hexsha stringlengths 40 40 | size int64 2 991k | ext stringclasses 2 values | lang stringclasses 1 value | max_stars_repo_path stringlengths 4 208 | max_stars_repo_name stringlengths 6 106 | max_stars_repo_head_hexsha stringlengths 40 40 | max_stars_repo_licenses list | max_stars_count int64 1 33.5k ⌀ | max_stars_repo_stars_event_min_datetime stringlengths 24 24 ⌀ | max_stars_repo_stars_event_max_datetime stringlengths 24 24 ⌀ | max_issues_repo_path stringlengths 4 208 | max_issues_repo_name stringlengths 6 106 | max_issues_repo_head_hexsha stringlengths 40 40 | max_issues_repo_licenses list | max_issues_count int64 1 16.3k ⌀ | max_issues_repo_issues_event_min_datetime stringlengths 24 24 ⌀ | max_issues_repo_issues_event_max_datetime stringlengths 24 24 ⌀ | max_forks_repo_path stringlengths 4 208 | max_forks_repo_name stringlengths 6 106 | max_forks_repo_head_hexsha stringlengths 40 40 | max_forks_repo_licenses list | max_forks_count int64 1 6.91k ⌀ | max_forks_repo_forks_event_min_datetime stringlengths 24 24 ⌀ | max_forks_repo_forks_event_max_datetime stringlengths 24 24 ⌀ | content stringlengths 2 991k | avg_line_length float64 1 36k | max_line_length int64 1 977k | alphanum_fraction float64 0 1 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
1c14cb52e9ad9439ad3e698ca3bbfb49b924e4be | 63 | ex | Elixir | testData/org/elixir_lang/parser_definition/matched_two_operation_parsing_test_case/UnaryNumericOperation.ex | keyno63/intellij-elixir | 4033e319992c53ddd42a683ee7123a97b5e34f02 | [
"Apache-2.0"
] | 1,668 | 2015-01-03T05:54:27.000Z | 2022-03-25T08:01:20.000Z | testData/org/elixir_lang/parser_definition/matched_two_operation_parsing_test_case/UnaryNumericOperation.ex | keyno63/intellij-elixir | 4033e319992c53ddd42a683ee7123a97b5e34f02 | [
"Apache-2.0"
] | 2,018 | 2015-01-01T22:43:39.000Z | 2022-03-31T20:13:08.000Z | testData/org/elixir_lang/parser_definition/matched_two_operation_parsing_test_case/UnaryNumericOperation.ex | keyno63/intellij-elixir | 4033e319992c53ddd42a683ee7123a97b5e34f02 | [
"Apache-2.0"
] | 145 | 2015-01-15T11:37:16.000Z | 2021-12-22T05:51:02.000Z | +one ++ -two
!three -- ^four
not five..~~~six
+seven <> -eight
| 12.6 | 16 | 0.555556 |
1c15190836360dfbd66ae06e5362e143c3f2ed97 | 1,470 | exs | Elixir | lib/perspective/processor/steps/tests/request_authorizer_test.exs | backmath/perspective | a0a577d0ffb06805b64e4dcb171a093e051884b0 | [
"MIT"
] | 2 | 2020-04-24T19:43:06.000Z | 2020-04-24T19:52:27.000Z | lib/perspective/processor/steps/tests/request_authorizer_test.exs | backmath/perspective | a0a577d0ffb06805b64e4dcb171a093e051884b0 | [
"MIT"
] | null | null | null | lib/perspective/processor/steps/tests/request_authorizer_test.exs | backmath/perspective | a0a577d0ffb06805b64e4dcb171a093e051884b0 | [
"MIT"
] | null | null | null | defmodule Perspective.Processor.RequestAuthorizer.Test do
use ExUnit.Case
defmodule Example do
use Perspective.ActionRequest
domain_event(Perspective.Processor.RequestAuthorizer.Test.ExampleEvent, "1.0")
authorize(%{actor_id: "user/true"}) do
true
end
authorize(%{actor_id: "user/ok"}) do
true
end
authorize(%{actor_id: "user/false"}) do
false
end
authorize(%{actor_id: "user/error"}) do
{:error, :extra_error}
end
end
test "true and ok response return the request" do
request = Example.new("user/true")
assert request == Perspective.Processor.RequestAuthorizer.authorize(request)
request = Example.new("user/ok")
assert request == Perspective.Processor.RequestAuthorizer.authorize(request)
end
test "false failures throw an error" do
assert_raise Perspective.Unauthorized, fn ->
Example.new("user/false")
|> Perspective.Processor.RequestAuthorizer.authorize()
end
end
test "error tuples throw an error with the errors attached" do
error =
assert_raise Perspective.Unauthorized, fn ->
Example.new("user/error")
|> Perspective.Processor.RequestAuthorizer.authorize()
end
assert [:extra_error] == error.errors
end
test "default failures" do
assert_raise Perspective.Unauthorized, fn ->
Example.new("user/missing")
|> Perspective.Processor.RequestAuthorizer.authorize()
end
end
end
| 24.915254 | 82 | 0.690476 |
1c152499e67dce4c9c78c0ae1c0887b0e5398664 | 76,564 | ex | Elixir | lib/elixir/lib/string.ex | dogatuncay/elixir | 42875b97f858a31d3cbb8e1090ffb4d6c443ba75 | [
"Apache-2.0"
] | 243 | 2020-02-03T03:48:51.000Z | 2021-11-08T12:56:25.000Z | lib/elixir/lib/string.ex | dogatuncay/elixir | 42875b97f858a31d3cbb8e1090ffb4d6c443ba75 | [
"Apache-2.0"
] | null | null | null | lib/elixir/lib/string.ex | dogatuncay/elixir | 42875b97f858a31d3cbb8e1090ffb4d6c443ba75 | [
"Apache-2.0"
] | null | null | null | import Kernel, except: [length: 1]
defmodule String do
@moduledoc ~S"""
Strings in Elixir are UTF-8 encoded binaries.
Strings in Elixir are a sequence of Unicode characters,
typically written between double quoted strings, such
as `"hello"` and `"héllò"`.
In case a string must have a double-quote in itself,
the double quotes must be escaped with a backslash,
for example: `"this is a string with \"double quotes\""`.
You can concatenate two strings with the `<>/2` operator:
iex> "hello" <> " " <> "world"
"hello world"
## Interpolation
Strings in Elixir also support interpolation. This allows
you to place some value in the middle of a string by using
the `#{}` syntax:
iex> name = "joe"
iex> "hello #{name}"
"hello joe"
Any Elixir expression is valid inside the interpolation.
If a string is given, the string is interpolated as is.
If any other value is given, Elixir will attempt to convert
it to a string using the `String.Chars` protocol. This
allows, for example, to output an integer from the interpolation:
iex> "2 + 2 = #{2 + 2}"
"2 + 2 = 4"
In case the value you want to interpolate cannot be
converted to a string, because it doesn't have an human
textual representation, a protocol error will be raised.
## Escape characters
Besides allowing double-quotes to be escaped with a backslash,
strings also support the following escape characters:
* `\a` - Bell
* `\b` - Backspace
* `\t` - Horizontal tab
* `\n` - Line feed (New lines)
* `\v` - Vertical tab
* `\f` - Form feed
* `\r` - Carriage return
* `\e` - Command Escape
* `\#` - Returns the `#` character itself, skipping interpolation
* `\xNN` - A byte represented by the hexadecimal `NN`
* `\uNNNN` - A Unicode code point represented by `NNNN`
Note it is generally not advised to use `\xNN` in Elixir
strings, as introducing an invalid byte sequence would
make the string invalid. If you have to introduce a
character by its hexadecimal representation, it is best
to work with Unicode code points, such as `\uNNNN`. In fact,
understanding Unicode code points can be essential when doing
low-level manipulations of string, so let's explore them in
detail next.
## Code points and grapheme cluster
The functions in this module act according to the Unicode
Standard, version 13.0.0.
As per the standard, a code point is a single Unicode Character,
which may be represented by one or more bytes.
For example, although the code point "é" is a single character,
its underlying representation uses two bytes:
iex> String.length("é")
1
iex> byte_size("é")
2
Furthermore, this module also presents the concept of grapheme cluster
(from now on referenced as graphemes). Graphemes can consist of multiple
code points that may be perceived as a single character by readers. For
example, "é" can be represented either as a single "e with acute" code point
or as the letter "e" followed by a "combining acute accent" (two code points):
iex> string = "\u0065\u0301"
iex> byte_size(string)
3
iex> String.length(string)
1
iex> String.codepoints(string)
["e", "́"]
iex> String.graphemes(string)
["é"]
Although the example above is made of two characters, it is
perceived by users as one.
Graphemes can also be two characters that are interpreted
as one by some languages. For example, some languages may
consider "ch" as a single character. However, since this
information depends on the locale, it is not taken into account
by this module.
In general, the functions in this module rely on the Unicode
Standard, but do not contain any of the locale specific behaviour.
More information about graphemes can be found in the [Unicode
Standard Annex #29](https://www.unicode.org/reports/tr29/).
For converting a binary to a different encoding and for Unicode
normalization mechanisms, see Erlang's `:unicode` module.
## String and binary operations
To act according to the Unicode Standard, many functions
in this module run in linear time, as they need to traverse
the whole string considering the proper Unicode code points.
For example, `String.length/1` will take longer as
the input grows. On the other hand, `Kernel.byte_size/1` always runs
in constant time (i.e. regardless of the input size).
This means often there are performance costs in using the
functions in this module, compared to the more low-level
operations that work directly with binaries:
* `Kernel.binary_part/3` - retrieves part of the binary
* `Kernel.bit_size/1` and `Kernel.byte_size/1` - size related functions
* `Kernel.is_bitstring/1` and `Kernel.is_binary/1` - type-check function
* Plus a number of functions for working with binaries (bytes)
in the [`:binary` module](`:binary`)
There are many situations where using the `String` module can
be avoided in favor of binary functions or pattern matching.
For example, imagine you have a string `prefix` and you want to
remove this prefix from another string named `full`.
One may be tempted to write:
iex> take_prefix = fn full, prefix ->
...> base = String.length(prefix)
...> String.slice(full, base, String.length(full) - base)
...> end
iex> take_prefix.("Mr. John", "Mr. ")
"John"
Although the function above works, it performs poorly. To
calculate the length of the string, we need to traverse it
fully, so we traverse both `prefix` and `full` strings, then
slice the `full` one, traversing it again.
A first attempt at improving it could be with ranges:
iex> take_prefix = fn full, prefix ->
...> base = String.length(prefix)
...> String.slice(full, base..-1)
...> end
iex> take_prefix.("Mr. John", "Mr. ")
"John"
While this is much better (we don't traverse `full` twice),
it could still be improved. In this case, since we want to
extract a substring from a string, we can use `Kernel.byte_size/1`
and `Kernel.binary_part/3` as there is no chance we will slice in
the middle of a code point made of more than one byte:
iex> take_prefix = fn full, prefix ->
...> base = byte_size(prefix)
...> binary_part(full, base, byte_size(full) - base)
...> end
iex> take_prefix.("Mr. John", "Mr. ")
"John"
Or simply use pattern matching:
iex> take_prefix = fn full, prefix ->
...> base = byte_size(prefix)
...> <<_::binary-size(base), rest::binary>> = full
...> rest
...> end
iex> take_prefix.("Mr. John", "Mr. ")
"John"
On the other hand, if you want to dynamically slice a string
based on an integer value, then using `String.slice/3` is the
best option as it guarantees we won't incorrectly split a valid
code point into multiple bytes.
## Integer code points
Although code points are represented as integers, this module
represents code points in their encoded format as strings.
For example:
iex> String.codepoints("olá")
["o", "l", "á"]
There are a couple of ways to retrieve the character code point.
One may use the `?` construct:
iex> ?o
111
iex> ?á
225
Or also via pattern matching:
iex> <<aacute::utf8>> = "á"
iex> aacute
225
As we have seen above, code points can be inserted into
a string by their hexadecimal code:
iex> "ol\u00E1"
"olá"
Finally, to convert a String into a list of integer
code points, known as "charlists" in Elixir, you can call
`String.to_charlist`:
iex> String.to_charlist("olá")
[111, 108, 225]
## Self-synchronization
The UTF-8 encoding is self-synchronizing. This means that
if malformed data (i.e., data that is not possible according
to the definition of the encoding) is encountered, only one
code point needs to be rejected.
This module relies on this behaviour to ignore such invalid
characters. For example, `length/1` will return
a correct result even if an invalid code point is fed into it.
In other words, this module expects invalid data to be detected
elsewhere, usually when retrieving data from the external source.
For example, a driver that reads strings from a database will be
responsible to check the validity of the encoding. `String.chunk/2`
can be used for breaking a string into valid and invalid parts.
## Compile binary patterns
Many functions in this module work with patterns. For example,
`String.split/3` can split a string into multiple strings given
a pattern. This pattern can be a string, a list of strings or
a compiled pattern:
iex> String.split("foo bar", " ")
["foo", "bar"]
iex> String.split("foo bar!", [" ", "!"])
["foo", "bar", ""]
iex> pattern = :binary.compile_pattern([" ", "!"])
iex> String.split("foo bar!", pattern)
["foo", "bar", ""]
The compiled pattern is useful when the same match will
be done over and over again. Note though that the compiled
pattern cannot be stored in a module attribute as the pattern
is generated at runtime and does not survive compile time.
"""
@typedoc """
A UTF-8 encoded binary.
The types `String.t()` and `binary()` are equivalent to analysis tools.
Although, for those reading the documentation, `String.t()` implies
it is a UTF-8 encoded binary.
"""
@type t :: binary
@typedoc "A single Unicode code point encoded in UTF-8. It may be one or more bytes."
@type codepoint :: t
@typedoc "Multiple code points that may be perceived as a single character by readers"
@type grapheme :: t
@typedoc "Pattern used in functions like `replace/4` and `split/3`"
@type pattern :: t | [t] | :binary.cp()
@conditional_mappings [:greek, :turkic]
@doc """
Checks if a string contains only printable characters up to `character_limit`.
Takes an optional `character_limit` as a second argument. If `character_limit` is `0`, this
function will return `true`.
## Examples
iex> String.printable?("abc")
true
iex> String.printable?("abc" <> <<0>>)
false
iex> String.printable?("abc" <> <<0>>, 2)
true
iex> String.printable?("abc" <> <<0>>, 0)
true
"""
@spec printable?(t, 0) :: true
@spec printable?(t, pos_integer | :infinity) :: boolean
def printable?(string, character_limit \\ :infinity)
when is_binary(string) and
(character_limit == :infinity or
(is_integer(character_limit) and character_limit >= 0)) do
recur_printable?(string, character_limit)
end
defp recur_printable?(_string, 0), do: true
defp recur_printable?(<<>>, _character_limit), do: true
for char <- 0x20..0x7E do
defp recur_printable?(<<unquote(char), rest::binary>>, character_limit) do
recur_printable?(rest, decrement(character_limit))
end
end
for char <- '\n\r\t\v\b\f\e\d\a' do
defp recur_printable?(<<unquote(char), rest::binary>>, character_limit) do
recur_printable?(rest, decrement(character_limit))
end
end
defp recur_printable?(<<char::utf8, rest::binary>>, character_limit)
when char in 0xA0..0xD7FF
when char in 0xE000..0xFFFD
when char in 0x10000..0x10FFFF do
recur_printable?(rest, decrement(character_limit))
end
defp recur_printable?(_string, _character_limit) do
false
end
defp decrement(:infinity), do: :infinity
defp decrement(character_limit), do: character_limit - 1
@doc ~S"""
Divides a string into substrings at each Unicode whitespace
occurrence with leading and trailing whitespace ignored. Groups
of whitespace are treated as a single occurrence. Divisions do
not occur on non-breaking whitespace.
## Examples
iex> String.split("foo bar")
["foo", "bar"]
iex> String.split("foo" <> <<194, 133>> <> "bar")
["foo", "bar"]
iex> String.split(" foo bar ")
["foo", "bar"]
iex> String.split("no\u00a0break")
["no\u00a0break"]
"""
@spec split(t) :: [t]
defdelegate split(binary), to: String.Break
@doc ~S"""
Divides a string into parts based on a pattern.
Returns a list of these parts.
The `pattern` may be a string, a list of strings, a regular expression, or a
compiled pattern.
The string is split into as many parts as possible by
default, but can be controlled via the `:parts` option.
Empty strings are only removed from the result if the
`:trim` option is set to `true`.
When the pattern used is a regular expression, the string is
split using `Regex.split/3`.
## Options
* `:parts` (positive integer or `:infinity`) - the string
is split into at most as many parts as this option specifies.
If `:infinity`, the string will be split into all possible
parts. Defaults to `:infinity`.
* `:trim` (boolean) - if `true`, empty strings are removed from
the resulting list.
This function also accepts all options accepted by `Regex.split/3`
if `pattern` is a regular expression.
## Examples
Splitting with a string pattern:
iex> String.split("a,b,c", ",")
["a", "b", "c"]
iex> String.split("a,b,c", ",", parts: 2)
["a", "b,c"]
iex> String.split(" a b c ", " ", trim: true)
["a", "b", "c"]
A list of patterns:
iex> String.split("1,2 3,4", [" ", ","])
["1", "2", "3", "4"]
A regular expression:
iex> String.split("a,b,c", ~r{,})
["a", "b", "c"]
iex> String.split("a,b,c", ~r{,}, parts: 2)
["a", "b,c"]
iex> String.split(" a b c ", ~r{\s}, trim: true)
["a", "b", "c"]
iex> String.split("abc", ~r{b}, include_captures: true)
["a", "b", "c"]
A compiled pattern:
iex> pattern = :binary.compile_pattern([" ", ","])
iex> String.split("1,2 3,4", pattern)
["1", "2", "3", "4"]
Splitting on empty string returns graphemes:
iex> String.split("abc", "")
["", "a", "b", "c", ""]
iex> String.split("abc", "", trim: true)
["a", "b", "c"]
iex> String.split("abc", "", parts: 1)
["abc"]
iex> String.split("abc", "", parts: 3)
["", "a", "bc"]
Be aware that this function can split within or across grapheme boundaries.
For example, take the grapheme "é" which is made of the characters
"e" and the acute accent. The following will split the string into two parts:
iex> String.split(String.normalize("é", :nfd), "e")
["", "́"]
However, if "é" is represented by the single character "e with acute"
accent, then it will split the string into just one part:
iex> String.split(String.normalize("é", :nfc), "e")
["é"]
"""
@spec split(t, pattern | Regex.t(), keyword) :: [t]
def split(string, pattern, options \\ [])
def split(string, %Regex{} = pattern, options) when is_binary(string) and is_list(options) do
Regex.split(pattern, string, options)
end
def split(string, "", options) when is_binary(string) and is_list(options) do
parts = Keyword.get(options, :parts, :infinity)
index = parts_to_index(parts)
trim = Keyword.get(options, :trim, false)
if trim == false and index != 1 do
["" | split_empty(string, trim, index - 1)]
else
split_empty(string, trim, index)
end
end
def split(string, pattern, options) when is_binary(string) and is_list(options) do
parts = Keyword.get(options, :parts, :infinity)
trim = Keyword.get(options, :trim, false)
case {parts, trim} do
{:infinity, false} ->
:binary.split(string, pattern, [:global])
_ ->
pattern = maybe_compile_pattern(pattern)
split_each(string, pattern, trim, parts_to_index(parts))
end
end
defp parts_to_index(:infinity), do: 0
defp parts_to_index(n) when is_integer(n) and n > 0, do: n
defp split_empty("", true, 1), do: []
defp split_empty(string, _, 1), do: [string]
defp split_empty(string, trim, count) do
case next_grapheme(string) do
{h, t} -> [h | split_empty(t, trim, count - 1)]
nil -> split_empty("", trim, 1)
end
end
defp split_each("", _pattern, true, 1), do: []
defp split_each(string, _pattern, _trim, 1) when is_binary(string), do: [string]
defp split_each(string, pattern, trim, count) do
case do_splitter(string, pattern, trim) do
{h, t} -> [h | split_each(t, pattern, trim, count - 1)]
nil -> []
end
end
@doc """
Returns an enumerable that splits a string on demand.
This is in contrast to `split/3` which splits the
entire string upfront.
This function does not support regular expressions
by design. When using regular expressions, it is often
more efficient to have the regular expressions traverse
the string at once than in parts, like this function does.
## Options
* :trim - when `true`, does not emit empty patterns
## Examples
iex> String.splitter("1,2 3,4 5,6 7,8,...,99999", [" ", ","]) |> Enum.take(4)
["1", "2", "3", "4"]
iex> String.splitter("abcd", "") |> Enum.take(10)
["", "a", "b", "c", "d", ""]
iex> String.splitter("abcd", "", trim: true) |> Enum.take(10)
["a", "b", "c", "d"]
A compiled pattern can also be given:
iex> pattern = :binary.compile_pattern([" ", ","])
iex> String.splitter("1,2 3,4 5,6 7,8,...,99999", pattern) |> Enum.take(4)
["1", "2", "3", "4"]
"""
@spec splitter(t, pattern, keyword) :: Enumerable.t()
def splitter(string, pattern, options \\ [])
def splitter(string, "", options) when is_binary(string) and is_list(options) do
if Keyword.get(options, :trim, false) do
Stream.unfold(string, &next_grapheme/1)
else
Stream.unfold(:match, &do_empty_splitter(&1, string))
end
end
def splitter(string, pattern, options) when is_binary(string) and is_list(options) do
pattern = maybe_compile_pattern(pattern)
trim = Keyword.get(options, :trim, false)
Stream.unfold(string, &do_splitter(&1, pattern, trim))
end
defp do_empty_splitter(:match, string), do: {"", string}
defp do_empty_splitter(:nomatch, _string), do: nil
defp do_empty_splitter("", _), do: {"", :nomatch}
defp do_empty_splitter(string, _), do: next_grapheme(string)
defp do_splitter(:nomatch, _pattern, _), do: nil
defp do_splitter("", _pattern, false), do: {"", :nomatch}
defp do_splitter("", _pattern, true), do: nil
defp do_splitter(bin, pattern, trim) do
case :binary.split(bin, pattern) do
["", second] when trim -> do_splitter(second, pattern, trim)
[first, second] -> {first, second}
[first] -> {first, :nomatch}
end
end
defp maybe_compile_pattern(pattern) when is_tuple(pattern), do: pattern
defp maybe_compile_pattern(pattern), do: :binary.compile_pattern(pattern)
@doc """
Splits a string into two at the specified offset. When the offset given is
negative, location is counted from the end of the string.
The offset is capped to the length of the string. Returns a tuple with
two elements.
Note: keep in mind this function splits on graphemes and for such it
has to linearly traverse the string. If you want to split a string or
a binary based on the number of bytes, use `Kernel.binary_part/3`
instead.
## Examples
iex> String.split_at("sweetelixir", 5)
{"sweet", "elixir"}
iex> String.split_at("sweetelixir", -6)
{"sweet", "elixir"}
iex> String.split_at("abc", 0)
{"", "abc"}
iex> String.split_at("abc", 1000)
{"abc", ""}
iex> String.split_at("abc", -1000)
{"", "abc"}
"""
@spec split_at(t, integer) :: {t, t}
def split_at(string, position)
def split_at(string, position)
when is_binary(string) and is_integer(position) and position >= 0 do
do_split_at(string, position)
end
def split_at(string, position)
when is_binary(string) and is_integer(position) and position < 0 do
position = length(string) + position
case position >= 0 do
true -> do_split_at(string, position)
false -> {"", string}
end
end
defp do_split_at(string, position) do
{byte_size, rest} = String.Unicode.split_at(string, position)
{binary_part(string, 0, byte_size), rest || ""}
end
@doc ~S"""
Returns `true` if `string1` is canonically equivalent to `string2`.
It performs Normalization Form Canonical Decomposition (NFD) on the
strings before comparing them. This function is equivalent to:
String.normalize(string1, :nfd) == String.normalize(string2, :nfd)
If you plan to compare multiple strings, multiple times in a row, you
may normalize them upfront and compare them directly to avoid multiple
normalization passes.
## Examples
iex> String.equivalent?("abc", "abc")
true
iex> String.equivalent?("man\u0303ana", "mañana")
true
iex> String.equivalent?("abc", "ABC")
false
iex> String.equivalent?("nø", "nó")
false
"""
@spec equivalent?(t, t) :: boolean
def equivalent?(string1, string2) when is_binary(string1) and is_binary(string2) do
normalize(string1, :nfd) == normalize(string2, :nfd)
end
@doc """
Converts all characters in `string` to Unicode normalization
form identified by `form`.
Invalid Unicode codepoints are skipped and the remaining of
the string is converted. If you want the algorithm to stop
and return on invalid codepoint, use `:unicode.characters_to_nfd_binary/1`,
`:unicode.characters_to_nfc_binary/1`, `:unicode.characters_to_nfkd_binary/1`,
and `:unicode.characters_to_nfkc_binary/1` instead.
Normalization forms `:nfkc` and `:nfkd` should not be blindly applied
to arbitrary text. Because they erase many formatting distinctions,
they will prevent round-trip conversion to and from many legacy
character sets.
## Forms
The supported forms are:
* `:nfd` - Normalization Form Canonical Decomposition.
Characters are decomposed by canonical equivalence, and
multiple combining characters are arranged in a specific
order.
* `:nfc` - Normalization Form Canonical Composition.
Characters are decomposed and then recomposed by canonical equivalence.
* `:nfkd` - Normalization Form Compatibility Decomposition.
Characters are decomposed by compatibility equivalence, and
multiple combining characters are arranged in a specific
order.
* `:nfkc` - Normalization Form Compatibility Composition.
Characters are decomposed and then recomposed by compatibility equivalence.
## Examples
iex> String.normalize("yêṩ", :nfd)
"yêṩ"
iex> String.normalize("leña", :nfc)
"leña"
iex> String.normalize("fi", :nfkd)
"fi"
iex> String.normalize("fi", :nfkc)
"fi"
"""
def normalize(string, form)
def normalize(string, :nfd) when is_binary(string) do
case :unicode.characters_to_nfd_binary(string) do
string when is_binary(string) -> string
{:error, good, <<head, rest::binary>>} -> good <> <<head>> <> normalize(rest, :nfd)
end
end
def normalize(string, :nfc) when is_binary(string) do
case :unicode.characters_to_nfc_binary(string) do
string when is_binary(string) -> string
{:error, good, <<head, rest::binary>>} -> good <> <<head>> <> normalize(rest, :nfc)
end
end
def normalize(string, :nfkd) when is_binary(string) do
case :unicode.characters_to_nfkd_binary(string) do
string when is_binary(string) -> string
{:error, good, <<head, rest::binary>>} -> good <> <<head>> <> normalize(rest, :nfkd)
end
end
def normalize(string, :nfkc) when is_binary(string) do
case :unicode.characters_to_nfkc_binary(string) do
string when is_binary(string) -> string
{:error, good, <<head, rest::binary>>} -> good <> <<head>> <> normalize(rest, :nfkc)
end
end
@doc """
Converts all characters in the given string to uppercase according to `mode`.
`mode` may be `:default`, `:ascii`, `:greek` or `:turkic`. The `:default` mode considers
all non-conditional transformations outlined in the Unicode standard. `:ascii`
uppercases only the letters a to z. `:greek` includes the context sensitive
mappings found in Greek. `:turkic` properly handles the letter i with the dotless variant.
## Examples
iex> String.upcase("abcd")
"ABCD"
iex> String.upcase("ab 123 xpto")
"AB 123 XPTO"
iex> String.upcase("olá")
"OLÁ"
The `:ascii` mode ignores Unicode characters and provides a more
performant implementation when you know the string contains only
ASCII characters:
iex> String.upcase("olá", :ascii)
"OLá"
And `:turkic` properly handles the letter i with the dotless variant:
iex> String.upcase("ıi")
"II"
iex> String.upcase("ıi", :turkic)
"Iİ"
"""
@spec upcase(t, :default | :ascii | :greek | :turkic) :: t
def upcase(string, mode \\ :default)
def upcase("", _mode) do
""
end
def upcase(string, :default) when is_binary(string) do
String.Casing.upcase(string, [], :default)
end
def upcase(string, :ascii) when is_binary(string) do
IO.iodata_to_binary(upcase_ascii(string))
end
def upcase(string, mode) when is_binary(string) and mode in @conditional_mappings do
String.Casing.upcase(string, [], mode)
end
defp upcase_ascii(<<char, rest::bits>>) when char >= ?a and char <= ?z,
do: [char - 32 | upcase_ascii(rest)]
defp upcase_ascii(<<char, rest::bits>>), do: [char | upcase_ascii(rest)]
defp upcase_ascii(<<>>), do: []
@doc """
Converts all characters in the given string to lowercase according to `mode`.
`mode` may be `:default`, `:ascii`, `:greek` or `:turkic`. The `:default` mode considers
all non-conditional transformations outlined in the Unicode standard. `:ascii`
lowercases only the letters A to Z. `:greek` includes the context sensitive
mappings found in Greek. `:turkic` properly handles the letter i with the dotless variant.
## Examples
iex> String.downcase("ABCD")
"abcd"
iex> String.downcase("AB 123 XPTO")
"ab 123 xpto"
iex> String.downcase("OLÁ")
"olá"
The `:ascii` mode ignores Unicode characters and provides a more
performant implementation when you know the string contains only
ASCII characters:
iex> String.downcase("OLÁ", :ascii)
"olÁ"
The `:greek` mode properly handles the context sensitive sigma in Greek:
iex> String.downcase("ΣΣ")
"σσ"
iex> String.downcase("ΣΣ", :greek)
"σς"
And `:turkic` properly handles the letter i with the dotless variant:
iex> String.downcase("Iİ")
"ii̇"
iex> String.downcase("Iİ", :turkic)
"ıi"
"""
@spec downcase(t, :default | :ascii | :greek | :turkic) :: t
def downcase(string, mode \\ :default)
def downcase("", _mode) do
""
end
def downcase(string, :default) when is_binary(string) do
String.Casing.downcase(string, [], :default)
end
def downcase(string, :ascii) when is_binary(string) do
IO.iodata_to_binary(downcase_ascii(string))
end
def downcase(string, mode) when is_binary(string) and mode in @conditional_mappings do
String.Casing.downcase(string, [], mode)
end
defp downcase_ascii(<<char, rest::bits>>) when char >= ?A and char <= ?Z,
do: [char + 32 | downcase_ascii(rest)]
defp downcase_ascii(<<char, rest::bits>>), do: [char | downcase_ascii(rest)]
defp downcase_ascii(<<>>), do: []
@doc """
Converts the first character in the given string to
uppercase and the remainder to lowercase according to `mode`.
`mode` may be `:default`, `:ascii`, `:greek` or `:turkic`. The `:default` mode considers
all non-conditional transformations outlined in the Unicode standard. `:ascii`
capitalizes only the letters A to Z. `:greek` includes the context sensitive
mappings found in Greek. `:turkic` properly handles the letter i with the dotless variant.
## Examples
iex> String.capitalize("abcd")
"Abcd"
iex> String.capitalize("fin")
"Fin"
iex> String.capitalize("olá")
"Olá"
"""
@spec capitalize(t, :default | :ascii | :greek | :turkic) :: t
def capitalize(string, mode \\ :default)
def capitalize(<<char, rest::binary>>, :ascii) do
char = if char >= ?a and char <= ?z, do: char - 32, else: char
<<char>> <> downcase(rest, :ascii)
end
def capitalize(string, mode) when is_binary(string) do
{char, rest} = String.Casing.titlecase_once(string, mode)
char <> downcase(rest, mode)
end
@doc false
@deprecated "Use String.trim_trailing/1 instead"
defdelegate rstrip(binary), to: String.Break, as: :trim_trailing
@doc false
@deprecated "Use String.trim_trailing/2 with a binary as second argument instead"
def rstrip(string, char) when is_integer(char) do
replace_trailing(string, <<char::utf8>>, "")
end
@doc """
Replaces all leading occurrences of `match` by `replacement` of `match` in `string`.
Returns the string untouched if there are no occurrences.
If `match` is `""`, this function raises an `ArgumentError` exception: this
happens because this function replaces **all** the occurrences of `match` at
the beginning of `string`, and it's impossible to replace "multiple"
occurrences of `""`.
## Examples
iex> String.replace_leading("hello world", "hello ", "")
"world"
iex> String.replace_leading("hello hello world", "hello ", "")
"world"
iex> String.replace_leading("hello world", "hello ", "ola ")
"ola world"
iex> String.replace_leading("hello hello world", "hello ", "ola ")
"ola ola world"
"""
@spec replace_leading(t, t, t) :: t
def replace_leading(string, match, replacement)
when is_binary(string) and is_binary(match) and is_binary(replacement) do
if match == "" do
raise ArgumentError, "cannot use an empty string as the match to replace"
end
prefix_size = byte_size(match)
suffix_size = byte_size(string) - prefix_size
replace_leading(string, match, replacement, prefix_size, suffix_size, 0)
end
defp replace_leading(string, match, replacement, prefix_size, suffix_size, acc)
when suffix_size >= 0 do
case string do
<<prefix::size(prefix_size)-binary, suffix::binary>> when prefix == match ->
replace_leading(
suffix,
match,
replacement,
prefix_size,
suffix_size - prefix_size,
acc + 1
)
_ ->
prepend_unless_empty(duplicate(replacement, acc), string)
end
end
defp replace_leading(string, _match, replacement, _prefix_size, _suffix_size, acc) do
prepend_unless_empty(duplicate(replacement, acc), string)
end
@doc """
Replaces all trailing occurrences of `match` by `replacement` in `string`.
Returns the string untouched if there are no occurrences.
If `match` is `""`, this function raises an `ArgumentError` exception: this
happens because this function replaces **all** the occurrences of `match` at
the end of `string`, and it's impossible to replace "multiple" occurrences of
`""`.
## Examples
iex> String.replace_trailing("hello world", " world", "")
"hello"
iex> String.replace_trailing("hello world world", " world", "")
"hello"
iex> String.replace_trailing("hello world", " world", " mundo")
"hello mundo"
iex> String.replace_trailing("hello world world", " world", " mundo")
"hello mundo mundo"
"""
@spec replace_trailing(t, t, t) :: t
def replace_trailing(string, match, replacement)
when is_binary(string) and is_binary(match) and is_binary(replacement) do
if match == "" do
raise ArgumentError, "cannot use an empty string as the match to replace"
end
suffix_size = byte_size(match)
prefix_size = byte_size(string) - suffix_size
replace_trailing(string, match, replacement, prefix_size, suffix_size, 0)
end
defp replace_trailing(string, match, replacement, prefix_size, suffix_size, acc)
when prefix_size >= 0 do
case string do
<<prefix::size(prefix_size)-binary, suffix::binary>> when suffix == match ->
replace_trailing(
prefix,
match,
replacement,
prefix_size - suffix_size,
suffix_size,
acc + 1
)
_ ->
append_unless_empty(string, duplicate(replacement, acc))
end
end
defp replace_trailing(string, _match, replacement, _prefix_size, _suffix_size, acc) do
append_unless_empty(string, duplicate(replacement, acc))
end
@doc """
Replaces prefix in `string` by `replacement` if it matches `match`.
Returns the string untouched if there is no match. If `match` is an empty
string (`""`), `replacement` is just prepended to `string`.
## Examples
iex> String.replace_prefix("world", "hello ", "")
"world"
iex> String.replace_prefix("hello world", "hello ", "")
"world"
iex> String.replace_prefix("hello hello world", "hello ", "")
"hello world"
iex> String.replace_prefix("world", "hello ", "ola ")
"world"
iex> String.replace_prefix("hello world", "hello ", "ola ")
"ola world"
iex> String.replace_prefix("hello hello world", "hello ", "ola ")
"ola hello world"
iex> String.replace_prefix("world", "", "hello ")
"hello world"
"""
@spec replace_prefix(t, t, t) :: t
def replace_prefix(string, match, replacement)
when is_binary(string) and is_binary(match) and is_binary(replacement) do
prefix_size = byte_size(match)
case string do
<<prefix::size(prefix_size)-binary, suffix::binary>> when prefix == match ->
prepend_unless_empty(replacement, suffix)
_ ->
string
end
end
@doc """
Replaces suffix in `string` by `replacement` if it matches `match`.
Returns the string untouched if there is no match. If `match` is an empty
string (`""`), `replacement` is just appended to `string`.
## Examples
iex> String.replace_suffix("hello", " world", "")
"hello"
iex> String.replace_suffix("hello world", " world", "")
"hello"
iex> String.replace_suffix("hello world world", " world", "")
"hello world"
iex> String.replace_suffix("hello", " world", " mundo")
"hello"
iex> String.replace_suffix("hello world", " world", " mundo")
"hello mundo"
iex> String.replace_suffix("hello world world", " world", " mundo")
"hello world mundo"
iex> String.replace_suffix("hello", "", " world")
"hello world"
"""
@spec replace_suffix(t, t, t) :: t
def replace_suffix(string, match, replacement)
when is_binary(string) and is_binary(match) and is_binary(replacement) do
suffix_size = byte_size(match)
prefix_size = byte_size(string) - suffix_size
case string do
<<prefix::size(prefix_size)-binary, suffix::binary>> when suffix == match ->
append_unless_empty(prefix, replacement)
_ ->
string
end
end
@compile {:inline, prepend_unless_empty: 2, append_unless_empty: 2}
defp prepend_unless_empty("", suffix), do: suffix
defp prepend_unless_empty(prefix, suffix), do: prefix <> suffix
defp append_unless_empty(prefix, ""), do: prefix
defp append_unless_empty(prefix, suffix), do: prefix <> suffix
@doc false
@deprecated "Use String.trim_leading/1 instead"
defdelegate lstrip(binary), to: String.Break, as: :trim_leading
@doc false
@deprecated "Use String.trim_leading/2 with a binary as second argument instead"
def lstrip(string, char) when is_integer(char) do
replace_leading(string, <<char::utf8>>, "")
end
@doc false
@deprecated "Use String.trim/1 instead"
def strip(string) do
trim(string)
end
@doc false
@deprecated "Use String.trim/2 with a binary second argument instead"
def strip(string, char) do
trim(string, <<char::utf8>>)
end
@doc ~S"""
Returns a string where all leading Unicode whitespaces
have been removed.
## Examples
iex> String.trim_leading("\n abc ")
"abc "
"""
@spec trim_leading(t) :: t
defdelegate trim_leading(string), to: String.Break
@doc """
Returns a string where all leading `to_trim` characters have been removed.
## Examples
iex> String.trim_leading("__ abc _", "_")
" abc _"
iex> String.trim_leading("1 abc", "11")
"1 abc"
"""
@spec trim_leading(t, t) :: t
def trim_leading(string, to_trim)
when is_binary(string) and is_binary(to_trim) do
replace_leading(string, to_trim, "")
end
@doc ~S"""
Returns a string where all trailing Unicode whitespaces
has been removed.
## Examples
iex> String.trim_trailing(" abc\n ")
" abc"
"""
@spec trim_trailing(t) :: t
defdelegate trim_trailing(string), to: String.Break
@doc """
Returns a string where all trailing `to_trim` characters have been removed.
## Examples
iex> String.trim_trailing("_ abc __", "_")
"_ abc "
iex> String.trim_trailing("abc 1", "11")
"abc 1"
"""
@spec trim_trailing(t, t) :: t
def trim_trailing(string, to_trim)
when is_binary(string) and is_binary(to_trim) do
replace_trailing(string, to_trim, "")
end
@doc ~S"""
Returns a string where all leading and trailing Unicode whitespaces
have been removed.
## Examples
iex> String.trim("\n abc\n ")
"abc"
"""
@spec trim(t) :: t
def trim(string) when is_binary(string) do
string
|> trim_leading()
|> trim_trailing()
end
@doc """
Returns a string where all leading and trailing `to_trim` characters have been
removed.
## Examples
iex> String.trim("a abc a", "a")
" abc "
"""
@spec trim(t, t) :: t
def trim(string, to_trim) when is_binary(string) and is_binary(to_trim) do
string
|> trim_leading(to_trim)
|> trim_trailing(to_trim)
end
@doc ~S"""
Returns a new string padded with a leading filler
which is made of elements from the `padding`.
Passing a list of strings as `padding` will take one element of the list
for every missing entry. If the list is shorter than the number of inserts,
the filling will start again from the beginning of the list.
Passing a string `padding` is equivalent to passing the list of graphemes in it.
If no `padding` is given, it defaults to whitespace.
When `count` is less than or equal to the length of `string`,
given `string` is returned.
Raises `ArgumentError` if the given `padding` contains a non-string element.
## Examples
iex> String.pad_leading("abc", 5)
" abc"
iex> String.pad_leading("abc", 4, "12")
"1abc"
iex> String.pad_leading("abc", 6, "12")
"121abc"
iex> String.pad_leading("abc", 5, ["1", "23"])
"123abc"
"""
@spec pad_leading(t, non_neg_integer, t | [t]) :: t
def pad_leading(string, count, padding \\ [" "])
def pad_leading(string, count, padding) when is_binary(padding) do
pad_leading(string, count, graphemes(padding))
end
def pad_leading(string, count, [_ | _] = padding)
when is_binary(string) and is_integer(count) and count >= 0 do
pad(:leading, string, count, padding)
end
@doc ~S"""
Returns a new string padded with a trailing filler
which is made of elements from the `padding`.
Passing a list of strings as `padding` will take one element of the list
for every missing entry. If the list is shorter than the number of inserts,
the filling will start again from the beginning of the list.
Passing a string `padding` is equivalent to passing the list of graphemes in it.
If no `padding` is given, it defaults to whitespace.
When `count` is less than or equal to the length of `string`,
given `string` is returned.
Raises `ArgumentError` if the given `padding` contains a non-string element.
## Examples
iex> String.pad_trailing("abc", 5)
"abc "
iex> String.pad_trailing("abc", 4, "12")
"abc1"
iex> String.pad_trailing("abc", 6, "12")
"abc121"
iex> String.pad_trailing("abc", 5, ["1", "23"])
"abc123"
"""
@spec pad_trailing(t, non_neg_integer, t | [t]) :: t
def pad_trailing(string, count, padding \\ [" "])
def pad_trailing(string, count, padding) when is_binary(padding) do
pad_trailing(string, count, graphemes(padding))
end
def pad_trailing(string, count, [_ | _] = padding)
when is_binary(string) and is_integer(count) and count >= 0 do
pad(:trailing, string, count, padding)
end
defp pad(kind, string, count, padding) do
string_length = length(string)
if string_length >= count do
string
else
filler = build_filler(count - string_length, padding, padding, 0, [])
case kind do
:leading -> [filler | string]
:trailing -> [string | filler]
end
|> IO.iodata_to_binary()
end
end
defp build_filler(0, _source, _padding, _size, filler), do: filler
defp build_filler(count, source, [], size, filler) do
rem_filler =
rem(count, size)
|> build_filler(source, source, 0, [])
filler =
filler
|> IO.iodata_to_binary()
|> duplicate(div(count, size) + 1)
[filler | rem_filler]
end
defp build_filler(count, source, [elem | rest], size, filler)
when is_binary(elem) do
build_filler(count - 1, source, rest, size + 1, [filler | elem])
end
defp build_filler(_count, _source, [elem | _rest], _size, _filler) do
raise ArgumentError, "expected a string padding element, got: #{inspect(elem)}"
end
@doc false
@deprecated "Use String.pad_leading/2 instead"
def rjust(subject, length) do
rjust(subject, length, ?\s)
end
@doc false
@deprecated "Use String.pad_leading/3 with a binary padding instead"
def rjust(subject, length, pad) when is_integer(pad) and is_integer(length) and length >= 0 do
pad(:leading, subject, length, [<<pad::utf8>>])
end
@doc false
@deprecated "Use String.pad_trailing/2 instead"
def ljust(subject, length) do
ljust(subject, length, ?\s)
end
@doc false
@deprecated "Use String.pad_trailing/3 with a binary padding instead"
def ljust(subject, length, pad) when is_integer(pad) and is_integer(length) and length >= 0 do
pad(:trailing, subject, length, [<<pad::utf8>>])
end
@doc ~S"""
Returns a new string created by replacing occurrences of `pattern` in
`subject` with `replacement`.
The `subject` is always a string.
The `pattern` may be a string, a list of strings, a regular expression, or a
compiled pattern.
The `replacement` may be a string or a function that receives the matched
pattern and must return the replacement as a string or iodata.
By default it replaces all occurrences but this behaviour can be controlled
through the `:global` option; see the "Options" section below.
## Options
* `:global` - (boolean) if `true`, all occurrences of `pattern` are replaced
with `replacement`, otherwise only the first occurrence is
replaced. Defaults to `true`
## Examples
iex> String.replace("a,b,c", ",", "-")
"a-b-c"
iex> String.replace("a,b,c", ",", "-", global: false)
"a-b,c"
The pattern may also be a list of strings and the replacement may also
be a function that receives the matches:
iex> String.replace("a,b,c", ["a", "c"], fn <<char>> -> <<char + 1>> end)
"b,b,d"
When the pattern is a regular expression, one can give `\N` or
`\g{N}` in the `replacement` string to access a specific capture in the
regular expression:
iex> String.replace("a,b,c", ~r/,(.)/, ",\\1\\g{1}")
"a,bb,cc"
Note that we had to escape the backslash escape character (i.e., we used `\\N`
instead of just `\N` to escape the backslash; same thing for `\\g{N}`). By
giving `\0`, one can inject the whole match in the replacement string.
A compiled pattern can also be given:
iex> pattern = :binary.compile_pattern(",")
iex> String.replace("a,b,c", pattern, "[]")
"a[]b[]c"
When an empty string is provided as a `pattern`, the function will treat it as
an implicit empty string between each grapheme and the string will be
interspersed. If an empty string is provided as `replacement` the `subject`
will be returned:
iex> String.replace("ELIXIR", "", ".")
".E.L.I.X.I.R."
iex> String.replace("ELIXIR", "", "")
"ELIXIR"
"""
@spec replace(t, pattern | Regex.t(), t | (t -> t | iodata), keyword) :: t
def replace(subject, pattern, replacement, options \\ [])
when is_binary(subject) and
(is_binary(replacement) or is_function(replacement, 1)) and
is_list(options) do
replace_guarded(subject, pattern, replacement, options)
end
defp replace_guarded(subject, %{__struct__: Regex} = regex, replacement, options) do
Regex.replace(regex, subject, replacement, options)
end
defp replace_guarded(subject, "", "", _) do
subject
end
defp replace_guarded(subject, "", replacement_binary, options)
when is_binary(replacement_binary) do
if Keyword.get(options, :global, true) do
IO.iodata_to_binary([replacement_binary | intersperse_bin(subject, replacement_binary)])
else
replacement_binary <> subject
end
end
defp replace_guarded(subject, "", replacement_fun, options) do
if Keyword.get(options, :global, true) do
IO.iodata_to_binary([replacement_fun.("") | intersperse_fun(subject, replacement_fun)])
else
IO.iodata_to_binary([replacement_fun.("") | subject])
end
end
defp replace_guarded(subject, pattern, replacement, options) do
if insert = Keyword.get(options, :insert_replaced) do
IO.warn(
"String.replace/4 with :insert_replaced option is deprecated. " <>
"Please use :binary.replace/4 instead or pass an anonymous function as replacement"
)
binary_options = if Keyword.get(options, :global) != false, do: [:global], else: []
:binary.replace(subject, pattern, replacement, [insert_replaced: insert] ++ binary_options)
else
matches =
if Keyword.get(options, :global, true) do
:binary.matches(subject, pattern)
else
case :binary.match(subject, pattern) do
:nomatch -> []
match -> [match]
end
end
IO.iodata_to_binary(do_replace(subject, matches, replacement, 0))
end
end
defp intersperse_bin(subject, replacement) do
case next_grapheme(subject) do
{current, rest} -> [current, replacement | intersperse_bin(rest, replacement)]
nil -> []
end
end
defp intersperse_fun(subject, replacement) do
case next_grapheme(subject) do
{current, rest} -> [current, replacement.("") | intersperse_fun(rest, replacement)]
nil -> []
end
end
defp do_replace(subject, [], _, n) do
[binary_part(subject, n, byte_size(subject) - n)]
end
defp do_replace(subject, [{start, length} | matches], replacement, n) do
prefix = binary_part(subject, n, start - n)
middle =
if is_binary(replacement) do
replacement
else
replacement.(binary_part(subject, start, length))
end
[prefix, middle | do_replace(subject, matches, replacement, start + length)]
end
@doc ~S"""
Reverses the graphemes in given string.
## Examples
iex> String.reverse("abcd")
"dcba"
iex> String.reverse("hello world")
"dlrow olleh"
iex> String.reverse("hello ∂og")
"go∂ olleh"
Keep in mind reversing the same string twice does
not necessarily yield the original string:
iex> "̀e"
"̀e"
iex> String.reverse("̀e")
"è"
iex> String.reverse(String.reverse("̀e"))
"è"
In the first example the accent is before the vowel, so
it is considered two graphemes. However, when you reverse
it once, you have the vowel followed by the accent, which
becomes one grapheme. Reversing it again will keep it as
one single grapheme.
"""
@spec reverse(t) :: t
def reverse(string) when is_binary(string) do
do_reverse(next_grapheme(string), [])
end
defp do_reverse({grapheme, rest}, acc) do
do_reverse(next_grapheme(rest), [grapheme | acc])
end
defp do_reverse(nil, acc), do: IO.iodata_to_binary(acc)
@compile {:inline, duplicate: 2}
@doc """
Returns a string `subject` repeated `n` times.
Inlined by the compiler.
## Examples
iex> String.duplicate("abc", 0)
""
iex> String.duplicate("abc", 1)
"abc"
iex> String.duplicate("abc", 2)
"abcabc"
"""
@spec duplicate(t, non_neg_integer) :: t
def duplicate(subject, n) when is_binary(subject) and is_integer(n) and n >= 0 do
:binary.copy(subject, n)
end
@doc ~S"""
Returns a list of code points encoded as strings.
To retrieve code points in their natural integer
representation, see `to_charlist/1`. For details about
code points and graphemes, see the `String` module
documentation.
## Examples
iex> String.codepoints("olá")
["o", "l", "á"]
iex> String.codepoints("оптими зации")
["о", "п", "т", "и", "м", "и", " ", "з", "а", "ц", "и", "и"]
iex> String.codepoints("ἅἪῼ")
["ἅ", "Ἢ", "ῼ"]
iex> String.codepoints("\u00e9")
["é"]
iex> String.codepoints("\u0065\u0301")
["e", "́"]
"""
@spec codepoints(t) :: [codepoint]
defdelegate codepoints(string), to: String.Unicode
@doc ~S"""
Returns the next code point in a string.
The result is a tuple with the code point and the
remainder of the string or `nil` in case
the string reached its end.
As with other functions in the `String` module, `next_codepoint/1`
works with binaries that are invalid UTF-8. If the string starts
with a sequence of bytes that is not valid in UTF-8 encoding, the
first element of the returned tuple is a binary with the first byte.
## Examples
iex> String.next_codepoint("olá")
{"o", "lá"}
iex> invalid = "\x80\x80OK" # first two bytes are invalid in UTF-8
iex> {_, rest} = String.next_codepoint(invalid)
{<<128>>, <<128, 79, 75>>}
iex> String.next_codepoint(rest)
{<<128>>, "OK"}
## Comparison with binary pattern matching
Binary pattern matching provides a similar way to decompose
a string:
iex> <<codepoint::utf8, rest::binary>> = "Elixir"
"Elixir"
iex> codepoint
69
iex> rest
"lixir"
though not entirely equivalent because `codepoint` comes as
an integer, and the pattern won't match invalid UTF-8.
Binary pattern matching, however, is simpler and more efficient,
so pick the option that better suits your use case.
"""
@compile {:inline, next_codepoint: 1}
@spec next_codepoint(t) :: {codepoint, t} | nil
defdelegate next_codepoint(string), to: String.Unicode
@doc ~S"""
Checks whether `string` contains only valid characters.
## Examples
iex> String.valid?("a")
true
iex> String.valid?("ø")
true
iex> String.valid?(<<0xFFFF::16>>)
false
iex> String.valid?(<<0xEF, 0xB7, 0x90>>)
true
iex> String.valid?("asd" <> <<0xFFFF::16>>)
false
"""
@spec valid?(t) :: boolean
def valid?(<<string::binary>>), do: valid_utf8?(string)
def valid?(_), do: false
defp valid_utf8?(<<_::utf8, rest::bits>>), do: valid_utf8?(rest)
defp valid_utf8?(<<>>), do: true
defp valid_utf8?(_), do: false
@doc false
@deprecated "Use String.valid?/1 instead"
def valid_character?(string) do
case string do
<<_::utf8>> -> valid?(string)
_ -> false
end
end
@doc ~S"""
Splits the string into chunks of characters that share a common trait.
The trait can be one of two options:
* `:valid` - the string is split into chunks of valid and invalid
character sequences
* `:printable` - the string is split into chunks of printable and
non-printable character sequences
Returns a list of binaries each of which contains only one kind of
characters.
If the given string is empty, an empty list is returned.
## Examples
iex> String.chunk(<<?a, ?b, ?c, 0>>, :valid)
["abc\0"]
iex> String.chunk(<<?a, ?b, ?c, 0, 0xFFFF::utf16>>, :valid)
["abc\0", <<0xFFFF::utf16>>]
iex> String.chunk(<<?a, ?b, ?c, 0, 0x0FFFF::utf8>>, :printable)
["abc", <<0, 0x0FFFF::utf8>>]
"""
@spec chunk(t, :valid | :printable) :: [t]
def chunk(string, trait)
def chunk("", _), do: []
def chunk(string, trait) when is_binary(string) and trait in [:valid, :printable] do
{cp, _} = next_codepoint(string)
pred_fn = make_chunk_pred(trait)
do_chunk(string, pred_fn.(cp), pred_fn)
end
defp do_chunk(string, flag, pred_fn), do: do_chunk(string, [], <<>>, flag, pred_fn)
defp do_chunk(<<>>, acc, <<>>, _, _), do: Enum.reverse(acc)
defp do_chunk(<<>>, acc, chunk, _, _), do: Enum.reverse(acc, [chunk])
defp do_chunk(string, acc, chunk, flag, pred_fn) do
{cp, rest} = next_codepoint(string)
if pred_fn.(cp) != flag do
do_chunk(rest, [chunk | acc], cp, not flag, pred_fn)
else
do_chunk(rest, acc, chunk <> cp, flag, pred_fn)
end
end
defp make_chunk_pred(:valid), do: &valid?/1
defp make_chunk_pred(:printable), do: &printable?/1
@doc ~S"""
Returns Unicode graphemes in the string as per Extended Grapheme
Cluster algorithm.
The algorithm is outlined in the [Unicode Standard Annex #29,
Unicode Text Segmentation](https://www.unicode.org/reports/tr29/).
For details about code points and graphemes, see the `String` module documentation.
## Examples
iex> String.graphemes("Ńaïve")
["Ń", "a", "ï", "v", "e"]
iex> String.graphemes("\u00e9")
["é"]
iex> String.graphemes("\u0065\u0301")
["é"]
"""
@spec graphemes(t) :: [grapheme]
defdelegate graphemes(string), to: String.Unicode
@compile {:inline, next_grapheme: 1, next_grapheme_size: 1}
@doc """
Returns the next grapheme in a string.
The result is a tuple with the grapheme and the
remainder of the string or `nil` in case
the String reached its end.
## Examples
iex> String.next_grapheme("olá")
{"o", "lá"}
iex> String.next_grapheme("")
nil
"""
@spec next_grapheme(t) :: {grapheme, t} | nil
def next_grapheme(binary) when is_binary(binary) do
case next_grapheme_size(binary) do
{size, rest} -> {binary_part(binary, 0, size), rest}
nil -> nil
end
end
@doc """
Returns the size (in bytes) of the next grapheme.
The result is a tuple with the next grapheme size in bytes and
the remainder of the string or `nil` in case the string
reached its end.
## Examples
iex> String.next_grapheme_size("olá")
{1, "lá"}
iex> String.next_grapheme_size("")
nil
"""
@spec next_grapheme_size(t) :: {pos_integer, t} | nil
defdelegate next_grapheme_size(string), to: String.Unicode
@doc """
Returns the first grapheme from a UTF-8 string,
`nil` if the string is empty.
## Examples
iex> String.first("elixir")
"e"
iex> String.first("եոգլի")
"ե"
iex> String.first("")
nil
"""
@spec first(t) :: grapheme | nil
def first(string) when is_binary(string) do
case next_grapheme(string) do
{char, _} -> char
nil -> nil
end
end
@doc """
Returns the last grapheme from a UTF-8 string,
`nil` if the string is empty.
## Examples
iex> String.last("elixir")
"r"
iex> String.last("եոգլի")
"ի"
"""
@spec last(t) :: grapheme | nil
def last(string) when is_binary(string) do
do_last(next_grapheme(string), nil)
end
defp do_last({char, rest}, _) do
do_last(next_grapheme(rest), char)
end
defp do_last(nil, last_char), do: last_char
@doc """
Returns the number of Unicode graphemes in a UTF-8 string.
## Examples
iex> String.length("elixir")
6
iex> String.length("եոգլի")
5
"""
@spec length(t) :: non_neg_integer
defdelegate length(string), to: String.Unicode
@doc """
Returns the grapheme at the `position` of the given UTF-8 `string`.
If `position` is greater than `string` length, then it returns `nil`.
## Examples
iex> String.at("elixir", 0)
"e"
iex> String.at("elixir", 1)
"l"
iex> String.at("elixir", 10)
nil
iex> String.at("elixir", -1)
"r"
iex> String.at("elixir", -10)
nil
"""
@spec at(t, integer) :: grapheme | nil
def at(string, position) when is_binary(string) and is_integer(position) and position >= 0 do
do_at(string, position)
end
def at(string, position) when is_binary(string) and is_integer(position) and position < 0 do
position = length(string) + position
case position >= 0 do
true -> do_at(string, position)
false -> nil
end
end
defp do_at(string, position) do
case String.Unicode.split_at(string, position) do
{_, nil} -> nil
{_, rest} -> first(rest)
end
end
@doc """
Returns a substring starting at the offset `start`, and of the given `length`.
If the offset is greater than string length, then it returns `""`.
Remember this function works with Unicode graphemes and considers
the slices to represent grapheme offsets. If you want to split
on raw bytes, check `Kernel.binary_part/3` instead.
## Examples
iex> String.slice("elixir", 1, 3)
"lix"
iex> String.slice("elixir", 1, 10)
"lixir"
iex> String.slice("elixir", 10, 3)
""
iex> String.slice("elixir", -4, 4)
"ixir"
iex> String.slice("elixir", -10, 3)
""
iex> String.slice("a", 0, 1500)
"a"
iex> String.slice("a", 1, 1500)
""
iex> String.slice("a", 2, 1500)
""
"""
@spec slice(t, integer, non_neg_integer) :: grapheme
def slice(_, _, 0) do
""
end
def slice(string, start, length)
when is_binary(string) and is_integer(start) and is_integer(length) and start >= 0 and
length >= 0 do
case String.Unicode.split_at(string, start) do
{_, nil} ->
""
{start_bytes, rest} ->
{len_bytes, _} = String.Unicode.split_at(rest, length)
binary_part(string, start_bytes, len_bytes)
end
end
def slice(string, start, length)
when is_binary(string) and is_integer(start) and is_integer(length) and start < 0 and
length >= 0 do
start = length(string) + start
case start >= 0 do
true -> slice(string, start, length)
false -> ""
end
end
@doc """
Returns a substring from the offset given by the start of the
range to the offset given by the end of the range.
If the start of the range is not a valid offset for the given
string or if the range is in reverse order, returns `""`.
If the start or end of the range is negative, the whole string
is traversed first in order to convert the negative indices into
positive ones.
Remember this function works with Unicode graphemes and considers
the slices to represent grapheme offsets. If you want to split
on raw bytes, check `Kernel.binary_part/3` instead.
## Examples
iex> String.slice("elixir", 1..3)
"lix"
iex> String.slice("elixir", 1..10)
"lixir"
iex> String.slice("elixir", -4..-1)
"ixir"
iex> String.slice("elixir", -4..6)
"ixir"
For ranges where `start > stop`, you need to explicit
mark them as increasing:
iex> String.slice("elixir", 2..-1//1)
"ixir"
iex> String.slice("elixir", 1..-2//1)
"lixi"
If values are out of bounds, it returns an empty string:
iex> String.slice("elixir", 10..3)
""
iex> String.slice("elixir", -10..-7)
""
iex> String.slice("a", 0..1500)
"a"
iex> String.slice("a", 1..1500)
""
"""
@spec slice(t, Range.t()) :: t
def slice(string, first..last//step = range) when is_binary(string) do
# TODO: Deprecate negative steps on Elixir v1.16
# TODO: There are two features we can add to slicing ranges:
# 1. We can allow the step to be any positive number
# 2. We can allow slice and reverse at the same time. However, we can't
# implement so right now. First we will have to raise if a decreasing
# range is given on Elixir v2.0.
if step == 1 or (step == -1 and first > last) do
slice_range(string, first, last)
else
raise ArgumentError,
"String.slice/2 does not accept ranges with custom steps, got: #{inspect(range)}"
end
end
defp slice_range("", _, _), do: ""
defp slice_range(string, first, -1) when first >= 0 do
case String.Unicode.split_at(string, first) do
{_, nil} -> ""
{start_bytes, _} -> binary_part(string, start_bytes, byte_size(string) - start_bytes)
end
end
defp slice_range(string, first, last) when first >= 0 and last >= 0 do
if last >= first do
slice(string, first, last - first + 1)
else
""
end
end
defp slice_range(string, first, last) do
{bytes, length} = acc_bytes(next_grapheme_size(string), [], 0)
first = add_if_negative(first, length)
last = add_if_negative(last, length)
if first < 0 or first > last or first > length do
""
else
last = min(last + 1, length)
bytes = Enum.drop(bytes, length - last)
first = last - first
{length_bytes, start_bytes} = split_bytes(bytes, 0, first)
binary_part(string, start_bytes, length_bytes)
end
end
defp acc_bytes({size, rest}, bytes, length) do
acc_bytes(next_grapheme_size(rest), [size | bytes], length + 1)
end
defp acc_bytes(nil, bytes, length) do
{bytes, length}
end
defp add_if_negative(value, to_add) when value < 0, do: value + to_add
defp add_if_negative(value, _to_add), do: value
defp split_bytes(rest, acc, 0), do: {acc, Enum.sum(rest)}
defp split_bytes([], acc, _), do: {acc, 0}
defp split_bytes([head | tail], acc, count), do: split_bytes(tail, head + acc, count - 1)
@doc """
Returns `true` if `string` starts with any of the prefixes given.
`prefix` can be either a string, a list of strings, or a compiled
pattern.
## Examples
iex> String.starts_with?("elixir", "eli")
true
iex> String.starts_with?("elixir", ["erlang", "elixir"])
true
iex> String.starts_with?("elixir", ["erlang", "ruby"])
false
A compiled pattern can also be given:
iex> pattern = :binary.compile_pattern(["erlang", "elixir"])
iex> String.starts_with?("elixir", pattern)
true
An empty string will always match:
iex> String.starts_with?("elixir", "")
true
iex> String.starts_with?("elixir", ["", "other"])
true
"""
@spec starts_with?(t, pattern) :: boolean
def starts_with?(string, prefix) when is_binary(string) and is_binary(prefix) do
starts_with_string?(string, byte_size(string), prefix)
end
def starts_with?(string, prefix) when is_binary(string) and is_list(prefix) do
string_size = byte_size(string)
Enum.any?(prefix, &starts_with_string?(string, string_size, &1))
end
def starts_with?(string, prefix) when is_binary(string) do
Kernel.match?({0, _}, :binary.match(string, prefix))
end
@compile {:inline, starts_with_string?: 3}
defp starts_with_string?(string, string_size, prefix) when is_binary(prefix) do
prefix_size = byte_size(prefix)
if prefix_size <= string_size do
prefix == binary_part(string, 0, prefix_size)
else
false
end
end
@doc """
Returns `true` if `string` ends with any of the suffixes given.
`suffixes` can be either a single suffix or a list of suffixes.
## Examples
iex> String.ends_with?("language", "age")
true
iex> String.ends_with?("language", ["youth", "age"])
true
iex> String.ends_with?("language", ["youth", "elixir"])
false
An empty suffix will always match:
iex> String.ends_with?("language", "")
true
iex> String.ends_with?("language", ["", "other"])
true
"""
@spec ends_with?(t, t | [t]) :: boolean
def ends_with?(string, suffix) when is_binary(string) and is_binary(suffix) do
ends_with_string?(string, byte_size(string), suffix)
end
def ends_with?(string, suffix) when is_binary(string) and is_list(suffix) do
string_size = byte_size(string)
Enum.any?(suffix, &ends_with_string?(string, string_size, &1))
end
@compile {:inline, ends_with_string?: 3}
defp ends_with_string?(string, string_size, suffix) when is_binary(suffix) do
suffix_size = byte_size(suffix)
if suffix_size <= string_size do
suffix == binary_part(string, string_size - suffix_size, suffix_size)
else
false
end
end
@doc """
Checks if `string` matches the given regular expression.
## Examples
iex> String.match?("foo", ~r/foo/)
true
iex> String.match?("bar", ~r/foo/)
false
"""
@spec match?(t, Regex.t()) :: boolean
def match?(string, regex) when is_binary(string) do
Regex.match?(regex, string)
end
@doc """
Checks if `string` contains any of the given `contents`.
`contents` can be either a string, a list of strings,
or a compiled pattern.
## Examples
iex> String.contains?("elixir of life", "of")
true
iex> String.contains?("elixir of life", ["life", "death"])
true
iex> String.contains?("elixir of life", ["death", "mercury"])
false
The argument can also be a compiled pattern:
iex> pattern = :binary.compile_pattern(["life", "death"])
iex> String.contains?("elixir of life", pattern)
true
An empty string will always match:
iex> String.contains?("elixir of life", "")
true
iex> String.contains?("elixir of life", ["", "other"])
true
Be aware that this function can match within or across grapheme boundaries.
For example, take the grapheme "é" which is made of the characters
"e" and the acute accent. The following returns `true`:
iex> String.contains?(String.normalize("é", :nfd), "e")
true
However, if "é" is represented by the single character "e with acute"
accent, then it will return `false`:
iex> String.contains?(String.normalize("é", :nfc), "e")
false
"""
@spec contains?(t, pattern) :: boolean
def contains?(string, []) when is_binary(string) do
false
end
def contains?(string, contents) when is_binary(string) and is_list(contents) do
"" in contents or :binary.match(string, contents) != :nomatch
end
def contains?(string, contents) when is_binary(string) do
"" == contents or :binary.match(string, contents) != :nomatch
end
@doc """
Converts a string into a charlist.
Specifically, this function takes a UTF-8 encoded binary and returns a list of its integer
code points. It is similar to `codepoints/1` except that the latter returns a list of code points as
strings.
In case you need to work with bytes, take a look at the
[`:binary` module](`:binary`).
## Examples
iex> String.to_charlist("æß")
'æß'
"""
@spec to_charlist(t) :: charlist
def to_charlist(string) when is_binary(string) do
case :unicode.characters_to_list(string) do
result when is_list(result) ->
result
{:error, encoded, rest} ->
raise UnicodeConversionError, encoded: encoded, rest: rest, kind: :invalid
{:incomplete, encoded, rest} ->
raise UnicodeConversionError, encoded: encoded, rest: rest, kind: :incomplete
end
end
@doc """
Converts a string to an atom.
Warning: this function creates atoms dynamically and atoms are
not garbage-collected. Therefore, `string` should not be an
untrusted value, such as input received from a socket or during
a web request. Consider using `to_existing_atom/1` instead.
By default, the maximum number of atoms is `1_048_576`. This limit
can be raised or lowered using the VM option `+t`.
The maximum atom size is of 255 Unicode code points.
Inlined by the compiler.
## Examples
iex> String.to_atom("my_atom")
:my_atom
"""
@spec to_atom(String.t()) :: atom
def to_atom(string) when is_binary(string) do
:erlang.binary_to_atom(string, :utf8)
end
@doc """
Converts a string to an existing atom.
The maximum atom size is of 255 Unicode code points.
Inlined by the compiler.
## Examples
iex> _ = :my_atom
iex> String.to_existing_atom("my_atom")
:my_atom
"""
@spec to_existing_atom(String.t()) :: atom
def to_existing_atom(string) when is_binary(string) do
:erlang.binary_to_existing_atom(string, :utf8)
end
@doc """
Returns an integer whose text representation is `string`.
`string` must be the string representation of an integer.
Otherwise, an `ArgumentError` will be raised. If you want
to parse a string that may contain an ill-formatted integer,
use `Integer.parse/1`.
Inlined by the compiler.
## Examples
iex> String.to_integer("123")
123
Passing a string that does not represent an integer leads to an error:
String.to_integer("invalid data")
** (ArgumentError) argument error
"""
@spec to_integer(String.t()) :: integer
def to_integer(string) when is_binary(string) do
:erlang.binary_to_integer(string)
end
@doc """
Returns an integer whose text representation is `string` in base `base`.
Inlined by the compiler.
## Examples
iex> String.to_integer("3FF", 16)
1023
"""
@spec to_integer(String.t(), 2..36) :: integer
def to_integer(string, base) when is_binary(string) and is_integer(base) do
:erlang.binary_to_integer(string, base)
end
@doc """
Returns a float whose text representation is `string`.
`string` must be the string representation of a float including a decimal point.
In order to parse a string without decimal point as a float then `Float.parse/1`
should be used. Otherwise, an `ArgumentError` will be raised.
Inlined by the compiler.
## Examples
iex> String.to_float("2.2017764e+0")
2.2017764
iex> String.to_float("3.0")
3.0
String.to_float("3")
** (ArgumentError) argument error
"""
@spec to_float(String.t()) :: float
def to_float(string) when is_binary(string) do
:erlang.binary_to_float(string)
end
@doc """
Computes the bag distance between two strings.
Returns a float value between 0 and 1 representing the bag
distance between `string1` and `string2`.
The bag distance is meant to be an efficient approximation
of the distance between two strings to quickly rule out strings
that are largely different.
The algorithm is outlined in the "String Matching with Metric
Trees Using an Approximate Distance" paper by Ilaria Bartolini,
Paolo Ciaccia, and Marco Patella.
## Examples
iex> String.bag_distance("abc", "")
0.0
iex> String.bag_distance("abcd", "a")
0.25
iex> String.bag_distance("abcd", "ab")
0.5
iex> String.bag_distance("abcd", "abc")
0.75
iex> String.bag_distance("abcd", "abcd")
1.0
"""
@spec bag_distance(t, t) :: float
@doc since: "1.8.0"
def bag_distance(_string, ""), do: 0.0
def bag_distance("", _string), do: 0.0
def bag_distance(string1, string2) when is_binary(string1) and is_binary(string2) do
{bag1, length1} = string_to_bag(string1, %{}, 0)
{bag2, length2} = string_to_bag(string2, %{}, 0)
diff1 = bag_difference(bag1, bag2)
diff2 = bag_difference(bag2, bag1)
1 - max(diff1, diff2) / max(length1, length2)
end
defp string_to_bag(string, bag, length) do
case next_grapheme(string) do
{char, rest} ->
bag =
case bag do
%{^char => current} -> %{bag | char => current + 1}
%{} -> Map.put(bag, char, 1)
end
string_to_bag(rest, bag, length + 1)
nil ->
{bag, length}
end
end
defp bag_difference(bag1, bag2) do
Enum.reduce(bag1, 0, fn {char, count1}, sum ->
case bag2 do
%{^char => count2} -> sum + max(count1 - count2, 0)
%{} -> sum + count1
end
end)
end
@doc """
Computes the Jaro distance (similarity) between two strings.
Returns a float value between `0.0` (equates to no similarity) and `1.0`
(is an exact match) representing [Jaro](https://en.wikipedia.org/wiki/Jaro-Winkler_distance)
distance between `string1` and `string2`.
The Jaro distance metric is designed and best suited for short
strings such as person names. Elixir itself uses this function
to provide the "did you mean?" functionality. For instance, when you
are calling a function in a module and you have a typo in the
function name, we attempt to suggest the most similar function
name available, if any, based on the `jaro_distance/2` score.
## Examples
iex> String.jaro_distance("Dwayne", "Duane")
0.8222222222222223
iex> String.jaro_distance("even", "odd")
0.0
iex> String.jaro_distance("same", "same")
1.0
"""
@spec jaro_distance(t, t) :: float
def jaro_distance(string1, string2)
def jaro_distance(string, string), do: 1.0
def jaro_distance(_string, ""), do: 0.0
def jaro_distance("", _string), do: 0.0
def jaro_distance(string1, string2) when is_binary(string1) and is_binary(string2) do
{chars1, len1} = chars_and_length(string1)
{chars2, len2} = chars_and_length(string2)
case match(chars1, len1, chars2, len2) do
{0, _trans} ->
0.0
{comm, trans} ->
(comm / len1 + comm / len2 + (comm - trans) / comm) / 3
end
end
@compile {:inline, chars_and_length: 1}
defp chars_and_length(string) do
chars = graphemes(string)
{chars, Kernel.length(chars)}
end
defp match(chars1, len1, chars2, len2) do
if len1 < len2 do
match(chars1, chars2, div(len2, 2) - 1)
else
match(chars2, chars1, div(len1, 2) - 1)
end
end
defp match(chars1, chars2, lim) do
match(chars1, chars2, {0, lim}, {0, 0, -1}, 0)
end
defp match([char | rest], chars, range, state, idx) do
{chars, state} = submatch(char, chars, range, state, idx)
case range do
{lim, lim} -> match(rest, tl(chars), range, state, idx + 1)
{pre, lim} -> match(rest, chars, {pre + 1, lim}, state, idx + 1)
end
end
defp match([], _, _, {comm, trans, _}, _), do: {comm, trans}
defp submatch(char, chars, {pre, _} = range, state, idx) do
case detect(char, chars, range) do
nil ->
{chars, state}
{subidx, chars} ->
{chars, proceed(state, idx - pre + subidx)}
end
end
defp detect(char, chars, {pre, lim}) do
detect(char, chars, pre + 1 + lim, 0, [])
end
defp detect(_char, _chars, 0, _idx, _acc), do: nil
defp detect(_char, [], _lim, _idx, _acc), do: nil
defp detect(char, [char | rest], _lim, idx, acc), do: {idx, Enum.reverse(acc, [nil | rest])}
defp detect(char, [other | rest], lim, idx, acc),
do: detect(char, rest, lim - 1, idx + 1, [other | acc])
defp proceed({comm, trans, former}, current) do
if current < former do
{comm + 1, trans + 1, current}
else
{comm + 1, trans, current}
end
end
@doc """
Returns a keyword list that represents an edit script.
Check `List.myers_difference/2` for more information.
## Examples
iex> string1 = "fox hops over the dog"
iex> string2 = "fox jumps over the lazy cat"
iex> String.myers_difference(string1, string2)
[eq: "fox ", del: "ho", ins: "jum", eq: "ps over the ", del: "dog", ins: "lazy cat"]
"""
@doc since: "1.3.0"
@spec myers_difference(t, t) :: [{:eq | :ins | :del, t}]
def myers_difference(string1, string2) when is_binary(string1) and is_binary(string2) do
graphemes(string1)
|> List.myers_difference(graphemes(string2))
|> Enum.map(fn {kind, chars} -> {kind, IO.iodata_to_binary(chars)} end)
end
@doc false
@deprecated "Use String.to_charlist/1 instead"
@spec to_char_list(t) :: charlist
def to_char_list(string), do: String.to_charlist(string)
end
| 28.805117 | 102 | 0.649914 |
1c15284ae385d7c4ac8a56c90daba66a925d4d22 | 809 | ex | Elixir | lib/live_sup/helpers/string_helper.ex | livesup-dev/livesup | eaf9ffc78d3043bd9e3408f0f4df26ed16eb8446 | [
"Apache-2.0",
"MIT"
] | null | null | null | lib/live_sup/helpers/string_helper.ex | livesup-dev/livesup | eaf9ffc78d3043bd9e3408f0f4df26ed16eb8446 | [
"Apache-2.0",
"MIT"
] | 3 | 2022-02-23T15:51:48.000Z | 2022-03-14T22:52:43.000Z | lib/live_sup/helpers/string_helper.ex | livesup-dev/livesup | eaf9ffc78d3043bd9e3408f0f4df26ed16eb8446 | [
"Apache-2.0",
"MIT"
] | null | null | null | defmodule LiveSup.Helpers.StringHelper do
def truncate(text, opts \\ []) do
max_length = opts[:max_length] || 50
omission = opts[:omission] || "..."
cond do
not String.valid?(text) ->
text
String.length(text) < max_length ->
text
true ->
length_with_omission = max_length - String.length(omission)
"#{String.slice(text, 0, length_with_omission)}#{omission}"
end
end
def find_placeholders(str) do
Regex.scan(~r{\{(.*?)\}}, str) |> Enum.map(&tl/1) |> List.flatten()
end
def keys_to_strings(map) do
for {key, val} <- map, into: %{}, do: {convert_to_string(key), val}
end
defp convert_to_string(value) when is_binary(value), do: value
defp convert_to_string(value) when is_atom(value), do: Atom.to_string(value)
end
| 26.096774 | 78 | 0.631644 |
1c1530c7c049f6f089132b25d2ee34810b90f753 | 612 | ex | Elixir | lib/markright/collectors/fuerer.ex | betrybe/markright | 1b81a2d9931b70645f9eff70d2605f0e9a58f396 | [
"MIT"
] | 15 | 2017-01-12T19:24:35.000Z | 2021-04-27T14:44:08.000Z | lib/markright/collectors/fuerer.ex | betrybe/markright | 1b81a2d9931b70645f9eff70d2605f0e9a58f396 | [
"MIT"
] | 8 | 2017-02-13T17:01:35.000Z | 2021-07-27T16:20:52.000Z | lib/markright/collectors/fuerer.ex | betrybe/markright | 1b81a2d9931b70645f9eff70d2605f0e9a58f396 | [
"MIT"
] | 1 | 2021-04-24T18:40:11.000Z | 2021-04-24T18:40:11.000Z | defmodule Markright.Collectors.Fuerer do
@moduledoc ~S"""
Collector that converts the topmost para into h2 tag unless it’s set.
"""
@behaviour Markright.Collector
@empty_header "★ ★ ★"
def on_ast(%Markright.Continuation{ast: ast} = _cont, acc) do
case ast do
{:article, %{}, [{tag, _, text} | _]}
when tag == :p or tag == :h1 or tag == :h2 or tag == :h3 ->
Keyword.put_new(acc, :header, text)
_ ->
acc
end
end
def afterwards(acc, opts) do
acc = Keyword.put_new(acc, :header, @empty_header)
{:h2, opts[:attrs] || %{}, acc[:header]}
end
end
| 23.538462 | 71 | 0.594771 |
1c15351c36fa7e89508b1c88da0267c83a30a917 | 11,916 | ex | Elixir | lib/codes/codes_v15.ex | badubizzle/icd_code | 4c625733f92b7b1d616e272abc3009bb8b916c0c | [
"Apache-2.0"
] | null | null | null | lib/codes/codes_v15.ex | badubizzle/icd_code | 4c625733f92b7b1d616e272abc3009bb8b916c0c | [
"Apache-2.0"
] | null | null | null | lib/codes/codes_v15.ex | badubizzle/icd_code | 4c625733f92b7b1d616e272abc3009bb8b916c0c | [
"Apache-2.0"
] | null | null | null | defmodule IcdCode.ICDCode.Codes_V15 do
alias IcdCode.ICDCode
def _V150XXA do
%ICDCode{full_code: "V150XXA",
category_code: "V15",
short_code: "0XXA",
full_name: "Pedal cycle driver injured in collision with railway train or railway vehicle in nontraffic accident, initial encounter",
short_name: "Pedal cycle driver injured in collision with railway train or railway vehicle in nontraffic accident, initial encounter",
category_name: "Pedal cycle driver injured in collision with railway train or railway vehicle in nontraffic accident, initial encounter"
}
end
def _V150XXD do
%ICDCode{full_code: "V150XXD",
category_code: "V15",
short_code: "0XXD",
full_name: "Pedal cycle driver injured in collision with railway train or railway vehicle in nontraffic accident, subsequent encounter",
short_name: "Pedal cycle driver injured in collision with railway train or railway vehicle in nontraffic accident, subsequent encounter",
category_name: "Pedal cycle driver injured in collision with railway train or railway vehicle in nontraffic accident, subsequent encounter"
}
end
def _V150XXS do
%ICDCode{full_code: "V150XXS",
category_code: "V15",
short_code: "0XXS",
full_name: "Pedal cycle driver injured in collision with railway train or railway vehicle in nontraffic accident, sequela",
short_name: "Pedal cycle driver injured in collision with railway train or railway vehicle in nontraffic accident, sequela",
category_name: "Pedal cycle driver injured in collision with railway train or railway vehicle in nontraffic accident, sequela"
}
end
def _V151XXA do
%ICDCode{full_code: "V151XXA",
category_code: "V15",
short_code: "1XXA",
full_name: "Pedal cycle passenger injured in collision with railway train or railway vehicle in nontraffic accident, initial encounter",
short_name: "Pedal cycle passenger injured in collision with railway train or railway vehicle in nontraffic accident, initial encounter",
category_name: "Pedal cycle passenger injured in collision with railway train or railway vehicle in nontraffic accident, initial encounter"
}
end
def _V151XXD do
%ICDCode{full_code: "V151XXD",
category_code: "V15",
short_code: "1XXD",
full_name: "Pedal cycle passenger injured in collision with railway train or railway vehicle in nontraffic accident, subsequent encounter",
short_name: "Pedal cycle passenger injured in collision with railway train or railway vehicle in nontraffic accident, subsequent encounter",
category_name: "Pedal cycle passenger injured in collision with railway train or railway vehicle in nontraffic accident, subsequent encounter"
}
end
def _V151XXS do
%ICDCode{full_code: "V151XXS",
category_code: "V15",
short_code: "1XXS",
full_name: "Pedal cycle passenger injured in collision with railway train or railway vehicle in nontraffic accident, sequela",
short_name: "Pedal cycle passenger injured in collision with railway train or railway vehicle in nontraffic accident, sequela",
category_name: "Pedal cycle passenger injured in collision with railway train or railway vehicle in nontraffic accident, sequela"
}
end
def _V152XXA do
%ICDCode{full_code: "V152XXA",
category_code: "V15",
short_code: "2XXA",
full_name: "Unspecified pedal cyclist injured in collision with railway train or railway vehicle in nontraffic accident, initial encounter",
short_name: "Unspecified pedal cyclist injured in collision with railway train or railway vehicle in nontraffic accident, initial encounter",
category_name: "Unspecified pedal cyclist injured in collision with railway train or railway vehicle in nontraffic accident, initial encounter"
}
end
def _V152XXD do
%ICDCode{full_code: "V152XXD",
category_code: "V15",
short_code: "2XXD",
full_name: "Unspecified pedal cyclist injured in collision with railway train or railway vehicle in nontraffic accident, subsequent encounter",
short_name: "Unspecified pedal cyclist injured in collision with railway train or railway vehicle in nontraffic accident, subsequent encounter",
category_name: "Unspecified pedal cyclist injured in collision with railway train or railway vehicle in nontraffic accident, subsequent encounter"
}
end
def _V152XXS do
%ICDCode{full_code: "V152XXS",
category_code: "V15",
short_code: "2XXS",
full_name: "Unspecified pedal cyclist injured in collision with railway train or railway vehicle in nontraffic accident, sequela",
short_name: "Unspecified pedal cyclist injured in collision with railway train or railway vehicle in nontraffic accident, sequela",
category_name: "Unspecified pedal cyclist injured in collision with railway train or railway vehicle in nontraffic accident, sequela"
}
end
def _V153XXA do
%ICDCode{full_code: "V153XXA",
category_code: "V15",
short_code: "3XXA",
full_name: "Person boarding or alighting a pedal cycle injured in collision with railway train or railway vehicle, initial encounter",
short_name: "Person boarding or alighting a pedal cycle injured in collision with railway train or railway vehicle, initial encounter",
category_name: "Person boarding or alighting a pedal cycle injured in collision with railway train or railway vehicle, initial encounter"
}
end
def _V153XXD do
%ICDCode{full_code: "V153XXD",
category_code: "V15",
short_code: "3XXD",
full_name: "Person boarding or alighting a pedal cycle injured in collision with railway train or railway vehicle, subsequent encounter",
short_name: "Person boarding or alighting a pedal cycle injured in collision with railway train or railway vehicle, subsequent encounter",
category_name: "Person boarding or alighting a pedal cycle injured in collision with railway train or railway vehicle, subsequent encounter"
}
end
def _V153XXS do
%ICDCode{full_code: "V153XXS",
category_code: "V15",
short_code: "3XXS",
full_name: "Person boarding or alighting a pedal cycle injured in collision with railway train or railway vehicle, sequela",
short_name: "Person boarding or alighting a pedal cycle injured in collision with railway train or railway vehicle, sequela",
category_name: "Person boarding or alighting a pedal cycle injured in collision with railway train or railway vehicle, sequela"
}
end
def _V154XXA do
%ICDCode{full_code: "V154XXA",
category_code: "V15",
short_code: "4XXA",
full_name: "Pedal cycle driver injured in collision with railway train or railway vehicle in traffic accident, initial encounter",
short_name: "Pedal cycle driver injured in collision with railway train or railway vehicle in traffic accident, initial encounter",
category_name: "Pedal cycle driver injured in collision with railway train or railway vehicle in traffic accident, initial encounter"
}
end
def _V154XXD do
%ICDCode{full_code: "V154XXD",
category_code: "V15",
short_code: "4XXD",
full_name: "Pedal cycle driver injured in collision with railway train or railway vehicle in traffic accident, subsequent encounter",
short_name: "Pedal cycle driver injured in collision with railway train or railway vehicle in traffic accident, subsequent encounter",
category_name: "Pedal cycle driver injured in collision with railway train or railway vehicle in traffic accident, subsequent encounter"
}
end
def _V154XXS do
%ICDCode{full_code: "V154XXS",
category_code: "V15",
short_code: "4XXS",
full_name: "Pedal cycle driver injured in collision with railway train or railway vehicle in traffic accident, sequela",
short_name: "Pedal cycle driver injured in collision with railway train or railway vehicle in traffic accident, sequela",
category_name: "Pedal cycle driver injured in collision with railway train or railway vehicle in traffic accident, sequela"
}
end
def _V155XXA do
%ICDCode{full_code: "V155XXA",
category_code: "V15",
short_code: "5XXA",
full_name: "Pedal cycle passenger injured in collision with railway train or railway vehicle in traffic accident, initial encounter",
short_name: "Pedal cycle passenger injured in collision with railway train or railway vehicle in traffic accident, initial encounter",
category_name: "Pedal cycle passenger injured in collision with railway train or railway vehicle in traffic accident, initial encounter"
}
end
def _V155XXD do
%ICDCode{full_code: "V155XXD",
category_code: "V15",
short_code: "5XXD",
full_name: "Pedal cycle passenger injured in collision with railway train or railway vehicle in traffic accident, subsequent encounter",
short_name: "Pedal cycle passenger injured in collision with railway train or railway vehicle in traffic accident, subsequent encounter",
category_name: "Pedal cycle passenger injured in collision with railway train or railway vehicle in traffic accident, subsequent encounter"
}
end
def _V155XXS do
%ICDCode{full_code: "V155XXS",
category_code: "V15",
short_code: "5XXS",
full_name: "Pedal cycle passenger injured in collision with railway train or railway vehicle in traffic accident, sequela",
short_name: "Pedal cycle passenger injured in collision with railway train or railway vehicle in traffic accident, sequela",
category_name: "Pedal cycle passenger injured in collision with railway train or railway vehicle in traffic accident, sequela"
}
end
def _V159XXA do
%ICDCode{full_code: "V159XXA",
category_code: "V15",
short_code: "9XXA",
full_name: "Unspecified pedal cyclist injured in collision with railway train or railway vehicle in traffic accident, initial encounter",
short_name: "Unspecified pedal cyclist injured in collision with railway train or railway vehicle in traffic accident, initial encounter",
category_name: "Unspecified pedal cyclist injured in collision with railway train or railway vehicle in traffic accident, initial encounter"
}
end
def _V159XXD do
%ICDCode{full_code: "V159XXD",
category_code: "V15",
short_code: "9XXD",
full_name: "Unspecified pedal cyclist injured in collision with railway train or railway vehicle in traffic accident, subsequent encounter",
short_name: "Unspecified pedal cyclist injured in collision with railway train or railway vehicle in traffic accident, subsequent encounter",
category_name: "Unspecified pedal cyclist injured in collision with railway train or railway vehicle in traffic accident, subsequent encounter"
}
end
def _V159XXS do
%ICDCode{full_code: "V159XXS",
category_code: "V15",
short_code: "9XXS",
full_name: "Unspecified pedal cyclist injured in collision with railway train or railway vehicle in traffic accident, sequela",
short_name: "Unspecified pedal cyclist injured in collision with railway train or railway vehicle in traffic accident, sequela",
category_name: "Unspecified pedal cyclist injured in collision with railway train or railway vehicle in traffic accident, sequela"
}
end
end
| 60.795918 | 156 | 0.719705 |
1c1552d66e08854c347bf88ef63c8263593de865 | 873 | exs | Elixir | apps/core/test/unit/employee/user_role_creator_test.exs | ehealth-ua/ehealth.api | 4ffe26a464fe40c95fb841a4aa2e147068f65ca2 | [
"Apache-2.0"
] | 8 | 2019-06-14T11:34:49.000Z | 2021-08-05T19:14:24.000Z | apps/core/test/unit/employee/user_role_creator_test.exs | edenlabllc/ehealth.api.public | 4ffe26a464fe40c95fb841a4aa2e147068f65ca2 | [
"Apache-2.0"
] | 1 | 2019-07-08T15:20:22.000Z | 2019-07-08T15:20:22.000Z | apps/core/test/unit/employee/user_role_creator_test.exs | ehealth-ua/ehealth.api | 4ffe26a464fe40c95fb841a4aa2e147068f65ca2 | [
"Apache-2.0"
] | 6 | 2018-05-11T13:59:32.000Z | 2022-01-19T20:15:22.000Z | defmodule Core.Unit.Employee.UserRoleCreatorTest do
@moduledoc false
use Core.ConnCase
import Mox
alias Core.Employees.UserRoleCreator
alias Ecto.UUID
test "create/2" do
expect(MithrilMock, :get_roles_by_name, fn _, _ ->
{:ok, %{"data" => []}}
end)
user_id = UUID.generate()
legal_entity = insert(:prm, :legal_entity)
expect(MithrilMock, :get_user_roles, fn _, _, _ ->
{:ok, %{"data" => [%{"user_id" => user_id, "client_id" => legal_entity.id}]}}
end)
insert(:prm, :division)
party = insert(:prm, :party)
insert(:prm, :party_user, party: party, user_id: user_id)
employee = insert(:prm, :employee, party: party, legal_entity: legal_entity)
assert :ok == UserRoleCreator.create(employee, get_headers())
end
defp get_headers do
[
{"x-consumer-id", Ecto.UUID.generate()}
]
end
end
| 24.942857 | 83 | 0.644903 |
1c1556a97b3a86b5a4516bf9a06a351e9299b4ae | 1,159 | ex | Elixir | apps/site/lib/site_web/plugs/rewrite_urls.ex | paulswartz/dotcom | 73e43e7c61afd96b1928608ce8316a7ed0eb1440 | [
"MIT"
] | null | null | null | apps/site/lib/site_web/plugs/rewrite_urls.ex | paulswartz/dotcom | 73e43e7c61afd96b1928608ce8316a7ed0eb1440 | [
"MIT"
] | 65 | 2021-05-06T18:38:33.000Z | 2022-03-28T20:50:04.000Z | apps/site/lib/site_web/plugs/rewrite_urls.ex | paulswartz/dotcom | 73e43e7c61afd96b1928608ce8316a7ed0eb1440 | [
"MIT"
] | null | null | null | defmodule SiteWeb.Plugs.RewriteUrls do
@moduledoc """
Plug to redirect before other kinds of data are loaded.
Currently, only used to redirect from the old Boat-F3 (Hull Ferry) schedule
to the new Boat-F1 (Hingham/Hull Ferry) schedule.
"""
@behaviour Plug
import Plug.Conn
import Phoenix.Controller, only: [redirect: 2]
@impl true
def init([]), do: []
@impl true
def call(conn, _) do
if new_base_url = rewrite_url(conn) do
conn
|> redirect(to: merge_url(new_base_url, conn.query_string))
|> halt
else
conn
end
end
defp rewrite_url(%{path_info: ["schedules", "Boat-F3" | _]} = conn) do
String.replace(conn.request_path, "Boat-F3", "Boat-F1")
end
defp rewrite_url(%{path_info: ["schedules", "62" | _]} = conn) do
String.replace(conn.request_path, "62", "627")
end
defp rewrite_url(%{path_info: ["schedules", "76" | _]} = conn) do
String.replace(conn.request_path, "76", "627")
end
defp rewrite_url(_conn) do
nil
end
defp merge_url(base_url, "") do
base_url
end
defp merge_url(base_url, query_string) do
"#{base_url}?#{query_string}"
end
end
| 22.72549 | 77 | 0.654875 |
1c157004dbefc8a20dc8efee4a0793964d829f75 | 2,514 | exs | Elixir | test/phoenix_live_view/integrations/assigns_test.exs | gaslight/live_element | 78d4ab0a2daab470f2ffd25d446fbabb0d746afe | [
"MIT"
] | null | null | null | test/phoenix_live_view/integrations/assigns_test.exs | gaslight/live_element | 78d4ab0a2daab470f2ffd25d446fbabb0d746afe | [
"MIT"
] | null | null | null | test/phoenix_live_view/integrations/assigns_test.exs | gaslight/live_element | 78d4ab0a2daab470f2ffd25d446fbabb0d746afe | [
"MIT"
] | null | null | null | defmodule LiveElement.AssignsTest do
use ExUnit.Case, async: true
import Plug.Conn
import Phoenix.ConnTest
import LiveElementTest
alias LiveElementTest.Endpoint
@endpoint Endpoint
setup do
{:ok, conn: Plug.Test.init_test_session(Phoenix.ConnTest.build_conn(), %{})}
end
describe "assign_new" do
test "uses conn.assigns on static render then fetches on connected mount", %{conn: conn} do
user = %{name: "user-from-conn", id: 123}
conn =
conn
|> Plug.Conn.assign(:current_user, user)
|> Plug.Conn.put_session(:user_id, user.id)
|> get("/root")
assert html_response(conn, 200) =~ "root name: user-from-conn"
assert html_response(conn, 200) =~ "child static name: user-from-conn"
{:ok, _, connected_html} = live(conn)
assert connected_html =~ "root name: user-from-root"
assert connected_html =~ "child static name: user-from-root"
end
test "uses assign_new from parent on dynamically added child", %{conn: conn} do
user = %{name: "user-from-conn", id: 123}
{:ok, view, _html} =
conn
|> Plug.Conn.assign(:current_user, user)
|> Plug.Conn.put_session(:user_id, user.id)
|> live("/root")
assert render(view) =~ "child static name: user-from-root"
refute render(view) =~ "child dynamic name"
:ok = GenServer.call(view.pid, {:dynamic_child, :dynamic})
html = render(view)
assert html =~ "child static name: user-from-root"
assert html =~ "child dynamic name: user-from-child"
end
end
describe "temporary assigns" do
test "can be configured with mount options", %{conn: conn} do
{:ok, conf_live, html} =
conn
|> put_session(:opts, temporary_assigns: [description: nil])
|> live("/opts")
assert html =~ "long description. canary"
assert render(conf_live) =~ "long description. canary"
socket = GenServer.call(conf_live.pid, {:exec, fn socket -> {:reply, socket, socket} end})
assert socket.assigns.description == nil
assert socket.assigns.canary == "canary"
end
test "raises with invalid options", %{conn: conn} do
assert_raise Plug.Conn.WrapperError,
~r/invalid option returned from LiveElementTest.OptsLive.mount\/3/,
fn ->
conn
|> put_session(:opts, oops: [:description])
|> live("/opts")
end
end
end
end
| 31.822785 | 96 | 0.612172 |
1c157d00ca529cd6a56b15b80e90a4796433d20e | 1,170 | ex | Elixir | lib/accounting/adapter.ex | verypossible/accounting | 25be530e191c7c23c725770deb90d390b9296496 | [
"MIT"
] | 3 | 2017-12-17T22:00:22.000Z | 2020-01-12T17:29:36.000Z | lib/accounting/adapter.ex | verypossible/accounting | 25be530e191c7c23c725770deb90d390b9296496 | [
"MIT"
] | 2 | 2017-12-11T23:31:04.000Z | 2017-12-15T16:44:31.000Z | lib/accounting/adapter.ex | verypossible/accounting | 25be530e191c7c23c725770deb90d390b9296496 | [
"MIT"
] | 3 | 2017-07-26T07:34:30.000Z | 2022-01-11T15:53:07.000Z | defmodule Accounting.Adapter do
@moduledoc """
A behaviour module for implementing the adapter for a particular journal.
"""
alias Accounting.{Account, Entry, Journal}
@typep account_number :: Accounting.account_number
@callback child_spec(keyword) :: Supervisor.child_spec
@callback setup_accounts(Journal.id, [Account.setup, ...], timeout) :: :ok | {:error, term}
@callback setup_account_conversions(Journal.id, 1..12, pos_integer, [Account.setup, ...], timeout) :: :ok | {:error, term}
@callback list_accounts(Journal.id, timeout) :: {:ok, [account_number]} | {:error, term}
@callback fetch_accounts(Journal.id, [account_number], timeout) :: {:ok, Journal.accounts} | {:error, term}
@callback record_entries(Journal.id, [Entry.t, ...], timeout) :: :ok | {:error, [Entry.Error.t] | term}
@callback record_invoices(Journal.id, [Entry.t, ...], timeout) :: :ok | {:error, [Entry.Error.t] | term}
@callback register_account(Journal.id, account_number, String.t, timeout) :: :ok | {:error, term}
@callback register_categories(Journal.id, [atom], timeout) :: :ok | {:error, term}
@callback start_link(opts :: any) :: Supervisor.on_start
end
| 55.714286 | 124 | 0.698291 |
1c15a0cdf26ba20e7b22ea3271001b360f481aef | 539 | ex | Elixir | lib/ex_bitmex/rest/user/commission.ex | eduardoscottini/ex_bitmex | f8528bd635922e1777a5b01ea4941d625da7396e | [
"MIT"
] | 6 | 2019-02-13T04:05:19.000Z | 2020-12-31T07:40:09.000Z | lib/ex_bitmex/rest/user/commission.ex | eduardoscottini/ex_bitmex | f8528bd635922e1777a5b01ea4941d625da7396e | [
"MIT"
] | 28 | 2021-03-29T06:46:42.000Z | 2022-03-28T11:03:38.000Z | lib/ex_bitmex/rest/user/commission.ex | yurikoval/ex_bitmex | d9492789fb319fbdf78d90a99f7c0e40c95c1885 | [
"MIT"
] | 4 | 2019-05-03T21:27:10.000Z | 2021-01-12T09:26:34.000Z | defmodule ExBitmex.Rest.User.Commission do
alias ExBitmex.Rest
@type credentials :: ExBitmex.Credentials.t() | nil
@type params :: map
@type rate_limit :: ExBitmex.RateLimit.t()
@path "/user/commission"
def get(%ExBitmex.Credentials{} = credentials, params \\ %{}) do
@path
|> Rest.HTTPClient.auth_get(credentials, params)
|> parse_response
end
defp parse_response({:ok, data, rate_limit}) when is_map(data) do
{:ok, data, rate_limit}
end
defp parse_response({:error, _, _} = error), do: error
end
| 24.5 | 67 | 0.682746 |
1c15b14c6500be766d8f3d258ec7f7038ada687d | 5,591 | exs | Elixir | test/helpers/element_with_selectors_test.exs | NeoArcanjo/hound | 31f15d35aafcebc6263c28948f2bc84eefe8892d | [
"MIT"
] | null | null | null | test/helpers/element_with_selectors_test.exs | NeoArcanjo/hound | 31f15d35aafcebc6263c28948f2bc84eefe8892d | [
"MIT"
] | null | null | null | test/helpers/element_with_selectors_test.exs | NeoArcanjo/hound | 31f15d35aafcebc6263c28948f2bc84eefe8892d | [
"MIT"
] | null | null | null | defmodule ElementWithSelectorsTest do
use ExUnit.Case
use Hound.Helpers
hound_session()
test "should get visible text of an element, when selector is passed" do
navigate_to("http://localhost:9090/page1.html")
assert visible_text({:class, "example"}) == "Paragraph"
end
test "should raise when passed selector does not match any element" do
navigate_to("http://localhost:9090/page1.html")
assert_raise Hound.NoSuchElementError, fn ->
visible_text({:class, "i-dont-exist"})
end
end
test "should input value into field, when selector is passed" do
navigate_to("http://localhost:9090/page1.html")
element = {:name, "username"}
input_into_field(element, "john")
assert attribute_value(element, "value") == "john"
input_into_field(element, "doe")
assert attribute_value(element, "value") == "johndoe"
end
test "should fill a field with a value, when selector is passed" do
navigate_to("http://localhost:9090/page1.html")
element = {:name, "username"}
fill_field(element, "johndoe")
assert attribute_value(element, "value") == "johndoe"
fill_field(element, "janedoe")
assert attribute_value(element, "value") == "janedoe"
end
test "should get tag name of element, when selector is passed" do
navigate_to("http://localhost:9090/page1.html")
assert tag_name({:name, "username"}) == "input"
end
test "should clear field, when selector is passed" do
navigate_to("http://localhost:9090/page1.html")
element = {:name, "username"}
fill_field(element, "johndoe")
assert attribute_value(element, "value") == "johndoe"
clear_field(element)
assert attribute_value(element, "value") == ""
end
test "should return true if item is selected in a checkbox or radio, when selector is passed" do
navigate_to("http://localhost:9090/page1.html")
element = {:id, "speed-superpower"}
click(element)
assert selected?(element)
end
test "should return false if item is *not* selected, when selector is passed" do
navigate_to("http://localhost:9090/page1.html")
assert selected?({:id, "speed-flying"}) == false
end
test "Should return true if element is enabled, when selector is passed" do
navigate_to("http://localhost:9090/page1.html")
assert element_enabled?({:name, "username"}) == true
end
test "Should return false if element is *not* enabled, when selector is passed" do
navigate_to("http://localhost:9090/page1.html")
assert element_enabled?({:name, "promocode"}) == false
end
test "should get attribute value of an element, when selector is passed" do
navigate_to("http://localhost:9090/page1.html")
assert attribute_value({:class, "example"}, "data-greeting") == "hello"
end
test "should return true when an element has a class, when selector is passed" do
navigate_to("http://localhost:9090/page1.html")
assert has_class?({:class, "example"}, "example")
assert has_class?({:class, "another_example"}, "another_class")
end
test "should return false when an element does not have a class, when selector is passed" do
navigate_to("http://localhost:9090/page1.html")
refute has_class?({:class, "example"}, "ex")
refute has_class?({:class, "example"}, "other")
end
test "should return true if an element is displayed, when selector is passed" do
navigate_to("http://localhost:9090/page1.html")
assert element_displayed?({:class, "example"})
end
test "should return false if an element is *not* displayed, when selector is passed" do
navigate_to("http://localhost:9090/page1.html")
assert element_displayed?({:class, "hidden-element"}) == false
end
test "should get an element's location on screen, when selector is passed" do
navigate_to("http://localhost:9090/page1.html")
{loc_x, loc_y} = element_location({:class, "example"})
assert is_integer(loc_x) || is_float(loc_x)
assert is_integer(loc_y) || is_float(loc_y)
end
test "should get an element's size, when selector is passed" do
navigate_to("http://localhost:9090/page1.html")
size = element_size({:class, "example"})
assert size == {400, 100}
end
test "should get css property of an element, when selector is passed" do
navigate_to("http://localhost:9090/page1.html")
assert css_property({:class, "container"}, "display") == "block"
end
test "should click on an element, when selector is passed" do
navigate_to("http://localhost:9090/page1.html")
click({:class, "submit-form"})
assert current_url() == "http://localhost:9090/page2.html"
end
test "should submit a form element, when selector is passed" do
navigate_to("http://localhost:9090/page1.html")
submit_element({:name, "username"})
Process.sleep(50)
assert current_url() == "http://localhost:9090/page2.html"
end
test "should move mouse to an element" do
navigate_to("http://localhost:9090/page1.html")
move_to({:id, "mouse-actions"}, 5, 5)
assert visible_text({:id, "mouse-actions"}) == "Mouse over"
end
test "should mouse down on an element" do
navigate_to("http://localhost:9090/page1.html")
move_to({:id, "mouse-actions"}, 5, 5)
mouse_down()
assert visible_text({:id, "mouse-actions"}) == "Mouse down"
end
test "should mouse up on an element" do
navigate_to("http://localhost:9090/page1.html")
move_to({:id, "mouse-actions"}, 5, 5)
# Mouse up needs a mouse down before
mouse_down()
mouse_up()
assert visible_text({:id, "mouse-actions"}) == "Mouse up"
end
end
| 34.726708 | 98 | 0.688786 |
1c15c193c8a2087ee9d991b7d3e54aa0448a132a | 787 | ex | Elixir | lib/basic_web/controllers/user_session_controller.ex | ysaito8015/communitex | d469447a62029d59883d95df4df3c9b09e0022e2 | [
"Apache-2.0"
] | 7 | 2021-07-14T15:45:55.000Z | 2022-01-25T11:13:01.000Z | lib/basic_web/controllers/user_session_controller.ex | ysaito8015/communitex | d469447a62029d59883d95df4df3c9b09e0022e2 | [
"Apache-2.0"
] | 10 | 2021-08-09T15:54:05.000Z | 2022-02-17T04:18:38.000Z | lib/basic_web/controllers/user_session_controller.ex | ysaito8015/communitex | d469447a62029d59883d95df4df3c9b09e0022e2 | [
"Apache-2.0"
] | 5 | 2021-07-23T05:54:35.000Z | 2022-01-28T04:14:51.000Z | defmodule BasicWeb.UserSessionController do
use BasicWeb, :controller
alias Basic.Accounts
alias BasicWeb.UserAuth
def new(conn, _params) do
render(conn, "new.html", error_message: nil)
end
def create(conn, %{"user" => user_params}) do
%{"email" => email, "password" => password} = user_params
if user = Accounts.get_user_by_email_and_password(email, password) do
UserAuth.log_in_user(conn, user, user_params)
else
# render(conn, "new.html", error_message: "Invalid email or password")
render(conn, "new.html", error_message: "メールアドレスかパスワードが正しくありません")
end
end
def delete(conn, _params) do
conn
# |> put_flash(:info, "Logged out successfully.")
|> put_flash(:info, "ログアウトしました")
|> UserAuth.log_out_user()
end
end
| 27.137931 | 75 | 0.687421 |
1c15f2b2eb5f906a93c36678a7bc9787a2fa43aa | 1,692 | ex | Elixir | clients/dns/lib/google_api/dns/v1/model/operation_dns_key_context.ex | pojiro/elixir-google-api | 928496a017d3875a1929c6809d9221d79404b910 | [
"Apache-2.0"
] | 1 | 2021-12-20T03:40:53.000Z | 2021-12-20T03:40:53.000Z | clients/dns/lib/google_api/dns/v1/model/operation_dns_key_context.ex | pojiro/elixir-google-api | 928496a017d3875a1929c6809d9221d79404b910 | [
"Apache-2.0"
] | 1 | 2020-08-18T00:11:23.000Z | 2020-08-18T00:44:16.000Z | clients/dns/lib/google_api/dns/v1/model/operation_dns_key_context.ex | pojiro/elixir-google-api | 928496a017d3875a1929c6809d9221d79404b910 | [
"Apache-2.0"
] | null | null | null | # Copyright 2019 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# NOTE: This file is auto generated by the elixir code generator program.
# Do not edit this file manually.
defmodule GoogleApi.DNS.V1.Model.OperationDnsKeyContext do
@moduledoc """
## Attributes
* `newValue` (*type:* `GoogleApi.DNS.V1.Model.DnsKey.t`, *default:* `nil`) - The post-operation DnsKey resource.
* `oldValue` (*type:* `GoogleApi.DNS.V1.Model.DnsKey.t`, *default:* `nil`) - The pre-operation DnsKey resource.
"""
use GoogleApi.Gax.ModelBase
@type t :: %__MODULE__{
:newValue => GoogleApi.DNS.V1.Model.DnsKey.t() | nil,
:oldValue => GoogleApi.DNS.V1.Model.DnsKey.t() | nil
}
field(:newValue, as: GoogleApi.DNS.V1.Model.DnsKey)
field(:oldValue, as: GoogleApi.DNS.V1.Model.DnsKey)
end
defimpl Poison.Decoder, for: GoogleApi.DNS.V1.Model.OperationDnsKeyContext do
def decode(value, options) do
GoogleApi.DNS.V1.Model.OperationDnsKeyContext.decode(value, options)
end
end
defimpl Poison.Encoder, for: GoogleApi.DNS.V1.Model.OperationDnsKeyContext do
def encode(value, options) do
GoogleApi.Gax.ModelBase.encode(value, options)
end
end
| 33.84 | 116 | 0.727541 |
1c163dc799075d18f6a1af4a9597fcfb589d4310 | 1,115 | exs | Elixir | config/config.exs | shufo/bom | c37735d5867b2d52996c073b70f9b8a349e5ad8c | [
"MIT"
] | 2 | 2019-11-19T12:48:55.000Z | 2020-01-12T18:59:41.000Z | config/config.exs | shufo/bom | c37735d5867b2d52996c073b70f9b8a349e5ad8c | [
"MIT"
] | 1 | 2021-06-25T15:34:11.000Z | 2021-06-25T15:34:11.000Z | config/config.exs | shufo/bom | c37735d5867b2d52996c073b70f9b8a349e5ad8c | [
"MIT"
] | null | null | null | # This file is responsible for configuring your application
# and its dependencies with the aid of the Mix.Config module.
use Mix.Config
# This configuration is loaded before any dependency and is restricted
# to this project. If another project depends on this project, this
# file won't be loaded nor affect the parent project. For this reason,
# if you want to provide default values for your application for
# 3rd-party users, it should be done in your "mix.exs" file.
# You can configure your application as:
#
# config :bom, key: :value
#
# and access this configuration in your application as:
#
# Application.get_env(:bom, :key)
#
# You can also configure a 3rd-party app:
#
# config :logger, level: :info
#
# It is also possible to import configuration files, relative to this
# directory. For example, you can emulate configuration per environment
# by uncommenting the line below and defining dev.exs, test.exs and such.
# Configuration from the imported file will override the ones defined
# here (which is why it is important to import them last).
#
# import_config "#{Mix.env}.exs"
| 35.967742 | 73 | 0.749776 |
1c167784d1844a8faaea5ff633bac7eb3b89b822 | 804 | exs | Elixir | apps/day21/mix.exs | jwarwick/aoc_2019 | 04229b86829b72323498b57a6649fcc6f7c96406 | [
"MIT"
] | 2 | 2019-12-21T21:21:04.000Z | 2019-12-27T07:00:19.000Z | apps/day21/mix.exs | jwarwick/aoc_2019 | 04229b86829b72323498b57a6649fcc6f7c96406 | [
"MIT"
] | null | null | null | apps/day21/mix.exs | jwarwick/aoc_2019 | 04229b86829b72323498b57a6649fcc6f7c96406 | [
"MIT"
] | null | null | null | defmodule Day21.MixProject do
use Mix.Project
def project do
[
app: :day21,
version: "0.1.0",
build_path: "../../_build",
config_path: "../../config/config.exs",
deps_path: "../../deps",
lockfile: "../../mix.lock",
elixir: "~> 1.9-dev",
start_permanent: Mix.env() == :prod,
deps: deps()
]
end
# Run "mix help compile.app" to learn about applications.
def application do
[
extra_applications: [:logger]
]
end
# Run "mix help deps" to learn about dependencies.
defp deps do
[
# {:dep_from_hexpm, "~> 0.3.0"},
# {:dep_from_git, git: "https://github.com/elixir-lang/my_dep.git", tag: "0.1.0"},
# {:sibling_app_in_umbrella, in_umbrella: true}
{:util, in_umbrella: true}
]
end
end
| 22.971429 | 88 | 0.56592 |
1c16b37c0b32e083e548d6510c9bf7dc3ca039b0 | 609 | exs | Elixir | test/views/error_view_test.exs | batmany13/github-ci | e67df76aaeee5e829b923ad09140dc6628ef979b | [
"Apache-2.0"
] | null | null | null | test/views/error_view_test.exs | batmany13/github-ci | e67df76aaeee5e829b923ad09140dc6628ef979b | [
"Apache-2.0"
] | null | null | null | test/views/error_view_test.exs | batmany13/github-ci | e67df76aaeee5e829b923ad09140dc6628ef979b | [
"Apache-2.0"
] | null | null | null | defmodule GithubCi.ErrorViewTest do
use GithubCi.ConnCase, async: true
# Bring render/3 and render_to_string/3 for testing custom views
import Phoenix.View
test "renders 404.json" do
assert render(GithubCi.ErrorView, "404.json", []) ==
%{errors: %{detail: "Page not found"}}
end
test "render 500.json" do
assert render(GithubCi.ErrorView, "500.json", []) ==
%{errors: %{detail: "Server internal error"}}
end
test "render any other" do
assert render(GithubCi.ErrorView, "505.json", []) ==
%{errors: %{detail: "Server internal error"}}
end
end
| 27.681818 | 66 | 0.648604 |
1c16e52b836d1e20e2206e9e41e572b43966d527 | 186 | exs | Elixir | priv/repo/migrations/20160421085335_add_settings_to_categories.exs | fdietz/whistler_news_reader | 501f3f95e1ba3a684da8b34b60e426da85e7852d | [
"MIT"
] | 8 | 2016-06-12T20:11:26.000Z | 2017-05-02T04:36:41.000Z | priv/repo/migrations/20160421085335_add_settings_to_categories.exs | fdietz/whistler_news_reader | 501f3f95e1ba3a684da8b34b60e426da85e7852d | [
"MIT"
] | 2 | 2016-06-12T15:49:06.000Z | 2016-06-12T20:00:02.000Z | priv/repo/migrations/20160421085335_add_settings_to_categories.exs | fdietz/whistler_news_reader | 501f3f95e1ba3a684da8b34b60e426da85e7852d | [
"MIT"
] | null | null | null | defmodule WhistlerNewsReader.Repo.Migrations.AddSettingsToCategories do
use Ecto.Migration
def change do
alter table(:categories) do
add :settings, :map
end
end
end
| 18.6 | 71 | 0.741935 |
1c16e72798d83f3385fd355b9bb1be4d4ed77e43 | 1,793 | ex | Elixir | clients/cloud_trace/lib/google_api/cloud_trace/v2/model/links.ex | leandrocp/elixir-google-api | a86e46907f396d40aeff8668c3bd81662f44c71e | [
"Apache-2.0"
] | null | null | null | clients/cloud_trace/lib/google_api/cloud_trace/v2/model/links.ex | leandrocp/elixir-google-api | a86e46907f396d40aeff8668c3bd81662f44c71e | [
"Apache-2.0"
] | null | null | null | clients/cloud_trace/lib/google_api/cloud_trace/v2/model/links.ex | leandrocp/elixir-google-api | a86e46907f396d40aeff8668c3bd81662f44c71e | [
"Apache-2.0"
] | 1 | 2020-11-10T16:58:27.000Z | 2020-11-10T16:58:27.000Z | # Copyright 2017 Google Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# NOTE: This class is auto generated by the swagger code generator program.
# https://github.com/swagger-api/swagger-codegen.git
# Do not edit the class manually.
defmodule GoogleApi.CloudTrace.V2.Model.Links do
@moduledoc """
A collection of links, which are references from this span to a span in the same or different trace.
## Attributes
- droppedLinksCount (integer()): The number of dropped links after the maximum size was enforced. If this value is 0, then no links were dropped. Defaults to: `null`.
- link ([Link]): A collection of links. Defaults to: `null`.
"""
use GoogleApi.Gax.ModelBase
@type t :: %__MODULE__{
:droppedLinksCount => any(),
:link => list(GoogleApi.CloudTrace.V2.Model.Link.t())
}
field(:droppedLinksCount)
field(:link, as: GoogleApi.CloudTrace.V2.Model.Link, type: :list)
end
defimpl Poison.Decoder, for: GoogleApi.CloudTrace.V2.Model.Links do
def decode(value, options) do
GoogleApi.CloudTrace.V2.Model.Links.decode(value, options)
end
end
defimpl Poison.Encoder, for: GoogleApi.CloudTrace.V2.Model.Links do
def encode(value, options) do
GoogleApi.Gax.ModelBase.encode(value, options)
end
end
| 35.156863 | 168 | 0.735081 |
1c16f9a033c2c8ad358d87d6bdecb67e65e44158 | 594 | exs | Elixir | test/views/error_view_test.exs | rubencaro/pedro | b550b3af700962283fa9e3985e1dcc2da2e14d0d | [
"MIT"
] | null | null | null | test/views/error_view_test.exs | rubencaro/pedro | b550b3af700962283fa9e3985e1dcc2da2e14d0d | [
"MIT"
] | null | null | null | test/views/error_view_test.exs | rubencaro/pedro | b550b3af700962283fa9e3985e1dcc2da2e14d0d | [
"MIT"
] | null | null | null | defmodule Pedro.ErrorViewTest do
use Pedro.ConnCase, async: true
# Bring render/3 and render_to_string/3 for testing custom views
import Phoenix.View
test "renders 404.json" do
assert render(Pedro.ErrorView, "404.json", []) ==
%{errors: %{detail: "Page not found"}}
end
test "render 500.json" do
assert render(Pedro.ErrorView, "500.json", []) ==
%{errors: %{detail: "Server internal error"}}
end
test "render any other" do
assert render(Pedro.ErrorView, "505.json", []) ==
%{errors: %{detail: "Server internal error"}}
end
end
| 27 | 66 | 0.639731 |
1c17074951fbac8ed475f93e048000f865b0ed30 | 5,972 | exs | Elixir | test/components/form/error_tag_test.exs | EddyLane/surface | 1f13259cbdf81b5a4740ee13349a48f8b6c54bb5 | [
"MIT"
] | 1 | 2020-12-29T10:43:19.000Z | 2020-12-29T10:43:19.000Z | test/components/form/error_tag_test.exs | EddyLane/surface | 1f13259cbdf81b5a4740ee13349a48f8b6c54bb5 | [
"MIT"
] | null | null | null | test/components/form/error_tag_test.exs | EddyLane/surface | 1f13259cbdf81b5a4740ee13349a48f8b6c54bb5 | [
"MIT"
] | null | null | null | defmodule Surface.Components.Form.ErrorTagTest.Common do
@moduledoc """
Common functions used by both ErrorTagTest and ErrorTagSyncTest
"""
def changeset do
{%{}, %{name: :string}}
|> Ecto.Changeset.cast(%{name: "myname"}, [:name])
|> Ecto.Changeset.add_error(:name, "is already taken")
|> Ecto.Changeset.add_error(:name, "another test error")
# Simulate that form submission already occurred so that error message will display
|> Map.put(:action, :insert)
end
end
defmodule Surface.Components.Form.ErrorTagTest do
use Surface.ConnCase, async: true
alias Surface.Components.Form.ErrorTagTest.Common
alias Surface.Components.Form
alias Surface.Components.Form.Field
alias Surface.Components.Form.ErrorTag
setup do
%{changeset: Common.changeset()}
end
test "multiple error messages", %{changeset: changeset} do
assigns = %{changeset: changeset}
html =
render_surface do
~H"""
<Form for={{@changeset}} opts={{ as: :user }}>
<Field name="name">
<ErrorTag />
</Field>
</Form>
"""
end
assert html =~
"<span phx-feedback-for=\"user_name\">is already taken</span>"
assert html =~
"<span phx-feedback-for=\"user_name\">another test error</span>"
end
test "no errors are shown if changeset.action is empty", %{changeset: changeset} do
changeset_without_action = Map.put(changeset, :action, nil)
assigns = %{changeset: changeset_without_action}
html =
render_surface do
~H"""
<Form for={{@changeset}} opts={{ as: :user }}>
<Field name="name">
<ErrorTag />
</Field>
</Form>
"""
end
refute html =~ "is already taken"
refute html =~ "another test error"
end
test "prop phx_feedback_for", %{changeset: changeset} do
assigns = %{changeset: changeset}
html =
render_surface do
~H"""
<Form for={{@changeset}} opts={{ as: :user }}>
<Field name="name">
<ErrorTag phx_feedback_for="test-id" />
</Field>
</Form>
"""
end
assert html =~
"<span phx-feedback-for=\"test-id\">is already taken</span>"
end
test "prop class", %{changeset: changeset} do
assigns = %{changeset: changeset}
html =
render_surface do
~H"""
<Form for={{@changeset}} opts={{ as: :user }}>
<Field name="name">
<ErrorTag class="test-class" />
</Field>
</Form>
"""
end
assert html =~
"<span class=\"test-class\" phx-feedback-for=\"user_name\">is already taken</span>"
end
test "no changeset shows no errors" do
html =
render_surface do
~H"""
<Form for={{ :user }}>
<Field name="name">
<ErrorTag />
</Field>
</Form>
"""
end
# The error tags are displayed as spans, so this demonstrates that none were rendered
refute html =~ "<span"
end
end
defmodule Surface.Components.Form.ErrorTagSyncTest do
use Surface.ConnCase
alias Surface.Components.Form.ErrorTagTest.Common
alias Surface.Components.Form
alias Surface.Components.Form.Field
alias Surface.Components.Form.ErrorTag
setup do
%{changeset: Common.changeset()}
end
test "translator from config", %{changeset: changeset} do
using_config ErrorTag,
default_translator: {Surface.Components.Form.ErrorTagSyncTest, :config_translate_error} do
assigns = %{changeset: changeset}
html =
render_surface do
~H"""
<Form for={{@changeset}} opts={{ as: :user }}>
<Field name="name">
<ErrorTag />
</Field>
</Form>
"""
end
assert html =~
"<span phx-feedback-for=\"user_name\">translated by config translator</span>"
end
end
test "prop translator overrides config and fallback", %{changeset: changeset} do
using_config ErrorTag,
default_translator: {Surface.Components.Form.ErrorTagSyncTest, :config_translate_error} do
assigns = %{changeset: changeset}
html =
render_surface do
~H"""
<Form for={{@changeset}} opts={{ as: :user }}>
<Field name="name">
<ErrorTag translator={{ fn _ -> "translated by prop translator" end }} />
</Field>
</Form>
"""
end
assert html =~
"<span phx-feedback-for=\"user_name\">translated by prop translator</span>"
refute html =~
"<span phx-feedback-for=\"user_name\">translated by config translator</span>"
end
end
test "default_class from config", %{changeset: changeset} do
using_config ErrorTag, default_class: "class-from-config" do
assigns = %{changeset: changeset}
html =
render_surface do
~H"""
<Form for={{@changeset}} opts={{ as: :user }}>
<Field name="name">
<ErrorTag />
</Field>
</Form>
"""
end
assert html =~
"<span class=\"class-from-config\" phx-feedback-for=\"user_name\">is already taken</span>"
end
end
test "prop class overrides config", %{changeset: changeset} do
using_config ErrorTag, default_class: "class-from-config" do
assigns = %{changeset: changeset}
html =
render_surface do
~H"""
<Form for={{@changeset}} opts={{ as: :user }}>
<Field name="name">
<ErrorTag class="class-from-prop" />
</Field>
</Form>
"""
end
assert html =~
"<span class=\"class-from-prop\" phx-feedback-for=\"user_name\">is already taken</span>"
end
end
def config_translate_error({_msg, _opts}) do
"translated by config translator"
end
end
| 26.780269 | 105 | 0.574849 |
1c170dd52385c074772c5fbd7a286f2dae2d048b | 4,395 | ex | Elixir | lib/sentry/logger.ex | sztosz/sentry-elixir | 3a2cf14eb4a4778d7e77e3b851897a2575b08052 | [
"MIT"
] | null | null | null | lib/sentry/logger.ex | sztosz/sentry-elixir | 3a2cf14eb4a4778d7e77e3b851897a2575b08052 | [
"MIT"
] | null | null | null | lib/sentry/logger.ex | sztosz/sentry-elixir | 3a2cf14eb4a4778d7e77e3b851897a2575b08052 | [
"MIT"
] | null | null | null | defmodule Sentry.Logger do
require Logger
@moduledoc """
This is based on the Erlang [error_logger](http://erlang.org/doc/man/error_logger.html).
To set this up, add `:ok = :error_logger.add_report_handler(Sentry.Logger)` to your application's start function. Example:
```elixir
def start(_type, _opts) do
children = [
supervisor(Task.Supervisor, [[name: Sentry.TaskSupervisor]]),
:hackney_pool.child_spec(Sentry.Client.hackney_pool_name(), [timeout: Config.hackney_timeout(), max_connections: Config.max_hackney_connections()])
]
opts = [strategy: :one_for_one, name: Sentry.Supervisor]
:ok = :error_logger.add_report_handler(Sentry.Logger)
Supervisor.start_link(children, opts)
end
```
Your application will then be running a `Sentry.Logger` event handler that receives error report messages and send them to Sentry.
It is important to note that the same report handler can be added multiple times. If you run an umbrella app, and add the report handler in multiple individual applications, the same error will be reported multiple times (one for each handler). There are two solutions to fix it.
The first is to ensure that the handler is only added at the primary application entry-point. This will work, but can be brittle, and will not work for applications running the multiple release style.
The other solution is to check for existing handlers before trying to add another. Example:
```elixir
if !(Sentry.Logger in :gen_event.which_handlers(:error_logger)) do
:ok = :error_logger.add_report_handler(Sentry.Logger)
end
```
With this solution, if a `Sentry.Logger` handler is already running, it will not add another. One can add the code to each application, and there will only ever be one handler created. This solution is safer, but slightly more complex to manage.
"""
@behaviour :gen_event
def init(_mod), do: {:ok, []}
def handle_call({:configure, new_keys}, _state), do: {:ok, :ok, new_keys}
def handle_event({:error, _gl, {_pid, _type, [pid, {exception, stack}]} = info}, state)
when is_list(stack) and is_pid(pid) do
try do
opts =
Keyword.put([], :event_source, :logger)
|> Keyword.put(:stacktrace, stack)
|> Keyword.put(:error_type, :error)
Sentry.capture_exception(exception, opts)
rescue
ex ->
Logger.warn(fn -> "Unable to notify Sentry due to #{inspect(ex)}! #{inspect(info)}" end)
end
{:ok, state}
end
def handle_event({:error_report, _gl, {_pid, _type, [message | _]}}, state)
when is_list(message) do
try do
{kind, exception, stacktrace, module} =
get_exception_and_stacktrace(message[:error_info])
|> get_initial_call_and_module(message)
opts =
(get_in(message, ~w[dictionary sentry_context]a) || %{})
|> Map.take(Sentry.Context.context_keys())
|> Map.to_list()
|> Keyword.put(:event_source, :logger)
|> Keyword.put(:stacktrace, stacktrace)
|> Keyword.put(:error_type, kind)
|> Keyword.put(:module, module)
Sentry.capture_exception(exception, opts)
rescue
ex ->
Logger.warn(fn -> "Unable to notify Sentry due to #{inspect(ex)}! #{inspect(message)}" end)
end
{:ok, state}
end
def handle_event(_, state) do
{:ok, state}
end
def handle_info(_msg, state) do
{:ok, state}
end
def code_change(_old, state, _extra) do
{:ok, state}
end
def terminate(_reason, _state) do
:ok
end
defp get_exception_and_stacktrace({kind, {exception, sub_stack}, _stack})
when is_list(sub_stack) do
{kind, exception, sub_stack}
end
defp get_exception_and_stacktrace({kind, exception, stacktrace}) do
{kind, exception, stacktrace}
end
# GenServer exits will usually only report a stacktrace containing core
# GenServer functions, which causes Sentry to group unrelated exits
# together. This gets the `:initial_call` to help disambiguate, as it contains
# the MFA for how the GenServer was started.
defp get_initial_call_and_module({kind, exception, stacktrace}, error_info) do
case Keyword.get(error_info, :initial_call) do
{module, function, arg} ->
{kind, exception, stacktrace ++ [{module, function, arg, []}], module}
_ ->
{kind, exception, stacktrace, nil}
end
end
end
| 34.606299 | 283 | 0.687144 |
1c1757e223e8ac879519e9d6ebe40ff772ac5bd5 | 7,954 | ex | Elixir | architect/lib/architect/projects/projects.ex | VJftw/velocity | 8335c39c510dbde1446e6cde03eebb450339d212 | [
"Apache-2.0"
] | 3 | 2017-12-09T21:05:54.000Z | 2019-08-06T08:13:34.000Z | architect/lib/architect/projects/projects.ex | VJftw/velocity | 8335c39c510dbde1446e6cde03eebb450339d212 | [
"Apache-2.0"
] | 63 | 2017-09-09T15:44:24.000Z | 2022-03-03T22:16:24.000Z | architect/lib/architect/projects/projects.ex | VJftw/velocity | 8335c39c510dbde1446e6cde03eebb450339d212 | [
"Apache-2.0"
] | 5 | 2017-09-14T00:17:22.000Z | 2019-11-27T14:43:45.000Z | defmodule Architect.Projects do
@moduledoc """
The Projects context.
"""
import Ecto.Query, warn: false
alias Architect.Repo
alias Architect.Projects.{Project, Starter}
alias Git.Repository
alias Architect.Events
alias Architect.Accounts.User
use Supervisor
require Logger
@registry __MODULE__.Registry
@supervisor __MODULE__.Supervisor
def start_link(_opts \\ []) do
Logger.debug("Starting #{Atom.to_string(__MODULE__)}")
Supervisor.start_link(__MODULE__, :ok, name: __MODULE__)
end
@doc """
Returns the list of projects.
## Examples
iex> list_projects()
[%Project{}, ...]
"""
def list_projects() do
Repo.all(Project)
end
@doc """
Gets a single project by id.
Raises `Ecto.NoResultsError` if the Known host does not exist.
"""
def get_project!(id), do: Repo.get!(Project, id)
@doc """
Gets a single project by slug.
Raises `Ecto.NoResultsError` if the Project does not exist.
## Examples
iex> get_project_by_slug!("velocity")
%KnownHost{}
iex> get_project_by_slug!("Not a slug")
** (Ecto.NoResultsError)
"""
def get_project_by_slug!(slug), do: Repo.get_by!(Project, slug: slug)
@doc """
Creates a project.
## Examples
iex> create_project(%User{}, "https://github.com/velocity-ci/velocity.git")
{:ok, %Project{}, %Event{}}
iex> create_project(%User{}, "banter)
{:error, %Ecto.Changeset{}}
"""
def create_project(%User{} = u, address, private_key \\ nil) when is_binary(address) do
Repo.transaction(fn ->
changeset =
Project.changeset(%Project{}, %{
address: address,
private_key: private_key,
created_by_id: u.id
})
case Repo.insert(changeset) do
{:ok, p} ->
event = Events.create_event!(u, p, %{type: :project_created})
{p, event}
{:error, e} ->
Repo.rollback(e)
end
end)
end
@doc ~S"""
Get a list of branches
## Examples
iex> list_branches(project)
[%Branch{}, ...]
"""
def list_branches(%Project{} = project),
do: call_repository(project, {:list_branches, []})
@doc ~S"""
Get a list of branches for a specific commit SHA
## Examples
iex> list_branches_for_commit(project, "925fbc450c8bdb7665ec3af3129ce715927433fe")
[%Branch{}, ...]
"""
def list_branches_for_commit(%Project{} = project, sha) when is_binary(sha),
do: call_repository(project, {:list_branches_for_commit, [sha]})
@doc ~S"""
Get a list of commits by branch
## Examples
iex> list_commits(project, "master")
[%Commit{}, ...]
"""
def list_commits(%Project{} = project, branch) when is_binary(branch),
do: call_repository(project, {:list_commits, [branch]})
@doc ~S"""
Get the default branch
## Examples
iex> default_branch(project)
%Branch{}
"""
def default_branch(%Project{} = project),
do: call_repository(project, {:default_branch, []})
@doc ~S"""
Get specific branch
## Examples
iex> get_branch(project, "master")
%Branch{}
"""
def get_branch(%Project{} = project, branch) when is_binary(branch),
do: call_repository(project, {:get_branch, [branch]})
@doc ~S"""
Get the amount of commits for the project
## Examples
iex> commit_count(project)
123
"""
def commit_count(%Project{} = project),
do: call_repository(project, {:commit_count, []})
@doc ~S"""
Get the amount of commits for the project, for a specific branch
## Examples
iex> commit_count(project, "master")
42
"""
def commit_count(%Project{} = project, branch) when is_binary(branch),
do: call_repository(project, {:commit_count, [branch]})
@doc ~S"""
List Blueprints
## Examples
iex> list_blueprints(project, {:sha, "925fbc450c8bdb7665ec3af3129ce715927433fe"})
[%Architect.Projects.Blueprint{}, ...]
"""
def list_blueprints(%Project{} = project, index) do
{:ok, cwd} = File.cwd()
{out, 0} =
call_repository(
project,
{:exec, [index, ["#{cwd}/vcli", "list", "blueprints", "--machine-readable"]]}
)
Poison.decode!(out)
end
@doc ~S"""
List Pipelines
## Examples
iex> list_pipelines(project, {:sha, "925fbc450c8bdb7665ec3af3129ce715927433fe"})
[%Architect.Projects.Pipeline{}, ...]
"""
def list_pipelines(%Project{} = project, index) do
{:ok, cwd} = File.cwd()
{out, 0} =
call_repository(
project,
{:exec, [index, ["#{cwd}/vcli", "list", "pipelines", "--machine-readable"]]}
)
Poison.decode!(out)
end
@doc ~S"""
Project Configuration
"""
def project_configuration(%Project{} = project) do
{:ok, cwd} = File.cwd()
default_branch = call_repository(project, {:default_branch, []})
{out, 0} =
call_repository(
project,
{:exec, [default_branch.name, ["#{cwd}/vcli", "info", "--machine-readable"]]}
)
Poison.decode!(out)
end
@doc ~S"""
Get the construction plan for a Blueprint on a commit sha
"""
def plan_blueprint(%Project{} = project, branch_name, commit, blueprint_name) do
{:ok, cwd} = File.cwd()
{out, 0} =
call_repository(
project,
{:exec,
[
branch_name,
[
"#{cwd}/vcli",
"run",
"blueprint",
"--plan-only",
"--machine-readable",
"--branch",
branch_name,
blueprint_name
]
]}
)
Poison.decode!(out)
end
@doc ~S"""
Get the construction plan for a Pipeline on a commit sha
"""
def plan_pipeline(%Project{} = project, branch_name, commit, pipeline_name) do
{:ok, cwd} = File.cwd()
{out, 0} =
call_repository(
project,
{:exec,
[
branch_name,
[
"#{cwd}/vcli",
"run",
"pipeline",
"--plan-only",
"--machine-readable",
"--branch",
branch_name,
pipeline_name
]
]}
)
Poison.decode!(out)
end
### Server
@impl true
def init(:ok) do
:ets.new(:simple_cache, [:named_table, :public])
children = [
{Registry, keys: :unique, name: @registry},
{DynamicSupervisor, name: @supervisor, strategy: :one_for_one, max_restarts: 3},
worker(
Starter,
[%{registry: @registry, supervisor: @supervisor, projects: list_projects()}],
restart: :transient
)
]
Logger.info("Running #{Atom.to_string(__MODULE__)}")
Supervisor.init(children, strategy: :one_for_one)
end
defp call_repository(project, callback, cache \\ true, attempt \\ 1)
defp call_repository(_, _, _, attempt) when attempt > 2, do: {:error, "Failed"}
defp call_repository(project, {f, args}, cache, attempt) do
case Registry.lookup(@registry, project.slug) do
[{repo, _}] ->
apply(Repository, f, [repo | args])
[] ->
Logger.error("Repository #{project.slug} does not exist")
call_repository(project, {f, args}, cache, attempt + 1)
end
end
end
defmodule Architect.Projects.Starter do
use Task
require Logger
alias Git.Repository
alias Architect.Projects.Project
def start_link(opts) do
Task.start_link(__MODULE__, :run, [opts])
end
def run(%{projects: projects, supervisor: _supervisor, registry: _registry}) do
for project <- projects do
process_name = {:via, Registry, {Architect.Projects.Registry, project.slug}}
known_hosts = Project.known_hosts(project.address)
{:ok, _pid} =
DynamicSupervisor.start_child(
# supervisor,
Architect.Projects.Supervisor,
{Repository, {process_name, project.address, project.private_key, known_hosts}}
)
end
end
end
| 22.405634 | 89 | 0.596429 |
1c1789cfb2c15da85cda131ac9bb70d11a104973 | 1,791 | ex | Elixir | clients/video_intelligence/lib/google_api/video_intelligence/v1/model/google_cloud_videointelligence_v1p1beta1__normalized_vertex.ex | pojiro/elixir-google-api | 928496a017d3875a1929c6809d9221d79404b910 | [
"Apache-2.0"
] | 1 | 2021-12-20T03:40:53.000Z | 2021-12-20T03:40:53.000Z | clients/video_intelligence/lib/google_api/video_intelligence/v1/model/google_cloud_videointelligence_v1p1beta1__normalized_vertex.ex | pojiro/elixir-google-api | 928496a017d3875a1929c6809d9221d79404b910 | [
"Apache-2.0"
] | 1 | 2020-08-18T00:11:23.000Z | 2020-08-18T00:44:16.000Z | clients/video_intelligence/lib/google_api/video_intelligence/v1/model/google_cloud_videointelligence_v1p1beta1__normalized_vertex.ex | pojiro/elixir-google-api | 928496a017d3875a1929c6809d9221d79404b910 | [
"Apache-2.0"
] | null | null | null | # Copyright 2019 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# NOTE: This file is auto generated by the elixir code generator program.
# Do not edit this file manually.
defmodule GoogleApi.VideoIntelligence.V1.Model.GoogleCloudVideointelligenceV1p1beta1_NormalizedVertex do
@moduledoc """
A vertex represents a 2D point in the image. NOTE: the normalized vertex coordinates are relative to the original image and range from 0 to 1.
## Attributes
* `x` (*type:* `number()`, *default:* `nil`) - X coordinate.
* `y` (*type:* `number()`, *default:* `nil`) - Y coordinate.
"""
use GoogleApi.Gax.ModelBase
@type t :: %__MODULE__{
:x => number() | nil,
:y => number() | nil
}
field(:x)
field(:y)
end
defimpl Poison.Decoder,
for: GoogleApi.VideoIntelligence.V1.Model.GoogleCloudVideointelligenceV1p1beta1_NormalizedVertex do
def decode(value, options) do
GoogleApi.VideoIntelligence.V1.Model.GoogleCloudVideointelligenceV1p1beta1_NormalizedVertex.decode(
value,
options
)
end
end
defimpl Poison.Encoder,
for: GoogleApi.VideoIntelligence.V1.Model.GoogleCloudVideointelligenceV1p1beta1_NormalizedVertex do
def encode(value, options) do
GoogleApi.Gax.ModelBase.encode(value, options)
end
end
| 32.563636 | 144 | 0.734785 |
1c178d0c0da6dcca1ecfe23fe894b86848fb9c32 | 1,293 | exs | Elixir | test/contracts_api/legal_entytis_test.exs | gissandrogama/contracts_api | 13bcd292637d0e2bc4d2a6c05f5b3266e8bf28e1 | [
"MIT"
] | null | null | null | test/contracts_api/legal_entytis_test.exs | gissandrogama/contracts_api | 13bcd292637d0e2bc4d2a6c05f5b3266e8bf28e1 | [
"MIT"
] | 2 | 2021-03-16T06:43:04.000Z | 2021-03-16T06:54:55.000Z | test/contracts_api/legal_entytis_test.exs | gissandrogama/contracts_api | 13bcd292637d0e2bc4d2a6c05f5b3266e8bf28e1 | [
"MIT"
] | null | null | null | defmodule ContractsApi.LegalEntytisTest do
use ContractsApi.DataCase, async: true
alias ContractsApi.LegalEntytis
describe "create_company/1" do
test "return company structure when parameters are valid" do
{:ok, result} = LegalEntytis.create_company(%{name: "SCJ LTDA", cnpj: "1245786000177"})
assert result.name == "SCJ LTDA"
assert result.cnpj == "1245786000177"
end
test "return company error when parameters are in_valid" do
assert {:error, %Ecto.Changeset{} = changeset} =
LegalEntytis.create_company(%{name: "", cnpj: "1245786000177"})
%{name: ["can't be blank"]} = errors_on(changeset)
end
end
describe "get_company!" do
test "return a company when the id is valid" do
{:ok, %{id: id}} = LegalEntytis.create_company(%{name: "CLOE LTDA", cnpj: "1245786000177"})
{:ok, result} = LegalEntytis.get_company!(id)
assert result.name == "CLOE LTDA"
assert result.cnpj == "1245786000177"
end
test "Return error when id or company does not exist" do
LegalEntytis.create_company(%{name: "CLOE LTDA", cnpj: "1245786000177"})
id_invalid = "2af084fe-a625-44d1-b3f5-dda7000501a9"
assert LegalEntytis.get_company!(id_invalid) == {:error, :not_found}
end
end
end
| 33.153846 | 97 | 0.672854 |
1c179cc64c5c84d6396c2097cde88186301a57e4 | 535 | ex | Elixir | test/stub_modules/producer.ex | trusty/elixir_google_spreadsheets | b7f74d75e61027dc0b12aa5260168563d0777c92 | [
"MIT"
] | 50 | 2016-12-21T06:39:08.000Z | 2022-03-16T04:52:42.000Z | test/stub_modules/producer.ex | trusty/elixir_google_spreadsheets | b7f74d75e61027dc0b12aa5260168563d0777c92 | [
"MIT"
] | 36 | 2017-01-31T19:12:19.000Z | 2022-03-10T17:27:58.000Z | test/stub_modules/producer.ex | nested-tech/elixir_google_spreadsheets | 45895fe6ac5bc9f256bc2e271be1a3324ce8aded | [
"MIT"
] | 34 | 2017-01-16T04:22:16.000Z | 2022-03-01T01:46:39.000Z | defmodule GSS.StubModules.Producer do
use GenStage
def start_link(events) do
GenStage.start_link(__MODULE__, {events, self()})
end
def init({events, owner}) do
{:producer, {events, owner}}
end
def handle_demand(demand, {events, owner}) do
{result, tail} = Enum.split(events, demand)
send(owner, {:handled_demand, result, demand})
{:noreply, result, {tail, owner}}
end
def handle_call({:add, new_events}, _from, {events, owner}) do
{:reply, :ok, [], {events ++ new_events, owner}}
end
end
| 23.26087 | 64 | 0.659813 |
1c17c9b090a32c8f16956b56cf37fbf1b9afda1d | 3,368 | ex | Elixir | lib/aws/generated/sage_maker_runtime.ex | salemove/aws-elixir | debdf6482158a71a57636ac664c911e682093395 | [
"Apache-2.0"
] | null | null | null | lib/aws/generated/sage_maker_runtime.ex | salemove/aws-elixir | debdf6482158a71a57636ac664c911e682093395 | [
"Apache-2.0"
] | null | null | null | lib/aws/generated/sage_maker_runtime.ex | salemove/aws-elixir | debdf6482158a71a57636ac664c911e682093395 | [
"Apache-2.0"
] | null | null | null | # WARNING: DO NOT EDIT, AUTO-GENERATED CODE!
# See https://github.com/aws-beam/aws-codegen for more details.
defmodule AWS.SageMakerRuntime do
@moduledoc """
The Amazon SageMaker runtime API.
"""
alias AWS.Client
alias AWS.Request
def metadata do
%AWS.ServiceMetadata{
abbreviation: nil,
api_version: "2017-05-13",
content_type: "application/x-amz-json-1.1",
credential_scope: nil,
endpoint_prefix: "runtime.sagemaker",
global?: false,
protocol: "rest-json",
service_id: "SageMaker Runtime",
signature_version: "v4",
signing_name: "sagemaker",
target_prefix: nil
}
end
@doc """
After you deploy a model into production using Amazon SageMaker hosting
services, your client applications use this API to get inferences from the model
hosted at the specified endpoint.
For an overview of Amazon SageMaker, see [How It Works](https://docs.aws.amazon.com/sagemaker/latest/dg/how-it-works.html).
Amazon SageMaker strips all POST headers except those supported by the API.
Amazon SageMaker might add additional headers. You should not rely on the
behavior of headers outside those enumerated in the request syntax.
Calls to `InvokeEndpoint` are authenticated by using AWS Signature Version 4.
For information, see [Authenticating Requests (AWS Signature Version 4)](https://docs.aws.amazon.com/AmazonS3/latest/API/sig-v4-authenticating-requests.html)
in the *Amazon S3 API Reference*.
A customer's model containers must respond to requests within 60 seconds. The
model itself can have a maximum processing time of 60 seconds before responding
to invocations. If your model is going to take 50-60 seconds of processing time,
the SDK socket timeout should be set to be 70 seconds.
Endpoints are scoped to an individual account, and are not public. The URL does
not contain the account ID, but Amazon SageMaker determines the account ID from
the authentication token that is supplied by the caller.
"""
def invoke_endpoint(%Client{} = client, endpoint_name, input, options \\ []) do
url_path = "/endpoints/#{URI.encode(endpoint_name)}/invocations"
{headers, input} =
[
{"Accept", "Accept"},
{"ContentType", "Content-Type"},
{"CustomAttributes", "X-Amzn-SageMaker-Custom-Attributes"},
{"InferenceId", "X-Amzn-SageMaker-Inference-Id"},
{"TargetContainerHostname", "X-Amzn-SageMaker-Target-Container-Hostname"},
{"TargetModel", "X-Amzn-SageMaker-Target-Model"},
{"TargetVariant", "X-Amzn-SageMaker-Target-Variant"}
]
|> Request.build_params(input)
query_params = []
options =
Keyword.put(
options,
:response_header_parameters,
[
{"Content-Type", "ContentType"},
{"X-Amzn-SageMaker-Custom-Attributes", "CustomAttributes"},
{"x-Amzn-Invoked-Production-Variant", "InvokedProductionVariant"}
]
)
options =
Keyword.put(
options,
:send_body_as_binary?,
true
)
options =
Keyword.put(
options,
:receive_body_as_binary?,
true
)
Request.request_rest(
client,
metadata(),
:post,
url_path,
query_params,
headers,
input,
options,
nil
)
end
end
| 31.476636 | 159 | 0.672209 |
1c17cb68338292f562c80afcc774abdfa4f0c9d8 | 2,914 | ex | Elixir | lib/freddie/security/aes.ex | kernelgarden/freddie | ec8bce295bc9b74ff5364708d04b1346184cf6e4 | [
"MIT"
] | 8 | 2019-03-16T04:13:37.000Z | 2020-03-16T01:57:41.000Z | lib/freddie/security/aes.ex | kernelgarden/freddie | ec8bce295bc9b74ff5364708d04b1346184cf6e4 | [
"MIT"
] | null | null | null | lib/freddie/security/aes.ex | kernelgarden/freddie | ec8bce295bc9b74ff5364708d04b1346184cf6e4 | [
"MIT"
] | null | null | null | defmodule Freddie.Security.Aes do
@moduledoc false
# http://erlang.org/doc/man/crypto.html#block_encrypt-4
alias __MODULE__
@cipher_mode :aes_gcm
@aad "FREDDIE_AES256GCM"
@iv_size 16
@tag_size 16
@spec generate_aes_key(any()) :: binary() | {:error, {:generate_aes_key, <<_::64, _::_*8>>}}
def generate_aes_key(secret_key) when is_bitstring(secret_key) do
:crypto.hash(:sha256, secret_key)
end
def generate_aes_key(secret_key) do
{:error, {:generate_aes_key, "not valid type #{inspect(secret_key)}"}}
end
@spec encrypt(
binary()
| maybe_improper_list(
binary() | maybe_improper_list(any(), binary() | []) | byte(),
binary() | []
),
binary()
| maybe_improper_list(
binary() | maybe_improper_list(any(), binary() | []) | byte(),
binary() | []
)
) :: binary()
def encrypt(aes_key, value) do
iv = :crypto.strong_rand_bytes(@iv_size)
{cipher_text, cipher_tag} =
:crypto.block_encrypt(@cipher_mode, aes_key, iv, {@aad, value, @tag_size})
iv <> cipher_tag <> cipher_text
end
@spec decrypt(any(), any()) :: :error | binary()
def decrypt(<<iv::binary-@iv_size, cipher_tag::binary-@tag_size, cipher_text::binary>>, aes_key) do
:crypto.block_decrypt(@cipher_mode, aes_key, iv, {@aad, cipher_text, cipher_tag})
end
def decrypt(_aes_key, _) do
:error
end
def test() do
client_private_key =
Freddie.Security.DiffieHellman.generate_private_key()
|> IO.inspect(label: "[Debug] client_private_key: ")
server_private_key =
Freddie.Security.DiffieHellman.generate_private_key()
|> IO.inspect(label: "[Debug] server_private_key: ")
client_public_key =
Freddie.Security.DiffieHellman.generate_public_key(client_private_key)
|> IO.inspect(label: "[Debug] client_public_key: ")
server_public_key =
Freddie.Security.DiffieHellman.generate_public_key(server_private_key)
|> IO.inspect(label: "[Debug] server_public_key: ")
_client_secret_key =
Freddie.Security.DiffieHellman.generate_secret_key(server_public_key, client_private_key)
|> IO.inspect(label: "[Debug] client_secret_key: ")
server_secret_key =
Freddie.Security.DiffieHellman.generate_secret_key(client_public_key, server_private_key)
|> IO.inspect(label: "[Debug] server_secret_key: ")
key =
server_secret_key
|> IO.inspect(label: "[Debug] key: ")
aes_key =
Aes.generate_aes_key(key)
|> IO.inspect(label: "[Debug] aes_key: ")
plain_text = "plain text.plain text.plain text.plain text."
cipher_block =
Aes.encrypt(aes_key, plain_text)
|> IO.inspect(label: "[Debug] cipher_block: ")
_decrypt_text =
Aes.decrypt(cipher_block, aes_key)
|> IO.inspect(label: "[Debug] decrypt_text: ")
end
end
| 30.354167 | 101 | 0.655113 |
1c17db42aed06d7c100c322a0f0aae64f94a8313 | 503 | ex | Elixir | lib/shippex/rate.ex | kianmeng/shippex | 0e4e562bdf1070c3fd39c9fa21bddaf51c9ea718 | [
"MIT"
] | 11 | 2017-01-16T10:53:56.000Z | 2021-06-05T14:51:17.000Z | lib/shippex/rate.ex | kianmeng/shippex | 0e4e562bdf1070c3fd39c9fa21bddaf51c9ea718 | [
"MIT"
] | 4 | 2017-06-17T16:03:56.000Z | 2020-12-31T18:04:21.000Z | lib/shippex/rate.ex | kianmeng/shippex | 0e4e562bdf1070c3fd39c9fa21bddaf51c9ea718 | [
"MIT"
] | 5 | 2018-01-30T21:57:56.000Z | 2021-01-31T02:09:29.000Z | defmodule Shippex.Rate do
@moduledoc """
A `Rate` is a representation of a price estimate from a given carrier for a
`Service`, which is typically selected by the end user for a desired shipping
speed.
"""
alias Shippex.Service
@enforce_keys [:service, :price, :line_items]
defstruct [:service, :price, :line_items]
@type t :: %__MODULE__{
service: Service.t(),
price: integer(),
line_items: nil | [%{name: String.t(), price: integer()}]
}
end
| 26.473684 | 79 | 0.640159 |
1c17fb72875d9ea238af2e4bbbfe53a46deecaae | 1,836 | ex | Elixir | clients/cloud_run/lib/google_api/cloud_run/v1alpha1/model/exec_action.ex | matehat/elixir-google-api | c1b2523c2c4cdc9e6ca4653ac078c94796b393c3 | [
"Apache-2.0"
] | 1 | 2018-12-03T23:43:10.000Z | 2018-12-03T23:43:10.000Z | clients/cloud_run/lib/google_api/cloud_run/v1alpha1/model/exec_action.ex | matehat/elixir-google-api | c1b2523c2c4cdc9e6ca4653ac078c94796b393c3 | [
"Apache-2.0"
] | null | null | null | clients/cloud_run/lib/google_api/cloud_run/v1alpha1/model/exec_action.ex | matehat/elixir-google-api | c1b2523c2c4cdc9e6ca4653ac078c94796b393c3 | [
"Apache-2.0"
] | null | null | null | # Copyright 2019 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# NOTE: This class is auto generated by the elixir code generator program.
# Do not edit the class manually.
defmodule GoogleApi.CloudRun.V1alpha1.Model.ExecAction do
@moduledoc """
ExecAction describes a "run in container" action.
## Attributes
* `command` (*type:* `String.t`, *default:* `nil`) - Command is the command line to execute inside the container, the working
directory for the command is root ('/') in the container's filesystem. The
command is simply exec'd, it is not run inside a shell, so traditional
shell instructions ('|', etc) won't work. To use a shell, you need to
explicitly call out to that shell. Exit status of 0 is treated as
live/healthy and non-zero is unhealthy. +optional
"""
use GoogleApi.Gax.ModelBase
@type t :: %__MODULE__{
:command => String.t()
}
field(:command)
end
defimpl Poison.Decoder, for: GoogleApi.CloudRun.V1alpha1.Model.ExecAction do
def decode(value, options) do
GoogleApi.CloudRun.V1alpha1.Model.ExecAction.decode(value, options)
end
end
defimpl Poison.Encoder, for: GoogleApi.CloudRun.V1alpha1.Model.ExecAction do
def encode(value, options) do
GoogleApi.Gax.ModelBase.encode(value, options)
end
end
| 35.307692 | 129 | 0.729847 |
1c182683647aa60359c7a37bc47e3c16cf6abaf9 | 174 | exs | Elixir | lib/mix/test/fixtures/deps_status/custom/noscm_repo/mix.exs | jbcrail/elixir | f30ef15d9d028a6d0f74d10c2bb320d5f8501bdb | [
"Apache-2.0"
] | 1 | 2015-02-23T00:01:48.000Z | 2015-02-23T00:01:48.000Z | lib/mix/test/fixtures/deps_status/custom/noscm_repo/mix.exs | jbcrail/elixir | f30ef15d9d028a6d0f74d10c2bb320d5f8501bdb | [
"Apache-2.0"
] | null | null | null | lib/mix/test/fixtures/deps_status/custom/noscm_repo/mix.exs | jbcrail/elixir | f30ef15d9d028a6d0f74d10c2bb320d5f8501bdb | [
"Apache-2.0"
] | 1 | 2020-12-07T08:04:16.000Z | 2020-12-07T08:04:16.000Z | defmodule NoSCMRepo do
use Mix.Project
def project do
[ app: :noscm_repo,
version: "0.1.0",
deps: [
{:git_repo, "0.1.0"}
]
]
end
end
| 13.384615 | 28 | 0.511494 |
1c184cf09d7a813298ffc47ea0ac30ebcbba4011 | 36,402 | ex | Elixir | lib/xandra.ex | Nitrino/xandra | 533cae4be6c23d7fc8240f934cf952aaf17e9b44 | [
"0BSD"
] | 1 | 2018-11-10T19:41:01.000Z | 2018-11-10T19:41:01.000Z | lib/xandra.ex | Nitrino/xandra | 533cae4be6c23d7fc8240f934cf952aaf17e9b44 | [
"0BSD"
] | null | null | null | lib/xandra.ex | Nitrino/xandra | 533cae4be6c23d7fc8240f934cf952aaf17e9b44 | [
"0BSD"
] | 1 | 2019-02-11T23:33:15.000Z | 2019-02-11T23:33:15.000Z | defmodule Xandra do
@moduledoc """
This module provides the main API to interface with Cassandra.
This module handles the connection to Cassandra, queries, connection pooling,
connection backoff, logging, and more. Many of these features are provided by
the [`DBConnection`](https://hex.pm/packages/db_connection) library, which
Xandra is built on top of.
## Errors
Many of the functions in this module (whose names don't end with a `!`)
return values in the form `{:ok, result}` or `{:error, error}`. While `result`
varies based on the specific function, `error` is always one of the following:
* a `Xandra.Error` struct: such structs represent errors returned by
Cassandra. When such an error is returned, it means that communicating
with the Cassandra server was successful, but the server returned an
error. Examples of these errors are syntax errors in queries, non-existent
tables, and so on. See `Xandra.Error` for more information.
* a `Xandra.ConnectionError` struct: such structs represent errors in the
communication with the Cassandra server. For example, if the Cassandra
server dies while the connection is waiting for a response from the
server, a `Xandra.ConnectionError` error will be returned. See
`Xandra.ConnectionError` for more information.
## Parameters, encoding, and types
Xandra supports parameterized queries (queries that specify "parameter" values
through `?` or `:named_value`):
SELECT * FROM users WHERE name = ? AND email = ?
SELECT * FROM users WHERE name = :name AND email = :email
When a query has positional parameters, parameters can be passed as a list to
functions like `execute/4`: in this case, a parameter in a given position in
the list will be used as the `?` in the corresponding position in the
query. When a query has named parameters, parameters are passed as a map with
string keys representing each parameter's name and values representing the
corresponding parameter's value.
### Types
Cassandra supports many types of values, and some types have "shades" that
cannot be represented by Elixir types. For example, in Cassandra an integer
could be a "bigint" (a 64 bit integer), an "int" (a 32 bit integer), a
"smallint" (a 16 bit integer), or others; in Elixir, however, integers are
just integers (with varying size to be precise), so it is impossible to
univocally map Elixir integers to a specific Cassandra integer type. For this
reason, when executing simple parameterized queries (statements) it is
necessary to explicitly specify the type of each value.
To specify the type of a value, that value needs to be provided as a
two-element tuple where the first element is the value's type and the second
element is the value itself. Types are expressed with the same syntax used in
CQL: for example, 16-bit integers are represented as `"smallint"`, while maps
of strings to booleans are represented as `"map<text, boolean>"`.
# Using a list of parameters:
statement = "INSERT INTO species (name, properties) VALUES (?, ?)"
Xandra.execute(conn, statement, [
{"text", "human"},
{"map<text, boolean>", %{"legs" => true, "arms" => true, "tail" => false}},
])
# Using a map of parameters:
statement = "INSERT INTO species (name, properties) VALUES (:name, :properties)"
Xandra.execute(conn, statement, %{
"name" => {"text", "human"},
"properties" => {"map<text, boolean>", %{"legs" => true, "arms" => true, "tail" => false}},
})
You only need to specify types for simple queries (statements): when using
prepared queries, the type information of each parameter of the query is
encoded in the prepared query itself.
# Using a map of parameters:
prepared = Xandra.prepare!(conn, "INSERT INTO species (name, properties) VALUES (:name, :properties)")
Xandra.execute(conn, prepared, %{
"name" => "human",
"properties" => %{"legs" => true, "arms" => true, "tail" => false},
})
#### User-defined types
Xandra supports user-defined types (UDTs). A UDT can be inserted as a map with
string fields. For example, consider having created the following UDTs:
CREATE TYPE full_name (first_name text, last_name text)
CREATE TYPE profile (username text, full_name frozen<full_name>)
and having the following table:
CREATE TABLE users (id int PRIMARY KEY, profile frozen<profile>)
Inserting rows will look something like this:
prepared_insert = Xandra.prepare!(conn, "INSERT INTO users (id, profile) VALUES (?, ?)")
profile = %{
"username" => "bperry",
"full_name" => %{"first_name" => "Britta", "last_name" => "Perry"},
}
Xandra.execute!(conn, prepared_insert, [_id = 1, profile])
Note that inserting UDTs is only supported on prepared queries.
When retrieved, UDTs are once again represented as maps with string
keys. Retrieving the row inserted above would look like this:
%{"profile" => profile} = conn |> Xandra.execute!("SELECT id, profile FROM users") |> Enum.fetch!(0)
profile
#=> %{"username" => "bperry", "full_name" => %{"first_name" => "Britta", "last_name" => "Perry"}}
## Reconnections
Thanks to the `DBConnection` library, Xandra is able to handle connection
losses and to automatically reconnect to Cassandra. By default, reconnections
are retried at exponentially increasing randomized intervals, but backoff can
be configured through a subset of the options accepted by
`start_link/2`. These options are described in the documentation for
`DBConnection.start_link/2`.
## Clustering
Xandra supports connecting to multiple nodes in a Cassandra cluster and
executing queries on different nodes based on load balancing strategies. See
the documentation for `Xandra.Cluster` for more information.
## Authentication
Xandra supports Cassandra authentication. See the documentation for
`Xandra.Authenticator` for more information.
## Retrying failed queries
Xandra takes a customizable and extensible approach to retrying failed queries
through "retry strategies" that encapsulate the logic for retrying
queries. See `Xandra.RetryStrategy` for documentation on retry strategies.
## Compression
Xandra supports compression. To inform the Cassandra server that the
connections you start should use compression for data transmitted to and from
the server, you can pass the `:compressor` option to `start_link/1`; this
option should be a module that implements the `Xandra.Compressor`
behaviour. After this, all compressed data that Cassandra sends to the
connection will be decompressed using this behaviour module.
To compress outgoing data (such as when issuing or preparing queries), the
`:compressor` option should be specified explicitly. When it's specified, the
given module will be used to compress data. If no `:compressor` option is
passed, the outgoing data will not be compressed.
"""
alias __MODULE__.{Batch, Connection, ConnectionError, Error, Prepared, Page, PageStream, Simple}
@type statement :: String.t()
@type values :: list | map
@type error :: Error.t() | ConnectionError.t()
@type result :: Xandra.Void.t() | Page.t() | Xandra.SetKeyspace.t() | Xandra.SchemaChange.t()
@type conn :: DBConnection.conn()
@type xandra_start_option ::
{:nodes, [String.t()]}
| {:compressor, module}
| {:authentication, {module, Keyword.t()}}
| {:atom_keys, boolean}
@type db_connection_start_option :: {atom(), any}
@type start_option :: xandra_start_option | db_connection_start_option
@type start_options :: [start_option]
@default_port 9042
@default_start_options [
nodes: ["127.0.0.1"],
idle_timeout: 30_000
]
@doc """
Starts a new connection or pool of connections to Cassandra.
This function starts a new connection or pool of connections to the provided
Cassandra server. `options` is a list of both Xandra-specific options, as well
as `DBConnection` options.
## Options
These are the Xandra-specific options supported by this function:
* `:nodes` - (list of strings) the Cassandra nodes to connect to. Each node
in the list has to be in the form `"ADDRESS:PORT"` or in the form
`"ADDRESS"`: if the latter is used, the default port (`#{@default_port}`)
will be used for that node. Defaults to `["127.0.0.1"]`. This option must
contain only one node unless the `:pool` option is set to
`Xandra.Cluster`; see the documentation for `Xandra.Cluster` for more
information.
* `:compressor` - (module) the compressor module to use for compressing and
decompressing data. See the "Compression" section in the module
documentation. By default this option is not present.
* `:authentication` - (tuple) a two-element tuple: the authenticator
module to use for authentication and its supported options. See the
"Authentication" section in the module documentation.
* `:atom_keys` - (boolean) whether or not results of and parameters to
`execute/4` will have atom keys. If `true`, the result maps will have
column names returned as atoms rather than as strings. Additionally,
maps that represent named parameters will need atom keys. Defaults to `false`.
The rest of the options are forwarded to `DBConnection.start_link/2`. For
example, to start a pool of connections to Cassandra, the `:pool` option can
be used:
Xandra.start_link(pool: DBConnection.Poolboy)
Note that this requires the `poolboy` dependency to be specified in your
application. The following options have default values that are different from
the default values provided by `DBConnection`:
* `:idle_timeout` - defaults to `30_000` (30 seconds)
## Examples
# Start a connection:
{:ok, conn} = Xandra.start_link()
# Start a connection and register it under a name:
{:ok, _conn} = Xandra.start_link(name: :xandra)
# Start a named pool of connections:
{:ok, _pool} = Xandra.start_link(name: :xandra_pool, pool: DBConnection.Poolboy)
As the `DBConnection` documentation states, if using a pool it's necessary to
pass a `:pool` option with the pool module being used to every call. For
example:
{:ok, _pool} = Xandra.start_link(name: :xandra_pool, pool: DBConnection.Poolboy)
Xandra.execute!(:xandra_pool, "SELECT * FROM users", _params = [], pool: DBConnection.Poolboy)
### Using a keyspace for new connections
It is common to start a Xandra connection or pool of connections that will use
a single keyspace for their whole life span. Doing something like:
{:ok, conn} = Xandra.start_link()
Xandra.execute!(conn, "USE my_keyspace")
will work just fine when you only have one connection. If you have a pool of
connections (with the `:pool` option), however, the code above won't work:
that code would start the pool, and then checkout one connection from the pool
to execute the `USE my_keyspace` query. That specific connection will then be
using the `my_keyspace` keyspace, but all other connections in the pool will
not. Fortunately, `DBConnection` provides an option we can use to solve this
problem: `:after_connect`. This option can specify a function that will be run
after each single connection to Cassandra. This function will take a
connection and can be used to setup that connection; since this function is
run for every established connection, it will work well with pools as well.
{:ok, conn} = Xandra.start_link(after_connect: fn(conn) -> Xandra.execute(conn, "USE my_keyspace") end)
See the documentation for `DBConnection.start_link/2` for more information
about this option.
"""
@spec start_link(start_options) :: GenServer.on_start()
def start_link(options \\ []) when is_list(options) do
options =
@default_start_options
|> Keyword.merge(options)
|> parse_start_options()
|> Keyword.put(:prepared_cache, Prepared.Cache.new())
DBConnection.start_link(Connection, options)
end
@doc """
Streams the results of a simple query or a prepared query with the given `params`.
This function can be used to stream the results of `query` so as not to load
them entirely in memory. This function doesn't send any query to Cassandra
right away: it will only execute queries as necessary when results are
requested out of the returned stream.
The returned value is a stream of `Xandra.Page` structs, where each of such
structs contains at most as many rows as specified by the `:page_size`
option. Every time an element is requested from the stream, `query` will be
executed with `params` to get that result.
In order to get each result from Cassandra, `execute!/4` is used: this means
that if there is an error (such as a network error) when executing the
queries, that error will be raised.
### Simple or prepared queries
Regardless of `query` being a simple query or a prepared query, this function
will execute it every time a result is needed from the returned stream. For
this reason, it is usually a good idea to use prepared queries when streaming.
## Options
`options` supports all the options supported by `execute/4`, with the same
default values.
## Examples
prepared = Xandra.prepare!(conn, "SELECT * FROM users")
users_stream = Xandra.stream_pages!(conn, prepared, _params = [], page_size: 2)
[%Xandra.Page{} = _page1, %Xandra.Page{} = _page2] = Enum.take(users_stream, 2)
"""
@spec stream_pages!(conn, statement | Prepared.t(), values, Keyword.t()) :: Enumerable.t()
def stream_pages!(conn, query, params, options \\ [])
def stream_pages!(conn, statement, params, options) when is_binary(statement) do
%PageStream{conn: conn, query: statement, params: params, options: options}
end
def stream_pages!(conn, %Prepared{} = prepared, params, options) do
%PageStream{conn: conn, query: prepared, params: params, options: options}
end
@doc """
Prepares the given query.
This function prepares the given statement on the Cassandra server. If
preparation is successful and there are no network errors while talking to the
server, `{:ok, prepared}` is returned, otherwise `{:error, error}` is
returned.
The returned prepared query can be run through `execute/4`, or used inside a
batch (see `Xandra.Batch`).
Errors returned by this function can be either `Xandra.Error` or
`Xandra.ConnectionError` structs. See the module documentation for more
information about errors.
Supports all the options supported by `DBConnection.prepare/3`, and the
following additional options:
* `:force` - (boolean) when `true`, forces the preparation of the query on
the server instead of trying to read the prepared query from cache. See
the "Prepared queries cache" section below. Defaults to `false`.
* `:compressor` - (module) the compressor module used to compress and
decompress data. See the "Compression" section in the module
documentation. By default, this option is not present.
## Prepared queries cache
Since Cassandra prepares queries on a per-node basis (and not on a
per-connection basis), Xandra internally caches prepared queries for each
connection or pool of connections. This means that if you prepare a query that
was already prepared, no action will be executed on the Cassandra server and
the prepared query will be returned from the cache.
If the Cassandra node goes down, however, the prepared query will be
invalidated and trying to use the one from cache will result in a
`Xandra.Error`. However, this is automatically handled by Xandra: when such an
error is returned, Xandra will first retry to prepare the query and only
return an error if the preparation fails.
If you want to ensure a query is prepared on the server, you can set the
`:force` option to `true`.
## Examples
{:ok, prepared} = Xandra.prepare(conn, "SELECT * FROM users WHERE id = ?")
{:ok, _page} = Xandra.execute(conn, prepared, [_id = 1])
{:error, %Xandra.Error{reason: :invalid_syntax}} = Xandra.prepare(conn, "bad syntax")
# Force a query to be prepared on the server and not be read from cache:
Xandra.prepare!(conn, "SELECT * FROM users WHERE ID = ?", force: true)
"""
@spec prepare(conn, statement, Keyword.t()) :: {:ok, Prepared.t()} | {:error, error}
def prepare(conn, statement, options \\ []) when is_binary(statement) do
DBConnection.prepare(conn, %Prepared{statement: statement}, options)
end
@doc """
Prepares the given query, raising if there's an error.
This function works exactly like `prepare/3`, except it returns the prepared
query directly if preparation succeeds, otherwise raises the returned error.
## Examples
prepared = Xandra.prepare!(conn, "SELECT * FROM users WHERE id = ?")
{:ok, _page} = Xandra.execute(conn, prepared, [_id = 1])
"""
@spec prepare!(conn, statement, Keyword.t()) :: Prepared.t() | no_return
def prepare!(conn, statement, options \\ []) do
case prepare(conn, statement, options) do
{:ok, result} -> result
{:error, exception} -> raise(exception)
end
end
@doc """
Executes the given simple query, prepared query, or batch query.
Returns `{:ok, result}` if executing the query was successful, or `{:error,
error}` otherwise. The meaning of the `params_or_options` argument depends on
what `query` is:
* if `query` is a batch query, than `params_or_options` has to be a list of
options that will be used to run the batch query (since batch queries
don't use parameters as parameters are attached to each query in the
batch).
* if `query` is a simple query (a string) or a prepared query, then
`params_or_opts` is a list or map of parameters, and this function is
exactly the same as calling `execute(conn, query, params_or_options, [])`.
When `query` is a batch query, successful results will always be `Xandra.Void`
structs.
When `{:error, error}` is returned, `error` can be either a `Xandra.Error` or
a `Xandra.ConnectionError` struct. See the module documentation for more
information on errors.
## Options for batch queries
When `query` is a batch query, `params_or_options` is a list of options. All
options supported by `DBConnection.execute/4` are supported, and the following
additional batch-specific options:
* `:consistency` - same as the `:consistency` option described in the
documentation for `execute/4`.
* `:serial_consistency` - same as the `:serial_consistency` option described
in the documentation for `execute/4`.
* `:timestamp` - (integer) using this option means that the provided
timestamp will apply to all the statements in the batch that do not
explicitly specify a timestamp.
## Examples
For examples on executing simple queries or prepared queries, see the
documentation for `execute/4`. Examples below specifically refer to batch
queries. See the documentation for `Xandra.Batch` for more information about
batch queries and how to construct them.
prepared_insert = Xandra.prepare!(conn, "INSERT (email, name) INTO users VALUES (?, ?)")
batch =
Xandra.Batch.new()
|> Xandra.Batch.add(prepared_insert, ["abed@community.com", "Abed Nadir"])
|> Xandra.Batch.add(prepared_insert, ["troy@community.com", "Troy Barnes"])
|> Xandra.Batch.add(prepared_insert, ["britta@community.com", "Britta Perry"])
# Execute the batch:
Xandra.execute(conn, batch)
#=> {:ok, %Xandra.Void{}}
# Execute the batch with a default timestamp for all statements:
Xandra.execute(conn, batch, timestamp: System.system_time(:millisecond) - 1_000)
#=> {:ok, %Xandra.Void{}}
All `DBConnection.execute/4` options are supported here as well:
Xandra.execute(conn, batch, pool: DBConnection.Poolboy)
#=> {:ok, %Xandra.Void{}}
"""
@spec execute(conn, statement | Prepared.t(), values) :: {:ok, result} | {:error, error}
@spec execute(conn, Batch.t(), Keyword.t()) :: {:ok, Xandra.Void.t()} | {:error, error}
def execute(conn, query, params_or_options \\ [])
def execute(conn, statement, params) when is_binary(statement) do
execute(conn, statement, params, _options = [])
end
def execute(conn, %Prepared{} = prepared, params) do
execute(conn, prepared, params, _options = [])
end
def execute(conn, %Batch{} = batch, options) when is_list(options) do
execute_with_retrying(conn, batch, nil, options)
end
@doc """
Executes the given simple query or prepared query with the given parameters.
Returns `{:ok, result}` where `result` is the result of executing `query` if
the execution is successful (there are no network errors or semantic errors
with the query), or `{:error, error}` otherwise.
`result` can be one of the following:
* a `Xandra.Void` struct - returned for queries such as `INSERT`, `UPDATE`,
or `DELETE`.
* a `Xandra.SchemaChange` struct - returned for queries that perform changes
on the schema (such as creating tables).
* a `Xandra.SetKeyspace` struct - returned for `USE` queries.
* a `Xandra.Page` struct - returned for queries that return rows (such as
`SELECT` queries).
The properties of each of the results listed above are described in each
result's module.
## Options
This function accepts all options accepted by `DBConnection.execute/4`, plus
the following ones:
* `:consistency` - (atom) specifies the consistency level for the given
query. See the Cassandra documentation for more information on consistency
levels. The value of this option can be one of:
* `:one` (default)
* `:two`
* `:three`
* `:any`
* `:quorum`
* `:all`
* `:local_quorum`
* `:each_quorum`
* `:serial`
* `:local_serial`
* `:local_one`
* `:page_size` - (integer) the size of a page of results. If `query` returns
`Xandra.Page` struct, that struct will contain at most `:page_size` rows
in it. Defaults to `10_000`.
* `:paging_state` - (binary) the offset where rows should be
returned from. By default this option is not present and paging starts
from the beginning. See the "Paging" section below for more information on
how to page queries.
* `:timestamp` - (integer) the default timestamp for the query (in
microseconds). If provided, overrides the server-side assigned timestamp;
however, a timestamp in the query itself will still override this
timestamp.
* `:serial_consistency` - (atom) specifies the serial consistency to use for
executing the given query. Can be of `:serial` and `:local_serial`.
* `:compressor` - (module) the compressor module used to compress and
decompress data. See the "Compression" section in the module
documentation. By default, this option is not present.
* `:retry_strategy` - (module) the module implementing the
`Xandra.RetryStrategy` behaviour that is used in case the query fails to
determine whether to retry it or not. See the "Retrying failed queries"
section in the module documentation. By default, this option is not
present.
* `:date_format` - (`:date` or `:integer`) controls the format in which
dates are returned. When set to `:integer` the returned value is
a number of days from the Unix epoch, a date struct otherwise.
Defaults to `:date`.
* `:time_format` - (`:time` or `:integer`) controls the format in which
times are returned. When set to `:integer` the returned value is
a number of nanoseconds from midnight, a time struct otherwise.
Defaults to `:time`.
* `:timestamp_format` - (`:datetime` or `:integer`) controls the format in which
timestamps are returned. When set to `:integer` the returned value is
a number of milliseconds from the Unix epoch, a datetime struct otherwise.
Defaults to `:datetime`.
## Parameters
The `params` argument specifies parameters to use when executing the query; it
can be either a list of positional parameters (specified via `?` in the query)
or a map of named parameters (specified as `:named_parameter` in the
query). When `query` is a simple query, the value of each parameter must be a
two-element tuple specifying the type used to encode the value and the value
itself; when `query` is a prepared query, this is not necessary (and values
can just be values) as the type information is encoded in the prepared
query. See the module documenatation for more information about query
parameters, types, and encoding values.
## Examples
Executing a simple query (which is just a string):
statement = "INSERT INTO users (first_name, last_name) VALUES (:first_name, :last_name)"
{:ok, %Xandra.Void{}} = Xandra.execute(conn, statement, %{
"first_name" => {"text", "Chandler"},
"last_name" => {"text", "Bing"},
})
Executing the query when `atom_keys: true` has been specified in `Xandra.start_link/1`:
Xandra.execute(conn, statement, %{
first_name: {"text", "Chandler"},
last_name: {"text", "Bing"}
})
Executing a prepared query:
prepared = Xandra.prepare!(conn, "INSERT INTO users (first_name, last_name) VALUES (?, ?)")
{:ok, %Xandra.Void{}} = Xandra.execute(conn, prepared, ["Monica", "Geller"])
Performing a `SELECT` query and using `Enum.to_list/1` to convert the
`Xandra.Page` result to a list of rows:
statement = "SELECT * FROM users"
{:ok, %Xandra.Page{} = page} = Xandra.execute(conn, statement, _params = [])
Enum.to_list(page)
#=> [%{"first_name" => "Chandler", "last_name" => "Bing"},
#=> %{"first_name" => "Monica", "last_name" => "Geller"}]
Performing the query when `atom_keys: true` has been specified in `Xandra.start_link/1`:
{:ok, page} = Xandra.execute(conn, statement, _params = [])
Enum.to_list(page)
#=> [%{first_name: "Chandler", last_name: "Bing"},
#=> %{first_name: "Monica", last_name: "Geller"}]
Ensuring the write is written to the commit log and memtable of at least three replica nodes:
statement = "INSERT INTO users (first_name, last_name) VALUES ('Chandler', 'Bing')"
{:ok, %Xandra.Void{}} = Xandra.execute(conn, statement, _params = [], consistency: :three)
This function supports all options supported by `DBConnection.execute/4`; for
example, if the `conn` connection was started with `pool: DBConnection.Poolboy`,
then the `:pool` option would have to be passed here as well:
statement = "DELETE FROM users WHERE first_name = 'Chandler'"
{:ok, %Xandra.Void{}} = Xandra.execute(conn, statement, _params = [], pool: DBConnection.Poolboy)
## Paging
Since `execute/4` supports the `:paging_state` option, it is possible to manually
implement paging. For example, given the following prepared query:
prepared = Xandra.prepare!(conn, "SELECT first_name FROM users")
We can now execute such query with a specific page size using the `:page_size`
option:
{:ok, %Xandra.Page{} = page} = Xandra.execute(conn, prepared, [], page_size: 2)
Since `:page_size` is `2`, `page` will contain at most `2` rows:
Enum.to_list(page)
#=> [%{"first_name" => "Ross"}, %{"first_name" => "Rachel"}]
Now, we can pass `page.paging_state` as the value of the `:paging_state` option to let the paging
start from where we left off:
{:ok, %Xandra.Page{} = new_page} = Xandra.execute(conn, prepared, [], page_size: 2, paging_state: page.paging_state)
Enum.to_list(page)
#=> [%{"first_name" => "Joey"}, %{"first_name" => "Phoebe"}]
However, using `:paging_state` and `:page_size` directly with `execute/4` is not
recommended when the intent is to "stream" a query. For that, it's recommended
to use `stream_pages!/4`. Also note that if the `:paging_state` option is set to `nil`,
meaning there are no more pages to fetch, an `ArgumentError` exception will be raised;
be sure to check for this with `page.paging_state != nil`.
"""
@spec execute(conn, statement | Prepared.t(), values, Keyword.t()) ::
{:ok, result} | {:error, error}
def execute(conn, query, params, options)
def execute(conn, statement, params, options) when is_binary(statement) do
query = %Simple{statement: statement}
execute_with_retrying(conn, query, params, validate_paging_state(options))
end
def execute(conn, %Prepared{} = prepared, params, options) do
execute_with_retrying(conn, prepared, params, validate_paging_state(options))
end
@doc """
Executes the given simple query, prepared query, or batch query, raising if
there's an error.
This function behaves exactly like `execute/3`, except that it returns
successful results directly and raises on errors.
## Examples
Xandra.execute!(conn, "INSERT INTO users (name, age) VALUES ('Jane', 29)")
#=> %Xandra.Void{}
"""
@spec execute!(conn, statement | Prepared.t(), values) :: result | no_return
@spec execute!(conn, Batch.t(), Keyword.t()) :: Xandra.Void.t() | no_return
def execute!(conn, query, params_or_options \\ []) do
case execute(conn, query, params_or_options) do
{:ok, result} -> result
{:error, exception} -> raise(exception)
end
end
@doc """
Executes the given simple query, prepared query, or batch query, raising if
there's an error.
This function behaves exactly like `execute/4`, except that it returns
successful results directly and raises on errors.
## Examples
statement = "INSERT INTO users (name, age) VALUES ('John', 43)"
Xandra.execute!(conn, statement, _params = [], consistency: :quorum)
#=> %Xandra.Void{}
"""
@spec execute!(conn, statement | Prepared.t(), values, Keyword.t()) :: result | no_return
def execute!(conn, query, params, options) do
case execute(conn, query, params, options) do
{:ok, result} -> result
{:error, exception} -> raise(exception)
end
end
@doc """
Acquires a locked connection from `conn` and executes `fun` passing such
connection as the argument.
All options are forwarded to `DBConnection.run/3` (and thus some of them to
the underlying pool).
The return value of this function is the return value of `fun`.
## Examples
Preparing a query and executing it on the same connection:
Xandra.run(conn, fn conn ->
prepared = Xandra.prepare!(conn, "INSERT INTO users (name, age) VALUES (:name, :age)")
Xandra.execute!(conn, prepared, %{"name" => "John", "age" => 84})
end)
"""
@spec run(conn, Keyword.t(), (conn -> result)) :: result when result: var
def run(conn, options \\ [], fun) when is_function(fun, 1) do
DBConnection.run(conn, fun, options)
end
defp reprepare_queries(conn, [%Simple{} | rest], options) do
reprepare_queries(conn, rest, options)
end
defp reprepare_queries(conn, [%Prepared{statement: statement} | rest], options) do
with {:ok, _prepared} <- prepare(conn, statement, Keyword.put(options, :force, true)) do
reprepare_queries(conn, rest, options)
end
end
defp reprepare_queries(_conn, [], _options) do
:ok
end
defp validate_paging_state(options) do
case Keyword.fetch(options, :paging_state) do
{:ok, nil} ->
raise ArgumentError, "no more pages are available"
{:ok, value} when not is_binary(value) ->
raise ArgumentError,
"expected a binary as the value of the :paging_state option, " <>
"got: #{inspect(value)}"
_other ->
maybe_put_paging_state(options)
end
end
defp maybe_put_paging_state(options) do
case Keyword.pop(options, :cursor) do
{%Page{paging_state: nil}, _options} ->
raise ArgumentError, "no more pages are available"
{%Page{paging_state: paging_state}, options} ->
IO.warn("the :cursor option is deprecated, please use :paging_state instead")
Keyword.put(options, :paging_state, paging_state)
{nil, options} ->
options
{other, _options} ->
raise ArgumentError,
"expected a Xandra.Page struct as the value of the :cursor option, " <>
"got: #{inspect(other)}"
end
end
defp execute_with_retrying(conn, query, params, options) do
case Keyword.pop(options, :retry_strategy) do
{nil, options} ->
execute_without_retrying(conn, query, params, options)
{retry_strategy, options} ->
execute_with_retrying(conn, query, params, options, retry_strategy)
end
end
defp execute_with_retrying(conn, query, params, options, retry_strategy) do
with {:error, reason} <- execute_without_retrying(conn, query, params, options) do
{retry_state, options} =
Keyword.pop_lazy(options, :retrying_state, fn ->
retry_strategy.new(options)
end)
case retry_strategy.retry(reason, options, retry_state) do
:error ->
{:error, reason}
{:retry, new_options, new_retry_state} ->
new_options = Keyword.put(new_options, :retrying_state, new_retry_state)
execute_with_retrying(conn, query, params, new_options, retry_strategy)
other ->
raise ArgumentError,
"invalid return value #{inspect(other)} from " <>
"retry strategy #{inspect(retry_strategy)} " <>
"with state #{inspect(retry_state)}"
end
end
end
defp execute_without_retrying(conn, %Batch{} = batch, nil, options) do
run(conn, options, fn conn ->
case DBConnection.execute(conn, batch, nil, options) do
{:ok, %Error{reason: :unprepared}} ->
with :ok <- reprepare_queries(conn, batch.queries, options) do
execute(conn, batch, options)
end
{:ok, %Error{} = error} ->
{:error, error}
other ->
other
end
end)
end
defp execute_without_retrying(conn, %Simple{} = query, params, options) do
with {:ok, %Error{} = error} <- DBConnection.execute(conn, query, params, options) do
{:error, error}
end
end
defp execute_without_retrying(conn, %Prepared{} = prepared, params, options) do
run(conn, options, fn conn ->
case DBConnection.execute(conn, prepared, params, options) do
{:ok, %Error{reason: :unprepared}} ->
# We can ignore the newly returned prepared query since it will have the
# same id of the query we are repreparing.
case DBConnection.prepare_execute(
conn,
prepared,
params,
Keyword.put(options, :force, true)
) do
{:ok, _prepared, %Error{} = error} ->
{:error, error}
{:ok, _prepared, result} ->
{:ok, result}
{:error, _reason} = error ->
error
end
{:ok, %Error{} = error} ->
{:error, error}
other ->
other
end
end)
end
defp parse_start_options(options) do
cluster? = options[:pool] == Xandra.Cluster
Enum.flat_map(options, fn
{:nodes, nodes} when cluster? ->
[nodes: Enum.map(nodes, &parse_node/1)]
{:nodes, [string]} ->
{address, port} = parse_node(string)
[address: address, port: port]
{:nodes, _nodes} ->
raise ArgumentError,
"multi-node use requires the :pool option to be set to Xandra.Cluster"
{_key, _value} = option ->
[option]
end)
end
defp parse_node(string) do
case String.split(string, ":", parts: 2) do
[address, port] ->
case Integer.parse(port) do
{port, ""} ->
{String.to_charlist(address), port}
_ ->
raise ArgumentError, "invalid item #{inspect(string)} in the :nodes option"
end
[address] ->
{String.to_charlist(address), @default_port}
end
end
end
| 39.653595 | 122 | 0.682572 |
1c1868d468e11f3372ba5d58d2fcbca5ded36926 | 1,049 | ex | Elixir | {{cookiecutter.project_slug}}/apps/{{cookiecutter.phoenix_app_slug}}_web/test/support/channel_case.ex | ibakami/cookiecutter-elixir-phoenix | 672b9e05f40b01a810a073a9712fc3300c396e40 | [
"MIT"
] | 14 | 2019-08-01T07:55:50.000Z | 2021-04-24T09:14:09.000Z | {{cookiecutter.project_slug}}/apps/{{cookiecutter.phoenix_app_slug}}_web/test/support/channel_case.ex | ibakami/cookiecutter-elixir-phoenix | 672b9e05f40b01a810a073a9712fc3300c396e40 | [
"MIT"
] | 1 | 2019-08-02T03:03:40.000Z | 2019-08-02T03:03:40.000Z | {{cookiecutter.project_slug}}/apps/{{cookiecutter.phoenix_app_slug}}_web/test/support/channel_case.ex | ibakami/cookiecutter-elixir-phoenix | 672b9e05f40b01a810a073a9712fc3300c396e40 | [
"MIT"
] | null | null | null | defmodule {{cookiecutter.phoenix_app_module}}Web.ChannelCase do
@moduledoc """
This module defines the test case to be used by
channel tests.
Such tests rely on `Phoenix.ChannelTest` and also
import other functionality to make it easier
to build common data structures and query the data layer.
Finally, if the test case interacts with the database,
it cannot be async. For this reason, every test runs
inside a transaction which is reset at the beginning
of the test unless the test case is marked as async.
"""
use ExUnit.CaseTemplate
alias Ecto.Adapters.SQL.Sandbox
using do
quote do
# Import conveniences for testing with channels
use Phoenix.ChannelTest
# The default endpoint for testing
@endpoint {{cookiecutter.phoenix_app_module}}Web.Endpoint
end
end
setup tags do
:ok = Sandbox.checkout({{cookiecutter.phoenix_app_module}}.Repo)
unless tags[:async] do
Sandbox.mode({{cookiecutter.phoenix_app_module}}.Repo, {:shared, self()})
end
:ok
end
end
| 26.225 | 79 | 0.7245 |
1c1898f0a07410eb63d60799083c3cbf990572ae | 2,177 | exs | Elixir | apps/tracing_1/config/prod.exs | WhiteRookPL/elixir-fire-brigade-workshop | 1c6183339fc623842a09f4d10be75bcecf2c37e7 | [
"MIT"
] | 14 | 2017-08-09T14:21:47.000Z | 2022-03-11T04:10:49.000Z | apps/tracing_1/config/prod.exs | nicholasjhenry/elixir-fire-brigade-workshop | 1c6183339fc623842a09f4d10be75bcecf2c37e7 | [
"MIT"
] | null | null | null | apps/tracing_1/config/prod.exs | nicholasjhenry/elixir-fire-brigade-workshop | 1c6183339fc623842a09f4d10be75bcecf2c37e7 | [
"MIT"
] | 15 | 2017-09-05T15:43:53.000Z | 2020-04-13T16:20:18.000Z | use Mix.Config
# For production, we often load configuration from external
# sources, such as your system environment. For this reason,
# you won't find the :http configuration below, but set inside
# RestApiWeb.Endpoint.init/2 when load_from_system_env is
# true. Any dynamic configuration should be done there.
#
# Don't forget to configure the url host to something meaningful,
# Phoenix uses this information when generating URLs.
#
# Finally, we also include the path to a cache manifest
# containing the digested version of static files. This
# manifest is generated by the mix phx.digest task
# which you typically run after static files are built.
config :rest_api, RestApiWeb.Endpoint,
load_from_system_env: true,
url: [host: "example.com", port: 80],
cache_static_manifest: "priv/static/cache_manifest.json"
# Do not print debug messages in production
config :logger, level: :info
# ## SSL Support
#
# To get SSL working, you will need to add the `https` key
# to the previous section and set your `:url` port to 443:
#
# config :rest_api, RestApiWeb.Endpoint,
# ...
# url: [host: "example.com", port: 443],
# https: [:inet6,
# port: 443,
# keyfile: System.get_env("SOME_APP_SSL_KEY_PATH"),
# certfile: System.get_env("SOME_APP_SSL_CERT_PATH")]
#
# Where those two env variables return an absolute path to
# the key and cert in disk or a relative path inside priv,
# for example "priv/ssl/server.key".
#
# We also recommend setting `force_ssl`, ensuring no data is
# ever sent via http, always redirecting to https:
#
# config :rest_api, RestApiWeb.Endpoint,
# force_ssl: [hsts: true]
#
# Check `Plug.SSL` for all available options in `force_ssl`.
# ## Using releases
#
# If you are doing OTP releases, you need to instruct Phoenix
# to start the server for all endpoints:
#
# config :phoenix, :serve_endpoints, true
#
# Alternatively, you can configure exactly which server to
# start per endpoint:
#
# config :rest_api, RestApiWeb.Endpoint, server: true
#
# Finally import the config/prod.secret.exs
# which should be versioned separately.
import_config "prod.secret.exs"
| 33.492308 | 67 | 0.722095 |
1c18af5b57622f00d21972909eaf4e95c840f567 | 1,565 | ex | Elixir | lib/ex_admin/sidebar.ex | andriybohdan/ex_admin | e31c725078ac4e7390204a87d96360a21ffe7b90 | [
"MIT"
] | 1 | 2018-08-30T20:20:56.000Z | 2018-08-30T20:20:56.000Z | lib/ex_admin/sidebar.ex | 8thlight/ex_admin | 314d4068270c47799ec54f719073a565222bcfad | [
"MIT"
] | null | null | null | lib/ex_admin/sidebar.ex | 8thlight/ex_admin | 314d4068270c47799ec54f719073a565222bcfad | [
"MIT"
] | 2 | 2018-07-12T07:44:50.000Z | 2018-07-19T11:45:09.000Z | Code.ensure_compiled(ExAdmin.Utils)
defmodule ExAdmin.Sidebar do
@moduledoc false
require Logger
require Ecto.Query
use Xain
def sidebars_visible?(_conn, %{sidebars: []}), do: false
def sidebars_visible?(conn, %{sidebars: sidebars}) do
Enum.reduce sidebars, false, fn({_, opts, _}, acc) ->
acc || visible?(conn, opts)
end
end
def sidebar_view(_conn, %{sidebars: []}, _), do: ""
def sidebar_view(conn, %{sidebars: sidebars}, resource) do
for sidebar <- sidebars do
_sidebar_view(conn, sidebar, resource)
end
end
defp _sidebar_view(conn, {name, opts, {mod, fun}}, resource) do
if visible? conn, opts do
ExAdmin.Theme.Helpers.theme_module(conn, Layout).sidebar_view(
conn, {name, opts, {mod, fun}}, resource)
else
""
end
end
def visible?(conn, opts) do
Phoenix.Controller.action_name(conn)
|> _visible?(Enum.into opts, %{})
end
def _visible?(action, %{only: only}) when is_atom(only) do
if action == only, do: true, else: false
end
def _visible?(action, %{only: only}) when is_list(only) do
if action in only, do: true, else: false
end
def _visible?(action, %{except: except}) when is_atom(except) do
if action == except, do: false, else: true
end
def _visible?(action, %{except: except}) when is_list(except) do
if action in except, do: false, else: true
end
def _visible?(_, _), do: true
def get_actions(item, opts) do
case opts[item] || [] do
atom when is_atom(atom) -> [atom]
other -> other
end
end
end
| 26.982759 | 68 | 0.647284 |
1c18b8a81ac0e7c9453b3a24b99f509bcfd03564 | 516 | ex | Elixir | lib/type_check/default_overrides/range.ex | kkentzo/elixir-type_check | bec089445286e4a420d653276e7ba96dd1016876 | [
"MIT"
] | 291 | 2020-07-07T18:14:46.000Z | 2022-03-29T22:36:48.000Z | lib/type_check/default_overrides/range.ex | kkentzo/elixir-type_check | bec089445286e4a420d653276e7ba96dd1016876 | [
"MIT"
] | 71 | 2020-07-07T11:50:37.000Z | 2022-03-23T21:20:54.000Z | lib/type_check/default_overrides/range.ex | kkentzo/elixir-type_check | bec089445286e4a420d653276e7ba96dd1016876 | [
"MIT"
] | 12 | 2020-10-07T16:28:22.000Z | 2022-02-17T16:31:05.000Z | defmodule TypeCheck.DefaultOverrides.Range do
use TypeCheck
@type! limit() :: integer()
if Version.compare(System.version(), "1.12.0") == :lt do
@type! t() :: %Elixir.Range{first: limit(), last: limit()}
@type! t(first, last) :: %Elixir.Range{first: first, last: last}
else
@type! step() :: pos_integer() | neg_integer()
@type! t() :: %Elixir.Range{first: limit(), last: limit(), step: step()}
@type! t(first, last) :: %Elixir.Range{first: first, last: last, step: step()}
end
end
| 28.666667 | 82 | 0.616279 |
1c18db0f28f85ab16e81a6097092692353763801 | 1,256 | ex | Elixir | lib/awesome_elixir/web/views/error_helpers.ex | sprql/awesome_elixir | ae1a372bf3060142a546aaf6cb28ffda491d9fa0 | [
"MIT"
] | 1 | 2017-04-13T05:37:08.000Z | 2017-04-13T05:37:08.000Z | lib/awesome_elixir/web/views/error_helpers.ex | sprql/awesome_elixir | ae1a372bf3060142a546aaf6cb28ffda491d9fa0 | [
"MIT"
] | null | null | null | lib/awesome_elixir/web/views/error_helpers.ex | sprql/awesome_elixir | ae1a372bf3060142a546aaf6cb28ffda491d9fa0 | [
"MIT"
] | null | null | null | defmodule AwesomeElixir.Web.ErrorHelpers do
@moduledoc """
Conveniences for translating and building error messages.
"""
use Phoenix.HTML
@doc """
Generates tag for inlined form input errors.
"""
def error_tag(form, field) do
if error = form.errors[field] do
content_tag :span, translate_error(error), class: "help-block"
end
end
@doc """
Translates an error message using gettext.
"""
def translate_error({msg, opts}) do
# Because error messages were defined within Ecto, we must
# call the Gettext module passing our Gettext backend. We
# also use the "errors" domain as translations are placed
# in the errors.po file.
# Ecto will pass the :count keyword if the error message is
# meant to be pluralized.
# On your own code and templates, depending on whether you
# need the message to be pluralized or not, this could be
# written simply as:
#
# dngettext "errors", "1 file", "%{count} files", count
# dgettext "errors", "is invalid"
#
if count = opts[:count] do
Gettext.dngettext(AwesomeElixir.Web.Gettext, "errors", msg, msg, count, opts)
else
Gettext.dgettext(AwesomeElixir.Web.Gettext, "errors", msg, opts)
end
end
end
| 30.634146 | 83 | 0.672771 |
1c18edbb3dbaacfbf4d1d8a400b7d4ea5ff99652 | 491 | ex | Elixir | lib/history_web/views/error_view.ex | fremantle-industries/history | a8a33744279ff4ca62620785f9a2e9c0c99e4de7 | [
"MIT"
] | 20 | 2021-08-06T01:09:48.000Z | 2022-03-28T18:44:56.000Z | lib/history_web/views/error_view.ex | fremantle-industries/history | a8a33744279ff4ca62620785f9a2e9c0c99e4de7 | [
"MIT"
] | 13 | 2021-08-21T21:17:02.000Z | 2022-03-27T06:33:51.000Z | lib/history_web/views/error_view.ex | fremantle-industries/history | a8a33744279ff4ca62620785f9a2e9c0c99e4de7 | [
"MIT"
] | 2 | 2021-09-23T11:31:59.000Z | 2022-01-09T16:19:35.000Z | defmodule HistoryWeb.ErrorView do
use HistoryWeb, :view
# If you want to customize a particular status code
# for a certain format, you may uncomment below.
# def render("500.html", _assigns) do
# "Internal Server Error"
# end
# By default, Phoenix returns the status message from
# the template name. For example, "404.html" becomes
# "Not Found".
def template_not_found(template, _assigns) do
Phoenix.Controller.status_message_from_template(template)
end
end
| 28.882353 | 61 | 0.735234 |
1c1916f5c5a1b170b9f5c658c96fb3376392cd55 | 2,302 | ex | Elixir | clients/monitoring/lib/google_api/monitoring/v3/model/monitored_resource_metadata.ex | MasashiYokota/elixir-google-api | 975dccbff395c16afcb62e7a8e411fbb58e9ab01 | [
"Apache-2.0"
] | null | null | null | clients/monitoring/lib/google_api/monitoring/v3/model/monitored_resource_metadata.ex | MasashiYokota/elixir-google-api | 975dccbff395c16afcb62e7a8e411fbb58e9ab01 | [
"Apache-2.0"
] | 1 | 2020-12-18T09:25:12.000Z | 2020-12-18T09:25:12.000Z | clients/monitoring/lib/google_api/monitoring/v3/model/monitored_resource_metadata.ex | MasashiYokota/elixir-google-api | 975dccbff395c16afcb62e7a8e411fbb58e9ab01 | [
"Apache-2.0"
] | 1 | 2020-10-04T10:12:44.000Z | 2020-10-04T10:12:44.000Z | # Copyright 2019 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# NOTE: This file is auto generated by the elixir code generator program.
# Do not edit this file manually.
defmodule GoogleApi.Monitoring.V3.Model.MonitoredResourceMetadata do
@moduledoc """
Auxiliary metadata for a MonitoredResource object. MonitoredResource objects contain the minimum set of information to uniquely identify a monitored resource instance. There is some other useful auxiliary metadata. Monitoring and Logging use an ingestion pipeline to extract metadata for cloud resources of all types, and store the metadata in this message.
## Attributes
* `systemLabels` (*type:* `map()`, *default:* `nil`) - Output only. Values for predefined system metadata labels. System labels are a kind of metadata extracted by Google, including "machine_image", "vpc", "subnet_id", "security_group", "name", etc. System label values can be only strings, Boolean values, or a list of strings. For example: { "name": "my-test-instance", "security_group": ["a", "b", "c"], "spot_instance": false }
* `userLabels` (*type:* `map()`, *default:* `nil`) - Output only. A map of user-defined metadata labels.
"""
use GoogleApi.Gax.ModelBase
@type t :: %__MODULE__{
:systemLabels => map(),
:userLabels => map()
}
field(:systemLabels, type: :map)
field(:userLabels, type: :map)
end
defimpl Poison.Decoder, for: GoogleApi.Monitoring.V3.Model.MonitoredResourceMetadata do
def decode(value, options) do
GoogleApi.Monitoring.V3.Model.MonitoredResourceMetadata.decode(value, options)
end
end
defimpl Poison.Encoder, for: GoogleApi.Monitoring.V3.Model.MonitoredResourceMetadata do
def encode(value, options) do
GoogleApi.Gax.ModelBase.encode(value, options)
end
end
| 46.04 | 436 | 0.743267 |
1c194225292a238ba3abe1d13ab8ba6f3221ce87 | 746 | ex | Elixir | lib/absinthe/blueprint/document/fragment/inline.ex | TheRealReal/absinthe | 6eae5bc36283e58f42d032b8afd90de3ad64f97b | [
"MIT"
] | 4,101 | 2016-03-02T03:49:20.000Z | 2022-03-31T05:46:01.000Z | lib/absinthe/blueprint/document/fragment/inline.ex | TheRealReal/absinthe | 6eae5bc36283e58f42d032b8afd90de3ad64f97b | [
"MIT"
] | 889 | 2016-03-02T16:06:59.000Z | 2022-03-31T20:24:12.000Z | lib/absinthe/blueprint/document/fragment/inline.ex | TheRealReal/absinthe | 6eae5bc36283e58f42d032b8afd90de3ad64f97b | [
"MIT"
] | 564 | 2016-03-02T07:49:59.000Z | 2022-03-06T14:40:59.000Z | defmodule Absinthe.Blueprint.Document.Fragment.Inline do
@moduledoc false
alias Absinthe.Blueprint
@enforce_keys [:type_condition]
defstruct [
:type_condition,
selections: [],
directives: [],
source_location: nil,
# Populated by phases
schema_node: nil,
complexity: nil,
flags: %{},
errors: []
]
@type t :: %__MODULE__{
directives: [Blueprint.Directive.t()],
errors: [Absinthe.Phase.Error.t()],
flags: Blueprint.flags_t(),
selections: [Blueprint.Document.selection_t()],
schema_node: nil | Absinthe.Type.t(),
source_location: nil | Blueprint.SourceLocation.t(),
type_condition: Blueprint.TypeReference.Name.t()
}
end
| 25.724138 | 62 | 0.628686 |
1c196531f398125226859cfc081b15ff1c5d3b13 | 2,088 | ex | Elixir | clients/video_intelligence/lib/google_api/video_intelligence/v1/model/google_cloud_videointelligence_v1__text_detection_config.ex | medikent/elixir-google-api | 98a83d4f7bfaeac15b67b04548711bb7e49f9490 | [
"Apache-2.0"
] | null | null | null | clients/video_intelligence/lib/google_api/video_intelligence/v1/model/google_cloud_videointelligence_v1__text_detection_config.ex | medikent/elixir-google-api | 98a83d4f7bfaeac15b67b04548711bb7e49f9490 | [
"Apache-2.0"
] | null | null | null | clients/video_intelligence/lib/google_api/video_intelligence/v1/model/google_cloud_videointelligence_v1__text_detection_config.ex | medikent/elixir-google-api | 98a83d4f7bfaeac15b67b04548711bb7e49f9490 | [
"Apache-2.0"
] | null | null | null | # Copyright 2019 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# NOTE: This file is auto generated by the elixir code generator program.
# Do not edit this file manually.
defmodule GoogleApi.VideoIntelligence.V1.Model.GoogleCloudVideointelligenceV1_TextDetectionConfig do
@moduledoc """
Config for TEXT_DETECTION.
## Attributes
* `languageHints` (*type:* `list(String.t)`, *default:* `nil`) - Language hint can be specified if the language to be detected is known a
priori. It can increase the accuracy of the detection. Language hint must
be language code in BCP-47 format.
Automatic language detection is performed if no hint is provided.
* `model` (*type:* `String.t`, *default:* `nil`) - Model to use for text detection.
Supported values: "builtin/stable" (the default if unset) and
"builtin/latest".
"""
use GoogleApi.Gax.ModelBase
@type t :: %__MODULE__{
:languageHints => list(String.t()),
:model => String.t()
}
field(:languageHints, type: :list)
field(:model)
end
defimpl Poison.Decoder,
for: GoogleApi.VideoIntelligence.V1.Model.GoogleCloudVideointelligenceV1_TextDetectionConfig do
def decode(value, options) do
GoogleApi.VideoIntelligence.V1.Model.GoogleCloudVideointelligenceV1_TextDetectionConfig.decode(
value,
options
)
end
end
defimpl Poison.Encoder,
for: GoogleApi.VideoIntelligence.V1.Model.GoogleCloudVideointelligenceV1_TextDetectionConfig do
def encode(value, options) do
GoogleApi.Gax.ModelBase.encode(value, options)
end
end
| 34.229508 | 141 | 0.736111 |
1c19797f5d1eb29807a50582f857e04df501b962 | 1,235 | ex | Elixir | lib/codes/codes_z14.ex | badubizzle/icd_code | 4c625733f92b7b1d616e272abc3009bb8b916c0c | [
"Apache-2.0"
] | null | null | null | lib/codes/codes_z14.ex | badubizzle/icd_code | 4c625733f92b7b1d616e272abc3009bb8b916c0c | [
"Apache-2.0"
] | null | null | null | lib/codes/codes_z14.ex | badubizzle/icd_code | 4c625733f92b7b1d616e272abc3009bb8b916c0c | [
"Apache-2.0"
] | null | null | null | defmodule IcdCode.ICDCode.Codes_Z14 do
alias IcdCode.ICDCode
def _Z1401 do
%ICDCode{full_code: "Z1401",
category_code: "Z14",
short_code: "01",
full_name: "Asymptomatic hemophilia A carrier",
short_name: "Asymptomatic hemophilia A carrier",
category_name: "Asymptomatic hemophilia A carrier"
}
end
def _Z1402 do
%ICDCode{full_code: "Z1402",
category_code: "Z14",
short_code: "02",
full_name: "Symptomatic hemophilia A carrier",
short_name: "Symptomatic hemophilia A carrier",
category_name: "Symptomatic hemophilia A carrier"
}
end
def _Z141 do
%ICDCode{full_code: "Z141",
category_code: "Z14",
short_code: "1",
full_name: "Cystic fibrosis carrier",
short_name: "Cystic fibrosis carrier",
category_name: "Cystic fibrosis carrier"
}
end
def _Z148 do
%ICDCode{full_code: "Z148",
category_code: "Z14",
short_code: "8",
full_name: "Genetic carrier of other disease",
short_name: "Genetic carrier of other disease",
category_name: "Genetic carrier of other disease"
}
end
end
| 28.72093 | 60 | 0.610526 |
1c19a88787f72d5804ad594828a59fb9c4d8891a | 692 | exs | Elixir | apps/gapi/priv/repo/seeds.exs | rolandolucio/gapi_umbrella | b1e690804d7d2a79d3ece7ca12edf65197ca7145 | [
"MIT"
] | null | null | null | apps/gapi/priv/repo/seeds.exs | rolandolucio/gapi_umbrella | b1e690804d7d2a79d3ece7ca12edf65197ca7145 | [
"MIT"
] | null | null | null | apps/gapi/priv/repo/seeds.exs | rolandolucio/gapi_umbrella | b1e690804d7d2a79d3ece7ca12edf65197ca7145 | [
"MIT"
] | null | null | null | # Script for populating the database. You can run it as:
#
# mix run priv/repo/seeds.exs
#
# Inside the script, you can read and write to any of your
# repositories directly:
#
# Gapi.Repo.insert!(%Gapi.SomeSchema{})
#
# We recommend using the bang functions (`insert!`, `update!`
# and so on) as they will fail if something goes wrong.
alias Gapi.Accounts
alias Gapi.Posts
Accounts.create_user(%{name: "John Doe", email: "example@test.com"})
Accounts.create_user(%{name: "Jackie rossy", email: "jrossy@test.com"})
for _ <- 1..10 do
Posts.create_post(%{
title: Faker.Lorem.sentence,
body: Faker.Lorem.paragraph,
user_id: [1, 2] |> Enum.take_random(1) |> hd
})
end
| 25.62963 | 71 | 0.686416 |
1c19b769d2c29e92d278c8418ded4978ddeb049d | 6,845 | exs | Elixir | test/chat_api_web/controllers/forwarding_address_controller_test.exs | ZmagoD/papercups | dff9a5822b809edc4fd8ecf198566f9b14ab613f | [
"MIT"
] | 4,942 | 2020-07-20T22:35:28.000Z | 2022-03-31T15:38:51.000Z | test/chat_api_web/controllers/forwarding_address_controller_test.exs | ZmagoD/papercups | dff9a5822b809edc4fd8ecf198566f9b14ab613f | [
"MIT"
] | 552 | 2020-07-22T01:39:04.000Z | 2022-02-01T00:26:35.000Z | test/chat_api_web/controllers/forwarding_address_controller_test.exs | ZmagoD/papercups | dff9a5822b809edc4fd8ecf198566f9b14ab613f | [
"MIT"
] | 396 | 2020-07-22T19:27:48.000Z | 2022-03-31T05:25:24.000Z | defmodule ChatApiWeb.ForwardingAddressControllerTest do
use ChatApiWeb.ConnCase, async: true
import ChatApi.Factory
alias ChatApi.ForwardingAddresses.ForwardingAddress
@update_attrs %{
forwarding_email_address: "updated@forwarding.com",
source_email_address: "updated@source.com",
description: "some updated description",
state: "some updated state"
}
@invalid_attrs %{
forwarding_email_address: nil
}
setup %{conn: conn} do
account = insert(:account)
user = insert(:user, account: account)
forwarding_address = insert(:forwarding_address, account: account)
conn = put_req_header(conn, "accept", "application/json")
authed_conn = Pow.Plug.assign_current_user(conn, user, [])
{:ok,
conn: conn,
authed_conn: authed_conn,
account: account,
forwarding_address: forwarding_address}
end
describe "index" do
test "lists all forwarding addresses", %{
authed_conn: authed_conn,
forwarding_address: forwarding_address
} do
resp = get(authed_conn, Routes.forwarding_address_path(authed_conn, :index))
ids = json_response(resp, 200)["data"] |> Enum.map(& &1["id"])
assert ids == [forwarding_address.id]
end
end
describe "show forwarding_address" do
test "shows forwarding_address by id", %{
account: account,
authed_conn: authed_conn
} do
forwarding_address = insert(:forwarding_address, %{account: account})
conn =
get(
authed_conn,
Routes.forwarding_address_path(authed_conn, :show, forwarding_address.id)
)
assert json_response(conn, 200)["data"]
end
test "renders 404 when asking for another user's forwarding_address", %{
authed_conn: authed_conn
} do
# Create a new account and give it a forwarding_address
another_account = insert(:account)
forwarding_address =
insert(:forwarding_address, %{
forwarding_email_address: "another@chat.papercups.io",
account: another_account
})
# Using the original session, try to delete the new account's forwarding_address
conn =
get(
authed_conn,
Routes.forwarding_address_path(authed_conn, :show, forwarding_address.id)
)
assert json_response(conn, 404)
end
end
describe "create forwarding_address" do
test "renders forwarding_address when data is valid", %{
authed_conn: authed_conn,
account: account
} do
resp =
post(authed_conn, Routes.forwarding_address_path(authed_conn, :create),
forwarding_address:
params_for(:forwarding_address,
account: account,
forwarding_email_address: "test@chat.papercups.io"
)
)
assert %{"id" => id} = json_response(resp, 201)["data"]
resp = get(authed_conn, Routes.forwarding_address_path(authed_conn, :show, id))
account_id = account.id
assert %{
"id" => ^id,
"account_id" => ^account_id,
"object" => "forwarding_address",
"forwarding_email_address" => "test@chat.papercups.io"
} = json_response(resp, 200)["data"]
end
test "renders errors when data is invalid", %{authed_conn: authed_conn} do
conn =
post(authed_conn, Routes.forwarding_address_path(authed_conn, :create),
forwarding_address: @invalid_attrs
)
assert json_response(conn, 422)["errors"] != %{}
end
end
describe "update forwarding_address" do
test "renders forwarding_address when data is valid", %{
authed_conn: authed_conn,
forwarding_address: %ForwardingAddress{id: id} = forwarding_address
} do
conn =
put(authed_conn, Routes.forwarding_address_path(authed_conn, :update, forwarding_address),
forwarding_address: @update_attrs
)
assert %{"id" => ^id} = json_response(conn, 200)["data"]
conn = get(authed_conn, Routes.forwarding_address_path(authed_conn, :show, id))
account_id = forwarding_address.account_id
assert %{
"id" => ^id,
"account_id" => ^account_id,
"object" => "forwarding_address",
"forwarding_email_address" => "updated@forwarding.com",
"source_email_address" => "updated@source.com",
"description" => "some updated description",
"state" => "some updated state"
} = json_response(conn, 200)["data"]
end
test "renders errors when data is invalid", %{
authed_conn: authed_conn,
forwarding_address: forwarding_address
} do
conn =
put(authed_conn, Routes.forwarding_address_path(authed_conn, :update, forwarding_address),
forwarding_address: @invalid_attrs
)
assert json_response(conn, 422)["errors"] != %{}
end
test "renders 404 when editing another account's forwarding_address",
%{authed_conn: authed_conn} do
# Create a new account and give it a forwarding_address
another_account = insert(:account)
forwarding_address =
insert(:forwarding_address, %{
forwarding_email_address: "forwarding@another.co",
account: another_account
})
# Using the original session, try to update the new account's forwarding_address
conn =
put(
authed_conn,
Routes.forwarding_address_path(authed_conn, :update, forwarding_address),
forwarding_address: @update_attrs
)
assert json_response(conn, 404)
end
end
describe "delete forwarding_address" do
test "deletes chosen forwarding_address", %{
authed_conn: authed_conn,
forwarding_address: forwarding_address
} do
conn =
delete(
authed_conn,
Routes.forwarding_address_path(authed_conn, :delete, forwarding_address)
)
assert response(conn, 204)
assert_error_sent(404, fn ->
get(authed_conn, Routes.forwarding_address_path(authed_conn, :show, forwarding_address))
end)
end
test "renders 404 when deleting another account's forwarding_address",
%{authed_conn: authed_conn} do
# Create a new account and give it a forwarding_address
another_account = insert(:account)
forwarding_address =
insert(:forwarding_address, %{
forwarding_email_address: "another@forwarding.co",
account: another_account
})
# Using the original session, try to delete the new account's forwarding_address
conn =
delete(
authed_conn,
Routes.forwarding_address_path(authed_conn, :delete, forwarding_address)
)
assert json_response(conn, 404)
end
end
end
| 31.113636 | 98 | 0.646603 |
1c19c5e8739603162c206432e42baf3ef9945d3b | 1,170 | ex | Elixir | apps/artemis/lib/artemis/drivers/ibm_cloudant/delete_all.ex | artemis-platform/artemis_dashboard | 5ab3f5ac4c5255478bbebf76f0e43b44992e3cab | [
"MIT"
] | 9 | 2019-08-19T19:56:34.000Z | 2022-03-22T17:56:38.000Z | apps/artemis/lib/artemis/drivers/ibm_cloudant/delete_all.ex | chrislaskey/atlas_dashboard | 9009ef5aac8fefba126fa7d3e3b82d1b610ee6fe | [
"MIT"
] | 7 | 2019-07-12T21:41:01.000Z | 2020-08-17T21:29:22.000Z | apps/artemis/lib/artemis/drivers/ibm_cloudant/delete_all.ex | chrislaskey/atlas_dashboard | 9009ef5aac8fefba126fa7d3e3b82d1b610ee6fe | [
"MIT"
] | 2 | 2019-07-05T22:51:47.000Z | 2019-08-19T19:56:37.000Z | defmodule Artemis.Drivers.IBMCloudant.DeleteAll do
alias Artemis.Drivers.IBMCloudant
@moduledoc """
Deletes all IBM Cloudant databases on all IBM Cloudant hosts
"""
def call() do
hosts_config = IBMCloudant.Config.get_hosts_config!()
databases_config = IBMCloudant.Config.get_databases_config!()
databases_by_host = Enum.group_by(databases_config, &Keyword.fetch!(&1, :host))
results =
Enum.map(hosts_config, fn host_config ->
host_name = Keyword.fetch!(host_config, :name)
existing_databases = get_existing_databases(host_name)
expected_databases = Map.fetch!(databases_by_host, host_name)
Enum.map(expected_databases, fn database ->
database_name = Keyword.fetch!(database, :name)
if Enum.member?(existing_databases, database_name) do
{:ok, _} = IBMCloudant.Delete.call(host_config, database)
end
end)
end)
{:ok, results}
end
# Helpers
defp get_existing_databases(host) do
{:ok, databases} =
IBMCloudant.Request.call(%{
host: host,
method: :get,
path: "_all_dbs"
})
databases
end
end
| 26.590909 | 83 | 0.664103 |
1c19c90d9b1b66539411f4c14bca2d2edd2c6c01 | 4,283 | ex | Elixir | lib/krakex/api.ex | lukebelbina/krakex | a0cce787d13d000e12dca098531aeacc4548256f | [
"MIT"
] | 23 | 2018-01-29T15:46:16.000Z | 2022-02-24T05:35:37.000Z | lib/krakex/api.ex | lukebelbina/krakex | a0cce787d13d000e12dca098531aeacc4548256f | [
"MIT"
] | 9 | 2018-02-27T01:03:00.000Z | 2022-01-08T12:09:37.000Z | lib/krakex/api.ex | lukebelbina/krakex | a0cce787d13d000e12dca098531aeacc4548256f | [
"MIT"
] | 14 | 2018-08-10T12:48:14.000Z | 2022-02-27T19:10:27.000Z | defmodule Krakex.API do
@moduledoc """
Access to public and private APIs.
This module defines functions for building calls for the public and private
APIs and handles things such as request signing.
"""
@type response :: {:ok, term} | {:error, any}
alias Krakex.Client
@public_path "/0/public/"
@private_path "/0/private/"
@doc """
Access public API calls.
"""
@spec public_request(Client.t(), binary, keyword) :: response
def public_request(%Client{http_client: http_client} = client, resource, params \\ []) do
url = client.endpoint <> public_path(resource)
params = process_params(params)
response = http_client.get(url, params, [])
handle_http_response(response)
end
defmodule MissingConfigError do
@moduledoc """
Error raised when attempting to use private API functions without having specified
values for the API and/or private keys.
"""
defexception [:message]
end
defmodule MissingCredentialsError do
@moduledoc """
Error raised when attempting to use private API functions and the `Krakex.Client` struct
didn't specify values for the API and/or private keys.
"""
defexception [:message]
end
@doc """
Access private API calls.
It signs requests using the API and private keys.
It will raise a `Krakex.API.MissingCredentialsError` if provided a `Krakex.Client` struct
without values for either the API or private keys.
"""
@spec private_request(Client.t(), binary, keyword) :: response
def private_request(client, resource, params \\ [])
def private_request(%Client{key: key, secret: secret} = client, resource, params)
when is_binary(key) and is_binary(secret) do
path = private_path(resource)
nonce = nonce()
form_data = params |> Keyword.put(:nonce, nonce) |> process_params()
headers = [
{"Api-Key", key},
{"Api-Sign", signature(path, nonce, form_data, secret)}
]
response = client.http_client.post(client.endpoint <> path, form_data, headers)
handle_http_response(response)
end
def private_request(%Client{}, _, _) do
raise MissingCredentialsError, message: "the client is missing values for :key and/or :secret"
end
@doc false
@spec public_client() :: Client.t()
def public_client do
%Client{}
end
@doc false
@spec private_client() :: Client.t()
def private_client do
config = Application.get_all_env(:krakex)
case {config[:api_key], config[:private_key]} do
{api_key, private_key} when is_binary(api_key) and is_binary(private_key) ->
Client.new(config[:api_key], config[:private_key])
_ ->
raise MissingConfigError,
message: """
This API call requires an API key and a private key and I wasn't able to find them in your
mix config. You need to define them like this:
config :krakex,
api_key: "KRAKEN_API_KEY",
private_key: "KRAKEN_PRIVATE_KEY"
"""
end
end
defp public_path(resource) do
@public_path <> resource
end
defp private_path(resource) do
@private_path <> resource
end
defp nonce do
System.system_time() |> to_string()
end
defp signature(path, nonce, form_data, secret) do
params = URI.encode_query(form_data)
sha_sum = :crypto.hash(:sha256, nonce <> params)
mac_sum = :crypto.mac(:hmac, :sha512, secret, path <> sha_sum)
Base.encode64(mac_sum)
end
defp handle_http_response({:ok, response}), do: handle_api_response(response)
defp handle_http_response({:error, reason}), do: {:error, reason}
defp handle_api_response(%{"error" => [], "result" => result}), do: {:ok, result}
# Not sure if more than one error can occur - just take the first one.
defp handle_api_response(%{"error" => errors}), do: {:error, hd(errors)}
defp process_params(params) do
params
|> Enum.map(¶m_mapper/1)
|> List.flatten()
|> Enum.reject(&is_empty/1)
end
defp param_mapper({k, v}) do
cond do
Keyword.keyword?(v) ->
Enum.map(v, fn {mk, mv} -> {:"#{k}[#{mk}]", mv} end)
is_list(v) ->
{k, Enum.join(v, ",")}
true ->
{k, v}
end
end
defp is_empty({_, v}) when v in [nil, ""], do: true
defp is_empty(_), do: false
end
| 27.993464 | 100 | 0.660518 |
1c1a065a65bd6f43331ea9c78c99c977f04ec685 | 1,391 | ex | Elixir | lib/mix/utils.ex | chaince/ex_admin | dee0b0fcf6c1c95d71290a8375a75b7da35c7c25 | [
"MIT"
] | null | null | null | lib/mix/utils.ex | chaince/ex_admin | dee0b0fcf6c1c95d71290a8375a75b7da35c7c25 | [
"MIT"
] | null | null | null | lib/mix/utils.ex | chaince/ex_admin | dee0b0fcf6c1c95d71290a8375a75b7da35c7c25 | [
"MIT"
] | 2 | 2018-07-12T07:44:50.000Z | 2018-07-19T11:45:09.000Z | defmodule Mix.ExAdmin.Utils do
def get_package_path do
__ENV__.file
|> Path.dirname
|> String.split("/lib/mix")
|> hd
end
def get_module do
Mix.Project.get
|> Module.split
|> Enum.reverse
|> Enum.at(1)
end
def web_path() do
path1 = Path.join ["lib", to_string(Mix.Phoenix.otp_app()) <> "_web"]
path2 = "web"
cond do
File.exists? path1 -> path1
File.exists? path2 -> path2
true ->
raise "Could not find web path '#{path1}'."
end
end
@doc "Print a status message to the console"
def status_msg(status, message),
do: IO.puts "#{IO.ANSI.green}* #{status}#{IO.ANSI.reset} #{message}"
def notice_msg(status, message),
do: IO.puts "#{IO.ANSI.yellow}* #{status}#{IO.ANSI.reset} #{message}"
@doc "Print an informational message without color"
def debug(message), do: IO.puts "==> #{message}"
@doc "Print an informational message in green"
def info(message), do: IO.puts "==> #{IO.ANSI.green}#{message}#{IO.ANSI.reset}"
@doc "Print a warning message in yellow"
def warn(message), do: IO.puts "==> #{IO.ANSI.yellow}#{message}#{IO.ANSI.reset}"
@doc "Print a notice in yellow"
def notice(message), do: IO.puts "#{IO.ANSI.yellow}#{message}#{IO.ANSI.reset}"
@doc "Print an error message in red"
def error(message), do: IO.puts "==> #{IO.ANSI.red}#{message}#{IO.ANSI.reset}"
end
| 30.23913 | 83 | 0.629763 |
1c1a11bbb41270e985578dca6e7f5576184fc6b8 | 22,472 | ex | Elixir | lib/elixir/lib/kernel/parallel_compiler.ex | matiasgarciaisaia/elixir | d0a3fdbfd774e0a6972513dcb82c2683400e67a0 | [
"Apache-2.0"
] | null | null | null | lib/elixir/lib/kernel/parallel_compiler.ex | matiasgarciaisaia/elixir | d0a3fdbfd774e0a6972513dcb82c2683400e67a0 | [
"Apache-2.0"
] | null | null | null | lib/elixir/lib/kernel/parallel_compiler.ex | matiasgarciaisaia/elixir | d0a3fdbfd774e0a6972513dcb82c2683400e67a0 | [
"Apache-2.0"
] | null | null | null | defmodule Kernel.ParallelCompiler do
@moduledoc """
A module responsible for compiling and requiring files in parallel.
"""
@doc """
Starts a task for parallel compilation.
If you have a file that needs to compile other modules in parallel,
the spawned processes need to be aware of the compiler environment.
This function allows a developer to create a task that is aware of
those environments.
See `Task.async/1` for more information. The task spawned must be
always awaited on by calling `Task.await/1`
"""
@doc since: "1.6.0"
def async(fun) when is_function(fun) do
if parent = :erlang.get(:elixir_compiler_pid) do
file = :erlang.get(:elixir_compiler_file)
dest = :erlang.get(:elixir_compiler_dest)
{:error_handler, error_handler} = :erlang.process_info(self(), :error_handler)
Task.async(fn ->
send(parent, {:async, self()})
:erlang.put(:elixir_compiler_pid, parent)
:erlang.put(:elixir_compiler_file, file)
dest != :undefined and :erlang.put(:elixir_compiler_dest, dest)
:erlang.process_flag(:error_handler, error_handler)
fun.()
end)
else
raise ArgumentError,
"cannot spawn parallel compiler task because " <>
"the current file is not being compiled/required"
end
end
@doc """
Compiles the given files.
Those files are compiled in parallel and can automatically
detect dependencies between them. Once a dependency is found,
the current file stops being compiled until the dependency is
resolved.
It returns `{:ok, modules, warnings}` or `{:error, errors, warnings}`.
Both errors and warnings are a list of three-element tuples containing
the file, line and the formatted error/warning.
## Options
* `:each_file` - for each file compiled, invokes the callback passing the
file
* `:each_long_compilation` - for each file that takes more than a given
timeout (see the `:long_compilation_threshold` option) to compile, invoke
this callback passing the file as its argument
* `:each_module` - for each module compiled, invokes the callback passing
the file, module and the module bytecode
* `:each_cycle` - after the given files are compiled, invokes this function
that should return the following values:
* `{:compile, modules}` - to continue compilation with a list of further modules to compile
* `{:runtime, modules}` - to stop compilation and verify the list of modules because
dependent modules have changed
* `:long_compilation_threshold` - the timeout (in seconds) after the
`:each_long_compilation` callback is invoked; defaults to `15`
* `:profile` - if set to `:time` measure the compilation time of each compilation cycle
and group pass checker
* `:dest` - the destination directory for the BEAM files. When using `compile/2`,
this information is only used to properly annotate the BEAM files before
they are loaded into memory. If you want a file to actually be written to
`dest`, use `compile_to_path/3` instead.
* `:beam_timestamp` - the modification timestamp to give all BEAM files
"""
@doc since: "1.6.0"
def compile(files, options \\ []) when is_list(options) do
spawn_workers(files, :compile, options)
end
@doc since: "1.6.0"
def compile_to_path(files, path, options \\ []) when is_binary(path) and is_list(options) do
spawn_workers(files, {:compile, path}, options)
end
@doc """
Requires the given files in parallel.
Opposite to compile, dependencies are not attempted to be
automatically solved between files.
It returns `{:ok, modules, warnings}` or `{:error, errors, warnings}`.
Both errors and warnings are a list of three-element tuples containing
the file, line and the formatted error/warning.
## Options
* `:each_file` - for each file compiled, invokes the callback passing the
file
* `:each_module` - for each module compiled, invokes the callback passing
the file, module and the module bytecode
"""
@doc since: "1.6.0"
def require(files, options \\ []) when is_list(options) do
spawn_workers(files, :require, options)
end
@doc false
@deprecated "Use Kernel.ParallelCompiler.compile/2 instead"
def files(files, options \\ []) when is_list(options) do
case spawn_workers(files, :compile, options) do
{:ok, modules, _} -> modules
{:error, _, _} -> exit({:shutdown, 1})
end
end
@doc false
@deprecated "Use Kernel.ParallelCompiler.compile_to_path/2 instead"
def files_to_path(files, path, options \\ []) when is_binary(path) and is_list(options) do
case spawn_workers(files, {:compile, path}, options) do
{:ok, modules, _} -> modules
{:error, _, _} -> exit({:shutdown, 1})
end
end
defp spawn_workers(files, output, options) do
{:module, _} = :code.ensure_loaded(Kernel.ErrorHandler)
compiler_pid = self()
:elixir_code_server.cast({:reset_warnings, compiler_pid})
schedulers = max(:erlang.system_info(:schedulers_online), 2)
result =
spawn_workers(files, 0, [], [], %{}, [], %{
dest: Keyword.get(options, :dest),
each_cycle: Keyword.get(options, :each_cycle, fn -> {:runtime, []} end),
each_file: Keyword.get(options, :each_file, fn _, _ -> :ok end) |> each_file(),
each_long_compilation: Keyword.get(options, :each_long_compilation, fn _file -> :ok end),
each_module: Keyword.get(options, :each_module, fn _file, _module, _binary -> :ok end),
beam_timestamp: Keyword.get(options, :beam_timestamp),
long_compilation_threshold: Keyword.get(options, :long_compilation_threshold, 15),
profile: Keyword.get(options, :profile),
cycle_start: System.monotonic_time(),
module_counter: 0,
output: output,
schedulers: schedulers
})
# In case --warning-as-errors is enabled and there was a warning,
# compilation status will be set to error.
compilation_status = :elixir_code_server.call({:compilation_status, compiler_pid})
case {result, compilation_status} do
{{:ok, _, warnings}, :error} ->
message = "Compilation failed due to warnings while using the --warnings-as-errors option"
IO.puts(:stderr, message)
{:error, warnings, []}
{{:error, errors, warnings}, :error} ->
{:error, errors ++ warnings, []}
_ ->
result
end
end
defp each_file(fun) when is_function(fun, 1), do: fn file, _ -> fun.(file) end
defp each_file(fun) when is_function(fun, 2), do: fun
defp each_file(file, lexical, parent) do
ref = Process.monitor(parent)
send(parent, {:file_ok, self(), ref, file, lexical})
receive do
^ref -> :ok
{:DOWN, ^ref, _, _, _} -> :ok
end
end
# We already have n=schedulers currently running, don't spawn new ones
defp spawn_workers(
queue,
spawned,
waiting,
files,
result,
warnings,
%{schedulers: schedulers} = state
)
when spawned - length(waiting) >= schedulers do
wait_for_messages(queue, spawned, waiting, files, result, warnings, state)
end
# Release waiting processes
defp spawn_workers([{ref, found} | t], spawned, waiting, files, result, warnings, state) do
waiting =
case List.keytake(waiting, ref, 2) do
{{_kind, pid, ^ref, _on, _defining, _deadlock}, waiting} ->
send(pid, {ref, found})
waiting
nil ->
# In case the waiting process died (for example, it was an async process),
# it will no longer be on the list. So we need to take it into account here.
waiting
end
spawn_workers(t, spawned, waiting, files, result, warnings, state)
end
defp spawn_workers([file | queue], spawned, waiting, files, result, warnings, state) do
%{output: output, long_compilation_threshold: threshold, dest: dest} = state
parent = self()
file = Path.expand(file)
{pid, ref} =
:erlang.spawn_monitor(fn ->
:erlang.put(:elixir_compiler_pid, parent)
:erlang.put(:elixir_compiler_file, file)
try do
case output do
{:compile, path} -> compile_file(file, path, parent)
:compile -> compile_file(file, dest, parent)
:require -> require_file(file, parent)
end
catch
kind, reason ->
send(parent, {:file_error, self(), file, {kind, reason, __STACKTRACE__}})
end
exit(:shutdown)
end)
timer_ref = Process.send_after(self(), {:timed_out, pid}, threshold * 1000)
files = [{pid, ref, file, timer_ref} | files]
spawn_workers(queue, spawned + 1, waiting, files, result, warnings, state)
end
# No more queue, nothing waiting, this cycle is done
defp spawn_workers([], 0, [], [], result, warnings, state) do
state = cycle_timing(result, state)
case each_cycle_return(state.each_cycle.()) do
{:runtime, dependent_modules} ->
write_and_verify_modules(result, warnings, dependent_modules, state)
{:compile, []} ->
write_and_verify_modules(result, warnings, [], state)
{:compile, more} ->
spawn_workers(more, 0, [], [], result, warnings, state)
end
end
# files x, waiting for x: POSSIBLE ERROR! Release processes so we get the failures
# Single entry, just release it.
defp spawn_workers(
[],
1,
[{_, pid, ref, _, _, _}] = waiting,
[{pid, _, _, _}] = files,
result,
warnings,
state
) do
spawn_workers([{ref, :not_found}], 1, waiting, files, result, warnings, state)
end
# Multiple entries, try to release modules.
defp spawn_workers([], spawned, waiting, files, result, warnings, state)
when length(waiting) == spawned do
# There is potentially a deadlock. We will release modules with
# the following order:
#
# 1. Code.ensure_compiled?/1 checks (deadlock = soft)
# 2. Struct checks (deadlock = hard)
# 3. Modules without a known definition
# 4. Code invocation (deadlock = raise)
#
# In theory there is no difference between hard and raise, the
# difference is where the raise is happening, inside the compiler
# or in the caller.
cond do
deadlocked = deadlocked(waiting, :soft) || deadlocked(waiting, :hard) ->
spawn_workers(deadlocked, spawned, waiting, files, result, warnings, state)
without_definition = without_definition(waiting, files) ->
spawn_workers(without_definition, spawned, waiting, files, result, warnings, state)
true ->
errors = handle_deadlock(waiting, files)
{:error, errors, warnings}
end
end
# No more queue, but spawned and length(waiting) do not match
defp spawn_workers([], spawned, waiting, files, result, warnings, state) do
wait_for_messages([], spawned, waiting, files, result, warnings, state)
end
defp compile_file(file, path, parent) do
:erlang.process_flag(:error_handler, Kernel.ErrorHandler)
:erlang.put(:elixir_compiler_dest, path)
:elixir_compiler.file(file, &each_file(&1, &2, parent))
end
defp require_file(file, parent) do
case :elixir_code_server.call({:acquire, file}) do
:required ->
send(parent, {:file_cancel, self()})
:proceed ->
:elixir_compiler.file(file, &each_file(&1, &2, parent))
:elixir_code_server.cast({:required, file})
end
end
defp cycle_timing(result, %{profile: :time} = state) do
%{cycle_start: cycle_start, module_counter: module_counter} = state
num_modules = count_modules(result)
diff_modules = num_modules - module_counter
now = System.monotonic_time()
time = System.convert_time_unit(now - cycle_start, :native, :millisecond)
IO.puts(
:stderr,
"[profile] Finished compilation cycle of #{diff_modules} modules in #{time}ms"
)
%{state | cycle_start: now, module_counter: num_modules}
end
defp cycle_timing(_result, %{profile: nil} = state) do
state
end
defp count_modules(result) do
Enum.count(result, &match?({{:module, _}, _}, &1))
end
# TODO: Deprecate on v1.12
defp each_cycle_return(modules) when is_list(modules), do: {:compile, modules}
defp each_cycle_return(other), do: other
defp write_and_verify_modules(result, warnings, dependent_modules, state) do
%{output: output, beam_timestamp: beam_timestamp} = state
{binaries, checker_warnings} = maybe_check_modules(result, dependent_modules, state)
write_module_binaries(output, beam_timestamp, binaries)
warnings = Enum.reverse(warnings, checker_warnings)
{:ok, Enum.map(binaries, &elem(&1, 0)), warnings}
end
defp write_module_binaries({:compile, path}, timestamp, result) do
Enum.each(result, fn {module, binary} ->
full_path = Path.join(path, Atom.to_string(module) <> ".beam")
File.write!(full_path, binary)
if timestamp, do: File.touch!(full_path, timestamp)
end)
end
defp write_module_binaries(_output, _timestamp, _result) do
:ok
end
defp maybe_check_modules(result, runtime_modules, state) do
%{schedulers: schedulers, profile: profile} = state
if :elixir_config.get(:bootstrap) do
binaries = for {{:module, module}, {binary, _map}} <- result, do: {module, binary}
{binaries, []}
else
compiled_modules = checker_compiled_modules(result)
runtime_modules = checker_runtime_modules(runtime_modules)
profile_checker(profile, compiled_modules, runtime_modules, fn ->
Module.ParallelChecker.verify(compiled_modules, runtime_modules, schedulers)
end)
end
end
defp checker_compiled_modules(result) do
for {{:module, _module}, {binary, module_map}} <- result do
{module_map, binary}
end
end
defp checker_runtime_modules(modules) do
for module <- modules,
path = :code.which(module),
is_list(path) do
{module, File.read!(path)}
end
end
defp profile_checker(_profile = :time, compiled_modules, runtime_modules, fun) do
{time, result} = :timer.tc(fun)
time = div(time, 1000)
num_modules = length(compiled_modules) + length(runtime_modules)
IO.puts(:stderr, "[profile] Finished group pass check of #{num_modules} modules in #{time}ms")
result
end
defp profile_checker(_profile = nil, _compiled_modules, _runtime_modules, fun) do
fun.()
end
# The goal of this function is to find leaves in the dependency graph,
# i.e. to find code that depends on code that we know is not being defined.
defp without_definition(waiting, files) do
nillify_empty(
for {pid, _, _, _} <- files,
{_, ^pid, ref, on, _, _} = List.keyfind(waiting, pid, 1),
not Enum.any?(waiting, fn {_, _, _, _, defining, _} -> on in defining end),
do: {ref, :not_found}
)
end
defp deadlocked(waiting, type) do
nillify_empty(for {_, _, ref, _, _, ^type} <- waiting, do: {ref, :not_found})
end
defp nillify_empty([]), do: nil
defp nillify_empty([_ | _] = list), do: list
# Wait for messages from child processes
defp wait_for_messages(queue, spawned, waiting, files, result, warnings, state) do
%{output: output} = state
receive do
{:async, process} ->
Process.monitor(process)
wait_for_messages(queue, spawned + 1, waiting, files, result, warnings, state)
{:available, kind, module} ->
available =
for {^kind, _, ref, ^module, _defining, _deadlock} <- waiting,
do: {ref, :found}
result = Map.put(result, {kind, module}, true)
spawn_workers(available ++ queue, spawned, waiting, files, result, warnings, state)
{:module_available, child, ref, file, module, binary, module_map} ->
state.each_module.(file, module, binary)
# Release the module loader which is waiting for an ack
send(child, {ref, :ack})
available =
for {:module, _, ref, ^module, _defining, _deadlock} <- waiting,
do: {ref, :found}
cancel_waiting_timer(files, child)
result = Map.put(result, {:module, module}, {binary, module_map})
spawn_workers(available ++ queue, spawned, waiting, files, result, warnings, state)
# If we are simply requiring files, we do not add to waiting.
{:waiting, _kind, child, ref, _on, _defining, _deadlock} when output == :require ->
send(child, {ref, :not_found})
spawn_workers(queue, spawned, waiting, files, result, warnings, state)
{:waiting, kind, child, ref, on, defining, deadlock?} ->
# If we already got what we were waiting for, do not put it on waiting.
# Alternatively, we're waiting on ourselves,
# send :found so that we can crash with a better error.
waiting =
if Map.has_key?(result, {kind, on}) or on in defining do
send(child, {ref, :found})
waiting
else
[{kind, child, ref, on, defining, deadlock?} | waiting]
end
spawn_workers(queue, spawned, waiting, files, result, warnings, state)
{:timed_out, child} ->
case List.keyfind(files, child, 0) do
{^child, _, file, _} -> state.each_long_compilation.(file)
_ -> :ok
end
spawn_workers(queue, spawned, waiting, files, result, warnings, state)
{:warning, file, line, message} ->
file = file && Path.absname(file)
message = :unicode.characters_to_binary(message)
warning = {file, line, message}
wait_for_messages(queue, spawned, waiting, files, result, [warning | warnings], state)
{:file_ok, child_pid, ref, file, lexical} ->
state.each_file.(file, lexical)
send(child_pid, ref)
cancel_waiting_timer(files, child_pid)
discard_down(child_pid)
new_files = List.keydelete(files, child_pid, 0)
# Sometimes we may have spurious entries in the waiting list
# because someone invoked try/rescue UndefinedFunctionError
new_waiting = List.keydelete(waiting, child_pid, 1)
spawn_workers(queue, spawned - 1, new_waiting, new_files, result, warnings, state)
{:file_cancel, child_pid} ->
cancel_waiting_timer(files, child_pid)
discard_down(child_pid)
new_files = List.keydelete(files, child_pid, 0)
spawn_workers(queue, spawned - 1, waiting, new_files, result, warnings, state)
{:file_error, child_pid, file, {kind, reason, stack}} ->
print_error(file, kind, reason, stack)
cancel_waiting_timer(files, child_pid)
discard_down(child_pid)
files |> List.keydelete(child_pid, 0) |> terminate()
{:error, [to_error(file, kind, reason, stack)], warnings}
{:DOWN, ref, :process, pid, reason} ->
waiting = List.keydelete(waiting, pid, 1)
case handle_down(files, ref, reason) do
:ok -> wait_for_messages(queue, spawned - 1, waiting, files, result, warnings, state)
{:error, errors} -> {:error, errors, warnings}
end
end
end
defp discard_down(pid) do
receive do
{:DOWN, _, :process, ^pid, _} -> :ok
end
end
defp handle_down(_files, _ref, :normal) do
:ok
end
defp handle_down(files, ref, reason) do
case List.keyfind(files, ref, 1) do
{child_pid, ^ref, file, _timer_ref} ->
print_error(file, :exit, reason, [])
files
|> List.keydelete(child_pid, 0)
|> terminate()
{:error, [to_error(file, :exit, reason, [])]}
_ ->
:ok
end
end
defp handle_deadlock(waiting, files) do
deadlock =
for {pid, _, file, _} <- files do
{:current_stacktrace, stacktrace} = Process.info(pid, :current_stacktrace)
Process.exit(pid, :kill)
{kind, ^pid, _, on, _, _} = List.keyfind(waiting, pid, 1)
description = "deadlocked waiting on #{kind} #{inspect(on)}"
error = CompileError.exception(description: description, file: nil, line: nil)
print_error(file, :error, error, stacktrace)
{Path.relative_to_cwd(file), on, description}
end
IO.puts("""
Compilation failed because of a deadlock between files.
The following files depended on the following modules:
""")
max =
deadlock
|> Enum.map(&(&1 |> elem(0) |> String.length()))
|> Enum.max()
for {file, mod, _} <- deadlock do
IO.puts([" ", String.pad_leading(file, max), " => " | inspect(mod)])
end
IO.puts(
"\nEnsure there are no compile-time dependencies between those files " <>
"and that the modules they reference exist and are correctly named\n"
)
for {file, _, description} <- deadlock, do: {Path.absname(file), nil, description}
end
defp terminate(files) do
for {pid, _, _, _} <- files, do: Process.exit(pid, :kill)
for {pid, _, _, _} <- files, do: discard_down(pid)
:ok
end
defp print_error(file, kind, reason, stack) do
IO.write([
"\n== Compilation error in file #{Path.relative_to_cwd(file)} ==\n",
Kernel.CLI.format_error(kind, reason, stack)
])
end
defp cancel_waiting_timer(files, child_pid) do
case List.keyfind(files, child_pid, 0) do
{^child_pid, _ref, _file, timer_ref} ->
Process.cancel_timer(timer_ref)
# Let's flush the message in case it arrived before we canceled the timeout.
receive do
{:timed_out, ^child_pid} -> :ok
after
0 -> :ok
end
nil ->
:ok
end
end
defp to_error(file, kind, reason, stack) do
line = get_line(file, reason, stack)
file = Path.absname(file)
message = :unicode.characters_to_binary(Kernel.CLI.format_error(kind, reason, stack))
{file, line, message}
end
defp get_line(_file, %{line: line}, _stack) when is_integer(line) and line > 0 do
line
end
defp get_line(file, :undef, [{_, _, _, []}, {_, _, _, info} | _]) do
if Keyword.get(info, :file) == to_charlist(Path.relative_to_cwd(file)) do
Keyword.get(info, :line)
end
end
defp get_line(file, _reason, [{_, _, _, info} | _]) do
if Keyword.get(info, :file) == to_charlist(Path.relative_to_cwd(file)) do
Keyword.get(info, :line)
end
end
defp get_line(_, _, _) do
nil
end
end
| 34.256098 | 98 | 0.646938 |
1c1a2720297c2ecd8d1897e9ae5418ba75a9be2e | 1,094 | exs | Elixir | test/furlex/oembed_test.exs | fanhero/furlex | 3b77bb7b19f3cee5b2e03d37a997dcec1225d47a | [
"Apache-2.0"
] | 1 | 2019-01-23T13:39:18.000Z | 2019-01-23T13:39:18.000Z | test/furlex/oembed_test.exs | alexcastano/furlex | c8e4e474aa2494285fdc19b133d3ad1ea348add9 | [
"Apache-2.0"
] | null | null | null | test/furlex/oembed_test.exs | alexcastano/furlex | c8e4e474aa2494285fdc19b133d3ad1ea348add9 | [
"Apache-2.0"
] | 1 | 2019-01-11T09:52:03.000Z | 2019-01-11T09:52:03.000Z | defmodule Furlex.OembedTest do
use ExUnit.Case
alias Furlex.Oembed
setup do
bypass = Bypass.open()
url = "http://localhost:#{bypass.port}"
config = Application.get_env :furlex, Oembed
new_config = Keyword.put config, :oembed_host, url
Application.put_env :furlex, Oembed, new_config
on_exit fn ->
Application.put_env :furlex, Oembed, config
:ok
end
{:ok, bypass: bypass}
end
test "returns endpoint from url", %{bypass: bypass} do
Bypass.expect bypass, &handle/1
assert {:error, :no_oembed_provider} ==
Oembed.endpoint_from_url("foobar")
url = "https://vimeo.com/88856141"
params = %{"format" => "json"}
{:ok, endpoint} = Oembed.endpoint_from_url(url, params, [skip_cache?: true])
assert endpoint == "https://vimeo.com/api/oembed.json"
end
def handle(%{request_path: "/providers.json"} = conn) do
assert conn.method == "GET"
providers =
[__DIR__ | ~w(.. fixtures providers.json)]
|> Path.join()
|> File.read!()
Plug.Conn.resp conn, 200, providers
end
end
| 22.791667 | 80 | 0.638026 |
1c1a596940b37e3a4bb6163708c18fbb9178cb5c | 122 | exs | Elixir | test/coin_ticker_test.exs | StGerman/CoinTickers | 9ff0128b946b7f11642911c0dc7aea6846f13e62 | [
"MIT"
] | null | null | null | test/coin_ticker_test.exs | StGerman/CoinTickers | 9ff0128b946b7f11642911c0dc7aea6846f13e62 | [
"MIT"
] | 3 | 2017-08-28T21:20:42.000Z | 2017-08-28T21:22:06.000Z | test/coin_ticker_test.exs | StGerman/CoinTickers | 9ff0128b946b7f11642911c0dc7aea6846f13e62 | [
"MIT"
] | null | null | null | defmodule CoinTickerTest do
use ExUnit.Case
doctest CoinTicker
test "the truth" do
assert 1 + 1 == 2
end
end
| 13.555556 | 27 | 0.688525 |
1c1a90a958ec4901a13ec8c2779ed7dd37d4a0bf | 452 | ex | Elixir | lib/ory/hydra/helpers/body.ex | sitch/ory-hydra-elixir | 529b5d120e0e857f9fadecf0e05f023f56394826 | [
"MIT"
] | null | null | null | lib/ory/hydra/helpers/body.ex | sitch/ory-hydra-elixir | 529b5d120e0e857f9fadecf0e05f023f56394826 | [
"MIT"
] | null | null | null | lib/ory/hydra/helpers/body.ex | sitch/ory-hydra-elixir | 529b5d120e0e857f9fadecf0e05f023f56394826 | [
"MIT"
] | null | null | null | defmodule ORY.Hydra.Helpers.Body do
@spec encode!(ORY.Hydra.Operation.t(), ORY.Hydra.Config.t()) :: String.t() | no_return
def encode!(%{method: :get}, _config) do
""
end
def encode!(operation, config) do
operation.params
|> Map.drop(operation.params_in_query)
|> config.json_codec.encode!()
end
def form_encode!(operation, config) do
opts = [:encode, &URI.encode_query/1]
# note ?
:NOT_IMPLEMENTED
end
end
| 22.6 | 88 | 0.659292 |
1c1ae433e3ac4c51a9cf95f0ad2f0acc138653dc | 2,393 | ex | Elixir | clients/dialogflow/lib/google_api/dialogflow/v2/model/google_cloud_dialogflow_v2beta1_intent_message_browse_carousel_card.ex | pojiro/elixir-google-api | 928496a017d3875a1929c6809d9221d79404b910 | [
"Apache-2.0"
] | 1 | 2021-12-20T03:40:53.000Z | 2021-12-20T03:40:53.000Z | clients/dialogflow/lib/google_api/dialogflow/v2/model/google_cloud_dialogflow_v2beta1_intent_message_browse_carousel_card.ex | pojiro/elixir-google-api | 928496a017d3875a1929c6809d9221d79404b910 | [
"Apache-2.0"
] | 1 | 2020-08-18T00:11:23.000Z | 2020-08-18T00:44:16.000Z | clients/dialogflow/lib/google_api/dialogflow/v2/model/google_cloud_dialogflow_v2beta1_intent_message_browse_carousel_card.ex | pojiro/elixir-google-api | 928496a017d3875a1929c6809d9221d79404b910 | [
"Apache-2.0"
] | null | null | null | # Copyright 2019 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# NOTE: This file is auto generated by the elixir code generator program.
# Do not edit this file manually.
defmodule GoogleApi.Dialogflow.V2.Model.GoogleCloudDialogflowV2beta1IntentMessageBrowseCarouselCard do
@moduledoc """
Browse Carousel Card for Actions on Google. https://developers.google.com/actions/assistant/responses#browsing_carousel
## Attributes
* `imageDisplayOptions` (*type:* `String.t`, *default:* `nil`) - Optional. Settings for displaying the image. Applies to every image in items.
* `items` (*type:* `list(GoogleApi.Dialogflow.V2.Model.GoogleCloudDialogflowV2beta1IntentMessageBrowseCarouselCardBrowseCarouselCardItem.t)`, *default:* `nil`) - Required. List of items in the Browse Carousel Card. Minimum of two items, maximum of ten.
"""
use GoogleApi.Gax.ModelBase
@type t :: %__MODULE__{
:imageDisplayOptions => String.t() | nil,
:items =>
list(
GoogleApi.Dialogflow.V2.Model.GoogleCloudDialogflowV2beta1IntentMessageBrowseCarouselCardBrowseCarouselCardItem.t()
)
| nil
}
field(:imageDisplayOptions)
field(:items,
as:
GoogleApi.Dialogflow.V2.Model.GoogleCloudDialogflowV2beta1IntentMessageBrowseCarouselCardBrowseCarouselCardItem,
type: :list
)
end
defimpl Poison.Decoder,
for: GoogleApi.Dialogflow.V2.Model.GoogleCloudDialogflowV2beta1IntentMessageBrowseCarouselCard do
def decode(value, options) do
GoogleApi.Dialogflow.V2.Model.GoogleCloudDialogflowV2beta1IntentMessageBrowseCarouselCard.decode(
value,
options
)
end
end
defimpl Poison.Encoder,
for: GoogleApi.Dialogflow.V2.Model.GoogleCloudDialogflowV2beta1IntentMessageBrowseCarouselCard do
def encode(value, options) do
GoogleApi.Gax.ModelBase.encode(value, options)
end
end
| 37.390625 | 256 | 0.754283 |
1c1ae515913ad1f5d748c361350758c0a44e5f49 | 1,133 | ex | Elixir | lib/chatbot_web/channels/user_socket.ex | mikehelmick/meme-bot | 52a84cfb3f5ddcdddadf59b0ba3976f9e3f23800 | [
"Apache-2.0"
] | 7 | 2019-04-05T06:12:56.000Z | 2021-04-03T11:39:40.000Z | lib/chatbot_web/channels/user_socket.ex | mikehelmick/meme-bot | 52a84cfb3f5ddcdddadf59b0ba3976f9e3f23800 | [
"Apache-2.0"
] | null | null | null | lib/chatbot_web/channels/user_socket.ex | mikehelmick/meme-bot | 52a84cfb3f5ddcdddadf59b0ba3976f9e3f23800 | [
"Apache-2.0"
] | 3 | 2019-04-20T13:05:48.000Z | 2019-06-05T16:52:46.000Z | defmodule ChatbotWeb.UserSocket do
use Phoenix.Socket
## Channels
# channel "room:*", ChatbotWeb.RoomChannel
## Transports
transport :longpoll, Phoenix.Transports.LongPoll
# Socket params are passed from the client and can
# be used to verify and authenticate a user. After
# verification, you can put default assigns into
# the socket that will be set for all channels, ie
#
# {:ok, assign(socket, :user_id, verified_user_id)}
#
# To deny connection, return `:error`.
#
# See `Phoenix.Token` documentation for examples in
# performing token verification on connect.
def connect(_params, socket, _connect_info) do
{:ok, socket}
end
# Socket id's are topics that allow you to identify all sockets for a given user:
#
# def id(socket), do: "user_socket:#{socket.assigns.user_id}"
#
# Would allow you to broadcast a "disconnect" event and terminate
# all active sockets and channels for a given user:
#
# ChatbotWeb.Endpoint.broadcast("user_socket:#{user.id}", "disconnect", %{})
#
# Returning `nil` makes this socket anonymous.
def id(_socket), do: nil
end
| 30.621622 | 83 | 0.700794 |
1c1afeae9d04228c953ecd86ab53e8532e8659b3 | 1,036 | ex | Elixir | test/support/factory.ex | paulswartz/arrow | c1ba1ce52107c0ed94ce9bca2fef2bfeb606b8f9 | [
"MIT"
] | null | null | null | test/support/factory.ex | paulswartz/arrow | c1ba1ce52107c0ed94ce9bca2fef2bfeb606b8f9 | [
"MIT"
] | null | null | null | test/support/factory.ex | paulswartz/arrow | c1ba1ce52107c0ed94ce9bca2fef2bfeb606b8f9 | [
"MIT"
] | null | null | null | defmodule Arrow.Factory do
use ExMachina.Ecto, repo: Arrow.Repo
def adjustment_factory do
%Arrow.Adjustment{
route_id: "Red",
source: "gtfs_creator",
source_label: sequence(:source_label, &"Adjustment-#{&1}")
}
end
def disruption_factory do
%Arrow.Disruption{
ready_revision: nil,
published_revision: nil
}
end
def disruption_revision_factory do
%Arrow.DisruptionRevision{
start_date: ~D[2020-01-01],
end_date: ~D[2020-01-15],
disruption: build(:disruption),
days_of_week: [build(:day_of_week)],
adjustments: [build(:adjustment)],
trip_short_names: [build(:trip_short_name)]
}
end
def day_of_week_factory do
%Arrow.Disruption.DayOfWeek{
day_name: "saturday",
start_time: nil,
end_time: nil
}
end
def exception_factory do
%Arrow.Disruption.Exception{excluded_date: ~D[2020-01-10]}
end
def trip_short_name_factory do
%Arrow.Disruption.TripShortName{trip_short_name: "1234"}
end
end
| 22.521739 | 64 | 0.671815 |
1c1b0b20ac96595c0cf7f9f89e1e560001f22220 | 1,794 | ex | Elixir | lib/elixir_script/passes/translate/forms/pattern/patterns.ex | alex-min/elixirscript | a2bd2327d0b6bbacf98fb555198acf12c0c20916 | [
"MIT"
] | 1 | 2021-09-14T14:28:39.000Z | 2021-09-14T14:28:39.000Z | lib/elixir_script/passes/translate/forms/pattern/patterns.ex | alex-min/elixirscript | a2bd2327d0b6bbacf98fb555198acf12c0c20916 | [
"MIT"
] | null | null | null | lib/elixir_script/passes/translate/forms/pattern/patterns.ex | alex-min/elixirscript | a2bd2327d0b6bbacf98fb555198acf12c0c20916 | [
"MIT"
] | null | null | null | defmodule ElixirScript.Translate.Forms.Pattern.Patterns do
@moduledoc false
alias ESTree.Tools.Builder, as: J
alias ElixirScript.Translate.Helpers
@parameter J.member_expression(
Helpers.patterns(),
J.identifier(:variable)
)
@head_tail J.member_expression(
Helpers.patterns(),
J.identifier(:headTail)
)
@starts_with J.member_expression(
Helpers.patterns(),
J.identifier(:startsWith)
)
@capture J.member_expression(
Helpers.patterns(),
J.identifier(:capture)
)
@bound J.member_expression(
Helpers.patterns(),
J.identifier(:bound)
)
@_type J.member_expression(
Helpers.patterns(),
J.identifier(:type)
)
@bitstring_match J.member_expression(
Helpers.patterns(),
J.identifier(:bitStringMatch)
)
def parameter() do
Helpers.call(
@parameter,
[]
)
end
def parameter(name) do
Helpers.call(
@parameter,
[name]
)
end
def head_tail(headParameter, tailParameter) do
Helpers.call(
@head_tail,
[headParameter, tailParameter]
)
end
def starts_with(prefix) do
Helpers.call(
@starts_with,
[J.literal(prefix)]
)
end
def capture(value) do
Helpers.call(
@capture,
[value]
)
end
def bound(value) do
Helpers.call(
@bound,
[value]
)
end
def type(prototype, value) do
Helpers.call(
@_type,
[prototype, value]
)
end
def bitstring_match(values) do
Helpers.call(
@bitstring_match,
values
)
end
end
| 18.306122 | 58 | 0.548495 |
1c1b14b178f318efa43cbbaf300c062e392eb897 | 1,055 | ex | Elixir | lib/wavex/chunk/data.ex | basdirks/wavex | b465c374d4b8a1668187d6c056b1d299fe3a9ffe | [
"Apache-2.0"
] | null | null | null | lib/wavex/chunk/data.ex | basdirks/wavex | b465c374d4b8a1668187d6c056b1d299fe3a9ffe | [
"Apache-2.0"
] | null | null | null | lib/wavex/chunk/data.ex | basdirks/wavex | b465c374d4b8a1668187d6c056b1d299fe3a9ffe | [
"Apache-2.0"
] | null | null | null | defmodule Wavex.Chunk.Data do
@moduledoc """
A data chunk.
"""
alias Wavex.FourCC
@enforce_keys [
:size,
:data
]
defstruct [
:size,
:data
]
@type t :: %__MODULE__{size: non_neg_integer, data: binary}
@four_cc "data"
@doc """
The ID that identifies a data chunk.
"""
@spec four_cc :: FourCC.t()
def four_cc, do: @four_cc
@doc ~S"""
Read a data chunk.
"""
@spec read(binary) ::
{:ok, t, binary}
| {:error,
:unexpected_eof
| {:unexpected_four_cc, %{actual: FourCC.t(), expected: FourCC.t()}}}
def read(binary) do
with <<
# 0 - 3
data_id::binary-size(4),
# 4 - 7
size::32-little,
# 8 - ...
data::binary-size(size),
etc::binary
>> <- binary,
:ok <- FourCC.verify(data_id, @four_cc) do
{:ok, %__MODULE__{size: size, data: data}, etc}
else
binary when is_binary(binary) -> {:error, :unexpected_eof}
error -> error
end
end
end
| 19.537037 | 82 | 0.513744 |
1c1b2b54ea11b73cad66094c03d8620556e452d1 | 1,288 | ex | Elixir | lib/cache.ex | rockerBOO/rest_twitch | 20815fd135ff5a47a9e23ecd7a55dd00ac952722 | [
"MIT"
] | 3 | 2015-06-24T14:58:17.000Z | 2019-04-25T23:38:03.000Z | lib/cache.ex | rockerBOO/rest_twitch | 20815fd135ff5a47a9e23ecd7a55dd00ac952722 | [
"MIT"
] | 1 | 2018-01-09T04:18:10.000Z | 2018-01-09T04:18:10.000Z | lib/cache.ex | rockerBOO/rest_twitch | 20815fd135ff5a47a9e23ecd7a55dd00ac952722 | [
"MIT"
] | 2 | 2015-08-16T20:30:10.000Z | 2016-04-14T11:24:03.000Z | defmodule RestTwitch.Cache do
defmodule Options do
defstruct ttl: 86400
end
def start_link(client) do
GenServer.start_link(__MODULE__, [client], [name: :twitch_cache])
end
def init([client]) do
{:ok, %{client: client}}
end
def handle_call({:get, key}, _from, state) do
{:reply, state.client |> Exredis.query(["GET", key]), state}
end
def handle_call({:set, key, value}, _from, state) do
{:reply, state.client |> Exredis.query(["SET", key, value]), state}
end
def handle_call({:del, key}, _from, state) do
{:reply, state.client |> Exredis.query(["DEL", key]), state}
end
def handle_call({:incr, key}, _from, state) do
{:reply, state.client |> Exredis.query(["INCR", key]), state}
end
def handle_call({:expire, key, ttl}, _from, state) do
{:reply, state.client |> Exredis.query(["EXPIRE", key, ttl]), state}
end
def get(key) do
GenServer.call(:twitch_cache, {:get, key})
end
def set(key, value) do
GenServer.call(:twitch_cache, {:set, key, value})
end
def delete(key) do
GenServer.call(:twitch_cache, {:del, key})
end
def incr(key) do
GenServer.call(:twitch_cache, {:incr, key})
end
def expire(key, ttl \\ 86400) do
GenServer.call(:twitch_cache, {:expire, key, ttl})
end
end | 24.301887 | 72 | 0.640528 |
1c1b7b4b432bd1d88d743e1189ff542b8900d82c | 1,312 | ex | Elixir | clients/content/lib/google_api/content/v21/model/pause_buy_on_google_program_request.ex | pojiro/elixir-google-api | 928496a017d3875a1929c6809d9221d79404b910 | [
"Apache-2.0"
] | 1 | 2021-12-20T03:40:53.000Z | 2021-12-20T03:40:53.000Z | clients/content/lib/google_api/content/v21/model/pause_buy_on_google_program_request.ex | pojiro/elixir-google-api | 928496a017d3875a1929c6809d9221d79404b910 | [
"Apache-2.0"
] | 1 | 2020-08-18T00:11:23.000Z | 2020-08-18T00:44:16.000Z | clients/content/lib/google_api/content/v21/model/pause_buy_on_google_program_request.ex | pojiro/elixir-google-api | 928496a017d3875a1929c6809d9221d79404b910 | [
"Apache-2.0"
] | null | null | null | # Copyright 2019 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# NOTE: This file is auto generated by the elixir code generator program.
# Do not edit this file manually.
defmodule GoogleApi.Content.V21.Model.PauseBuyOnGoogleProgramRequest do
@moduledoc """
Request message for the PauseProgram method.
## Attributes
"""
use GoogleApi.Gax.ModelBase
@type t :: %__MODULE__{}
end
defimpl Poison.Decoder, for: GoogleApi.Content.V21.Model.PauseBuyOnGoogleProgramRequest do
def decode(value, options) do
GoogleApi.Content.V21.Model.PauseBuyOnGoogleProgramRequest.decode(value, options)
end
end
defimpl Poison.Encoder, for: GoogleApi.Content.V21.Model.PauseBuyOnGoogleProgramRequest do
def encode(value, options) do
GoogleApi.Gax.ModelBase.encode(value, options)
end
end
| 31.238095 | 90 | 0.771341 |
1c1b9109067c58a3200c197792e5e43f6d4cad79 | 3,635 | ex | Elixir | lib/host.ex | xeejp/TheTragedyOfTheCommons | a674f24383a91125d25d9b6de55a4ae30750c305 | [
"MIT"
] | null | null | null | lib/host.ex | xeejp/TheTragedyOfTheCommons | a674f24383a91125d25d9b6de55a4ae30750c305 | [
"MIT"
] | 1 | 2016-12-27T02:06:06.000Z | 2016-12-27T02:06:06.000Z | lib/host.ex | xeejp/TheTragedyOfTheCommons | a674f24383a91125d25d9b6de55a4ae30750c305 | [
"MIT"
] | null | null | null | defmodule TragedyOfTheCommons.Host do
alias TragedyOfTheCommons.Main
defp ensure_integer(integer) when is_integer(integer), do: integer
defp ensure_integer(str), do: Integer.parse(str) |> elem(0)
def update_config(data, config) do
data = Map.put(data, :capacity, ensure_integer(config["capacity"]))
|> Map.put(:cost, ensure_integer(config["cost"]))
|> Map.put(:group_size, ensure_integer(config["groupSize"]))
|> Map.put(:max_grazing_num, ensure_integer(config["maxGrazingNum"]))
|> Map.put(:max_round, ensure_integer(config["maxRound"]))
|> Map.put(:ask_student_id, config["askStudentId"])
end
def update_description(data, description) do
data = Map.put(data, :description, description)
end
def change_page(data, page) do
data = if data.page == "waiting" && page == "description" do
Map.update!(data, :results, fn _ -> %{ groups: %{}, participants: %{} } end)
|> Map.update!(:profits_data, fn _ -> [] end)
|> Map.update!(:history, fn _ -> [] end)
|> match()
else
data
end
data = Map.update!(data, :page, fn _ -> page end)
case page do
"waiting" -> Map.update!(data, :joinable, fn _ -> true end)
|> Map.update!(:active_participants_number, fn _ -> data.participants_number end)
_ -> data
end
end
def visit(data) do
Map.put(data, :is_first_visit, false)
end
def match(data) do
%{participants: participants, group_size: group_size} = data
groups_number = round(Float.ceil(Map.size(participants)/group_size))
groups = participants
|> Enum.map(&elem(&1, 0)) # [id...]
|> Enum.shuffle
|> Enum.map_reduce(0, fn(p, acc) -> {{acc, p}, acc + 1} end) |> elem(0) # [{0, p0}, ..., {n-1, pn-1}]
|> Enum.group_by(fn {i, p} -> Integer.to_string(div(i, group_size)) end, fn {i, p} -> p end) # %{0 => [p0, pm-1], ..., l-1 => [...]}
updater = fn participant, group ->
%{ participant |
group: group,
answered: false,
confirmed: false,
is_finish_description: false,
profits: [],
grazings: [],
answers: 0,
confirms: 0,
status: "experiment"
}
end
reducer = fn {group, ids}, {participants, groups} ->
participants = Enum.reduce(ids, participants, fn id, participants ->
Map.update!(participants, id, &updater.(&1, group))
end)
groups = Map.put(groups, group, Main.new_group(ids))
{participants, groups}
end
acc = {participants, %{}}
{participants, groups} = Enum.reduce(groups, acc, reducer)
%{data | participants: participants, groups: groups, groups_number: groups_number, active_participants_number: data.participants_number, joinable: false, results: %{participants: %{}, groups: %{}}
}
end
def get_filter(data) do
map = %{
_default: true,
is_first_visit: "isFirstVisit",
participants_number: "participantsNumber",
active_participants_number: "activeParticipantsNumber",
finish_description_number: "finishDescriptionNumber",
groups_number: "groupsNumber",
group_size: "groupSize",
max_round: "maxRound",
max_grazing_num: "maxGrazingNum",
ask_student_id: "askStudentId",
profits_data: "profits_data",
groups: %{
_default: %{
_default: true,
group_status: "groupStatus",
group_profits: "groupProfits",
}
}
}
end
def filter_data(data) do
Transmap.transform(data, get_filter(data), diff: false)
end
end
| 34.619048 | 200 | 0.607428 |
1c1ba706c8bea71ec2dab821ed6b575f80026654 | 14,270 | ex | Elixir | lib/mint/http2/frame.ex | moogle19/mint | 80ecfad07dfcefd70d3c729624f7ae3c147ae052 | [
"Apache-2.0"
] | null | null | null | lib/mint/http2/frame.ex | moogle19/mint | 80ecfad07dfcefd70d3c729624f7ae3c147ae052 | [
"Apache-2.0"
] | null | null | null | lib/mint/http2/frame.ex | moogle19/mint | 80ecfad07dfcefd70d3c729624f7ae3c147ae052 | [
"Apache-2.0"
] | null | null | null | defmodule Mint.HTTP2.Frame do
@moduledoc false
use Bitwise, skip_operators: true
import Record
shared_stream = [:stream_id, {:flags, 0x00}]
shared_conn = [stream_id: 0, flags: 0x00]
defrecord :data, shared_stream ++ [:data, :padding]
defrecord :headers, shared_stream ++ [:exclusive?, :stream_dependency, :weight, :hbf, :padding]
defrecord :priority, shared_stream ++ [:exclusive?, :stream_dependency, :weight]
defrecord :rst_stream, shared_stream ++ [:error_code]
defrecord :settings, shared_conn ++ [:params]
defrecord :push_promise, shared_stream ++ [:promised_stream_id, :hbf, :padding]
defrecord :ping, shared_conn ++ [:opaque_data]
defrecord :goaway, shared_conn ++ [:last_stream_id, :error_code, :debug_data]
defrecord :window_update, shared_stream ++ [:window_size_increment]
defrecord :continuation, shared_stream ++ [:hbf]
defrecord :unknown, []
@types %{
data: 0x00,
headers: 0x01,
priority: 0x02,
rst_stream: 0x03,
settings: 0x04,
push_promise: 0x05,
ping: 0x06,
goaway: 0x07,
window_update: 0x08,
continuation: 0x09
}
## Flag handling
@flags %{
data: [end_stream: 0x01, padded: 0x08],
headers: [end_stream: 0x01, end_headers: 0x04, padded: 0x08, priority: 0x20],
settings: [ack: 0x01],
push_promise: [end_headers: 0x04, padded: 0x08],
ping: [ack: 0x01],
continuation: [end_headers: 0x04]
}
@spec set_flags(byte(), atom(), [flag_name :: atom()]) :: byte()
def set_flags(initial_flags \\ 0x00, frame_name, flags_to_set)
when is_integer(initial_flags) and is_list(flags_to_set) do
Enum.reduce(flags_to_set, initial_flags, &set_flag(&2, frame_name, &1))
end
@spec flag_set?(byte(), atom(), atom()) :: boolean()
def flag_set?(flags, frame, flag_name)
for {frame, flags} <- @flags,
{flag_name, flag_value} <- flags do
defp set_flag(flags, unquote(frame), unquote(flag_name)), do: bor(flags, unquote(flag_value))
defp set_flag(unquote(frame), unquote(flag_name)), do: unquote(flag_value)
def flag_set?(flags, unquote(frame), unquote(flag_name)),
do: band(flags, unquote(flag_value)) == unquote(flag_value)
end
defmacrop is_flag_set(flags, flag) do
quote do
band(unquote(flags), unquote(flag)) == unquote(flag)
end
end
## Parsing
@doc """
Decodes the next frame of the given binary.
Returns `{:ok, frame, rest}` if successful, `{:error, reason}` if not.
"""
@spec decode_next(binary()) :: {:ok, tuple(), binary()} | :more | {:error, reason}
when reason:
{:frame_size_error, atom()}
| {:protocol_error, binary()}
| :payload_too_big
def decode_next(bin, max_frame_size \\ 16_384) when is_binary(bin) do
case decode_next_raw(bin) do
{:ok, {_type, _flags, _stream_id, payload}, _rest}
when byte_size(payload) > max_frame_size ->
{:error, :payload_too_big}
{:ok, {type, flags, stream_id, payload}, rest} ->
{:ok, decode_contents(type, flags, stream_id, payload), rest}
:more ->
:more
end
catch
:throw, {:mint, reason} -> {:error, reason}
end
defp decode_next_raw(<<
length::24,
type,
flags,
_reserved::1,
stream_id::31,
payload::size(length)-binary,
rest::binary
>>) do
{:ok, {type, flags, stream_id, payload}, rest}
end
defp decode_next_raw(_other) do
:more
end
for {frame, type} <- @types do
function = :"decode_#{frame}"
defp decode_contents(unquote(type), flags, stream_id, payload) do
unquote(function)(flags, stream_id, payload)
end
end
defp decode_contents(_type, _flags, _stream_id, _payload) do
unknown()
end
# Parsing of specific frames
# http://httpwg.org/specs/rfc7540.html#rfc.section.6.1
defp decode_data(flags, stream_id, payload) do
{data, padding} = decode_padding(:data, flags, payload)
data(stream_id: stream_id, flags: flags, data: data, padding: padding)
end
# http://httpwg.org/specs/rfc7540.html#rfc.section.6.2
defp decode_headers(flags, stream_id, payload) do
{data, padding} = decode_padding(:headers, flags, payload)
{exclusive?, stream_dependency, weight, data} =
if flag_set?(flags, :headers, :priority) do
<<exclusive::1, stream_dependency::31, weight::8, rest::binary>> = data
{exclusive == 1, stream_dependency, weight + 1, rest}
else
{nil, nil, nil, data}
end
headers(
stream_id: stream_id,
flags: flags,
padding: padding,
exclusive?: exclusive?,
stream_dependency: stream_dependency,
weight: weight,
hbf: data
)
end
# http://httpwg.org/specs/rfc7540.html#rfc.section.6.3
defp decode_priority(_flags, _stream_id, payload) when byte_size(payload) != 5 do
throw({:mint, {:frame_size_error, :priority}})
end
defp decode_priority(flags, stream_id, payload) do
<<exclusive::1, stream_dependency::31, weight::8>> = payload
priority(
stream_id: stream_id,
flags: flags,
exclusive?: exclusive == 1,
stream_dependency: stream_dependency,
weight: weight + 1
)
end
# http://httpwg.org/specs/rfc7540.html#rfc.section.6.4
defp decode_rst_stream(_flags, _stream_id, payload) when byte_size(payload) != 4 do
throw({:mint, {:frame_size_error, :rst_stream}})
end
defp decode_rst_stream(flags, stream_id, <<error_code::32>>) do
rst_stream(
stream_id: stream_id,
flags: flags,
error_code: humanize_error_code(error_code)
)
end
# http://httpwg.org/specs/rfc7540.html#rfc.section.6.5
defp decode_settings(_flags, _stream_id, payload) when rem(byte_size(payload), 6) != 0 do
throw({:mint, {:frame_size_error, :settings}})
end
defp decode_settings(flags, stream_id, payload) do
settings(stream_id: stream_id, flags: flags, params: decode_settings_params(payload))
end
# http://httpwg.org/specs/rfc7540.html#rfc.section.6.6
defp decode_push_promise(flags, stream_id, payload) do
{data, padding} = decode_padding(:push_promise, flags, payload)
<<_reserved::1, promised_stream_id::31, header_block_fragment::binary>> = data
push_promise(
stream_id: stream_id,
flags: flags,
promised_stream_id: promised_stream_id,
hbf: header_block_fragment,
padding: padding
)
end
# http://httpwg.org/specs/rfc7540.html#rfc.section.6.7
defp decode_ping(_flags, _stream_id, payload) when byte_size(payload) != 8 do
throw({:mint, {:frame_size_error, :ping}})
end
defp decode_ping(flags, stream_id, payload) do
ping(stream_id: stream_id, flags: flags, opaque_data: payload)
end
# http://httpwg.org/specs/rfc7540.html#rfc.section.6.8
defp decode_goaway(flags, stream_id, payload) do
<<_reserved::1, last_stream_id::31, error_code::32, debug_data::binary>> = payload
goaway(
stream_id: stream_id,
flags: flags,
last_stream_id: last_stream_id,
error_code: humanize_error_code(error_code),
debug_data: debug_data
)
end
# http://httpwg.org/specs/rfc7540.html#rfc.section.6.9
defp decode_window_update(_flags, _stream_id, payload) when byte_size(payload) != 4 do
throw({:mint, {:frame_size_error, :window_update}})
end
defp decode_window_update(_flags, _stream_id, <<_reserved::1, 0::31>>) do
throw({:mint, {:protocol_error, "bad WINDOW_SIZE increment"}})
end
defp decode_window_update(flags, stream_id, <<_reserved::1, window_size_increment::31>>) do
window_update(
stream_id: stream_id,
flags: flags,
window_size_increment: window_size_increment
)
end
# http://httpwg.org/specs/rfc7540.html#rfc.section.6.10
defp decode_continuation(flags, stream_id, payload) do
continuation(stream_id: stream_id, flags: flags, hbf: payload)
end
defp decode_padding(frame, flags, <<pad_length, rest::binary>> = payload)
when is_flag_set(flags, unquote(@flags[:data][:padded])) do
if pad_length >= byte_size(payload) do
debug_data =
"the padding length of a #{inspect(frame)} frame is bigger than the payload length"
throw({:mint, {:protocol_error, debug_data}})
else
# 1 byte is for the space taken by pad_length
data_length = byte_size(payload) - pad_length - 1
<<data::size(data_length)-binary, padding::size(pad_length)-binary>> = rest
{data, padding}
end
end
defp decode_padding(_frame, _flags, payload) do
{payload, nil}
end
defp decode_settings_params(payload) do
decode_settings_params(payload, _acc = [])
end
defp decode_settings_params(<<>>, acc) do
Enum.reverse(acc)
end
defp decode_settings_params(<<identifier::16, value::32, rest::binary>>, acc) do
# From http://httpwg.org/specs/rfc7540.html#SettingValues:
# An endpoint that receives a SETTINGS frame with any unknown or unsupported identifier MUST
# ignore that setting.
acc =
case identifier do
0x01 -> [{:header_table_size, value} | acc]
0x02 -> [{:enable_push, value == 1} | acc]
0x03 -> [{:max_concurrent_streams, value} | acc]
0x04 -> [{:initial_window_size, value} | acc]
0x05 -> [{:max_frame_size, value} | acc]
0x06 -> [{:max_header_list_size, value} | acc]
0x08 -> [{:enable_connect_protocol, value == 1} | acc]
_other -> acc
end
decode_settings_params(rest, acc)
end
## Encoding
@doc """
Encodes the given `frame`.
"""
@spec encode(tuple()) :: iodata()
def encode(frame)
def encode(data(stream_id: stream_id, flags: flags, data: data, padding: nil)) do
encode_raw(@types[:data], flags, stream_id, data)
end
def encode(data(stream_id: stream_id, flags: flags, data: data, padding: padding)) do
flags = set_flags(flags, :data, [:padded])
payload = [byte_size(padding), data, padding]
encode_raw(@types[:data], flags, stream_id, payload)
end
def encode(headers() = frame) do
headers(
flags: flags,
stream_id: stream_id,
exclusive?: exclusive?,
stream_dependency: stream_dependency,
weight: weight,
hbf: hbf,
padding: padding
) = frame
payload = hbf
{payload, flags} =
if stream_dependency && weight && is_boolean(exclusive?) do
{
[<<if(exclusive?, do: 1, else: 0)::1, stream_dependency::31>>, weight - 1, payload],
set_flags(flags, :headers, [:priority])
}
else
{payload, flags}
end
{payload, flags} =
if padding do
{[byte_size(padding), payload, padding], set_flags(flags, :headers, [:padded])}
else
{payload, flags}
end
encode_raw(@types[:headers], flags, stream_id, payload)
end
def encode(priority() = frame) do
priority(
stream_id: stream_id,
flags: flags,
exclusive?: exclusive?,
stream_dependency: stream_dependency,
weight: weight
) = frame
payload = [
<<if(exclusive?, do: 1, else: 0)::1, stream_dependency::31>>,
weight - 1
]
encode_raw(@types[:priority], flags, stream_id, payload)
end
def encode(rst_stream(stream_id: stream_id, flags: flags, error_code: error_code)) do
payload = <<dehumanize_error_code(error_code)::32>>
encode_raw(@types[:rst_stream], flags, stream_id, payload)
end
def encode(settings(stream_id: stream_id, flags: flags, params: params)) do
payload =
Enum.map(params, fn
{:header_table_size, value} -> <<0x01::16, value::32>>
{:enable_push, value} -> <<0x02::16, if(value, do: 1, else: 0)::32>>
{:max_concurrent_streams, value} -> <<0x03::16, value::32>>
{:initial_window_size, value} -> <<0x04::16, value::32>>
{:max_frame_size, value} -> <<0x05::16, value::32>>
{:max_header_list_size, value} -> <<0x06::16, value::32>>
{:enable_connect_protocol, value} -> <<0x08::16, if(value, do: 1, else: 0)::32>>
end)
encode_raw(@types[:settings], flags, stream_id, payload)
end
def encode(push_promise() = frame) do
push_promise(
stream_id: stream_id,
flags: flags,
promised_stream_id: promised_stream_id,
hbf: hbf,
padding: padding
) = frame
payload = [<<0::1, promised_stream_id::31>>, hbf]
{payload, flags} =
if padding do
{
[byte_size(padding), payload, padding],
set_flags(flags, :push_promise, [:padded])
}
else
{payload, flags}
end
encode_raw(@types[:push_promise], flags, stream_id, payload)
end
def encode(ping(stream_id: 0, flags: flags, opaque_data: opaque_data)) do
encode_raw(@types[:ping], flags, 0, opaque_data)
end
def encode(goaway() = frame) do
goaway(
stream_id: 0,
flags: flags,
last_stream_id: last_stream_id,
error_code: error_code,
debug_data: debug_data
) = frame
payload = [<<0::1, last_stream_id::31, dehumanize_error_code(error_code)::32>>, debug_data]
encode_raw(@types[:goaway], flags, 0, payload)
end
def encode(window_update(stream_id: stream_id, flags: flags, window_size_increment: wsi)) do
payload = <<0::1, wsi::31>>
encode_raw(@types[:window_update], flags, stream_id, payload)
end
def encode(continuation(stream_id: stream_id, flags: flags, hbf: hbf)) do
encode_raw(@types[:continuation], flags, stream_id, _payload = hbf)
end
def encode_raw(type, flags, stream_id, payload) do
[<<IO.iodata_length(payload)::24>>, type, flags, <<0::1, stream_id::31>>, payload]
end
## Helpers
error_codes = %{
0x00 => :no_error,
0x01 => :protocol_error,
0x02 => :internal_error,
0x03 => :flow_control_error,
0x04 => :settings_timeout,
0x05 => :stream_closed,
0x06 => :frame_size_error,
0x07 => :refused_stream,
0x08 => :cancel,
0x09 => :compression_error,
0x0A => :connect_error,
0x0B => :enhance_your_calm,
0x0C => :inadequate_security,
0x0D => :http_1_1_required
}
for {code, human_code} <- error_codes do
defp humanize_error_code(unquote(code)), do: unquote(human_code)
defp dehumanize_error_code(unquote(human_code)), do: unquote(code)
end
end
| 30.556745 | 97 | 0.652768 |
1c1be3e3d52a6f881535feb7982ded41ae0f2ff5 | 546 | ex | Elixir | lib/groundstation_web/live/mavlink_viz_live.ex | joshprice/groundstation | ef08065f32c6389d87c84cd36f14052fafe0e3ee | [
"Apache-2.0"
] | 23 | 2019-10-03T05:40:57.000Z | 2021-09-29T17:03:59.000Z | lib/groundstation_web/live/mavlink_viz_live.ex | joshprice/groundstation | ef08065f32c6389d87c84cd36f14052fafe0e3ee | [
"Apache-2.0"
] | null | null | null | lib/groundstation_web/live/mavlink_viz_live.ex | joshprice/groundstation | ef08065f32c6389d87c84cd36f14052fafe0e3ee | [
"Apache-2.0"
] | 3 | 2019-10-17T04:13:38.000Z | 2020-07-11T02:18:03.000Z | defmodule GroundStationWeb.MavlinkVizLive do
use Phoenix.LiveView
def render(assigns) do
Phoenix.View.render(GroundStationWeb.PageView, "mavlink_viz.html", assigns)
end
def mount(session, socket) do
{:ok, assign(socket, vehicle: session.vehicle)}
end
def handle_info(:mavlink_message, socket) do
socket.assigns.vehicle
|> update_vehicle(socket)
end
def handle_event(_, _, socket) do
{:noreply, socket}
end
def update_vehicle(state, socket) do
{:noreply, assign(socket, :vehicle, state)}
end
end
| 21.84 | 79 | 0.717949 |
1c1c03ffe770a7740aad2c36265623c520916492 | 339 | exs | Elixir | priv/repo/migrations/20201224022124_add_oban_jobs_table.exs | gustavoarmoa/changelog.com | e898a9979a237ae66962714821ed8633a4966f37 | [
"MIT"
] | 2,599 | 2016-10-25T15:02:53.000Z | 2022-03-26T02:34:42.000Z | priv/repo/migrations/20201224022124_add_oban_jobs_table.exs | sdrees/changelog.com | 955cdcf93d74991062f19a03e34c9f083ade1705 | [
"MIT"
] | 253 | 2016-10-25T20:29:24.000Z | 2022-03-29T21:52:36.000Z | priv/repo/migrations/20201224022124_add_oban_jobs_table.exs | sdrees/changelog.com | 955cdcf93d74991062f19a03e34c9f083ade1705 | [
"MIT"
] | 298 | 2016-10-25T15:18:31.000Z | 2022-01-18T21:25:52.000Z | defmodule Changelog.Repo.Migrations.AddObanJobsTable do
use Ecto.Migration
def up do
Oban.Migrations.up()
end
# We specify `version: 1` in `down`, ensuring that we'll roll all the way back down if
# necessary, regardless of which version we've migrated `up` to.
def down do
Oban.Migrations.down(version: 1)
end
end
| 24.214286 | 88 | 0.719764 |
1c1c193bdf4149c32852c7b3c8ef5362f716573e | 3,163 | ex | Elixir | clients/cloud_functions/lib/google_api/cloud_functions/v1/model/operation.ex | MasashiYokota/elixir-google-api | 975dccbff395c16afcb62e7a8e411fbb58e9ab01 | [
"Apache-2.0"
] | null | null | null | clients/cloud_functions/lib/google_api/cloud_functions/v1/model/operation.ex | MasashiYokota/elixir-google-api | 975dccbff395c16afcb62e7a8e411fbb58e9ab01 | [
"Apache-2.0"
] | 1 | 2020-12-18T09:25:12.000Z | 2020-12-18T09:25:12.000Z | clients/cloud_functions/lib/google_api/cloud_functions/v1/model/operation.ex | MasashiYokota/elixir-google-api | 975dccbff395c16afcb62e7a8e411fbb58e9ab01 | [
"Apache-2.0"
] | 1 | 2020-10-04T10:12:44.000Z | 2020-10-04T10:12:44.000Z | # Copyright 2019 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# NOTE: This file is auto generated by the elixir code generator program.
# Do not edit this file manually.
defmodule GoogleApi.CloudFunctions.V1.Model.Operation do
@moduledoc """
This resource represents a long-running operation that is the result of a network API call.
## Attributes
* `done` (*type:* `boolean()`, *default:* `nil`) - If the value is `false`, it means the operation is still in progress. If `true`, the operation is completed, and either `error` or `response` is available.
* `error` (*type:* `GoogleApi.CloudFunctions.V1.Model.Status.t`, *default:* `nil`) - The error result of the operation in case of failure or cancellation.
* `metadata` (*type:* `map()`, *default:* `nil`) - Service-specific metadata associated with the operation. It typically contains progress information and common metadata such as create time. Some services might not provide such metadata. Any method that returns a long-running operation should document the metadata type, if any.
* `name` (*type:* `String.t`, *default:* `nil`) - The server-assigned name, which is only unique within the same service that originally returns it. If you use the default HTTP mapping, the `name` should be a resource name ending with `operations/{unique_id}`.
* `response` (*type:* `map()`, *default:* `nil`) - The normal response of the operation in case of success. If the original method returns no data on success, such as `Delete`, the response is `google.protobuf.Empty`. If the original method is standard `Get`/`Create`/`Update`, the response should be the resource. For other methods, the response should have the type `XxxResponse`, where `Xxx` is the original method name. For example, if the original method name is `TakeSnapshot()`, the inferred response type is `TakeSnapshotResponse`.
"""
use GoogleApi.Gax.ModelBase
@type t :: %__MODULE__{
:done => boolean(),
:error => GoogleApi.CloudFunctions.V1.Model.Status.t(),
:metadata => map(),
:name => String.t(),
:response => map()
}
field(:done)
field(:error, as: GoogleApi.CloudFunctions.V1.Model.Status)
field(:metadata, type: :map)
field(:name)
field(:response, type: :map)
end
defimpl Poison.Decoder, for: GoogleApi.CloudFunctions.V1.Model.Operation do
def decode(value, options) do
GoogleApi.CloudFunctions.V1.Model.Operation.decode(value, options)
end
end
defimpl Poison.Encoder, for: GoogleApi.CloudFunctions.V1.Model.Operation do
def encode(value, options) do
GoogleApi.Gax.ModelBase.encode(value, options)
end
end
| 53.610169 | 543 | 0.725261 |
1c1c6bfeeeb1661282529d82bdebbce6f263949a | 10,623 | ex | Elixir | lib/avr/programmer/stk500.ex | luisgabrielroldan/avr | a21b3bfce083adbbcd9a223ec6e4b12ae0439e34 | [
"Apache-2.0"
] | 10 | 2020-01-07T06:54:25.000Z | 2021-10-16T06:42:19.000Z | lib/avr/programmer/stk500.ex | luisgabrielroldan/ex_avr | a21b3bfce083adbbcd9a223ec6e4b12ae0439e34 | [
"Apache-2.0"
] | null | null | null | lib/avr/programmer/stk500.ex | luisgabrielroldan/ex_avr | a21b3bfce083adbbcd9a223ec6e4b12ae0439e34 | [
"Apache-2.0"
] | 2 | 2020-11-24T12:06:50.000Z | 2021-03-15T12:48:14.000Z | defmodule AVR.Programmer.Stk500 do
@moduledoc false
alias Circuits.UART
alias AVR.Programmer, as: PGM
import AVR.Helpers, only: [with_retry: 2, read_min_bytes: 3]
import Kernel, except: [send: 2]
@behaviour AVR.Programmer
@default_speed 115_200
@max_cmd_retries 10
@sync_crc_eop 0x20
@cmd_get_sync 0x30
@cmd_set_parameter 0x40
@cmd_get_parameter 0x41
@cmd_enter_progmode 0x50
@cmd_leave_progmode 0x51
@cmd_load_address 0x55
@cmd_universal 0x56
@cmd_prog_page 0x64
@cmd_read_page 0x74
@cmd_read_sign 0x75
@resp_ok 0x10
@resp_failed 0x10
@resp_insync 0x14
@resp_noinsync 0x15
@parameters [
hw_ver: 0x80,
sw_major: 0x81,
sw_minor: 0x82,
leds: 0x83,
vtarget: 0x84,
vadjust: 0x85,
osc_pscale: 0x86,
osc_cmatch: 0x87,
reset_duration: 0x88,
sck_duration: 0x89,
bufsizel: 0x90,
bufsizeh: 0x91,
device: 0x92,
progmode: 0x93,
paramode: 0x94,
polling: 0x95,
selftimed: 0x96,
topcard_detect: 0x98
]
def get_param(%PGM{} = pgm, param) do
case Keyword.get(@parameters, param) do
nil ->
{:error, :param_name}
param_id ->
cmd = <<@cmd_get_parameter, param_id, @sync_crc_eop>>
with_retry(@max_cmd_retries, fn ->
with :ok <- send(pgm, cmd),
{:ok, <<value>>} <- recv_result(pgm, 1) do
{:done, {:ok, value}}
else
{:error, :no_sync} = error ->
case get_sync(pgm) do
:ok ->
{:retry, error}
error ->
{:done, error}
end
{:error, {:failed, _}} ->
{:done, {:error, :failed}}
_ ->
{:retry, {:error, :get_param}}
end
end)
end
end
def set_param(%PGM{} = pgm, param, value) do
case Keyword.get(@parameters, param) do
nil ->
{:error, :param_name}
param_id ->
cmd = <<@cmd_set_parameter, param_id, value, @sync_crc_eop>>
with_retry(@max_cmd_retries, fn ->
with :ok <- send(pgm, cmd),
:ok <- recv_ok(pgm) do
{:done, :ok}
else
{:error, :no_sync} = error ->
case get_sync(pgm) do
:ok ->
{:retry, error}
error ->
{:done, error}
end
{:error, {:failed, _}} ->
{:done, {:error, :failed}}
_ ->
{:retry, {:error, :set_param}}
end
end)
end
end
def paged_read(%PGM{} = pgm, page_size, {mem, baseaddr}, n_bytes)
when is_integer(baseaddr) and is_integer(n_bytes) do
n_pages = div(n_bytes - 1, page_size) + 1
0..(n_pages - 1)
|> Enum.reduce_while([], fn page_num, acc ->
addr = baseaddr + page_num * page_size
case read_page_from_addr(pgm, page_size, mem, addr) do
{:ok, page_data} ->
{:cont, [page_data | acc]}
error ->
{:halt, error}
end
end)
|> case do
pages when is_list(pages) ->
# Discard not requested extra data
<<data::binary-size(n_bytes), _::binary>> =
pages
|> Enum.reverse()
|> Enum.into(<<>>)
{:ok, data}
error ->
error
end
end
def paged_write(%PGM{} = pgm, page_size, {mem, baseaddr}, data)
when is_integer(baseaddr) and is_binary(data) do
data
|> to_pages(page_size)
|> Enum.reduce_while(baseaddr, fn page_data, addr ->
case write_page_in_addr(pgm, page_size, mem, addr, page_data) do
{:ok, addr} ->
{:cont, addr}
error ->
{:halt, error}
end
end)
|> case do
res when is_integer(res) ->
:ok
error ->
error
end
end
def read_sig_bytes(%PGM{} = pgm) do
with :ok <- send(pgm, <<@cmd_read_sign, @sync_crc_eop>>),
{:ok, res} <- recv_result(pgm, 3) do
{:ok, res}
else
{:error, reason} -> {:error, {:read_sign, reason}}
end
end
def cmd(%PGM{} = pgm, <<_::32>> = cmd) do
with :ok <- send(pgm, <<@cmd_universal, cmd::binary, @sync_crc_eop>>),
{:ok, res} <- recv_result(pgm, 1) do
{:ok, res}
else
{:error, reason} -> {:error, {:cmd, reason}}
end
end
def initialize(%PGM{} = pgm) do
with :ok <- enter_prog_mode(pgm),
{:ok, sw_major} <- get_param(pgm, :sw_major),
{:ok, sw_minor} <- get_param(pgm, :sw_minor) do
meta = Keyword.put(pgm.meta, :sw_version, {sw_major, sw_minor})
{:ok, %{pgm | meta: meta}}
end
end
def open(%PGM{} = pgm, port_name, opts \\ []) do
port_opts = [
speed: opts[:speed] || @default_speed,
active: false
]
case open_port(port_name, port_opts) do
{:ok, port} ->
pgm = %{pgm | port: port}
case get_sync(pgm) do
:ok ->
{:ok, pgm}
error ->
case close(pgm) do
:ok ->
error
error1 ->
error1
end
end
error ->
error
end
end
def close(%PGM{} = pgm) do
with :ok <- leave_prog_mode(pgm) do
close_port(pgm.port)
:ok
end
end
def send(%PGM{port: port}, data) do
case UART.write(port, data) do
:ok -> :ok
{:error, reason} -> {:error, {:send, reason}}
end
end
def drain(%PGM{port: nil}),
do: :ok
def drain(%PGM{port: port}) do
case UART.drain(port) do
:ok -> :ok
{:error, reason} -> {:error, {:drain, reason}}
end
end
def get_sync(%PGM{} = pgm) do
cmd = <<@cmd_get_sync, @sync_crc_eop>>
with :ok <- send(pgm, cmd),
:ok <- drain(pgm),
:ok <- send(pgm, cmd),
:ok <- drain(pgm) do
with_retry(@max_cmd_retries, fn ->
with :ok <- send(pgm, cmd),
:ok <- recv_ok(pgm) do
{:done, :ok}
else
_ ->
{:retry, {:error, :get_sync}}
end
end)
end
end
def init_pgm(_opts \\ []) do
{:ok, %PGM{}}
end
def enter_prog_mode(%PGM{} = pgm) do
cmd = <<@cmd_enter_progmode, @sync_crc_eop>>
with_retry(@max_cmd_retries, fn ->
with :ok <- send(pgm, cmd),
:ok <- recv_ok(pgm) do
{:done, :ok}
else
{:error, :no_sync} = error ->
case get_sync(pgm) do
:ok ->
{:retry, error}
error ->
{:done, error}
end
_ ->
{:retry, {:error, :enter_prog_mode}}
end
end)
end
def leave_prog_mode(%PGM{} = pgm) do
with :ok <- send(pgm, <<@cmd_leave_progmode, @sync_crc_eop>>),
:ok <- recv_ok(pgm) do
:ok
else
{:error, reason} ->
{:error, {:leave_prog_mode, reason}}
end
end
def open_port(port_name, opts) do
{:ok, ref} = UART.start_link()
case UART.open(ref, port_name, opts) do
:ok ->
{:ok, ref}
error ->
UART.stop(ref)
error
end
end
def close_port(ref) do
UART.stop(ref)
end
defp read_page_from_addr(pgm, page_size, mem, addr) do
with_retry(@max_cmd_retries, fn ->
with :ok <- load_address(pgm, mem, addr),
{:ok, page_data} <- read_page(pgm, page_size, mem) do
{:done, {:ok, page_data}}
else
{:error, :no_sync} = error ->
case get_sync(pgm) do
:ok ->
{:retry, error}
error ->
{:retry, error}
end
error ->
{:retry, error}
end
end)
end
defp read_page(%PGM{} = pgm, page_size, mem) do
memtype = get_memtype_code(mem)
buffer = <<
@cmd_read_page,
page_size::16-big,
memtype,
@sync_crc_eop
>>
with :ok <- send(pgm, buffer),
{:ok, res} <- recv_result(pgm, page_size) do
{:ok, res}
else
{:error, reason} ->
{:error, {:read_page, reason}}
end
end
defp write_page_in_addr(pgm, page_size, mem, addr, page_data) do
with_retry(@max_cmd_retries, fn ->
with :ok <- load_address(pgm, mem, addr),
:ok <- write_page(pgm, page_size, mem, page_data) do
{:done, {:ok, addr + byte_size(page_data)}}
else
{:error, :no_sync} = error ->
case get_sync(pgm) do
:ok ->
{:retry, error}
error ->
{:retry, error}
end
error ->
{:retry, error}
end
end)
end
defp write_page(pgm, page_size, mem, page_data) do
memtype = get_memtype_code(mem)
cmd = <<
@cmd_prog_page,
page_size::16-big,
memtype,
page_data::binary,
@sync_crc_eop
>>
with :ok <- send(pgm, cmd),
:ok <- recv_ok(pgm) do
:ok
else
{:error, reason} ->
{:error, {:write_page, reason}}
end
end
defp load_address(pgm, mem, addr) do
addr = translate_address(mem, addr)
cmd = <<@cmd_load_address, addr::16-little, @sync_crc_eop>>
with :ok <- send(pgm, cmd),
:ok <- recv_ok(pgm) do
:ok
else
{:error, reason} ->
{:error, {:load_address, reason}}
end
end
defp translate_address(:flash, addr), do: div(addr, 2)
defp translate_address(:eeprom, addr), do: div(addr, 2)
defp to_pages(data, page_size, acc \\ []) do
case data do
<<>> ->
Enum.reverse(acc)
<<page::binary-size(page_size), rest::binary>> ->
to_pages(rest, page_size, [page | acc])
<<rest::binary>> ->
padding_size = page_size - byte_size(rest)
padding = for _ <- 1..padding_size, into: <<>>, do: <<0>>
page = <<rest::binary, padding::binary>>
to_pages(<<>>, page_size, [page | acc])
end
end
defp recv_ok(pgm) do
case recv_result(pgm) do
{:ok, <<>>} ->
:ok
error ->
error
end
end
defp recv_result(pgm, expected_size \\ 0) do
case read_min_bytes(pgm.port, 2 + expected_size, 1000) do
{:ok, <<@resp_insync, data::binary-size(expected_size), @resp_ok, _::binary>>} ->
{:ok, data}
{:ok, <<@resp_insync, value::8, @resp_failed, _::binary>>} ->
{:error, {:failed, value}}
{:ok, <<@resp_noinsync, _>>} ->
{:error, :no_sync}
{:ok, result} ->
{:error, {:unexpected_result, result}}
{:error, reason} ->
{:error, {:recv_result, reason}}
end
end
defp get_memtype_code(:flash), do: ?F
defp get_memtype_code(:eeprom), do: ?E
end
| 22.458774 | 87 | 0.51605 |
1c1c952578175f4d36b00eeead3a5985da3b191b | 831 | ex | Elixir | lib/changelog_web/views/news_item_comment_view.ex | yanokwa/changelog.com | 88093bada9ff294159246b8200b3121cf41666f7 | [
"MIT"
] | 1 | 2021-03-14T21:12:49.000Z | 2021-03-14T21:12:49.000Z | lib/changelog_web/views/news_item_comment_view.ex | yanokwa/changelog.com | 88093bada9ff294159246b8200b3121cf41666f7 | [
"MIT"
] | null | null | null | lib/changelog_web/views/news_item_comment_view.ex | yanokwa/changelog.com | 88093bada9ff294159246b8200b3121cf41666f7 | [
"MIT"
] | 1 | 2018-10-03T20:55:52.000Z | 2018-10-03T20:55:52.000Z | defmodule ChangelogWeb.NewsItemCommentView do
use ChangelogWeb, :public_view
alias Changelog.{Hashid, ListKit, NewsItemComment, StringKit}
alias ChangelogWeb.{LayoutView, PersonView, TimeView}
def hashid(id) when is_integer(id), do: Hashid.encode(id)
def hashid(comment = %NewsItemComment{}), do: Hashid.encode(comment.id)
def modifier_classes(item, comment) do
[(if Enum.any?(comment.children), do: "comment--has_replies"),
(if item.author_id == comment.author_id, do: "is-author")]
|> ListKit.compact_join()
end
def permalink_path(comment), do: "#comment-#{hashid(comment)}"
def transformed_content(content) do
mentioned = NewsItemComment.mentioned_people(content)
content
|> StringKit.mentions_linkify(mentioned)
|> StringKit.md_linkify()
|> md_to_safe_html()
end
end
| 30.777778 | 73 | 0.728039 |
1c1c97be13111b67c849807081d887a4f0e26494 | 488 | ex | Elixir | lib/fomantic_ui/router.ex | easink/fomantic_ui | 069e3d9f8bf6859223d35d8e14e97d9e0fb03848 | [
"Apache-2.0"
] | null | null | null | lib/fomantic_ui/router.ex | easink/fomantic_ui | 069e3d9f8bf6859223d35d8e14e97d9e0fb03848 | [
"Apache-2.0"
] | null | null | null | lib/fomantic_ui/router.ex | easink/fomantic_ui | 069e3d9f8bf6859223d35d8e14e97d9e0fb03848 | [
"Apache-2.0"
] | null | null | null | defmodule FomanticUI.Router do
use FomanticUI, :router
pipeline :browser do
plug(:accepts, ["html"])
plug(:fetch_session)
plug(:fetch_live_flash)
plug :put_root_layout, {FomanticUI.LayoutView, :root}
plug(:protect_from_forgery)
plug(:put_secure_browser_headers)
end
# pipeline :api do
# plug(:accepts, ["json"])
# end
scope "/", FomanticUI do
pipe_through(:browser)
# get("/", PageController, :index)
live("/", IndexLive)
end
end
| 20.333333 | 57 | 0.661885 |
1c1ca51d84ae7ea7b557322533fc1cdb923b0a60 | 7,121 | exs | Elixir | test/nerves_new_test.exs | axelson/nerves_bootstrap | d1f13eb5b186af636c8a3a7ad512400ee6a1653c | [
"Apache-2.0"
] | 33 | 2017-10-20T04:04:00.000Z | 2021-04-27T11:15:08.000Z | test/nerves_new_test.exs | axelson/nerves_bootstrap | d1f13eb5b186af636c8a3a7ad512400ee6a1653c | [
"Apache-2.0"
] | 67 | 2018-01-10T15:41:43.000Z | 2022-02-23T22:11:55.000Z | test/nerves_new_test.exs | axelson/nerves_bootstrap | d1f13eb5b186af636c8a3a7ad512400ee6a1653c | [
"Apache-2.0"
] | 14 | 2018-02-04T16:31:28.000Z | 2022-01-21T11:12:46.000Z | Code.require_file("mix_helper.exs", __DIR__)
defmodule Nerves.NewTest do
use ExUnit.Case
import MixHelper
@app_name "my_device"
setup do
# The shell asks to install deps.
# We will politely say not to.
send(self(), {:mix_shell_input, :yes?, false})
:ok
end
test "new project default targets", context do
in_tmp(context.test, fn ->
Mix.Tasks.Nerves.New.run([@app_name])
assert_file("#{@app_name}/README.md")
assert_file("#{@app_name}/mix.exs", fn file ->
assert file =~ "@app :#{@app_name}"
assert file =~ "{:nerves_system_rpi, \"~> 1.17\", runtime: false, targets: :rpi"
assert file =~ "{:nerves_system_rpi0, \"~> 1.17\", runtime: false, targets: :rpi0"
assert file =~ "{:nerves_system_rpi2, \"~> 1.17\", runtime: false, targets: :rpi2"
assert file =~ "{:nerves_system_rpi3, \"~> 1.17\", runtime: false, targets: :rpi3"
assert file =~ "{:nerves_system_rpi3a, \"~> 1.17\", runtime: false, targets: :rpi3a"
assert file =~ "{:nerves_system_rpi4, \"~> 1.17\", runtime: false, targets: :rpi4"
assert file =~ "{:nerves_system_bbb, \"~> 2.12\", runtime: false, targets: :bbb"
assert file =~
"{:nerves_system_osd32mp1, \"~> 0.8\", runtime: false, targets: :osd32mp1"
assert file =~ "{:nerves_system_x86_64, \"~> 1.17\", runtime: false, targets: :x86_64"
end)
end)
end
test "new project single target", context do
in_tmp(context.test, fn ->
Mix.Tasks.Nerves.New.run([@app_name, "--target", "rpi"])
assert_file("#{@app_name}/README.md")
assert_file("#{@app_name}/mix.exs", fn file ->
assert file =~ "@app :#{@app_name}"
assert file =~ "{:nerves_system_rpi, \"~> 1.17\", runtime: false, targets: :rpi"
refute file =~ "{:nerves_system_rpi0, \"~> 1.17\", runtime: false, targets: :rpi0"
end)
end)
end
test "new project multiple target", context do
in_tmp(context.test, fn ->
Mix.Tasks.Nerves.New.run([@app_name, "--target", "rpi", "--target", "rpi3"])
assert_file("#{@app_name}/README.md")
assert_file("#{@app_name}/mix.exs", fn file ->
assert file =~ "@app :#{@app_name}"
assert file =~ "{:nerves_system_rpi, \"~> 1.17\", runtime: false, targets: :rpi"
assert file =~ "{:nerves_system_rpi3, \"~> 1.17\", runtime: false, targets: :rpi3"
refute file =~ "{:nerves_system_rpi0, \"~> 1.17\", runtime: false, targets: :rpi0"
end)
end)
end
test "new project cookie set", context do
in_tmp(context.test, fn ->
Mix.Tasks.Nerves.New.run([@app_name, "--cookie", "foo"])
assert_file("#{@app_name}/mix.exs", fn file ->
assert file =~ ~s{cookie: "foo"}
end)
end)
end
test "new project provides a default cookie", context do
in_tmp(context.test, fn ->
Mix.Tasks.Nerves.New.run([@app_name])
assert_file("#{@app_name}/mix.exs", fn file ->
assert file =~ "cookie: \"\#{@app}_cookie\""
end)
end)
end
test "new project enables heart", context do
in_tmp(context.test, fn ->
Mix.Tasks.Nerves.New.run([@app_name])
assert_file("#{@app_name}/rel/vm.args.eex", fn file ->
assert file =~ "-heart -env HEART_BEAT_TIMEOUT"
end)
end)
end
test "new project enables embedded mode", context do
in_tmp(context.test, fn ->
Mix.Tasks.Nerves.New.run([@app_name])
assert_file("#{@app_name}/rel/vm.args.eex", fn file ->
assert file =~ "-mode embedded"
end)
end)
end
test "new project adds runtime_tools", context do
in_tmp(context.test, fn ->
Mix.Tasks.Nerves.New.run([@app_name])
assert_file("#{@app_name}/mix.exs", fn file ->
assert file =~ ~r/extra_applications:.*runtime_tools/
end)
end)
end
test "new project includes ring_logger", context do
in_tmp(context.test, fn ->
Mix.Tasks.Nerves.New.run([@app_name])
assert_file("#{@app_name}/mix.exs", fn file ->
assert file =~ ~r/:ring_logger/
end)
assert_file("#{@app_name}/config/config.exs", fn file ->
assert file =~ ~r/RingLogger/
end)
end)
end
test "new project includes toolshed", context do
in_tmp(context.test, fn ->
Mix.Tasks.Nerves.New.run([@app_name])
assert_file("#{@app_name}/mix.exs", fn file ->
assert file =~ ~r/:toolshed/
end)
end)
end
test "new project sets build_embedded", context do
in_tmp(context.test, fn ->
Mix.Tasks.Nerves.New.run([@app_name])
assert_file("#{@app_name}/mix.exs", fn file ->
assert file =~ ~r/build_embedded:/
end)
end)
end
test "new project with nerves_pack", context do
in_tmp(context.test, fn ->
Mix.Tasks.Nerves.New.run([@app_name, "--nerves-pack"])
assert_file("#{@app_name}/mix.exs", fn file ->
assert file =~ ~r/:nerves_pack/
end)
assert_file("#{@app_name}/config/target.exs", fn file ->
assert file =~ ~r"nerves_ssh"
assert file =~ ~r"vintage_net"
assert file =~ ~r"mdns_lite"
end)
end)
end
test "new project with implicit nerves_pack", context do
in_tmp(context.test, fn ->
Mix.Tasks.Nerves.New.run([@app_name])
assert_file("#{@app_name}/mix.exs", fn file ->
assert file =~ ~r/:nerves_pack/
end)
assert_file("#{@app_name}/config/target.exs", fn file ->
assert file =~ ~r"nerves_ssh"
assert file =~ ~r"vintage_net"
assert file =~ ~r"mdns_lite"
end)
end)
end
test "new project without nerves pack", context do
in_tmp(context.test, fn ->
Mix.Tasks.Nerves.New.run([@app_name, "--no-nerves-pack"])
assert_file("#{@app_name}/mix.exs", fn file ->
refute file =~ ~r"nerves_pack"
end)
assert_file("#{@app_name}/config/config.exs", fn file ->
refute file =~ ~r"nerves_pack"
refute file =~ ~r"nerves_ssh"
end)
end)
end
test "new projects cannot use reserved names", context do
in_tmp(context.test, fn ->
assert_raise(Mix.Error, "New projects cannot be named 'nerves'", fn ->
Mix.Tasks.Nerves.New.run(["nerves"])
end)
end)
end
test "new project sets source_date_epoch time", context do
in_tmp(context.test, fn ->
Mix.Tasks.Nerves.New.run([@app_name, "--source-date-epoch", "1234"])
assert_file("#{@app_name}/config/config.exs", fn file ->
assert file =~ ~r/source_date_epoch: "1234"/
end)
end)
end
test "new project generates source_date_epoch time", context do
in_tmp(context.test, fn ->
Mix.Tasks.Nerves.New.run([@app_name])
assert_file("#{@app_name}/config/config.exs", fn file ->
assert file =~ ~r/source_date_epoch: /
end)
end)
end
test "new project generates sample erlinit config", context do
in_tmp(context.test, fn ->
Mix.Tasks.Nerves.New.run([@app_name])
assert_file("#{@app_name}/config/target.exs", fn file ->
assert file =~ ~r"erlinit"
end)
end)
end
end
| 29.920168 | 94 | 0.599635 |
1c1cd4e79ec0b8cb97137d4739a28ead916bd187 | 900 | ex | Elixir | scenic_live_reload/lib/scenic_live_reload.ex | QuantumProductions/scenic_font_test | ff8d0df6ade399039b9d9e816e398cb1ad80a7db | [
"BSD-3-Clause"
] | 1 | 2019-05-29T00:45:46.000Z | 2019-05-29T00:45:46.000Z | scenic_live_reload/lib/scenic_live_reload.ex | QuantumProductions/scenic_font_test | ff8d0df6ade399039b9d9e816e398cb1ad80a7db | [
"BSD-3-Clause"
] | null | null | null | scenic_live_reload/lib/scenic_live_reload.ex | QuantumProductions/scenic_font_test | ff8d0df6ade399039b9d9e816e398cb1ad80a7db | [
"BSD-3-Clause"
] | null | null | null | defmodule ScenicLiveReload do
@moduledoc """
A simple, generic code reloader for Scenic Scenes
"""
use GenServer
require Logger
defmodule State do
@moduledoc false
defstruct []
end
def start_link(state, name \\ __MODULE__) do
GenServer.start_link(__MODULE__, state, name: name)
end
@impl GenServer
def init(_) do
Logger.debug("SceneReloader running #{inspect(self())}")
state = %State{}
{:ok, state}
end
@impl GenServer
def handle_call(:reload_current_scene, _, state) do
Logger.info("Reloading current scene!")
reload_current_scenes()
{:reply, nil, state}
end
defp reload_current_scenes do
ScenicLiveReload.Private.GetScenePids.scene_pids()
|> Enum.each(fn
{:ok, pid} ->
Process.exit(pid, :kill)
_ ->
Logger.warn("Unable to find any scene PID's to reload")
nil
end)
end
end
| 19.565217 | 63 | 0.656667 |
1c1cd5ab6af4ab24e648681dd8ccddf7dd90ceb5 | 6,262 | exs | Elixir | test/banchan_web/controllers/user_auth_test.exs | riamaria/banchan | c4f8bd9374acaf0a8bb2c501e2ae1eb78f96579f | [
"BlueOak-1.0.0",
"Apache-2.0"
] | null | null | null | test/banchan_web/controllers/user_auth_test.exs | riamaria/banchan | c4f8bd9374acaf0a8bb2c501e2ae1eb78f96579f | [
"BlueOak-1.0.0",
"Apache-2.0"
] | null | null | null | test/banchan_web/controllers/user_auth_test.exs | riamaria/banchan | c4f8bd9374acaf0a8bb2c501e2ae1eb78f96579f | [
"BlueOak-1.0.0",
"Apache-2.0"
] | null | null | null | defmodule BanchanWeb.UserAuthTest do
use BanchanWeb.ConnCase, async: true
alias Banchan.Accounts
alias BanchanWeb.UserAuth
import Banchan.AccountsFixtures
@remember_me_cookie "_banchan_web_user_remember_me"
setup %{conn: conn} do
conn =
conn
|> Map.replace!(:secret_key_base, BanchanWeb.Endpoint.config(:secret_key_base))
|> init_test_session(%{})
%{user: user_fixture(), conn: conn}
end
describe "log_in_user/3" do
test "stores the user token in the session", %{conn: conn, user: user} do
conn = UserAuth.log_in_user(conn, user)
assert token = get_session(conn, :user_token)
assert get_session(conn, :live_socket_id) == "users_sessions:#{Base.url_encode64(token)}"
assert redirected_to(conn) == "/"
assert Accounts.get_user_by_session_token(token)
end
test "clears everything previously stored in the session", %{conn: conn, user: user} do
conn = conn |> put_session(:to_be_removed, "value") |> UserAuth.log_in_user(user)
refute get_session(conn, :to_be_removed)
end
test "redirects to the configured path", %{conn: conn, user: user} do
conn = conn |> put_session(:user_return_to, "/hello") |> UserAuth.log_in_user(user)
assert redirected_to(conn) == "/hello"
end
test "writes a cookie if remember_me is configured", %{conn: conn, user: user} do
conn = conn |> fetch_cookies() |> UserAuth.log_in_user(user, %{"remember_me" => "true"})
assert get_session(conn, :user_token) == conn.cookies[@remember_me_cookie]
assert %{value: signed_token, max_age: max_age} = conn.resp_cookies[@remember_me_cookie]
assert signed_token != get_session(conn, :user_token)
assert max_age == 5_184_000
end
end
describe "logout_user/1" do
test "erases session and cookies", %{conn: conn, user: user} do
user_token = Accounts.generate_user_session_token(user)
conn =
conn
|> put_session(:user_token, user_token)
|> put_req_cookie(@remember_me_cookie, user_token)
|> fetch_cookies()
|> UserAuth.log_out_user()
refute get_session(conn, :user_token)
refute conn.cookies[@remember_me_cookie]
assert %{max_age: 0} = conn.resp_cookies[@remember_me_cookie]
assert redirected_to(conn) == "/"
refute Accounts.get_user_by_session_token(user_token)
end
test "broadcasts to the given live_socket_id", %{conn: conn} do
live_socket_id = "users_sessions:abcdef-token"
BanchanWeb.Endpoint.subscribe(live_socket_id)
conn
|> put_session(:live_socket_id, live_socket_id)
|> UserAuth.log_out_user()
assert_receive %Phoenix.Socket.Broadcast{
event: "disconnect",
topic: "users_sessions:abcdef-token"
}
end
test "works even if user is already logged out", %{conn: conn} do
conn = conn |> fetch_cookies() |> UserAuth.log_out_user()
refute get_session(conn, :user_token)
assert %{max_age: 0} = conn.resp_cookies[@remember_me_cookie]
assert redirected_to(conn) == "/"
end
end
describe "fetch_current_user/2" do
test "authenticates user from session", %{conn: conn, user: user} do
user_token = Accounts.generate_user_session_token(user)
conn = conn |> put_session(:user_token, user_token) |> UserAuth.fetch_current_user([])
assert conn.assigns.current_user.id == user.id
end
test "authenticates user from cookies", %{conn: conn, user: user} do
logged_in_conn =
conn |> fetch_cookies() |> UserAuth.log_in_user(user, %{"remember_me" => "true"})
user_token = logged_in_conn.cookies[@remember_me_cookie]
%{value: signed_token} = logged_in_conn.resp_cookies[@remember_me_cookie]
conn =
conn
|> put_req_cookie(@remember_me_cookie, signed_token)
|> UserAuth.fetch_current_user([])
assert get_session(conn, :user_token) == user_token
assert conn.assigns.current_user.id == user.id
end
test "does not authenticate if data is missing", %{conn: conn, user: user} do
_ = Accounts.generate_user_session_token(user)
conn = UserAuth.fetch_current_user(conn, [])
refute get_session(conn, :user_token)
refute conn.assigns.current_user
end
end
describe "redirect_if_user_is_authenticated/2" do
test "redirects if user is authenticated", %{conn: conn, user: user} do
conn = conn |> assign(:current_user, user) |> UserAuth.redirect_if_user_is_authenticated([])
assert conn.halted
assert redirected_to(conn) == "/"
end
test "does not redirect if user is not authenticated", %{conn: conn} do
conn = UserAuth.redirect_if_user_is_authenticated(conn, [])
refute conn.halted
refute conn.status
end
end
describe "require_authenticated_user/2" do
test "redirects if user is not authenticated", %{conn: conn} do
conn = conn |> fetch_flash() |> UserAuth.require_authenticated_user([])
assert conn.halted
assert redirected_to(conn) == Routes.login_path(conn, :new)
assert get_flash(conn, :error) == "You must log in to access this page."
end
test "stores the path to redirect to on GET", %{conn: conn} do
halted_conn =
%{conn | path_info: ["foo"], query_string: ""}
|> fetch_flash()
|> UserAuth.require_authenticated_user([])
assert halted_conn.halted
assert get_session(halted_conn, :user_return_to) == "/foo"
halted_conn =
%{conn | path_info: ["foo"], query_string: "bar=baz"}
|> fetch_flash()
|> UserAuth.require_authenticated_user([])
assert halted_conn.halted
assert get_session(halted_conn, :user_return_to) == "/foo?bar=baz"
halted_conn =
%{conn | path_info: ["foo"], query_string: "bar", method: "POST"}
|> fetch_flash()
|> UserAuth.require_authenticated_user([])
assert halted_conn.halted
refute get_session(halted_conn, :user_return_to)
end
test "does not redirect if user is authenticated", %{conn: conn, user: user} do
conn = conn |> assign(:current_user, user) |> UserAuth.require_authenticated_user([])
refute conn.halted
refute conn.status
end
end
end
| 35.988506 | 98 | 0.673587 |
1c1cd63e8ecb4aa1f6bdaa2d390f7f2c7cd582b7 | 12,759 | ex | Elixir | deps/ecto/lib/ecto/changeset/relation.ex | rchervin/phoenixportfolio | a5a6a60168d7261647a10a8dbd395b440db8a4f9 | [
"MIT"
] | null | null | null | deps/ecto/lib/ecto/changeset/relation.ex | rchervin/phoenixportfolio | a5a6a60168d7261647a10a8dbd395b440db8a4f9 | [
"MIT"
] | null | null | null | deps/ecto/lib/ecto/changeset/relation.ex | rchervin/phoenixportfolio | a5a6a60168d7261647a10a8dbd395b440db8a4f9 | [
"MIT"
] | null | null | null | defmodule Ecto.Changeset.Relation do
@moduledoc false
alias Ecto.Changeset
alias Ecto.Association.NotLoaded
@type t :: %{cardinality: :one | :many,
on_replace: :raise | :mark_as_invalid | atom,
relationship: :parent | :child,
owner: atom,
related: atom,
field: atom}
@doc """
Builds the related data.
"""
@callback build(t) :: Ecto.Schema.t
@doc """
Returns empty container for relation.
"""
def empty(%{cardinality: cardinality}), do: cardinality_to_empty(cardinality)
defp cardinality_to_empty(:one), do: nil
defp cardinality_to_empty(:many), do: []
@doc """
Checks if the container can be considered empty.
"""
def empty?(%{cardinality: _}, %NotLoaded{}), do: true
def empty?(%{cardinality: :many}, []), do: true
def empty?(%{cardinality: :one}, nil), do: true
def empty?(%{}, _), do: false
@doc """
Applies related changeset changes
"""
def apply_changes(%{cardinality: :one}, nil) do
nil
end
def apply_changes(%{cardinality: :one}, changeset) do
apply_changes(changeset)
end
def apply_changes(%{cardinality: :many}, changesets) do
for changeset <- changesets,
struct = apply_changes(changeset),
do: struct
end
defp apply_changes(%Changeset{action: :delete}), do: nil
defp apply_changes(%Changeset{action: :replace}), do: nil
defp apply_changes(changeset), do: Changeset.apply_changes(changeset)
@doc """
Loads the relation with the given struct.
Loading will fail if the association is not loaded but the struct is.
"""
def load!(%{__meta__: %{state: :built}}, %NotLoaded{__cardinality__: cardinality}) do
cardinality_to_empty(cardinality)
end
def load!(struct, %NotLoaded{__field__: field}) do
raise "attempting to cast or change association `#{field}` " <>
"from `#{inspect struct.__struct__}` that was not loaded. Please preload your " <>
"associations before manipulating them through changesets"
end
def load!(_struct, loaded), do: loaded
@doc """
Casts related according to the `on_cast` function.
"""
def cast(%{cardinality: :one} = relation, nil, current, _on_cast) do
case current && on_replace(relation, current) do
:error -> :error
_ -> {:ok, nil, true, false}
end
end
def cast(%{cardinality: :many} = relation, params, current, on_cast) when is_map(params) do
params =
params
|> Enum.map(&key_as_int/1)
|> Enum.sort
|> Enum.map(&elem(&1, 1))
cast(relation, params, current, on_cast)
end
def cast(%{related: mod} = relation, params, current, on_cast) do
pks = mod.__schema__(:primary_key)
cast_or_change(relation, params, current, struct_pk(mod, pks),
param_pk(mod, pks), &do_cast(relation, &1, &2, &3, on_cast))
end
defp do_cast(meta, params, nil, allowed_actions, on_cast) do
{:ok,
on_cast.(meta.__struct__.build(meta), params)
|> put_new_action(:insert)
|> check_action!(allowed_actions)}
end
defp do_cast(relation, nil, current, _allowed_actions, _on_cast) do
on_replace(relation, current)
end
defp do_cast(_meta, params, struct, allowed_actions, on_cast) do
{:ok,
on_cast.(struct, params)
|> put_new_action(:update)
|> check_action!(allowed_actions)}
end
@doc """
Wraps related structs in changesets.
"""
def change(%{cardinality: :one} = relation, nil, current) do
case current && on_replace(relation, current) do
:error -> :error
_ -> {:ok, nil, true, false}
end
end
def change(%{related: mod} = relation, value, current) do
get_pks = struct_pk(mod, mod.__schema__(:primary_key))
cast_or_change(relation, value, current, get_pks, get_pks,
&do_change(relation, &1, &2, &3))
end
# This may be an insert or an update, get all fields.
defp do_change(_relation, %{__struct__: _} = changeset_or_struct, nil, _allowed_actions) do
changeset = Changeset.change(changeset_or_struct)
{:ok, put_new_action(changeset, action_from_changeset(changeset))}
end
defp do_change(relation, nil, current, _allowed_actions) do
on_replace(relation, current)
end
defp do_change(_relation, %Changeset{} = changeset, _current, allowed_actions) do
{:ok, put_new_action(changeset, :update) |> check_action!(allowed_actions)}
end
defp do_change(_relation, %{__struct__: _} = struct, _current, allowed_actions) do
{:ok, struct |> Ecto.Changeset.change |> put_new_action(:update) |> check_action!(allowed_actions)}
end
defp do_change(%{related: mod} = relation, changes, current, allowed_actions)
when is_list(changes) or is_map(changes) do
changeset = Ecto.Changeset.change(current || mod.__struct__, changes)
changeset = put_new_action(changeset, action_from_changeset(changeset))
do_change(relation, changeset, current, allowed_actions)
end
defp action_from_changeset(%{data: %{__meta__: %{state: state}}}) do
case state do
:built -> :insert
:loaded -> :update
:deleted -> :delete
end
end
defp action_from_changeset(_) do
:insert # We don't care if it is insert/update for embeds (no meta)
end
@doc """
Handles the changeset or struct when being replaced.
"""
def on_replace(%{on_replace: :mark_as_invalid}, _changeset_or_struct) do
:error
end
def on_replace(%{on_replace: :raise, field: name, owner: owner}, _) do
raise """
you are attempting to change relation #{inspect name} of
#{inspect owner}, but there is missing data.
If you are attempting to update an existing entry, please make sure
you include the entry primary key (ID) alongside the data.
If you have a relationship with many children, at least the same N
children must be given on update. By default it is not possible to
orphan embed nor associated records, attempting to do so results in
this error message.
If you don't desire the current behavior or if you are using embeds
without a primary key, it is possible to change this behaviour by
setting `:on_replace` when defining the relation. See `Ecto.Changeset`'s
section on related data for more info.
"""
end
def on_replace(_relation, changeset_or_struct) do
{:ok, Changeset.change(changeset_or_struct) |> put_new_action(:replace)}
end
defp cast_or_change(%{cardinality: :one} = relation, value, current, current_pks,
new_pks, fun) when is_map(value) or is_list(value) or is_nil(value) do
single_change(relation, value, current_pks, new_pks, fun, current)
end
defp cast_or_change(%{cardinality: :many}, [], [], _current_pks, _new_pks, _fun) do
{:ok, [], true, false}
end
defp cast_or_change(%{cardinality: :many, unique: unique}, value, current, current_pks, new_pks, fun) when is_list(value) do
map_changes(value, new_pks, fun, process_current(current, current_pks), [], true, true, unique && %{})
end
defp cast_or_change(_, _, _, _, _, _), do: :error
# single change
defp single_change(_relation, nil, _current_pks, _new_pks, fun, current) do
single_change(nil, current, fun, [:update, :delete], false)
end
defp single_change(_relation, new, _current_pks, _new_pks, fun, nil) do
single_change(new, nil, fun, [:insert], false)
end
defp single_change(%{on_replace: on_replace} = relation, new, current_pks, new_pks, fun, current) do
pk_values = new_pks.(new)
if on_replace == :update or (pk_values == current_pks.(current) and pk_values != []) do
single_change(new, current, fun, allowed_actions(pk_values), true)
else
case on_replace(relation, current) do
{:ok, _changeset} -> single_change(new, nil, fun, [:insert], false)
:error -> :error
end
end
end
defp single_change(new, current, fun, allowed_actions, skippable?) do
case fun.(new, current, allowed_actions) do
{:ok, changeset} ->
{:ok, changeset, changeset.valid?, skippable? and skip?(changeset)}
:error ->
:error
end
end
# map changes
defp map_changes([changes | rest], new_pks, fun, current, acc, valid?, skip?, acc_pk_values)
when is_map(changes) or is_list(changes) do
pk_values = new_pks.(changes)
{struct, current, allowed_actions} = pop_current(current, pk_values)
case fun.(changes, struct, allowed_actions) do
{:ok, changeset} ->
changeset = maybe_add_error_on_pk(changeset, pk_values, acc_pk_values)
map_changes(rest, new_pks, fun, current, [changeset | acc],
valid? and changeset.valid?, (struct != nil) and skip? and skip?(changeset),
acc_pk_values && Map.put(acc_pk_values, pk_values, true))
:error ->
:error
end
end
defp map_changes([], _new_pks, fun, current, acc, valid?, skip?, _acc_pk_values) do
current_structs = Enum.map(current, &elem(&1, 1))
reduce_delete_changesets(current_structs, fun, Enum.reverse(acc), valid?, skip?)
end
defp map_changes(_params, _new_pks, _fun, _current, _acc, _valid?, _skip?, _acc_pk_values) do
:error
end
defp maybe_add_error_on_pk(%{data: %{__struct__: schema}} = changeset, pk_values, acc_pk_values) do
if is_map(acc_pk_values) and not missing_pks?(pk_values) and
Map.has_key?(acc_pk_values, pk_values) do
Enum.reduce(schema.__schema__(:primary_key), changeset, fn pk, acc ->
Changeset.add_error(acc, pk, "has already been taken")
end)
else
changeset
end
end
defp missing_pks?(pk_values) do
pk_values == [] or Enum.any?(pk_values, &is_nil/1)
end
defp allowed_actions(pk_values) do
if Enum.all?(pk_values, &is_nil/1) do
[:insert, :update, :delete]
else
[:update, :delete]
end
end
defp reduce_delete_changesets([], _fun, acc, valid?, skip?) do
{:ok, acc, valid?, skip?}
end
defp reduce_delete_changesets([struct | rest], fun, acc, valid?, _skip?) do
case fun.(nil, struct, [:update, :delete]) do
{:ok, changeset} ->
reduce_delete_changesets(rest, fun, [changeset | acc],
valid? and changeset.valid?, false)
:error ->
:error
end
end
# helpers
defp check_action!(changeset, allowed_actions) do
action = changeset.action
cond do
action in allowed_actions ->
changeset
action == :insert ->
raise "cannot #{action} related #{inspect changeset.data} " <>
"because it is already associated with the given struct"
true ->
raise "cannot #{action} related #{inspect changeset.data} because " <>
"it already exists and it is not currently associated with the " <>
"given struct. Ecto forbids casting existing records through " <>
"the association field for security reasons. Instead, set " <>
"the foreign key value accordingly"
end
end
defp key_as_int({key, val}) when is_binary(key) do
case Integer.parse(key) do
{key, ""} -> {key, val}
_ -> {key, val}
end
end
defp key_as_int(key_val), do: key_val
defp process_current(nil, _get_pks),
do: %{}
defp process_current(current, get_pks) do
Enum.reduce(current, {%{}, 0}, fn struct, {acc, index} ->
case get_pks.(struct) do
[] -> {Map.put(acc, index, struct), index + 1}
pks -> {Map.put(acc, pks, struct), index}
end
end) |> elem(0)
end
defp pop_current(current, pk_values) do
case Map.fetch(current, pk_values) do
{:ok, struct} ->
{struct, Map.delete(current, pk_values), allowed_actions(pk_values)}
:error ->
{nil, current, [:insert]}
end
end
defp struct_pk(_mod, pks) do
fn
%Changeset{data: struct} -> Enum.map(pks, &Map.get(struct, &1))
[_|_] = struct -> Enum.map(pks, &Keyword.get(struct, &1))
%{} = struct -> Enum.map(pks, &Map.get(struct, &1))
end
end
defp param_pk(mod, pks) do
pks = Enum.map(pks, &{&1, Atom.to_string(&1), mod.__schema__(:type, &1)})
fn params ->
Enum.map pks, fn {atom_key, string_key, type} ->
original = Map.get(params, string_key) || Map.get(params, atom_key)
case Ecto.Type.cast(type, original) do
{:ok, value} -> value
:error -> original
end
end
end
end
defp put_new_action(%{action: action} = changeset, new_action) when is_nil(action),
do: Map.put(changeset, :action, new_action)
defp put_new_action(changeset, _new_action),
do: changeset
defp skip?(%{valid?: true, changes: empty, action: :update}) when empty == %{},
do: true
defp skip?(_changeset),
do: false
end
| 32.968992 | 126 | 0.653891 |
1c1cf1fdadbc0aff1555c807523e0f6792fd069f | 533 | ex | Elixir | memory_backend/lib/memory_backend/model/score.ex | AdrianPaulCarrieres/lpiot2020-memory-adrianpaulcarrieres | 0a2d66c6ecf501188a949807c8ea2d99c26c531b | [
"MIT"
] | null | null | null | memory_backend/lib/memory_backend/model/score.ex | AdrianPaulCarrieres/lpiot2020-memory-adrianpaulcarrieres | 0a2d66c6ecf501188a949807c8ea2d99c26c531b | [
"MIT"
] | 15 | 2020-12-23T16:09:28.000Z | 2020-12-26T22:32:47.000Z | memory_backend/lib/memory_backend/model/score.ex | AdrianPaulCarrieres/lpiot2020-memory-adrianpaulcarrieres | 0a2d66c6ecf501188a949807c8ea2d99c26c531b | [
"MIT"
] | null | null | null | defmodule MemoryBackend.Model.Score do
use Ecto.Schema
import Ecto.Changeset
@derive {Jason.Encoder, only: [:id, :deck_id, :score, :players]}
schema "scores" do
field :score, :integer
belongs_to(:deck, MemoryBackend.Model.Deck)
has_many :players, MemoryBackend.Model.Player, on_replace: :delete
timestamps()
end
@doc false
def changeset(score, attrs) do
score
|> cast(attrs, [:score])
|> cast_assoc(:players)
|> validate_required([:score])
|> assoc_constraint(:deck)
end
end
| 20.5 | 70 | 0.675422 |
1c1cf2463e59bce81036a85877d231cbe8bfd6e3 | 3,977 | ex | Elixir | clients/apigee/lib/google_api/apigee/v1/api/hybrid.ex | medikent/elixir-google-api | 98a83d4f7bfaeac15b67b04548711bb7e49f9490 | [
"Apache-2.0"
] | null | null | null | clients/apigee/lib/google_api/apigee/v1/api/hybrid.ex | medikent/elixir-google-api | 98a83d4f7bfaeac15b67b04548711bb7e49f9490 | [
"Apache-2.0"
] | null | null | null | clients/apigee/lib/google_api/apigee/v1/api/hybrid.ex | medikent/elixir-google-api | 98a83d4f7bfaeac15b67b04548711bb7e49f9490 | [
"Apache-2.0"
] | null | null | null | # Copyright 2019 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# NOTE: This file is auto generated by the elixir code generator program.
# Do not edit this file manually.
defmodule GoogleApi.Apigee.V1.Api.Hybrid do
@moduledoc """
API calls for all endpoints tagged `Hybrid`.
"""
alias GoogleApi.Apigee.V1.Connection
alias GoogleApi.Gax.{Request, Response}
@library_version Mix.Project.config() |> Keyword.get(:version, "")
@doc """
Lists hybrid services and its trusted issuers service account ids.
This api is authenticated and unauthorized(allow all the users) and used by
runtime authn-authz service to query control plane's issuer service account
ids.
## Parameters
* `connection` (*type:* `GoogleApi.Apigee.V1.Connection.t`) - Connection to server
* `name` (*type:* `String.t`) - Required. Must be of the form `hybrid/issuers`.
* `optional_params` (*type:* `keyword()`) - Optional parameters
* `:"$.xgafv"` (*type:* `String.t`) - V1 error format.
* `:access_token` (*type:* `String.t`) - OAuth access token.
* `:alt` (*type:* `String.t`) - Data format for response.
* `:callback` (*type:* `String.t`) - JSONP
* `:fields` (*type:* `String.t`) - Selector specifying which fields to include in a partial response.
* `:key` (*type:* `String.t`) - API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token.
* `:oauth_token` (*type:* `String.t`) - OAuth 2.0 token for the current user.
* `:prettyPrint` (*type:* `boolean()`) - Returns response with indentations and line breaks.
* `:quotaUser` (*type:* `String.t`) - Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters.
* `:uploadType` (*type:* `String.t`) - Legacy upload protocol for media (e.g. "media", "multipart").
* `:upload_protocol` (*type:* `String.t`) - Upload protocol for media (e.g. "raw", "multipart").
* `opts` (*type:* `keyword()`) - Call options
## Returns
* `{:ok, %GoogleApi.Apigee.V1.Model.GoogleCloudApigeeV1ListHybridIssuersResponse{}}` on success
* `{:error, info}` on failure
"""
@spec apigee_hybrid_issuers_list(Tesla.Env.client(), String.t(), keyword(), keyword()) ::
{:ok, GoogleApi.Apigee.V1.Model.GoogleCloudApigeeV1ListHybridIssuersResponse.t()}
| {:ok, Tesla.Env.t()}
| {:error, Tesla.Env.t()}
def apigee_hybrid_issuers_list(connection, name, optional_params \\ [], opts \\ []) do
optional_params_config = %{
:"$.xgafv" => :query,
:access_token => :query,
:alt => :query,
:callback => :query,
:fields => :query,
:key => :query,
:oauth_token => :query,
:prettyPrint => :query,
:quotaUser => :query,
:uploadType => :query,
:upload_protocol => :query
}
request =
Request.new()
|> Request.method(:get)
|> Request.url("/v1/{+name}", %{
"name" => URI.encode(name, &URI.char_unreserved?/1)
})
|> Request.add_optional_params(optional_params_config, optional_params)
|> Request.library_version(@library_version)
connection
|> Connection.execute(request)
|> Response.decode(
opts ++ [struct: %GoogleApi.Apigee.V1.Model.GoogleCloudApigeeV1ListHybridIssuersResponse{}]
)
end
end
| 43.228261 | 196 | 0.659039 |
1c1d05b8966332e8c64c7dfe44bc1a5707fa7ab6 | 640 | ex | Elixir | lib/sobelow/vuln/cookie_rce.ex | tmecklem/sobelow | 76b441da408b0156a05fa208a8426c63f3536fe5 | [
"Apache-2.0"
] | null | null | null | lib/sobelow/vuln/cookie_rce.ex | tmecklem/sobelow | 76b441da408b0156a05fa208a8426c63f3536fe5 | [
"Apache-2.0"
] | null | null | null | lib/sobelow/vuln/cookie_rce.ex | tmecklem/sobelow | 76b441da408b0156a05fa208a8426c63f3536fe5 | [
"Apache-2.0"
] | null | null | null | defmodule Sobelow.Vuln.CookieRCE do
alias Sobelow.Utils
alias Sobelow.Vuln
use Sobelow.Finding
@vuln_vsn ~w(1.3.1 1.3.0 1.2.2 1.2.1 1.2.0 1.1.6 1.1.5 1.1.4 1.1.3 1.1.2 1.1.1 1.1.0 1.0.3 1.0.2 1.0.1 1.0.0)
def run(root) do
plug_conf = root <> "/deps/plug/mix.exs"
if File.exists?(plug_conf) do
vsn = Utils.get_version(plug_conf)
if Enum.member?(@vuln_vsn, vsn) do
Vuln.print_finding(
vsn,
"Plug",
"Arbitrary Code Execution in Cookie Serialization",
"CVE-2017-1000053"
)
end
end
end
def details() do
Sobelow.Vuln.details()
end
end
| 22.068966 | 111 | 0.592188 |
1c1d08123dde61d516744c59f067b81a7cf4a11a | 1,583 | ex | Elixir | apps/auth/web/web.ex | mikrofusion/passwordless | 26621389cc0e8e307080b1eb5be42b0527e6611b | [
"MIT"
] | null | null | null | apps/auth/web/web.ex | mikrofusion/passwordless | 26621389cc0e8e307080b1eb5be42b0527e6611b | [
"MIT"
] | null | null | null | apps/auth/web/web.ex | mikrofusion/passwordless | 26621389cc0e8e307080b1eb5be42b0527e6611b | [
"MIT"
] | null | null | null | defmodule Auth.Web do
@moduledoc """
A module that keeps using definitions for controllers,
views and so on.
This can be used in your application as:
use Auth.Web, :controller
use Auth.Web, :view
The definitions below will be executed for every view,
controller, etc, so keep them short and clean, focused
on imports, uses and aliases.
Do NOT define functions inside the quoted expressions
below.
"""
def model do
quote do
use Ecto.Schema
import Ecto
import Ecto.Changeset
import Ecto.Query
end
end
def controller do
quote do
use Phoenix.Controller
alias Auth.Repo
import Ecto
import Ecto.Query
import Auth.Router.Helpers
import Auth.Gettext
end
end
def view do
quote do
use Phoenix.View, root: "web/templates"
# Import convenience functions from controllers
import Phoenix.Controller, only: [get_csrf_token: 0,
get_flash: 2,
view_module: 1]
import Auth.Router.Helpers
import Auth.ErrorHelpers
import Auth.Gettext
end
end
def router do
quote do
use Phoenix.Router
end
end
def channel do
quote do
use Phoenix.Channel
alias Auth.Repo
import Ecto
import Ecto.Query
import Auth.Gettext
end
end
@doc """
When used, dispatch to the appropriate controller/view/etc.
"""
defmacro __using__(which) when is_atom(which) do
apply(__MODULE__, which, [])
end
end
| 19.54321 | 61 | 0.62729 |
1c1d0e4234852ca9663e00ccfea9205e74db11a8 | 295 | exs | Elixir | elixir/priv/repo/migrations/20200826195412_create_merchants.exs | TreywRoberts/web-homework | d19b17dd384341d9e6e7e3174372673584289b83 | [
"MIT"
] | null | null | null | elixir/priv/repo/migrations/20200826195412_create_merchants.exs | TreywRoberts/web-homework | d19b17dd384341d9e6e7e3174372673584289b83 | [
"MIT"
] | null | null | null | elixir/priv/repo/migrations/20200826195412_create_merchants.exs | TreywRoberts/web-homework | d19b17dd384341d9e6e7e3174372673584289b83 | [
"MIT"
] | null | null | null | defmodule Homework.Repo.Migrations.CreateMerchants do
use Ecto.Migration
def change do
create table(:merchants, primary_key: false) do
add(:id, :uuid, primary_key: true)
add(:name, :string)
add(:description, :string)
timestamps()
end
end
end
| 21.071429 | 54 | 0.640678 |
1c1d3c3a46f1113f845ddac2fd2371e9c1111ecb | 220 | exs | Elixir | tapestry_algorithm/project3.exs | kdlogan19/Tapestry-Algorithm | 337445c4569a7f0fd63da31272018a308ed65480 | [
"MIT"
] | null | null | null | tapestry_algorithm/project3.exs | kdlogan19/Tapestry-Algorithm | 337445c4569a7f0fd63da31272018a308ed65480 | [
"MIT"
] | null | null | null | tapestry_algorithm/project3.exs | kdlogan19/Tapestry-Algorithm | 337445c4569a7f0fd63da31272018a308ed65480 | [
"MIT"
] | null | null | null | defmodule Project3 do
def start_project do
[num_nodes, num_requests] = System.argv()
MainModule.start(String.to_integer(num_nodes), String.to_integer(num_requests))
end
end
Project3.start_project | 27.5 | 87 | 0.740909 |
1c1d4d563bd59fff6cdeeb18b19ccaa4bc2bdb83 | 118 | exs | Elixir | Elixir/Basics/first_app/test/first_app_test.exs | xuedong/programming-language-learning | daba94ffcf99f2ff81224ccbd336b2c0e4013ba1 | [
"MIT"
] | null | null | null | Elixir/Basics/first_app/test/first_app_test.exs | xuedong/programming-language-learning | daba94ffcf99f2ff81224ccbd336b2c0e4013ba1 | [
"MIT"
] | null | null | null | Elixir/Basics/first_app/test/first_app_test.exs | xuedong/programming-language-learning | daba94ffcf99f2ff81224ccbd336b2c0e4013ba1 | [
"MIT"
] | null | null | null | defmodule FirstAppTest do
use ExUnit.Case
doctest FirstApp
test "the truth" do
assert 1 + 1 == 2
end
end
| 13.111111 | 25 | 0.677966 |
1c1d55fec3fd0d95ae0f23c1ab89b8b28ad21e31 | 2,422 | ex | Elixir | clients/compute/lib/google_api/compute/v1/model/instance_group_manager_status.ex | medikent/elixir-google-api | 98a83d4f7bfaeac15b67b04548711bb7e49f9490 | [
"Apache-2.0"
] | null | null | null | clients/compute/lib/google_api/compute/v1/model/instance_group_manager_status.ex | medikent/elixir-google-api | 98a83d4f7bfaeac15b67b04548711bb7e49f9490 | [
"Apache-2.0"
] | null | null | null | clients/compute/lib/google_api/compute/v1/model/instance_group_manager_status.ex | medikent/elixir-google-api | 98a83d4f7bfaeac15b67b04548711bb7e49f9490 | [
"Apache-2.0"
] | null | null | null | # Copyright 2019 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# NOTE: This file is auto generated by the elixir code generator program.
# Do not edit this file manually.
defmodule GoogleApi.Compute.V1.Model.InstanceGroupManagerStatus do
@moduledoc """
## Attributes
* `autoscaler` (*type:* `String.t`, *default:* `nil`) - [Output Only] The URL of the Autoscaler that targets this instance group manager.
* `isStable` (*type:* `boolean()`, *default:* `nil`) - [Output Only] A bit indicating whether the managed instance group is in a stable state. A stable state means that: none of the instances in the managed instance group is currently undergoing any type of change (for example, creation, restart, or deletion); no future changes are scheduled for instances in the managed instance group; and the managed instance group itself is not being modified.
* `versionTarget` (*type:* `GoogleApi.Compute.V1.Model.InstanceGroupManagerStatusVersionTarget.t`, *default:* `nil`) - [Output Only] A status of consistency of Instances' versions with their target version specified by version field on Instance Group Manager.
"""
use GoogleApi.Gax.ModelBase
@type t :: %__MODULE__{
:autoscaler => String.t(),
:isStable => boolean(),
:versionTarget => GoogleApi.Compute.V1.Model.InstanceGroupManagerStatusVersionTarget.t()
}
field(:autoscaler)
field(:isStable)
field(:versionTarget, as: GoogleApi.Compute.V1.Model.InstanceGroupManagerStatusVersionTarget)
end
defimpl Poison.Decoder, for: GoogleApi.Compute.V1.Model.InstanceGroupManagerStatus do
def decode(value, options) do
GoogleApi.Compute.V1.Model.InstanceGroupManagerStatus.decode(value, options)
end
end
defimpl Poison.Encoder, for: GoogleApi.Compute.V1.Model.InstanceGroupManagerStatus do
def encode(value, options) do
GoogleApi.Gax.ModelBase.encode(value, options)
end
end
| 45.698113 | 453 | 0.752684 |
1c1d5748ff8875cad7c5a1273e40530102a27f09 | 2,928 | ex | Elixir | lib/future_made_concerts_web/telemetry.ex | Future-Made/concerts-for-impact | 5532cd1be5252fa0ccb0b956f0961be8701e0e04 | [
"MIT"
] | null | null | null | lib/future_made_concerts_web/telemetry.ex | Future-Made/concerts-for-impact | 5532cd1be5252fa0ccb0b956f0961be8701e0e04 | [
"MIT"
] | null | null | null | lib/future_made_concerts_web/telemetry.ex | Future-Made/concerts-for-impact | 5532cd1be5252fa0ccb0b956f0961be8701e0e04 | [
"MIT"
] | null | null | null | defmodule FutureMadeConcertsWeb.Telemetry do
@moduledoc false
use Supervisor
import Telemetry.Metrics
def start_link(arg) do
Supervisor.start_link(__MODULE__, arg, name: __MODULE__)
end
@impl true
def init(_arg) do
children = [
# Telemetry poller will execute the given period measurements
# every 10_000ms. Learn more here: https://hexdocs.pm/telemetry_metrics
{:telemetry_poller, measurements: periodic_measurements(), period: 10_000}
# Add reporters as children of your supervision tree.
# {Telemetry.Metrics.ConsoleReporter, metrics: metrics()}
]
Supervisor.init(children, strategy: :one_for_one)
end
def metrics do
[
# FutureMadeConcerts Metrics
summary("FutureMadeConcerts.session.count.active"),
summary("FutureMadeConcerts.spotify.api_error.count",
tags: [:error_type]
),
# Phoenix Metrics
summary("phoenix.endpoint.stop.duration",
unit: {:native, :millisecond}
),
summary("phoenix.router_dispatch.stop.duration",
tags: [:route],
unit: {:native, :millisecond}
),
summary("phoenix.live_view.mount.stop.duration",
unit: {:native, :millisecond}
),
summary("phoenix.live_view.handle_params.stop.duration",
tags: [:live_action],
tag_values: &add_live_action/1,
unit: {:native, :millisecond}
),
summary("phoenix.live_view.handle_event.stop.duration",
tags: [:event],
unit: {:native, :millisecond}
),
# VM Metrics
summary("vm.memory.total", unit: {:byte, :kilobyte}),
summary("vm.total_run_queue_lengths.total"),
summary("vm.total_run_queue_lengths.cpu"),
summary("vm.total_run_queue_lengths.io"),
# HTTP - Spotify
summary("finch.request.stop.duration",
unit: {:native, :millisecond},
tags: [:normalized_path],
tag_values: &add_normalized_path/1,
keep: &keep_spotify/1,
reporter_options: [
nav: "HTTP - Spotify"
]
),
summary("finch.response.stop.duration",
unit: {:native, :millisecond},
tags: [:normalized_path],
tag_values: &add_normalized_path/1,
keep: &keep_spotify/1,
reporter_options: [
nav: "HTTP - Spotify"
]
)
]
end
defp periodic_measurements do
[
# A module, function and arguments to be invoked periodically.
# This function must call :telemetry.execute/3 and a metric must be added above.
{FutureMadeConcerts.Spotify.Supervisor, :count_sessions, []}
]
end
defp add_normalized_path(metadata) do
Map.put(metadata, :normalized_path, URI.parse(metadata.path).path)
end
defp add_live_action(metadata) do
Map.put(metadata, :live_action, metadata.socket.assigns.live_action)
end
defp keep_spotify(meta) do
meta.host =~ "spotify"
end
end
| 29.28 | 86 | 0.646175 |
1c1d5765540605c6f9b449c55dc137f3314f3e48 | 8,391 | ex | Elixir | lib/erlef/groups/git_report.ex | joaquinalcerro/website | 52dc89c70cd0b42127ab233a4c0d10f626d2b698 | [
"Apache-2.0"
] | 71 | 2019-07-02T18:06:15.000Z | 2022-03-09T15:30:08.000Z | lib/erlef/groups/git_report.ex | joaquinalcerro/website | 52dc89c70cd0b42127ab233a4c0d10f626d2b698 | [
"Apache-2.0"
] | 157 | 2019-07-02T01:21:16.000Z | 2022-03-30T16:08:12.000Z | lib/erlef/groups/git_report.ex | joaquinalcerro/website | 52dc89c70cd0b42127ab233a4c0d10f626d2b698 | [
"Apache-2.0"
] | 45 | 2019-07-04T05:51:11.000Z | 2022-02-27T11:56:02.000Z | defmodule Erlef.Groups.GitReport do
@moduledoc false
# n.b,
# This is the initial implementation of a concept called GitReport.
# This however is not a good abstraction. A proper absaction would would not be confined to reports.
# More features of the app will make use git for submitting and updating files and a
# proper abstraction should emerge after more cases. Make it work! :tm:
alias Erlef.Groups.WorkingGroupReport
alias Erlef.Github
def get_status(%WorkingGroupReport{} = report) do
args = %{
report: report,
repo: report_repo(report),
pull_number: report_pull_number(report)
}
with {:ok, state} <- to_state(args),
{:ok, state} <- get_token(state),
{:ok, state} <- get_pr_status(state) do
case state.pull_request_status == report.status do
true -> {:ok, report.status}
false -> {:changed, state.pull_request_status}
end
end
end
def submit(%WorkingGroupReport{} = report) do
case Erlef.is_env?(:prod) do
true ->
args = %{
files: report_to_files(report),
from: branch_name(report),
to: "main",
title: report_title(report),
description: report_description(report),
repo: report_repo(report),
author: report_author(report),
message: report_title(report)
}
create(args)
false ->
# N.B, a sub-sequent PR must be done to support faking github in order
# to avoid this temp hack
{:ok, %{}}
end
end
def update(%WorkingGroupReport{} = report) do
case Erlef.is_env?(:prod) do
true ->
args = %{
files: report_to_existing_files(report),
repo: report_repo(report),
message: report.update_message,
branch_name: branch_name(report),
from: branch_name(report),
author: report_author(report),
last_commit_sha: report_last_commit(report),
pull_number: report_pull_number(report)
}
case do_update(args) do
{:ok, state} ->
meta =
report.meta
|> put_in(["source", "pull_request"], state.pull_request)
|> put_in(["source", "branch"], state.branch)
{:ok, meta}
err ->
err
end
false ->
# N.B, a sub-sequent PR must be done to support faking github in order
# to avoid this temp hack
{:ok, report.meta}
end
end
def create(args) do
with {:ok, state} <- to_state(args),
{:ok, state} <- get_token(state),
{:ok, state} <- get_last_commit(state),
{:ok, state} <- create_files(state),
{:ok, state} <- create_commit(state),
{:ok, state} <- create_branch(state),
{:ok, state} <- create_pr(state) do
meta = %{
source: %{
type: "git",
provider: "github",
params: args,
pull_request: state.pull_request,
branch: state.branch,
commit: state.commit,
tree: state.tree
}
}
{:ok, meta}
end
end
defp do_update(args) do
with {:ok, state} <- to_state(args),
{:ok, state} <- get_token(state),
{:ok, state} <- create_files(state),
{:ok, state} <- create_commit(state),
{:ok, state} <- update_branch(state),
{:ok, state} <- get_pr(state) do
{:ok, state}
end
end
defp get_pr(state) do
p = %{owner: state.owner, repo: state.repo, number: state.pull_number}
case Github.get_pr(state.token, p) do
{:ok, pr} ->
{:ok, set(state, :pull_request, pr)}
err ->
err
end
end
defp get_pr_status(state) do
p = %{owner: state.owner, repo: state.repo, number: state.pull_number}
case Github.get_pr_status(state.token, p) do
{:ok, status} ->
{:ok, set(state, :pull_request_status, status)}
err ->
err
end
end
defp get_token(state) do
case Github.auth_app() do
{:ok, token} ->
{:ok, set(state, :token, token)}
err ->
err
end
end
defp get_last_commit(state) do
p = %{owner: state.owner, repo: state.repo}
case Github.get_main_last_commit(state.token, p) do
{:ok, last_commit} ->
state =
state
|> set(:last_commit_sha, last_commit["sha"])
|> set(:last_commit, last_commit)
{:ok, state}
err ->
err
end
end
defp create_files(state) do
p = %{
owner: state.owner,
repo: state.repo,
files: state.files,
base_tree: state.last_commit_sha
}
case Github.create_files(state.token, p) do
{:ok, tree} ->
state =
state
|> set(:tree_sha, tree["sha"])
|> set(:tree, tree)
{:ok, state}
err ->
err
end
end
defp create_commit(state) do
p = %{
owner: state.owner,
repo: state.repo,
files: state.tree,
author: state.author,
message: state.message,
tree: state.tree_sha,
parents: [state.last_commit_sha]
}
case Github.create_commit(state.token, p) do
{:ok, commit} ->
state =
state
|> set(:commit_sha, commit["sha"])
|> set(:commit, commit)
{:ok, state}
err ->
err
end
end
defp create_branch(state) do
p = %{
owner: state.owner,
repo: state.repo,
name: state.from,
sha: state.commit_sha
}
case Github.create_branch(state.token, p) do
{:ok, branch} ->
{:ok, set(state, :branch, branch)}
err ->
err
end
end
defp update_branch(state) do
p = %{
owner: state.owner,
repo: state.repo,
name: state.from,
sha: state.commit_sha
}
case Github.update_branch(state.token, p) do
{:ok, branch} ->
{:ok, set(state, :branch, branch)}
err ->
err
end
end
defp create_pr(state) do
p = %{
maintainer_can_modify: true,
body: state.description,
head: "refs/heads/#{state.from}",
base: "refs/heads/#{state.to}",
owner: state.owner,
repo: state.repo,
title: state.title
}
case Github.create_pr(state.token, p) do
{:ok, pr} ->
{:ok, set(state, :pull_request, pr)}
err ->
err
end
end
defp timestamp(dt), do: Calendar.strftime(dt, "%Y-%m-%d-%H%M%S")
defp branch_name(%{meta: %{"source" => %{"params" => %{"from" => from}}}}), do: from
defp branch_name(report) do
"#{report.working_group.slug}-#{report.type}-report-#{timestamp(report.inserted_at)}"
end
defp report_author(%{meta: %{"source" => %{"params" => %{"author" => author}}}}), do: author
defp report_author(report) do
%{name: report.submitted_by.name, email: "erlef.bot@erlef.org"}
end
defp report_description(%WorkingGroupReport{type: :quarterly} = report) do
"""
# Quarterly report for #{report.working_group.name} working group
## Submitted by - #{report.submitted_by.name}
"""
end
defp report_title(report) do
"#{report.working_group.name} - #{report.type} report"
end
defp report_repo(_) do
config = Application.get_env(:erlef, :reports)
Keyword.get(config, :report_submission_repo)
end
defp report_to_files(report) do
slug = report.working_group.slug
ts = timestamp(report.inserted_at)
path = "reports/#{slug}/quarterly/#{slug}-#{ts}.md"
[
%{
content: report.content,
path: path
}
]
end
defp report_to_existing_files(report) do
[file] = report.meta["source"]["params"]["files"]
[
%{
content: report.content,
path: file["path"]
}
]
end
defp report_pull_number(report) do
report.meta["source"]["pull_request"]["number"]
end
defp report_last_commit(%{meta: %{"source" => %{"branch" => %{"sha" => sha}}}}) do
sha
end
defp set(state, key, val), do: Map.put(state, key, val)
defp to_state(%{repo: repo} = args) do
case String.split(repo, "/") do
[owner, gh_repo] ->
state =
args
|> set(:owner, owner)
|> set(:repo, gh_repo)
{:ok, state}
_ ->
{:error, "Malformed owner/repo value"}
end
end
end
| 23.63662 | 102 | 0.562031 |
1c1da33fdda552d956149624519d270535169471 | 100 | ex | Elixir | lib/evolution/repo.ex | AlexKovalevych/evolution | 2271546f045d475d676f5993c25824496026694b | [
"MIT"
] | null | null | null | lib/evolution/repo.ex | AlexKovalevych/evolution | 2271546f045d475d676f5993c25824496026694b | [
"MIT"
] | null | null | null | lib/evolution/repo.ex | AlexKovalevych/evolution | 2271546f045d475d676f5993c25824496026694b | [
"MIT"
] | null | null | null | defmodule Evolution.Repo do
use Ecto.Repo, otp_app: :evolution
use Scrivener, page_size: 10
end
| 20 | 36 | 0.77 |
1c1db4861d75f4efce2be554c49e9a30d38a7de9 | 1,386 | exs | Elixir | elixir/getting-started/9_enumerables_and_streams.exs | wesleyegberto/dojos-languages | 87170a722efac1247c713daa21cb3fcc39f5c5c1 | [
"MIT"
] | null | null | null | elixir/getting-started/9_enumerables_and_streams.exs | wesleyegberto/dojos-languages | 87170a722efac1247c713daa21cb3fcc39f5c5c1 | [
"MIT"
] | null | null | null | elixir/getting-started/9_enumerables_and_streams.exs | wesleyegberto/dojos-languages | 87170a722efac1247c713daa21cb3fcc39f5c5c1 | [
"MIT"
] | null | null | null | # Enumerables and Streams
# === Enumarables ===
# Elixir provides the Enum module with functions to
# enumerables manipulations: transform, sort, group, filter an retrieve.
# Those functions are eager and usually returns a list.
Enum.map(1..3, fn x -> x * 2 end)
Enum.reduce(1..3, 0, &+/2)
# question mark is a convention to indicate that it returns a boolean
odd? = &(rem(&1, 2) != 0)
Enum.filter(1..3, odd?)
# each Enum function will return a new list
Enum.sum(Enum.filter(Enum.map(1..100_000, &(&1 * 3)), odd?))
# |> is the pipe operator, takes the output from the left side and use as input
# to the right side
# each step of the pipeline will still return a new list
total_sum = 1..100_000 |> Enum.map(&(&1 * 3)) |> Enum.filter(odd?) |> Enum.sum
# === Streams ===
# Stream module supports lazy operations
total_sum = 1..100_000 |> Stream.map(&(&1 * 3)) |> Stream.filter(odd?) |> Enum.sum
# will only be invoked when passed to a Enum function
odd_numbers_stream = 1..100_000 |> Stream.map(&(&1 * 3)) |> Stream.filter(odd?)
# takes the first 100th numbers
odd_numbers = Enum.take(odd_numbers_stream, 100)
# Create an infinite stream from a cycled list
infinite_stream = Stream.cycle([1, 2, 3])
first_10th = Enum.take(infinite_stream, 10)
# Generate values from a initial value
hello_stream = Stream.unfold("hełło", &String.next_codepoint/1)
Enum.take(hello_stream, 3)
| 30.8 | 82 | 0.702742 |
1c1dbd1106e1c0f59a883b96ed2f33fe55f24ba2 | 2,276 | ex | Elixir | clients/identity_toolkit/lib/google_api/identity_toolkit/v3/model/verify_custom_token_response.ex | matehat/elixir-google-api | c1b2523c2c4cdc9e6ca4653ac078c94796b393c3 | [
"Apache-2.0"
] | 1 | 2018-12-03T23:43:10.000Z | 2018-12-03T23:43:10.000Z | clients/identity_toolkit/lib/google_api/identity_toolkit/v3/model/verify_custom_token_response.ex | matehat/elixir-google-api | c1b2523c2c4cdc9e6ca4653ac078c94796b393c3 | [
"Apache-2.0"
] | null | null | null | clients/identity_toolkit/lib/google_api/identity_toolkit/v3/model/verify_custom_token_response.ex | matehat/elixir-google-api | c1b2523c2c4cdc9e6ca4653ac078c94796b393c3 | [
"Apache-2.0"
] | null | null | null | # Copyright 2019 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# NOTE: This class is auto generated by the elixir code generator program.
# Do not edit the class manually.
defmodule GoogleApi.IdentityToolkit.V3.Model.VerifyCustomTokenResponse do
@moduledoc """
Response from verifying a custom token
## Attributes
* `expiresIn` (*type:* `String.t`, *default:* `nil`) - If idToken is STS id token, then this field will be expiration time of STS id token in seconds.
* `idToken` (*type:* `String.t`, *default:* `nil`) - The GITKit token for authenticated user.
* `isNewUser` (*type:* `boolean()`, *default:* `nil`) - True if it's a new user sign-in, false if it's a returning user.
* `kind` (*type:* `String.t`, *default:* `identitytoolkit#VerifyCustomTokenResponse`) - The fixed string "identitytoolkit#VerifyCustomTokenResponse".
* `refreshToken` (*type:* `String.t`, *default:* `nil`) - If idToken is STS id token, then this field will be refresh token.
"""
use GoogleApi.Gax.ModelBase
@type t :: %__MODULE__{
:expiresIn => String.t(),
:idToken => String.t(),
:isNewUser => boolean(),
:kind => String.t(),
:refreshToken => String.t()
}
field(:expiresIn)
field(:idToken)
field(:isNewUser)
field(:kind)
field(:refreshToken)
end
defimpl Poison.Decoder, for: GoogleApi.IdentityToolkit.V3.Model.VerifyCustomTokenResponse do
def decode(value, options) do
GoogleApi.IdentityToolkit.V3.Model.VerifyCustomTokenResponse.decode(value, options)
end
end
defimpl Poison.Encoder, for: GoogleApi.IdentityToolkit.V3.Model.VerifyCustomTokenResponse do
def encode(value, options) do
GoogleApi.Gax.ModelBase.encode(value, options)
end
end
| 38.576271 | 154 | 0.711336 |
1c1dc178ad7e238223e7b90f8f1032892516c3b9 | 392 | ex | Elixir | lib/xiami_daily_recommend.ex | Gizeta/xiami_daily_recommend | e4b510cc02ad544f0007ba78af3e5efc6b6369a3 | [
"MIT"
] | null | null | null | lib/xiami_daily_recommend.ex | Gizeta/xiami_daily_recommend | e4b510cc02ad544f0007ba78af3e5efc6b6369a3 | [
"MIT"
] | null | null | null | lib/xiami_daily_recommend.ex | Gizeta/xiami_daily_recommend | e4b510cc02ad544f0007ba78af3e5efc6b6369a3 | [
"MIT"
] | null | null | null | defmodule XiamiDailyRecommend do
use Application
def start(_type, _args) do
port = Application.get_env(:xiami_daily_recommend, :cowboy_port, 8081)
:ets.new(:xiami, [:set, :public, :named_table])
children = [
Plug.Adapters.Cowboy.child_spec(:http, XiamiDailyRecommend.Router, [], port: port)
]
Supervisor.start_link(children, strategy: :one_for_one)
end
end
| 26.133333 | 88 | 0.711735 |
1c1dc840485b62e9422992e276ad90fd2bc98026 | 1,132 | ex | Elixir | apps/tai/lib/tai/venue_adapters/binance/asset_balances.ex | ihorkatkov/tai | 09f9f15d2c385efe762ae138a8570f1e3fd41f26 | [
"MIT"
] | 1 | 2019-12-19T05:16:26.000Z | 2019-12-19T05:16:26.000Z | apps/tai/lib/tai/venue_adapters/binance/asset_balances.ex | ihorkatkov/tai | 09f9f15d2c385efe762ae138a8570f1e3fd41f26 | [
"MIT"
] | null | null | null | apps/tai/lib/tai/venue_adapters/binance/asset_balances.ex | ihorkatkov/tai | 09f9f15d2c385efe762ae138a8570f1e3fd41f26 | [
"MIT"
] | null | null | null | defmodule Tai.VenueAdapters.Binance.AssetBalances do
def asset_balances(venue_id, account_id, credentials) do
venue_credentials = struct!(ExBinance.Credentials, credentials)
with {:ok, account} <- ExBinance.Private.account(venue_credentials) do
balances = account.balances |> Enum.map(&build(&1, venue_id, account_id))
{:ok, balances}
else
{:error, :receive_window} = error ->
error
{:error, {:binance_error, %{"code" => -2014, "msg" => "API-key format invalid." = reason}}} ->
{:error, {:credentials, reason}}
{:error, {:http_error, %HTTPoison.Error{reason: "timeout"}}} ->
{:error, :timeout}
end
end
defp build(
%{"asset" => raw_asset, "free" => free, "locked" => locked},
venue_id,
account_id
) do
asset =
raw_asset
|> String.downcase()
|> String.to_atom()
%Tai.Venues.AssetBalance{
venue_id: venue_id,
account_id: account_id,
asset: asset,
free: free |> Decimal.new() |> Decimal.reduce(),
locked: locked |> Decimal.new() |> Decimal.reduce()
}
end
end
| 29.025641 | 100 | 0.600707 |
1c1dd25865be8c54d8661b6e118bf7ba641ddad1 | 602 | ex | Elixir | apps/trial/lib/trial/application.ex | losttime/commanded_poc | 9b1101924e8b9918666d565d12119dcd448fd040 | [
"MIT"
] | null | null | null | apps/trial/lib/trial/application.ex | losttime/commanded_poc | 9b1101924e8b9918666d565d12119dcd448fd040 | [
"MIT"
] | null | null | null | apps/trial/lib/trial/application.ex | losttime/commanded_poc | 9b1101924e8b9918666d565d12119dcd448fd040 | [
"MIT"
] | null | null | null | defmodule Trial.Application do
# See https://hexdocs.pm/elixir/Application.html
# for more information on OTP Applications
@moduledoc false
use Application
def start(_type, _args) do
# List all child processes to be supervised
children = [
# Starts a worker by calling: Trial.Worker.start_link(arg)
# {Trial.Worker, arg},
Trial.Handlers.Driven
]
# See https://hexdocs.pm/elixir/Supervisor.html
# for other strategies and supported options
opts = [strategy: :one_for_one, name: Trial.Supervisor]
Supervisor.start_link(children, opts)
end
end
| 27.363636 | 64 | 0.704319 |
1c1ddef05d912a9d96582a3c538eca42ee47d7dd | 536 | ex | Elixir | testData/org/elixir_lang/parser_definition/matched_call_operation/matched_expression_parsing_test_case/List.ex | keyno63/intellij-elixir | 4033e319992c53ddd42a683ee7123a97b5e34f02 | [
"Apache-2.0"
] | 1,668 | 2015-01-03T05:54:27.000Z | 2022-03-25T08:01:20.000Z | testData/org/elixir_lang/parser_definition/matched_call_operation/matched_expression_parsing_test_case/List.ex | keyno63/intellij-elixir | 4033e319992c53ddd42a683ee7123a97b5e34f02 | [
"Apache-2.0"
] | 2,018 | 2015-01-01T22:43:39.000Z | 2022-03-31T20:13:08.000Z | testData/org/elixir_lang/parser_definition/matched_call_operation/matched_expression_parsing_test_case/List.ex | keyno63/intellij-elixir | 4033e319992c53ddd42a683ee7123a97b5e34f02 | [
"Apache-2.0"
] | 145 | 2015-01-15T11:37:16.000Z | 2021-12-22T05:51:02.000Z | [] &one
[] one \\ default
[] one when guard
[] one :: type
[] one | new
[] one = two
[] one or two
[] one || two
[] one and two
[] one && two
[] one != two
[] one < two
[] one |> two
[] one + two
[] one / two
[] one * two
[] one ^^^ two
[] !one
[] One.Two
[] One.two
[] @one
[] One Two
[] one
[] @1
[] &1
[] !1
[] not 1
[] ?1
[] 0b10
[] 1.2e-3
[] 1
[] 0xFF
[] 0o7
[] 0zZ
[] []
[] "StringLine"
[] """
String
Heredoc
"""
[] 'CharListLine'
[] '''
CharListLine
'''
[] ~x{sigil}modifiers
[] nil
[] :atom
[] One
| 10.938776 | 21 | 0.449627 |
1c1e2bf5e2c954778b23b9501705b9b7c4146cba | 3,205 | ex | Elixir | clients/private_ca/lib/google_api/private_ca/v1beta1/model/reusable_config_values.ex | pojiro/elixir-google-api | 928496a017d3875a1929c6809d9221d79404b910 | [
"Apache-2.0"
] | 1 | 2021-12-20T03:40:53.000Z | 2021-12-20T03:40:53.000Z | clients/private_ca/lib/google_api/private_ca/v1beta1/model/reusable_config_values.ex | pojiro/elixir-google-api | 928496a017d3875a1929c6809d9221d79404b910 | [
"Apache-2.0"
] | 1 | 2020-08-18T00:11:23.000Z | 2020-08-18T00:44:16.000Z | clients/private_ca/lib/google_api/private_ca/v1beta1/model/reusable_config_values.ex | pojiro/elixir-google-api | 928496a017d3875a1929c6809d9221d79404b910 | [
"Apache-2.0"
] | null | null | null | # Copyright 2019 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# NOTE: This file is auto generated by the elixir code generator program.
# Do not edit this file manually.
defmodule GoogleApi.PrivateCA.V1beta1.Model.ReusableConfigValues do
@moduledoc """
A ReusableConfigValues is used to describe certain fields of an X.509 certificate, such as the key usage fields, fields specific to CA certificates, certificate policy extensions and custom extensions.
## Attributes
* `additionalExtensions` (*type:* `list(GoogleApi.PrivateCA.V1beta1.Model.X509Extension.t)`, *default:* `nil`) - Optional. Describes custom X.509 extensions.
* `aiaOcspServers` (*type:* `list(String.t)`, *default:* `nil`) - Optional. Describes Online Certificate Status Protocol (OCSP) endpoint addresses that appear in the "Authority Information Access" extension in the certificate.
* `caOptions` (*type:* `GoogleApi.PrivateCA.V1beta1.Model.CaOptions.t`, *default:* `nil`) - Optional. Describes options in this ReusableConfigValues that are relevant in a CA certificate.
* `keyUsage` (*type:* `GoogleApi.PrivateCA.V1beta1.Model.KeyUsage.t`, *default:* `nil`) - Optional. Indicates the intended use for keys that correspond to a certificate.
* `policyIds` (*type:* `list(GoogleApi.PrivateCA.V1beta1.Model.ObjectId.t)`, *default:* `nil`) - Optional. Describes the X.509 certificate policy object identifiers, per https://tools.ietf.org/html/rfc5280#section-4.2.1.4.
"""
use GoogleApi.Gax.ModelBase
@type t :: %__MODULE__{
:additionalExtensions =>
list(GoogleApi.PrivateCA.V1beta1.Model.X509Extension.t()) | nil,
:aiaOcspServers => list(String.t()) | nil,
:caOptions => GoogleApi.PrivateCA.V1beta1.Model.CaOptions.t() | nil,
:keyUsage => GoogleApi.PrivateCA.V1beta1.Model.KeyUsage.t() | nil,
:policyIds => list(GoogleApi.PrivateCA.V1beta1.Model.ObjectId.t()) | nil
}
field(:additionalExtensions, as: GoogleApi.PrivateCA.V1beta1.Model.X509Extension, type: :list)
field(:aiaOcspServers, type: :list)
field(:caOptions, as: GoogleApi.PrivateCA.V1beta1.Model.CaOptions)
field(:keyUsage, as: GoogleApi.PrivateCA.V1beta1.Model.KeyUsage)
field(:policyIds, as: GoogleApi.PrivateCA.V1beta1.Model.ObjectId, type: :list)
end
defimpl Poison.Decoder, for: GoogleApi.PrivateCA.V1beta1.Model.ReusableConfigValues do
def decode(value, options) do
GoogleApi.PrivateCA.V1beta1.Model.ReusableConfigValues.decode(value, options)
end
end
defimpl Poison.Encoder, for: GoogleApi.PrivateCA.V1beta1.Model.ReusableConfigValues do
def encode(value, options) do
GoogleApi.Gax.ModelBase.encode(value, options)
end
end
| 53.416667 | 230 | 0.744774 |
1c1e8076e28921d82dbbf044a42f5c0bf4a2a404 | 376 | ex | Elixir | web/views/error_view.ex | genamerica/ChatLocal | 0a7076a8f25219a0bc694780f3e436eadd357aea | [
"MIT"
] | 11 | 2016-09-10T01:24:32.000Z | 2021-03-20T22:54:55.000Z | web/views/error_view.ex | genamerica/ChatLocal | 0a7076a8f25219a0bc694780f3e436eadd357aea | [
"MIT"
] | null | null | null | web/views/error_view.ex | genamerica/ChatLocal | 0a7076a8f25219a0bc694780f3e436eadd357aea | [
"MIT"
] | 2 | 2017-03-01T04:16:41.000Z | 2018-01-02T10:14:52.000Z | defmodule Chatroom.ErrorView do
use Chatroom.Web, :view
def render("404.html", _assigns) do
"Page not found"
end
def render("500.html", _assigns) do
"Server internal error"
end
# In case no render clause matches or no
# template is found, let's render it as 500
def template_not_found(_template, assigns) do
render "500.html", assigns
end
end
| 20.888889 | 47 | 0.699468 |
1c1e8c6f52eb3e60ac3613c107b20cc3e5051114 | 272 | ex | Elixir | lib/four_lucha/search.ex | Thomas-Jean/four_lucha | 591627059c02edc3315b5cac2c35eacb821108ff | [
"Apache-2.0"
] | 1 | 2021-02-21T19:15:27.000Z | 2021-02-21T19:15:27.000Z | lib/four_lucha/search.ex | Thomas-Jean/four_lucha | 591627059c02edc3315b5cac2c35eacb821108ff | [
"Apache-2.0"
] | null | null | null | lib/four_lucha/search.ex | Thomas-Jean/four_lucha | 591627059c02edc3315b5cac2c35eacb821108ff | [
"Apache-2.0"
] | null | null | null | defmodule FourLucha.Search do
@moduledoc false
@items "search"
use FourLucha.BaseHelpers
use FourLucha.BasePlural
require FourLucha.Client
defp get_resource do
Map
end
defp get_known_query_params do
[:query, :resources, :limit, :page]
end
end
| 16 | 39 | 0.731618 |
1c1e927cff527570c3b3c04cd638aee04d21dc39 | 1,270 | exs | Elixir | mix.exs | EevanW/elixir-xml_rpc | 46609dadc5df66322153e7505ebe02526d784ab1 | [
"Apache-2.0"
] | null | null | null | mix.exs | EevanW/elixir-xml_rpc | 46609dadc5df66322153e7505ebe02526d784ab1 | [
"Apache-2.0"
] | null | null | null | mix.exs | EevanW/elixir-xml_rpc | 46609dadc5df66322153e7505ebe02526d784ab1 | [
"Apache-2.0"
] | null | null | null | defmodule XmlRpc.Mixfile do
use Mix.Project
def project do
[app: :xmlrpc,
version: "1.3.0",
elixir: "~> 1.4",
name: "XMLRPC",
description: "XML-RPC encoder/decder for Elixir. Supports all valid datatypes. Input (ie untrusted) is parsed with erlsom against an xml-schema for security.",
source_url: "https://github.com/ewildgoose/elixir-xml_rpc",
build_embedded: Mix.env == :prod,
start_permanent: Mix.env == :prod,
deps: deps(),
package: package()]
end
# Configuration for the OTP application
#
# Type `mix help compile.app` for more information
def application do
[extra_applications: []]
end
# Dependencies can be Hex packages:
#
# {:mydep, "~> 0.3.0"}
#
# Or git/path repositories:
#
# {:mydep, git: "https://github.com/elixir-lang/mydep.git", tag: "0.1.0"}
#
# Type `mix help deps` for more examples and options
defp deps do
[ {:earmark, "~> 1.0", only: :docs},
{:ex_doc, "~> 0.14", only: :docs},
{:erlsom, "~> 1.4"},
]
end
defp package do
[files: ~w(lib mix.exs README.md LICENSE),
maintainers: ["Ed Wildgoose"],
licenses: ["Apache 2.0"],
links: %{"GitHub" => "https://github.com/ewildgoose/elixir-xml_rpc"}]
end
end
| 27.021277 | 164 | 0.611024 |
1c1e9753dec7d6a0b60822f1b21da3f67ed0bc0a | 6,237 | ex | Elixir | lib/elixir_script/passes/translate/function.ex | alex-min/elixirscript | a2bd2327d0b6bbacf98fb555198acf12c0c20916 | [
"MIT"
] | 1 | 2021-09-14T14:28:39.000Z | 2021-09-14T14:28:39.000Z | lib/elixir_script/passes/translate/function.ex | alex-min/elixirscript | a2bd2327d0b6bbacf98fb555198acf12c0c20916 | [
"MIT"
] | null | null | null | lib/elixir_script/passes/translate/function.ex | alex-min/elixirscript | a2bd2327d0b6bbacf98fb555198acf12c0c20916 | [
"MIT"
] | null | null | null | defmodule ElixirScript.Translate.Function do
@moduledoc false
# Translates the given Elixir function AST into the
# equivalent JavaScript AST.
alias ESTree.Tools.Builder, as: J
alias ElixirScript.Translate.{Clause, Form, Helpers}
alias ElixirScript.Translate.Forms.Pattern
@spec compile(any, map) :: {ESTree.Node.t(), map}
def compile({:fn, _, clauses}, state) do
anonymous? = Map.get(state, :anonymous_fn, false)
state =
Map.put(state, :anonymous_fn, true)
|> Map.put(:in_guard, false)
clauses = compile_clauses(clauses, state)
arg_matches_declaration = Helpers.declare_let("__arg_matches__", J.identifier("null"))
function_recur_dec =
Helpers.function(
"recur",
[J.rest_element(J.identifier("__function_args__"))],
J.block_statement([
arg_matches_declaration,
clauses,
J.throw_statement(
Helpers.new(
J.member_expression(
Helpers.patterns(),
J.identifier("MatchError")
),
[J.identifier("__function_args__")]
)
)
])
)
function_dec =
Helpers.arrow_function(
[J.rest_element(J.identifier("__function_args__"))],
J.block_statement([
function_recur_dec,
J.return_statement(trampoline())
])
)
state = Map.put(state, :anonymous_fn, anonymous?)
{function_dec, state}
end
def compile({{name, arity}, _type, _, clauses}, state) do
state =
Map.put(state, :function, {name, arity})
|> Map.put(:anonymous_fn, false)
|> Map.put(:in_guard, false)
clauses = compile_clauses(clauses, state)
arg_matches_declaration = Helpers.declare_let("__arg_matches__", J.identifier("null"))
intermediate_declaration = Helpers.declare_let("__intermediate__", J.identifier("null"))
function_recur_dec =
Helpers.function(
"recur",
[J.rest_element(J.identifier("__function_args__"))],
J.block_statement([
arg_matches_declaration,
intermediate_declaration,
clauses,
J.throw_statement(
Helpers.new(
J.member_expression(
Helpers.patterns(),
J.identifier("MatchError")
),
[J.identifier("__function_args__")]
)
)
])
)
function_dec =
Helpers.function(
ElixirScript.Translate.Identifier.make_function_name(name),
[J.rest_element(J.identifier("__function_args__"))],
J.block_statement([
function_recur_dec,
J.return_statement(trampoline())
])
)
{function_dec, state}
end
defp compile_clauses(clauses, state) do
clauses
|> Enum.map(&compile_clause(&1, state))
|> Enum.map(fn {patterns, _params, guards, body} ->
match_or_default_call =
Helpers.call(
J.member_expression(
Helpers.patterns(),
J.identifier("match_or_default")
),
[J.array_expression(patterns), J.identifier("__function_args__"), guards]
)
J.if_statement(
J.binary_expression(
:!==,
Helpers.assign(J.identifier("__arg_matches__"), match_or_default_call),
J.identifier("null")
),
J.block_statement(body)
)
end)
|> Enum.reverse()
|> Enum.reduce(nil, fn
if_ast, nil ->
if_ast
if_ast, ast ->
%{if_ast | alternate: ast}
end)
end
defp compile_clause({_, args, guards, body}, state) do
state =
if Map.has_key?(state, :vars) do
state
else
Map.put(state, :vars, %{})
end
{patterns, params, state} = Pattern.compile(args, state)
guard = Clause.compile_guard(params, guards, state)
{body, _state} = compile_block(body, state)
body =
body
|> Clause.return_last_statement()
|> update_last_call(state)
declaration = Helpers.declare_let(params, J.identifier("__arg_matches__"))
body = [declaration] ++ body
{patterns, params, guard, body}
end
defp compile_clause({:->, _, [[{:when, _, params}], body]}, state) do
guards = List.last(params)
params = params |> Enum.reverse() |> tl |> Enum.reverse()
compile_clause({[], params, guards, body}, state)
end
defp compile_clause({:->, _, [params, body]}, state) do
compile_clause({[], params, [], body}, state)
end
@spec compile_block(any, map) :: {ESTree.Node.t(), map}
def compile_block(block, state) do
ast =
case block do
nil ->
J.identifier("null")
{:__block__, _, block_body} ->
{list, _} = Enum.map_reduce(block_body, state, &Form.compile(&1, &2))
List.flatten(list)
_ ->
Form.compile!(block, state)
end
{ast, state}
end
@spec update_last_call([ESTree.Node.t()], map) :: list
def update_last_call(clause_body, %{function: {name, _}, anonymous_fn: anonymous?}) do
last_item = List.last(clause_body)
function_name = ElixirScript.Translate.Identifier.make_function_name(name)
case last_item do
%ESTree.ReturnStatement{
argument: %ESTree.CallExpression{callee: ^function_name, arguments: arguments}
} ->
if anonymous? do
clause_body
else
new_last_item = J.return_statement(recurse(recur_bind(arguments)))
List.replace_at(clause_body, length(clause_body) - 1, new_last_item)
end
_ ->
clause_body
end
end
defp recur_bind(args) do
Helpers.call(
J.member_expression(
J.identifier("recur"),
J.identifier("bind")
),
[J.identifier("null")] ++ args
)
end
defp recurse(func) do
Helpers.new(
J.member_expression(
Helpers.functions(),
J.identifier("Recurse")
),
[
func
]
)
end
defp trampoline() do
Helpers.call(
J.member_expression(
Helpers.functions(),
J.identifier("trampoline")
),
[
recurse(recur_bind([J.rest_element(J.identifier("__function_args__"))]))
]
)
end
end
| 25.9875 | 92 | 0.591951 |
1c1ea13fde336b3e960f0424534a360b03153878 | 2,072 | ex | Elixir | clients/cloud_run/lib/google_api/cloud_run/v1alpha1/model/volume_mount.ex | Contractbook/elixir-google-api | 342751041aaf8c2e7f76f9922cf24b9c5895802b | [
"Apache-2.0"
] | 1 | 2021-10-01T09:20:41.000Z | 2021-10-01T09:20:41.000Z | clients/cloud_run/lib/google_api/cloud_run/v1alpha1/model/volume_mount.ex | Contractbook/elixir-google-api | 342751041aaf8c2e7f76f9922cf24b9c5895802b | [
"Apache-2.0"
] | null | null | null | clients/cloud_run/lib/google_api/cloud_run/v1alpha1/model/volume_mount.ex | Contractbook/elixir-google-api | 342751041aaf8c2e7f76f9922cf24b9c5895802b | [
"Apache-2.0"
] | null | null | null | # Copyright 2019 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# NOTE: This file is auto generated by the elixir code generator program.
# Do not edit this file manually.
defmodule GoogleApi.CloudRun.V1alpha1.Model.VolumeMount do
@moduledoc """
Not supported by Cloud Run VolumeMount describes a mounting of a Volume within a container.
## Attributes
* `mountPath` (*type:* `String.t`, *default:* `nil`) - Path within the container at which the volume should be mounted. Must not contain ':'.
* `name` (*type:* `String.t`, *default:* `nil`) - This must match the Name of a Volume.
* `readOnly` (*type:* `boolean()`, *default:* `nil`) - (Optional) Only true is accepted. Defaults to true.
* `subPath` (*type:* `String.t`, *default:* `nil`) - (Optional) Path within the volume from which the container's volume should be mounted. Defaults to "" (volume's root).
"""
use GoogleApi.Gax.ModelBase
@type t :: %__MODULE__{
:mountPath => String.t() | nil,
:name => String.t() | nil,
:readOnly => boolean() | nil,
:subPath => String.t() | nil
}
field(:mountPath)
field(:name)
field(:readOnly)
field(:subPath)
end
defimpl Poison.Decoder, for: GoogleApi.CloudRun.V1alpha1.Model.VolumeMount do
def decode(value, options) do
GoogleApi.CloudRun.V1alpha1.Model.VolumeMount.decode(value, options)
end
end
defimpl Poison.Encoder, for: GoogleApi.CloudRun.V1alpha1.Model.VolumeMount do
def encode(value, options) do
GoogleApi.Gax.ModelBase.encode(value, options)
end
end
| 37 | 175 | 0.701255 |
1c1ea4106b99be8d08ac7f127eedef520da96539 | 842 | ex | Elixir | lib/game/socket.ex | jgsmith/ex_venture | 546adaa8fe80d45a72fde6de8d8d6906902c12d4 | [
"MIT"
] | 2 | 2019-05-14T11:36:44.000Z | 2020-07-01T08:54:04.000Z | lib/game/socket.ex | nickwalton/ex_venture | d8ff1b0181db03f9ddcb7610ae7ab533feecbfbb | [
"MIT"
] | null | null | null | lib/game/socket.ex | nickwalton/ex_venture | d8ff1b0181db03f9ddcb7610ae7ab533feecbfbb | [
"MIT"
] | 1 | 2021-01-29T14:12:40.000Z | 2021-01-29T14:12:40.000Z | defmodule Game.Socket do
@moduledoc """
Client to call the socket module
"""
@socket Application.get_env(:ex_venture, :networking)[:socket_module]
def echo(state, string) do
@socket.echo(state.socket, string)
end
def prompt(state, string) do
@socket.prompt(state.socket, string)
end
def set_config(state, config) do
@socket.set_config(state.socket, config)
end
def disconnect(state) do
@socket.disconnect(state.socket)
end
def set_character_id(state, character_id) do
@socket.set_character_id(state.socket, character_id)
end
def tcp_option(state, option, enabled) do
@socket.tcp_option(state.socket, option, enabled)
end
def nop(state) do
@socket.nop(state.socket)
end
def push_gmcp(state, module, data) do
@socket.push_gmcp(state.socket, module, data)
end
end
| 21.05 | 71 | 0.711401 |
1c1ec094153bc4e19706f79d856ede7e101e165a | 284 | exs | Elixir | priv/repo/migrations/20181001015931_create_videos_tags.exs | the-mikedavis/doc_gen | efcc884ea65bba5748f41c5601abd00db2777ec4 | [
"BSD-3-Clause"
] | null | null | null | priv/repo/migrations/20181001015931_create_videos_tags.exs | the-mikedavis/doc_gen | efcc884ea65bba5748f41c5601abd00db2777ec4 | [
"BSD-3-Clause"
] | 27 | 2018-10-29T18:34:44.000Z | 2019-03-11T18:43:12.000Z | priv/repo/migrations/20181001015931_create_videos_tags.exs | the-mikedavis/doc_gen | efcc884ea65bba5748f41c5601abd00db2777ec4 | [
"BSD-3-Clause"
] | null | null | null | defmodule DocGen.Repo.Migrations.CreateVideosTags do
use Ecto.Migration
def change do
create table(:videos_tags) do
add(:video_id, references(:videos))
add(:tag_id, references(:tags))
end
create unique_index(:videos_tags, [:video_id, :tag_id])
end
end
| 21.846154 | 59 | 0.704225 |
1c1ec52d1cf8dbfc8458fdd5829dd4031ee3fd83 | 396 | exs | Elixir | mix.exs | prosapient/dev_space_auth | 19038344500a713e41ed5b190cabbbae6cb6dfde | [
"Apache-2.0"
] | null | null | null | mix.exs | prosapient/dev_space_auth | 19038344500a713e41ed5b190cabbbae6cb6dfde | [
"Apache-2.0"
] | null | null | null | mix.exs | prosapient/dev_space_auth | 19038344500a713e41ed5b190cabbbae6cb6dfde | [
"Apache-2.0"
] | null | null | null | defmodule DevSpaceAuth.MixProject do
use Mix.Project
def project do
[
app: :dev_space_auth,
version: "0.1.0",
elixir: "~> 1.12",
start_permanent: Mix.env() == :prod,
deps: deps()
]
end
def application do
[
extra_applications: [:logger]
]
end
defp deps do
[
{:plug, "~> 1.11"},
{:hammer, "~> 6.0"}
]
end
end
| 14.666667 | 42 | 0.515152 |
1c1ecd5b6dab608c7b016727f2e38768fc77d583 | 43,661 | ex | Elixir | lib/elixir/lib/stream.ex | TurtleAI/elixir | 2fb41ebef4d06315dd6c05ee00899572b27ee50a | [
"Apache-2.0"
] | null | null | null | lib/elixir/lib/stream.ex | TurtleAI/elixir | 2fb41ebef4d06315dd6c05ee00899572b27ee50a | [
"Apache-2.0"
] | null | null | null | lib/elixir/lib/stream.ex | TurtleAI/elixir | 2fb41ebef4d06315dd6c05ee00899572b27ee50a | [
"Apache-2.0"
] | null | null | null | defmodule Stream do
@moduledoc """
Module for creating and composing streams.
Streams are composable, lazy enumerables. Any enumerable that generates
items one by one during enumeration is called a stream. For example,
Elixir's `Range` is a stream:
iex> range = 1..5
1..5
iex> Enum.map range, &(&1 * 2)
[2, 4, 6, 8, 10]
In the example above, as we mapped over the range, the elements being
enumerated were created one by one, during enumeration. The `Stream`
module allows us to map the range, without triggering its enumeration:
iex> range = 1..3
iex> stream = Stream.map(range, &(&1 * 2))
iex> Enum.map(stream, &(&1 + 1))
[3, 5, 7]
Notice we started with a range and then we created a stream that is
meant to multiply each item in the range by 2. At this point, no
computation was done. Only when `Enum.map/2` is called we actually
enumerate over each item in the range, multiplying it by 2 and adding 1.
We say the functions in `Stream` are *lazy* and the functions in `Enum`
are *eager*.
Due to their laziness, streams are useful when working with large
(or even infinite) collections. When chaining many operations with `Enum`,
intermediate lists are created, while `Stream` creates a recipe of
computations that are executed at a later moment. Let's see another
example:
1..3
|> Enum.map(&IO.inspect(&1))
|> Enum.map(&(&1 * 2))
|> Enum.map(&IO.inspect(&1))
1
2
3
2
4
6
#=> [2, 4, 6]
Notice that we first printed each item in the list, then multiplied each
element by 2 and finally printed each new value. In this example, the list
was enumerated three times. Let's see an example with streams:
stream = 1..3
|> Stream.map(&IO.inspect(&1))
|> Stream.map(&(&1 * 2))
|> Stream.map(&IO.inspect(&1))
Enum.to_list(stream)
1
2
2
4
3
6
#=> [2, 4, 6]
Although the end result is the same, the order in which the items were
printed changed! With streams, we print the first item and then print
its double. In this example, the list was enumerated just once!
That's what we meant when we said earlier that streams are composable,
lazy enumerables. Notice we could call `Stream.map/2` multiple times,
effectively composing the streams and keeping them lazy. The computations
are only performed when you call a function from the `Enum` module.
## Creating Streams
There are many functions in Elixir's standard library that return
streams, some examples are:
* `IO.stream/2` - streams input lines, one by one
* `URI.query_decoder/1` - decodes a query string, pair by pair
This module also provides many convenience functions for creating streams,
like `Stream.cycle/1`, `Stream.unfold/2`, `Stream.resource/3` and more.
Note the functions in this module are guaranteed to return enumerables.
Since enumerables can have different shapes (structs, anonymous functions,
and so on), the functions in this module may return any of those shapes
and that this may change at any time. For example, a function that today
returns an anonymous function may return a struct in future releases.
"""
@doc false
defstruct enum: nil, funs: [], accs: [], done: nil
@type acc :: any
@type element :: any
@type index :: non_neg_integer
@type default :: any
# Require Stream.Reducers and its callbacks
require Stream.Reducers, as: R
defmacrop skip(acc) do
{:cont, acc}
end
defmacrop next(fun, entry, acc) do
quote do: unquote(fun).(unquote(entry), unquote(acc))
end
defmacrop acc(head, state, tail) do
quote do: [unquote(head), unquote(state) | unquote(tail)]
end
defmacrop next_with_acc(fun, entry, head, state, tail) do
quote do
{reason, [head | tail]} = unquote(fun).(unquote(entry), [unquote(head) | unquote(tail)])
{reason, [head, unquote(state) | tail]}
end
end
## Transformers
@doc """
Shortcut to `chunk(enum, n, n)`.
"""
@spec chunk(Enumerable.t, pos_integer) :: Enumerable.t
def chunk(enum, n), do: chunk(enum, n, n, nil)
@doc """
Streams the enumerable in chunks, containing `n` items each, where
each new chunk starts `step` elements into the enumerable.
`step` is optional and, if not passed, defaults to `n`, i.e.
chunks do not overlap. If the final chunk does not have `n`
elements to fill the chunk, elements are taken as necessary
from `leftover` if it was passed. If `leftover` is passed and
does not have enough elements to fill the chunk, then the chunk is
returned anyway with less than `n` elements. If `leftover` is not
passed at all or is `nil`, then the partial chunk is discarded
from the result.
## Examples
iex> Stream.chunk([1, 2, 3, 4, 5, 6], 2) |> Enum.to_list
[[1, 2], [3, 4], [5, 6]]
iex> Stream.chunk([1, 2, 3, 4, 5, 6], 3, 2) |> Enum.to_list
[[1, 2, 3], [3, 4, 5]]
iex> Stream.chunk([1, 2, 3, 4, 5, 6], 3, 2, [7]) |> Enum.to_list
[[1, 2, 3], [3, 4, 5], [5, 6, 7]]
iex> Stream.chunk([1, 2, 3, 4, 5, 6], 3, 3, []) |> Enum.to_list
[[1, 2, 3], [4, 5, 6]]
"""
@spec chunk(Enumerable.t, pos_integer, pos_integer) :: Enumerable.t
@spec chunk(Enumerable.t, pos_integer, pos_integer, Enumerable.t | nil) :: Enumerable.t
def chunk(enum, n, step, leftover \\ nil)
when is_integer(n) and n > 0 and is_integer(step) and step > 0 do
limit = :erlang.max(n, step)
if is_nil(leftover) do
lazy enum, {[], 0}, fn(f1) -> R.chunk(n, step, limit, f1) end
else
lazy enum, {[], 0},
fn(f1) -> R.chunk(n, step, limit, f1) end,
&do_chunk(&1, n, leftover, &2)
end
end
defp do_chunk(acc(_, {_, 0}, _) = acc, _, _, _) do
{:cont, acc}
end
defp do_chunk(acc(h, {buffer, count} = old, t), n, leftover, f1) do
buffer = :lists.reverse(buffer, Enum.take(leftover, n - count))
next_with_acc(f1, buffer, h, old, t)
end
@doc """
Chunks the `enum` by buffering elements for which `fun` returns
the same value and only emit them when `fun` returns a new value
or the `enum` finishes.
## Examples
iex> stream = Stream.chunk_by([1, 2, 2, 3, 4, 4, 6, 7, 7], &(rem(&1, 2) == 1))
iex> Enum.to_list(stream)
[[1], [2, 2], [3], [4, 4, 6], [7, 7]]
"""
@spec chunk_by(Enumerable.t, (element -> any)) :: Enumerable.t
def chunk_by(enum, fun) do
lazy enum, nil,
fn(f1) -> R.chunk_by(fun, f1) end,
&do_chunk_by(&1, &2)
end
defp do_chunk_by(acc(_, nil, _) = acc, _f1) do
{:cont, acc}
end
defp do_chunk_by(acc(h, {buffer, _}, t), f1) do
next_with_acc(f1, :lists.reverse(buffer), h, nil, t)
end
@doc """
Creates a stream that only emits elements if they are different from the last emitted element.
This function only ever needs to store the last emitted element.
Elements are compared using `===`.
## Examples
iex> Stream.dedup([1, 2, 3, 3, 2, 1]) |> Enum.to_list
[1, 2, 3, 2, 1]
"""
@spec dedup(Enumerable.t) :: Enumerable.t
def dedup(enum) do
dedup_by(enum, fn x -> x end)
end
@doc """
Creates a stream that only emits elements if the result of calling `fun` on the element is
different from the (stored) result of calling `fun` on the last emitted element.
## Examples
iex> Stream.dedup_by([{1, :x}, {2, :y}, {2, :z}, {1, :x}], fn {x, _} -> x end) |> Enum.to_list
[{1, :x}, {2, :y}, {1, :x}]
"""
@spec dedup_by(Enumerable.t, (element -> term)) :: Enumerable.t
def dedup_by(enum, fun) when is_function(fun, 1) do
lazy enum, nil, fn f1 -> R.dedup(fun, f1) end
end
@doc """
Lazily drops the next `n` items from the enumerable.
If a negative `n` is given, it will drop the last `n` items from
the collection. Note that the mechanism by which this is implemented
will delay the emission of any item until `n` additional items have
been emitted by the enum.
## Examples
iex> stream = Stream.drop(1..10, 5)
iex> Enum.to_list(stream)
[6, 7, 8, 9, 10]
iex> stream = Stream.drop(1..10, -5)
iex> Enum.to_list(stream)
[1, 2, 3, 4, 5]
"""
@spec drop(Enumerable.t, non_neg_integer) :: Enumerable.t
def drop(enum, n) when n >= 0 do
lazy enum, n, fn(f1) -> R.drop(f1) end
end
def drop(enum, n) when n < 0 do
n = abs(n)
lazy enum, {0, [], []}, fn(f1) ->
fn
entry, [h, {count, buf1, []} | t] ->
do_drop(:cont, n, entry, h, count, buf1, [], t)
entry, [h, {count, buf1, [next | buf2]} | t] ->
{reason, [h | t]} = f1.(next, [h | t])
do_drop(reason, n, entry, h, count, buf1, buf2, t)
end
end
end
defp do_drop(reason, n, entry, h, count, buf1, buf2, t) do
buf1 = [entry | buf1]
count = count + 1
if count == n do
{reason, [h, {0, [], :lists.reverse(buf1)} | t]}
else
{reason, [h, {count, buf1, buf2} | t]}
end
end
@doc """
Creates a stream that drops every `nth` item from the enumerable.
The first item is always dropped, unless `nth` is 0.
`nth` must be a non-negative integer.
## Examples
iex> stream = Stream.drop_every(1..10, 2)
iex> Enum.to_list(stream)
[2, 4, 6, 8, 10]
iex> stream = Stream.drop_every(1..1000, 1)
iex> Enum.to_list(stream)
[]
iex> stream = Stream.drop_every([1, 2, 3, 4, 5], 0)
iex> Enum.to_list(stream)
[1, 2, 3, 4, 5]
"""
@spec drop_every(Enumerable.t, non_neg_integer) :: Enumerable.t
def drop_every(enum, nth)
def drop_every(enum, 0), do: %Stream{enum: enum}
def drop_every([], _nth), do: %Stream{enum: []}
def drop_every(enum, nth) when is_integer(nth) and nth > 0 do
lazy enum, nth, fn(f1) -> R.drop_every(nth, f1) end
end
@doc """
Lazily drops elements of the enumerable while the given
function returns `true`.
## Examples
iex> stream = Stream.drop_while(1..10, &(&1 <= 5))
iex> Enum.to_list(stream)
[6, 7, 8, 9, 10]
"""
@spec drop_while(Enumerable.t, (element -> as_boolean(term))) :: Enumerable.t
def drop_while(enum, fun) do
lazy enum, true, fn(f1) -> R.drop_while(fun, f1) end
end
@doc """
Executes the given function for each item.
Useful for adding side effects (like printing) to a stream.
## Examples
iex> stream = Stream.each([1, 2, 3], fn(x) -> send self(), x end)
iex> Enum.to_list(stream)
iex> receive do: (x when is_integer(x) -> x)
1
iex> receive do: (x when is_integer(x) -> x)
2
iex> receive do: (x when is_integer(x) -> x)
3
"""
@spec each(Enumerable.t, (element -> term)) :: Enumerable.t
def each(enum, fun) when is_function(fun, 1) do
lazy enum, fn(f1) ->
fn(x, acc) ->
fun.(x)
f1.(x, acc)
end
end
end
@doc """
Creates a stream that will apply the given function on enumeration and
flatten the result, but only one level deep.
## Examples
iex> stream = Stream.flat_map([1, 2, 3], fn(x) -> [x, x * 2] end)
iex> Enum.to_list(stream)
[1, 2, 2, 4, 3, 6]
iex> stream = Stream.flat_map([1, 2, 3], fn(x) -> [[x]] end)
iex> Enum.to_list(stream)
[[1], [2], [3]]
"""
@spec flat_map(Enumerable.t, (element -> Enumerable.t)) :: Enumerable.t
def flat_map(enum, mapper) do
transform(enum, nil, fn val, nil -> {mapper.(val), nil} end)
end
@doc """
Creates a stream that filters elements according to
the given function on enumeration.
## Examples
iex> stream = Stream.filter([1, 2, 3], fn(x) -> rem(x, 2) == 0 end)
iex> Enum.to_list(stream)
[2]
"""
@spec filter(Enumerable.t, (element -> as_boolean(term))) :: Enumerable.t
def filter(enum, fun) do
lazy enum, fn(f1) -> R.filter(fun, f1) end
end
@doc """
Creates a stream that filters and then maps elements according
to given functions.
Exists for symmetry with `Enum.filter_map/3`.
## Examples
iex> stream = Stream.filter_map(1..6, fn(x) -> rem(x, 2) == 0 end, &(&1 * 2))
iex> Enum.to_list(stream)
[4, 8, 12]
"""
@spec filter_map(Enumerable.t, (element -> as_boolean(term)), (element -> any)) :: Enumerable.t
def filter_map(enum, filter, mapper) do
lazy enum, fn(f1) -> R.filter_map(filter, mapper, f1) end
end
@doc """
Creates a stream that emits a value after the given period `n`
in milliseconds.
The values emitted are an increasing counter starting at `0`.
This operation will block the caller by the given interval
every time a new item is streamed.
Do not use this function to generate a sequence of numbers.
If blocking the caller process is not necessary, use
`Stream.iterate(0, & &1 + 1)` instead.
## Examples
iex> Stream.interval(10) |> Enum.take(10)
[0, 1, 2, 3, 4, 5, 6, 7, 8, 9]
"""
@spec interval(non_neg_integer) :: Enumerable.t
def interval(n) do
unfold 0, fn(count) ->
Process.sleep(n)
{count, count + 1}
end
end
@doc """
Injects the stream values into the given collectable as a side-effect.
This function is often used with `run/1` since any evaluation
is delayed until the stream is executed. See `run/1` for an example.
"""
@spec into(Enumerable.t, Collectable.t, (term -> term)) :: Enumerable.t
def into(enum, collectable, transform \\ fn x -> x end) when is_function(transform, 1) do
&do_into(enum, collectable, transform, &1, &2)
end
defp do_into(enum, collectable, transform, acc, fun) do
{initial, into} = Collectable.into(collectable)
composed = fn x, [acc | collectable] ->
collectable = into.(collectable, {:cont, transform.(x)})
{reason, acc} = fun.(x, acc)
{reason, [acc | collectable]}
end
do_into(&Enumerable.reduce(enum, &1, composed), initial, into, acc)
end
defp do_into(reduce, collectable, into, {command, acc}) do
try do
reduce.({command, [acc | collectable]})
catch
kind, reason ->
stacktrace = System.stacktrace
into.(collectable, :halt)
:erlang.raise(kind, reason, stacktrace)
else
{:suspended, [acc | collectable], continuation} ->
{:suspended, acc, &do_into(continuation, collectable, into, &1)}
{reason, [acc | collectable]} ->
into.(collectable, :done)
{reason, acc}
end
end
@doc """
Creates a stream that will apply the given function on
enumeration.
## Examples
iex> stream = Stream.map([1, 2, 3], fn(x) -> x * 2 end)
iex> Enum.to_list(stream)
[2, 4, 6]
"""
@spec map(Enumerable.t, (element -> any)) :: Enumerable.t
def map(enum, fun) do
lazy enum, fn(f1) -> R.map(fun, f1) end
end
@doc """
Creates a stream that will apply the given function on
every `nth` item from the enumerable.
The first item is always passed to the given function.
`nth` must be a non-negative integer.
## Examples
iex> stream = Stream.map_every(1..10, 2, fn(x) -> x * 2 end)
iex> Enum.to_list(stream)
[2, 2, 6, 4, 10, 6, 14, 8, 18, 10]
iex> stream = Stream.map_every([1, 2, 3, 4, 5], 1, fn(x) -> x * 2 end)
iex> Enum.to_list(stream)
[2, 4, 6, 8, 10]
iex> stream = Stream.map_every(1..5, 0, fn(x) -> x * 2 end)
iex> Enum.to_list(stream)
[1, 2, 3, 4, 5]
"""
@spec map_every(Enumerable.t, non_neg_integer, (element -> any)) :: Enumerable.t
def map_every(enum, nth, fun)
def map_every(enum, 1, fun), do: map(enum, fun)
def map_every(enum, 0, _fun), do: %Stream{enum: enum}
def map_every([], _nth, _fun), do: %Stream{enum: []}
def map_every(enum, nth, fun) when is_integer(nth) and nth > 0 do
lazy enum, nth, fn(f1) -> R.map_every(nth, fun, f1) end
end
@doc """
Creates a stream that will reject elements according to
the given function on enumeration.
## Examples
iex> stream = Stream.reject([1, 2, 3], fn(x) -> rem(x, 2) == 0 end)
iex> Enum.to_list(stream)
[1, 3]
"""
@spec reject(Enumerable.t, (element -> as_boolean(term))) :: Enumerable.t
def reject(enum, fun) do
lazy enum, fn(f1) -> R.reject(fun, f1) end
end
@doc """
Runs the given stream.
This is useful when a stream needs to be run, for side effects,
and there is no interest in its return result.
## Examples
Open up a file, replace all `#` by `%` and stream to another file
without loading the whole file in memory:
stream = File.stream!("code")
|> Stream.map(&String.replace(&1, "#", "%"))
|> Stream.into(File.stream!("new"))
|> Stream.run
No computation will be done until we call one of the Enum functions
or `Stream.run/1`.
"""
@spec run(Enumerable.t) :: :ok
def run(stream) do
_ = Enumerable.reduce(stream, {:cont, nil}, fn(_, _) -> {:cont, nil} end)
:ok
end
@doc """
Creates a stream that applies the given function to each
element, emits the result and uses the same result as the accumulator
for the next computation.
## Examples
iex> stream = Stream.scan(1..5, &(&1 + &2))
iex> Enum.to_list(stream)
[1, 3, 6, 10, 15]
"""
@spec scan(Enumerable.t, (element, acc -> any)) :: Enumerable.t
def scan(enum, fun) do
lazy enum, :first, fn(f1) -> R.scan2(fun, f1) end
end
@doc """
Creates a stream that applies the given function to each
element, emits the result and uses the same result as the accumulator
for the next computation. Uses the given `acc` as the starting value.
## Examples
iex> stream = Stream.scan(1..5, 0, &(&1 + &2))
iex> Enum.to_list(stream)
[1, 3, 6, 10, 15]
"""
@spec scan(Enumerable.t, acc, (element, acc -> any)) :: Enumerable.t
def scan(enum, acc, fun) do
lazy enum, acc, fn(f1) -> R.scan3(fun, f1) end
end
@doc """
Lazily takes the next `count` items from the enumerable and stops
enumeration.
If a negative `count` is given, the last `count` values will be taken.
For such, the collection is fully enumerated keeping up to `2 * count`
elements in memory. Once the end of the collection is reached,
the last `count` elements will be executed. Therefore, using
a negative `count` on an infinite collection will never return.
## Examples
iex> stream = Stream.take(1..100, 5)
iex> Enum.to_list(stream)
[1, 2, 3, 4, 5]
iex> stream = Stream.take(1..100, -5)
iex> Enum.to_list(stream)
[96, 97, 98, 99, 100]
iex> stream = Stream.cycle([1, 2, 3]) |> Stream.take(5)
iex> Enum.to_list(stream)
[1, 2, 3, 1, 2]
"""
@spec take(Enumerable.t, integer) :: Enumerable.t
def take(_enum, 0), do: %Stream{enum: []}
def take([], _count), do: %Stream{enum: []}
def take(enum, count) when is_integer(count) and count > 0 do
lazy enum, count, fn(f1) -> R.take(f1) end
end
def take(enum, count) when is_integer(count) and count < 0 do
&Enumerable.reduce(Enum.take(enum, count), &1, &2)
end
@doc """
Creates a stream that takes every `nth` item from the enumerable.
The first item is always included, unless `nth` is 0.
`nth` must be a non-negative integer.
## Examples
iex> stream = Stream.take_every(1..10, 2)
iex> Enum.to_list(stream)
[1, 3, 5, 7, 9]
iex> stream = Stream.take_every([1, 2, 3, 4, 5], 1)
iex> Enum.to_list(stream)
[1, 2, 3, 4, 5]
iex> stream = Stream.take_every(1..1000, 0)
iex> Enum.to_list(stream)
[]
"""
@spec take_every(Enumerable.t, non_neg_integer) :: Enumerable.t
def take_every(enum, nth)
def take_every(_enum, 0), do: %Stream{enum: []}
def take_every([], _nth), do: %Stream{enum: []}
def take_every(enum, nth) when is_integer(nth) and nth > 0 do
lazy enum, nth, fn(f1) -> R.take_every(nth, f1) end
end
@doc """
Lazily takes elements of the enumerable while the given
function returns `true`.
## Examples
iex> stream = Stream.take_while(1..100, &(&1 <= 5))
iex> Enum.to_list(stream)
[1, 2, 3, 4, 5]
"""
@spec take_while(Enumerable.t, (element -> as_boolean(term))) :: Enumerable.t
def take_while(enum, fun) do
lazy enum, fn(f1) -> R.take_while(fun, f1) end
end
@doc """
Creates a stream that emits a single value after `n` milliseconds.
The value emitted is `0`. This operation will block the caller by
the given time until the item is streamed.
## Examples
iex> Stream.timer(10) |> Enum.to_list
[0]
"""
@spec timer(non_neg_integer) :: Enumerable.t
def timer(n) do
take(interval(n), 1)
end
@doc """
Transforms an existing stream.
It expects an accumulator and a function that receives each stream item
and an accumulator, and must return a tuple containing a new stream
(often a list) with the new accumulator or a tuple with `:halt` as first
element and the accumulator as second.
Note: this function is similar to `Enum.flat_map_reduce/3` except the
latter returns both the flat list and accumulator, while this one returns
only the stream.
## Examples
`Stream.transform/3` is useful as it can be used as the basis to implement
many of the functions defined in this module. For example, we can implement
`Stream.take(enum, n)` as follows:
iex> enum = 1..100
iex> n = 3
iex> stream = Stream.transform(enum, 0, fn i, acc ->
...> if acc < n, do: {[i], acc + 1}, else: {:halt, acc}
...> end)
iex> Enum.to_list(stream)
[1, 2, 3]
"""
@spec transform(Enumerable.t, acc, fun) :: Enumerable.t
when fun: (element, acc -> {Enumerable.t, acc} | {:halt, acc}),
acc: any
def transform(enum, acc, reducer) when is_function(reducer, 2) do
&do_transform(enum, fn -> acc end, reducer, &1, &2, nil)
end
@doc """
Transforms an existing stream with function-based start and finish.
The accumulator is only calculated when transformation starts. It also
allows an after function to be given which is invoked when the stream
halts or completes.
This function can be seen as a combination of `Stream.resource/3` with
`Stream.transform/3`.
"""
@spec transform(Enumerable.t, (() -> acc), fun, (acc -> term)) :: Enumerable.t
when fun: (element, acc -> {Enumerable.t, acc} | {:halt, acc}),
acc: any
def transform(enum, start_fun, reducer, after_fun)
when is_function(start_fun, 0) and is_function(reducer, 2) and is_function(after_fun, 1) do
&do_transform(enum, start_fun, reducer, &1, &2, after_fun)
end
defp do_transform(enumerables, user_acc, user, inner_acc, fun, after_fun) do
inner = &do_transform_each(&1, &2, fun)
step = &do_transform_step(&1, &2)
next = &Enumerable.reduce(enumerables, &1, step)
do_transform(user_acc.(), user, fun, :cont, next, inner_acc, inner, after_fun)
end
defp do_transform(user_acc, _user, _fun, _next_op, next, {:halt, inner_acc}, _inner, after_fun) do
next.({:halt, []})
do_after(after_fun, user_acc)
{:halted, inner_acc}
end
defp do_transform(user_acc, user, fun, next_op, next, {:suspend, inner_acc}, inner, after_fun) do
{:suspended, inner_acc, &do_transform(user_acc, user, fun, next_op, next, &1, inner, after_fun)}
end
defp do_transform(user_acc, _user, _fun, :halt, _next, {_, inner_acc}, _inner, after_fun) do
do_after(after_fun, user_acc)
{:halted, inner_acc}
end
defp do_transform(user_acc, user, fun, :cont, next, inner_acc, inner, after_fun) do
try do
next.({:cont, []})
catch
kind, reason ->
stacktrace = System.stacktrace
do_after(after_fun, user_acc)
:erlang.raise(kind, reason, stacktrace)
else
{:suspended, [val], next} ->
do_transform_user(val, user_acc, user, fun, :cont, next, inner_acc, inner, after_fun)
{_, [val]} ->
do_transform_user(val, user_acc, user, fun, :halt, next, inner_acc, inner, after_fun)
{_, []} ->
do_transform(user_acc, user, fun, :halt, next, inner_acc, inner, after_fun)
end
end
defp do_transform_user(val, user_acc, user, fun, next_op, next, inner_acc, inner, after_fun) do
user.(val, user_acc)
catch
kind, reason ->
stacktrace = System.stacktrace
next.({:halt, []})
do_after(after_fun, user_acc)
:erlang.raise(kind, reason, stacktrace)
else
{[], user_acc} ->
do_transform(user_acc, user, fun, next_op, next, inner_acc, inner, after_fun)
{list, user_acc} when is_list(list) ->
do_list_transform(user_acc, user, fun, next_op, next, inner_acc, inner,
&Enumerable.List.reduce(list, &1, fun), after_fun)
{:halt, user_acc} ->
next.({:halt, []})
do_after(after_fun, user_acc)
{:halted, elem(inner_acc, 1)}
{other, user_acc} ->
do_enum_transform(user_acc, user, fun, next_op, next, inner_acc, inner,
&Enumerable.reduce(other, &1, inner), after_fun)
end
defp do_list_transform(user_acc, user, fun, next_op, next, inner_acc, inner, reduce, after_fun) do
try do
reduce.(inner_acc)
catch
kind, reason ->
stacktrace = System.stacktrace
next.({:halt, []})
do_after(after_fun, user_acc)
:erlang.raise(kind, reason, stacktrace)
else
{:done, acc} ->
do_transform(user_acc, user, fun, next_op, next, {:cont, acc}, inner, after_fun)
{:halted, acc} ->
next.({:halt, []})
do_after(after_fun, user_acc)
{:halted, acc}
{:suspended, acc, c} ->
{:suspended, acc, &do_list_transform(user_acc, user, fun, next_op, next, &1, inner, c, after_fun)}
end
end
defp do_enum_transform(user_acc, user, fun, next_op, next, {op, inner_acc}, inner, reduce, after_fun) do
try do
reduce.({op, [:outer | inner_acc]})
catch
kind, reason ->
stacktrace = System.stacktrace
next.({:halt, []})
do_after(after_fun, user_acc)
:erlang.raise(kind, reason, stacktrace)
else
# Only take into account outer halts when the op is not halt itself.
# Otherwise, we were the ones wishing to halt, so we should just stop.
{:halted, [:outer | acc]} when op != :halt ->
do_transform(user_acc, user, fun, next_op, next, {:cont, acc}, inner, after_fun)
{:halted, [_ | acc]} ->
next.({:halt, []})
do_after(after_fun, user_acc)
{:halted, acc}
{:done, [_ | acc]} ->
do_transform(user_acc, user, fun, next_op, next, {:cont, acc}, inner, after_fun)
{:suspended, [_ | acc], c} ->
{:suspended, acc, &do_enum_transform(user_acc, user, fun, next_op, next, &1, inner, c, after_fun)}
end
end
defp do_after(nil, _user_acc), do: :ok
defp do_after(fun, user_acc), do: fun.(user_acc)
defp do_transform_each(x, [:outer | acc], f) do
case f.(x, acc) do
{:halt, res} -> {:halt, [:inner | res]}
{op, res} -> {op, [:outer | res]}
end
end
defp do_transform_step(x, acc) do
{:suspend, [x | acc]}
end
@doc """
Creates a stream that only emits elements if they are unique.
Keep in mind that, in order to know if an element is unique
or not, this function needs to store all unique values emitted
by the stream. Therefore, if the stream is infinite, the number
of items stored will grow infinitely, never being garbage collected.
## Examples
iex> Stream.uniq([1, 2, 3, 3, 2, 1]) |> Enum.to_list
[1, 2, 3]
"""
@spec uniq(Enumerable.t) :: Enumerable.t
def uniq(enum) do
uniq_by(enum, fn x -> x end)
end
@doc false
def uniq(enum, fun) do
IO.warn "Stream.uniq/2 is deprecated, use Stream.uniq_by/2 instead"
uniq_by(enum, fun)
end
@doc """
Creates a stream that only emits elements if they are unique, by removing the
elements for which function `fun` returned duplicate items.
The function `fun` maps every element to a term which is used to
determine if two elements are duplicates.
Keep in mind that, in order to know if an element is unique
or not, this function needs to store all unique values emitted
by the stream. Therefore, if the stream is infinite, the number
of items stored will grow infinitely, never being garbage collected.
## Example
iex> Stream.uniq_by([{1, :x}, {2, :y}, {1, :z}], fn {x, _} -> x end) |> Enum.to_list
[{1, :x}, {2, :y}]
iex> Stream.uniq_by([a: {:tea, 2}, b: {:tea, 2}, c: {:coffee, 1}], fn {_, y} -> y end) |> Enum.to_list
[a: {:tea, 2}, c: {:coffee, 1}]
"""
@spec uniq_by(Enumerable.t, (element -> term)) :: Enumerable.t
def uniq_by(enum, fun) do
lazy enum, %{}, fn f1 -> R.uniq_by(fun, f1) end
end
@doc """
Creates a stream where each item in the enumerable will
be wrapped in a tuple alongside its index.
If an `offset` is given, we will index from the given offset instead of from zero.
## Examples
iex> stream = Stream.with_index([1, 2, 3])
iex> Enum.to_list(stream)
[{1, 0}, {2, 1}, {3, 2}]
iex> stream = Stream.with_index([1, 2, 3], 3)
iex> Enum.to_list(stream)
[{1, 3}, {2, 4}, {3, 5}]
"""
@spec with_index(Enumerable.t) :: Enumerable.t
@spec with_index(Enumerable.t, integer) :: Enumerable.t
def with_index(enum, offset \\ 0) do
lazy enum, offset, fn(f1) -> R.with_index(f1) end
end
## Combiners
@doc """
Creates a stream that enumerates each enumerable in an enumerable.
## Examples
iex> stream = Stream.concat([1..3, 4..6, 7..9])
iex> Enum.to_list(stream)
[1, 2, 3, 4, 5, 6, 7, 8, 9]
"""
@spec concat(Enumerable.t) :: Enumerable.t
def concat(enumerables) do
flat_map(enumerables, &(&1))
end
@doc """
Creates a stream that enumerates the first argument, followed by the second.
## Examples
iex> stream = Stream.concat(1..3, 4..6)
iex> Enum.to_list(stream)
[1, 2, 3, 4, 5, 6]
iex> stream1 = Stream.cycle([1, 2, 3])
iex> stream2 = Stream.cycle([4, 5, 6])
iex> stream = Stream.concat(stream1, stream2)
iex> Enum.take(stream, 6)
[1, 2, 3, 1, 2, 3]
"""
@spec concat(Enumerable.t, Enumerable.t) :: Enumerable.t
def concat(first, second) do
flat_map([first, second], &(&1))
end
@doc """
Zips two collections together, lazily.
The zipping finishes as soon as any enumerable completes.
## Examples
iex> concat = Stream.concat(1..3, 4..6)
iex> cycle = Stream.cycle([:a, :b, :c])
iex> Stream.zip(concat, cycle) |> Enum.to_list
[{1, :a}, {2, :b}, {3, :c}, {4, :a}, {5, :b}, {6, :c}]
"""
@spec zip(Enumerable.t, Enumerable.t) :: Enumerable.t
def zip(left, right), do: zip([left, right])
@doc """
Zips corresponding elements from a collection of enumerables
into one stream of tuples.
The zipping finishes as soon as any enumerable completes.
## Examples
iex> concat = Stream.concat(1..3, 4..6)
iex> cycle = Stream.cycle(["foo", "bar", "baz"])
iex> Stream.zip([concat, [:a, :b, :c], cycle]) |> Enum.to_list
[{1, :a, "foo"}, {2, :b, "bar"}, {3, :c, "baz"}]
"""
@spec zip([Enumerable.t]) :: Enumerable.t
def zip(enumerables) do
step = &do_zip_step(&1, &2)
enum_funs = Enum.map(enumerables, fn enum ->
{&Enumerable.reduce(enum, &1, step), :cont}
end)
&do_zip(enum_funs, &1, &2)
end
# This implementation of do_zip/3 works for any number of
# streams to zip, even if right now zip/2 only zips two streams.
defp do_zip(zips, {:halt, acc}, _fun) do
do_zip_close(zips)
{:halted, acc}
end
defp do_zip(zips, {:suspend, acc}, fun) do
{:suspended, acc, &do_zip(zips, &1, fun)}
end
defp do_zip(zips, {:cont, acc}, callback) do
try do
do_zip_next_tuple(zips, acc, callback, [], [])
catch
kind, reason ->
stacktrace = System.stacktrace
do_zip_close(zips)
:erlang.raise(kind, reason, stacktrace)
else
{:next, buffer, acc} ->
do_zip(buffer, acc, callback)
{:done, _acc} = other ->
other
end
end
# do_zip_next_tuple/5 computes the next tuple formed by
# the next element of each zipped stream.
defp do_zip_next_tuple([{_, :halt} | zips], acc, _callback, _yielded_elems, buffer) do
do_zip_close(:lists.reverse(buffer, zips))
{:done, acc}
end
defp do_zip_next_tuple([{fun, :cont} | zips], acc, callback, yielded_elems, buffer) do
case fun.({:cont, []}) do
{:suspended, [elem], fun} ->
do_zip_next_tuple(zips, acc, callback, [elem | yielded_elems], [{fun, :cont} | buffer])
{_, [elem]} ->
do_zip_next_tuple(zips, acc, callback, [elem | yielded_elems], [{fun, :halt} | buffer])
{_, []} ->
# The current zipped stream terminated, so we close all the streams
# and return {:halted, acc} (which is returned as is by do_zip/3).
do_zip_close(:lists.reverse(buffer, zips))
{:done, acc}
end
end
defp do_zip_next_tuple([] = _zips, acc, callback, yielded_elems, buffer) do
# "yielded_elems" is a reversed list of results for the current iteration of
# zipping: it needs to be reversed and converted to a tuple to have the next
# tuple in the list resulting from zipping.
zipped = List.to_tuple(:lists.reverse(yielded_elems))
{:next, :lists.reverse(buffer), callback.(zipped, acc)}
end
defp do_zip_close(zips) do
:lists.foreach(fn {fun, _} -> fun.({:halt, []}) end, zips)
end
defp do_zip_step(x, []) do
{:suspend, [x]}
end
## Sources
@doc """
Creates a stream that cycles through the given enumerable,
infinitely.
## Examples
iex> stream = Stream.cycle([1, 2, 3])
iex> Enum.take(stream, 5)
[1, 2, 3, 1, 2]
"""
@spec cycle(Enumerable.t) :: Enumerable.t
def cycle(enumerable)
def cycle(enumerable) when is_list(enumerable) do
unfold {enumerable, enumerable}, fn
{source, [h | t]} -> {h, {source, t}}
{source = [h | t], []} -> {h, {source, t}}
end
end
def cycle(enumerable) do
fn acc, fun ->
inner = &do_cycle_each(&1, &2, fun)
outer = &Enumerable.reduce(enumerable, &1, inner)
do_cycle(outer, outer, acc)
end
end
defp do_cycle(_reduce, _cycle, {:halt, acc}) do
{:halted, acc}
end
defp do_cycle(reduce, cycle, {:suspend, acc}) do
{:suspended, acc, &do_cycle(reduce, cycle, &1)}
end
defp do_cycle(reduce, cycle, acc) do
try do
reduce.(acc)
catch
{:stream_cycle, acc} ->
{:halted, acc}
else
{state, acc} when state in [:done, :halted] ->
do_cycle(cycle, cycle, {:cont, acc})
{:suspended, acc, continuation} ->
{:suspended, acc, &do_cycle(continuation, cycle, &1)}
end
end
defp do_cycle_each(x, acc, f) do
case f.(x, acc) do
{:halt, h} -> throw({:stream_cycle, h})
{_, _} = o -> o
end
end
@doc """
Emits a sequence of values, starting with `start_value`. Successive
values are generated by calling `next_fun` on the previous value.
## Examples
iex> Stream.iterate(0, &(&1+1)) |> Enum.take(5)
[0, 1, 2, 3, 4]
"""
@spec iterate(element, (element -> element)) :: Enumerable.t
def iterate(start_value, next_fun) do
unfold({:ok, start_value}, fn
{:ok, value} ->
{value, {:next, value}}
{:next, value} ->
next = next_fun.(value)
{next, {:next, next}}
end)
end
@doc """
Returns a stream generated by calling `generator_fun` repeatedly.
## Examples
# Although not necessary, let's seed the random algorithm
iex> :rand.seed(:exsplus, {1, 2, 3})
iex> Stream.repeatedly(&:rand.uniform/0) |> Enum.take(3)
[0.40502929729990744, 0.45336720247823126, 0.04094511692041057]
"""
@spec repeatedly((() -> element)) :: Enumerable.t
def repeatedly(generator_fun) when is_function(generator_fun, 0) do
&do_repeatedly(generator_fun, &1, &2)
end
defp do_repeatedly(generator_fun, {:suspend, acc}, fun) do
{:suspended, acc, &do_repeatedly(generator_fun, &1, fun)}
end
defp do_repeatedly(_generator_fun, {:halt, acc}, _fun) do
{:halted, acc}
end
defp do_repeatedly(generator_fun, {:cont, acc}, fun) do
do_repeatedly(generator_fun, fun.(generator_fun.(), acc), fun)
end
@doc """
Emits a sequence of values for the given resource.
Similar to `transform/3` but the initial accumulated value is
computed lazily via `start_fun` and executes an `after_fun` at
the end of enumeration (both in cases of success and failure).
Successive values are generated by calling `next_fun` with the
previous accumulator (the initial value being the result returned
by `start_fun`) and it must return a tuple containing a list
of items to be emitted and the next accumulator. The enumeration
finishes if it returns `{:halt, acc}`.
As the name says, this function is useful to stream values from
resources.
## Examples
Stream.resource(fn -> File.open!("sample") end,
fn file ->
case IO.read(file, :line) do
data when is_binary(data) -> {[data], file}
_ -> {:halt, file}
end
end,
fn file -> File.close(file) end)
"""
@spec resource((() -> acc), (acc -> {[element], acc} | {:halt, acc}), (acc -> term)) :: Enumerable.t
def resource(start_fun, next_fun, after_fun) do
&do_resource(start_fun.(), next_fun, &1, &2, after_fun)
end
defp do_resource(next_acc, next_fun, {:suspend, acc}, fun, after_fun) do
{:suspended, acc, &do_resource(next_acc, next_fun, &1, fun, after_fun)}
end
defp do_resource(next_acc, _next_fun, {:halt, acc}, _fun, after_fun) do
after_fun.(next_acc)
{:halted, acc}
end
defp do_resource(next_acc, next_fun, {:cont, acc}, fun, after_fun) do
try do
# Optimize the most common cases
case next_fun.(next_acc) do
{[], next_acc} -> {:opt, {:cont, acc}, next_acc}
{[v], next_acc} -> {:opt, fun.(v, acc), next_acc}
{_, _} = other -> other
end
catch
kind, reason ->
stacktrace = System.stacktrace
after_fun.(next_acc)
:erlang.raise(kind, reason, stacktrace)
else
{:opt, acc, next_acc} ->
do_resource(next_acc, next_fun, acc, fun, after_fun)
{:halt, next_acc} ->
do_resource(next_acc, next_fun, {:halt, acc}, fun, after_fun)
{list, next_acc} when is_list(list) ->
do_list_resource(next_acc, next_fun, {:cont, acc}, fun, after_fun,
&Enumerable.List.reduce(list, &1, fun))
{enum, next_acc} ->
inner = &do_resource_each(&1, &2, fun)
do_enum_resource(next_acc, next_fun, {:cont, acc}, fun, after_fun,
&Enumerable.reduce(enum, &1, inner))
end
end
defp do_list_resource(next_acc, next_fun, acc, fun, after_fun, reduce) do
try do
reduce.(acc)
catch
kind, reason ->
stacktrace = System.stacktrace
after_fun.(next_acc)
:erlang.raise(kind, reason, stacktrace)
else
{:done, acc} ->
do_resource(next_acc, next_fun, {:cont, acc}, fun, after_fun)
{:halted, acc} ->
do_resource(next_acc, next_fun, {:halt, acc}, fun, after_fun)
{:suspended, acc, c} ->
{:suspended, acc, &do_list_resource(next_acc, next_fun, &1, fun, after_fun, c)}
end
end
defp do_enum_resource(next_acc, next_fun, {op, acc}, fun, after_fun, reduce) do
try do
reduce.({op, [:outer | acc]})
catch
kind, reason ->
stacktrace = System.stacktrace
after_fun.(next_acc)
:erlang.raise(kind, reason, stacktrace)
else
{:halted, [:outer | acc]} ->
do_resource(next_acc, next_fun, {:cont, acc}, fun, after_fun)
{:halted, [:inner | acc]} ->
do_resource(next_acc, next_fun, {:halt, acc}, fun, after_fun)
{:done, [_ | acc]} ->
do_resource(next_acc, next_fun, {:cont, acc}, fun, after_fun)
{:suspended, [_ | acc], c} ->
{:suspended, acc, &do_enum_resource(next_acc, next_fun, &1, fun, after_fun, c)}
end
end
defp do_resource_each(x, [:outer | acc], f) do
case f.(x, acc) do
{:halt, res} -> {:halt, [:inner | res]}
{op, res} -> {op, [:outer | res]}
end
end
@doc """
Emits a sequence of values for the given accumulator.
Successive values are generated by calling `next_fun` with the previous
accumulator and it must return a tuple with the current value and next
accumulator. The enumeration finishes if it returns `nil`.
## Examples
iex> Stream.unfold(5, fn 0 -> nil; n -> {n, n-1} end) |> Enum.to_list()
[5, 4, 3, 2, 1]
"""
@spec unfold(acc, (acc -> {element, acc} | nil)) :: Enumerable.t
def unfold(next_acc, next_fun) do
&do_unfold(next_acc, next_fun, &1, &2)
end
defp do_unfold(next_acc, next_fun, {:suspend, acc}, fun) do
{:suspended, acc, &do_unfold(next_acc, next_fun, &1, fun)}
end
defp do_unfold(_next_acc, _next_fun, {:halt, acc}, _fun) do
{:halted, acc}
end
defp do_unfold(next_acc, next_fun, {:cont, acc}, fun) do
case next_fun.(next_acc) do
nil -> {:done, acc}
{v, next_acc} -> do_unfold(next_acc, next_fun, fun.(v, acc), fun)
end
end
## Helpers
@compile {:inline, lazy: 2, lazy: 3, lazy: 4}
defp lazy(%Stream{done: nil, funs: funs} = lazy, fun),
do: %{lazy | funs: [fun | funs] }
defp lazy(enum, fun),
do: %Stream{enum: enum, funs: [fun]}
defp lazy(%Stream{done: nil, funs: funs, accs: accs} = lazy, acc, fun),
do: %{lazy | funs: [fun | funs], accs: [acc | accs] }
defp lazy(enum, acc, fun),
do: %Stream{enum: enum, funs: [fun], accs: [acc]}
defp lazy(%Stream{done: nil, funs: funs, accs: accs} = lazy, acc, fun, done),
do: %{lazy | funs: [fun | funs], accs: [acc | accs], done: done}
defp lazy(enum, acc, fun, done),
do: %Stream{enum: enum, funs: [fun], accs: [acc], done: done}
end
defimpl Enumerable, for: Stream do
@compile :inline_list_funs
def reduce(lazy, acc, fun) do
do_reduce(lazy, acc, fn x, [acc] ->
{reason, acc} = fun.(x, acc)
{reason, [acc]}
end)
end
def count(_lazy) do
{:error, __MODULE__}
end
def member?(_lazy, _value) do
{:error, __MODULE__}
end
defp do_reduce(%Stream{enum: enum, funs: funs, accs: accs, done: done}, acc, fun) do
composed = :lists.foldl(fn fun, acc -> fun.(acc) end, fun, funs)
do_each(&Enumerable.reduce(enum, &1, composed),
done && {done, fun}, :lists.reverse(accs), acc)
end
defp do_each(reduce, done, accs, {command, acc}) do
case reduce.({command, [acc | accs]}) do
{:suspended, [acc | accs], continuation} ->
{:suspended, acc, &do_each(continuation, done, accs, &1)}
{:halted, accs} ->
do_done {:halted, accs}, done
{:done, accs} ->
do_done {:done, accs}, done
end
end
defp do_done({reason, [acc | _]}, nil), do: {reason, acc}
defp do_done({reason, [acc | t]}, {done, fun}) do
[h | _] = Enum.reverse(t)
case done.([acc, h], fun) do
{:cont, [acc | _]} -> {reason, acc}
{:halt, [acc | _]} -> {:halted, acc}
{:suspend, [acc | _]} -> {:suspended, acc, &({:done, elem(&1, 1)})}
end
end
end
defimpl Inspect, for: Stream do
import Inspect.Algebra
def inspect(%{enum: enum, funs: funs}, opts) do
inner = [enum: enum, funs: Enum.reverse(funs)]
concat ["#Stream<", to_doc(inner, opts), ">"]
end
end
| 30.468248 | 108 | 0.614805 |
1c1ef8d31f9d082e7d87f753e131806d91d5f647 | 1,307 | exs | Elixir | mix.exs | hissssst/Elixir-Slack | f48c724b0002bc2678df88c683a89eb26cede5d1 | [
"MIT"
] | null | null | null | mix.exs | hissssst/Elixir-Slack | f48c724b0002bc2678df88c683a89eb26cede5d1 | [
"MIT"
] | null | null | null | mix.exs | hissssst/Elixir-Slack | f48c724b0002bc2678df88c683a89eb26cede5d1 | [
"MIT"
] | null | null | null | defmodule Slack.Mixfile do
use Mix.Project
def project do
[
app: :slack,
version: "0.19.0",
elixir: "~> 1.6",
elixirc_paths: elixirc_paths(Mix.env()),
name: "Slack",
deps: deps(),
docs: docs(),
source_url: "https://github.com/BlakeWilliams/Elixir-Slack",
description: "A Slack Real Time Messaging API client.",
package: package()
]
end
defp elixirc_paths(:test), do: ["lib", "test/support"]
defp elixirc_paths(_), do: ["lib"]
def application do
[
extra_applications: [:logger]
]
end
defp deps do
[
{:httpoison, "~> 1.2"},
{:websocket_client, "~> 1.2.4"},
{:poison, "~> 4.0"},
{:ex_doc, "~> 0.19", only: :dev},
{:credo, "~> 0.5", only: [:dev, :test]},
{:plug, "~> 1.6", only: :test},
{:cowboy, "~> 1.0.0", only: :test}
]
end
def docs do
[
{:main, Slack},
{:assets, "guides/assets"},
{:extra_section, "GUIDES"},
{:extras, ["guides/token_generation_instructions.md"]}
]
end
defp package do
%{
maintainers: ["Blake Williams"],
licenses: ["MIT"],
links: %{
Github: "https://github.com/BlakeWilliams/Elixir-Slack",
Documentation: "http://hexdocs.pm/slack/"
}
}
end
end
| 21.783333 | 66 | 0.530987 |
1c1f034386eaf9d0161da54fbac8fcacc510a10f | 2,139 | ex | Elixir | web/controllers/auth_controller.ex | aforward-oss/phoenix_guardian | c6dbf4e5e8571f42fc14b1437b9ae2e0485fdbb2 | [
"MIT"
] | null | null | null | web/controllers/auth_controller.ex | aforward-oss/phoenix_guardian | c6dbf4e5e8571f42fc14b1437b9ae2e0485fdbb2 | [
"MIT"
] | null | null | null | web/controllers/auth_controller.ex | aforward-oss/phoenix_guardian | c6dbf4e5e8571f42fc14b1437b9ae2e0485fdbb2 | [
"MIT"
] | null | null | null | defmodule PhoenixGuardian.AuthController do
@moduledoc """
Handles the Überauth integration.
This controller implements the request and callback phases for all providers.
The actual creation and lookup of users/authorizations is handled by UserFromAuth
"""
use PhoenixGuardian.Web, :controller
alias PhoenixGuardian.UserFromAuth
plug Ueberauth
def login(conn, _params, current_user, _claims) do
render conn, "login.html", current_user: current_user, current_auths: auths(current_user)
end
def callback(%Plug.Conn{assigns: %{ueberauth_failure: fails}} = conn, _params, current_user, _claims) do
conn
|> put_flash(:error, hd(fails.errors).message)
|> render("login.html", current_user: current_user, current_auths: auths(current_user))
end
def callback(%Plug.Conn{assigns: %{ueberauth_auth: auth}} = conn, _params, current_user, _claims) do
case UserFromAuth.get_or_insert(auth, current_user, Repo) do
{:ok, user} ->
conn
|> put_flash(:info, "Signed in as #{user.name}")
|> Guardian.Plug.sign_in(user, :token, perms: %{default: Guardian.Permissions.max})
|> redirect(to: private_page_path(conn, :index))
{:error, _reason} ->
conn
|> put_flash(:error, "Could not authenticate")
|> render("login.html", current_user: current_user, current_auths: auths(current_user))
end
end
def logout(conn, _params, current_user, _claims) do
if current_user do
conn
# This clears the whole session.
# We could use sign_out(:default) to just revoke this token
# but I prefer to clear out the session. This means that because we
# use tokens in two locations - :default and :admin - we need to load it (see above)
|> Guardian.Plug.sign_out
|> put_flash(:info, "Signed out")
|> redirect(to: "/")
else
conn
|> put_flash(:info, "Not logged in")
|> redirect(to: "/")
end
end
defp auths(nil), do: []
defp auths(%PhoenixGuardian.User{} = user) do
Ecto.Model.assoc(user, :authorizations)
|> Repo.all
|> Enum.map(&(&1.provider))
end
end
| 35.065574 | 106 | 0.673212 |
1c1f94b51347692347ffc9fc017d7dc88272d39c | 1,602 | exs | Elixir | config/dev.exs | Mdlkxzmcp/mango | 0009f3ce5d45e392496cdfbd7fa5ad545b65bf04 | [
"MIT"
] | null | null | null | config/dev.exs | Mdlkxzmcp/mango | 0009f3ce5d45e392496cdfbd7fa5ad545b65bf04 | [
"MIT"
] | null | null | null | config/dev.exs | Mdlkxzmcp/mango | 0009f3ce5d45e392496cdfbd7fa5ad545b65bf04 | [
"MIT"
] | null | null | null | use Mix.Config
config :mango, MangoWeb.Endpoint,
http: [port: 4000],
debug_errors: true,
code_reloader: true,
check_origin: false,
watchers: [
node: [
"node_modules/brunch/bin/brunch",
"watch",
"--stdin",
cd: Path.expand("../assets", __DIR__)
]
]
# ## SSL Support
#
# In order to use HTTPS in development, a self-signed
# certificate can be generated by running the following
# command from your terminal:
#
# openssl req -new -newkey rsa:4096 -days 365 -nodes -x509 -subj "/C=US/ST=Denial/L=Springfield/O=Dis/CN=www.example.com" -keyout priv/server.key -out priv/server.pem
#
# The `http:` config above can be replaced with:
#
# https: [port: 4000, keyfile: "priv/server.key", certfile: "priv/server.pem"],
#
# If desired, both `http:` and `https:` keys can be
# configured to run both http and https servers on
# different ports.
config :mango, MangoWeb.Endpoint,
live_reload: [
patterns: [
~r{priv/static/.*(js|css|png|jpeg|jpg|gif|svg)$},
~r{priv/gettext/.*(po)$},
~r{lib/mango_web/views/.*(ex)$},
~r{lib/mango_web/templates/.*(eex)$}
]
]
# Do not include metadata nor timestamps in development logs
config :logger, :console, format: "[$level] $message\n"
# Set a higher stacktrace during development. Avoid configuring such
# in production as building large stacktraces may be expensive.
config :phoenix, :stacktrace_depth, 20
config :mango, Mango.Repo,
adapter: Ecto.Adapters.Postgres,
username: "postgres",
password: "postgres",
database: "mango_dev",
hostname: "localhost",
pool_size: 10
| 28.105263 | 170 | 0.676654 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.