code stringlengths 114 1.05M | path stringlengths 3 312 | quality_prob float64 0.5 0.99 | learning_prob float64 0.2 1 | filename stringlengths 3 168 | kind stringclasses 1 value |
|---|---|---|---|---|---|
defmodule VintageNetDirect do
@moduledoc """
Support for directly connected Ethernet configurations
Direct Ethernet connections are those where the network connects only two
devices. Examples include a virtual Ethernet interface being run over a USB
cable. This is a popular Nerves configuration for development where the USB
cable provides power and networking to a Raspberry Pi, Beaglebone or other
"USB Gadget"-capable board. Another example would be a direct Ethernet
connection between a device and a development computer. Such a connection is
handy when a router isn't readily available.
The VintageNet Technology works by assigning a static IP address to the
Ethernet interface on this side of the connection and running a DHCP server
to assign an IP address to the other side of the cable. IP addresses are
computed based on the hostname and interface name. A /30 subnet is used for
the two IP addresses for each side of the cable to try to avoid conflicts
with IP subnets used on either computer. The DHCP server in use is very
simple and assigns the same IP address every time.
Note that many decisions were made to make this use case work well. If
you're thinking about use cases with more than just the one cable and two
endpoints, you'll want to look elsewhere.
Configurations for this technology are maps with a `:type` field set to
`VintageNetDirect`. `VintageNetDirect`-specific options are in a map under
the `:vintage_net_direct` key (formerly the `:gadget` key). These include:
* `:hostname` - if non-nil, this overrides the hostname used for computing
a unique IP address for this interface. If unset, `:inet.gethostname/0`
is used.
Most users should specify the following configuration:
```elixir
%{type: VintageNetDirect}
```
"""
@behaviour VintageNet.Technology
alias VintageNet.Interface.RawConfig
alias VintageNet.IP.IPv4Config
@impl VintageNet.Technology
def normalize(%{type: __MODULE__} = config) do
normalized =
get_specific_options(config)
|> normalize_options()
%{type: __MODULE__, vintage_net_direct: normalized}
end
defp get_specific_options(config) do
Map.get(config, :vintage_net_direct) || Map.get(config, :gadget)
end
defp normalize_options(%{hostname: hostname}) when is_binary(hostname) do
%{hostname: hostname}
end
defp normalize_options(_specific_config), do: %{}
@impl VintageNet.Technology
def to_raw_config(ifname, %{type: __MODULE__} = config, opts) do
normalized_config = normalize(config)
# Derive the subnet based on the ifname, but allow the user to force a hostname
subnet =
case normalized_config.vintage_net_direct do
%{hostname: hostname} ->
OneDHCPD.IPCalculator.default_subnet(ifname, hostname)
_ ->
OneDHCPD.IPCalculator.default_subnet(ifname)
end
ipv4_config = %{
ipv4: %{
method: :static,
address: OneDHCPD.IPCalculator.our_ip_address(subnet),
prefix_length: OneDHCPD.IPCalculator.prefix_length()
}
}
%RawConfig{
ifname: ifname,
type: __MODULE__,
source_config: normalized_config,
required_ifnames: [ifname],
child_specs: [{OneDHCPD.Server, [ifname, [subnet: subnet]]}]
}
|> IPv4Config.add_config(ipv4_config, opts)
end
@impl VintageNet.Technology
def ioctl(_ifname, _command, _args) do
{:error, :unsupported}
end
@impl VintageNet.Technology
def check_system(_opts) do
# TODO
:ok
end
end | lib/vintage_net_direct.ex | 0.784608 | 0.770724 | vintage_net_direct.ex | starcoder |
if Code.ensure_loaded?(Decorator.Define) do
defmodule Nebulex.Caching do
@moduledoc ~S"""
Declarative annotation-based caching via function
[decorators](https://github.com/arjan/decorator).
For caching declaration, the abstraction provides three Elixir function
decorators: `cacheable `, `cache_evict`, and `cache_put`, which allow
functions to trigger cache population or cache eviction.
Let us take a closer look at each annotation.
> Inspired by [Spring Cache Abstraction](https://docs.spring.io/spring/docs/3.2.x/spring-framework-reference/html/cache.html).
## `cacheable` decorator
As the name implies, `cacheable` is used to demarcate functions that are
cacheable - that is, functions for whom the result is stored into the cache
so, on subsequent invocations (with the same arguments), the value in the
cache is returned without having to actually execute the function. In its
simplest form, the decorator/annotation declaration requires the name of
the cache associated with the annotated function:
@decorate cacheable(cache: Cache)
def get_account(id) do
# the logic for retrieving the account ...
end
In the snippet above, the function `get_account/1` is associated with the
cache named `Cache`. Each time the function is called, the cache is checked
to see whether the invocation has been already executed and does not have
to be repeated.
### Default Key Generation
Since caches are essentially key-value stores, each invocation of a cached
function needs to be translated into a suitable key for cache access.
Out of the box, the caching abstraction uses a simple key-generator
based on the following algorithm: `:erlang.phash2({module, func_name})`.
### Custom Key Generation Declaration
Since caching is generic, it is quite likely the target functions have
various signatures that cannot be simply mapped on top of the cache
structure. This tends to become obvious when the target function has
multiple arguments out of which only some are suitable for caching
(while the rest are used only by the function logic). For example:
@decorate cacheable(cache: Cache)
def get_account(email, include_users?) do
# the logic for retrieving the account ...
end
At first glance, while the boolean argument influences the way the account
is found, it is no use for the cache.
For such cases, the `cacheable` decorator allows the user to specify the
key explicitly based on the function attributes.
@decorate cacheable(cache: Cache, key: {Account, email})
def get_account(email, include_users?) do
# the logic for retrieving the account ...
end
@decorate cacheable(cache: Cache, key: {Account, user.account_id})
def get_user_account(%User{} = user) do
# the logic for retrieving the account ...
end
It is also possible passing options to the cache, like so:
@decorate cacheable(cache: Cache, key: {Account, email}, opts: [ttl: 300_000])
def get_account(email, include_users?) do
# the logic for retrieving the account ...
end
See the **"Shared Options"** section below.
### Functions with multiple clauses
Since [decorator lib](https://github.com/arjan/decorator#functions-with-multiple-clauses)
is used, it is important to be aware of the recommendations, warns,
limitations, and so on. In this case, for functions with multiple clauses
the general advice is to create an empty function head, and call the
decorator on that head, like so:
@decorate cacheable(cache: Cache, key: email)
def get_account(email \\ nil)
def get_account(nil), do: nil
def get_account(email) do
# the logic for retrieving the account ...
end
## `cache_put` decorator
For cases where the cache needs to be updated without interfering with the
function execution, one can use the `cache_put` decorator. That is, the
method will always be executed and its result placed into the cache
(according to the `cache_put` options). It supports the same options as
`cacheable`.
@decorate cache_put(cache: Cache, key: {Account, acct.email})
def update_account(%Account{} = acct, attrs) do
# the logic for updating the account ...
end
Note that using `cache_put` and `cacheable` annotations on the same function
is generally discouraged because they have different behaviors. While the
latter causes the method execution to be skipped by using the cache, the
former forces the execution in order to execute a cache update. This leads
to unexpected behavior and with the exception of specific corner-cases
(such as decorators having conditions that exclude them from each other),
such declarations should be avoided.
## `cache_evict` decorator
The cache abstraction allows not just the population of a cache store but
also eviction. This process is useful for removing stale or unused data from
the cache. Opposed to `cacheable`, the decorator `cache_evict` demarcates
functions that perform cache eviction, which are functions that act as
triggers for removing data from the cache. The `cache_evict` decorator not
only allows a key to be specified, but also a set of keys. Besides, extra
options like`all_entries` which indicates whether a cache-wide eviction
needs to be performed rather than just an entry one (based on the key or
keys):
@decorate cache_evict(cache: Cache, key: {Account, email})
def delete_account_by_email(email) do
# the logic for deleting the account ...
end
@decorate cacheable(cache: Cache, keys: [{Account, acct.id}, {Account, acct.email}])
def delete_account(%Account{} = acct) do
# the logic for deleting the account ...
end
@decorate cacheable(cache: Cache, all_entries: true)
def delete_all_accounts do
# the logic for deleting all the accounts ...
end
The option `all_entries:` comes in handy when an entire cache region needs
to be cleared out - rather than evicting each entry (which would take a
long time since it is inefficient), all the entries are removed in one
operation as shown above.
## Shared Options
All three cache annotations explained previously accept the following
options:
* `:cache` - Defines what cache to use (required). Raises `ArgumentError`
if the option is not present.
* `:key` - Defines the cache access key (optional). If this option
is not present, a default key is generated by hashing the tuple
`{module, fun_name}`; the first element is the caller module and the
second one the function name (`:erlang.phash2({module, fun})`).
* `:opts` - Defines the cache options that will be passed as argument
to the invoked cache function (optional).
* `:match` - Match function `(term -> boolean | {true, term})` (optional).
This function is for matching and decide whether or not the code-block
evaluation result is cached or not. If `true` the code-block evaluation
result is cached as it is (the default). If `{true, value}` is returned,
then the `value` is what is cached (useful to control what to cache).
Otherwise, none result is stored in the cache.
## Putting all together
Supposing we are using `Ecto` and we want to define some cacheable functions
within the context `MyApp.Accounts`:
# The config
config :my_app, MyApp.Cache,
gc_interval: 86_400_000, #=> 1 day
backend: :shards
# The Cache
defmodule MyApp.Cache do
use Nebulex.Cache,
otp_app: :my_app,
adapter: Nebulex.Adapters.Local
end
# Some Ecto schema
defmodule MyApp.Accounts.User do
use Ecto.Schema
schema "users" do
field(:username, :string)
field(:password, :string)
field(:role, :string)
end
def changeset(user, attrs) do
user
|> cast(attrs, [:username, :password, :role])
|> validate_required([:username, :password, :role])
end
end
# Accounts context
defmodule MyApp.Accounts do
use Nebulex.Caching
alias MyApp.Accounts.User
alias MyApp.{Cache, Repo}
@ttl Nebulex.Time.expiry_time(1, :hour)
@decorate cacheable(cache: Cache, key: {User, id}, opts: [ttl: @ttl])
def get_user!(id) do
Repo.get!(User, id)
end
@decorate cacheable(cache: Cache, key: {User, username}, opts: [ttl: @ttl])
def get_user_by_username(username) do
Repo.get_by(User, [username: username])
end
@decorate cache_put(
cache: Cache,
keys: [{User, usr.id}, {User, usr.username}],
match: &match_update/1
)
def update_user(%User{} = usr, attrs) do
usr
|> User.changeset(attrs)
|> Repo.update()
end
defp match_update({:ok, usr}), do: {true, usr}
defp match_update({:error, _}), do: false
@decorate cache_evict(cache: Cache, keys: [{User, usr.id}, {User, usr.username}])
def delete_user(%User{} = usr) do
Repo.delete(usr)
end
def create_user(attrs \\ %{}) do
%User{}
|> User.changeset(attrs)
|> Repo.insert()
end
end
See [Cache Usage Patters Guide](http://hexdocs.pm/nebulex/cache-usage-patterns.html).
"""
use Decorator.Define, cacheable: 1, cache_evict: 1, cache_put: 1
alias Nebulex.Caching
@doc """
Provides a way of annotating functions to be cached (cacheable aspect).
The returned value by the code block is cached if it doesn't exist already
in cache, otherwise, it is returned directly from cache and the code block
is not executed.
## Options
See the "Shared options" section at the module documentation.
## Examples
defmodule MyApp.Example do
use Nebulex.Caching
alias MyApp.Cache
@ttl Nebulex.Time.expiry_time(1, :hour)
@decorate cacheable(cache: Cache, key: name)
def get_by_name(name, age) do
# your logic (maybe the loader to retrieve the value from the SoR)
end
@decorate cacheable(cache: Cache, key: age, opts: [ttl: @ttl])
def get_by_age(age) do
# your logic (maybe the loader to retrieve the value from the SoR)
end
@decorate cacheable(cache: Cache, key: clauses, match: &match_fun/1)
def all(clauses) do
# your logic (maybe the loader to retrieve the value from the SoR)
end
defp match_fun([]), do: false
defp match_fun(_), do: true
end
The **Read-through** pattern is supported by this decorator. The loader to
retrieve the value from the system-of-record (SoR) is your function's logic
and the rest is provided by the macro under-the-hood.
"""
def cacheable(attrs, block, context) do
caching_action(:cacheable, attrs, block, context)
end
@doc """
Provides a way of annotating functions to be evicted; but updating the
cached key instead of deleting it.
The content of the cache is updated without interfering with the function
execution. That is, the method would always be executed and the result
cached.
The difference between `cacheable/3` and `cache_put/3` is that `cacheable/3`
will skip running the function if the key exists in the cache, whereas
`cache_put/3` will actually run the function and then put the result in
the cache.
## Options
See the "Shared options" section at the module documentation.
## Examples
defmodule MyApp.Example do
use Nebulex.Caching
alias MyApp.Cache
@ttl Nebulex.Time.expiry_time(1, :hour)
@decorate cache_put(cache: Cache, key: id, opts: [ttl: @ttl])
def update!(id, attrs \\ %{}) do
# your logic (maybe write data to the SoR)
end
@decorate cache_put(cache: Cache, key: id, match: &match_fun/1, opts: [ttl: @ttl])
def update(id, attrs \\ %{}) do
# your logic (maybe write data to the SoR)
end
defp match_fun({:ok, updated}), do: {true, updated}
defp match_fun({:error, _}), do: false
end
The **Write-through** pattern is supported by this decorator. Your function
provides the logic to write data to the system-of-record (SoR) and the rest
is provided by the decorator under-the-hood.
"""
def cache_put(attrs, block, context) do
caching_action(:cache_put, attrs, block, context)
end
@doc """
Provides a way of annotating functions to be evicted (eviction aspect).
On function's completion, the given key or keys (depends on the `:key` and
`:keys` options) are deleted from the cache.
## Options
* `:keys` - Defines the set of keys to be evicted from cache on function
completion.
* `:all_entries` - Defines if all entries must be removed on function
completion. Defaults to `false`.
See the "Shared options" section at the module documentation.
## Examples
defmodule MyApp.Example do
use Nebulex.Caching
alias MyApp.Cache
@decorate cache_evict(cache: Cache, key: id)
def delete(id) do
# your logic (maybe write/delete data to the SoR)
end
@decorate cache_evict(cache: Cache, keys: [object.name, object.id])
def delete_object(object) do
# your logic (maybe write/delete data to the SoR)
end
@decorate cache_evict(cache: Cache, all_entries: true)
def delete_all do
# your logic (maybe write/delete data to the SoR)
end
end
The **Write-through** pattern is supported by this decorator. Your function
provides the logic to write data to the system-of-record (SoR) and the rest
is provided by the decorator under-the-hood. But in contrast with `update`
decorator, when the data is written to the SoR, the key for that value is
deleted from cache instead of updated.
"""
def cache_evict(attrs, block, context) do
caching_action(:cache_evict, attrs, block, context)
end
## Private Functions
defp caching_action(action, attrs, block, context) do
cache = attrs[:cache] || raise ArgumentError, "expected cache: to be given as argument"
key_var =
Keyword.get(
attrs,
:key,
quote(do: :erlang.phash2({unquote(context.module), unquote(context.name)}))
)
keys_var = Keyword.get(attrs, :keys, [])
match_var = Keyword.get(attrs, :match, quote(do: fn _ -> true end))
opts_var = Keyword.get(attrs, :opts, [])
action_logic = action_logic(action, block, attrs)
quote do
cache = unquote(cache)
key = unquote(key_var)
keys = unquote(keys_var)
opts = unquote(opts_var)
match = unquote(match_var)
unquote(action_logic)
end
end
defp action_logic(:cacheable, block, _attrs) do
quote do
if value = cache.get(key, opts) do
value
else
unquote(match_logic(block))
end
end
end
defp action_logic(:cache_put, block, _attrs) do
match_logic(block)
end
defp match_logic(block) do
quote do
Caching.eval_match(unquote(block), match, cache, key, opts)
end
end
@doc """
This function is for internal purposes.
> **NOTE:** Workaround to avoid dialyzer warnings when using declarative
annotation-based caching via decorators.
"""
@spec eval_match(term, (term -> boolean | {true, term}), module, term, Keyword.t()) :: term
def eval_match(result, match, cache, key, opts) do
case match.(result) do
{true, value} ->
:ok = cache.put(key, value, opts)
result
true ->
:ok = cache.put(key, result, opts)
result
false ->
result
end
end
defp action_logic(:cache_evict, block, attrs) do
all_entries? = Keyword.get(attrs, :all_entries, false)
quote do
if unquote(all_entries?) do
cache.flush()
else
Enum.each([key | keys], fn k ->
if k, do: cache.delete(k)
end)
end
unquote(block)
end
end
end
end | lib/nebulex/caching.ex | 0.881526 | 0.717148 | caching.ex | starcoder |
defmodule LexOffice do
@moduledoc """
Documentation for `LexOffice` which provides an API for lexoffice.de.
## Installation
This package can be installed by adding `lexoffice` to your list of dependencies in `mix.exs`:
```elixir
def deps do
[
{:lexoffice, "~> 0.1"}
]
end
```
## Configuration
Put the following lines into your `config.exs` or better, into your environment configuration files like `test.exs`, `dev.exs` or `prod.exs`.
```elixir
config :lexoffice, api_key: "<your api key>"
```
## WebHooks in Phoenix
Put the following lines in a file called `lexoffice_controller.ex` inside your controllers directory.
```elixir
defmodule YourAppWeb.LexOfficeController do
use LexOffice.PhoenixController
def handle_event(org_id, "payment.changed", resource_id, date) do
id
|> Documents.get_by_pandadoc_id!()
|> Documents.update_document(%{status: status})
end
def handle_event(org_id, type, resource_id, date) do
IO.puts "i do not handle LexOffice requests of type \#{type} yet."
end
end
```
Put the following lines into your `router.ex` and configure the WebHook in the lexoffice portal.
```elixir
post "/callbacks/lexoffice", YourAppWeb.LexOfficeController, :webhook
```
## Usage
"""
import LexOffice.RequestBuilder
alias LexOffice.Connection
@doc """
Create a new contact.
## Parameters
- contact (LexOffice.Model.CreateContact): Contact data
- connection (LexOffice.Connection): [optional] Connection to server
## Returns
- `{:ok, %LexOffice.Model.ContactResponse{}}` on success
- `{:error, info}` on failure
"""
@spec create_contact(LexOffice.Model.CreateContact.t(), Tesla.Env.client() | nil) ::
{:ok, LexOffice.Model.ContactResponse.t()} | {:error, Tesla.Env.t()}
def create_contact(%LexOffice.Model.CreateContact{} = contact, client \\ Connection.new()) do
%{}
|> method(:post)
|> url("/v1/contacts")
|> add_param(:body, :body, contact)
|> Enum.into([])
|> (&Tesla.request(client, &1)).()
|> evaluate_response([
{200, %LexOffice.Model.ContactResponse{}},
{201, %LexOffice.Model.ContactResponse{}},
{400, %LexOffice.Model.ErrorListResponse{}},
{403, %LexOffice.Model.ErrorResponse{}},
{500, %LexOffice.Model.ErrorResponse{}}
])
end
@doc """
Create a new invoice.
## Parameters
- invoice (LexOffice.Model.CreateInvoice): Invoice data
- opts (KeywordList): [optional] Optional parameters
- connection (LexOffice.Connection): [optional] Connection to server
## Returns
- `{:ok, %LexOffice.Model.InvoiceResponse{}}` on success
- `{:error, info}` on failure
"""
@spec create_invoice(LexOffice.Model.CreateInvoice.t(), keyword(), Tesla.Env.client() | nil) ::
{:ok, LexOffice.Model.InvoiceResponse.t()} | {:error, Tesla.Env.t()}
def create_invoice(
%LexOffice.Model.CreateInvoice{} = invoice,
query \\ [],
client \\ Connection.new()
) do
optional_params = %{
:finalize => :query
}
%{}
|> method(:post)
|> url("/v1/invoices")
|> add_optional_params(optional_params, query)
|> add_param(:body, :body, invoice)
|> Enum.into([])
|> (&Tesla.request(client, &1)).()
|> evaluate_response([
{201, %LexOffice.Model.InvoiceResponse{}},
{400, %LexOffice.Model.ErrorResponse{}},
{403, %LexOffice.Model.ErrorResponse{}},
{500, %LexOffice.Model.ErrorResponse{}}
])
end
@doc """
Create a new credit-note.
## Parameters
- credit_note (LexOffice.Model.CreateCreditNote): Credit-Note data
- opts (KeywordList): [optional] Optional parameters
- connection (LexOffice.Connection): [optional] Connection to server
## Returns
- `{:ok, %LexOffice.Model.CreditNoteResponse{}}` on success
- `{:error, info}` on failure
"""
@spec create_credit_note(LexOffice.Model.CreateCreditNote.t(), keyword(), Tesla.Env.client()) ::
{:ok, LexOffice.Model.CreditNoteResponse.t()} | {:error, Tesla.Env.t()}
def create_credit_note(
%LexOffice.Model.CreateCreditNote{} = credit_note,
query \\ [],
client \\ Connection.new()
) do
optional_params = %{
:finalize => :query
}
%{}
|> method(:post)
|> url("/v1/credit-notes")
|> add_optional_params(optional_params, query)
|> add_param(:body, :body, credit_note)
|> Enum.into([])
|> (&Tesla.request(client, &1)).()
|> evaluate_response([
{201, %LexOffice.Model.CreditNoteResponse{}},
{400, %LexOffice.Model.ErrorResponse{}},
{403, %LexOffice.Model.ErrorResponse{}},
{500, %LexOffice.Model.ErrorResponse{}}
])
end
@doc """
Gets a single invoice
## Parameters
- id (String): Invoice ID
- connection (LexOffice.Connection): [optional] Connection to server
## Returns
- `{:ok, %LexOffice.Model.InvoiceResponse{}}` on success
- `{:error, info}` on failure
"""
@spec get_invoice(String.t(), Tesla.Env.client() | nil) ::
{:ok, LexOffice.Model.InvoiceDetailsResponse.t()} | {:error, Tesla.Env.t()}
def get_invoice(id, client \\ Connection.new()) do
%{}
|> method(:get)
|> url("/v1/invoices/#{id}")
|> Enum.into([])
|> (&Tesla.request(client, &1)).()
|> evaluate_response([
{200, %LexOffice.Model.InvoiceDetailsResponse{}},
{400, %LexOffice.Model.ErrorResponse{}},
{403, %LexOffice.Model.ErrorResponse{}},
{500, %LexOffice.Model.ErrorResponse{}}
])
end
@doc """
Downloads a single invoice
## Parameters
- id (String): Invoice ID
- connection (LexOffice.Connection): [optional] Connection to server
## Returns
- `{:ok, %LexOffice.Model.InvoiceResponse{}}` on success
- `{:error, info}` on failure
"""
@spec download_invoice(String.t(), Tesla.Env.client() | nil) ::
{:ok, LexOffice.Model.InvoiceResponse.t()} | {:error, Tesla.Env.t()}
def download_invoice(id, client \\ Connection.new()) do
file_id_response =
%{}
|> method(:get)
|> url("/v1/invoices/#{id}/document")
|> Enum.into([])
|> (&Tesla.request(client, &1)).()
|> evaluate_response([
{200, %LexOffice.Model.FileIdResponse{}},
{400, %LexOffice.Model.ErrorResponse{}},
{403, %LexOffice.Model.ErrorResponse{}},
{500, %LexOffice.Model.ErrorResponse{}}
])
case file_id_response do
{:ok, file_id_response} ->
%{}
|> method(:get)
|> url("/v1/files/#{file_id_response.documentFileId}")
|> Enum.into([])
|> (&Tesla.request(client, &1)).()
|> evaluate_response([
{200, :base64},
{400, %LexOffice.Model.ErrorResponse{}},
{404, %LexOffice.Model.ErrorResponse{}},
{403, %LexOffice.Model.ErrorResponse{}},
{500, %LexOffice.Model.ErrorResponse{}}
])
other ->
other
end
end
@doc """
Gets a single credit-note.
## Parameters
- id (String): CreditNote ID
- connection (LexOffice.Connection): [optional] Connection to server
## Returns
- `{:ok, %LexOffice.Model.CreditNoteResponse{}}` on success
- `{:error, info}` on failure
"""
@spec get_credit_note(String.t(), Tesla.Env.client() | nil) ::
{:ok, LexOffice.Model.CreditNoteResponse.t()} | {:error, Tesla.Env.t()}
def get_credit_note(id, client \\ Connection.new()) do
file_id_response =
%{}
|> method(:get)
|> url("/v1/credit-notes/#{id}/document")
|> Enum.into([])
|> (&Tesla.request(client, &1)).()
|> evaluate_response([
{200, %LexOffice.Model.FileIdResponse{}},
{400, %LexOffice.Model.ErrorResponse{}},
{403, %LexOffice.Model.ErrorResponse{}},
{500, %LexOffice.Model.ErrorResponse{}}
])
case file_id_response do
{:ok, file_id_response} ->
%{}
|> method(:get)
|> url("/v1/files/#{file_id_response.documentFileId}")
|> Enum.into([])
|> (&Tesla.request(client, &1)).()
|> evaluate_response([
{200, :base64},
{400, %LexOffice.Model.ErrorResponse{}},
{404, %LexOffice.Model.ErrorResponse{}},
{403, %LexOffice.Model.ErrorResponse{}},
{500, %LexOffice.Model.ErrorResponse{}}
])
other ->
other
end
end
@doc """
List vouchers.
## Parameters
- opts (KeywordList): [optional] Optional parameters
- :voucherType (String.t): Comma separated list of voucher-types, e.g. purchaseinvoice,invoice.
- :voucherStatus (String.t): Find vouchers by their voucherStatus, e.g. open.
- :page (integer()): The page to access..
- :size (integer()): The number of items to return per page.
- connection (LexOffice.Connection): [optional] Connection to server
## Returns
- `{:ok, %LexOffice.Model.VoucherListResponse{}}` on success
- `{:error, info}` on failure
"""
@spec list_vouchers(keyword(), Tesla.Env.client() | nil) ::
{:ok, LexOffice.Model.VoucherListResponse.t()} | {:error, Tesla.Env.t()}
def list_vouchers(opts \\ [], client \\ Connection.new()) do
optional_params = %{
:voucherType => :query,
:voucherStatus => :query,
:page => :query,
:size => :query
}
%{}
|> method(:get)
|> url("/v1/voucherlist")
|> add_optional_params(optional_params, opts)
|> Enum.into([])
|> (&Tesla.request(client, &1)).()
|> evaluate_response([
{200, %LexOffice.Model.VoucherListResponse{}},
{400, %LexOffice.Model.ErrorResponse{}},
{403, %LexOffice.Model.ErrorResponse{}},
{500, %LexOffice.Model.ErrorResponse{}}
])
end
end | lib/lexoffice.ex | 0.84375 | 0.660118 | lexoffice.ex | starcoder |
defmodule PoxTool.Poxel do
defstruct [
front: [],
back: [],
left: [],
right: [],
bottom: [],
top: []
]
@type colour :: { red :: float, green :: float, blue :: float, alpha :: float }
@type material :: atom
@type depth_range :: { non_neg_integer, pos_integer | nil }
@type chunk :: { depth_range, colour, material }
@type row :: [chunk]
@type face :: [row]
@type t :: %__MODULE__{
front: face,
back: face,
left: face,
right: face,
bottom: face,
top: face
}
@type palette :: %{ colour => non_neg_integer }
@spec size(t) :: { width :: non_neg_integer, height :: non_neg_integer, depth :: non_neg_integer }
def size(%{ front: front, left: left }) do
{ w, h } = face_size(front)
{ d, ^h } = face_size(left)
{ w, h, d }
end
@spec face_size(face) :: { width :: non_neg_integer, height :: non_neg_integer }
def face_size([]), do: { 0, 0 }
def face_size(face = [row|_]), do: { Enum.count(row), Enum.count(face) }
@spec palette(t, map) :: palette
def palette(poxel, palette \\ %{}) do
poxel
|> Map.from_struct
|> Enum.reduce(%{}, fn { _, face }, palette ->
face_palette(face, palette)
end)
end
@spec face_palette(face, map) :: palette
def face_palette(face, palette \\ %{}) do
face_map(face, palette, &(&1), fn result, _ -> result end, &(&1), fn { _, colour, _ }, acc ->
Map.put_new(acc, colour, map_size(acc))
end)
end
@spec face_map(face, any, (acc :: any -> any), (result :: any, acc :: any -> any), (acc :: any -> any), (chunk, acc :: any -> any)) :: any
def face_map(face, acc, row_init, row_merge, seg_init, seg_map) do
Enum.reduce(face, acc, fn row, acc ->
Enum.reduce(row, row_init.(acc), fn segment, acc ->
row_merge.(Enum.reduce(segment, seg_init.(acc), seg_map), acc)
end)
end)
end
end | lib/pox_tool/poxel.ex | 0.876654 | 0.495178 | poxel.ex | starcoder |
defmodule Level.DailyDigest do
@moduledoc """
Functions for generating the daily digest email.
"""
import Ecto.Query
alias Level.Digests
alias Level.Digests.FeedSection
alias Level.Digests.InboxSection
alias Level.Digests.Options
alias Level.Repo
alias Level.Schemas.Digest
alias Level.Schemas.DueDigest
alias Level.Schemas.SpaceUser
@doc """
Builds options to a pass to the digest generator.
"""
@spec digest_options(String.t(), DateTime.t(), String.t()) :: Options.t()
def digest_options(key, end_at, time_zone) do
%Options{
title: "Daily Summary",
subject: "Daily Summary",
key: key,
start_at: Timex.shift(end_at, hours: -24),
end_at: end_at,
time_zone: time_zone,
now: DateTime.utc_now()
}
end
@doc """
Fetches space user ids that are due to receive the daily digest
at the time the query is run.
"""
@spec due_query(DateTime.t(), integer()) :: Ecto.Query.t()
def due_query(now, hour_of_day \\ 16) do
inner_query =
from su in "space_users",
join: u in "users",
on: su.user_id == u.id,
join: s in "spaces",
on: su.space_id == s.id,
where: s.state == "ACTIVE",
where: su.is_digest_enabled == true,
where: su.state == "ACTIVE",
where: su.is_demo == false,
select: %DueDigest{
id: su.id,
space_id: su.space_id,
space_user_id: su.id,
hour: fragment("EXTRACT(HOUR FROM ? AT TIME ZONE ?)", ^now, u.time_zone),
digest_key:
fragment(
"concat('daily:', to_char(? AT TIME ZONE ?, 'yyyy-mm-dd'))",
^now,
u.time_zone
),
time_zone: u.time_zone
}
from r in subquery(inner_query),
left_join: d in Digest,
on: d.space_user_id == r.space_user_id and d.key == r.digest_key,
where: is_nil(d.id) and r.hour == ^hour_of_day,
select: %DueDigest{
id: fragment("?::text", r.id),
space_id: fragment("?::text", r.space_id),
space_user_id: fragment("?::text", r.space_user_id),
hour: r.hour,
digest_key: r.digest_key,
time_zone: r.time_zone
}
end
@doc """
Builds and sends all due digests.
"""
@spec build_and_send([DueDigest.t()], DateTime.t()) :: [
{:ok, Digest.t()}
| {:skip, DueDigest.t()}
| {:error, DueDigest.t()}
]
def build_and_send(results, now) do
Enum.map(results, fn result ->
space_user = Repo.get(SpaceUser, result.id)
opts = digest_options(result.digest_key, now, result.time_zone)
section_modules = filter_section_modules(space_user, opts)
if Enum.any?(section_modules) do
space_user
|> Digests.build(section_modules, opts)
|> send_after_build(result)
else
{:skip, result}
end
end)
end
def send_after_build({:ok, digest}, _) do
_ = Digests.send_email(digest)
{:ok, digest}
end
def send_after_build(_, result) do
{:error, result}
end
@doc """
Fetches sendables and processes them.
"""
@spec periodic_task(integer()) :: [
{:ok, Digest.t()}
| {:skip, DueDigest.t()}
| {:error, DueDigest.t()}
]
def periodic_task(hour_of_day \\ 16) do
now = DateTime.utc_now()
now
|> due_query(hour_of_day)
|> Repo.all()
|> build_and_send(now)
end
@doc """
Determines if the digest has enough interesting data to actually send.
"""
@spec send?(SpaceUser.t(), Options.t()) :: boolean()
def send?(space_user, opts) do
InboxSection.has_data?(space_user, opts) || FeedSection.has_data?(space_user, opts)
end
defp filter_section_modules(space_user, opts) do
Enum.filter([InboxSection, FeedSection], fn section ->
section.has_data?(space_user, opts)
end)
end
end | lib/level/daily_digest.ex | 0.736495 | 0.438845 | daily_digest.ex | starcoder |
defmodule Dpos.Utils do
alias Salty.Sign.Ed25519
@doc """
Signs a message using the wallet private key.
"""
@type keypair() :: {binary(), binary()}
@spec generate_keypair(String.t()) :: keypair()
def generate_keypair(secret) when is_binary(secret) do
{:ok, pk, sk} =
:sha256
|> :crypto.hash(secret)
|> Ed25519.seed_keypair()
{sk, pk}
end
@doc """
Signs a message and returns the signature.
"""
@spec sign_message(String.t(), binary()) :: {:ok, binary()}
def sign_message(msg, priv_key)
when is_binary(msg) and is_binary(priv_key) and byte_size(priv_key) == 64 do
Ed25519.sign(msg, priv_key)
end
@doc """
Verifies a message.
"""
@spec verify_message(String.t(), binary(), binary()) :: :ok
def verify_message(msg, sig, pub_key)
when is_binary(msg) and is_binary(sig) and is_binary(pub_key) and byte_size(pub_key) == 32 do
Ed25519.verify_detached(sig, msg, pub_key)
end
@doc """
Derives a wallet address from the public key.
"""
@spec derive_address(binary(), String.t()) :: String.t()
def derive_address(pub_key, suffix)
when is_binary(pub_key) and byte_size(pub_key) == 32 and is_binary(suffix) do
hash = :crypto.hash(:sha256, pub_key)
<<head::bytes-size(8), _tail::bytes>> = hash
head
|> reverse_binary()
|> to_string()
|> Kernel.<>(suffix)
end
@doc """
Reverses the given binary (from little-endian to big-endian).
"""
@spec reverse_binary(binary()) :: Integer.t()
def reverse_binary(bin) when is_binary(bin) do
{int, ""} =
bin
|> :binary.decode_unsigned(:little)
|> :binary.encode_unsigned(:big)
|> Base.encode16()
|> Integer.parse(16)
int
end
@doc """
Encodes the binary in base 16.
Returns nil if binary is nil.
"""
@spec hexdigest(binary()) :: String.t()
def hexdigest(bin)
def hexdigest(nil), do: nil
def hexdigest(bin) when is_binary(bin), do: Base.encode16(bin, case: :lower)
@doc """
Converts the wallet address to binary.
"""
@spec address_to_binary(String.t(), pos_integer()) :: binary()
def address_to_binary(address, suffix_length)
def address_to_binary(nil, _suffix_length), do: :binary.copy(<<0>>, 8)
def address_to_binary(address, suffix_length)
when is_binary(address) and is_integer(suffix_length) do
len = String.length(address) - suffix_length
{int, ""} =
address
|> String.slice(0..(len - 1))
|> Integer.parse()
<<int::size(64)>>
end
@doc """
Ensures signature is a binary 64 bytes long.
Returns an empty binary if signature is nil.
"""
@spec signature_to_binary(binary()) :: binary()
def signature_to_binary(sig)
def signature_to_binary(nil), do: <<>>
def signature_to_binary(sig) when is_binary(sig), do: <<sig::bytes-size(64)>>
end | lib/utils.ex | 0.860017 | 0.408247 | utils.ex | starcoder |
defmodule Ockam.Messaging.Ordering.Strict.IndexPipe do
@moduledoc """
Strictly ordered pipe using indexing to enforce ordering
See `Ockam.Messaging.Ordering.IndexPipe.Sender` and
`Ockam.Messaging.Ordering.Strict.IndexPipe.Receiver`
"""
@behaviour Ockam.Messaging.Pipe
@doc "Get sender module"
def sender() do
Ockam.Messaging.IndexPipe.Sender
end
@doc "Get receiver module"
def receiver() do
Ockam.Messaging.Ordering.Strict.IndexPipe.Receiver
end
end
defmodule Ockam.Messaging.Ordering.Strict.IndexPipe.Receiver do
@moduledoc """
Receiver side of strictly ordered pipe using indexing to enforce ordering
Maintains a sent message index and a send queue.
Receives wrapped messages from the sender
if the message index is current+1 - message is sent.
if the message index is lower then the current+1 - message is ignored
if the message index is higher then current+1 - message is put in the send queue
When message with current+1 index is received - messages from the send queue are processed
After sending a message updates the current index to the message index
"""
use Ockam.Worker
alias Ockam.Messaging.IndexPipe.Wrapper
require Logger
@impl true
def handle_message(indexed_message, state) do
case Wrapper.unwrap_message(Ockam.Message.payload(indexed_message)) do
{:ok, index, message} ->
case compare_index(index, state) do
:low ->
Logger.warn("Ignoring message #{inspect(message)} with index: #{inspect(index)}")
{:ok, state}
:high ->
Logger.warn("Enqueue message #{inspect(message)} with index: #{inspect(index)}")
{:ok, enqueue_message(index, message, state)}
:next ->
{:ok, send_message(index, message, state)}
end
other ->
Logger.error(
"Unable to decode indexed message: #{inspect(indexed_message)}, reason: #{inspect(other)}"
)
{:error, :unable_to_decode_message}
end
end
defp compare_index(index, state) do
next_index = current_index(state) + 1
case index do
^next_index -> :next
high when high > next_index -> :high
low when low < next_index -> :low
end
end
def send_message(index, message, state) do
Ockam.Router.route(message)
state = Map.put(state, :current_index, index)
process_queue(state)
end
defp process_queue(state) do
next_index = current_index(state) + 1
queue = queue(state)
case Map.pop(queue, next_index) do
{nil, _queue} ->
state
{message, rest} ->
state = Map.put(state, :queue, rest)
send_message(next_index, message, state)
end
end
def enqueue_message(index, message, state) do
queue = queue(state)
case Map.get(queue, index) do
nil ->
:ok
val ->
Logger.debug(
"Duplicate message: #{inspect(message)} overrides #{inspect(val)} for index #{inspect(index)}"
)
end
Map.put(state, :queue, Map.put(queue, index, message))
end
defp queue(state) do
Map.get(state, :queue, %{})
end
defp current_index(state) do
Map.get(state, :current_index, 0)
end
end | implementations/elixir/ockam/ockam/lib/ockam/messaging/ordering/strict/index_pipe.ex | 0.801042 | 0.515864 | index_pipe.ex | starcoder |
defmodule Golos.SocialNetworkApi do
@moduledoc """
Contains all functions to call Golos database_api methods
"""
def call(method, params) do
Golos.call(["social_network", method, params])
end
# CONTENT
@doc """
Returns content data, accepts author and permlink.
Example response:
```
%{"max_accepted_payout" => "1000000.000 GBG",
"title" => "[объявление] Краудсейл и Шэрдроп. Дистрибьюция",
"category" => "ru--kraudseijl", "promoted" => "0.000 GBG",
"last_update" => "2016-12-06T15:36:54", "created" => "2016-12-05T16:43:03",
"parent_permlink" => "ru--kraudseijl", "total_vote_weight" => 0,
"json_metadata" => "{"tags":["ru--kraudseijl","ru--shyerdrop","ru--golos"],"users":["golos","crowdsale","cyberdrop","misha","ether","bender","hipster","litvintech","vitaly-lvov"],"image":["https://dl.dropboxusercontent.com/u/52209381/golos/golos.png","https://dl.dropboxusercontent.com/u/52209381/golos/Screenshot%202016-12-05%2018.30.00.png","https://dl.dropboxusercontent.com/u/52209381/golos/ico_final-min.jpg","https://dl.dropboxusercontent.com/u/52209381/golos/Screenshot%202016-12-06%2002.25.05.png","https://dl.dropboxusercontent.com/u/52209381/golos/card.png"],"links":["https://docs.google.com/spreadsheets/d/1JwCAeRwsu4NzCG20UDM_CnEEsskl0wtvQ7VYjqi233A/edit?usp=sharing","https://golos.io/@litvintech"]}",
"last_payout" => "2017-01-15T11:00:06",
"total_payout_value" => "2412.784 GBG", "allow_replies" => true,
"children_rshares2" => "0", "id" => "2.8.30160",
"pending_payout_value" => "0.000 GBG", "children" => 15, "replies" => [],
"body" => "...",
"active" => "2016-12-06T22:23:06", "net_rshares" => 0,
"author_rewards" => 10011558, "total_pending_payout_value" => "0.000 GBG",
"root_comment" => "2.8.30160", "max_cashout_time" => "1969-12-31T23:59:59",
"root_title" => "[объявление] Краудсейл и Шэрдроп. Дистрибьюция",
"allow_votes" => true, "percent_steem_dollars" => 10000,
"children_abs_rshares" => 0, "net_votes" => 90, "author" => "litvintech",
"curator_payout_value" => "112.100 GBG",
"permlink" => "obyavlenie-kraudseil-i-sherdrop-distribyuciya",
"url" => "/ru--kraudseijl/@litvintech/obyavlenie-kraudseil-i-sherdrop-distribyuciya",
"cashout_time" => "2017-02-14T11:00:06", "parent_author" => "",
"allow_curation_rewards" => true, "vote_rshares" => 0,
"reward_weight" => 10000,
"active_votes" => [%{"percent" => 1000, "reputation" => "15928643268388",
"rshares" => "1974529666496", "time" => "2016-12-05T17:02:39",
"voter" => "val", "weight" => "99631990926249375"}, %{...}, ...], "depth" => 0,
"mode" => "second_payout", "abs_rshares" => 0,
"author_reputation" => "22784203010137"}
```
"""
@spec get_content(String.t(), String.t()) :: {:ok, map} | {:error, any}
def get_content(author, permlink) do
with {:ok, comment} <- call("get_content", [author, permlink]) do
cleaned =
comment
|> Golos.Cleaner.strip_token_names_and_parse(:float)
|> Golos.Cleaner.parse_json_strings(:json_metadata)
|> Golos.Cleaner.extract_fields()
|> Golos.Cleaner.prepare_tags()
|> Golos.Cleaner.parse_timedate_strings()
|> Golos.Cleaner.parse_empty_strings()
{:ok, cleaned}
else
err -> err
end
end
@doc """
Returns a list of replies to the given content, accepts author and permlink.
Example response:
```
[%{"max_accepted_payout" => "1000000.000 GBG", "title" => "",
"category" => "ru--kraudseijl", "promoted" => "0.000 GBG",
"last_update" => "2016-12-05T16:50:09",
"created" => "2016-12-05T16:50:09",
"parent_permlink" => "obyavlenie-kraudseil-i-sherdrop-distribyuciya",
"total_vote_weight" => 0,
"json_metadata" => "{\"tags\":[\"ru--kraudseijl\"]}",
"last_payout" => "2017-01-15T11:00:06",
"total_payout_value" => "12.892 GBG", "allow_replies" => true,
"children_rshares2" => "0", "id" => "2.8.30165",
"pending_payout_value" => "0.000 GBG", "children" => 1,
"replies" => [],
"body" => "И он сказал поехали...",
"active" => "2016-12-06T01:57:24", "net_rshares" => 0,
"author_rewards" => 53499,
"total_pending_payout_value" => "0.000 GBG",
"root_comment" => "2.8.30160",
"max_cashout_time" => "1969-12-31T23:59:59",
"root_title" => "[объявление] Краудсейл и Шэрдроп. Дистрибьюция",
"allow_votes" => true, "percent_steem_dollars" => 10000,
"children_abs_rshares" => 0, "net_votes" => 6,
"author" => "dmilash", "curator_payout_value" => "4.296 GBG",
"permlink" => "re-litvintech-obyavlenie-kraudseil-i-sherdrop-distribyuciya-20161205t165002890z",
"url" => "/ru--kraudseijl/@litvintech/obyavlenie-kraudseil-i-sherdrop-distribyuciya#@dmilash/re-litvintech-obyavlenie-kraudseil-i-sherdrop-distribyuciya-20161205t165002890z",
"cashout_time" => "1969-12-31T23:59:59",
"parent_author" => "litvintech",
"allow_curation_rewards" => true, "vote_rshares" => 0,
"reward_weight" => 10000, "active_votes" => [], "depth" => 1,
"mode" => "second_payout", "abs_rshares" => 0,
"author_reputation" => "37110534901202"},
%{...},
...]
```
"""
@spec get_content_replies(String.t(), String.t()) :: {:ok, map} | {:error, any}
def get_content_replies(author, permlink) do
call("get_content_replies", [author, permlink])
end
@doc """
Get state for the provided path.
Example result:
```
%{
"accounts" => ...,
"categories" => ...,
"category_idx" => ...,
"content" => ...,
"current_route" => ...,
"discussion_idx" => ...,
"error" => ...,
"feed_price" => ...,
"pow_queue" => ...,
"props" => ...,
"witness_schedule" => ...,
"witnesses" => ... }
```
"""
@spec get_state(String.t()) :: {:ok, map} | {:error, any}
def get_state(path) do
call("get_state", [path])
end
@doc """
Get categories. Accepts wanted metric, after_category, limit.
Example result:
```
%{
"accounts" => ...,
"categories" => ...,
"category_idx" => ...,
"discussion_idx" => ...,
"error" => ...,
"feed_price" => ...,
"pow_queue" => ...,
"props" => ...,
"witness_schedule" => ...,
"current_virtual_time" => ...,
"id" => ...,
"majority_version" => ...,
"median_props" => ...,
"next_shuffle_block_num" => ...,
"witnesses" => ... }
```
"""
@spec get_categories(atom, String.t(), integer) :: {:ok, [map]} | {:error, any}
def get_categories(metric, after_category, limit) do
method = "get_" <> Atom.to_string(metric) <> "_categories"
call(method, [after_category, limit])
end
@doc """
Gets current GBG to GOLOS conversion requests for given account.
Example result:
```
[%{"amount" => "100.000 GBG", "conversion_date" => "2017-02-17T18:59:42",
"id" => "2.15.696", "owner" => "ontofractal", "requestid" => 1486753166}]
```
"""
@spec get_conversion_requests() :: {:ok, [map]} | {:error, any}
def get_conversion_requests() do
call("get_conversion_requests", ["ontofractal"])
end
@doc """
Returns past owner authorities that are valid for account recovery.
Doesn't seem to work at this moment.
```
```
"""
@spec get_owner_history(String.t()) :: {:ok, [map]} | {:error, any}
def get_owner_history(name) do
call("get_owner_history", [name])
end
@doc """
Get witnesses by ids
## Example response
```
[%{"created" => "2016-10-18T11:21:18",
"hardfork_time_vote" => "2016-10-18T11:00:00",
"hardfork_version_vote" => "0.0.0", "id" => "2.3.101",
"last_aslot" => 3323895, "last_confirmed_block_num" => 3318746,
"last_sbd_exchange_update" => "2017-02-09T06:10:33",
"last_work" => "0000000000000000000000000000000000000000000000000000000000000000",
"owner" => "hipster", "pow_worker" => 0,
"props" => %{"account_creation_fee" => "1.000 GOLOS",
"maximum_block_size" => 65536, "sbd_interest_rate" => 1000},
"running_version" => "0.14.2",
"sbd_exchange_rate" => %{"base" => "1.742 GBG",
"quote" => "1.000 GOLOS"},
"signing_key" => "<KEY>",
"total_missed" => 10,
"url" => "https://golos.io/ru--delegaty/@hipster/delegat-hipster",
"virtual_last_update" => "2363092957490310521961963807",
"virtual_position" => "186709431624610119071729411416709427966",
"virtual_scheduled_time" => "2363094451567901047152350987",
"votes" => "102787791122912956"},
%{...} ]
```
"""
@spec get_witnesses([String.t]) :: {:ok, [map]} | {:error, any}
def get_witnesses(ids) do
call("get_witnesses", [ids])
end
@doc """
Get witnesses by votes. Example response is the same as get_witnesses.
"""
@spec get_witnesses_by_vote(integer, integer) :: {:ok, [map]} | {:error, any}
def get_witnesses_by_vote(from, limit) do
call("get_witnesses_by_vote", [from, limit])
end
@doc """
Lookup witness accounts
Example response:
```
["creator", "creatorgalaxy", "crypto", "cryptocat", "cyberfounder", "cybertech-01", "d00m", "dacom", "dance", "danet"]
```
"""
@spec lookup_witness_accounts(String.t(), integer) :: {:ok, [String.t()]} | {:error, any}
def lookup_witness_accounts(lower_bound_name, limit) do
call("lookup_witness_accounts", [lower_bound_name, limit])
end
@doc """
Get witness count
Example response: `997`
"""
@spec get_witness_count() :: {:ok, [String.t()]} | {:error, any}
def get_witness_count() do
call("get_witness_count", [])
end
@doc """
Get active witnesses
Example response:
```
["primus", "litvintech", "yaski", "serejandmyself", "dark.sun", "phenom",
"hipster", "gtx-1080-sc-0048", "lehard", "aleksandraz", "dr2073", "smailer",
"on0tole", "roelandp", "arcange", "testz", "vitaly-lvov", "xtar", "anyx",
"kuna", "creator"]
```
"""
@spec get_active_witnesses() :: {:ok, [String.t()]} | {:error, any}
def get_active_witnesses() do
call("get_active_witnesses", [])
end
@doc """
Get miner queue
Example response:
```
["gtx-1080-sc-0083", "gtx-1080-sc-0016", "gtx-1080-sc-0084", "gtx-1080-sc-0017",
"gtx-1080-sc-0085", "gtx-1080-sc-0018", "penguin-11", "gtx-1080-sc-0028",
"gtx-1080-sc-0023", "gtx-1080-sc-0080", ...]
```
"""
@spec get_miner_queue() :: {:ok, [String.t()]} | {:error, any}
def get_miner_queue() do
call("get_miner_queue", [])
end
@doc """
Get *all* account votes
Example response:
```
[%{"authorperm" => "rusldv/programmiruem-na-php-vvedenie", "percent" => 10000,
"rshares" => 130036223, "time" => "2017-01-26T20:06:03", "weight" => 0},
%{...}, ...] ```
"""
@spec get_account_votes(String.t()) :: {:ok, [map]} | {:error, any}
def get_account_votes(name) do
call("get_account_votes", [name])
end
@doc """
Get active votes on the given content. Accepts author and permlink.
Example response:
```
[%{"percent" => 6900, "reputation" => "28759071217014",
"rshares" => "18897453242648", "time" => "2017-01-27T09:20:21",
"voter" => "hipster", "weight" => "51460692508758354"},
%{...}, ...] ```
"""
@spec get_active_votes(String.t(), String.t()) :: {:ok, [map]} | {:error, any}
def get_active_votes(account, permlink) do
call("get_active_votes", [account, permlink])
end
end | lib/apis/social_network_api.ex | 0.619701 | 0.647213 | social_network_api.ex | starcoder |
defmodule APIDoc.APIDocumenter do
@moduledoc ~S"""
API Documenter.
Documents the main entry of the API.
The following annotations can be set:
- `@api` (string) name of the API.
- `@vsn` (string) version of the API.
- `@moduledoc` (string) API description. Supports markdown.
- `@contact` (string) contact info.
The following macros can be used for documenting:
- `schema/2`, `schema/3`: Adds data schemas to the documentation.
- `server/1`, `server/2`: Add possible servers to the documentation.
"""
require Logger
alias APIDoc.Doc.Contact
alias APIDoc.Doc.Schema
alias APIDoc.Doc.Security
alias APIDoc.Doc.Server
alias Mix.Project
@doc @moduledoc
defmacro __using__(opts \\ []) do
quote do
Module.register_attribute(__MODULE__, :api, accumulate: false, persist: false)
Module.register_attribute(__MODULE__, :contact, accumulate: false, persist: false)
Module.register_attribute(__MODULE__, :security, accumulate: true, persist: false)
Module.register_attribute(__MODULE__, :server, accumulate: true, persist: false)
Module.register_attribute(__MODULE__, :schema, accumulate: true, persist: false)
require APIDoc.APIDocumenter
import APIDoc.APIDocumenter,
only: [server: 1, server: 2, schema: 2, schema: 3, security: 4, security: 5]
@before_compile APIDoc.APIDocumenter
@router unquote(opts[:router])
end
end
@doc false
defmacro __before_compile__(env) do
name = Module.get_attribute(env.module, :api) || "API Documentation"
version = Module.get_attribute(env.module, :vsn) || Project.config()[:version]
servers = Module.get_attribute(env.module, :server) || []
schemas = Module.get_attribute(env.module, :schema) || []
security = Module.get_attribute(env.module, :security) || []
contact =
with %{"email" => email, "name" => name} <-
Regex.named_captures(
~r/^(?'name'.*?)\ ?<(?'email'.*)>$/,
Module.get_attribute(env.module, :contact) || ""
) do
%Contact{email: email, name: name}
else
_ -> nil
end
# Perform validations
schemas |> Enum.each(&Schema.validate!/1)
quote do
@doc ~S"""
Format the document with a given formatter.
Uses `APIDoc.Format.OpenAPI3` by default.
"""
@spec format(atom) :: String.t()
def format(formatter \\ APIDoc.Format.OpenAPI3), do: formatter.format(__document__())
@doc false
@spec __document__ :: map
def __document__ do
%APIDoc.Doc.Document{
info: %APIDoc.Doc.Info{
name: unquote(name),
version: unquote(version),
description: @moduledoc,
contact: unquote(Macro.escape(contact))
},
servers: unquote(Macro.escape(servers)),
schemas: unquote(Macro.escape(schemas)),
security: unquote(Macro.escape(Enum.into(security, %{}))),
endpoints: @router.__api_doc__()
}
end
end
end
@doc ~S"""
Add server to documentation.
## Examples
Only url:
```
server "https://prod.example.com"
server "https://stage.example.com"
```
Url and description:
```
server "https://prod.example.com", "Production example server"
server "https://stage.example.com", "Staging example server"
```
"""
@spec server(String.t(), String.t() | nil) :: term
defmacro server(url, description \\ nil) do
quote do
@server %Server{
url: unquote(url),
description: unquote(description)
}
end
end
@doc ~S"""
Add security scheme to documentation.
```
"""
@spec security(atom, String.t(), Security.type(), Security.location(), String.t() | nil) :: term
defmacro security(id, name, type, location, description \\ nil) do
quote do
@security {unquote(id),
%Security{
name: unquote(Macro.expand(name, __CALLER__)),
type: unquote(type),
in: unquote(location),
description: unquote(description)
}}
end
end
@doc ~S"""
Add schema to documentation.
The `name` and `type` are always required.
For additional optional fields see: `APIDoc.Doc.Schema`.
## Examples
Just name and type:
```
schema Name, :string
schema Age, :integer
```
Additional options:
```
schema Name, :string,
example: "Bob"
schema Age, :integer,
format: :int32,
example: 34,
minimum: 1,
maximum: 150
```
"""
@spec schema(atom, Schema.type(), Keyword.t()) :: term
defmacro schema(name, type, opts \\ []) do
quote do
@schema %Schema{
name: unquote(Macro.expand(name, __CALLER__)),
type: unquote(type),
format: unquote(opts[:format]),
required: unquote(opts[:required]),
properties: unquote(opts[:properties]),
example: unquote(opts[:example]),
minimum: unquote(opts[:minimum]),
maximum: unquote(opts[:maximum]),
items: unquote(opts[:items])
}
end
end
end | lib/documenter/api.ex | 0.833257 | 0.672688 | api.ex | starcoder |
defmodule ETag.Plug.Options do
@moduledoc """
Applies defaults and validates the given options for the plug. Allowed options are:
- `generator`
- `methods`
- `status_codes`
For details on their usage, values and defaults take a look at the `ETag.Plug` module.
"""
@defaults %{
generator: ETag.Generator.SHA1,
methods: ["GET"],
status_codes: [200]
}
@spec sanitize!(Keyword.t()) :: Keyword.t()
def sanitize!(opts) do
unless Keyword.keyword?(opts) do
raise ArgumentError,
"Expected to receive a Keyword list as " <>
"options but instead received: #{inspect(opts)}"
end
opts
|> with_default(:generator)
|> with_default(:methods)
|> with_default(:status_codes)
|> do_sanitize!()
end
@spec defaults() :: Keyword.t()
def defaults, do: unquote(Enum.to_list(@defaults))
@spec default(key :: atom()) :: any()
def default(key), do: @defaults[key]
defp with_default(opts, key) do
Keyword.put_new_lazy(opts, key, fn -> config(key) end)
end
defp config(key), do: Application.get_env(:etag_plug, key, default(key))
defp do_sanitize!(opts) do
opts
|> Keyword.update!(:generator, &validate_generator!/1)
|> Keyword.update!(:methods, &validate_and_uppercase_methods!/1)
|> Keyword.update!(:status_codes, &validate_status_codes!/1)
end
defp validate_generator!(generator) do
unless is_atom(generator) do
raise ArgumentError,
"Expected the generator to be a module but received: #{inspect(generator)}"
end
generator
end
defp validate_and_uppercase_methods!(methods) do
methods =
Enum.map(methods, fn
method when is_binary(method) ->
String.upcase(method)
method ->
raise ArgumentError,
"Expected the methods to be strings but received: #{inspect(method)}"
end)
with [] <- methods do
raise ArgumentError, "Received an empty list for `methods` which makes no sense!"
end
end
defp validate_status_codes!(status_codes) do
status_codes = Enum.map(status_codes, &Plug.Conn.Status.code/1)
with [] <- status_codes do
raise ArgumentError, "Received an empty list for `status_codes` which makes no sense!"
end
end
end | lib/etag/plug/options.ex | 0.828592 | 0.463323 | options.ex | starcoder |
defmodule FastXlsxExporter do
@moduledoc """
# Fast XLSX Exporter
## Installation
Add `fast_xlsx_exporter` to your mix.ex deps:
```elixir
def deps do
[
{:fast_xlsx_exporter, "~> 0.2.2"}
]
end
```
## Explanation
[Elixlsx](https://github.com/xou/elixlsx) was fine, until really huge exports appeared. Then it took more and more time to generate xlsx reports. And RAM.
So, being really primitive (8 hour at night from scratch knowing nothing about xlsx) this library does not store document in memory. It writes straight to file system.
Some example:
```elixir
rows = [[1, 2, 3, 10], [4, 5, 6], [7, 8, 9]]
context = FastXlsxExporter.initialize()
context = Enum.reduce(rows, context, &FastXlsxExporter.put_row/2)
{:ok, document} = FastXlsxExporter.finalize(context)
File.write("/home/george/failures.xlsx", document)
```
See? Really simple thing, nothing special.
If you're looking for something that really supports xlsx, go with [elixlsx](https://github.com/xou/elixlsx).
## Supported cell values
### Numbers
Both `float` and `integer` values are supported and special form of `{<float>, :percents}` to write number as xlsx percent.
Example row:
```elixir
[1, 12.5, {0.59, :percents}]
```
### Strings
Strings could be written in two ways.
First one is straight (no special form). In this case strings are written sequentially to **shared strings**, which is RAM-friendly but bloats resulting xlsx file.
Second one requires special form of `{<string>, :dictionary}`. In this case strings are put into dictionary and are put into **shared strings** only once, but are stored in memory, which is good for limited set of values but can cause `OOMKilled` if strings are *random*.
Example rows:
```elixir
# first row
["<NAME>", "<NAME>", "<NAME>"]
# second row
[{"some_string", :dictionary}, {"some_other_string", :dictionary}, {"some_string", :dictionary}]
# third row
["wow!", {"some_other_string", :dictionary}, "yay!"]
```
### Date and time
Both `%Date{}` and `%NaiveDateTime{}` are rendered as dates (not strings).
Example row:
```elixir
[~D[1905-12-11], ~D[2020-04-09], ~N[2020-04-09 12:00:00]]
```
"""
alias FastXlsxExporter.Sheet
@type context() :: {temp_dir_name :: binary(), Sheet.context()}
@doc """
Initializes export
Creates temporary export directory at `System.tmp_dir!()`, writes common files and content file header
"""
@spec initialize() :: context()
def initialize do
:random.seed()
temp_name = "xlsx_#{:rand.uniform(1_000_000_000)}"
dir = Path.join(System.tmp_dir!(), temp_name)
File.rm_rf!(dir)
File.mkdir!(dir)
FastXlsxExporter.Sample.write(dir)
{:ok, sheet_context} = Sheet.initialize(dir)
{dir, sheet_context}
end
@doc """
Adds row to document
"""
@spec put_row(Sheet.row(), context()) :: context()
def put_row(row, {dir, sheet_context}) when is_list(row) do
{dir, Sheet.write_row(row, sheet_context)}
end
@doc """
Finalizes export and returns xlsx file binary
Removes temporary directory, closes file descriptors
"""
@spec finalize(context()) :: {:ok, content :: binary()} | {:error, term()}
def finalize({dir, sheet_context} = _context) do
Sheet.finalize(sheet_context)
archive_result =
:zip.create(to_charlist("file.xlsx"), list_files(dir), [:memory, cwd: to_charlist(dir)])
File.rm_rf!(dir)
case archive_result do
{:ok, {_filename, content}} -> {:ok, content}
error -> error
end
end
@doc """
Finalizes export and writes result to file
Removes temporary directory, closes file descriptors
"""
@spec finalize_to_file(context :: context(), :file.name()) :: :ok | {:error, term()}
def finalize_to_file({dir, sheet_context} = _context, filename) do
Sheet.finalize(sheet_context)
archive_result = :zip.create(filename, list_files(dir), cwd: to_charlist(dir))
File.rm_rf!(dir)
case archive_result do
{:ok, _} -> :ok
error -> error
end
end
defp list_files(path) do
path
|> Path.join("*")
|> Path.wildcard()
|> Enum.map(&String.replace_leading(&1, path, ""))
|> Enum.map(&String.replace_leading(&1, "/", ""))
|> Enum.map(&to_charlist/1)
end
end | lib/fast_xlsx_exporter.ex | 0.852966 | 0.886224 | fast_xlsx_exporter.ex | starcoder |
defmodule Discogs.Models.Record do
@moduledoc """
Ecto model representing a record belonging to a Discogs release.
This is not a first-class object in Discogs's data model. What this allows us
to do is to join over from `User` -> `Release` -> `Record` and pick out a
sampling of _specific records_ in a user's collection at random (for example,
disc 2 of a 2xLP set).
See `Discogs.Tasks.shuffle_collection/2` for example usage.
"""
use Ecto.Schema
alias Discogs.Models.{Record, Release}
import Ecto.Changeset
schema "records" do
belongs_to(:release, Release)
has_many(:artists, through: [:release, :artists])
field(:disc_number, :integer, null: false)
timestamps()
end
@doc """
Validates the params and returns an Ecto changeset on success.
"""
@type params :: %{
optional(:disc_number) => String.t(),
optional(:release_id) => pos_integer
}
@spec changeset(%Record{}, params) :: Ecto.Changeset.t()
def changeset(record, params \\ %{}) do
record
|> cast(params, [:disc_number])
|> cast_assoc(:release)
|> validate_required(:disc_number)
|> assoc_constraint(:release)
|> unique_constraint(:disc_number,
name: :records_disc_number_release_id_index
)
end
@doc """
Formats the `Record` name for consumption by the `Discogs.Repo` (or
elsewhere).
If the record is part of a 2+xLP set, appends the disc number; else delegates
to the release name.
TODO: This should be done by implementing `String.Chars`.
cf. https://elixirschool.com/en/lessons/advanced/protocols/
This would allow us to:
1. simply call the stringified value of the release, rather than passing around
a formatter
2. consume the stringfied value of the record similarly elsewhere (i.e., via
simple interpolation without a method call).
This works similarly to defining `#to_s` in Ruby.
"""
@spec format_name(%Record{}, fun) :: String.t()
def format_name(
%Record{disc_number: disc_number, release: release},
formatter
) do
release_name = formatter.(release)
if length(release.records) > 1,
do: "#{release_name} (disc #{disc_number})",
else: release_name
end
end | lib/discogs/models/record.ex | 0.766643 | 0.514949 | record.ex | starcoder |
defmodule Comet.TabWorker do
@moduledoc """
Worker macro module for use in your application.
This module will manage the life cycle and interactions with a browser
tab worker being managed by `:poolboy`.
This module cannot be used directly, it should be `use`d in your app:
## Example
defmodule MyApp.TabWorker do
use Comet.TabWorker
end
In your app's config you will have to explicitly define the `module_worker`
to point to your module:
config :comet, :supervisor,
worker_module: MyApp.TabWorker
Please refer to the public functions documented below.
Each publicly documented function can be overriden in your
custom module.
## Worker Lifecycle Hooks
1. `init`
1. `before_launch/2`
1. `launch`
1. `before_navigate/2`
1. `Comet.navigate_to/2`
1. `after_navigate/2`
1. `after_launch/2`
1. `request/1`
1. `before_visit/2`
1. `visit/2`
1. `after_visit/2`
1. `after_request/1`
## Differences beteen `navigate` and `visit`
### Navigate
When launching a tab a URL is provided. This action is referred to a `navigate`
### Visit
The tab will keep your application running. You should use the `visit` functions to manage
how to route each request to the application.
"""
Module.add_doc(__MODULE__, 266, :def, {:after_launch, 2}, (quote do: [opts, state]), """
Run any code after the worker tab launches.
Default is `noop`. This function is intended to be overridden.
This lifecycle hook is run only once while the worker is in its `init/1` funtion. It is blocking.
## Example
defmoule MyApp.TabWorker do
use Comet.TabWorker
def after_launch(_opts, state) do
# work
{:ok, state}
end
end
The return value should be `{:ok, Map}`, if if for any reason an error occurs `{:error, reason}`
""")
Module.add_doc(__MODULE__, 265, :def, {:after_navigate, 2}, (quote do: [opts, state]), """
Run any code after the tab navigation.
Default is `noop`. This function is intended to be overridden.
This lifecycle hook is run only once while the worker is in its `init/1` funtion. It is blocking.
## Example
defmoule MyApp.TabWorker do
use Comet.TabWorker
def after_navigate(_opts, state) do
# work
{:ok, state}
end
end
The return value should be `{:ok, state}`, if if for any reason an error occurs `{:error, reason}`
""")
Module.add_doc(__MODULE__, 336, :def, {:after_visit, 2}, (quote do: [state]), """
Run any code after the visit if changes to the response are desirable.
Default is `noop`. This function is intended to be overridden.
This lifecycle hook is run only once while the worker is in its `init/1` funtion. It is blocking.
## Example
defmoule MyApp.TabWorker do
use Comet.TabWorker
def after_visit(%Comet.Response{} = response, state) do
# work
response
end
end
The return value **must** be a `Comet.Response` struct.
""")
Module.add_doc(__MODULE__, 242, :def, {:before_launch, 2}, (quote do: [opts, state]), """
Run any code before the worker tab launches.
Default is `noop`. This function is intended to be overridden.
This lifecycle hook is run only once while the worker is in its `init/1` funtion. It is blocking.
## Example
defmoule MyApp.TabWorker do
use Comet.TabWorker
def before_launch(_opts, state) do
# work
{:ok, state}
end
end
The return value should be `{:ok, state}`, if if for any reason an error occurs `{:error, reason}`
""")
Module.add_doc(__MODULE__, 264, :def, {:before_navigate, 2}, (quote do: [opts, state]), """
Run any code before the worker tab navigates.
Default is `noop`. This function is intended to be overridden.
This lifecycle hook is run only once while the worker is in its `init/1` funtion. It is blocking.
## Example
defmoule MyApp.TabWorker do
use Comet.TabWorker
def before_navigate(_opts, state) do
# work
{:ok, state}
end
end
The return value should be `{:ok, state}`, if if for any reason an error occurs `{:error, reason}`
""")
Module.add_doc(__MODULE__, 335, :def, {:visit, 2}, (quote do: [opts, state]), """
Hook for triggering a visit action within your application.
By default this function returns `:not_implemented` and *must* be overridden.
It is recommended that you use `Comet.eval/2` for running the necessary
JavaScript in your application to trigger a visit. The visit action within your client application
should result in a promise. The promise itself should resolve to a JSON object with a `status` and `body` key:
## Example
def visit(path, %{pid: pid}) do
Comet.eval(pid, \"""
MyApp.visit(\#{path}).then((application) => {
return application.getResponse();
});
\""")
end
The default `:not_implemented` return value will result in the worker returning a `%{status: 501: body: "Not Implemented"}` response
that will be set into the `conn` of `Comet.Plug`.
""")
defmacro __using__([]) do
quote do
use GenServer
@init_timeout Application.get_env(:comet, :init_timeout, 2_000)
@resp_timeout Application.get_env(:comet, :resp_timeout, 1_000)
@pool :comet_pool
def start_link(opts) do
GenServer.start_link(__MODULE__, opts)
end
def init(:ignore), do: :ignore
def init(opts) do
with {:ok, state} <- before_launch(opts, %{}),
{:ok, state} <- launch(opts, state),
{:ok, state} <- after_launch(opts, state) do
{:ok, state}
else
{:error, reason} -> {:stop, reason}
unknown -> {:stop, {:unknown, unknown}}
end
end
def before_launch(_opts, state), do: {:ok, state}
def launch(opts, state) do
url = Keyword.get(opts, :launch_url)
timeout = Keyword.get(opts, :timeout, @init_timeout)
server = Comet.server(opts)
{tab, pid} = Comet.new_tab(server)
Comet.enable(pid)
state = Map.merge(state, %{server: server, tab: tab, pid: pid})
{:ok, state} = before_navigate(opts, state)
case Comet.navigate_to(pid, url, timeout) do
:ok -> after_navigate(opts, state)
{_error, reason} -> {:error, reason}
error -> {:error, error}
end
end
def before_navigate(_opts, state), do: {:ok, state}
def after_navigate(_opts, state), do: {:ok, state}
def after_launch(_opts, state), do: {:ok, state}
def handle_call(:pid, _from, %{pid: tab_pid} = state) do
{:reply, tab_pid, state}
end
defp render_template(title, body) do
"""
<!DOCTYPE html>
<html>
<head>
<title>#{title}</title>
</head>
<body>
To retrieve the response from your app you must
override <code>`Comet.Plug.visit/2`</code>.
</body>
</html>
"""
end
def handle_call({:request, path}, _from, state) do
response =
path
|> before_visit(state)
|> visit(state)
|> case do
{:error, :not_implemented} ->
body = render_template("Not Implemented", """
To retrieve the response from your app you must
override <code>`Comet.Plug.visit/2`</code>.
""")
%{status: 501, body: body}
{:ok, response} -> response
{:reject, response} -> {response, status: 500}
{:error, reason} -> %{status: 500, body: render_template("Internal Server Error", "Something went wrong: #{reason}")}
end
|> Comet.Response.normalize()
|> after_visit(state)
{:reply, response, state}
end
def handle_cast(:after_request, state) do
state = after_request(state)
:poolboy.checkin(@pool, self())
{:noreply, state}
end
def handle_info(msg, state) do
{:noreply, state}
end
def terminate(reason, %{server: server, tab: tab, pid: pid}) do
Comet.close_tab(server, tab, pid)
{:stop, reason}
end
def terminate(reason, _state), do: {:stop, reason}
def before_visit(path, _state), do: path
def visit(_path, _state), do: {:error, :not_implemented}
def after_visit(%Comet.Response{} = response, _state), do: response
def after_request(state), do: state
defoverridable [
before_launch: 2,
before_navigate: 2,
after_navigate: 2,
after_launch: 2,
before_visit: 2,
visit: 2,
after_visit: 2,
after_request: 1
]
end
end
end | lib/comet/tab_worker.ex | 0.802246 | 0.478102 | tab_worker.ex | starcoder |
defmodule CashAddr do
use Bitwise
@moduledoc ~S"""
Encode and decode the CashAddr format, with checksums.
"""
# Encoding character set. Maps data value -> char
for {encoding, value} <- Enum.with_index('qpzry9x8gf2tvdw0s3jn54khce6mua7l') do
defp do_encode32(unquote(value)), do: unquote(encoding)
defp do_decode32(unquote(encoding)), do: unquote(value)
end
defp do_decode32(_), do: nil
# Human-readable part and data part separator (':')
@separator 0x3A
# Generator coefficients
for {generator_value, index} <-
Enum.with_index([0x98F2BC8E61, 0x79B76D99E2, 0xF33E5FB3C4, 0xAE2EABE2A8, 0x1E4F43E470]) do
defp generator(unquote(index)), do: unquote(generator_value)
end
@uint5_max_value 31
# hash sizes
for {hash_size_value, index} <- Enum.with_index([160, 192, 224, 256, 320, 384, 448, 512]) do
defp hash_size(unquote(index)), do: unquote(hash_size_value)
defp encode_hash_size(unquote(hash_size_value)), do: unquote(index)
end
defp encode_hash_size(_), do: raise(ArgumentError)
@doc ~S"""
Encode a CashAddr string.
## Examples
iex> CashAddr.encode("prefix", "hełło")
"prefix:dpjutqk9sfhsx5tgjch6"
iex> CashAddr.encode("bitcoincash", <<0, 111, 75, 112, 94, 62, 4, 7, 191, 49, 89, 233, 196, 5, 13, 241, 183, 145, 210, 195, 246>>)
"bitcoincash:qph5kuz78czq00e3t85ugpgd7xmer5kr7c5f6jdpwk"
"""
@spec encode(String.t(), binary()) :: String.t()
def encode(hrp, data) when is_binary(data) do
do_encode(hrp, split(data, []))
end
defp do_encode(hrp, data) when is_list(data) do
checksummed = data ++ create_checksum(hrp, data)
dp = for i <- checksummed, into: "", do: <<do_encode32(i)>>
<<hrp::binary, @separator, dp::binary>>
end
defp split(data, acc) do
case data do
<<a::size(5), rest::bitstring>> ->
split(rest, acc ++ [a])
<<a::size(4)>> ->
acc ++ [a <<< 1]
<<a::size(3)>> ->
acc ++ [a <<< 2]
<<a::size(2)>> ->
acc ++ [a <<< 3]
<<a::size(1)>> ->
acc ++ [a <<< 4]
<<>> ->
acc
end
end
@doc ~S"""
Decode a CashAddr string.
## Examples
iex> CashAddr.decode("prefix:dpjutqk9sfhsx5tgjch6")
{:ok, {"prefix", "hełło"}}
iex> CashAddr.decode("bitcoincash:qph5kuz78czq00e3t85ugpgd7xmer5kr7c5f6jdpwk")
{:ok, {"bitcoincash",
<<0, 111, 75, 112, 94, 62, 4, 7, 191, 49, 89, 233, 196, 5, 13, 241, 183, 145, 210, 195, 246>>}}
"""
@spec decode(String.t()) :: {:ok, {String.t(), binary()}} | {:error, String.t()}
def decode(bech) do
with {_, false} <- {:mixed, String.downcase(bech) != bech && String.upcase(bech) != bech},
bech_charlist = :binary.bin_to_list(bech),
bech = String.downcase(bech),
len = Enum.count(bech_charlist),
pos =
Enum.find_index(Enum.reverse(bech_charlist), fn c ->
c == @separator
end),
{_, true} <- {:oor_sep, pos != nil},
pos = len - pos - 1,
{_, false} <- {:empty_hrp, pos < 1},
{_, false} <- {:short_cs, pos + 9 > len},
<<hrp::binary-size(pos), @separator, data::binary>> = bech,
data_charlist =
(for c <- :binary.bin_to_list(data) do
do_decode32(c)
end),
{_, false} <-
{:oor_data,
Enum.any?(
data_charlist,
&match?(nil, &1)
)},
{_, true} <- {:cs, verify_checksum(hrp, data_charlist)},
data_len = Enum.count(data_charlist),
data = Enum.slice(data_charlist, 0, data_len - 8) do
len_bits = (data_len - 8) * 5
bits = div(len_bits, 8) * 8
padding_length = rem(len_bits, 8)
<<decoded::bits-size(bits), _::size(padding_length)>> =
for d <- data, into: <<0::size(0)>>, do: <<d::size(5)>>
{:ok, {hrp, decoded}}
else
{:mixed, _} -> {:error, "Mixed case"}
{:oor_sep, _} -> {:error, "No separator character"}
{:empty_hrp, _} -> {:error, "Empty HRP"}
{:oor_data, _} -> {:error, "Invalid data"}
{:short_cs, _} -> {:error, "Too short checksum"}
{:cs, _} -> {:error, "Invalid checksum"}
_ -> {:error, "Unknown error"}
end
end
@doc ~S"""
Encodes hash as CashAddr payload
## Examples
iex> CashAddr.encode_payload(0, <<118, 160, 64, 83, 189, 160, 168, 139, 218, 81, 119, 184, 106, 21, 195, 178, 159, 85, 152, 115>>)
<<0, 118, 160, 64, 83, 189, 160, 168, 139, 218, 81, 119, 184, 106, 21, 195, 178, 159, 85, 152, 115>>
"""
def encode_payload(type, hash)
when is_integer(type) and type >= 0 and type <= 15 and is_binary(hash) do
encoded_hash_size = encode_hash_size(byte_size(hash) * 8)
<<0::1, type::4, encoded_hash_size::3>> <> hash
end
@doc ~S"""
Decode a CashAddr payload. Retrns tuple containing address type, hash size and hash
## Examples
iex> CashAddr.decode_payload(<<0, 118, 160, 64, 83, 189, 160, 168, 139, 218, 81, 119, 184, 106, 21, 195, 178, 159, 85, 152, 115>>)
{:ok, {0, 160, <<118, 160, 64, 83, 189, 160, 168, 139, 218, 81, 119, 184, 106, 21, 195, 178, 159, 85, 152, 115>>}}
"""
def decode_payload(<<version_byte::binary-size(1), rest::binary>>) do
<<reserved_bit::1, type::4, encoded_hash_size::3>> = version_byte
with {_, true} <- {:invalid_reserved_bit, reserved_bit == 0},
hash_size <- hash_size(encoded_hash_size),
{_, true} <- {:invalid_hash_size, byte_size(rest) * 8 == hash_size} do
{:ok, {type, hash_size, rest}}
else
{:invalid_reserved_bit, _} -> {:error, "Invalid reserved bit in version byte"}
{:invalid_hash_size, _} -> {:error, "Invalid payload hash size"}
end
end
def decode_payload(<<>>), do: {:error, "No version byte"}
# Create a checksum.
defp create_checksum(hrp, data) do
payload = expand_hrp(hrp) ++ data
values = payload ++ [0, 0, 0, 0, 0, 0, 0, 0]
mod = polymod(values)
for p <- 0..7, do: mod >>> (5 * (7 - p)) &&& @uint5_max_value
end
# Verify a checksum.
defp verify_checksum(hrp, data) do
polymod(expand_hrp(hrp) ++ data) == 0
end
# Expand a HRP for use in checksum computation.
defp expand_hrp(hrp) do
hrp_charlist = :binary.bin_to_list(hrp)
b_values = for c <- hrp_charlist, do: c &&& @uint5_max_value
b_values ++ [0]
end
# Find the polynomial with value coefficients mod the generator as 30-bit.
defp polymod(data) do
c =
Enum.reduce(data, 1, fn d, c ->
if d > @uint5_max_value or d < 0, do: raise(ArgumentError)
c0 = c >>> 35
c = ((c &&& 0x07FFFFFFFF) <<< 5) ^^^ d
Enum.reduce(for(i <- 0..4, do: i), c, fn i, c ->
g =
if (c0 >>> i &&& 1) != 0 do
generator(i)
else
0
end
c ^^^ g
end)
end)
c ^^^ 1
end
end | lib/cashaddr.ex | 0.705582 | 0.52476 | cashaddr.ex | starcoder |
defprotocol Jason.Encoder do
@moduledoc """
Protocol controlling how a value is encoded to JSON.
## Deriving
The protocol allows leveraging the Elixir's `@derive` feature
to simplify protocol implementation in trivial cases. Accepted
options are:
* `:only` - encodes only values of specified keys.
* `:except` - encodes all struct fields except specified keys.
By default all keys except the `:__struct__` key are encoded.
## Example
Let's assume a presence of the following struct:
defmodule Test do
defstruct [:foo, :bar, :baz]
end
If we were to call `@derive Jason.Encoder` just before `defstruct`,
an implementation similar to the following implementation would be generated:
defimpl Jason.Encoder, for: Test do
def encode(value, opts) do
Jason.Encode.map(Map.take(value, [:foo, :bar, :baz]), opts)
end
end
If we called `@derive {Jason.Encoder, only: [:foo]}`, an implementation
similar to the following implementation would be generated:
defimpl Jason.Encoder, for: Test do
def encode(value, opts) do
Jason.Encode.map(Map.take(value, [:foo]), opts)
end
end
If we called `@derive {Jason.Encoder, except: [:foo]}`, an implementation
similar to the following implementation would be generated:
defimpl Jason.Encoder, for: Test do
def encode(value, opts) do
Jason.Encode.map(Map.take(value, [:bar, :baz]), opts)
end
end
The actually generated implementations are more efficient computing some data
during compilation similar to the macros from the `Jason.Helpers` module.
## Explicit implementation
If you wish to implement the protocol fully yourself, it is advised to
use functions from the `Jason.Encode` module to do the actual iodata
generation - they are highly optimized and verified to always produce
valid JSON.
"""
@type t :: term
@type opts :: Jason.Encode.opts()
@fallback_to_any true
@doc """
Encodes `value` to JSON.
The argument `opts` is opaque - it can be passed to various functions in
`Jason.Encode` (or to the protocol function itself) for encoding values to JSON.
"""
@spec encode(t, opts) :: iodata
def encode(value, opts)
end
defimpl Jason.Encoder, for: Any do
defmacro __deriving__(module, struct, opts) do
fields = fields_to_encode(struct, opts)
kv = Enum.map(fields, &{&1, generated_var(&1, __MODULE__)})
escape = quote(do: escape)
encode_map = quote(do: encode_map)
encode_args = [escape, encode_map]
kv_iodata = Jason.Codegen.build_kv_iodata(kv, encode_args)
quote do
defimpl Jason.Encoder, for: unquote(module) do
require Jason.Helpers
def encode(%{unquote_splicing(kv)}, {unquote(escape), unquote(encode_map)}) do
unquote(kv_iodata)
end
end
end
end
# The same as Macro.var/2 except it sets generated: true
defp generated_var(name, context) do
{name, [generated: true], context}
end
def encode(%_{} = struct, _opts) do
raise Protocol.UndefinedError,
protocol: @protocol,
value: struct,
description: """
Jason.Encoder protocol must always be explicitly implemented.
If you own the struct, you can derive the implementation specifying \
which fields should be encoded to JSON:
@derive {Jason.Encoder, only: [....]}
defstruct ...
It is also possible to encode all fields, although this should be \
used carefully to avoid accidentally leaking private information \
when new fields are added:
@derive Jason.Encoder
defstruct ...
Finally, if you don't own the struct you want to encode to JSON, \
you may use Protocol.derive/3 placed outside of any module:
Protocol.derive(Jason.Encoder, NameOfTheStruct, only: [...])
Protocol.derive(Jason.Encoder, NameOfTheStruct)
"""
end
def encode(value, _opts) do
raise Protocol.UndefinedError,
protocol: @protocol,
value: value,
description: "Jason.Encoder protocol must always be explicitly implemented"
end
defp fields_to_encode(struct, opts) do
cond do
only = Keyword.get(opts, :only) ->
only
except = Keyword.get(opts, :except) ->
Map.keys(struct) -- [:__struct__ | except]
true ->
Map.keys(struct) -- [:__struct__]
end
end
end
# The following implementations are formality - they are already covered
# by the main encoding mechanism in Jason.Encode, but exist mostly for
# documentation purposes and if anybody had the idea to call the protocol directly.
defimpl Jason.Encoder, for: Atom do
def encode(atom, opts) do
Jason.Encode.atom(atom, opts)
end
end
defimpl Jason.Encoder, for: Integer do
def encode(integer, _opts) do
Jason.Encode.integer(integer)
end
end
defimpl Jason.Encoder, for: Float do
def encode(float, _opts) do
Jason.Encode.float(float)
end
end
defimpl Jason.Encoder, for: List do
def encode(list, opts) do
Jason.Encode.list(list, opts)
end
end
defimpl Jason.Encoder, for: Map do
def encode(map, opts) do
Jason.Encode.map(map, opts)
end
end
defimpl Jason.Encoder, for: BitString do
def encode(binary, opts) when is_binary(binary) do
Jason.Encode.string(binary, opts)
end
def encode(bitstring, _opts) do
raise Protocol.UndefinedError,
protocol: @protocol,
value: bitstring,
description: "cannot encode a bitstring to JSON"
end
end
defimpl Jason.Encoder, for: [Date, Time, NaiveDateTime, DateTime] do
def encode(value, _opts) do
[?\", @for.to_iso8601(value), ?\"]
end
end
defimpl Jason.Encoder, for: Decimal do
def encode(value, _opts) do
# silence the xref warning
decimal = Decimal
[?\", decimal.to_string(value), ?\"]
end
end
defimpl Jason.Encoder, for: Jason.Fragment do
def encode(%{encode: encode}, opts) do
encode.(opts)
end
end | lib/encoder.ex | 0.935883 | 0.755118 | encoder.ex | starcoder |
defmodule DateTime do
@moduledoc """
A datetime implementation with a time zone.
This datetime can be seen as an ephemeral snapshot
of a datetime at a given time zone. For such purposes,
it also includes both UTC and Standard offsets, as
well as the zone abbreviation field used exclusively
for formatting purposes.
Remember, comparisons in Elixir using `==/2`, `>/2`, `</2` and friends
are structural and based on the DateTime struct fields. For proper
comparison between datetimes, use the `compare/2` function.
Developers should avoid creating the `DateTime` struct directly
and instead rely on the functions provided by this module as
well as the ones in third-party calendar libraries.
## Time zone database
Many functions in this module require a time zone database.
By default, it uses the default time zone database returned by
`Calendar.get_time_zone_database/0`, which defaults to
`Calendar.UTCOnlyTimeZoneDatabase` which only handles "Etc/UTC"
datetimes and returns `{:error, :utc_only_time_zone_database}`
for any other time zone.
Other time zone databases can also be configured. For example,
two of the available options are:
* [`tz`](https://hexdocs.pm/tz/)
* [`tzdata`](https://hexdocs.pm/tzdata/)
To use them, first make sure it is added as a dependency in `mix.exs`.
It can then be configured either via configuration:
config :elixir, :time_zone_database, Tzdata.TimeZoneDatabase
or by calling `Calendar.put_time_zone_database/1`:
Calendar.put_time_zone_database(Tzdata.TimeZoneDatabase)
See the proper names in the library installation instructions.
"""
@enforce_keys [:year, :month, :day, :hour, :minute, :second] ++
[:time_zone, :zone_abbr, :utc_offset, :std_offset]
defstruct [
:year,
:month,
:day,
:hour,
:minute,
:second,
:time_zone,
:zone_abbr,
:utc_offset,
:std_offset,
microsecond: {0, 0},
calendar: Calendar.ISO
]
@type t :: %__MODULE__{
year: Calendar.year(),
month: Calendar.month(),
day: Calendar.day(),
calendar: Calendar.calendar(),
hour: Calendar.hour(),
minute: Calendar.minute(),
second: Calendar.second(),
microsecond: Calendar.microsecond(),
time_zone: Calendar.time_zone(),
zone_abbr: Calendar.zone_abbr(),
utc_offset: Calendar.utc_offset(),
std_offset: Calendar.std_offset()
}
@unix_days :calendar.date_to_gregorian_days({1970, 1, 1})
@seconds_per_day 24 * 60 * 60
@doc """
Returns the current datetime in UTC.
## Examples
iex> datetime = DateTime.utc_now()
iex> datetime.time_zone
"Etc/UTC"
"""
@spec utc_now(Calendar.calendar()) :: t
def utc_now(calendar \\ Calendar.ISO) do
System.os_time() |> from_unix!(:native, calendar)
end
@doc """
Builds a datetime from date and time structs.
It expects a time zone to put the `DateTime` in.
If the time zone is not passed it will default to `"Etc/UTC"`,
which always succeeds. Otherwise, the `DateTime` is checked against the time zone database
given as `time_zone_database`. See the "Time zone database"
section in the module documentation.
## Examples
iex> DateTime.new(~D[2016-05-24], ~T[13:26:08.003], "Etc/UTC")
{:ok, ~U[2016-05-24 13:26:08.003Z]}
When the datetime is ambiguous - for instance during changing from summer
to winter time - the two possible valid datetimes are returned. First the one
that happens first, then the one that happens after.
iex> {:ambiguous, first_dt, second_dt} = DateTime.new(~D[2018-10-28], ~T[02:30:00], "Europe/Copenhagen", FakeTimeZoneDatabase)
iex> first_dt
#DateTime<2018-10-28 02:30:00+02:00 CEST Europe/Copenhagen>
iex> second_dt
#DateTime<2018-10-28 02:30:00+01:00 CET Europe/Copenhagen>
When there is a gap in wall time - for instance in spring when the clocks are
turned forward - the latest valid datetime just before the gap and the first
valid datetime just after the gap.
iex> {:gap, just_before, just_after} = DateTime.new(~D[2019-03-31], ~T[02:30:00], "Europe/Copenhagen", FakeTimeZoneDatabase)
iex> just_before
#DateTime<2019-03-31 01:59:59.999999+01:00 CET Europe/Copenhagen>
iex> just_after
#DateTime<2019-03-31 03:00:00+02:00 CEST Europe/Copenhagen>
Most of the time there is one, and just one, valid datetime for a certain
date and time in a certain time zone.
iex> {:ok, datetime} = DateTime.new(~D[2018-07-28], ~T[12:30:00], "Europe/Copenhagen", FakeTimeZoneDatabase)
iex> datetime
#DateTime<2018-07-28 12:30:00+02:00 CEST Europe/Copenhagen>
"""
@doc since: "1.11.0"
@spec new(Date.t(), Time.t(), Calendar.time_zone(), Calendar.time_zone_database()) ::
{:ok, t}
| {:ambiguous, t, t}
| {:gap, t, t}
| {:error,
:incompatible_calendars | :time_zone_not_found | :utc_only_time_zone_database}
def new(
date,
time,
time_zone \\ "Etc/UTC",
time_zone_database \\ Calendar.get_time_zone_database()
)
def new(%Date{calendar: calendar} = date, %Time{calendar: calendar} = time, "Etc/UTC", _db) do
%{year: year, month: month, day: day} = date
%{hour: hour, minute: minute, second: second, microsecond: microsecond} = time
datetime = %DateTime{
calendar: calendar,
year: year,
month: month,
day: day,
hour: hour,
minute: minute,
second: second,
microsecond: microsecond,
std_offset: 0,
utc_offset: 0,
zone_abbr: "UTC",
time_zone: "Etc/UTC"
}
{:ok, datetime}
end
def new(date, time, time_zone, time_zone_database) do
with {:ok, naive_datetime} <- NaiveDateTime.new(date, time) do
from_naive(naive_datetime, time_zone, time_zone_database)
end
end
@doc """
Builds a datetime from date and time structs, raising on errors.
It expects a time zone to put the `DateTime` in.
If the time zone is not passed it will default to `"Etc/UTC"`,
which always succeeds. Otherwise, the DateTime is checked against the time zone database
given as `time_zone_database`. See the "Time zone database"
section in the module documentation.
## Examples
iex> DateTime.new!(~D[2016-05-24], ~T[13:26:08.003], "Etc/UTC")
~U[2016-05-24 13:26:08.003Z]
When the datetime is ambiguous - for instance during changing from summer
to winter time - an error will be raised.
iex> DateTime.new!(~D[2018-10-28], ~T[02:30:00], "Europe/Copenhagen", FakeTimeZoneDatabase)
** (ArgumentError) cannot build datetime with ~D[2018-10-28] and ~T[02:30:00] because such instant is ambiguous in time zone Europe/Copenhagen as there is an overlap between #DateTime<2018-10-28 02:30:00+02:00 CEST Europe/Copenhagen> and #DateTime<2018-10-28 02:30:00+01:00 CET Europe/Copenhagen>
When there is a gap in wall time - for instance in spring when the clocks are
turned forward - an error will be raised.
iex> DateTime.new!(~D[2019-03-31], ~T[02:30:00], "Europe/Copenhagen", FakeTimeZoneDatabase)
** (ArgumentError) cannot build datetime with ~D[2019-03-31] and ~T[02:30:00] because such instant does not exist in time zone Europe/Copenhagen as there is a gap between #DateTime<2019-03-31 01:59:59.999999+01:00 CET Europe/Copenhagen> and #DateTime<2019-03-31 03:00:00+02:00 CEST Europe/Copenhagen>
Most of the time there is one, and just one, valid datetime for a certain
date and time in a certain time zone.
iex> datetime = DateTime.new!(~D[2018-07-28], ~T[12:30:00], "Europe/Copenhagen", FakeTimeZoneDatabase)
iex> datetime
#DateTime<2018-07-28 12:30:00+02:00 CEST Europe/Copenhagen>
"""
@doc since: "1.11.0"
@spec new!(Date.t(), Time.t(), Calendar.time_zone(), Calendar.time_zone_database()) :: t
def new!(
date,
time,
time_zone \\ "Etc/UTC",
time_zone_database \\ Calendar.get_time_zone_database()
)
def new!(date, time, time_zone, time_zone_database) do
case new(date, time, time_zone, time_zone_database) do
{:ok, datetime} ->
datetime
{:ambiguous, dt1, dt2} ->
raise ArgumentError,
"cannot build datetime with #{inspect(date)} and #{inspect(time)} because such " <>
"instant is ambiguous in time zone #{time_zone} as there is an overlap " <>
"between #{inspect(dt1)} and #{inspect(dt2)}"
{:gap, dt1, dt2} ->
raise ArgumentError,
"cannot build datetime with #{inspect(date)} and #{inspect(time)} because such " <>
"instant does not exist in time zone #{time_zone} as there is a gap " <>
"between #{inspect(dt1)} and #{inspect(dt2)}"
{:error, reason} ->
raise ArgumentError,
"cannot build datetime with #{inspect(date)} and #{inspect(time)}, reason: #{
inspect(reason)
}"
end
end
@doc """
Converts the given Unix time to `DateTime`.
The integer can be given in different unit
according to `System.convert_time_unit/3` and it will
be converted to microseconds internally. Up to
253402300799 seconds is supported.
Unix times are always in UTC and therefore the DateTime
will be returned in UTC.
## Examples
iex> {:ok, datetime} = DateTime.from_unix(1_464_096_368)
iex> datetime
~U[2016-05-24 13:26:08Z]
iex> {:ok, datetime} = DateTime.from_unix(1_432_560_368_868_569, :microsecond)
iex> datetime
~U[2015-05-25 13:26:08.868569Z]
iex> {:ok, datetime} = DateTime.from_unix(253_402_300_799)
iex> datetime
~U[9999-12-31 23:59:59Z]
iex> {:error, :invalid_unix_time} = DateTime.from_unix(253_402_300_800)
The unit can also be an integer as in `t:System.time_unit/0`:
iex> {:ok, datetime} = DateTime.from_unix(143_256_036_886_856, 1024)
iex> datetime
~U[6403-03-17 07:05:22.320312Z]
Negative Unix times are supported up to -377705116800 seconds:
iex> {:ok, datetime} = DateTime.from_unix(-377_705_116_800)
iex> datetime
~U[-9999-01-01 00:00:00Z]
iex> {:error, :invalid_unix_time} = DateTime.from_unix(-377_705_116_801)
"""
@spec from_unix(integer, :native | System.time_unit(), Calendar.calendar()) ::
{:ok, t} | {:error, atom}
def from_unix(integer, unit \\ :second, calendar \\ Calendar.ISO) when is_integer(integer) do
case Calendar.ISO.from_unix(integer, unit) do
{:ok, {year, month, day}, {hour, minute, second}, microsecond} ->
iso_datetime = %DateTime{
year: year,
month: month,
day: day,
hour: hour,
minute: minute,
second: second,
microsecond: microsecond,
std_offset: 0,
utc_offset: 0,
zone_abbr: "UTC",
time_zone: "Etc/UTC"
}
convert(iso_datetime, calendar)
{:error, _} = error ->
error
end
end
@doc """
Converts the given Unix time to `DateTime`.
The integer can be given in different unit
according to `System.convert_time_unit/3` and it will
be converted to microseconds internally.
Unix times are always in UTC and therefore the DateTime
will be returned in UTC.
## Examples
# An easy way to get the Unix epoch is passing 0 to this function
iex> DateTime.from_unix!(0)
~U[1970-01-01 00:00:00Z]
iex> DateTime.from_unix!(1_464_096_368)
~U[2016-05-24 13:26:08Z]
iex> DateTime.from_unix!(1_432_560_368_868_569, :microsecond)
~U[2015-05-25 13:26:08.868569Z]
iex> DateTime.from_unix!(143_256_036_886_856, 1024)
~U[6403-03-17 07:05:22.320312Z]
"""
@spec from_unix!(integer, :native | System.time_unit(), Calendar.calendar()) :: t
def from_unix!(integer, unit \\ :second, calendar \\ Calendar.ISO) do
case from_unix(integer, unit, calendar) do
{:ok, datetime} ->
datetime
{:error, :invalid_unix_time} ->
raise ArgumentError, "invalid Unix time #{integer}"
end
end
@doc """
Converts the given `NaiveDateTime` to `DateTime`.
It expects a time zone to put the `NaiveDateTime` in.
If the time zone is "Etc/UTC", it always succeeds. Otherwise,
the NaiveDateTime is checked against the time zone database
given as `time_zone_database`. See the "Time zone database"
section in the module documentation.
## Examples
iex> DateTime.from_naive(~N[2016-05-24 13:26:08.003], "Etc/UTC")
{:ok, ~U[2016-05-24 13:26:08.003Z]}
When the datetime is ambiguous - for instance during changing from summer
to winter time - the two possible valid datetimes are returned. First the one
that happens first, then the one that happens after.
iex> {:ambiguous, first_dt, second_dt} = DateTime.from_naive(~N[2018-10-28 02:30:00], "Europe/Copenhagen", FakeTimeZoneDatabase)
iex> first_dt
#DateTime<2018-10-28 02:30:00+02:00 CEST Europe/Copenhagen>
iex> second_dt
#DateTime<2018-10-28 02:30:00+01:00 CET Europe/Copenhagen>
When there is a gap in wall time - for instance in spring when the clocks are
turned forward - the latest valid datetime just before the gap and the first
valid datetime just after the gap.
iex> {:gap, just_before, just_after} = DateTime.from_naive(~N[2019-03-31 02:30:00], "Europe/Copenhagen", FakeTimeZoneDatabase)
iex> just_before
#DateTime<2019-03-31 01:59:59.999999+01:00 CET Europe/Copenhagen>
iex> just_after
#DateTime<2019-03-31 03:00:00+02:00 CEST Europe/Copenhagen>
Most of the time there is one, and just one, valid datetime for a certain
date and time in a certain time zone.
iex> {:ok, datetime} = DateTime.from_naive(~N[2018-07-28 12:30:00], "Europe/Copenhagen", FakeTimeZoneDatabase)
iex> datetime
#DateTime<2018-07-28 12:30:00+02:00 CEST Europe/Copenhagen>
This function accepts any map or struct that contains at least the same fields as a `NaiveDateTime`
struct. The most common example of that is a `DateTime`. In this case the information about the time
zone of that `DateTime` is completely ignored. This is the same principle as passing a `DateTime` to
`Date.to_iso8601/2`. `Date.to_iso8601/2` extracts only the date-specific fields (calendar, year,
month and day) of the given structure and ignores all others.
This way if you have a `DateTime` in one time zone, you can get the same wall time in another time zone.
For instance if you have 2018-08-24 10:00:00 in Copenhagen and want a `DateTime` for 2018-08-24 10:00:00
in UTC you can do:
iex> cph_datetime = DateTime.from_naive!(~N[2018-08-24 10:00:00], "Europe/Copenhagen", FakeTimeZoneDatabase)
iex> {:ok, utc_datetime} = DateTime.from_naive(cph_datetime, "Etc/UTC", FakeTimeZoneDatabase)
iex> utc_datetime
~U[2018-08-24 10:00:00Z]
If instead you want a `DateTime` for the same point time in a different time zone see the
`DateTime.shift_zone/3` function which would convert 2018-08-24 10:00:00 in Copenhagen
to 2018-08-24 08:00:00 in UTC.
"""
@doc since: "1.4.0"
@spec from_naive(
Calendar.naive_datetime(),
Calendar.time_zone(),
Calendar.time_zone_database()
) ::
{:ok, t}
| {:ambiguous, t, t}
| {:gap, t, t}
| {:error,
:incompatible_calendars | :time_zone_not_found | :utc_only_time_zone_database}
def from_naive(
naive_datetime,
time_zone,
time_zone_database \\ Calendar.get_time_zone_database()
)
def from_naive(naive_datetime, "Etc/UTC", _) do
utc_period = %{std_offset: 0, utc_offset: 0, zone_abbr: "UTC"}
{:ok, from_naive_with_period(naive_datetime, "Etc/UTC", utc_period)}
end
def from_naive(%{calendar: Calendar.ISO} = naive_datetime, time_zone, time_zone_database) do
case time_zone_database.time_zone_periods_from_wall_datetime(naive_datetime, time_zone) do
{:ok, period} ->
{:ok, from_naive_with_period(naive_datetime, time_zone, period)}
{:ambiguous, first_period, second_period} ->
first_datetime = from_naive_with_period(naive_datetime, time_zone, first_period)
second_datetime = from_naive_with_period(naive_datetime, time_zone, second_period)
{:ambiguous, first_datetime, second_datetime}
{:gap, {first_period, first_period_until_wall}, {second_period, second_period_from_wall}} ->
# `until_wall` is not valid, but any time just before is.
# So by subtracting a second and adding .999999 seconds
# we get the last microsecond just before.
before_naive =
first_period_until_wall
|> Map.put(:microsecond, {999_999, 6})
|> NaiveDateTime.add(-1)
after_naive = second_period_from_wall
latest_datetime_before = from_naive_with_period(before_naive, time_zone, first_period)
first_datetime_after = from_naive_with_period(after_naive, time_zone, second_period)
{:gap, latest_datetime_before, first_datetime_after}
{:error, _} = error ->
error
end
end
def from_naive(%{calendar: calendar} = naive_datetime, time_zone, time_zone_database)
when calendar != Calendar.ISO do
# For non-ISO calendars, convert to ISO, create ISO DateTime, and then
# convert to original calendar
iso_result =
with {:ok, in_iso} <- NaiveDateTime.convert(naive_datetime, Calendar.ISO) do
from_naive(in_iso, time_zone, time_zone_database)
end
case iso_result do
{:ok, dt} ->
convert(dt, calendar)
{:ambiguous, dt1, dt2} ->
with {:ok, dt1converted} <- convert(dt1, calendar),
{:ok, dt2converted} <- convert(dt2, calendar),
do: {:ambiguous, dt1converted, dt2converted}
{:gap, dt1, dt2} ->
with {:ok, dt1converted} <- convert(dt1, calendar),
{:ok, dt2converted} <- convert(dt2, calendar),
do: {:gap, dt1converted, dt2converted}
{:error, _} = error ->
error
end
end
defp from_naive_with_period(naive_datetime, time_zone, period) do
%{std_offset: std_offset, utc_offset: utc_offset, zone_abbr: zone_abbr} = period
%{
calendar: calendar,
hour: hour,
minute: minute,
second: second,
microsecond: microsecond,
year: year,
month: month,
day: day
} = naive_datetime
%DateTime{
calendar: calendar,
year: year,
month: month,
day: day,
hour: hour,
minute: minute,
second: second,
microsecond: microsecond,
std_offset: std_offset,
utc_offset: utc_offset,
zone_abbr: zone_abbr,
time_zone: time_zone
}
end
@doc """
Converts the given `NaiveDateTime` to `DateTime`.
It expects a time zone to put the NaiveDateTime in.
If the time zone is "Etc/UTC", it always succeeds. Otherwise,
the NaiveDateTime is checked against the time zone database
given as `time_zone_database`. See the "Time zone database"
section in the module documentation.
## Examples
iex> DateTime.from_naive!(~N[2016-05-24 13:26:08.003], "Etc/UTC")
~U[2016-05-24 13:26:08.003Z]
iex> DateTime.from_naive!(~N[2018-05-24 13:26:08.003], "Europe/Copenhagen", FakeTimeZoneDatabase)
#DateTime<2018-05-24 13:26:08.003+02:00 CEST Europe/Copenhagen>
"""
@doc since: "1.4.0"
@spec from_naive!(
NaiveDateTime.t(),
Calendar.time_zone(),
Calendar.time_zone_database()
) :: t
def from_naive!(
naive_datetime,
time_zone,
time_zone_database \\ Calendar.get_time_zone_database()
) do
case from_naive(naive_datetime, time_zone, time_zone_database) do
{:ok, datetime} ->
datetime
{:ambiguous, dt1, dt2} ->
raise ArgumentError,
"cannot convert #{inspect(naive_datetime)} to datetime because such " <>
"instant is ambiguous in time zone #{time_zone} as there is an overlap " <>
"between #{inspect(dt1)} and #{inspect(dt2)}"
{:gap, dt1, dt2} ->
raise ArgumentError,
"cannot convert #{inspect(naive_datetime)} to datetime because such " <>
"instant does not exist in time zone #{time_zone} as there is a gap " <>
"between #{inspect(dt1)} and #{inspect(dt2)}"
{:error, reason} ->
raise ArgumentError,
"cannot convert #{inspect(naive_datetime)} to datetime, reason: #{inspect(reason)}"
end
end
@doc """
Changes the time zone of a `DateTime`.
Returns a `DateTime` for the same point in time, but instead at
the time zone provided. It assumes that `DateTime` is valid and
exists in the given time zone and calendar.
By default, it uses the default time zone database returned by
`Calendar.get_time_zone_database/0`, which defaults to
`Calendar.UTCOnlyTimeZoneDatabase` which only handles "Etc/UTC" datetimes.
Other time zone databases can be passed as argument or set globally.
See the "Time zone database" section in the module docs.
## Examples
iex> {:ok, pacific_datetime} = DateTime.shift_zone(~U[2018-07-16 10:00:00Z], "America/Los_Angeles", FakeTimeZoneDatabase)
iex> pacific_datetime
#DateTime<2018-07-16 03:00:00-07:00 PDT America/Los_Angeles>
iex> DateTime.shift_zone(~U[2018-07-16 10:00:00Z], "bad timezone", FakeTimeZoneDatabase)
{:error, :time_zone_not_found}
"""
@doc since: "1.8.0"
@spec shift_zone(t, Calendar.time_zone(), Calendar.time_zone_database()) ::
{:ok, t} | {:error, :time_zone_not_found | :utc_only_time_zone_database}
def shift_zone(datetime, time_zone, time_zone_database \\ Calendar.get_time_zone_database())
def shift_zone(%{time_zone: time_zone} = datetime, time_zone, _) do
{:ok, datetime}
end
def shift_zone(datetime, time_zone, time_zone_database) do
%{
std_offset: std_offset,
utc_offset: utc_offset,
calendar: calendar,
microsecond: {_, precision}
} = datetime
datetime
|> to_iso_days()
|> apply_tz_offset(utc_offset + std_offset)
|> shift_zone_for_iso_days_utc(calendar, precision, time_zone, time_zone_database)
end
defp shift_zone_for_iso_days_utc(iso_days_utc, calendar, precision, time_zone, time_zone_db) do
case time_zone_db.time_zone_period_from_utc_iso_days(iso_days_utc, time_zone) do
{:ok, %{std_offset: std_offset, utc_offset: utc_offset, zone_abbr: zone_abbr}} ->
{year, month, day, hour, minute, second, {microsecond_without_precision, _}} =
iso_days_utc
|> apply_tz_offset(-(utc_offset + std_offset))
|> calendar.naive_datetime_from_iso_days()
datetime = %DateTime{
calendar: calendar,
year: year,
month: month,
day: day,
hour: hour,
minute: minute,
second: second,
microsecond: {microsecond_without_precision, precision},
std_offset: std_offset,
utc_offset: utc_offset,
zone_abbr: zone_abbr,
time_zone: time_zone
}
{:ok, datetime}
{:error, _} = error ->
error
end
end
@doc """
Changes the time zone of a `DateTime` or raises on errors.
See `shift_zone/3` for more information.
## Examples
iex> DateTime.shift_zone!(~U[2018-07-16 10:00:00Z], "America/Los_Angeles", FakeTimeZoneDatabase)
#DateTime<2018-07-16 03:00:00-07:00 PDT America/Los_Angeles>
iex> DateTime.shift_zone!(~U[2018-07-16 10:00:00Z], "bad timezone", FakeTimeZoneDatabase)
** (ArgumentError) cannot shift ~U[2018-07-16 10:00:00Z] to "bad timezone" time zone, reason: :time_zone_not_found
"""
@doc since: "1.10.0"
@spec shift_zone!(t, Calendar.time_zone(), Calendar.time_zone_database()) :: t
def shift_zone!(datetime, time_zone, time_zone_database \\ Calendar.get_time_zone_database()) do
case shift_zone(datetime, time_zone, time_zone_database) do
{:ok, datetime} ->
datetime
{:error, reason} ->
raise ArgumentError,
"cannot shift #{inspect(datetime)} to #{inspect(time_zone)} time zone" <>
", reason: #{inspect(reason)}"
end
end
@doc """
Returns the current datetime in the provided time zone.
By default, it uses the default time_zone returned by
`Calendar.get_time_zone_database/0`, which defaults to
`Calendar.UTCOnlyTimeZoneDatabase` which only handles "Etc/UTC" datetimes.
Other time zone databases can be passed as argument or set globally.
See the "Time zone database" section in the module docs.
## Examples
iex> {:ok, datetime} = DateTime.now("Etc/UTC")
iex> datetime.time_zone
"Etc/UTC"
iex> DateTime.now("Europe/Copenhagen")
{:error, :utc_only_time_zone_database}
iex> DateTime.now("bad timezone", FakeTimeZoneDatabase)
{:error, :time_zone_not_found}
"""
@doc since: "1.8.0"
@spec now(Calendar.time_zone(), Calendar.time_zone_database()) ::
{:ok, t} | {:error, :time_zone_not_found | :utc_only_time_zone_database}
def now(time_zone, time_zone_database \\ Calendar.get_time_zone_database())
def now("Etc/UTC", _) do
{:ok, utc_now()}
end
def now(time_zone, time_zone_database) do
shift_zone(utc_now(), time_zone, time_zone_database)
end
@doc """
Returns the current datetime in the provided time zone or raises on errors
See `now/2` for more information.
## Examples
iex> datetime = DateTime.now!("Etc/UTC")
iex> datetime.time_zone
"Etc/UTC"
iex> DateTime.now!("Europe/Copenhagen")
** (ArgumentError) cannot get current datetime in "Europe/Copenhagen" time zone, reason: :utc_only_time_zone_database
iex> DateTime.now!("bad timezone", FakeTimeZoneDatabase)
** (ArgumentError) cannot get current datetime in "bad timezone" time zone, reason: :time_zone_not_found
"""
@doc since: "1.10.0"
@spec now!(Calendar.time_zone(), Calendar.time_zone_database()) :: t
def now!(time_zone, time_zone_database \\ Calendar.get_time_zone_database()) do
case now(time_zone, time_zone_database) do
{:ok, datetime} ->
datetime
{:error, reason} ->
raise ArgumentError,
"cannot get current datetime in #{inspect(time_zone)} time zone, reason: " <>
inspect(reason)
end
end
@doc """
Converts the given `datetime` to Unix time.
The `datetime` is expected to be using the ISO calendar
with a year greater than or equal to 0.
It will return the integer with the given unit,
according to `System.convert_time_unit/3`.
If you want to get the current time in Unix seconds,
do not do `DateTime.utc_now() |> DateTime.to_unix()`.
Simply call `System.os_time(:second)` instead.
## Examples
iex> 1_464_096_368 |> DateTime.from_unix!() |> DateTime.to_unix()
1464096368
iex> dt = %DateTime{calendar: Calendar.ISO, day: 20, hour: 18, microsecond: {273806, 6},
...> minute: 58, month: 11, second: 19, time_zone: "America/Montevideo",
...> utc_offset: -10800, std_offset: 3600, year: 2014, zone_abbr: "UYST"}
iex> DateTime.to_unix(dt)
1416517099
iex> flamel = %DateTime{calendar: Calendar.ISO, day: 22, hour: 8, microsecond: {527771, 6},
...> minute: 2, month: 3, second: 25, std_offset: 0, time_zone: "Etc/UTC",
...> utc_offset: 0, year: 1418, zone_abbr: "UTC"}
iex> DateTime.to_unix(flamel)
-17412508655
"""
@spec to_unix(Calendar.datetime(), System.time_unit()) :: integer
def to_unix(datetime, unit \\ :second)
def to_unix(%{utc_offset: utc_offset, std_offset: std_offset} = datetime, unit) do
{days, fraction} = to_iso_days(datetime)
unix_units = Calendar.ISO.iso_days_to_unit({days - @unix_days, fraction}, unit)
offset_units = System.convert_time_unit(utc_offset + std_offset, :second, unit)
unix_units - offset_units
end
@doc """
Converts the given `datetime` into a `NaiveDateTime`.
Because `NaiveDateTime` does not hold time zone information,
any time zone related data will be lost during the conversion.
## Examples
iex> dt = %DateTime{year: 2000, month: 2, day: 29, zone_abbr: "CET",
...> hour: 23, minute: 0, second: 7, microsecond: {0, 1},
...> utc_offset: 3600, std_offset: 0, time_zone: "Europe/Warsaw"}
iex> DateTime.to_naive(dt)
~N[2000-02-29 23:00:07.0]
"""
@spec to_naive(Calendar.datetime()) :: NaiveDateTime.t()
def to_naive(%{
calendar: calendar,
year: year,
month: month,
day: day,
hour: hour,
minute: minute,
second: second,
microsecond: microsecond,
time_zone: _
}) do
%NaiveDateTime{
year: year,
month: month,
day: day,
calendar: calendar,
hour: hour,
minute: minute,
second: second,
microsecond: microsecond
}
end
@doc """
Converts a `DateTime` into a `Date`.
Because `Date` does not hold time nor time zone information,
data will be lost during the conversion.
## Examples
iex> dt = %DateTime{year: 2000, month: 2, day: 29, zone_abbr: "CET",
...> hour: 23, minute: 0, second: 7, microsecond: {0, 0},
...> utc_offset: 3600, std_offset: 0, time_zone: "Europe/Warsaw"}
iex> DateTime.to_date(dt)
~D[2000-02-29]
"""
@spec to_date(Calendar.datetime()) :: Date.t()
def to_date(%{
year: year,
month: month,
day: day,
calendar: calendar,
hour: _,
minute: _,
second: _,
microsecond: _,
time_zone: _
}) do
%Date{year: year, month: month, day: day, calendar: calendar}
end
@doc """
Converts a `DateTime` into `Time`.
Because `Time` does not hold date nor time zone information,
data will be lost during the conversion.
## Examples
iex> dt = %DateTime{year: 2000, month: 2, day: 29, zone_abbr: "CET",
...> hour: 23, minute: 0, second: 7, microsecond: {0, 1},
...> utc_offset: 3600, std_offset: 0, time_zone: "Europe/Warsaw"}
iex> DateTime.to_time(dt)
~T[23:00:07.0]
"""
@spec to_time(Calendar.datetime()) :: Time.t()
def to_time(%{
year: _,
month: _,
day: _,
calendar: calendar,
hour: hour,
minute: minute,
second: second,
microsecond: microsecond,
time_zone: _
}) do
%Time{
hour: hour,
minute: minute,
second: second,
microsecond: microsecond,
calendar: calendar
}
end
@doc """
Converts the given datetime to
[ISO 8601:2004](https://en.wikipedia.org/wiki/ISO_8601) format.
By default, `DateTime.to_iso8601/2` returns datetimes formatted in the "extended"
format, for human readability. It also supports the "basic" format through passing the `:basic` option.
Only supports converting datetimes which are in the ISO calendar,
attempting to convert datetimes from other calendars will raise.
WARNING: the ISO 8601 datetime format does not contain the time zone nor
its abbreviation, which means information is lost when converting to such
format.
### Examples
iex> dt = %DateTime{year: 2000, month: 2, day: 29, zone_abbr: "CET",
...> hour: 23, minute: 0, second: 7, microsecond: {0, 0},
...> utc_offset: 3600, std_offset: 0, time_zone: "Europe/Warsaw"}
iex> DateTime.to_iso8601(dt)
"2000-02-29T23:00:07+01:00"
iex> dt = %DateTime{year: 2000, month: 2, day: 29, zone_abbr: "UTC",
...> hour: 23, minute: 0, second: 7, microsecond: {0, 0},
...> utc_offset: 0, std_offset: 0, time_zone: "Etc/UTC"}
iex> DateTime.to_iso8601(dt)
"2000-02-29T23:00:07Z"
iex> dt = %DateTime{year: 2000, month: 2, day: 29, zone_abbr: "AMT",
...> hour: 23, minute: 0, second: 7, microsecond: {0, 0},
...> utc_offset: -14400, std_offset: 0, time_zone: "America/Manaus"}
iex> DateTime.to_iso8601(dt, :extended)
"2000-02-29T23:00:07-04:00"
iex> dt = %DateTime{year: 2000, month: 2, day: 29, zone_abbr: "AMT",
...> hour: 23, minute: 0, second: 7, microsecond: {0, 0},
...> utc_offset: -14400, std_offset: 0, time_zone: "America/Manaus"}
iex> DateTime.to_iso8601(dt, :basic)
"20000229T230007-0400"
"""
@spec to_iso8601(Calendar.datetime(), :extended | :basic) :: String.t()
def to_iso8601(datetime, format \\ :extended)
def to_iso8601(%{calendar: Calendar.ISO} = datetime, format)
when format in [:extended, :basic] do
%{
year: year,
month: month,
day: day,
hour: hour,
minute: minute,
second: second,
microsecond: microsecond,
time_zone: time_zone,
utc_offset: utc_offset,
std_offset: std_offset
} = datetime
Calendar.ISO.date_to_string(year, month, day, format) <>
"T" <>
Calendar.ISO.time_to_string(hour, minute, second, microsecond, format) <>
Calendar.ISO.offset_to_string(utc_offset, std_offset, time_zone, format)
end
def to_iso8601(%{calendar: _} = datetime, format) when format in [:extended, :basic] do
datetime
|> convert!(Calendar.ISO)
|> to_iso8601(format)
end
@doc """
Parses the extended "Date and time of day" format described by
[ISO 8601:2004](https://en.wikipedia.org/wiki/ISO_8601).
Since ISO 8601 does not include the proper time zone, the given
string will be converted to UTC and its offset in seconds will be
returned as part of this function. Therefore offset information
must be present in the string.
As specified in the standard, the separator "T" may be omitted if
desired as there is no ambiguity within this function.
The year parsed by this function is limited to four digits and,
while ISO 8601 allows datetimes to specify 24:00:00 as the zero
hour of the next day, this notation is not supported by Elixir.
Note leap seconds are not supported by the built-in Calendar.ISO.
## Examples
iex> {:ok, datetime, 0} = DateTime.from_iso8601("2015-01-23T23:50:07Z")
iex> datetime
~U[2015-01-23 23:50:07Z]
iex> {:ok, datetime, 9000} = DateTime.from_iso8601("2015-01-23T23:50:07.123+02:30")
iex> datetime
~U[2015-01-23 21:20:07.123Z]
iex> {:ok, datetime, 9000} = DateTime.from_iso8601("2015-01-23T23:50:07,123+02:30")
iex> datetime
~U[2015-01-23 21:20:07.123Z]
iex> {:ok, datetime, 0} = DateTime.from_iso8601("-2015-01-23T23:50:07Z")
iex> datetime
~U[-2015-01-23 23:50:07Z]
iex> {:ok, datetime, 9000} = DateTime.from_iso8601("-2015-01-23T23:50:07,123+02:30")
iex> datetime
~U[-2015-01-23 21:20:07.123Z]
iex> DateTime.from_iso8601("2015-01-23P23:50:07")
{:error, :invalid_format}
iex> DateTime.from_iso8601("2015-01-23T23:50:07")
{:error, :missing_offset}
iex> DateTime.from_iso8601("2015-01-23 23:50:61")
{:error, :invalid_time}
iex> DateTime.from_iso8601("2015-01-32 23:50:07")
{:error, :invalid_date}
iex> DateTime.from_iso8601("2015-01-23T23:50:07.123-00:00")
{:error, :invalid_format}
"""
@doc since: "1.4.0"
@spec from_iso8601(String.t(), Calendar.calendar()) ::
{:ok, t, Calendar.utc_offset()} | {:error, atom}
def from_iso8601(string, calendar \\ Calendar.ISO) do
with {:ok, {year, month, day, hour, minute, second, microsecond}, offset} <-
Calendar.ISO.parse_utc_datetime(string) do
datetime = %DateTime{
year: year,
month: month,
day: day,
hour: hour,
minute: minute,
second: second,
microsecond: microsecond,
std_offset: 0,
utc_offset: 0,
zone_abbr: "UTC",
time_zone: "Etc/UTC"
}
with {:ok, converted} <- convert(datetime, calendar) do
{:ok, converted, offset}
end
end
end
@doc """
Converts a number of gregorian seconds to a `DateTime` struct.
The returned `DateTime` will have `UTC` timezone, if you want other timezone, please use
`DateTime.shift_zone/3`.
## Examples
iex> DateTime.from_gregorian_seconds(1)
~U[0000-01-01 00:00:01Z]
iex> DateTime.from_gregorian_seconds(63_755_511_991, {5000, 3})
~U[2020-05-01 00:26:31.005Z]
iex> DateTime.from_gregorian_seconds(-1)
~U[-0001-12-31 23:59:59Z]
"""
@doc since: "1.11.0"
@spec from_gregorian_seconds(integer(), Calendar.microsecond(), Calendar.calendar()) :: t
def from_gregorian_seconds(
seconds,
{microsecond, precision} \\ {0, 0},
calendar \\ Calendar.ISO
)
when is_integer(seconds) do
iso_days = Calendar.ISO.gregorian_seconds_to_iso_days(seconds, microsecond)
{year, month, day, hour, minute, second, {microsecond, _}} =
calendar.naive_datetime_from_iso_days(iso_days)
%DateTime{
calendar: calendar,
year: year,
month: month,
day: day,
hour: hour,
minute: minute,
second: second,
microsecond: {microsecond, precision},
std_offset: 0,
utc_offset: 0,
zone_abbr: "UTC",
time_zone: "Etc/UTC"
}
end
@doc """
Converts a `DateTime` struct to a number of gregorian seconds and microseconds.
## Examples
iex> dt = %DateTime{year: 0000, month: 1, day: 1, zone_abbr: "UTC",
...> hour: 0, minute: 0, second: 1, microsecond: {0, 0},
...> utc_offset: 0, std_offset: 0, time_zone: "Etc/UTC"}
iex> DateTime.to_gregorian_seconds(dt)
{1, 0}
iex> dt = %DateTime{year: 2020, month: 5, day: 1, zone_abbr: "UTC",
...> hour: 0, minute: 26, second: 31, microsecond: {5000, 0},
...> utc_offset: 0, std_offset: 0, time_zone: "Etc/UTC"}
iex> DateTime.to_gregorian_seconds(dt)
{63_755_511_991, 5000}
iex> dt = %DateTime{year: 2020, month: 5, day: 1, zone_abbr: "CET",
...> hour: 1, minute: 26, second: 31, microsecond: {5000, 0},
...> utc_offset: 3600, std_offset: 0, time_zone: "Europe/Warsaw"}
iex> DateTime.to_gregorian_seconds(dt)
{63_755_511_991, 5000}
"""
@doc since: "1.11.0"
@spec to_gregorian_seconds(Calendar.datetime()) :: {integer(), non_neg_integer()}
def to_gregorian_seconds(
%{
std_offset: std_offset,
utc_offset: utc_offset,
microsecond: {microsecond, _}
} = datetime
) do
{days, day_fraction} =
datetime
|> to_iso_days()
|> apply_tz_offset(utc_offset + std_offset)
seconds_in_day = seconds_from_day_fraction(day_fraction)
{days * @seconds_per_day + seconds_in_day, microsecond}
end
@doc """
Converts the given `datetime` to a string according to its calendar.
### Examples
iex> dt = %DateTime{year: 2000, month: 2, day: 29, zone_abbr: "CET",
...> hour: 23, minute: 0, second: 7, microsecond: {0, 0},
...> utc_offset: 3600, std_offset: 0, time_zone: "Europe/Warsaw"}
iex> DateTime.to_string(dt)
"2000-02-29 23:00:07+01:00 CET Europe/Warsaw"
iex> dt = %DateTime{year: 2000, month: 2, day: 29, zone_abbr: "UTC",
...> hour: 23, minute: 0, second: 7, microsecond: {0, 0},
...> utc_offset: 0, std_offset: 0, time_zone: "Etc/UTC"}
iex> DateTime.to_string(dt)
"2000-02-29 23:00:07Z"
iex> dt = %DateTime{year: 2000, month: 2, day: 29, zone_abbr: "AMT",
...> hour: 23, minute: 0, second: 7, microsecond: {0, 0},
...> utc_offset: -14400, std_offset: 0, time_zone: "America/Manaus"}
iex> DateTime.to_string(dt)
"2000-02-29 23:00:07-04:00 AMT America/Manaus"
iex> dt = %DateTime{year: -100, month: 12, day: 19, zone_abbr: "CET",
...> hour: 3, minute: 20, second: 31, microsecond: {0, 0},
...> utc_offset: 3600, std_offset: 0, time_zone: "Europe/Stockholm"}
iex> DateTime.to_string(dt)
"-0100-12-19 03:20:31+01:00 CET Europe/Stockholm"
"""
@spec to_string(Calendar.datetime()) :: String.t()
def to_string(%{calendar: calendar} = datetime) do
%{
year: year,
month: month,
day: day,
hour: hour,
minute: minute,
second: second,
microsecond: microsecond,
time_zone: time_zone,
zone_abbr: zone_abbr,
utc_offset: utc_offset,
std_offset: std_offset
} = datetime
calendar.datetime_to_string(
year,
month,
day,
hour,
minute,
second,
microsecond,
time_zone,
zone_abbr,
utc_offset,
std_offset
)
end
@doc """
Compares two datetime structs.
Returns `:gt` if the first datetime is later than the second
and `:lt` for vice versa. If the two datetimes are equal
`:eq` is returned.
Note that both UTC and Standard offsets will be taken into
account when comparison is done.
## Examples
iex> dt1 = %DateTime{year: 2000, month: 2, day: 29, zone_abbr: "AMT",
...> hour: 23, minute: 0, second: 7, microsecond: {0, 0},
...> utc_offset: -14400, std_offset: 0, time_zone: "America/Manaus"}
iex> dt2 = %DateTime{year: 2000, month: 2, day: 29, zone_abbr: "CET",
...> hour: 23, minute: 0, second: 7, microsecond: {0, 0},
...> utc_offset: 3600, std_offset: 0, time_zone: "Europe/Warsaw"}
iex> DateTime.compare(dt1, dt2)
:gt
"""
@doc since: "1.4.0"
@spec compare(Calendar.datetime(), Calendar.datetime()) :: :lt | :eq | :gt
def compare(
%{utc_offset: utc_offset1, std_offset: std_offset1} = datetime1,
%{utc_offset: utc_offset2, std_offset: std_offset2} = datetime2
) do
{days1, {parts1, ppd1}} =
datetime1
|> to_iso_days()
|> apply_tz_offset(utc_offset1 + std_offset1)
{days2, {parts2, ppd2}} =
datetime2
|> to_iso_days()
|> apply_tz_offset(utc_offset2 + std_offset2)
# Ensure fraction tuples have same denominator.
first = {days1, parts1 * ppd2}
second = {days2, parts2 * ppd1}
cond do
first > second -> :gt
first < second -> :lt
true -> :eq
end
end
@doc """
Subtracts `datetime2` from `datetime1`.
The answer can be returned in any `unit` available from `t:System.time_unit/0`.
Leap seconds are not taken into account.
This function returns the difference in seconds where seconds are measured
according to `Calendar.ISO`.
## Examples
iex> dt1 = %DateTime{year: 2000, month: 2, day: 29, zone_abbr: "AMT",
...> hour: 23, minute: 0, second: 7, microsecond: {0, 0},
...> utc_offset: -14400, std_offset: 0, time_zone: "America/Manaus"}
iex> dt2 = %DateTime{year: 2000, month: 2, day: 29, zone_abbr: "CET",
...> hour: 23, minute: 0, second: 7, microsecond: {0, 0},
...> utc_offset: 3600, std_offset: 0, time_zone: "Europe/Warsaw"}
iex> DateTime.diff(dt1, dt2)
18000
iex> DateTime.diff(dt2, dt1)
-18000
"""
@doc since: "1.5.0"
@spec diff(Calendar.datetime(), Calendar.datetime(), System.time_unit()) :: integer()
def diff(
%{utc_offset: utc_offset1, std_offset: std_offset1} = datetime1,
%{utc_offset: utc_offset2, std_offset: std_offset2} = datetime2,
unit \\ :second
) do
naive_diff =
(datetime1 |> to_iso_days() |> Calendar.ISO.iso_days_to_unit(unit)) -
(datetime2 |> to_iso_days() |> Calendar.ISO.iso_days_to_unit(unit))
offset_diff = utc_offset2 + std_offset2 - (utc_offset1 + std_offset1)
naive_diff + System.convert_time_unit(offset_diff, :second, unit)
end
@doc """
Adds a specified amount of time to a `DateTime`.
Accepts an `amount_to_add` in any `unit` available from `t:System.time_unit/0`.
Negative values will move backwards in time.
Takes changes such as summer time/DST into account. This means that adding time
can cause the wall time to "go backwards" during "fall back" during autumn.
Adding just a few seconds to a datetime just before "spring forward" can cause wall
time to increase by more than an hour.
Fractional second precision stays the same in a similar way to `NaiveDateTime.add/2`.
### Examples
iex> dt = DateTime.from_naive!(~N[2018-11-15 10:00:00], "Europe/Copenhagen", FakeTimeZoneDatabase)
iex> dt |> DateTime.add(3600, :second, FakeTimeZoneDatabase)
#DateTime<2018-11-15 11:00:00+01:00 CET Europe/Copenhagen>
iex> DateTime.add(~U[2018-11-15 10:00:00Z], 3600, :second)
~U[2018-11-15 11:00:00Z]
When adding 3 seconds just before "spring forward" we go from 1:59:59 to 3:00:02
iex> dt = DateTime.from_naive!(~N[2019-03-31 01:59:59.123], "Europe/Copenhagen", FakeTimeZoneDatabase)
iex> dt |> DateTime.add(3, :second, FakeTimeZoneDatabase)
#DateTime<2019-03-31 03:00:02.123+02:00 CEST Europe/Copenhagen>
"""
@doc since: "1.8.0"
@spec add(Calendar.datetime(), integer, System.time_unit(), Calendar.time_zone_database()) ::
t()
def add(
datetime,
amount_to_add,
unit \\ :second,
time_zone_database \\ Calendar.get_time_zone_database()
)
when is_integer(amount_to_add) do
%{
utc_offset: utc_offset,
std_offset: std_offset,
calendar: calendar,
microsecond: {_, precision}
} = datetime
ppd = System.convert_time_unit(86400, :second, unit)
total_offset = System.convert_time_unit(utc_offset + std_offset, :second, unit)
result =
datetime
|> to_iso_days()
# Subtract total offset in order to get UTC and add the integer for the addition
|> Calendar.ISO.add_day_fraction_to_iso_days(amount_to_add - total_offset, ppd)
|> shift_zone_for_iso_days_utc(calendar, precision, datetime.time_zone, time_zone_database)
case result do
{:ok, result_datetime} ->
result_datetime
{:error, error} ->
raise ArgumentError,
"cannot add #{amount_to_add} #{unit} to #{inspect(datetime)} (with time zone " <>
"database #{inspect(time_zone_database)}), reason: #{inspect(error)}"
end
end
@doc """
Returns the given datetime with the microsecond field truncated to the given
precision (`:microsecond`, `:millisecond` or `:second`).
The given datetime is returned unchanged if it already has lower precision than
the given precision.
## Examples
iex> dt1 = %DateTime{year: 2017, month: 11, day: 7, zone_abbr: "CET",
...> hour: 11, minute: 45, second: 18, microsecond: {123456, 6},
...> utc_offset: 3600, std_offset: 0, time_zone: "Europe/Paris"}
iex> DateTime.truncate(dt1, :microsecond)
#DateTime<2017-11-07 11:45:18.123456+01:00 CET Europe/Paris>
iex> dt2 = %DateTime{year: 2017, month: 11, day: 7, zone_abbr: "CET",
...> hour: 11, minute: 45, second: 18, microsecond: {123456, 6},
...> utc_offset: 3600, std_offset: 0, time_zone: "Europe/Paris"}
iex> DateTime.truncate(dt2, :millisecond)
#DateTime<2017-11-07 11:45:18.123+01:00 CET Europe/Paris>
iex> dt3 = %DateTime{year: 2017, month: 11, day: 7, zone_abbr: "CET",
...> hour: 11, minute: 45, second: 18, microsecond: {123456, 6},
...> utc_offset: 3600, std_offset: 0, time_zone: "Europe/Paris"}
iex> DateTime.truncate(dt3, :second)
#DateTime<2017-11-07 11:45:18+01:00 CET Europe/Paris>
"""
@doc since: "1.6.0"
@spec truncate(Calendar.datetime(), :microsecond | :millisecond | :second) :: t()
def truncate(%DateTime{microsecond: microsecond} = datetime, precision) do
%{datetime | microsecond: Calendar.truncate(microsecond, precision)}
end
def truncate(%{} = datetime_map, precision) do
truncate(from_map(datetime_map), precision)
end
@doc """
Converts a given `datetime` from one calendar to another.
If it is not possible to convert unambiguously between the calendars
(see `Calendar.compatible_calendars?/2`), an `{:error, :incompatible_calendars}` tuple
is returned.
## Examples
Imagine someone implements `Calendar.Holocene`, a calendar based on the
Gregorian calendar that adds exactly 10,000 years to the current Gregorian
year:
iex> dt1 = %DateTime{year: 2000, month: 2, day: 29, zone_abbr: "AMT",
...> hour: 23, minute: 0, second: 7, microsecond: {0, 0},
...> utc_offset: -14400, std_offset: 0, time_zone: "America/Manaus"}
iex> DateTime.convert(dt1, Calendar.Holocene)
{:ok, %DateTime{calendar: Calendar.Holocene, day: 29, hour: 23,
microsecond: {0, 0}, minute: 0, month: 2, second: 7, std_offset: 0,
time_zone: "America/Manaus", utc_offset: -14400, year: 12000,
zone_abbr: "AMT"}}
"""
@doc since: "1.5.0"
@spec convert(Calendar.datetime(), Calendar.calendar()) ::
{:ok, t} | {:error, :incompatible_calendars}
def convert(%DateTime{calendar: calendar} = datetime, calendar) do
{:ok, datetime}
end
def convert(%{calendar: calendar} = datetime, calendar) do
{:ok, from_map(datetime)}
end
def convert(%{calendar: dt_calendar, microsecond: {_, precision}} = datetime, calendar) do
if Calendar.compatible_calendars?(dt_calendar, calendar) do
result_datetime =
datetime
|> to_iso_days
|> from_iso_days(datetime, calendar, precision)
{:ok, result_datetime}
else
{:error, :incompatible_calendars}
end
end
@doc """
Converts a given `datetime` from one calendar to another.
If it is not possible to convert unambiguously between the calendars
(see `Calendar.compatible_calendars?/2`), an ArgumentError is raised.
## Examples
Imagine someone implements `Calendar.Holocene`, a calendar based on the
Gregorian calendar that adds exactly 10,000 years to the current Gregorian
year:
iex> dt1 = %DateTime{year: 2000, month: 2, day: 29, zone_abbr: "AMT",
...> hour: 23, minute: 0, second: 7, microsecond: {0, 0},
...> utc_offset: -14400, std_offset: 0, time_zone: "America/Manaus"}
iex> DateTime.convert!(dt1, Calendar.Holocene)
%DateTime{calendar: Calendar.Holocene, day: 29, hour: 23,
microsecond: {0, 0}, minute: 0, month: 2, second: 7, std_offset: 0,
time_zone: "America/Manaus", utc_offset: -14400, year: 12000,
zone_abbr: "AMT"}
"""
@doc since: "1.5.0"
@spec convert!(Calendar.datetime(), Calendar.calendar()) :: t
def convert!(datetime, calendar) do
case convert(datetime, calendar) do
{:ok, value} ->
value
{:error, :incompatible_calendars} ->
raise ArgumentError,
"cannot convert #{inspect(datetime)} to target calendar #{inspect(calendar)}, " <>
"reason: #{inspect(datetime.calendar)} and #{inspect(calendar)} have different " <>
"day rollover moments, making this conversion ambiguous"
end
end
# Keep it multiline for proper function clause errors.
defp to_iso_days(%{
calendar: calendar,
year: year,
month: month,
day: day,
hour: hour,
minute: minute,
second: second,
microsecond: microsecond
}) do
calendar.naive_datetime_to_iso_days(year, month, day, hour, minute, second, microsecond)
end
defp from_iso_days(iso_days, datetime, calendar, precision) do
%{time_zone: time_zone, zone_abbr: zone_abbr, utc_offset: utc_offset, std_offset: std_offset} =
datetime
{year, month, day, hour, minute, second, {microsecond, _}} =
calendar.naive_datetime_from_iso_days(iso_days)
%DateTime{
calendar: calendar,
year: year,
month: month,
day: day,
hour: hour,
minute: minute,
second: second,
microsecond: {microsecond, precision},
time_zone: time_zone,
zone_abbr: zone_abbr,
utc_offset: utc_offset,
std_offset: std_offset
}
end
defp apply_tz_offset(iso_days, 0) do
iso_days
end
defp apply_tz_offset(iso_days, offset) do
Calendar.ISO.add_day_fraction_to_iso_days(iso_days, -offset, 86400)
end
defp from_map(%{} = datetime_map) do
%DateTime{
year: datetime_map.year,
month: datetime_map.month,
day: datetime_map.day,
hour: datetime_map.hour,
minute: datetime_map.minute,
second: datetime_map.second,
microsecond: datetime_map.microsecond,
time_zone: datetime_map.time_zone,
zone_abbr: datetime_map.zone_abbr,
utc_offset: datetime_map.utc_offset,
std_offset: datetime_map.std_offset
}
end
defp seconds_from_day_fraction({parts_in_day, @seconds_per_day}),
do: parts_in_day
defp seconds_from_day_fraction({parts_in_day, parts_per_day}),
do: div(parts_in_day * @seconds_per_day, parts_per_day)
defimpl String.Chars do
def to_string(datetime) do
%{
calendar: calendar,
year: year,
month: month,
day: day,
hour: hour,
minute: minute,
second: second,
microsecond: microsecond,
time_zone: time_zone,
zone_abbr: zone_abbr,
utc_offset: utc_offset,
std_offset: std_offset
} = datetime
calendar.datetime_to_string(
year,
month,
day,
hour,
minute,
second,
microsecond,
time_zone,
zone_abbr,
utc_offset,
std_offset
)
end
end
defimpl Inspect do
def inspect(datetime, _) do
%{
year: year,
month: month,
day: day,
hour: hour,
minute: minute,
second: second,
microsecond: microsecond,
time_zone: time_zone,
zone_abbr: zone_abbr,
utc_offset: utc_offset,
std_offset: std_offset,
calendar: calendar
} = datetime
formatted =
calendar.datetime_to_string(
year,
month,
day,
hour,
minute,
second,
microsecond,
time_zone,
zone_abbr,
utc_offset,
std_offset
)
case datetime do
%{utc_offset: 0, std_offset: 0, time_zone: "Etc/UTC"} ->
"~U[" <> formatted <> suffix(calendar) <> "]"
_ ->
"#DateTime<" <> formatted <> suffix(calendar) <> ">"
end
end
defp suffix(Calendar.ISO), do: ""
defp suffix(calendar), do: " " <> inspect(calendar)
end
end | lib/elixir/lib/calendar/datetime.ex | 0.947113 | 0.650592 | datetime.ex | starcoder |
defmodule PolicrMini.Logger do
@moduledoc """
查询和记录日志。
"""
require Logger
defmodule Record do
@moduledoc false
use PolicrMini.Mnesia
@enforce_keys [:level, :message, :timestamp]
defstruct [:level, :message, :timestamp]
@type t :: %__MODULE__{
level: atom,
message: String.t(),
timestamp: integer
}
def new([level, message, timestamp]) do
%__MODULE__{level: level, message: message, timestamp: timestamp}
end
@impl true
def init(node_list) do
Mnesia.create_table(__MODULE__,
attributes: [:id, :level, :message, :timestamp],
disc_only_copies: node_list
)
end
def write(level, msg, ts) when is_binary(msg) do
{{year, month, day}, {hour, minute, second, _msec}} = ts
ts =
{{year, month, day}, {hour, minute, second}}
|> NaiveDateTime.from_erl!()
# 注意:此处的实现表示日志必须使用 UTC 时间
|> DateTime.from_naive!("Etc/UTC")
|> DateTime.to_unix()
Mnesia.dirty_write({__MODULE__, Sequence.increment(__MODULE__), level, msg, ts})
end
@type query_cont :: [
{:level, atom | nil},
{:beginning, integer | nil},
{:ending, integer | nil}
]
@doc """
查询已持久化存储的日志。
参数 `query_cont` 表示查询条件,支持以下可选项:
- `level`: 日志的级别。例如 `:error` 或 `:warn`。
- `beginning`: 起始时间(时间戳)。
- `ending`: 结束时间(时间戳)。
注意:如果不指定时间区间相关的参数,将返回所有的日志记录,这个数据量可能会很庞大。
"""
@spec query(query_cont) :: {:ok, [t]} | {:error, any}
def query(cont \\ []) do
cont_combine_fun = fn where, acc ->
case where do
{_, nil} -> acc
{:level, level} -> acc ++ [{:==, :"$1", level}]
{:beginning, beginning} -> acc ++ [{:>=, :"$3", beginning}]
{:ending, ending} -> acc ++ [{:"=<", :"$3", ending}]
end
end
guards = Enum.reduce(cont, [], cont_combine_fun)
select_fun = fn ->
Mnesia.select(__MODULE__, [{{__MODULE__, :_, :"$1", :"$2", :"$3"}, guards, [:"$$"]}])
end
case Mnesia.transaction(select_fun) do
{:atomic, records} ->
records = records |> Enum.map(&new/1) |> Enum.reverse()
{:ok, records}
{:aborted, reason} ->
{:error, reason}
end
end
end
@doc """
输出格式强制统一的错误日志。
## 参数
- `action`: 执行失败的动作。位于句首,例如 `Message deletion`(消息删除)。
- `details`: 失败的详情。一般是错误的返回值,如要自行定制详情内容推荐使用关键字列表。注意不需要在传递之前调用 `inspect`。
"""
@spec unitized_error(String.t(), any) :: :ok
def unitized_error(action, details) do
error("#{action} failed, details: #{inspect(details)}")
end
@doc """
输出格式强制统一的警告日志。
## 参数
- `message`: 警告消息,语法结构自由,但不要以 `.` 结尾。
- `defails`: 警告的详情。一般是错误的返回值,如要自行定制详情内容推荐使用关键字列表。注意不需要在传递之前调用 `inspect`。
"""
@spec unitized_warn(String.t(), any) :: :ok
def unitized_warn(message, details) do
warn("#{message}, details: #{inspect(details)}")
end
defdelegate warn(chardata_or_fun, metadata \\ []), to: Logger
defdelegate info(chardata_or_fun, metadata \\ []), to: Logger
defdelegate error(chardata_or_fun, metadata \\ []), to: Logger
defdelegate debug(chardata_or_fun, metadata \\ []), to: Logger
defdelegate log(level, chardata_or_fun, metadata \\ []), to: Logger
defmodule Backend do
@moduledoc """
自定义的日志后端。
此后端会将日志持久化存储到 Mnesia 中,并可通过 `PolicrMini.Logger.Log.query/1` 函数查询。
"""
@behaviour :gen_event
@impl true
def init({__MODULE__, name}) do
{:ok, configure(name, [])}
end
defp configure(name, []) do
base_level = Application.get_env(:logger, name)[:level] || :debug
:logger |> Application.get_env(name, []) |> Enum.into(%{name: name, level: base_level})
end
@impl true
def handle_event(:flush, state) do
{:ok, state}
end
# 持久化存储字符串日志消息。
def handle_event({level, _gl, {Logger, msg, ts, _md}}, %{level: min_level} = state)
when is_binary(msg) do
if right_log_level?(min_level, level) do
PolicrMini.Logger.Record.write(level, msg, ts)
end
{:ok, state}
end
def handle_event(_, state), do: {:ok, state}
@impl true
def handle_call({:configure, opts}, %{name: name} = state) do
{:ok, :ok, configure(name, opts, state)}
end
defp configure(_name, [level: new_level], state) do
Map.merge(state, %{level: new_level})
end
defp configure(_name, _opts, state), do: state
defp right_log_level?(min_level, level) do
Logger.compare_levels(level, min_level) != :lt
end
end
end | lib/policr_mini/logger.ex | 0.702632 | 0.453988 | logger.ex | starcoder |
defmodule Mix.Tasks.Hex.Registry do
use Mix.Task
@behaviour Hex.Mix.TaskDescription
@switches [
name: :string,
private_key: :string
]
@shortdoc "Manages local Hex registries"
@moduledoc """
Manages local Hex registries.
## Build a local registry
mix hex.registry build PUBLIC_DIR
To build a registry you need a name, a directory that will be used to store public registry files,
and a private key to sign the registry:
```
mix hex.registry build public --name=acme --private-key=private_key.pem
* creating public/public_key
* creating public/tarballs
* creating public/names
* creating public/versions
```
You can generate a random private key using the following command:
```
openssl genrsa -out private_key.pem
```
Let's say you have a package `foo-1.0.0.tar`. To publish it, simply copy it to the appropriate
directory and re-build the registry:
```
cp foo-1.0.0.tar public/tarballs/
mix hex.registry build public --name=acme --private-key=private_key.pem
* creating public/packages/foo
* updating public/names
* updating public/versions
```
You can test the repository by starting the built-in Erlang/OTP HTTP server, adding the repository,
and retrieving the package that you just published.
```
erl -s inets -eval 'inets:start(httpd,[{port,8000},{server_name,"localhost"},{server_root,"."},{document_root,"public"}]).'
# replace "acme" with the name of your repository
mix hex.repo add acme http://localhost:8000 --public-key=public/public_key
mix hex.package fetch foo 1.0.0 --repo=acme
```
To use the package in your Mix project, add it as a dependency and set the `:repo` option to your repository name:
```elixir
defp deps() do
{:decimal, "~> 2.0", repo: "acme"}
end
```
### Command line options
* `--name` - The name of the registry
* `--private-key` - Path to the private key
"""
@impl true
def run(args) do
Hex.start()
{opts, args} = Hex.OptionParser.parse!(args, strict: @switches)
case args do
["build", public_dir] ->
build(public_dir, opts)
_ ->
Mix.raise("""
Invalid arguments, expected one of:
mix hex.registry build PUBLIC_DIR
""")
end
end
@impl true
def tasks() do
[
{"build PUBLIC_DIR", "Build a local registry"}
]
end
defp build(public_dir, opts) do
repo_name = opts[:name] || raise "missing --name"
private_key_path = opts[:private_key] || raise "missing --private-key"
private_key = private_key_path |> File.read!() |> decode_private_key()
build(repo_name, public_dir, private_key)
end
defp build(repo_name, public_dir, private_key) do
ensure_public_key(private_key, public_dir)
create_directory(Path.join(public_dir, "tarballs"))
paths_per_name =
Enum.group_by(Path.wildcard("#{public_dir}/tarballs/*.tar"), fn path ->
[name | _rest] = String.split(Path.basename(path), ["-", ".tar"], trim: true)
name
end)
versions =
Enum.map(paths_per_name, fn {name, paths} ->
releases =
paths
|> Enum.map(&build_release(repo_name, &1))
|> Enum.sort(&(Hex.Version.compare(&1.version, &2.version) == :lt))
package =
%{repository: repo_name, name: name, releases: releases}
|> :mix_hex_registry.encode_package()
|> sign_and_gzip(private_key)
write_file("#{public_dir}/packages/#{name}", package)
{name, Enum.map(releases, & &1.version)}
end)
for path <- Path.wildcard("#{public_dir}/packages/*"),
not (Path.basename(path) in Map.keys(paths_per_name)) do
remove_file(path)
end
names = for {name, _} <- versions, do: %{name: name}
payload = %{repository: repo_name, packages: names}
names = payload |> :mix_hex_registry.encode_names() |> sign_and_gzip(private_key)
write_file("#{public_dir}/names", names)
versions = for {name, versions} <- versions, do: %{name: name, versions: versions}
payload = %{repository: repo_name, packages: versions}
versions = payload |> :mix_hex_registry.encode_versions() |> sign_and_gzip(private_key)
write_file("#{public_dir}/versions", versions)
end
defp build_release(repo_name, tarball_path) do
tarball = File.read!(tarball_path)
{:ok, result} = :mix_hex_tarball.unpack(tarball, :memory)
dependencies =
for {package, map} <- Map.get(result.metadata, "requirements", []) do
app = Map.fetch!(map, "app")
requirement = Map.fetch!(map, "requirement")
optional = map["optional"] == true
repository = map["repository"]
release = %{
package: package,
app: app,
optional: optional,
requirement: requirement
}
if !repository or repository == repo_name do
release
else
Map.put(release, :repository, repository)
end
end
%{
version: result.metadata["version"],
inner_checksum: result.inner_checksum,
outer_checksum: result.outer_checksum,
dependencies: dependencies
}
end
defp sign_and_gzip(protobuf, private_key) do
protobuf
|> :mix_hex_registry.sign_protobuf(private_key)
|> :zlib.gzip()
end
defp ensure_public_key(private_key, public_dir) do
path = "#{public_dir}/public_key"
encoded_public_key = private_key |> extract_public_key() |> encode_public_key()
case File.read(path) do
{:ok, ^encoded_public_key} ->
:ok
{:ok, _} ->
Hex.Shell.info("* public key at #{path} does not match private key, overwriting")
write_file(path, encoded_public_key)
{:error, :enoent} ->
write_file(path, encoded_public_key)
end
end
defp create_directory(path) do
unless File.dir?(path) do
Hex.Shell.info(["* creating ", path])
File.mkdir_p!(path)
end
end
defp write_file(path, data) do
if File.exists?(path) do
Hex.Shell.info(["* updating ", path])
else
File.mkdir_p!(Path.dirname(path))
Hex.Shell.info(["* creating ", path])
end
File.write!(path, data)
end
defp remove_file(path) do
Hex.Shell.info(["* removing ", path])
File.rm!(path)
end
## Key utilities
require Record
Record.defrecordp(
:rsa_private_key,
:RSAPrivateKey,
Record.extract(:RSAPrivateKey, from_lib: "public_key/include/OTP-PUB-KEY.hrl")
)
Record.defrecordp(
:rsa_public_key,
:RSAPublicKey,
Record.extract(:RSAPublicKey, from_lib: "public_key/include/OTP-PUB-KEY.hrl")
)
defp extract_public_key(rsa_private_key(modulus: m, publicExponent: e)) do
rsa_public_key(modulus: m, publicExponent: e)
end
defp encode_public_key(key) do
:public_key.pem_encode([:public_key.pem_entry_encode(:RSAPublicKey, key)])
end
defp decode_private_key(data) do
[entry] = :public_key.pem_decode(data)
:public_key.pem_entry_decode(entry)
end
end | lib/mix/tasks/hex.registry.ex | 0.795221 | 0.731197 | hex.registry.ex | starcoder |
defmodule Ash.Query.Function do
@moduledoc """
A function is a predicate with an arguments list.
For more information on being a predicate, see `Ash.Filter.Predicate`. Most of the complexities
are there. A function must meet both behaviours.
"""
@callback new(list(term)) :: {:ok, term}
@type arg :: :ref | :term | {:options, Keyword.t()}
@doc """
The number and types of arguments supported.
Currently supports three values: `:ref`, `:term`, and `{:options, schema}`.
* `:ref` - a column/relationship path reference. Will be an instance of `Ash.Query.Ref`
* `:term` - any value. No type validation is currently supported except for what is listed here, so it must be done in the `c:new/1` function
* `{:options, keys}` - Only the last arg may be options, and `keys` is a list of atoms for which options are accepted
"""
@callback args() :: [arg]
def new(mod, args, ref) do
args = List.wrap(args)
configured_args = List.wrap(mod.args())
configured_arg_count = Enum.count(configured_args)
given_arg_count = Enum.count(args)
if configured_arg_count == given_arg_count do
args
|> Enum.zip(configured_args)
|> Enum.with_index()
|> Enum.reduce_while({:ok, []}, fn
{{%Ash.Query.Ref{} = ref, :ref}, _i}, {:ok, args} ->
{:cont, {:ok, [ref | args]}}
{{arg, :ref}, i}, {:ok, args} when is_atom(arg) ->
case Ash.Resource.attribute(ref.resource, arg) do
nil ->
{:halt,
{:error, "invalid reference in #{ordinal(i + 1)} argument to #{mod.name()}"}}
attribute ->
{:cont, {:ok, [%{ref | attribute: attribute} | args]}}
end
{{arg, :term}, _}, {:ok, args} ->
{:cont, {:ok, [arg | args]}}
{{arg, {:options, keys}}, i}, {:ok, args} ->
case to_keys(arg, keys) do
{:ok, opts} ->
{:cont, {:ok, [opts | args]}}
{:error, message} when is_binary(message) ->
{:halt,
{:error, "#{ordinal(i + 1)} argument to #{mod.name()} is invalid: #{message}"}}
{:error, exception} ->
{:halt,
{:error,
"#{ordinal(i + 1)} argument to #{mod.name()} is invalid: #{
Exception.message(exception)
}"}}
end
end)
|> case do
{:ok, args} ->
mod.new(Enum.reverse(args))
{:error, error} ->
{:error, error}
end
else
{:error,
"function #{mod.name()} takes #{configured_arg_count} arguments, provided #{
given_arg_count
}"}
end
end
defp to_keys(nil, _), do: {:ok, nil}
defp to_keys(opts, keys) do
if is_map(opts) || Keyword.keyword?(opts) do
string_keys = Enum.map(keys, &to_string/1)
Enum.reduce_while(opts, {:ok, []}, fn
{key, value}, {:ok, opts} when is_binary(key) ->
if key in string_keys do
{:cont, {:ok, [{String.to_existing_atom(key), value} | opts]}}
else
{:halt, {:error, "No such option #{key}"}}
end
{key, value}, {:ok, opts} when is_atom(key) ->
if key in keys do
{:cont, {:ok, [{key, value} | opts]}}
else
{:halt, {:error, "No such option #{key}"}}
end
{key, _}, _ ->
{:halt, {:error, "No such option #{key}"}}
end)
else
{:error, "Invalid options #{inspect(opts)}"}
end
end
# Copied from https://github.com/andrewhao/ordinal/blob/master/lib/ordinal.ex
@doc """
Attaches the appropiate suffix to refer to an ordinal number, e.g 1 -> "1st"
"""
def ordinal(num) do
cond do
Enum.any?([11, 12, 13], &(&1 == Integer.mod(num, 100))) ->
"#{num}th"
Integer.mod(num, 10) == 1 ->
"#{num}st"
Integer.mod(num, 10) == 2 ->
"#{num}nd"
Integer.mod(num, 10) == 3 ->
"#{num}rd"
true ->
"#{num}th"
end
end
defmacro __using__(opts) do
quote do
@behaviour Ash.Filter.Predicate
alias Ash.Query.Ref
defstruct [
:arguments,
name: unquote(opts[:name]),
embedded?: false,
__function__?: true,
__predicate__?: unquote(opts[:predicate?] || false)
]
def name, do: unquote(opts[:name])
if unquote(opts[:predicate?]) do
def match?(struct) do
evaluate(struct) not in [nil, false]
end
end
defimpl Inspect do
import Inspect.Algebra
def inspect(%{arguments: args, name: name}, opts) do
concat(
to_string(name),
container_doc("(", args, ")", opts, &to_doc(&1, &2), separator: ", ")
)
end
end
end
end
end | lib/ash/query/function/function.ex | 0.851027 | 0.563498 | function.ex | starcoder |
defmodule ExWire.Message do
@moduledoc """
Defines a behavior for messages so that they can be easily encoded and
decoded for transfer from and to the wire.
"""
alias ExthCrypto.Key
alias ExWire.Crypto
alias ExWire.Message.{FindNeighbours, Neighbours, Ping, Pong}
alias ExWire.Struct.Endpoint
defmodule UnknownMessageError do
defexception [:message]
end
@type t :: Ping.t() | Pong.t() | FindNeighbours.t() | Neighbours.t()
@type message_id :: integer()
@callback message_id() :: message_id
@callback encode(t) :: binary()
@callback to(t) :: Endpoint.t() | nil
@message_types %{
0x01 => Ping,
0x02 => Pong,
0x03 => FindNeighbours,
0x04 => Neighbours
}
@doc """
Decodes a message of given `type` based on the encoded data. Effectively
reverses the `decode/1` function.
## Examples
iex> ExWire.Message.decode(0x01, <<210, 1, 199, 132, 1, 2, 3, 4, 128, 5, 199, 132, 5, 6, 7, 8, 6, 128, 4>>)
%ExWire.Message.Ping{
version: 1,
from: %ExWire.Struct.Endpoint{ip: [1, 2, 3, 4], tcp_port: 5, udp_port: nil},
to: %ExWire.Struct.Endpoint{ip: [5, 6, 7, 8], tcp_port: nil, udp_port: 6},
timestamp: 4
}
iex> ExWire.Message.decode(0x02, <<202, 199, 132, 5, 6, 7, 8, 6, 128, 2, 3>>)
%ExWire.Message.Pong{
to: %ExWire.Struct.Endpoint{ip: [5, 6, 7, 8], tcp_port: nil, udp_port: 6}, hash: <<2>>, timestamp: 3
}
iex> ExWire.Message.decode(0x99, <<>>)
** (ExWire.Message.UnknownMessageError) Unknown message type: 0x99
"""
@spec decode(integer(), binary()) :: t
def decode(type, data) do
case @message_types[type] do
nil -> raise UnknownMessageError, "Unknown message type: #{inspect(type, base: :hex)}"
mod -> mod.decode(data)
end
end
@doc """
Encoded a message by concating its `message_id` to the encoded data of the
message itself.
## Examples
iex> ExWire.Message.encode(
...> %ExWire.Message.Ping{
...> version: 1,
...> from: %ExWire.Struct.Endpoint{ip: [1, 2, 3, 4], tcp_port: 5, udp_port: nil},
...> to: %ExWire.Struct.Endpoint{ip: [5, 6, 7, 8], tcp_port: nil, udp_port: 6},
...> timestamp: 4
...> }
...> )
<<1, 214, 1, 201, 132, 1, 2, 3, 4, 128, 130, 0, 5, 201, 132, 5, 6, 7, 8, 130, 0, 6, 128, 4>>
iex> ExWire.Message.encode(%ExWire.Message.Pong{to: %ExWire.Struct.Endpoint{ip: [5, 6, 7, 8], tcp_port: nil, udp_port: 6}, hash: <<2>>, timestamp: 3})
<<2, 204, 201, 132, 5, 6, 7, 8, 130, 0, 6, 128, 2, 3>>
"""
@spec encode(t) :: binary()
def encode(message) do
<<message.__struct__.message_id()>> <> message.__struct__.encode(message)
end
@doc """
Recovers the public key of a signer, given a message and its
signature.
## Examples
iex> pong = %ExWire.Message.Pong{
...> hash: <<186, 68, 151, 218, 158, 6, 56, 106, 29, 48, 177, 35, 29, 4, 114, 189,
...> 50, 66, 82, 184, 158, 70, 161, 192, 157, 133, 159, 214, 98, 85, 140, 136>>,
...> timestamp: 1526987786,
...> to: %ExWire.Struct.Endpoint{
...> ip: [52, 176, 100, 77],
...> tcp_port: nil,
...> udp_port: 30303
...> }
...> }
iex> signature = <<193, 30, 149, 122, 226, 192, 230, 158, 118, 204, 173, 80, 63,
...> 232, 67, 152, 216, 249, 89, 52, 162, 92, 233, 201, 177, 108, 63, 120, 152,
...> 134, 149, 220, 73, 198, 29, 93, 218, 123, 50, 70, 8, 202, 17, 171, 67, 245,
...> 70, 235, 163, 158, 201, 246, 223, 114, 168, 7, 7, 95, 9, 53, 165, 8, 177,
...> 13>>
iex> ExWire.Message.recover_public_key(pong, signature, 1)
<<134, 90, 99, 37, 91, 59, 182, 128, 35, 182, 191, 253, 80, 149, 17, 143, 204,
19, 231, 157, 207, 1, 79, 228, 228, 126, 6, 92, 53, 12, 124, 199, 42, 242,
229, 62, 255, 137, 95, 17, 186, 27, 187, 106, 43, 51, 39, 28, 17, 22, 238,
135, 15, 38, 102, 24, 234, 223, 194, 231, 138, 167, 52, 156>>
"""
@spec recover_public_key(t() | binary(), binary(), integer()) :: Key.public_key()
def recover_public_key(message, signature, recovery_id) when is_binary(message) do
message
|> Crypto.hash()
|> Crypto.recover_public_key(signature, recovery_id)
end
def recover_public_key(message, signature, recovery_id) do
message
|> encode()
|> recover_public_key(signature, recovery_id)
end
end | apps/ex_wire/lib/ex_wire/message.ex | 0.880451 | 0.410254 | message.ex | starcoder |
defmodule Snitch.Domain.Order do
@moduledoc """
Order helpers.
"""
@editable_states ~w(cart address delivery payment)a
use Snitch.Domain
import Ecto.Changeset
import Ecto.Query
alias Snitch.Data.Schema.{Order, Package, Payment}
alias Snitch.Data.Model.{Product, Image}
alias Snitch.Data.Model.GeneralConfiguration, as: GCModel
@spec validate_change(Ecto.Changeset.t()) :: Ecto.Changeset.t()
def validate_change(%{valid?: false} = changeset), do: changeset
def validate_change(%{valid?: true} = changeset) do
prepare_changes(changeset, fn changeset ->
with {_, order_id} <- fetch_field(changeset, :order_id),
%Order{state: order_state} <- Repo.get(Order, order_id) do
if order_state in @editable_states do
changeset
else
add_error(changeset, :order, "has been frozen", validation: :state, state: order_state)
end
else
_ ->
changeset
end
end)
end
@doc """
Returns summation of all the `payments` for the order in the supplied `payment`
state.
"""
@spec payments_total(Order.t(), String.t()) :: Money.t()
def payments_total(order, payment_state) do
currency = GCModel.fetch_currency()
query =
from(
payment in Payment,
where: payment.state == ^payment_state
)
order = Repo.preload(order, payments: query)
order.payments
|> Enum.reduce(Money.new(currency, 0), fn payment, acc ->
Money.add!(acc, payment.amount)
end)
|> Money.round(currency_digits: :cash)
end
@doc """
Returns the total cost for an `order`.
The total for an `order` depends on the `state` in which the order is in.
If the order is in `cart` or the `address` state then total cost is summation
of individual costs of the lineitems.
In case the order is in advanced stages like `delivery` or `payment` then the
summation includes shipment cost, shipping taxes and other taxes on individual
lineitems as well.
"""
@spec total_amount(Order.t()) :: Money.t()
def total_amount(%Order{state: state} = order) when state in [:cart, :address] do
order = Repo.preload(order, :line_items)
total = line_item_total(order)
total
end
def total_amount(%Order{} = order) do
order = Repo.preload(order, [:line_items, packages: :items])
currency = GCModel.fetch_currency()
total =
Money.add!(
line_item_total(order),
packages_total_cost(order.packages, currency)
)
Money.round(total, currency_digits: :cash)
end
def line_item_total(order) do
currency = GCModel.fetch_currency()
order.line_items
|> Enum.reduce(Money.new(currency, 0), fn line_item, acc ->
{:ok, total} = Money.mult(line_item.unit_price, line_item.quantity)
{:ok, acc} = Money.add(acc, total)
acc
end)
|> Money.round(currency_digits: :cash)
end
defp packages_total_cost(packages, currency) do
zero_money = Money.new(currency, 0)
packages
|> Enum.reduce(zero_money, fn %{
items: items,
shipping_tax: shipping_tax,
cost: cost
},
acc ->
acc
|> Money.add!(shipping_tax || zero_money)
|> Money.add!(cost || zero_money)
|> Money.add!(package_items_total_cost(items, currency) || zero_money)
end)
|> Money.round(currency_digits: :cash)
end
# TODO: handle shipping tax for items
# At present updating taxes for package items on persisting shipping
# preferences(transition funtction) is not handled.
defp package_items_total_cost(package_items, currency) do
Enum.reduce(package_items, Money.new(currency, 0), fn %{shipping_tax: shipping_tax, tax: tax},
acc ->
shipping_tax = shipping_tax || Money.new!(currency, 0)
tax = tax || Money.new!(currency, 0)
acc
|> Money.add!(shipping_tax)
|> Money.add!(tax)
end)
end
def total_tax(packages) do
currency = GCModel.fetch_currency()
packages
|> Enum.reduce(Money.new(currency, 0), fn %{
items: items,
shipping_tax: shipping_tax
},
acc ->
acc
|> Money.add!(shipping_tax)
|> Money.add!(package_items_total_cost(items, currency))
end)
|> Money.round(currency_digits: :cash)
end
def shipping_total(packages) do
currency = GCModel.fetch_currency()
packages
|> Enum.reduce(Money.new(currency, 0), fn %{cost: cost}, acc ->
Money.add!(acc, cost)
end)
|> Money.round(currency_digits: :cash)
end
def fetch_image_url(line_item) do
image = line_item.product.images |> List.first()
Image.image_url(image.name, line_item.product)
end
def format_date(date) do
date
|> NaiveDateTime.to_erl()
|> form_date
end
defp form_date({{year, month, date}, _}) do
"#{date}/#{month}/#{year}"
end
def line_items_count(order) do
length(order.line_items)
end
def order_package_delivered?(order) do
order = Repo.preload(order, :packages)
Enum.all?(order.packages, fn package ->
package.state == :delivered
end)
end
end | apps/snitch_core/lib/core/domain/order/order.ex | 0.781539 | 0.445831 | order.ex | starcoder |
defmodule Mix.Tasks.NervesHub.Device do
use Mix.Task
import Mix.NervesHubCLI.Utils
alias Mix.NervesHubCLI.Shell
@shortdoc "Manages your NervesHub devices"
@moduledoc """
Manage your NervesHub devices.
## create
Create a new NervesHub device. The shell will prompt for information about the
device. This information can be passed by specifying one or all of the command
line options.
mix nerves_hub.device create
### Command-line options
* `--identifier` - (Optional) The device identifier
* `--description` - (Optional) The description of the device
* `--tag` - (Optional) Multiple tags can be set by passing this key multiple
times
## update
Update values on a device.
### Examples
List all devices
mix nerves_hub.device list
Update device tags
mix nerves_hub.device update 1234 tags dev qa
## delete
Delete a device on NervesHub
mix nerves_hub.device delete DEVICE_IDENTIFIER
## burn
Combine a firmware image with NervesHub provisioning information and burn the
result to an attached MicroSD card or file. This requires that the device
was already created. Calling burn without passing command-line options will
generate a new cert pair for the device. The command will end with calling
mix firmware.burn.
mix nerves_hub.device burn DEVICE_IDENTIFIER
### Command-line options
* `--cert` - (Optional) A path to an existing device certificate
* `--key` - (Optional) A path to an existing device private key
* `--path` - (Optional) The path to put the device certificates
## cert list
List all certificates for a device.
mix nerves_hub.device cert list DEVICE_IDENTIFIER
## cert create
Creates a new device certificate pair. The certificates will be placed in the
current working directory if no path is specified.
mix nerves_hub.device cert create DEVICE_IDENTIFIER
### Command-line options
* `--path` - (Optional) A local location for storing certificates
"""
@switches [
org: :string,
path: :string,
identifier: :string,
description: :string,
tag: :keep,
key: :string,
cert: :string
]
@data_dir "nerves-hub"
@spec run([String.t()]) :: :ok | no_return()
def run(args) do
Application.ensure_all_started(:nerves_hub_cli)
{opts, args} = OptionParser.parse!(args, strict: @switches)
show_api_endpoint()
org = org(opts)
case args do
["list"] ->
list(org, opts)
["create"] ->
create(org, opts)
["delete", identifier] ->
delete(org, identifier)
["burn", identifier] ->
burn(identifier, opts)
["cert", "list", device] ->
cert_list(org, device)
["cert", "create", device] ->
cert_create(org, device, opts)
["update", identifier | update_data] ->
update(org, identifier, update_data)
_ ->
render_help()
end
end
@spec render_help() :: no_return()
def render_help() do
Shell.raise("""
Invalid arguments to `mix nerves_hub.device`.
Usage:
mix nerves_hub.device list
mix nerves_hub.device create
mix nerves_hub.device update KEY VALUE
mix nerves_hub.device delete DEVICE_IDENTIFIER
mix nerves_hub.device burn DEVICE_IDENTIFIER
mix nerves_hub.device cert list DEVICE_IDENTIFIER
mix nerves_hub.device cert create DEVICE_IDENTIFIER
Run `mix help nerves_hub.device` for more information.
""")
end
@spec list(String.t(), keyword()) :: :ok
def list(org, _opts) do
auth = Shell.request_auth()
case NervesHubUserAPI.Device.list(org, auth) do
{:ok, %{"data" => devices}} ->
Shell.info(render_devices(org, devices))
Shell.info("Total devices: #{Enum.count(devices)}")
error ->
Shell.render_error(error)
end
end
@spec create(String.t(), keyword()) :: :ok
def create(org, opts) do
identifier = opts[:identifier] || Shell.prompt("Identifier (e.g., serial number):")
description = opts[:description] || Shell.prompt("Description:")
# Tags may be specified using multiple `--tag` options or as `--tag "a, b, c"`
tags = Keyword.get_values(opts, :tag) |> Enum.flat_map(&split_tag_string/1)
tags =
if tags == [] do
Shell.prompt("One or more comma-separated tags:")
|> split_tag_string()
else
tags
end
auth = Shell.request_auth()
case NervesHubUserAPI.Device.create(org, identifier, description, tags, auth) do
{:ok, %{"data" => %{} = _device}} ->
Shell.info("""
Device #{identifier} created.
If your device has an ATECCx08A module or NervesKey that has been
provisioned by a CA/signing certificate known to NervesHub, it is
ready to go.
If not using a hardware module to protect the device's private
key, create and register a certificate and key pair manually by
running:
mix nerves.device cert create #{identifier}
""")
error ->
Shell.render_error(error)
end
end
@spec update(String.t(), String.t(), [String.t()]) :: :ok
def update(org, identifier, ["tags" | tags]) do
# Split up tags with comma separators
tags = Enum.flat_map(tags, &split_tag_string/1)
auth = Shell.request_auth()
case NervesHubUserAPI.Device.update(org, identifier, %{tags: tags}, auth) do
{:ok, %{"data" => %{} = _device}} ->
Shell.info("Device #{identifier} updated")
error ->
Shell.render_error(error)
end
end
def update(org, identifier, [key, value]) do
auth = Shell.request_auth()
case NervesHubUserAPI.Device.update(org, identifier, %{key => value}, auth) do
{:ok, %{"data" => %{} = _device}} ->
Shell.info("Device #{identifier} updated")
error ->
Shell.render_error(error)
end
end
def update(_org, _identifier, data) do
Shell.render_error("Unable to update data: #{inspect(data)}")
end
@spec delete(String.t(), String.t()) :: :ok
def delete(org, identifier) do
auth = Shell.request_auth()
case NervesHubUserAPI.Device.delete(org, identifier, auth) do
{:ok, _} ->
Shell.info("Device #{identifier} deleted")
error ->
Shell.render_error(error)
end
end
@spec burn(String.t(), keyword()) :: :ok
def burn(identifier, opts) do
path = opts[:path] || Path.join(File.cwd!(), @data_dir)
cert_path = opts[:cert]
key_path = opts[:key]
{cert_path, key_path} =
if key_path == nil and cert_path == nil do
cert_path = Path.join(path, identifier <> "-cert.pem")
key_path = Path.join(path, identifier <> "-key.pem")
unless File.exists?(key_path) and File.exists?(cert_path) do
Shell.raise("""
A private key and certificate for #{identifier}
does not exists at path #{path}.
To generate certificates for #{identifier}
mix nerves_hub.device cert create #{identifier}
""")
end
{cert_path, key_path}
else
if key_path == nil or cert_path == nil do
Shell.raise("Must specify both --key and --cert")
end
{cert_path, key_path}
end
Shell.info("Burning firmware")
System.put_env("NERVES_SERIAL_NUMBER", identifier)
System.put_env("NERVES_HUB_CERT", File.read!(cert_path))
System.put_env("NERVES_HUB_KEY", File.read!(key_path))
Mix.Task.run("burn", [])
end
@spec cert_list(String.t(), String.t()) :: :ok
def cert_list(org, identifier) do
auth = Shell.request_auth()
case NervesHubUserAPI.Device.cert_list(org, identifier, auth) do
{:ok, %{"data" => certs}} ->
render_certs(identifier, certs)
error ->
Shell.render_error(error)
end
end
@spec cert_create(String.t(), String.t(), keyword(), nil | NervesHubUserAPI.Auth.t()) :: :ok
def cert_create(org, identifier, opts, auth \\ nil) do
Shell.info("Creating certificate for #{identifier}")
path = opts[:path] || Path.join(File.cwd!(), @data_dir)
File.mkdir_p(path)
auth = auth || Shell.request_auth()
key = X509.PrivateKey.new_ec(:secp256r1)
pem_key = X509.PrivateKey.to_pem(key)
csr = X509.CSR.new(key, "/O=#{org}/CN=#{identifier}")
pem_csr = X509.CSR.to_pem(csr)
with safe_csr <- Base.encode64(pem_csr),
{:ok, %{"data" => %{"cert" => cert}}} <-
NervesHubUserAPI.Device.cert_sign(org, identifier, safe_csr, auth),
:ok <- File.write(Path.join(path, "#{identifier}-cert.pem"), cert),
:ok <- File.write(Path.join(path, "#{identifier}-key.pem"), pem_key) do
Shell.info("Finished")
:ok
else
error ->
Shell.render_error(error)
end
end
defp render_certs(identifier, certs) when is_list(certs) do
Shell.info("\nDevice: #{identifier}")
Shell.info("Certificates:")
Enum.each(certs, fn params ->
Shell.info("------------")
render_certs(identifier, params)
|> String.trim_trailing()
|> Shell.info()
end)
Shell.info("------------")
Shell.info("")
end
defp render_certs(_identifier, params) do
{:ok, not_before, _} = DateTime.from_iso8601(params["not_before"])
{:ok, not_after, _} = DateTime.from_iso8601(params["not_after"])
"""
serial: #{params["serial"]}
validity: #{DateTime.to_date(not_before)} - #{DateTime.to_date(not_after)} UTC
"""
end
defp render_devices(org, devices) do
title = "Devices for #{org}"
header = ["Identifier", "Tags", "Version", "Status", "Last connected", "Description"]
rows =
Enum.map(devices, fn device ->
[
device["identifier"],
Enum.join(device["tags"], ", "),
device["version"],
device["status"],
device["last_communication"],
device["description"]
]
end)
TableRex.quick_render!(rows, header, title)
end
end | lib/mix/tasks/nerves_hub.device.ex | 0.835618 | 0.427217 | nerves_hub.device.ex | starcoder |
defmodule Serex.Matcher do
@moduledoc """
This module provides functions used to match regular expressions against text.
"""
@doc """
Searches for a regular expression match in supplied text and returns true if found.
"""
def match(regex, text) when is_binary(regex) and is_binary(text) do
tokens = Serex.Lexer.lex(regex)
chars = String.graphemes(text)
case List.first(tokens) do
{:bol, nil} ->
match_here(Enum.drop(tokens, 1), chars)
nil ->
match_here(tokens, [])
_ ->
match_incremental(tokens, chars)
end
end
# Search for a regex match incrementally through a supplied list of chars
defp match_incremental(_tokens, [] = _chars), do: false
defp match_incremental(tokens, chars) when is_list(tokens) and is_list(chars) and length(chars) > 0 do
case match_here(tokens, chars) do
true ->
true
_ ->
match_incremental(tokens, Enum.drop(chars, 1))
end
end
# Search for a regex match at the start of a supplied list of chars
defp match_here([] = _tokens, _chars), do: true
defp match_here([{:eol, nil} | []] = _tokens, chars) when is_list(chars), do: length(chars) == 0
defp match_here([_token_head | token_tail] = tokens, [] = _chars) when is_list(tokens) do
cond do
List.first(token_tail) == {:star, nil} ->
match_here(Enum.drop(token_tail, 1), [])
true ->
false
end
end
defp match_here([token_head | token_tail], [char_head | char_tail] = chars) do
{token_type, token_value} = token_head
cond do
List.first(token_tail) == {:star, nil} ->
match_star(token_head, Enum.drop(token_tail, 1), chars)
token_type == :wildcard ->
match_here(token_tail, char_tail)
token_value == char_head ->
match_here(token_tail, char_tail)
true ->
false
end
end
# Search for zero or more instances of a character at the start of a supplied list of chars
defp match_star(_token , tokens, [] = _chars) when is_list(tokens) do
match_here(tokens, [])
end
defp match_star({:wildcard, nil} = token, tokens, [_char_head | char_tail] = chars) when is_list(tokens) do
case match_here(tokens, chars) do
true ->
true
_ ->
match_star(token, tokens, char_tail)
end
end
defp match_star({:char, token_value} = token, tokens, [char_head | char_tail] = chars) when is_list(tokens) do
cond do
match_here(tokens, chars) == true ->
true
char_head == token_value ->
match_star(token, tokens, char_tail)
true ->
false
end
end
end | lib/matcher.ex | 0.656218 | 0.586168 | matcher.ex | starcoder |
defimpl Charts.StackedColumnChart, for: Charts.BaseChart do
alias Charts.BaseChart
alias Charts.StackedColumnChart.{MultiColumn, Rectangle}
alias Charts.ColumnChart.Dataset
def columns(%BaseChart{dataset: nil}), do: []
def columns(%BaseChart{dataset: dataset}), do: columns(dataset)
def columns(%Dataset{data: []}), do: []
def columns(%Dataset{data: data, axes: %{magnitude_axis: %{max: max}}}) do
width = 100.0 / Enum.count(data)
margin = width / 4.0
data
|> Enum.with_index()
|> Enum.map(fn {datum, index} ->
offset = index * width
column_height = (Map.values(datum.values) |> Enum.sum()) / max * 100
%MultiColumn{
width: width,
column_height: column_height,
offset: offset,
label: datum.name,
column_width: width / 2.0,
column_offset: offset + margin,
parts: datum.values
}
end)
end
def rectangles(chart) do
chart
|> columns()
|> rectangles_from_columns()
end
defp rectangles_from_columns([]), do: []
defp rectangles_from_columns(multi_columns) do
multi_columns
|> Enum.flat_map(&build_rectangles_for_column(&1))
end
defp build_rectangles_for_column(column) do
column.parts
|> Enum.reject(fn {_color, height} -> height == 0 end)
|> Enum.reduce([], fn {color, height}, acc ->
percentage = height / Enum.sum(Map.values(column.parts)) * 100
rectangle_height = percentage / 100 * column.column_height
case acc do
[previous | _rectangles] ->
new_rectangle = %Rectangle{
x_offset: column.column_offset,
y_offset: previous.y_offset - rectangle_height,
fill_color: color,
width: column.width,
height: rectangle_height,
label: height
}
[new_rectangle | acc]
[] ->
new_rectangle = %Rectangle{
x_offset: column.column_offset,
y_offset: 100 - rectangle_height,
fill_color: color,
width: column.width,
height: rectangle_height,
label: height
}
[new_rectangle]
end
end)
end
end | charts/lib/charts/stacked_column_chart/base_chart_impl.ex | 0.828766 | 0.429938 | base_chart_impl.ex | starcoder |
defmodule Confispex.Schema do
@moduledoc """
## Example
defmodule MyApp.RuntimeConfigSchema do
import Confispex.Schema
@behaviour Confispex.Schema
alias Confispex.Type
defvariables(%{
"RUNTIME_CONFIG_REPORT" => %{
cast: {Type.Enum, values: ["disabled", "detailed", "brief"]},
default: "disabled",
groups: [:misc]
},
"TZDATA_AUTOUPDATE_ENABLED" => %{
doc: "Autoupdate timezones from IANA Time Zone Database",
cast: Type.Boolean,
default: "false",
groups: [:base],
context: [env: [:dev, :prod]]
},
"LOG_LEVEL" => %{
cast:
{Type.Enum,
values: [
"emergency",
"alert",
"critical",
"error",
"warning",
"notice",
"info",
"debug",
"none"
]},
default_lazy: fn
%{env: :test} -> "warning"
%{env: :dev} -> "debug"
%{env: :prod} -> "debug"
end,
groups: [:base]
},
"DATABASE_URL" => %{
aliases: ["DB_URL"],
doc: "Full DB URL",
cast: Type.URL,
context: [env: [:prod]],
groups: [:primary_db],
required: [:primary_db]
},
"DATABASE_POOL_SIZE" => %{
aliases: ["DB_POOL_SIZE", "POOL_SIZE"],
cast: {Type.Integer, scope: :positive},
default: "10",
context: [env: [:prod]],
groups: [:primary_db]
}
})
end
"""
@type variable_name :: term()
@typedoc """
A spec for a single variable
* `:cast` - describes how value should be cast.
* `:groups` - a list of groups which are affected by variable.
* `:doc` - a description about variable, shown in generated `.envrc` file.
* `:default` - default value. Must be set in raw format. Raw format was choosen to populate `.envrc` file with default values.
* `:defaut_lazy` - default value based on given context. Useful when default value must be different for different environments. Cannot be used alongside with `:default` parameter. Return `nil` if default value should be ignored.
* `:template_value_generator` - a function that is used in `confispex.gen.envrc_template` mix task to generate a value for a variable. Such value will always be uncommented even if it is not required. This is useful for variables like "SECRET_KEY_BASE" which should be generated only once.
* `:required` - a list of groups or a function that returns a list of groups in which variable is required. When all required variables of the group are cast successfully, then the group is considered as ready for using.
* `:context` - specifies context in which variable is used.
* `:aliases` - a list of alias names.
"""
@type variable_spec :: %{
required(:cast) => module() | {module(), opts :: keyword()},
required(:groups) => [atom()],
optional(:doc) => String.t(),
optional(:default) => String.t(),
optional(:default_lazy) => (Confispex.context() -> String.t() | nil),
optional(:template_value_generator) => (() -> String.t()),
optional(:required) => [atom()] | (Confispex.context() -> [atom()]),
optional(:context) => [{atom(), atom()}],
optional(:aliases) => [variable_name()]
}
@callback variables_schema() :: %{variable_name() => variable_spec()}
@doc """
A helper which performs basic validations of the input schema and then defines `variables_schema/0` function.
"""
defmacro defvariables(variables) do
quote do
validate_variables!(unquote(variables))
@impl unquote(__MODULE__)
def variables_schema do
unquote(variables)
end
end
end
@doc false
def validate_variables!(variables) when is_map(variables) do
Enum.each(variables, fn {variable_name, spec} ->
assert(spec[:cast], "param :cast is required", variable_name)
assert(spec[:groups], "param :groups is required", variable_name)
assert(is_list(spec[:groups]), "param :groups must be a list", variable_name)
assert(
is_nil(spec[:default]) or is_nil(spec[:default_lazy]),
"param :default cannot be used with :default_lazy",
variable_name
)
assert(
is_nil(spec[:required]) or is_nil(spec[:default]),
"param :default cannot be used with :required",
variable_name
)
assert(
not Map.has_key?(spec, :required) or is_list(spec.required) or
is_function(spec.required, 1),
"param :required must be a list or function with arity 1",
variable_name
)
assert(
not Map.has_key?(spec, :aliases) or is_list(spec.aliases),
"param :aliases must be a list",
variable_name
)
assert(
not Map.has_key?(spec, :template_value_generator) or
is_function(spec.template_value_generator, 0),
"param :template_value_generator must be a function with arity 0",
variable_name
)
assert(
not Map.has_key?(spec, :default_lazy) or
is_function(spec.default_lazy, 1),
"param :default_lazy must be a function with arity 1",
variable_name
)
end)
end
defp assert(condition, msg, variable_name) do
if condition do
:ok
else
raise ArgumentError, "Assertion failed for #{variable_name}: " <> msg
end
end
@doc false
def variable_required?(spec, group, context) do
case spec[:required] do
nil -> false
required when is_list(required) -> group in required
required when is_function(required, 1) -> group in required.(context)
end
end
@doc false
def variables_in_context(variables_schema, context) do
Enum.filter(variables_schema, fn {_variable_name, spec} -> spec_in_context?(spec, context) end)
end
defp spec_in_context?(spec, context) do
case Map.fetch(spec, :context) do
{:ok, context_spec} ->
Enum.all?(context_spec, fn {context_key, allowed_values} ->
Map.fetch!(context, context_key) in allowed_values
end)
:error ->
true
end
end
@doc false
def grouped_variables(variables_schema, context) do
variables_schema
|> variables_in_context(context)
|> Enum.flat_map(fn {_variable_name, spec} = item ->
Enum.map(spec.groups, &{&1, item})
end)
|> Enum.group_by(&elem(&1, 0), &elem(&1, 1))
end
end | lib/confispex/schema.ex | 0.791176 | 0.492981 | schema.ex | starcoder |
defmodule MssqlexV3 do
@moduledoc """
Interface for interacting with MS SQL Server via an ODBC driver for Elixir.
It implements `DBConnection` behaviour, using `:odbc` to connect to the
system's ODBC driver. Requires MS SQL Server ODBC driver, see
[README](readme.html) for installation instructions.
"""
alias MssqlexV3.Query
@type conn :: DBConnection.conn
@doc """
Connect to a MS SQL Server using ODBC.
`opts` expects a keyword list with zero or more of:
* `:odbc_driver` - The driver the adapter will use.
* environment variable: `MSSQL_DVR`
* default value: {ODBC Driver 17 for SQL Server}
* `:hostname` - The server hostname.
* environment variable: `MSSQL_HST`
* default value: localhost
* `:instance_name` - OPTIONAL. The name of the instance, if using named instances.
* environment variable: `MSSQL_IN`
* `:port` - OPTIONAL. The server port number.
* environment variable: `MSSQL_PRT`
* `:database` - The name of the database.
* environment variable: `MSSQL_DB`
* `:username` - Username.
* environment variable: `MSSQL_UID`
* `:password` - <PASSWORD>.
* environment variable: `<PASSWORD>`
* `:encrypt` - Specifies whether data should be encrypted before sending it over the network.
* environment variable: `MSSQL_ENCRYPT`
* `:trust_server_certificate` - When used with Encrypt, enables encryption using a self-signed server certificate.
* environment variable: `MSSQL_TRUST_SERVER_CERT`
`MssqlexV3` uses the `DBConnection` framework and supports all `DBConnection`
options like `:idle`, `:after_connect` etc.
See `DBConnection.start_link/2` for more information.
## Examples
iex> {:ok, pid} = MssqlexV3.start_link(database: "mr_microsoft")
{:ok, #PID<0.70.0>}
"""
@spec start_link(Keyword.t()) :: {:ok, pid}
def start_link(opts) do
DBConnection.start_link(MssqlexV3.Protocol, opts)
end
@doc """
Executes a query against an MS SQL Server with ODBC.
`conn` expects a `MssqlexV3` process identifier.
`statement` expects a SQL query string.
`params` expects a list of values in one of the following formats:
* Strings with only valid ASCII characters, which will be sent to the
database as strings.
* Other binaries, which will be converted to UTF16 Little Endian binaries
(which is what SQL Server expects for its unicode fields).
* `Decimal` structs, which will be encoded as strings so they can be
sent to the database with arbitrary precision.
* Integers, which will be sent as-is if under 10 digits or encoded
as strings for larger numbers.
* Floats, which will be encoded as strings.
* Time as `{hour, minute, sec, usec}` tuples, which will be encoded as
strings.
* Dates as `{year, month, day}` tuples, which will be encoded as strings.
* Datetime as `{{hour, minute, sec, usec}, {year, month, day}}` tuples which
will be encoded as strings. Note that attempting to insert a value with
usec > 0 into a 'datetime' or 'smalldatetime' column is an error since
those column types don't have enough precision to store usec data.
`opts` expects a keyword list with zero or more of:
* `:preserve_encoding`: If `true`, doesn't convert returned binaries from
UTF16LE to UTF8. Default: `false`.
* `:mode` - set to `:savepoint` to use a savepoint to rollback to before the
query on error, otherwise set to `:transaction` (default: `:transaction`);
Result values will be encoded according to the following conversions:
* char and varchar: strings.
* nchar and nvarchar: strings unless `:preserve_encoding` is set to `true`
in which case they will be returned as UTF16 Little Endian binaries.
* int, smallint, tinyint, decimal and numeric when precision < 10 and
scale = 0 (i.e. effectively integers): integers.
* float, real, double precision, decimal and numeric when precision between
10 and 15 and/or scale between 1 and 15: `Decimal` structs.
* bigint, money, decimal and numeric when precision > 15: strings.
* date: `{year, month, day}`
* smalldatetime, datetime, dateime2: `{{YY, MM, DD}, {HH, MM, SS, 0}}` (note that fractional
second data is lost due to limitations of the ODBC adapter. To preserve it
you can convert these columns to varchar during selection.)
* uniqueidentifier, time, binary, varbinary, rowversion: not currently
supported due to adapter limitations. Select statements for columns
of these types must convert them to supported types (e.g. varchar).
"""
@spec query(conn, iodata, list, Keyword.t()) :: {:ok, MssqlexV3.Result.t()} | {:error, Exception.t()}
def query(conn, statement, params, opts \\ []) do
if name = Keyword.get(opts, :cache_statement) do
query = %Query{name: name, cache: :statement, statement: IO.iodata_to_binary(statement)}
case DBConnection.prepare_execute(conn, query, params, opts) do
{:ok, _, result} ->
{:ok, result}
{:error, %MssqlexV3.Error{mssql: %{code: :feature_not_supported}}} = error->
with %DBConnection{} <- conn,
:error <- DBConnection.status(conn) do
error
else
_ -> query_prepare_execute(conn, query, params, opts)
end
{:error, _} = error ->
error
end
else
query_prepare_execute(conn, %Query{name: "", statement: statement}, params, opts)
end
end
@doc """
Runs an (extended) query and returns the result or raises `MssqlexV3.Error` if
there was an error. See `query/3`.
"""
@spec query!(conn, iodata, list, Keyword.t()) :: MssqlexV3.Result.t()
def query!(conn, statement, params, opts \\ []) do
case query(conn, statement, params, opts) do
{:ok, result} -> result
{:error, err} -> raise err
end
end
@doc """
Runs an (extended) prepared query.
It returns the result as `{:ok, %MssqlexV3.Query{}, %MssqlexV3.Result{}}` or
`{:error, %MssqlexV3.Error{}}` if there was an error. Parameters are given as
part of the prepared query, `%MssqlexV3.Query{}`.
See the README for information on how MssqlexV3 encodes and decodes Elixir
values by default. See `MssqlexV3.Query` for the query data and
`MssqlexV3.Result` for the result data.
## Options
* `:queue` - Whether to wait for connection in a queue (default: `true`);
* `:timeout` - Execute request timeout (default: `#{@timeout}`);
* `:decode_mapper` - Fun to map each row in the result to a term after
decoding, (default: `fn x -> x end`);
* `:mode` - set to `:savepoint` to use a savepoint to rollback to before the
execute on error, otherwise set to `:transaction` (default: `:transaction`);
## Examples
query = MssqlexV3.prepare!(conn, "", "CREATE TABLE posts (id serial, title text)")
MssqlexV3.execute(conn, query, [])
query = MssqlexV3.prepare!(conn, "", "SELECT id FROM posts WHERE title like $1")
MssqlexV3.execute(conn, query, ["%my%"])
"""
@spec execute(conn, MssqlexV3.Query.t, list, Keyword.t) ::
{:ok, MssqlexV3.Query.t, MssqlexV3.Result.t} | {:error, MssqlexV3.Error.t}
def execute(conn, query, params, opts \\ []) do
DBConnection.execute(conn, query, params, opts)
end
@doc """
Runs an (extended) prepared query and returns the result or raises
`MssqlexV3.Error` if there was an error. See `execute/4`.
"""
@spec execute!(conn, MssqlexV3.Query.t, list, Keyword.t) :: MssqlexV3.Result.t
def execute!(conn, query, params, opts \\ []) do
DBConnection.execute!(conn, query, params, opts)
end
defp query_prepare_execute(conn, query, params, opts) do
case DBConnection.prepare_execute(conn, query, params, opts) do
{:ok, _, result} -> {:ok, result}
{:error, _} = error -> error
end
end
@spec prepare_execute(conn, iodata, iodata, list, Keyword.t) ::
{:ok, MssqlexV3.Query.t, MssqlexV3.Result.t} | {:error, MssqlexV3.Error.t}
def prepare_execute(conn, name, statement, params, opts \\ []) do
query = %Query{name: name, statement: statement}
DBConnection.prepare_execute(conn, query, params, opts)
end
@doc """
Prepares and runs a query and returns the result or raises
`MssqlexV3.Error` if there was an error. See `prepare_execute/5`.
"""
@spec prepare_execute!(conn, iodata, iodata, list, Keyword.t) ::
{MssqlexV3.Query.t, MssqlexV3.Result.t}
def prepare_execute!(conn, name, statement, params, opts \\ []) do
query = %Query{name: name, statement: statement}
DBConnection.prepare_execute!(conn, query, params, opts)
end
@doc """
Returns a supervisor child specification for a DBConnection pool.
"""
@spec child_spec(Keyword.t) :: Supervisor.Spec.spec
def child_spec(opts) do
opts = MssqlexV3.Utils.default_opts(opts)
DBConnection.child_spec(MssqlexV3.Protocol, opts)
end
end | lib/mssqlex_v3.ex | 0.94419 | 0.627523 | mssqlex_v3.ex | starcoder |
defmodule Recurly.AddOn do
@moduledoc """
Module for handling plan addons in Recurly.
See the [developer docs on plan addons](https://dev.recurly.com/docs/plan-add-ons-object)
for more details
"""
use Recurly.Resource
alias Recurly.{Resource,AddOn,Money,Plan}
@endpoint "/plans/<%= plan_code %>/add_ons"
schema :add_on do
field :accounting_code, :string
field :add_on_code, :string
field :add_on_type, :string
field :created_at, :date_time, read_only: true
field :default_quantity, :integer
field :display_quantity_on_hosted_page, :boolean
field :measured_unit_id, :string
field :name, :string
field :optional, :boolean
field :plan, Plan, read_only: true
field :revenue_schedule_type, :string
field :tax_code, :string
field :unit_amount_in_cents, Money
field :updated_at, :date_time, read_only: true
field :usage_percentage, :string
field :usage_type, :string
end
@doc """
Creates plan addon from a changeset.
## Parameters
- `changeset` Keyword list changeset. This must include a `plan_code` key.
- `plan_code` String plan code of parent plan
## Examples
```
alias Recurly.ValidationError
changeset = [
plan_code: "gold",
add_on_code: "ipaddresses",
name: "Extra IP Addresses",
unit_amount_in_cents: [
usd: 200
]
]
case Recurly.AddOn.create(changeset) do
{:ok, addon} ->
# created the addon
{:error, %ValidationError{errors: errors}} ->
# will give you a list of validation errors
end
```
"""
def create(changeset) do
plan_code = Keyword.fetch!(changeset, :plan_code)
Resource.create(%AddOn{}, changeset, path(plan_code))
end
@doc """
Generates the path to an addon given the plan code.
## Parameters
- `plan_code` String plan code
"""
def path(plan_code) do
EEx.eval_string(@endpoint, plan_code: plan_code)
end
end | lib/recurly/add_on.ex | 0.772917 | 0.57332 | add_on.ex | starcoder |
defmodule Cldr.Interval.Backend do
@moduledoc false
def define_interval_module(config) do
backend = config.backend
config = Macro.escape(config)
quote location: :keep, bind_quoted: [config: config, backend: backend] do
defmodule Interval do
@moduledoc """
Interval formats allow for software to format intervals like "Jan 10-12, 2008" as a
shorter and more natural format than "Jan 10, 2008 - Jan 12, 2008". They are designed
to take a start and end date, time or datetime plus a formatting pattern
and use that information to produce a localized format.
The interval functions in this library will determine the calendar
field with the greatest difference between the two datetimes before using the
format pattern.
For example, the greatest difference in "Jan 10-12, 2008" is the day field, while
the greatest difference in "Jan 10 - Feb 12, 2008" is the month field. This is used to
pick the exact pattern to be used.
See `Cldr.Interval` for further detail.
"""
if Cldr.Code.ensure_compiled?(CalendarInterval) do
@doc false
def to_string(%CalendarInterval{} = interval) do
Cldr.Interval.to_string(interval, unquote(backend), [])
end
end
@doc false
def to_string(%Elixir.Date.Range{} = interval) do
Cldr.Interval.to_string(interval, unquote(backend), [])
end
@doc """
Returns a `Date.Range` or `CalendarInterval` as
a localised string.
## Arguments
* `range` is either a `Date.Range.t` returned from `Date.range/2`
or a `CalendarInterval.t`
* `options` is a keyword list of options. The default is `[]`.
## Options
* `:format` is one of `:short`, `:medium` or `:long` or a
specific format type or a string representing of an interval
format. The default is `:medium`.
* `:style` supports dfferent formatting styles. The valid
styles depends on whether formatting is for a date, time or datetime.
Since the functions in this module will make a determination as
to which formatter to be used based upon the data passed to them
it is recommended the style option be ommitted. If a style is important
then call `to_string/3` directly on `Cldr.Date.Interval`, `Cldr.Time.Interval`
or `Cldr.DateTime.Interval`.
* For a date the alternatives are `:date`, `:month_and_day`, `:month`
and `:year_and_month`. The default is `:date`.
* For a time the alternatives are `:time`, `:zone` and
`:flex`. The default is `:time`
* For a datetime there are no style options, the default
for each of the date and time part is used
* `locale` is any valid locale name returned by `Cldr.known_locale_names/0`
or a `Cldr.LanguageTag` struct. The default is `#{backend}.get_locale/0`
* `number_system:` a number system into which the formatted date digits should
be transliterated
## Returns
* `{:ok, string}` or
* `{:error, {exception, reason}}`
## Notes
* `to_string/2` will decide which formatter to call based upon
the aguments provided to it.
* A `Date.Range.t` will call `Cldr.Date.Interval.to_string/3`
* A `CalendarInterval` will call `Cldr.Date.Interval.to_string/3`
if its `:precision` is `:year`, `:month` or `:day`. Othersie
it will call `Cldr.Time.Interval.to_string/3`
* If `from` and `to` both conform to the `Calendar.datetime()`
type then `Cldr.DateTime.Interval.to_string/3` is called
* Otherwise if `from` and `to` conform to the `Calendar.date()`
type then `Cldr.Date.Interval.to_string/3` is called
* Otherwise if `from` and `to` conform to the `Calendar.time()`
type then `Cldr.Time.Interval.to_string/3` is called
* `CalendarInterval` support requires adding the
dependency [calendar_interval](https://hex.pm/packages/calendar_interval)
to the `deps` configuration in `mix.exs`.
* For more information on interval format string
see `Cldr.Interval`.
* The available predefined formats that can be applied are the
keys of the map returned by `Cldr.DateTime.Format.interval_formats("en", :gregorian)`
where `"en"` can be replaced by any configuration locale name and `:gregorian`
is the underlying `CLDR` calendar type.
* In the case where `from` and `to` are equal, a single
date, time or datetime is formatted instead of an interval
## Examples
iex> use CalendarInterval
iex> #{inspect(__MODULE__)}.to_string ~I"2020-01-01/12",
...> format: :long
{:ok, "Wed, Jan 1 – Sun, Jan 12, 2020"}
iex> #{inspect(__MODULE__)}.to_string Date.range(~D[2020-01-01], ~D[2020-12-31]),
...> format: :long
{:ok, "Wed, Jan 1 – Thu, Dec 31, 2020"}
"""
@spec to_string(Cldr.Interval.range(), Keyword.t()) ::
{:ok, String.t()} | {:error, {module, String.t()}}
if Cldr.Code.ensure_compiled?(CalendarInterval) do
def to_string(%CalendarInterval{} = interval, options) do
Cldr.Interval.to_string(interval, unquote(backend), options)
end
end
def to_string(%Elixir.Date.Range{} = interval, options) do
Cldr.Interval.to_string(interval, unquote(backend), options)
end
@doc """
Returns a string representing the formatted
interval formed by two dates.
## Arguments
* `from` is any map that conforms to the
any one of the `Calendar` types.
* `to` is any map that conforms to the
any one of the `Calendar` types. `to` must
occur on or after `from`.
* `options` is a keyword list of options. The default is `[]`.
## Options
* `:format` is one of `:short`, `:medium` or `:long` or a
specific format type or a string representing of an interval
format. The default is `:medium`.
* `:style` supports dfferent formatting styles. The valid
styles depends on whether formatting is for a date, time or datetime.
Since the functions in this module will make a determination as
to which formatter to be used based upon the data passed to them
it is recommended the style option be ommitted. If styling is important
then call `to_string/3` directly on `Cldr.Date.Interval`, `Cldr.Time.Interval`
or `Cldr.DateTime.Interval`.
* For a date the alternatives are `:date`, `:month_and_day`, `:month`
and `:year_and_month`. The default is `:date`.
* For a time the alternatives are `:time`, `:zone` and
`:flex`. The default is `:time`
* For a datetime there are no style options, the default
for each of the date and time part is used
* `locale` is any valid locale name returned by `Cldr.known_locale_names/0`
or a `Cldr.LanguageTag` struct. The default is `#{backend}.get_locale/0`
* `number_system:` a number system into which the formatted date digits should
be transliterated
## Returns
* `{:ok, string}` or
* `{:error, {exception, reason}}`
## Notes
* `to_string/2` will decide which formatter to call based upon
the aguments provided to it.
* A `Date.Range.t` will call `Cldr.Date.Interval.to_string/3`
* A `CalendarInterval` will call `Cldr.Date.Interval.to_string/3`
if its `:precision` is `:year`, `:month` or `:day`. Othersie
it will call `Cldr.Time.Interval.to_string/3`
* If `from` and `to` both conform to the `Calendar.datetime()`
type then `Cldr.DateTime.Interval.to_string/3` is called
* Otherwise if `from` and `to` conform to the `Calendar.date()`
type then `Cldr.Date.Interval.to_string/3` is called
* Otherwise if `from` and `to` conform to the `Calendar.time()`
type then `Cldr.Time.Interval.to_string/3` is called
* `CalendarInterval` support requires adding the
dependency [calendar_interval](https://hex.pm/packages/calendar_interval)
to the `deps` configuration in `mix.exs`.
* For more information on interval format string
see `Cldr.Interval`.
* The available predefined formats that can be applied are the
keys of the map returned by `Cldr.DateTime.Format.interval_formats("en", :gregorian)`
where `"en"` can be replaced by any configuration locale name and `:gregorian`
is the underlying `CLDR` calendar type.
* In the case where `from` and `to` are equal, a single
date, time or datetime is formatted instead of an interval
## Examples
iex> #{inspect(__MODULE__)}.to_string ~D[2020-01-01], ~D[2020-12-31]
{:ok, "Jan 1 – Dec 31, 2020"}
iex> #{inspect(__MODULE__)}.to_string ~D[2020-01-01], ~D[2020-01-12]
{:ok, "Jan 1 – 12, 2020"}
iex> #{inspect(__MODULE__)}.to_string ~D[2020-01-01], ~D[2020-01-12],
...> format: :long
{:ok, "Wed, Jan 1 – Sun, Jan 12, 2020"}
iex> #{inspect(__MODULE__)}.to_string ~D[2020-01-01], ~D[2020-12-01],
...> format: :long, style: :year_and_month
{:ok, "January – December 2020"}
iex> use CalendarInterval
iex> #{inspect(__MODULE__)}.to_string ~I"2020-01-01/12",
...> format: :long
{:ok, "Wed, Jan 1 – Sun, Jan 12, 2020"}
iex> #{inspect(__MODULE__)}.to_string ~U[2020-01-01 00:00:00.0Z], ~U[2020-12-01 10:05:00.0Z],
...> format: :long
{:ok, "January 1, 2020 at 12:00:00 AM UTC – December 1, 2020 at 10:05:00 AM UTC"}
iex> #{inspect(__MODULE__)}.to_string ~U[2020-01-01 00:00:00.0Z], ~U[2020-01-01 10:05:00.0Z],
...> format: :long
{:ok, "January 1, 2020 at 12:00:00 AM UTC – 10:05:00 AM UTC"}
"""
@spec to_string(Cldr.Interval.datetime(), Cldr.Interval.datetime(), Keyword.t()) ::
{:ok, String.t()} | {:error, {module, String.t()}}
def to_string(from, to, options \\ []) do
Cldr.Interval.to_string(from, to, unquote(backend), options)
end
if Cldr.Code.ensure_compiled?(CalendarInterval) do
@doc false
def to_string!(%CalendarInterval{} = interval) do
Cldr.Interval.to_string!(interval, unquote(backend), [])
end
end
@doc false
def to_string!(%Elixir.Date.Range{} = interval) do
Cldr.Interval.to_string!(interval, unquote(backend), [])
end
@doc """
Returns a `Date.Range` or `CalendarInterval` as
a localised string or raises an exception.
## Arguments
* `range` is either a `Date.Range.t` returned from `Date.range/2`
or a `CalendarInterval.t`
* `options` is a keyword list of options. The default is `[]`.
## Options
* `:format` is one of `:short`, `:medium` or `:long` or a
specific format type or a string representing of an interval
format. The default is `:medium`.
* `:style` supports dfferent formatting styles. The valid
styles depends on whether formatting is for a date, time or datetime.
Since the functions in this module will make a determination as
to which formatter to be used based upon the data passed to them
it is recommended the style option be ommitted. If a style is important
then call `to_string/3` directly on `Cldr.Date.Interval`, `Cldr.Time.Interval`
or `Cldr.DateTime.Interval`.
* For a date the alternatives are `:date`, `:month_and_day`, `:month`
and `:year_and_month`. The default is `:date`.
* For a time the alternatives are `:time`, `:zone` and
`:flex`. The default is `:time`.
* For a datetime there are no style options, the default
for each of the date and time part is used.
* `locale` is any valid locale name returned by `Cldr.known_locale_names/0`
or a `Cldr.LanguageTag` struct. The default is `#{backend}.get_locale/0`.
* `number_system:` a number system into which the formatted date digits should
be transliterated.
## Returns
* `string` or
* raises an exception
## Notes
* `to_string/3` will decide which formatter to call based upon
the aguments provided to it.
* A `Date.Range.t` will call `Cldr.Date.Interval.to_string/3`
* A `CalendarInterval` will call `Cldr.Date.Interval.to_string/3`
if its `:precision` is `:year`, `:month` or `:day`. Otherwise
it will call `Cldr.Time.Interval.to_string/3`
* If `from` and `to` both conform to the `Calendar.datetime()`
type then `Cldr.DateTime.Interval.to_string/3` is called
* Otherwise if `from` and `to` conform to the `Calendar.date()`
type then `Cldr.Date.Interval.to_string/3` is called
* Otherwise if `from` and `to` conform to the `Calendar.time()`
type then `Cldr.Time.Interval.to_string/3` is called
* `CalendarInterval` support requires adding the
dependency [calendar_interval](https://hex.pm/packages/calendar_interval)
to the `deps` configuration in `mix.exs`.
* For more information on interval format string
see `Cldr.Interval`.
* The available predefined formats that can be applied are the
keys of the map returned by `Cldr.DateTime.Format.interval_formats("en", :gregorian)`
where `"en"` can be replaced by any configuration locale name and `:gregorian`
is the underlying `CLDR` calendar type.
* In the case where `from` and `to` are equal, a single
date, time or datetime is formatted instead of an interval
## Examples
iex> use CalendarInterval
iex> #{inspect(__MODULE__)}.to_string! ~I"2020-01-01/12",
...> format: :long
"Wed, Jan 1 – Sun, Jan 12, 2020"
iex> #{inspect(__MODULE__)}.to_string! Date.range(~D[2020-01-01], ~D[2020-12-31]),
...> format: :long
"Wed, Jan 1 – Thu, Dec 31, 2020"
"""
@spec to_string!(Cldr.Interval.range(), Keyword.t()) ::
String.t() | no_return
if Cldr.Code.ensure_compiled?(CalendarInterval) do
def to_string!(%CalendarInterval{} = interval, options) do
Cldr.Interval.to_string!(interval, unquote(backend), options)
end
end
def to_string!(%Elixir.Date.Range{} = interval, options) do
Cldr.Interval.to_string!(interval, unquote(backend), options)
end
@doc """
Returns a string representing the formatted
interval formed by two date or raises an
exception.
## Arguments
* `from` is any map that conforms to the
any one of the `Calendar` types.
* `to` is any map that conforms to the
any one of the `Calendar` types. `to` must
occur on or after `from`.
* `options` is a keyword list of options. The default is `[]`.
## Options
* `:format` is one of `:short`, `:medium` or `:long` or a
specific format type or a string representing of an interval
format. The default is `:medium`.
* `:style` supports dfferent formatting styles. The valid
styles depends on whether formatting is for a date, time or datetime.
Since the functions in this module will make a determination as
to which formatter to be used based upon the data passed to them
it is recommended the style option be ommitted. If styling is important
then call `to_string/3` directly on `Cldr.Date.Interval`, `Cldr.Time.Interval`
or `Cldr.DateTime.Interval`.
* For a date the alternatives are `:date`, `:month_and_day`, `:month`
and `:year_and_month`. The default is `:date`.
* For a time the alternatives are `:time`, `:zone` and
`:flex`. The default is `:time`.
* For a datetime there are no style options, the default
for each of the date and time part is used.
* `locale` is any valid locale name returned by `Cldr.known_locale_names/0`
or a `Cldr.LanguageTag` struct. The default is `#{backend}.get_locale/0`.
* `number_system:` a number system into which the formatted date digits should
be transliterated.
## Returns
* `string` or
* raises and exception
## Notes
* `to_string/3` will decide which formatter to call based upon
the aguments provided to it.
* A `Date.Range.t` will call `Cldr.Date.Interval.to_string/3`
* A `CalendarInterval` will call `Cldr.Date.Interval.to_string/3`
if its `:precision` is `:year`, `:month` or `:day`. Othersie
it will call `Cldr.Time.Interval.to_string/3`
* If `from` and `to` both conform to the `Calendar.datetime()`
type then `Cldr.DateTime.Interval.to_string/3` is called
* Otherwise if `from` and `to` conform to the `Calendar.date()`
type then `Cldr.Date.Interval.to_string/3` is called
* Otherwise if `from` and `to` conform to the `Calendar.time()`
type then `Cldr.Time.Interval.to_string/3` is called
* `CalendarInterval` support requires adding the
dependency [calendar_interval](https://hex.pm/packages/calendar_interval)
to the `deps` configuration in `mix.exs`.
* For more information on interval format string
see `Cldr.Interval`.
* The available predefined formats that can be applied are the
keys of the map returned by `Cldr.DateTime.Format.interval_formats("en", :gregorian)`
where `"en"` can be replaced by any configuration locale name and `:gregorian`
is the underlying `CLDR` calendar type.
* In the case where `from` and `to` are equal, a single
date, time or datetime is formatted instead of an interval
## Examples
iex> #{inspect(__MODULE__)}.to_string! ~D[2020-01-01], ~D[2020-12-31]
"Jan 1 – Dec 31, 2020"
iex> #{inspect(__MODULE__)}.to_string! ~D[2020-01-01], ~D[2020-01-12]
"Jan 1 – 12, 2020"
iex> #{inspect(__MODULE__)}.to_string! ~D[2020-01-01], ~D[2020-01-12],
...> format: :long
"Wed, Jan 1 – Sun, Jan 12, 2020"
iex> #{inspect(__MODULE__)}.to_string! ~D[2020-01-01], ~D[2020-12-01],
...> format: :long, style: :year_and_month
"January – December 2020"
iex> use CalendarInterval
iex> #{inspect(__MODULE__)}.to_string! ~I"2020-01-01/12",
...> format: :long
"Wed, Jan 1 – Sun, Jan 12, 2020"
iex> #{inspect(__MODULE__)}.to_string! ~U[2020-01-01 00:00:00.0Z], ~U[2020-12-01 10:05:00.0Z],
...> format: :long
"January 1, 2020 at 12:00:00 AM UTC – December 1, 2020 at 10:05:00 AM UTC"
iex> #{inspect(__MODULE__)}.to_string! ~U[2020-01-01 00:00:00.0Z], ~U[2020-01-01 10:05:00.0Z],
...> format: :long
"January 1, 2020 at 12:00:00 AM UTC – 10:05:00 AM UTC"
"""
@spec to_string!(Cldr.Interval.datetime(), Cldr.Interval.datetime(), Keyword.t()) ::
String.t() | no_return()
def to_string!(from, to, options \\ []) do
Cldr.Interval.to_string!(from, to, unquote(backend), options)
end
end
end
end
end | lib/cldr/backend/interval.ex | 0.938287 | 0.674164 | interval.ex | starcoder |
defmodule Blockchain.TransactionIO do
@moduledoc """
Implementation of blockchain transaction input/output (IO)
"""
alias Blockchain.Hash
alias Blockchain.Wallet
@enforce_keys [:transaction_hash, :value, :owner, :timestamp]
defstruct @enforce_keys
@typedoc """
Represents transaction IO
"""
@type t :: %__MODULE__{
transaction_hash: Hash.t(),
value: number(),
owner: Wallet.t(),
timestamp: DateTime.t()
}
@doc """
Create a new transaction IO
"""
@spec new(number(), Wallet.t()) :: __MODULE__.t()
def new(value, %Wallet{} = owner) do
timestamp = DateTime.utc_now()
%__MODULE__{
transaction_hash: calculate_transaction_io_hash(value, owner, timestamp),
value: value,
owner: owner,
timestamp: timestamp
}
end
@doc """
Calculates a transaction IO hash using the SHA hashing algorithm
"""
@spec calculate_transaction_io_hash(number(), Wallet.t(), DateTime.t()) :: Hash.t()
def calculate_transaction_io_hash(value, %Wallet{} = owner, %DateTime{} = timestamp) do
# Append all data as a list of binaries or strings and then hash the list
ExCrypto.Hash.sha256!([
to_string(value),
:erlang.term_to_binary(owner),
DateTime.to_string(timestamp)
])
|> Hash.new()
end
@doc """
Calculates a transaction IO hash using the SHA hashing algorithm
"""
@spec calculate_transaction_io_hash(__MODULE__.t()) :: Hash.t()
def calculate_transaction_io_hash(%__MODULE__{} = transaction_io) do
calculate_transaction_io_hash(
transaction_io.value,
transaction_io.owner,
transaction_io.timestamp
)
end
@doc """
Determines if a transaction IO is valid or not by re-calculating the transction IO's
hash and comparing it to the transaction IO's existing hash
"""
@spec valid?(__MODULE__.t()) :: boolean()
def valid?(%__MODULE__{} = transaction_io) do
transaction_io.transaction_hash == calculate_transaction_io_hash(transaction_io)
end
end | lib/blockchain/transaction_io.ex | 0.884831 | 0.462655 | transaction_io.ex | starcoder |
defmodule Logger.Backends.Console do
@moduledoc ~S"""
A logger backend that logs messages by printing them to the console.
## Options
* `:level` - the level to be logged by this backend.
Note that messages are filtered by the general
`:level` configuration for the `:logger` application first.
* `:format` - the format message used to print logs.
Defaults to: `"\n$time $metadata[$level] $message\n"`.
It may also be a `{module, function}` tuple that is invoked
with the log level, the message, the current timestamp and
the metadata.
* `:metadata` - the metadata to be printed by `$metadata`.
Defaults to an empty list (no metadata).
Setting `:metadata` to `:all` prints all metadata. See
the "Metadata" section for more information.
* `:colors` - a keyword list of coloring options.
* `:device` - the device to log error messages to. Defaults to
`:user` but can be changed to something else such as `:standard_error`.
* `:max_buffer` - maximum events to buffer while waiting
for a confirmation from the IO device (default: 32).
Once the buffer is full, the backend will block until
a confirmation is received.
The supported keys in the `:colors` keyword list are:
* `:enabled` - boolean value that allows for switching the
coloring on and off. Defaults to: `IO.ANSI.enabled?/0`
* `:debug` - color for debug messages. Defaults to: `:cyan`
* `:info` - color for info and notice messages. Defaults to: `:normal`
* `:warn` - color for warning messages. Defaults to: `:yellow`
* `:error` - color for error and higher messages. Defaults to: `:red`
See the `IO.ANSI` module for a list of colors and attributes.
Here is an example of how to configure the `:console` backend in a
`config/config.exs` file:
config :logger, :console,
format: "\n$time $metadata[$level] $message\n",
metadata: [:user_id]
## Custom formatting
The console backend allows you to customize the format of your
log messages with the `:format` option.
You may set `:format` to either a string or a `{module, function}`
tuple if you wish to provide your own format function. Here is an
example of how to configure the `:console` backend in a
`config/config.exs` file:
config :logger, :console,
format: {MyConsoleLogger, :format}
And here is an example of how you can define `MyConsoleLogger.format/4`
from the above configuration:
defmodule MyConsoleLogger do
def format(level, message, timestamp, metadata) do
# Custom formatting logic...
end
end
It is extremely important that **the formatting function does
not fail**, as it will bring that particular logger instance down,
causing your system to temporarily lose messages. If necessary,
wrap the function in a `rescue` and log a default message instead:
defmodule MyConsoleLogger do
def format(level, message, timestamp, metadata) do
# Custom formatting logic...
rescue
_ -> "could not format: #{inspect({level, message, metadata})}"
end
end
The `{module, function}` will be invoked with four arguments:
* the log level: an atom
* the message: this is usually chardata, but in some cases it
may contain invalid data. Since the formatting function should
*never* fail, you need to prepare for the message being anything
* the current timestamp: a term of type `t:Logger.Formatter.time/0`
* the metadata: a keyword list
You can read more about formatting in `Logger.Formatter`, especially
if you want to support custom formatting in a custom backend.
"""
@behaviour :gen_event
defstruct buffer: [],
buffer_size: 0,
colors: nil,
device: nil,
format: nil,
level: nil,
max_buffer: nil,
metadata: nil,
output: nil,
ref: nil
@impl true
def init(:console) do
config = Application.get_env(:logger, :console)
device = Keyword.get(config, :device, :user)
if Process.whereis(device) do
{:ok, init(config, %__MODULE__{})}
else
{:error, :ignore}
end
end
def init({__MODULE__, opts}) when is_list(opts) do
config = configure_merge(Application.get_env(:logger, :console), opts)
{:ok, init(config, %__MODULE__{})}
end
@impl true
def handle_call({:configure, options}, state) do
{:ok, :ok, configure(options, state)}
end
@impl true
def handle_event({level, _gl, {Logger, msg, ts, md}}, state) do
%{level: log_level, ref: ref, buffer_size: buffer_size, max_buffer: max_buffer} = state
{:erl_level, level} = List.keyfind(md, :erl_level, 0, {:erl_level, level})
cond do
not meet_level?(level, log_level) ->
{:ok, state}
is_nil(ref) ->
{:ok, log_event(level, msg, ts, md, state)}
buffer_size < max_buffer ->
{:ok, buffer_event(level, msg, ts, md, state)}
buffer_size === max_buffer ->
state = buffer_event(level, msg, ts, md, state)
{:ok, await_io(state)}
end
end
def handle_event(:flush, state) do
{:ok, flush(state)}
end
def handle_event(_, state) do
{:ok, state}
end
@impl true
def handle_info({:io_reply, ref, msg}, %{ref: ref} = state) do
{:ok, handle_io_reply(msg, state)}
end
def handle_info({:DOWN, ref, _, pid, reason}, %{ref: ref}) do
raise "device #{inspect(pid)} exited: " <> Exception.format_exit(reason)
end
def handle_info(_, state) do
{:ok, state}
end
@impl true
def code_change(_old_vsn, state, _extra) do
{:ok, state}
end
@impl true
def terminate(_reason, _state) do
:ok
end
## Helpers
defp meet_level?(_lvl, nil), do: true
defp meet_level?(lvl, min) do
Logger.compare_levels(lvl, min) != :lt
end
defp configure(options, state) do
config = configure_merge(Application.get_env(:logger, :console), options)
Application.put_env(:logger, :console, config)
init(config, state)
end
defp init(config, state) do
level = Keyword.get(config, :level)
device = Keyword.get(config, :device, :user)
format = Logger.Formatter.compile(Keyword.get(config, :format))
colors = configure_colors(config)
metadata = Keyword.get(config, :metadata, []) |> configure_metadata()
max_buffer = Keyword.get(config, :max_buffer, 32)
%{
state
| format: format,
metadata: metadata,
level: level,
colors: colors,
device: device,
max_buffer: max_buffer
}
end
defp configure_metadata(:all), do: :all
defp configure_metadata(metadata), do: Enum.reverse(metadata)
defp configure_merge(env, options) do
Keyword.merge(env, options, fn
:colors, v1, v2 -> Keyword.merge(v1, v2)
_, _v1, v2 -> v2
end)
end
defp configure_colors(config) do
colors = Keyword.get(config, :colors, [])
%{
emergency: Keyword.get(colors, :error, :red),
alert: Keyword.get(colors, :error, :red),
critical: Keyword.get(colors, :error, :red),
error: Keyword.get(colors, :error, :red),
warning: Keyword.get(colors, :warn, :yellow),
notice: Keyword.get(colors, :info, :normal),
info: Keyword.get(colors, :info, :normal),
debug: Keyword.get(colors, :debug, :cyan),
enabled: Keyword.get(colors, :enabled, IO.ANSI.enabled?())
}
end
defp log_event(level, msg, ts, md, %{device: device} = state) do
output = format_event(level, msg, ts, md, state)
%{state | ref: async_io(device, output), output: output}
end
defp buffer_event(level, msg, ts, md, state) do
%{buffer: buffer, buffer_size: buffer_size} = state
buffer = [buffer | format_event(level, msg, ts, md, state)]
%{state | buffer: buffer, buffer_size: buffer_size + 1}
end
defp async_io(name, output) when is_atom(name) do
case Process.whereis(name) do
device when is_pid(device) ->
async_io(device, output)
nil ->
raise "no device registered with the name #{inspect(name)}"
end
end
defp async_io(device, output) when is_pid(device) do
ref = Process.monitor(device)
send(device, {:io_request, self(), ref, {:put_chars, :unicode, output}})
ref
end
defp await_io(%{ref: nil} = state), do: state
defp await_io(%{ref: ref} = state) do
receive do
{:io_reply, ^ref, :ok} ->
handle_io_reply(:ok, state)
{:io_reply, ^ref, error} ->
handle_io_reply(error, state)
|> await_io()
{:DOWN, ^ref, _, pid, reason} ->
raise "device #{inspect(pid)} exited: " <> Exception.format_exit(reason)
end
end
defp format_event(level, msg, ts, md, state) do
%{format: format, metadata: keys, colors: colors} = state
format
|> Logger.Formatter.format(level, msg, ts, take_metadata(md, keys))
|> color_event(level, colors, md)
end
defp take_metadata(metadata, :all) do
metadata
end
defp take_metadata(metadata, keys) do
Enum.reduce(keys, [], fn key, acc ->
case Keyword.fetch(metadata, key) do
{:ok, val} -> [{key, val} | acc]
:error -> acc
end
end)
end
defp color_event(data, _level, %{enabled: false}, _md), do: data
defp color_event(data, level, %{enabled: true} = colors, md) do
color = md[:ansi_color] || Map.fetch!(colors, level)
[IO.ANSI.format_fragment(color, true), data | IO.ANSI.reset()]
end
defp log_buffer(%{buffer_size: 0, buffer: []} = state), do: state
defp log_buffer(state) do
%{device: device, buffer: buffer} = state
%{state | ref: async_io(device, buffer), buffer: [], buffer_size: 0, output: buffer}
end
defp handle_io_reply(:ok, %{ref: ref} = state) do
Process.demonitor(ref, [:flush])
log_buffer(%{state | ref: nil, output: nil})
end
defp handle_io_reply({:error, {:put_chars, :unicode, _} = error}, state) do
retry_log(error, state)
end
defp handle_io_reply({:error, :put_chars}, %{output: output} = state) do
retry_log({:put_chars, :unicode, output}, state)
end
defp handle_io_reply({:error, error}, _) do
raise "failure while logging console messages: " <> inspect(error)
end
defp retry_log(error, %{device: device, ref: ref, output: dirty} = state) do
Process.demonitor(ref, [:flush])
try do
:unicode.characters_to_binary(dirty)
rescue
ArgumentError ->
clean = ["failure while trying to log malformed data: ", inspect(dirty), ?\n]
%{state | ref: async_io(device, clean), output: clean}
else
{_, good, bad} ->
clean = [good | Logger.Formatter.prune(bad)]
%{state | ref: async_io(device, clean), output: clean}
_ ->
# A well behaved IO device should not error on good data
raise "failure while logging consoles messages: " <> inspect(error)
end
end
defp flush(%{ref: nil} = state), do: state
defp flush(state) do
state
|> await_io()
|> flush()
end
end | lib/logger/lib/logger/backends/console.ex | 0.819569 | 0.61086 | console.ex | starcoder |
defmodule Stache do
@moduledoc """
Mustache templates for Elixir.
`Stache` is a templating engine for compiling mustache templates into native Elixir
functions. It fully supports the features of the Mustache spec, allowing you to
easily use the logic-less mustache templates you know and love.
The API mirrors that of `EEx`.
See the [mustache spec](https://mustache.github.io/mustache.5.html) for information
about the mustache templating system itself.
"""
@doc """
Compiles and renders the template `string` with `context`.
"""
def eval_string(string, context, partials \\ %{}) do
string
|> Stache.Compiler.compile!
|> Code.eval_quoted([stache_assigns: [context], stache_partials: partials])
|> elem(0)
|> to_string
end
@doc """
Compiles and renders the template `filename` with `context`.
"""
def eval_file(filename, context, partials \\ %{}) do
File.read!(filename) |> eval_string(context, partials)
end
@doc """
Compiles `template` and defines an elixir function from it.
`kind` can be `:def` or `:defp`.
This defines a 2-arity function that takes both the context to render
along with the set of partials, if any. Both must be a `Map`.
## Examples
# templates.ex
defmodule Templates do
require Stache
def foo, do: 1
Stache.function_from_string(:def, :hello_world, "{{hello}}, world!")
end
# iex
Templates.hello_world %{hello: "Hello"} #=> "Hello, world!"
"""
defmacro function_from_string(kind, name, template) do
quote bind_quoted: binding do
compiled = Stache.Compiler.compile!(template)
case kind do
:def ->
def(unquote(name)(context, stache_partials \\ %{})) do
var!(stache_assigns) = [context]
unquote(compiled)
end
:defp ->
defp(unquote(name)(context, stache_partials \\ %{})) do
var!(stache_assigns) = [context]
unquote(compiled)
end
end
end
end
@doc """
Compiles `file` and defines an elixir function from it.
`kind` can be `:def` or `:defp`.
This defines a 2-arity function that takes both the context to render
along with the set of partials, if any. Both must be a `Map`.
## Examples
# hello.stache
{{hello}}, world!
# templates.ex
defmodule Templates do
require Stache
def foo, do: 1
Stache.function_from_file(:def, :hello_world, "hello.stache")
end
# iex
Templates.hello_world %{hello: "Hello"} #=> "Hello, world!"
"""
defmacro function_from_file(kind, name, file) do
template = File.read!(file)
quote bind_quoted: [kind: kind, name: name, template: template] do
Stache.function_from_string(kind, name, template)
end
end
end | lib/stache.ex | 0.855926 | 0.454109 | stache.ex | starcoder |
defmodule Elixlsx do
@moduledoc ~S"""
Elixlsx is a writer for the MS Excel OpenXML format (.xlsx).
# Quick Overview
The `write_to` function takes a `Elixlsx.Workbook` object
and a filename. A Workbook is a collection of `Elixlsx.Sheet`s with
(currently only) a *creation date*.
See the example.exs file for usage instructions.
# Hacking / Technical overview
XLSX stores potentially repeating values in databases, most
notably sharedStrings.xml and styles.xml. In these databases,
each element is assigned a unique ID which is then referenced
later. IDs are consecutive and correspond to the (0-indexed)
position in the database (except for number/date formattings,
where the ID is explicitly given in the attribute and needs to
be at least 164).
The sharedStrings database is built up using the
`Elixlsx.Compiler.StringDB` module. Pre-compilation, all cells
are *folded* over, producing the StringDB struct which assigns
each string a uniqe ID. The StringDB is part of the
`Elixlsx.Compiler.WorkbookCompInfo` struct, which is passed to
the XML generating function, which then `get_id`'s the ID
associated with the string found in the cell.
For styles.xml, the procedure is in general the same, but slightly
more complicated since elements can reference other elements in
the same file. The `Elixlsx.Style.CellStyle` element is the
combination of sub-styles (`Elixlsx.Style.Font`, `Elixlsx.Style.NumFmt`,
...). A call to register_all creates the (unique) entries in the
sub-style databases (`Elixlsx.Compiler.FontDB`, `Elixlsx.Compiler.NumFmtDB`).
Afterwards, each unique combination of substyles is assigned an ID
in `Elixlsx.Compiler.CellStyleDB`. During XML generation, the <xf>
elements reference the individual sub-style IDs, and the actual cell
element references the <xf> id.
"""
@doc ~S"""
Write a Workbook object to the given filename
"""
@spec write_to(Elixlsx.Workbook.t, String.t) :: {:ok, String.t} | {:error, any()}
def write_to(workbook, filename) do
wci = Elixlsx.Compiler.make_workbook_comp_info workbook
:zip.create(to_charlist(filename), Elixlsx.Writer.create_files(workbook, wci))
end
@doc ~S"""
Write a Workbook object to the binary
Returns a tuple containing a filename and the binary
"""
@spec write_to_memory(Elixlsx.Workbook.t, String.t) :: {:ok, {String.t, binary}} | {:error, any()}
def write_to_memory(workbook, filename) do
wci = Elixlsx.Compiler.make_workbook_comp_info workbook
:zip.create(to_charlist(filename), Elixlsx.Writer.create_files(workbook, wci), [:memory])
end
end | lib/elixlsx.ex | 0.823328 | 0.582254 | elixlsx.ex | starcoder |
defmodule KantanCluster.NodeConnector do
@moduledoc false
# When a server is unneeded, we want to stop it immediately.
use GenServer, restart: :transient
require Logger
@polling_interval_ms :timer.seconds(5)
## API
@doc """
Connects to a specified node and start monitoring it.
"""
@spec start_link(node) :: GenServer.on_start()
def start_link(connect_to) when is_atom(connect_to) do
if Node.self() == :nonode@nohost do
:ignore
else
case whereis(connect_to) do
nil -> GenServer.start_link(__MODULE__, connect_to, name: via(connect_to))
pid -> {:ok, pid}
end
end
end
@doc """
Disconnects from a specified node and stops monitoring it.
"""
@spec disconnect(node) :: :ok
def disconnect(node_name) when is_atom(node_name) do
Node.disconnect(node_name)
GenServer.stop(whereis(node_name), :normal)
end
@spec whereis(node) :: nil | pid
def whereis(node_name) when is_atom(node_name) do
KantanCluster.ProcessRegistry.whereis(node_name)
end
defp via(connect_to) when is_atom(connect_to) do
KantanCluster.ProcessRegistry.via(connect_to)
end
## Callback
@impl GenServer
def init(connect_to) do
connect_node(connect_to)
send(self(), :tick)
{:ok, %{connect_to: connect_to}}
end
@impl GenServer
def handle_info(:tick, state) do
if Node.self() == :nonode@nohost do
# If node is stopped, there is no need for monitoring.
{:stop, :normal, state}
else
# Node.monitor/2 does not trigger :nodedown every now and then. Pinging
# periodically is more reliable for monitoring connected nodes.
Process.send_after(self(), :tick, @polling_interval_ms)
if :pang == Node.ping(state.connect_to) do
connect_node(state.connect_to)
end
{:noreply, state}
end
end
@spec connect_node(node) :: boolean
defp connect_node(connect_to) do
if connected = Node.connect(connect_to) do
Logger.info("connected from #{Node.self()} to #{connect_to}")
else
Logger.warning("could not connect from #{Node.self()} to #{connect_to}")
end
connected
end
end | lib/kantan_cluster/node_connector.ex | 0.697094 | 0.441974 | node_connector.ex | starcoder |
defmodule Andy.Profiles.Rover.GMDefs.AvoidingObstacle do
@moduledoc "The GM definition for :avoiding_obstacle"
alias Andy.GM.{GenerativeModelDef, Conjecture, Prediction}
import Andy.GM.Utils
def gm_def() do
%GenerativeModelDef{
name: :avoiding_obstacle,
conjectures: [
conjecture(:obstacle_not_hit),
conjecture(:obstacle_avoided)
],
contradictions: [],
priors: %{
obstacle_not_hit: %{about: :self, values: %{is: true}},
obstacle_avoided: %{about: :self, values: %{is: true}}
},
intentions: movement_intentions()
}
end
# Conjectures
# Self-activated goal
defp conjecture(:obstacle_not_hit) do
%Conjecture{
name: :obstacle_not_hit,
self_activated: true,
activator: goal_activator(fn %{is: not_hit?} -> not_hit? end, :self),
predictors: [
no_change_predictor(:distance_to_obstacle, default: %{is: :unknown})
],
valuator: obstacle_not_hit_belief_valuator(),
intention_domain: movement_domain()
}
end
# Self-activated goal
defp conjecture(:obstacle_avoided) do
%Conjecture{
name: :obstacle_avoided,
self_activated: true,
activator: goal_activator(fn %{is: avoided?} -> avoided? end, :self),
predictors: [
distance_to_obstacle_predictor()
],
valuator: obstacle_avoided_belief_valuator(),
intention_domain: movement_domain()
}
end
# Conjecture predictors
def distance_to_obstacle_predictor() do
fn conjecture_activation, rounds ->
about = conjecture_activation.about
expectation = expected_numerical_value(rounds, :distance_to_obstacle, about, :is)
%Prediction{
conjecture_name: :distance_to_obstacle,
about: about,
expectations: Map.new([{:is, expectation}])
}
end
end
# Conjecture belief valuators
defp obstacle_not_hit_belief_valuator() do
fn conjecture_activation, [round | _previous_rounds] ->
about = conjecture_activation.about
distance_to_obstacle =
current_perceived_value(round, about, :distance_to_obstacle, :is, default: :unknown)
touched? =
distance_to_obstacle != :unknown and
less_than?(distance_to_obstacle, 10)
%{is: not touched?}
end
end
defp obstacle_avoided_belief_valuator() do
fn conjecture_activation, [round | _previous_rounds] ->
about = conjecture_activation.about
approaching_obstacle? =
current_perceived_value(round, about, :approaching_obstacle, :is, default: false)
distance_to_obstacle =
current_perceived_value(round, about, :distance_to_obstacle, :is, default: :unknown)
%{
is:
not approaching_obstacle? and
(distance_to_obstacle == :unknown or
greater_than?(distance_to_obstacle, 20))
}
end
end
end | lib/andy/profiles/rover/gm_defs/avoiding_obstacle.ex | 0.712732 | 0.455986 | avoiding_obstacle.ex | starcoder |
defmodule AdventOfCode.Day09 do
@spec problem1 :: number
def problem1 do
{width, height} = _shape = Nx.shape(input())
padded = Nx.pad(input(), 99, [{0, 0, 0}, {1, 1, 0}])
shifted = Nx.slice_axis(padded, 0, height, :x)
x1 = Nx.less(input(), shifted)
shifted = Nx.slice_axis(padded, 2, height, :x)
x2 = Nx.less(input(), shifted)
x = Nx.logical_and(x1, x2)
padded = Nx.pad(input(), 99, [{1, 1, 0}, {0, 0, 0}])
shifted = Nx.slice_axis(padded, 0, width, :y)
y1 = Nx.less(input(), shifted)
shifted = Nx.slice_axis(padded, 2, width, :y)
y2 = Nx.less(input(), shifted)
y = Nx.logical_and(y1, y2)
minimas = Nx.logical_and(x, y)
input()
|> Nx.multiply(minimas)
|> Nx.sum()
|> Nx.to_number()
end
@spec problem2 :: any
def problem2 do
{width, _height} = shape = Nx.shape(input())
input()
|> Nx.equal(10)
|> Nx.logical_not()
|> Nx.select(Nx.iota(shape), 9999)
|> Nx.to_flat_list()
|> Enum.reject(&(&1 == 9999))
|> Enum.map(fn point -> {div(point, width), rem(point, width)} end)
|> Enum.reduce([], fn {y, x} = point, basins ->
basin_left = Enum.find_index(basins, &({y, x - 1} in &1))
basin_up = Enum.find_index(basins, &({y - 1, x} in &1))
case {basin_left, basin_up} do
{nil, nil} ->
[MapSet.new([point]) | basins]
{idx, nil} ->
List.update_at(basins, idx, &MapSet.put(&1, point))
{nil, idx} ->
List.update_at(basins, idx, &MapSet.put(&1, point))
{idx, idx} ->
List.update_at(basins, idx, &MapSet.put(&1, point))
{idx1, idx2} ->
{old, basins} = List.pop_at(basins, max(idx1, idx2))
List.update_at(basins, min(idx1, idx2), &(&1 |> MapSet.union(old) |> MapSet.put(point)))
end
end)
|> Enum.map(&MapSet.size/1)
|> Enum.sort(:desc)
|> Enum.take(3)
|> Enum.reduce(&*/2)
end
defp input do
input =
File.read!("data/09/input")
|> String.split("\n", trim: true)
|> Enum.map(&String.to_charlist(String.trim(&1)))
|> Nx.tensor(names: [:y, :x])
|> Nx.subtract(?0)
|> Nx.add(1)
input
end
end | 2021/lib/aoc09.ex | 0.689515 | 0.684554 | aoc09.ex | starcoder |
defmodule Combinatorics do
# === product ===
@doc ~S"""
Cartesian Product of 2 Enumerables.
(At least 2nd Enumerable should be finite)
## Examples
iex> Combinatorics.product([1, 2, 3], 1..3) |> Enum.to_list
[{1, 1}, {1, 2}, {1, 3}, {2, 1}, {2, 2}, {2, 3}, {3, 1}, {3, 2}, {3, 3}]
iex> Stream.iterate(1, &(&1+1)) |> Combinatorics.product(1..3) |> Enum.take(10)
[{1, 1}, {1, 2}, {1, 3}, {2, 1}, {2, 2}, {2, 3}, {3, 1}, {3, 2}, {3, 3}, {4, 1}]
"""
def product(enum1, enum2) do
product([enum1, enum2])
end
@doc ~S"""
Cartesian Product of multi Enumerables.
(At least last Enumerable should be finite)
## Examples
iex> Combinatorics.product([1..2, 3..4, 5..6]) |> Enum.to_list
[{1, 3, 5}, {1, 3, 6}, {1, 4, 5}, {1, 4, 6}, {2, 3, 5}, {2, 3, 6}, {2, 4, 5}, {2, 4, 6}]
iex> Combinatorics.product([Stream.iterate(1, &(&1+1)), 1..3]) |> Enum.take(10)
[{1, 1}, {1, 2}, {1, 3}, {2, 1}, {2, 2}, {2, 3}, {3, 1}, {3, 2}, {3, 3}, {4, 1}]
"""
def product([]), do: []
def product([it|[]]), do: Stream.map(it, &{&1})
def product(its) when is_list(its) do
&do_product({its, [], its, [], []}, &1, &2)
end
defp do_product(_, {:halt, term}, _fun), do: {:halted, term}
defp do_product(v, {:suspend, term}, fun) do
{:suspended, term, &do_product(v, &1, fun)}
end
defp do_product({[], [x|ys], [], [z|ws], vals = [_|vs]}, {:cont, term}, fun) do
do_product({[x], ys, [z], ws, vs}, fun.(List.to_tuple(:lists.reverse(vals)), term), fun)
end
defp do_product({[y|xs], [], [z|zs], [], []}, acc = {:cont, term}, fun) do
case next(z) do
{:next, v, w} -> do_product({xs, [y], zs, [w], [v]}, acc, fun)
_ -> {:done, term}
end
end
defp do_product({xss = [x|xs], yss = [y|ys], [z|zs], wss = [w|ws], vals = [_|vs]}, acc = {:cont, _}, fun) do
case next(z) do
{:next, v, next_w} -> do_product({xs, [x|yss], zs, [next_w|wss], [v|vals]}, acc, fun)
_ -> do_product({[y|xss], ys, [w|[x|zs]], ws, vs}, acc, fun)
end
end
# === combinations ===
@doc ~S"""
Combinations - n-length tuples, in sorted order, no repeated elements.
## Examples
iex> Combinatorics.combinations(1..4, 2) |> Enum.to_list
[{1, 2}, {1, 3}, {1, 4}, {2, 3}, {2, 4}, {3, 4}]
iex> Combinatorics.combinations(1..4, 3) |> Enum.to_list
[{1, 2, 3}, {1, 2, 4}, {1, 3, 4}, {2, 3, 4}]
"""
def combinations(_enum, 0), do: []
def combinations(enum, 1), do: Stream.map(enum, &{&1})
def combinations(enum, n) when is_integer(n) and n > 1 do
case next(enum) do
{:next, v, fun} -> &do_combinations({[fun], [v], :next, n - 1}, &1, &2)
_ -> []
end
end
defp do_combinations(_, {:halt, term}, _fun), do: {:halted, term}
defp do_combinations(v, {:suspend, term}, fun) do
{:suspended, term, &do_combinations(v, &1, fun)}
end
defp do_combinations({[], _, _, _}, {:cont, term}, _), do: {:done, term}
defp do_combinations({fs, vals, _, 0}, {:cont, term}, fun) do
do_combinations({fs, vals, :back, 1}, fun.(List.to_tuple(:lists.reverse(vals)), term), fun)
end
defp do_combinations({funs = [f|fs], vals = [_|vs], :next, n}, acc = {:cont, _}, fun) do
case next(f) do
{:next, v, next_f} -> do_combinations({[next_f|funs], [v|vals], :next, n - 1}, acc, fun)
_ -> do_combinations({fs, vs, :back, n + 2}, acc, fun)
end
end
defp do_combinations({[f|fs], [_|vs], :back, n}, acc = {:cont, _}, fun) do
case next(f) do
{:next, v, next_f} -> do_combinations({[next_f|fs], [v|vs], :next, n - 1}, acc, fun)
_ -> do_combinations({fs, vs, :back, n + 1}, acc, fun)
end
end
# === permutations ===
@doc ~S"""
Permutations - full-length tuples, all possible orderings, no repeated elements.
Notice: parameter `enum` can be a List or a Range.
## Examples
iex> Combinatorics.permutations([1, 2, 3]) |> Enum.to_list
[{1, 2, 3}, {1, 3, 2}, {2, 1, 3}, {2, 3, 1}, {3, 1, 2}, {3, 2, 1}]
iex> Combinatorics.permutations(2..4) |> Enum.to_list
[{2, 3, 4}, {2, 4, 3}, {3, 2, 4}, {3, 4, 2}, {4, 2, 3}, {4, 3, 2}]
"""
def permutations(enum) when is_list(enum) do
permutations(enum, length(enum))
end
def permutations(enum = %Range{}) do
permutations(enum, Enum.count(enum))
end
@doc ~S"""
Permutations - n-length tuples, all possible orderings, no repeated elements.
## Examples
iex> Combinatorics.permutations(1..4, 2) |> Enum.to_list
[{1, 2}, {1, 3}, {1, 4}, {2, 1}, {2, 3}, {2, 4}, {3, 1}, {3, 2}, {3, 4}, {4, 1}, {4, 2}, {4, 3}]
iex> Combinatorics.permutations(1..3, 3) |> Enum.to_list
[{1, 2, 3}, {1, 3, 2}, {2, 1, 3}, {2, 3, 1}, {3, 1, 2}, {3, 2, 1}]
"""
def permutations(_enum, 0), do: []
def permutations(enum, 1), do: Stream.map(enum, &{&1})
def permutations(enum, n) when is_integer(n) and n > 1 do
case next(enum) do
{:next, v, rest} -> &do_permutations({[{v, [], rest}], [v], :next, n - 1}, &1, &2)
_ -> []
end
end
defp do_permutations(_, {:halt, term}, _fun), do: {:halted, term}
defp do_permutations(v, {:suspend, term}, fun) do
{:suspended, term, &do_permutations(v, &1, fun)}
end
defp do_permutations({[], _, _, _}, {:cont, term}, _), do: {:done, term}
defp do_permutations({fs, vals, _, 0}, {:cont, term}, fun) do
do_permutations({fs, vals, :back, 1}, fun.(List.to_tuple(:lists.reverse(vals)), term), fun)
end
defp do_permutations({(fs=[{_, r, s}|_]), vals, :next, n}, acc = {:cont, term}, fun) do
case next({r, s}) do
{:next, v, rest} -> do_permutations({[{v, [], rest}|fs], [v|vals], :next, n - 1}, acc, fun)
_ -> {:done, term}
end
end
defp do_permutations({[{o, r, s}|fs], [_|vs], :back, n}, acc = {:cont, _}, fun) do
case next(s) do
{:next, v, rest} -> do_permutations({[{v, {r, [o]}, rest}|fs], [v|vs], :next, n - 1}, acc, fun)
_ -> do_permutations({fs, vs, :back, n + 1}, acc, fun)
end
end
# === Common Private Functions ===
defp reducer(v, _), do: {:suspend, v}
defp next([]), do: :done
defp next([x|xs]), do: {:next, x, xs}
defp next(fun) when is_function(fun, 1) do
case fun.({:cont, nil}) do
{:suspended, v, next_fun} -> {:next, v, next_fun}
_ -> :done
end
end
defp next({a, b}) do
case next(a) do
{:next, v, as} -> {:next, v, {as, b}}
_ -> next(b)
end
end
defp next(it) do
case Enumerable.reduce(it, {:cont, nil}, &reducer/2) do
{:suspended, v, next_fun} -> {:next, v, next_fun}
_ -> :done
end
end
end | lib/combinatorics.ex | 0.807081 | 0.704583 | combinatorics.ex | starcoder |
defmodule Gnat.ConsumerSupervisor do
use GenServer
require Logger
@moduledoc """
A process that can supervise consumers for you (EXPERIMENTAL)
> Note: This module is experimental and may be removed in the 1.0 release depending on what we find as we experiment with other forms of highly available connections.
If you want to subscribe to a few topics and have that subscription last across restarts for you, then this worker can be of help. It also spawns a supervised `Task` for each message it receives. This way errors in message processing don't crash the consumers, but you will still get SASL reports that you can send to services like honeybadger.
To use this just add an entry to your supervision tree like this:
```
consumer_supervisor_settings = %{
connection_name: :name_of_supervised_connection,
consuming_function: {MyApp.RpcServer, :handle_request},
subscription_topics: [
%{topic: "rpc.MyApp.search", queue_group: "rpc.MyApp.search"},
%{topic: "rpc.MyApp.create", queue_group: "rpc.MyApp.create"},
],
}
worker(Gnat.ConsumerSupervisor, [consumer_supervisor_settings, [name: :rpc_consumer]], shutdown: 30_000)
```
The second argument is a keyword list that gets used as the GenServer options so you can pass a name that you want to register for the consumer process if you like. The `consuming_function` specific which module and function to call when messages arrive. The function will be called with a single argument which is a gnat message just like you get when you call `Gnat.sub` directly.
You can have a single consumer that subscribes to multiple topics or multiple consumers that subscribe to different topics and call different consuming functions. It is recommended that your `ConsumerSupervisor`s are present later in your supervision tree than your `ConnectionSupervisor`. That way during a shutdown the `ConsumerSupervisor` can attempt a graceful shutdown of the consumer before shutting down the connection.
"""
@spec start_link(map(), keyword()) :: GenServer.on_start
def start_link(settings, options \\ []) do
GenServer.start_link(__MODULE__, settings, options)
end
@impl GenServer
def init(settings) do
Process.flag(:trap_exit, true)
{:ok, task_supervisor_pid} = Task.Supervisor.start_link()
connection_name = Map.get(settings, :connection_name)
subscription_topics = Map.get(settings, :subscription_topics)
consuming_function = Map.get(settings, :consuming_function)
send self(), :connect
state = %{
connection_name: connection_name,
connection_pid: nil,
consuming_function: consuming_function,
status: :disconnected,
subscription_topics: subscription_topics,
subscriptions: [],
task_supervisor_pid: task_supervisor_pid,
}
{:ok, state}
end
@impl GenServer
def handle_info(:connect, %{connection_name: name}=state) do
case Process.whereis(name) do
nil ->
Process.send_after(self(), :connect, 2_000)
{:noreply, state}
connection_pid ->
_ref = Process.monitor(connection_pid)
subscriptions = Enum.map(state.subscription_topics, fn(topic_and_queue_group) ->
topic = Map.fetch!(topic_and_queue_group, :topic)
queue_group = Map.get(topic_and_queue_group, :queue_group)
{:ok, subscription} = Gnat.sub(connection_pid, self(), topic, queue_group: queue_group)
subscription
end)
{:noreply, %{state | status: :connected, connection_pid: connection_pid, subscriptions: subscriptions}}
end
end
def handle_info({:DOWN, _ref, :process, connection_pid, _reason}, %{connection_pid: connection_pid}=state) do
Process.send_after(self(), :connect, 2_000)
{:noreply, %{state | status: :disconnected, connection_pid: nil, subscriptions: []}}
end
# Ignore DOWN and task result messages from the spawned tasks
def handle_info({:DOWN, _ref, :process, _task_pid, _reason}, state), do: {:noreply, state}
def handle_info({ref, _result}, state) when is_reference(ref), do: {:noreply, state}
def handle_info({:EXIT, supervisor_pid, _reason}, %{task_supervisor_pid: supervisor_pid}=state) do
{:ok, task_supervisor_pid} = Task.Supervisor.start_link()
{:noreply, Map.put(state, :task_supervisor_pid, task_supervisor_pid)}
end
def handle_info({:msg, gnat_message}, %{consuming_function: {mod, fun}}=state) do
Task.Supervisor.async_nolink(state.task_supervisor_pid, mod, fun, [gnat_message])
{:noreply, state}
end
def handle_info(other, state) do
Logger.error "#{__MODULE__} received unexpected message #{inspect other}"
{:noreply, state}
end
@impl GenServer
def terminate(:shutdown, state) do
Logger.info "#{__MODULE__} starting graceful shutdown"
Enum.each(state.subscriptions, fn(subscription) ->
:ok = Gnat.unsub(state.connection_pid, subscription)
end)
Process.sleep(500) # wait for final messages from broker
receive_final_broker_messages(state)
wait_for_empty_task_supervisor(state)
Logger.info "#{__MODULE__} finished graceful shutdown"
end
def terminate(reason, _state) do
Logger.error "#{__MODULE__} unexpected shutdown #{inspect reason}"
end
defp receive_final_broker_messages(state) do
receive do
info ->
handle_info(info, state)
receive_final_broker_messages(state)
after 0 ->
:done
end
end
defp wait_for_empty_task_supervisor(%{task_supervisor_pid: pid}=state) do
case Task.Supervisor.children(pid) do
[] -> :ok
children ->
Logger.info "#{__MODULE__}\t\t#{Enum.count(children)} tasks remaining"
Process.sleep(1_000)
wait_for_empty_task_supervisor(state)
end
end
end | lib/gnat/consumer_supervisor.ex | 0.778018 | 0.703091 | consumer_supervisor.ex | starcoder |
defmodule Credo.Check.Consistency.Helper do
@moduledoc """
This module contains functions that are used by several
consistency checks.
# On properties and property lists
Imagine a test that checks files for whether they use soft-tabs or hard-tabs
for indentation.
property_values in this case might be :spaces and :tabs
{value, meta}
value can be anything imaginable, meta should contain a filename
(optionally with a line_no, trigger, etc.) or an AST
a `property_list` is simply a list of property_values
[
{value, meta},
{value, meta},
...
]
a property_tuple is a tuple of {property_list, source_file}
So in our example a property_tuple
{[{:spaces, meta}, {:tabs, meta2}], %SourceFile{}}
which would indicate that the check on that SourceFile showed that it mixes
different indentation styles within one file.
"""
alias Credo.Check.PropertyValue
alias Credo.IssueMeta
@doc """
`callback` is expected to return a tuple `{property_values, most_picked_prop_value}`.
"""
def most_picked_prop(source_files, callback) when is_list(source_files) and is_function(callback) do
properties =
source_files
|> Enum.map(callback)
|> Enum.sort
{properties, most_picked_prop_value(properties)}
end
@doc """
Returns a tuple `{most_picked_prop, picked_count, total_count}`
"""
def most_picked_prop_value(list) when is_list(list) do
all_property_values =
Enum.flat_map(list, fn({property_list, _source_file}) -> PropertyValue.get(property_list) end)
result =
all_property_values
|> Enum.map(fn(prop_val) ->
current_property_value = PropertyValue.get(prop_val)
prop_size =
all_property_values
|> Enum.filter(fn(property_value) ->
PropertyValue.get(property_value) == current_property_value
end)
|> Enum.count
case prop_size do
0 -> nil
_ -> {prop_size, current_property_value}
end
end)
|> Enum.reject(&is_nil/1)
|> Enum.sort
|> List.last
case result do
{prop_count, prop_name} -> {prop_name, prop_count, Enum.count(all_property_values)}
nil -> nil
end
end
@doc """
Runs a given set of `pattern_mods` (CodePattern modules) against a given
set of `source_files`.
Returns a tuple: {property_tuples, most_picked}
"""
def run_code_patterns(source_files, pattern_mods, params) do
most_picked_prop(source_files, &create_property_tuples(&1, pattern_mods, params))
end
@doc """
Takes all the `property_tuples` from run_code_patterns and creates issues in
all source_files that do not sport the most_picked property_value.
Does call `new_issue_fun/4` when necessary to create a new issue.
"""
def append_issues_via_issue_service({property_tuples, most_picked}, new_issue_fun, params) do
Enum.map(property_tuples, &append_issues_if_necessary(&1, most_picked, new_issue_fun, params))
end
defp append_issues_if_necessary({_prop_list, _source_file}, nil, _, _) do
nil
end
defp append_issues_if_necessary({prop_list, source_file}, most_picked, new_issue_fun, params) do
{expected_prop, picked_count, total_count} = most_picked
case prop_list |> PropertyValue.get |> Enum.uniq do
[^expected_prop] ->
nil
list ->
prop_list
|> Enum.map(fn(prop) ->
value = PropertyValue.get(prop)
if value != expected_prop && Enum.member?(list, value) do
issue_meta = IssueMeta.for(source_file, params)
new_issue_fun.(issue_meta, prop, expected_prop, picked_count, total_count)
end
end)
|> List.flatten
|> Enum.reject(&is_nil/1)
|> Enum.uniq # TODO: should we really "squash" the issues here?
|> Enum.each(fn(issue) ->
Credo.Service.SourceFileIssues.append(source_file, issue)
end)
end
end
defp create_property_tuples(source_file, pattern_mods, params) do
list = property_list_for(source_file, pattern_mods, params)
{list, source_file}
end
defp property_list_for(source_file, pattern_mods, params) do
pattern_mods
|> collect_property_values(source_file, params)
|> Enum.reject(&is_nil/1)
|> Enum.uniq
end
defp collect_property_values(pattern_mods, source_file, params) do
Enum.reduce(pattern_mods, [], fn(pattern_mod, acc) ->
result = pattern_mod.property_value_for(source_file, params)
acc ++ List.wrap(result)
end)
end
end | lib/credo/check/consistency/helper.ex | 0.753829 | 0.545104 | helper.ex | starcoder |
defmodule GameApp.Round do
@moduledoc """
`GameApp.Round` defines a struct that encapsulates Round state as well as
functions that update the round state.
"""
alias __MODULE__, as: Round
alias GameApp.Player
@enforce_keys [:number]
defstruct number: nil,
prompt: nil,
winner: nil,
reactions: %{}
@type t :: %Round{
number: integer(),
prompt: String.t() | nil,
winner: Player.t() | nil,
reactions: map()
}
@doc """
Creates a round for the given round number.
## Examples
iex> Round.create(number: 1)
%Round{
number: 1,
winner: nil,
reactions: %{}
}
"""
@spec create(keyword()) :: Round.t()
def create(attrs \\ []) do
struct(Round, attrs)
end
@doc """
Sets the prompt for a round.
## Examples
iex> r = Round.create(number: 1)
iex> Round.set_prompt(r, "Wat")
%Round{
number: 1,
prompt: "Wat",
winner: nil,
reactions: %{}
}
"""
@spec set_prompt(Round.t(), String.t()) :: Round.t()
def set_prompt(round, prompt) do
Map.put(round, :prompt, prompt)
end
@doc """
Sets the reaction for a player in a round.
## Examples
iex> r = Round.create(number: 1)
iex> Round.set_reaction(r, Player.create(id: "1", name: "Gamer"), "OMG!")
%Round{
number: 1,
winner: nil,
reactions: %{
"1" => "OMG!"
}
}
"""
@spec set_reaction(Round.t(), Player.t(), String.t()) :: Round.t()
def set_reaction(%Round{reactions: reactions} = round, %Player{id: id}, reaction) do
Map.put(round, :reactions, Map.put(reactions, id, reaction))
end
@doc """
Removes the reaction for a player in a round.
## Examples
iex> r = Round.create(number: 1)
iex> p = Player.create(id: "1", name: "Gamer")
iex> r = Round.set_reaction(r, p, "OMG!")
iex> Round.remove_reaction(r, p)
%Round{
number: 1,
winner: nil,
reactions: %{}
}
"""
@spec remove_reaction(Round.t(), Player.t()) :: Round.t()
def remove_reaction(%Round{reactions: reactions} = round, %Player{id: id}) do
Map.put(round, :reactions, Map.delete(reactions, id))
end
@doc """
Sets the winner for a round.
## Examples
iex> r = Round.create(number: 1)
iex> Round.set_winner(r, Player.create(id: "1", name: "Gamer"))
%Round{
number: 1,
winner: %Player{id: "1", name: "Gamer"},
reactions: %{}
}
"""
@spec set_winner(Round.t(), Player.t() | nil) :: Round.t()
def set_winner(round, winner) do
Map.put(round, :winner, winner)
end
end | apps/game/lib/game/state/round.ex | 0.885369 | 0.47171 | round.ex | starcoder |
defmodule Spandex.Datadog.Span do
@moduledoc """
In charge of holding the datadog span attributes, and for starting/ending
spans. This also handles serialization via `to_map/1`, and span inheritance
via `child_of/3`
"""
alias __MODULE__, as: Span
alias Spandex.Datadog.Utils
defstruct [
:id, :trace_id, :parent_id, :name, :resource,
:service, :env, :start, :completion_time, :error,
:error_message, :stacktrace, :type, :error_type,
:url, :status, :method, :user, :sql_rows, :sql_db, :sql_query,
meta: %{}
]
@type t :: %__MODULE__{}
@default "unknown"
@updateable_keys [
:name, :resource, :service, :env, :start, :completion_time, :error,
:error_message, :stacktrace, :error_type, :start, :status, :url, :method,
:user, :type
]
@doc """
Creates new struct with defaults from :spandex configuration.
"""
@spec new(map :: map) :: t
def new(map \\ %{}) do
core = %Span{
id: default_if_blank(map, :id, &Utils.next_id/0),
start: default_if_blank(map, :start, &Utils.now/0),
env: default_if_blank(map, :env, &default_env/0),
service: default_if_blank(map, :service, &default_service/0),
resource: default_if_blank(map, :resource, fn -> default_if_blank(map, :name, fn -> @default end) end),
}
core
|> Map.put(:type, default_if_blank(map, :type, fn -> default_type(core.service) end))
|> Map.merge(Map.drop(map, [:id, :start, :env, :service, :resource, :type]))
end
@doc """
Sets completion time for given span if it's missing as unix epoch in nanoseconds.
"""
@spec stop(span :: t) :: t
def stop(%Span{completion_time: nil} = span),
do: %{span | completion_time: Utils.now()}
def stop(%Span{} = span),
do: span
@doc """
Updates span with given map. Only `@updateable_keys` are allowed for updates.
"""
@spec update(span :: t, updates :: map) :: t
def update(%Span{} = span, updates) do
@updateable_keys
|> Enum.reduce(span, fn key, span ->
if Map.has_key?(updates, key) do
Map.put(span, key, updates[key])
else
span
end
end)
|> merge_meta(updates[:meta] || %{})
end
defp merge_meta(%Span{meta: meta} = span, new_meta) do
%{span | meta: Map.merge(meta, new_meta)}
end
@doc """
Creates new span based on parent span.
"""
@spec child_of(parent :: t, name :: term) :: t
def child_of(%Span{id: parent_id} = parent, name) do
%{parent | id: Utils.next_id(), start: Utils.now(), name: name, parent_id: parent_id}
end
defp duration(left, right) do
left - right
end
defp default_if_blank(map, key, fun) do
case Map.get(map, key) do
nil -> fun.()
val -> val
end
end
@doc """
Creates a final map structure suitable for datadog trace agent.
"""
@spec to_map(span :: t) :: map
def to_map(%Span{} = span) do
service = span.service || default_service()
now = Utils.now()
%{
trace_id: span.trace_id,
span_id: span.id,
name: span.name,
start: span.start || now,
duration: duration(span.completion_time || now, span.start || now),
parent_id: span.parent_id,
error: span.error || 0,
resource: span.resource || span.name || @default,
service: service,
type: span.type || default_type(service)
}
|> add_meta(span)
|> add_error_data(span)
|> add_http_data(span)
|> add_sql_data(span)
end
defp add_meta(json, %{env: env, user: user, meta: meta}) do
json
|> Map.put(:meta, %{})
|> put_in([:meta, :env], env || default_env())
|> add_if_not_nil([:meta, :user], user)
|> Map.update!(:meta, fn current_meta -> Map.merge(current_meta, meta) end)
|> filter_nils
end
defp add_http_data(json, %{url: url, status: status, method: method}) do
json
|> add_if_not_nil([:meta, "http.url"], url)
|> add_string_if_not_nil([:meta, "http.status_code"], status)
|> add_if_not_nil([:meta, "http.method"], method)
end
defp add_sql_data(json, span) do
json
|> add_if_not_nil([:meta, "sql.query"], span.sql_query)
|> add_if_not_nil([:meta, "sql.rows"], span.sql_rows)
|> add_if_not_nil([:meta, "sql.db"], span.sql_db)
end
defp add_error_data(json, %{error: 1, error_message: error_message, stacktrace: stacktrace, error_type: error_type}) do
json
|> add_if_not_nil([:meta, "error.msg"], error_message)
|> add_if_not_nil([:meta, "error.stack"], stacktrace)
|> add_if_not_nil([:meta, "error.type"], error_type)
end
defp add_error_data(json, _), do: json
defp add_if_not_nil(map, _path, nil), do: map
defp add_if_not_nil(map, path, value), do: put_in(map, path, value)
defp add_string_if_not_nil(map, _path, nil), do: map
defp add_string_if_not_nil(map, path, value), do: put_in(map, path, to_string(value))
defp filter_nils(map) when is_map(map) do
map
|> Enum.reject(fn {_key, value} -> is_nil(value) end)
|> Enum.into(%{}, fn {key, value} -> {key, filter_nils(value)} end)
end
defp filter_nils(other), do: other
defp default_service, do: Confex.get_env(:spandex, :service)
defp default_env, do: Confex.get_env(:spandex, :env)
defp default_type(service) do
:spandex
|> Confex.get_env(:datadog)
|> Keyword.get(:services, [])
|> Keyword.get(service, @default)
end
end | lib/datadog/span.ex | 0.751329 | 0.414069 | span.ex | starcoder |
defmodule GeoPotion.Vector do
alias :math, as: Math
alias GeoPotion.Distance
alias GeoPotion.Angle
alias __MODULE__
@type position_map :: %{
latitude: number,
longitude: number
}
@type t :: %__MODULE__{
azimuth: Angle.t,
distance: Distance.t
}
defstruct azimuth: Angle.new, distance: Distance.new
@moduledoc """
"""
@epsg_wgs1984 7030
@equitorial_radius_wgs1984 6378137
@polar_radius_wgs1984 6356752.3142
@flattening_wgs1984 0.0033528106718309896
@inverse_flattening_wgs1984 298.257223563
@pi 3.141592653589793
@calc_iterations 100
@spec calculate(position_map, position_map) :: t
def calculate(%{latitude: from_lat,longitude: from_lon}, %{latitude: to_lat, longitude: to_lon}) do
_vincenty_calc(from_lat, from_lon, to_lat, to_lon)
end
@spec distance_over_ground(position_map, position_map) :: Distance.t
def distance_over_ground(%{latitude: from_lat,longitude: from_lon}, %{latitude: to_lat, longitude: to_lon}) do
%{distance: result} = _vincenty_calc(from_lat, from_lon, to_lat, to_lon)
result
end
@spec bearing_to(position_map, position_map) :: Angle.t
def bearing_to(%{latitude: from_lat,longitude: from_lon}, %{latitude: to_lat, longitude: to_lon}) do
%{azimuth: result} = _vincenty_calc(from_lat, from_lon, to_lat, to_lon)
result
end
defp _distance_over_ground(lat1, lon1, lat2, lon2) do
# From: http://www.mathworks.com/matlabcentral/files/8607/vdist.m
# this implementation was ported from DotSpatial library
# this is the aproximated fast calculation over the more accurate slow one.
# Should come back later and implement the full accurate version as well.
# -TCM 8-16-14
{rLat1,rLon1, rLat2, rLon2} = _getRads({lat1, lon1, lat2, lon2})
dLat = abs(rLat2 - rLat1)
dLon = abs(rLon2 - rLon1)
l = (rLat1 + rLat2) * 0.5
a = @equitorial_radius_wgs1984
b = @polar_radius_wgs1984
e = Math.sqrt(1 - (b* b) / (a * a))
r1 = (a * (1 - (e * e))) / Math.pow((1 - (e * e) * (Math.sin(l) * Math.sin(l))), 3 * 0.5)
r2 = a / Math.sqrt(1 - (e * e) * (Math.sin(l) * Math.sin(l)))
ravg = (r1 * (dLat / (dLat + dLon))) + (r2 * (dLon / (dLat + dLon)))
sinlat = Math.sin(dLat * 0.5)
sinlon = Math.sin(dLon * 0.5)
a2 = Math.pow(sinlat, 2) + Math.cos(rLat1) * Math.cos(rLat2) * Math.pow(sinlon, 2)
c = 2 * Math.asin(min(1, Math.sqrt(a2)))
ravg * c |> Distance.new
end
defp _vincenty_calc(lat1, lon1, lat2, lon2) when lat1 == lat2 and lon1 == lon2 do
%__MODULE__{azimuth: Angle.new(0.0), distance: Distance.new(0.0) }
end
defp _vincenty_calc(lat1, lon1, lat2, lon2) do
{_,rLon1, _, rLon2} = radians = _getRads({lat1, lon1, lat2, lon2})
[l,u1, u2|rest] = prep = _itt_prep(radians)
[lamda|_] = ittResults = _itt_calc(0,l, prep, [l])
dist = _calc_distance(ittResults)
fwdAz = _calc_azimuth([lamda,u1,u2])
%__MODULE__{azimuth: fwdAz, distance: dist}
end
defp _getRads({lat1, lon1, lat2, lon2}) do
rLat1 = Angle.degrees_to_radians(lat1)
rLat2 = Angle.degrees_to_radians(lat2)
rLon1 = Angle.degrees_to_radians(lon1)
rLon2 = Angle.degrees_to_radians(lon2)
{rLat1,rLon1,rLat2,rLon2}
end
defp _itt_prep(radians) do
{lat1, lon1, lat2, lon2} = radians
u1 = Math.atan((1-@flattening_wgs1984) * Math.tan(lat1))
u2 = Math.atan((1-@flattening_wgs1984) * Math.tan(lat2))
# rLon1 = rLon1/(2*Math.pi)
# rLon2 = rLon2/(2*Math.pi)
l = abs(lon2 - lon1)
[l, u1, u2, lon1, lon2]
end
defp _itt_calc(0, _lDiff, [lamda|rest], _) when lamda > @pi do
newlamda = 2 * Math.pi - lamda
_itt_calc(0,newlamda,[newlamda,rest],[newlamda])
end
defp _itt_calc(count, lamdaDif,state, vals) when count == 0 or (lamdaDif > 0.1e-6 and count <= @calc_iterations) do
[l, u1, u2|_] = state
[lamda| _rest] = vals
a = Math.pow((Math.cos(u2) * Math.sin(lamda)), 2)
b = Math.pow((Math.cos(u1) * Math.sin(u2) - Math.sin(u1) * Math.cos(u2) * Math.cos(lamda)),2)
sinsigma = Math.sqrt(a + b)
cossigma = Math.sin(u1) * Math.sin(u2) + Math.cos(u1) * Math.cos(u2) * Math.cos(lamda)
sigma = Math.atan2(sinsigma,cossigma)
alpha = Math.cos(u1) * Math.cos(u2) * Math.sin(lamda) / Math.sin(sigma)
|> Math.asin
cos2SigmaM = Math.cos(sigma) - 2.0 * Math.sin(u1) * Math.sin(u2) / Math.pow(Math.cos(alpha), 2)
c = @flattening_wgs1984 / 16 * Math.pow(Math.cos(alpha),2) * (4 + @flattening_wgs1984 * (4 - 3 * Math.pow(Math.cos(alpha),2)))
newlamda = l + (1 - c) * @flattening_wgs1984 * Math.sin(alpha) *
(sigma + c * Math.sin(sigma) *
(cos2SigmaM + c * Math.cos(sigma) *
(-1 + 2 * Math.pow(cos2SigmaM, 2))))
if(newlamda > Math.pi) do
newlamda = Math.pi
lamda = Math.pi
end
lamdaDiff = abs(newlamda - lamda)
_itt_calc(count+1,lamdaDiff,state,[newlamda, alpha, sigma, cos2SigmaM])
end
defp _itt_calc(_count, _lamdaDiff, state, vals) do
[_,_,_,lon1,lon2] = state
[lamda|rest] = vals
newlamda = abs(lamda)
if(Math.sin(lon2 - lon1) * Math.sin(lamda) < 0) do
newlamda = -newlamda
end
[newlamda|rest]
end
defp _calc_distance([_, alpha, sigma, cos2SigmaM]) do
a = @equitorial_radius_wgs1984
b = @polar_radius_wgs1984
distU2 = (alpha |> Math.cos |> Math.pow 2) * ((Math.pow(a,2) - Math.pow(b,2)) / Math.pow(b,2))
aa = 1 + distU2 / 16384 * (4096 + distU2 * (-768 + distU2 *(320 - 175 * distU2)))
bb = distU2 / 1024 * (256 + distU2 * (-128 + distU2 * (74 - 47 * distU2)))
deltaSigma = bb * Math.sin(sigma) * (cos2SigmaM + bb / 4 * (Math.cos(sigma) * (-1 + 2 *
Math.pow(cos2SigmaM, 2)) - bb / 6 * cos2SigmaM * (-3 + 4 * (sigma |> Math.sin |> Math.pow 2)) *
(-3 + 4 * Math.pow(cos2SigmaM,2))))
s = b * aa * (sigma - deltaSigma)
s |> Distance.new
end
defp _calc_azimuth([lamda, u1, u2]) do
numer = Math.cos(u2) * Math.sin(lamda)
denom = Math.cos(u1) * Math.sin(u2) - Math.sin(u1) * Math.cos(u2) * Math.cos(lamda)
a12 = Math.atan2(numer, denom)
if(a12 < 0) do
a12 = a12 + (2* Math.pi)
end
a12 |> Angle.radians_to_degrees |> Angle.new
end
end | lib/geopotion/vector.ex | 0.802052 | 0.688711 | vector.ex | starcoder |
defmodule FastEIP55 do
@moduledoc """
Github link: https://github.com/gregors/fast_eip_55
Rust-powered Keccak version of EIP-55
Provides EIP-55 encoding and validation functions.
"""
@doc """
Encodes an Ethereum address into an EIP-55 checksummed address.
## Examples
iex> alias FastEIP55, as: EIP55
iex> EIP55.encode("0x5aaeb6053f3e94c9b9a09f33669435e7ef1beaed")
{:ok, "0x5aAeb6053F3E94C9b9A09f33669435E7Ef1BeAed"}
iex> EIP55.encode(<<90, 174, 182, 5, 63, 62, 148, 201, 185, 160, 159, 51, 102, 148, 53, 231, 239, 27, 234, 237>>)
{:ok, "0x5aAeb6053F3E94C9b9A09f33669435E7Ef1BeAed"}
iex> EIP55.encode("not an address")
{:error, :unrecognized_address_format}
"""
def encode("0x" <> address) when byte_size(address) == 40 do
address = String.downcase(address, :ascii)
hash =
address
|> ExKeccak.hash_256()
|> Base.encode16(case: :lower)
encoded = _encode(address, hash)
{:ok, encoded}
end
def encode(address) when byte_size(address) == 20 do
encode("0x" <> Base.encode16(address, case: :lower))
end
def encode(_) do
{:error, :unrecognized_address_format}
end
defp checksum(c, _) when c >= 48 and c <= 57, do: c
defp checksum(c, x) when (x >= 97 and x <= 102) or x == 56 or x == 57 do
c - 32
end
defp checksum(c, _), do: c
defp _encode(address, hash) do
address = :binary.bin_to_list(address)
hash = :binary.bin_to_list(hash)
_encode(address, hash, ['x', '0'])
|> :lists.reverse()
|> :binary.list_to_bin()
end
defp _encode([], _, acc), do: acc
defp _encode([a | address], [b | hash], acc) do
_encode(address, hash, [checksum(a, b) | acc])
end
@doc """
Determines whether the given Ethereum address has a valid EIP-55 checksum.
## Examples
iex> alias FastEIP55, as: EIP55
iex> EIP55.valid?("0x5aAeb6053F3E94C9b9A09f33669435E7Ef1BeAed")
true
iex> EIP55.valid?("0x5AAEB6053f3e94c9b9a09f33669435e7ef1beaed")
false
iex> EIP55.valid?("not an address")
false
"""
def valid?("0x" <> _ = address) when byte_size(address) == 42 do
case encode(address) do
{:ok, ^address} -> true
_ -> false
end
end
def valid?(_), do: false
end | lib/fast_eip55.ex | 0.698844 | 0.428532 | fast_eip55.ex | starcoder |
defmodule Prelude.Type do
defstruct [:type]
types = [
:atom,
:binary,
:bitstring,
:boolean,
:float,
:function,
:integer,
:list,
:map,
:nil,
:nonempty_list,
:number,
:pid,
:port,
:reference,
:tuple
]
for type <- types do
def unquote(type)() do
%__MODULE__{type: unquote(type)}
end
def from_test(unquote(:"is_#{type}")) do
%__MODULE__{type: unquote(type)}
end
end
defmodule Argument do
defstruct [:function, :arity, :pos]
end
defmodule Arithmetic do
defstruct [:op, :left, :right]
end
defmodule FArithmetic do
defstruct [:op, :left, :right]
end
defmodule Arity do
defstruct [:arity]
end
defmodule ExtCall do
defstruct [:module, :function, :arity, :arguments]
end
defmodule LocalCall do
defstruct [:function, :arity, :arguments]
end
defmodule BIF do
defstruct [:name, :arguments]
end
defmodule Not do
defstruct [:type]
end
defmodule And do
defstruct [:left, :right]
end
defmodule Or do
defstruct [:left, :right]
end
defmodule LT do
defstruct [:left, :right]
end
defmodule GE do
defstruct [:left, :right]
end
defmodule EQ do
defstruct [:left, :right]
end
defmodule NE do
defstruct [:left, :right]
end
defmodule EQExact do
defstruct [:left, :right]
end
defmodule NEExact do
defstruct [:left, :right]
end
defmodule Fun do
defstruct [:fun, :arity, :env]
end
defmodule Literal do
defstruct [:value]
end
defmodule Binary do
defstruct [:fields]
end
defmodule BinaryField do
defstruct [:value, :size, :type, :flags]
end
defmodule Cons do
defstruct [:head, :tail]
end
defmodule Head do
defstruct [:cons]
end
defmodule Tail do
defstruct [:cons]
end
defmodule Tuple do
defstruct [:elements]
end
defmodule TupleElement do
defstruct [:tuple, :idx]
end
defmodule Map do
defstruct [:fields]
end
defmodule MapField do
defstruct [:key, :term]
end
defmodule MapElement do
defstruct [:map, :key]
end
defmodule Message do
defstruct []
end
defmodule Exception do
defstruct [:class, :type]
end
end | lib/prelude/type.ex | 0.599837 | 0.756807 | type.ex | starcoder |
defmodule Commanded.Middleware.Pipeline do
@moduledoc """
Pipeline is a struct used as an argument in the callback functions of modules
implementing the `Commanded.Middleware` behaviour.
This struct must be returned by each function to be used in the next
middleware based on the configured middleware chain.
## Pipeline fields
- `assigns` - shared user data as a map.
- `causation_id` - an optional UUID used to identify the cause of the
command being dispatched.
- `correlation_id` - an optional UUID used to correlate related
commands/events together.
- `command` - command struct being dispatched.
- `command_uuid` - UUID assigned to the command being dispatched.
- `consistency` - requested dispatch consistency, either: `:eventual`
(default) or `:strong`.
- `halted` - flag indicating whether the pipeline was halted.
- `identity` - an atom specifying a field in the command containing the
aggregate's identity or a one-arity function that returns an identity
from the command being dispatched.
- `identity_prefix` - an optional prefix to the aggregate's identity. It may
be a string (e.g. "prefix-") or a zero arity function
(e.g. `&MyRouter.identity_prefix/0`).
- `metadata` - the metadata map to be persisted along with the events.
- `response` - sets the response to send back to the caller.
"""
defstruct assigns: %{},
causation_id: nil,
correlation_id: nil,
command: nil,
command_uuid: nil,
consistency: nil,
halted: false,
identity: nil,
identity_prefix: nil,
metadata: nil,
response: nil
alias Commanded.Middleware.Pipeline
@doc """
Puts the `key` with value equal to `value` into `assigns` map.
"""
def assign(%Pipeline{} = pipeline, key, value) when is_atom(key) do
%Pipeline{assigns: assigns} = pipeline
%Pipeline{pipeline | assigns: Map.put(assigns, key, value)}
end
@doc """
Puts the `key` with value equal to `value` into `metadata` map.
Note: Use of atom keys in metadata is deprecated in favour of binary strings.
"""
def assign_metadata(%Pipeline{} = pipeline, key, value) when is_binary(key) or is_atom(key) do
%Pipeline{metadata: metadata} = pipeline
%Pipeline{pipeline | metadata: Map.put(metadata, key, value)}
end
@doc """
Has the pipeline been halted?
"""
def halted?(%Pipeline{halted: halted}), do: halted
@doc """
Halts the pipeline by preventing further middleware downstream from being invoked.
Prevents dispatch of the command if `halt` occurs in a `before_dispatch` callback.
"""
def halt(%Pipeline{} = pipeline) do
%Pipeline{pipeline | halted: true} |> respond({:error, :halted})
end
@doc """
Extract the response from the pipeline
"""
def response(%Pipeline{response: response}), do: response
@doc """
Sets the response to be returned to the dispatch caller, unless already set.
"""
def respond(%Pipeline{response: nil} = pipeline, response) do
%Pipeline{pipeline | response: response}
end
def respond(%Pipeline{} = pipeline, _response), do: pipeline
@doc """
Executes the middleware chain.
"""
def chain(pipeline, stage, middleware)
def chain(%Pipeline{} = pipeline, _stage, []), do: pipeline
def chain(%Pipeline{halted: true} = pipeline, :before_dispatch, _middleware), do: pipeline
def chain(%Pipeline{halted: true} = pipeline, :after_dispatch, _middleware), do: pipeline
def chain(%Pipeline{} = pipeline, stage, [module | modules]) do
chain(apply(module, stage, [pipeline]), stage, modules)
end
end | lib/commanded/middleware/pipeline.ex | 0.912378 | 0.861363 | pipeline.ex | starcoder |
defmodule Tesla.Middleware.ConsulWatch do
@moduledoc """
Fills the request so it results in a Consul blocking query.
## Example usage
```
defmodule Myclient do
use Tesla
plug Tesla.Middleware.ConsulWatch, wait: 60_000
end
```
"""
@behaviour Tesla.Middleware
use GenServer
@default_wait 60_000
@header_x_consul_index "x-consul-index"
@index_table Module.concat(__MODULE__, "Indexes")
def reset(%{url: url}) do
GenServer.call(__MODULE__, {:reset, url})
end
def start_link(_opts \\ []) do
GenServer.start_link(__MODULE__, [], name: __MODULE__)
end
@impl GenServer
def init(_) do
@index_table = :ets.new(@index_table, [:named_table, :public])
{:ok, []}
end
@impl GenServer
def handle_call({:reset, url}, _from, state) do
:ets.delete(@index_table, url)
{:reply, :ok, state}
end
@impl Tesla.Middleware
def call(env, next, opts) do
env
|> load_index(opts)
|> Tesla.run(next)
|> store_index()
end
defp load_index(%{method: :get, url: url} = env, opts) do
case :ets.lookup(@index_table, url) do
[] ->
env
[{_, index}] ->
wait =
Keyword.get(opts, :wait, @default_wait)
|> to_gotime()
env
|> Map.update!(:query, &(&1 ++ [wait: wait, index: index]))
end
end
defp load_index(env, _opts), do: env
def to_gotime(duration) do
ms =
case rem(duration, 1_000) do
0 ->
""
ms ->
<<"0", ms::binary>> = to_string(ms / 1_000)
ms
end
duration = div(duration, 1_000)
s =
case rem(duration, 60) do
0 -> if(ms == "", do: "", else: "0#{ms}s")
s -> "#{s}#{ms}s"
end
duration = div(duration, 60)
m =
case rem(duration, 60) do
0 -> s
m -> "#{m}m#{s}"
end
case div(duration, 60) do
0 -> m
h -> "#{h}h#{m}"
end
end
def store_index({:ok, %{url: url} = env}) do
case Tesla.get_header(env, @header_x_consul_index) do
nil ->
:ets.delete(@index_table, url)
index ->
:ets.insert(@index_table, {url, index})
end
{:ok, env}
end
def store_index({:error, reason}), do: {:error, reason}
end | lib/tesla/middleware/consul_watch.ex | 0.836388 | 0.7586 | consul_watch.ex | starcoder |
defmodule RDF.XSD.Utils.Regex do
@moduledoc !"""
XSD-flavoured regex matching.
This is not intended to be used directly.
Use `c:RDF.XSD.Datatype.matches?/3` implementations on the datatypes or
`RDF.Literal.matches?/3` instead.
"""
@doc """
Matches the string representation of the given value against a XPath and XQuery regular expression pattern.
The regular expression language is defined in _XQuery 1.0 and XPath 2.0 Functions and Operators_.
see <https://www.w3.org/TR/xpath-functions/#func-matches>
"""
@spec matches?(String.t(), String.t(), String.t()) :: boolean
def matches?(value, pattern, flags \\ "") do
string = to_string(value)
case xpath_pattern(pattern, flags) do
{:regex, regex} ->
Regex.match?(regex, string)
{:q, pattern} ->
String.contains?(string, pattern)
{:qi, pattern} ->
string
|> String.downcase()
|> String.contains?(String.downcase(pattern))
{:error, error} ->
raise "Invalid XQuery regex pattern or flags: #{inspect(error)}"
end
end
@spec xpath_pattern(String.t(), String.t()) ::
{:q | :qi, String.t()} | {:regex, Regex.t()} | {:error, any}
def xpath_pattern(pattern, flags)
def xpath_pattern(pattern, flags) when is_binary(pattern) and is_binary(flags) do
q_pattern(pattern, flags) || xpath_regex_pattern(pattern, flags)
end
defp q_pattern(pattern, flags) do
if String.contains?(flags, "q") and String.replace(flags, ~r/[qi]/, "") == "" do
{if(String.contains?(flags, "i"), do: :qi, else: :q), pattern}
end
end
defp xpath_regex_pattern(pattern, flags) do
with {:ok, regex} <-
pattern
|> convert_utf_escaping()
|> Regex.compile(xpath_regex_flags(flags)) do
{:regex, regex}
end
end
@spec convert_utf_escaping(String.t()) :: String.t()
def convert_utf_escaping(string) do
require Integer
xpath_unicode_regex = ~r/(\\*)\\U([0-9]|[A-F]|[a-f]){2}(([0-9]|[A-F]|[a-f]){6})/
[first | possible_matches] = Regex.split(xpath_unicode_regex, string, include_captures: true)
[
first
| Enum.map_every(possible_matches, 2, fn possible_xpath_unicode ->
[_, escapes, _, codepoint, _] = Regex.run(xpath_unicode_regex, possible_xpath_unicode)
if escapes |> String.length() |> Integer.is_odd() do
"#{escapes}\\u{#{codepoint}}"
else
"\\" <> possible_xpath_unicode
end
end)
]
|> Enum.join()
end
defp xpath_regex_flags(flags) do
String.replace(flags, "q", "") <> "u"
end
end | lib/rdf/xsd/utils/regex.ex | 0.847716 | 0.6612 | regex.ex | starcoder |
defmodule Liaison.NodeHelper do
@moduledoc """
NodeHelper aims to make converting between node representations
into FQDN nodenames
### Auto Localhost Calculation
Ever tried to connect to another local node but your machine name
is not fun to enter? Now all you need to do is enter a name for the
node and the localhost is automatically appended
### Node Formats
The following is a list of valid node formats and how they are
expanded
* `name` -> `:name@localhost`
* `name@host` -> `:name@host`
* `name@a.b.c.d` -> `:"name@a.b.c.d"`
* `%{name: name}` -> `:name@localhost`
* `%{name: :random}` -> random 6 digit name, eg: `:atj2ld@localhost`
* `%{name: [random: 3]}` -> random 3 digit name, eg:`:d5h@localhost`
* `%{name: name, host: host}` -> `:name@host`
* `%{name: name, host: :shortname}` -> `:name@localhost`
* `%{name: name, host: :longname}` -> `:name@localhost_ip`
"""
alias Liaison.IP
@type nodename :: atom()
@type node_format :: String.t() | atom() | map()
@doc """
Expands a node format into a fully qualified nodename
"""
@spec to_nodename(node_format) :: nodename
def to_nodename(format) when is_binary(format) do
case String.split(format, "@") do
[name] -> "#{name}@#{get_localhost()}"
[name, host] -> "#{name}@#{host}"
end
|> String.to_atom()
end
def to_nodename(format) when is_atom(format) do
Atom.to_string(format)
|> to_nodename()
end
def to_nodename(%{name: :random}) do
random_name()
|> to_nodename()
end
def to_nodename(%{name: [random: n]}) do
random_name(n)
|> to_nodename()
end
def to_nodename(%{name: name, host: host}) do
cond do
host == :shortname -> "#{name}"
host == :longname -> "#{name}@#{get_localhost_ip()}"
true -> "#{name}@#{host}"
end
|> to_nodename()
end
def to_nodename(%{name: name}) do
to_nodename(name)
end
@doc """
Returns the localhost name of the machine
"""
@spec get_localhost() :: charlist()
def get_localhost() do
elem(:inet.gethostname(), 1)
end
@doc """
Returns the first ip4 address for the machine
"""
@spec get_localhost_ip() :: String.t()
def get_localhost_ip() do
IP.as_string(IP.get_ipv4_addr())
end
@doc """
Returns a random alphanumeric localhost name of length `len`
"""
@spec random_name(integer) :: String.t()
def random_name(len \\ 6) do
opts =
Enum.to_list(?a..?z) ++
Enum.to_list(?A..?Z) ++
Enum.to_list(?1..?9)
Enum.map(1..len, fn _ -> Enum.random(opts) end)
|> List.to_string()
end
end | lib/node_helper.ex | 0.663015 | 0.603114 | node_helper.ex | starcoder |
defmodule Andi.InputSchemas.DatasetInput do
@moduledoc """
Module for validating Ecto.Changesets on flattened dataset input.
"""
import Ecto.Changeset
alias Andi.DatasetCache
alias Andi.InputSchemas.DatasetSchemaValidator
alias Andi.InputSchemas.Options
alias Andi.InputSchemas.KeyValue
@business_fields %{
benefitRating: :float,
contactEmail: :string,
contactName: :string,
dataTitle: :string,
description: :string,
homepage: :string,
issuedDate: :date,
keywords: {:array, :string},
language: :string,
license: :string,
modifiedDate: :date,
orgTitle: :string,
publishFrequency: :string,
riskRating: :float,
spatial: :string,
temporal: :string
}
@technical_fields %{
dataName: :string,
orgName: :string,
private: :boolean,
schema: {:array, :map},
sourceFormat: :string,
sourceHeaders: {:embed, KeyValue.relationship_definition(:sourceHeaders)},
sourceQueryParams: {:embed, KeyValue.relationship_definition(:sourceQueryParams)},
sourceType: :string,
sourceUrl: :string,
topLevelSelector: :string
}
@types %{id: :string}
|> Map.merge(@business_fields)
|> Map.merge(@technical_fields)
@key_value_type_keys [:sourceQueryParams, :sourceHeaders]
@non_embedded_types Map.drop(@types, @key_value_type_keys)
@required_fields [
:benefitRating,
:contactEmail,
:contactName,
:dataName,
:dataTitle,
:description,
:issuedDate,
:license,
:orgName,
:orgTitle,
:private,
:publishFrequency,
:riskRating,
:sourceFormat,
:sourceType,
:sourceUrl
]
@email_regex ~r/^[\w\_\~\!\$\&\'\(\)\*\+\,\;\=\:.-]+@[\w.-]+\.[\w.-]+?$/
@no_dashes_regex ~r/^[^\-]+$/
@ratings Map.keys(Options.ratings())
def business_keys(), do: Map.keys(@business_fields)
def technical_keys(), do: Map.keys(@technical_fields)
def key_value_keys(), do: @key_value_type_keys
def light_validation_changeset(changes), do: light_validation_changeset(%{}, changes)
def light_validation_changeset(schema, changes) do
{schema, @types}
|> cast(changes, Map.keys(@non_embedded_types), empty_values: [])
|> cast_embedded()
|> validate_required(@required_fields, message: "is required")
|> validate_format(:contactEmail, @email_regex)
|> validate_format(:orgName, @no_dashes_regex, message: "cannot contain dashes")
|> validate_format(:dataName, @no_dashes_regex, message: "cannot contain dashes")
|> validate_inclusion(:benefitRating, @ratings, message: "should be one of #{inspect(@ratings)}")
|> validate_inclusion(:riskRating, @ratings, message: "should be one of #{inspect(@ratings)}")
|> validate_top_level_selector()
|> validate_schema()
|> validate_key_value_parameters()
end
def full_validation_changeset(changes), do: full_validation_changeset(%{}, changes)
def full_validation_changeset(schema, changes) do
light_validation_changeset(schema, changes) |> validate_unique_system_name()
end
def add_key_value(changeset, field, %{} = param \\ %{}) do
new_key_value_changeset = KeyValue.changeset(%KeyValue{}, param)
change =
case fetch_change(changeset, field) do
{:ok, params} -> params ++ [new_key_value_changeset]
_ -> [new_key_value_changeset]
end
put_change(changeset, field, change)
end
def remove_key_value(changeset, field, id) do
update_change(changeset, field, fn params ->
Enum.filter(params, fn param -> param.changes.id != id end)
end)
|> validate_key_value_parameters()
|> update_source_url()
end
def adjust_source_url_for_query_params(changeset) do
changeset
|> update_source_url()
|> adjust_source_query_params_for_url()
end
def adjust_source_query_params_for_url(changeset) do
source_url = Ecto.Changeset.get_field(changeset, :sourceUrl)
case Andi.URI.extract_query_params(source_url) do
{:ok, params} ->
key_value_changes = Enum.map(params, &convert_param_to_kv/1)
changeset
|> put_change(:sourceQueryParams, key_value_changes)
|> validate_key_value_parameters()
_ ->
changeset
end
end
defp update_source_url(changeset) do
source_url = Ecto.Changeset.get_field(changeset, :sourceUrl)
source_query_params = Ecto.Changeset.get_field(changeset, :sourceQueryParams, [])
updated_source_url = Andi.URI.update_url_with_params(source_url, source_query_params)
put_change(changeset, :sourceUrl, updated_source_url)
end
defp convert_param_to_kv({k, v}) do
KeyValue.changeset(%KeyValue{}, %{key: k, value: v})
end
defp cast_embedded(changeset) do
Enum.reduce(@key_value_type_keys, changeset, fn key, acc_changeset -> cast_embed(acc_changeset, key) end)
end
defp validate_unique_system_name(changeset) do
if has_unique_data_and_org_name?(changeset) do
changeset
else
add_error(changeset, :dataName, "existing dataset has the same orgName and dataName")
end
end
defp has_unique_data_and_org_name?(%{changes: changes}) do
DatasetCache.get_all()
|> Enum.filter(&Map.has_key?(&1, "dataset"))
|> Enum.map(& &1["dataset"])
|> Enum.all?(fn existing_dataset ->
changes[:orgName] != existing_dataset.technical.orgName ||
changes[:dataName] != existing_dataset.technical.dataName ||
changes[:id] == existing_dataset.id
end)
end
defp validate_top_level_selector(%{changes: %{sourceFormat: source_format}} = changeset)
when source_format in ["xml", "text/xml"] do
validate_required(changeset, [:topLevelSelector], message: "is required")
end
defp validate_top_level_selector(changeset), do: changeset
defp validate_schema(%{changes: %{sourceType: source_type}} = changeset)
when source_type in ["ingest", "stream"] do
case Map.get(changeset.changes, :schema) do
[] -> add_error(changeset, :schema, "cannot be empty")
nil -> add_error(changeset, :schema, "is required", validation: :required)
_ -> validate_schema_internals(changeset)
end
end
defp validate_schema(changeset), do: changeset
defp validate_schema_internals(%{changes: changes} = changeset) do
DatasetSchemaValidator.validate(changes[:schema], changes[:sourceFormat])
|> Enum.reduce(changeset, fn error, changeset_acc -> add_error(changeset_acc, :schema, error) end)
end
defp validate_key_value_parameters(changeset) do
[:sourceQueryParams, :sourceHeaders]
|> Enum.reduce(changeset, fn field, acc_changeset ->
acc_changeset = clear_field_errors(acc_changeset, field)
if has_invalid_key_values?(acc_changeset, field) do
add_error(acc_changeset, field, "has invalid format", validation: :format)
else
acc_changeset
end
end)
end
defp has_invalid_key_values?(%{changes: changes}, field) do
case Map.get(changes, field) do
nil ->
false
key_value_changesets ->
Enum.any?(key_value_changesets, fn key_value_changeset -> not key_value_changeset.valid? end)
end
end
defp clear_field_errors(changset, field) do
Map.update(changset, :errors, [], fn errors -> Keyword.delete(errors, field) end)
end
end | apps/andi/lib/andi/input_schemas/dataset_input.ex | 0.69285 | 0.425546 | dataset_input.ex | starcoder |
defmodule MangoPay.Hook do
@moduledoc """
Functions for MangoPay [hook](https://docs.mangopay.com/endpoints/v2.01/hooks#e246_the-hook-object).
"""
use MangoPay.Query.Base
set_path "hooks"
@doc """
Get a hook.
## Examples
{:ok, hook} = MangoPay.Hook.get(id)
"""
def get id do
_get id
end
@doc """
Get a hook.
## Examples
hook = MangoPay.Hook.get!(id)
"""
def get! id do
_get! id
end
@doc """
Create a hook.
## Examples
params = %{
"Tag": "custom meta",
"EventType": "PAYIN_NORMAL_CREATED",
"Url": "http://www.my-site.com/hooks/"
}
{:ok, hook} = MangoPay.Hook.create(params)
"""
def create params do
_create params
end
@doc """
Create a hook.
## Examples
params = %{
"Tag": "custom meta",
"EventType": "PAYIN_NORMAL_CREATED",
"Url": "http://www.my-site.com/hooks/"
}
hook = MangoPay.Hook.create!(params)
"""
def create! params do
_create! params
end
@doc """
Update a hook.
## Examples
params = %{
"Tag": "custom meta",
"Status": "ENABLED",
"Url": "http://www.my-site.com/hooks/"
}
{:ok, hook} = MangoPay.Hook.update(id, params)
"""
def update id, params do
_update params, id
end
@doc """
Update a hook.
## Examples
params = %{
"Tag": "custom meta",
"Status": "ENABLED",
"Url": "http://www.my-site.com/hooks/"
}
hook = MangoPay.Hook.update!(id, params)
"""
def update! id, params do
_update! params, id
end
@doc """
List all hooks.
## Examples
query = %{
"Page": 1,
"Per_Page": 25,
"Sort": "CreationDate:DESC"
}
{:ok, hooks} = MangoPay.Hook.all()
"""
def all(query \\ %{}) do
_all(nil, query)
end
@doc """
List all hooks.
## Examples
query = %{
"Page": 1,
"Per_Page": 25,
"Sort": "CreationDate:DESC"
}
hooks = MangoPay.Hook.all!(query)
"""
def all!(query \\ %{}) do
_all!(nil, query)
end
end | lib/mango_pay/hook.ex | 0.746509 | 0.420332 | hook.ex | starcoder |
defmodule Qex do
@moduledoc ~S"""
A `:queue` wrapper with improvements in API and addition of Protocol implementations
## Protocols
`Inspect`, `Collectable` and `Enumerable` are implemented
iex> inspect Qex.new
"#Qex<[]>"
iex> Enum.count Qex.new(1..5)
5
iex> Enum.empty? Qex.new
true
iex> Enum.map Qex.new([1, 2, 3]), &(&1 + 1)
[2, 3, 4]
iex> inspect Enum.into(1..5, %Qex{})
"#Qex<[1, 2, 3, 4, 5]>"
"""
@opaque t(type) :: %__MODULE__{:data => :queue.queue(type)}
@opaque t() :: %__MODULE__{:data => :queue.queue()}
defstruct data: :queue.new
@doc """
Create a new queue from a range
iex> inspect Qex.new(1..3)
"#Qex<[1, 2, 3]>"
Create a new queue from a list
iex> inspect Qex.new([1, 2, 3])
"#Qex<[1, 2, 3]>"
"""
@spec new([term] | Range.t) :: t
def new(init_data \\ [])
def new(x..y) do
%__MODULE__{data: :queue.from_list(Enum.to_list(x..y))}
end
def new(list) do
%__MODULE__{data: :queue.from_list(list)}
end
@doc """
Add an element to the back of the queue
iex> q = Qex.new([:mid])
iex> Enum.to_list Qex.push(q, :back)
[:mid, :back]
"""
@spec push(t, term) :: t
def push(%__MODULE__{data: q}, item) do
%__MODULE__{data: :queue.in(item, q)}
end
@doc """
Add an element to the front of the queue
iex> q = Qex.new([:mid])
iex> Enum.to_list Qex.push_front(q, :front)
[:front, :mid]
"""
@spec push_front(t, term) :: t
def push_front(%__MODULE__{data: q}, item) do
%__MODULE__{data: :queue.in_r(item, q)}
end
@doc """
Get and remove an element from the front of the queue
iex> q = Qex.new([:front, :mid])
iex> {{:value, item}, _q} = Qex.pop(q)
iex> item
:front
iex> q = Qex.new
iex> {empty, _q} = Qex.pop(q)
iex> empty
:empty
"""
@spec pop(t) :: {{:value, term}, t} | {:empty, t}
def pop(%__MODULE__{data: q}) do
case :queue.out(q) do
{{:value, v}, q} -> {{:value, v}, %__MODULE__{data: q}}
{:empty, q} -> {:empty, %__MODULE__{data: q}}
end
end
@spec pop!(t) :: {term, t} | no_return
def pop!(%__MODULE__{data: q}) do
case :queue.out(q) do
{{:value, v}, q} -> {v, %__MODULE__{data: q}}
{:empty, _q} -> raise "Queue is empty"
end
end
@doc """
Get and remove an element from the back of the queue
iex> q = Qex.new([:mid, :back])
iex> {{:value, item}, _q} = Qex.pop_back(q)
iex> item
:back
iex> q = Qex.new
iex> {empty, _q} = Qex.pop_back(q)
iex> empty
:empty
"""
@spec pop_back(t) :: {{:value, term}, t} | {:empty, t}
def pop_back(%__MODULE__{data: q}) do
case :queue.out_r(q) do
{{:value, v}, q} -> {{:value, v}, %__MODULE__{data: q}}
{:empty, q} -> {:empty, %__MODULE__{data: q}}
end
end
@spec pop_back!(t) :: {term, t} | no_return
def pop_back!(%__MODULE__{data: q}) do
case :queue.out_r(q) do
{{:value, v}, q} -> {v, %__MODULE__{data: q}}
{:empty, _q} -> raise "Queue is empty"
end
end
@doc """
Reverse a queue
iex> q = Qex.new(1..3)
iex> Enum.to_list q
[1, 2, 3]
iex> Enum.to_list Qex.reverse(q)
[3, 2, 1]
"""
@spec reverse(t) :: t
def reverse(%__MODULE__{data: q}) do
%__MODULE__{data: :queue.reverse(q)}
end
@doc """
Split a queue into two, the front n items are put in the first queue
iex> q = Qex.new 1..5
iex> {q1, q2} = Qex.split(q, 3)
iex> Enum.to_list q1
[1, 2, 3]
iex> Enum.to_list q2
[4, 5]
"""
@spec split(t, pos_integer) :: {t, t}
def split(%__MODULE__{data: q}, n) do
with {q1, q2} <- :queue.split(n, q) do
{%__MODULE__{data: q1}, %__MODULE__{data: q2}}
end
end
@doc """
Join two queues together
iex> q1 = Qex.new 1..3
iex> q2 = Qex.new 4..5
iex> Enum.to_list Qex.join(q1, q2)
[1, 2, 3, 4, 5]
"""
@spec join(t, t) :: t
def join(%__MODULE__{data: q1}, %__MODULE__{data: q2}) do
%__MODULE__{data: :queue.join(q1, q2)}
end
@doc """
Return the first item in the queue in {:value, term} tuple,
return :empty if the queue is empty
iex> q1 = Qex.new 1..3
iex> Qex.first(q1)
{:value, 1}
iex> q2 = Qex.new []
iex> Qex.first(q2)
:empty
"""
@spec first(t) :: {:value, term} | :empty
def first(%__MODULE__{data: q}) do
:queue.peek(q)
end
@doc """
Retun the first item in the queue, raise if it's empty
iex> q1 = Qex.new 1..3
iex> Qex.first!(q1)
1
"""
@spec first!(t) :: term | no_return
def first!(%__MODULE__{data: q}) do
case :queue.peek(q) do
{:value, v} -> v
:empty -> raise "Queue is empty"
end
end
@doc """
Return the last item in the queue in {:value, term} tuple,
return :empty if the queue is empty
iex> q1 = Qex.new 1..3
iex> Qex.last(q1)
{:value, 3}
iex> q2 = Qex.new []
iex> Qex.last(q2)
:empty
"""
@spec last(t) :: {:value, term} | :empty
def last(%__MODULE__{data: q}) do
:queue.peek_r(q)
end
@doc """
Retun the last item in the queue, raise if it's empty
iex> q1 = Qex.new 1..3
iex> Qex.last!(q1)
3
"""
@spec last!(t) :: term | no_return
def last!(%__MODULE__{data: q}) do
case :queue.peek_r(q) do
{:value, v} -> v
:empty -> raise "Queue is empty"
end
end
end | lib/qex.ex | 0.880855 | 0.551211 | qex.ex | starcoder |
defmodule Stargate.Receiver.Dispatcher do
@moduledoc """
Defines the `Stargate.Receiver.Dispatcher` GenStage process
that functions as the producer in the pipeline, receiving messages
pushed from the reader or consumer socket and dispatching to the
rest of the pipeline.
"""
use GenStage
import Stargate.Supervisor, only: [via: 2]
defmodule State do
@moduledoc """
Defines the struct used by a `Stargate.Receiver.Dispatcher`
to store its state.
Includes the type of the receiver (consumer or reader), the name
of the process registry associated to the supervision tree, the
path parameters of the topic (tenant, namespace, topic), the atom
key of the websocket connection within the process registry, and
whether or not the receiver is in push or pull mode if it's consumer.
"""
defstruct [
:type,
:registry,
:persistence,
:tenant,
:namespace,
:topic,
:pull_mode,
:receiver
]
end
@type raw_message :: String.t()
@doc """
Push messages received over the reader or consumer connection into the
GenStage processing pipeline for handling and acknowledgement. This is normally
handled automatically by the websocket connection but can also be called directly
for testing the receive pipeline.
"""
@spec push(GenServer.server(), [raw_message()] | raw_message()) :: :ok
def push(dispatcher, messages), do: GenServer.cast(dispatcher, {:push, messages})
@doc """
Starts a `Stargate.Receiver.Dispatcher` GenStage process and links it to
the calling process.
"""
@spec start_link(keyword()) :: GenServer.on_start()
def start_link(init_args) do
registry = Keyword.fetch!(init_args, :registry)
persistence = Keyword.get(init_args, :persistence, "persistent")
tenant = Keyword.fetch!(init_args, :tenant)
ns = Keyword.fetch!(init_args, :namespace)
topic = Keyword.fetch!(init_args, :topic)
GenStage.start_link(__MODULE__, init_args,
name: via(registry, {:dispatcher, "#{persistence}", "#{tenant}", "#{ns}", "#{topic}"})
)
end
@impl GenStage
def init(init_args) do
type = Keyword.fetch!(init_args, :type)
persistence = Keyword.get(init_args, :persistence, "persistent")
tenant = Keyword.fetch!(init_args, :tenant)
ns = Keyword.fetch!(init_args, :namespace)
topic = Keyword.fetch!(init_args, :topic)
pull =
case get_in(init_args, [:query_params, :pull_mode]) do
true -> true
_ -> false
end
state = %State{
registry: Keyword.fetch!(init_args, :registry),
persistence: persistence,
tenant: tenant,
namespace: ns,
topic: topic,
pull_mode: pull,
receiver: {:"#{type}", "#{persistence}", "#{tenant}", "#{ns}", "#{topic}"}
}
{:ok, _receiver} = Stargate.Receiver.start_link(init_args)
{:producer, state}
end
@impl GenStage
def handle_cast({:push, messages}, state) when is_list(messages) do
{:noreply, messages, state}
end
@impl GenStage
def handle_cast({:push, message}, state), do: {:noreply, [message], state}
@impl GenStage
def handle_demand(demand, %{pull_mode: true} = state) do
receiver = via(state.registry, state.receiver)
Stargate.Receiver.pull_permit(receiver, demand)
{:noreply, [], state}
end
@impl GenStage
def handle_demand(_, state), do: {:noreply, [], state}
@impl GenStage
def handle_info(_, state), do: {:noreply, [], state}
end | lib/stargate/receiver/dispatcher.ex | 0.842604 | 0.486758 | dispatcher.ex | starcoder |
defmodule Kayrock.Convenience do
@moduledoc """
Convenience functions for working with Kayrock / Kafka API data
"""
alias Kayrock.ErrorCode
def topic_exists?(pid, topic) when is_pid(pid) do
{:ok, [topic]} = Kayrock.topics_metadata(pid, [topic])
topic[:error_code] != ErrorCode.unknown_topic()
end
def partition_last_offset(client_pid, topic, partition) do
%{^partition => offset} = partitions_last_offset(client_pid, topic, [partition])
offset
end
@doc """
Returns a map of partition to offset for partitions on the same leader
All partitions must be on the same leader or you will get -1 for the offset
"""
def partitions_last_offset(client_pid, topic, partitions) do
[first_partition | _] = partitions
partition_requests =
for partition <- partitions do
%{partition: partition, timestamp: -1}
end
request = %Kayrock.ListOffsets.V1.Request{
replica_id: -1,
topics: [%{topic: topic, partitions: partition_requests}]
}
{:ok, resp} =
Kayrock.client_call(client_pid, request, {:topic_partition, topic, first_partition})
[topic_resp] = resp.responses
Enum.into(topic_resp.partition_responses, %{}, fn %{partition: partition, offset: offset} ->
{partition, offset}
end)
end
def topic_last_offsets(client_pid, topic) do
partition_leaders = get_partition_leaders(client_pid, topic)
partitions_by_node =
Enum.reduce(partition_leaders, %{}, fn {partition, node}, acc ->
Map.update(acc, node, [partition], fn existing_partitions ->
[partition | existing_partitions]
end)
end)
Enum.reduce(partitions_by_node, %{}, fn {_, partitions}, acc ->
Map.merge(acc, partitions_last_offset(client_pid, topic, partitions))
end)
end
def get_partition_leaders(client_pid, topic) do
{:ok, [metadata]} = Kayrock.topics_metadata(client_pid, [topic])
Enum.into(metadata.partition_metadata, %{}, fn partition_metadata ->
{partition_metadata.partition, partition_metadata.leader}
end)
end
end | lib/kayrock/convenience.ex | 0.757166 | 0.428293 | convenience.ex | starcoder |
defmodule AwsExRay.Config do
@moduledoc """
Config Accessor Module
## Configuration
```elixir
config :aws_ex_ray,
sampling_rate: 0.1,
default_annotation: %{foo: "bar"},
default_metadata: %{bar: "buz"}
```
|key|default|description|
|:--|:--|:--|
|sampling_rate|0.1|set number between 0.0 - 1.0. recommended that set 0.0 for 'test' environment|
|default_annotation|%{}|annotation parameters automatically put into segment/subsegment|
|default_metadata|%{}|metadata parameters automatically put into segment/subsegment|
|daemon_address|127.0.0.1|your xray daemon's IP address. typically, you don't need to customize this.|
|daemon_port|2000|your xray daemon's port. typically, you don't need to customize this.|
|default_client_pool_size|10|number of UDP client which connects to xray daemon|
|default_client_pool_overflow|100|overflow capacity size of UDP client|
|default_store_monitor_pool_size|10|number of tracing-process-monitor|
"""
@default_sampling_rate 0.1
@default_daemon_address "127.0.0.1"
@default_daemon_port 2000
@default_client_pool_size 10
@default_client_pool_overflow 100
@default_store_monitor_pool_size 10
@default_client_module AwsExRay.Client.UDPClientSupervisor
@default_sandbox_sink_module AwsExRay.Client.Sandbox.Sink.Ignore
@spec library_name() :: String.t
def library_name(), do: "aws-ex-ray"
@spec library_version() :: String.t
def library_version(), do: "0.0.1"
@spec get(atom, any) :: any
def get(key, default) do
Application.get_env(:aws_ex_ray, key, default)
end
@spec sampling_rate() :: float
def sampling_rate() do
get(:sampling_rate,
@default_sampling_rate)
end
@spec daemon_address() :: :inet.ip_address
def daemon_address() do
address = get(:daemon_address,
@default_daemon_address)
charlist_address = address |> String.to_charlist()
{:ok, ip_address} =
case :inet.getaddr(charlist_address, :inet) do
{:ok, ip_address} ->
{:ok, ip_address}
{:error, _} ->
:inet.getaddr(charlist_address, :inet6)
end
ip_address
end
@spec daemon_port() :: non_neg_integer
def daemon_port() do
get(:daemon_port,
@default_daemon_port)
end
@spec default_annotation() :: map
def default_annotation() do
get(:default_annotation, %{})
end
@spec default_metadata() :: map
def default_metadata() do
get(:default_metadata, %{})
end
@spec store_monitor_pool_size() :: non_neg_integer
def store_monitor_pool_size() do
get(:store_monitor_pool_size,
@default_store_monitor_pool_size)
end
@spec client_pool_size() :: non_neg_integer
def client_pool_size() do
get(:client_pool_size,
@default_client_pool_size)
end
@spec client_pool_overflow() :: non_neg_integer
def client_pool_overflow() do
get(:client_pool_overflow,
@default_client_pool_overflow)
end
@spec client_module() :: module
def client_module() do
get(:client_module,
@default_client_module)
end
@spec sandbox_sink_module() :: module
def sandbox_sink_module() do
get(:sandbox_sink_module,
@default_sandbox_sink_module)
end
@spec service_version() :: String.t
def service_version() do
get(:service_version, "")
end
end | lib/aws_ex_ray/config.ex | 0.836154 | 0.707329 | config.ex | starcoder |
defmodule Authoritex.Mock do
@moduledoc """
Mock Authority for testing Authoritex consumers
Examples:
```
# In test.exs:
# config :authoritex, authorities: [Authoritex.Mock]
# In test_helper.exs:
# Authoritex.Mock.init()
# In test case:
iex> Authoritex.Mock.set_data([
%{id: "mock:result1", label: "First Result", qualified_label: "First Result (1)", hint: "(1)"},
%{id: "mock:result2", label: "Second Result", qualified_label: "Second Result (2)", hint: "(2)"},
%{id: "mock:result3", label: "Third Result", qualified_label: "Third Result (3)", hint: "(3)"}])
:ok
iex> Authoritex.fetch("mock:result2")
{:ok, %{id: "mock:result2", label: "Second Result", qualified_label: "Second Result (2)", hint: "(2)"}}
iex> Authoritex.fetch("missing_id_authority:anything")
{:error, 404}
iex> Authoritex.fetch("wrong")
{:error, :unknown_authority}
iex> Authoritex.search("mock", "test")
{:ok, [
%{id: "mock:result1", label: "First Result", hint: "(1)"},
%{id: "mock:result2", label: "Second Result", hint: "(2)"},
%{id: "mock:result3", label: "Third Result", hint: "(3)"}
]}
iex> Authoritex.search("mock", :no_results)
{:ok, []}
iex> Authoritex.search("mock", :error)
{:error, 500}
```
"""
@behaviour Authoritex
@impl Authoritex
def code, do: "mock"
@impl Authoritex
def description, do: "Authoritex Mock Authority for Test Suites"
@impl Authoritex
def can_resolve?(_id), do: true
@impl Authoritex
def fetch("missing_id_authority:" <> _id) do
{:error, 404}
end
def fetch(id) do
case Enum.find(get_data(), &(&1.id == id)) do
nil -> {:error, :unknown_authority}
record -> {:ok, record}
end
end
@impl Authoritex
def search(query, max_results \\ 5)
def search(:no_results, _), do: {:ok, []}
def search(:error, _), do: {:error, 500}
def search(_query, max_results) do
{:ok,
get_data()
|> Enum.map(&Map.delete(&1, :qualified_label))
|> Enum.take(max_results)}
rescue
ArgumentError -> {:error, 500}
end
def init do
:ets.new(__MODULE__, [:set, :named_table, :public])
rescue
ArgumentError -> __MODULE__
end
def set_data(data) when is_list(data) do
:ets.insert(Authoritex.Mock, {Kernel.inspect(self()), data})
:ok
end
defp get_data do
case :ets.lookup(Authoritex.Mock, Kernel.inspect(self())) do
[] -> []
[{_, data}] -> data
end
end
end | lib/authoritex/mock.ex | 0.826677 | 0.855187 | mock.ex | starcoder |
defmodule GGity.Scale.Linetype.Discrete do
@moduledoc false
alias GGity.{Draw, Labels}
alias GGity.Scale.Linetype
# solid: "",
# dashed: "4",
# dotted: "1",
# longdash: "6 2",
# dotdash: "1 2 3 2",
# twodash: "2 2 6 2"
@palette_values [
"",
"4",
"1",
"6 2",
"1 2 3 2",
"2 2 6 2"
]
defstruct transform: nil,
levels: nil,
labels: :waivers,
guide: :legend
@type t() :: %__MODULE__{}
@spec new(keyword()) :: Linetype.Discrete.t()
def new(options \\ []), do: struct(Linetype.Discrete, options)
@spec train(Linetype.Discrete.t(), list(binary())) :: Linetype.Discrete.t()
def train(scale, [level | _other_levels] = levels) when is_list(levels) and is_binary(level) do
transform = GGity.Scale.Discrete.transform(levels, palette(levels))
struct(scale, levels: levels, transform: transform)
end
defp palette(levels) do
@palette_values
|> Stream.cycle()
|> Enum.take(length(levels))
end
@spec draw_legend(Linetype.Discrete.t(), binary(), atom(), number(), keyword()) :: iolist()
def draw_legend(
%Linetype.Discrete{guide: :none},
_label,
_key_glyph,
_key_height,
_fixed_aesthetics
),
do: []
def draw_legend(
%Linetype.Discrete{levels: [_]},
_label,
_key_glyph,
_key_height,
_fixed_aesthetics
),
do: []
def draw_legend(
%Linetype.Discrete{levels: levels} = scale,
label,
key_glyph,
key_height,
fixed_aesthetics
) do
[
Draw.text(
"#{label}",
x: "0",
y: "-5",
class: "gg-text gg-legend-title",
text_anchor: "left"
),
Stream.with_index(levels)
|> Enum.map(fn {level, index} ->
draw_legend_item(scale, {level, index}, key_glyph, key_height, fixed_aesthetics)
end)
]
end
defp draw_legend_item(scale, {level, index}, key_glyph, key_height, fixed_aesthetics) do
[
Draw.rect(
x: "0",
y: "#{key_height * index}",
height: key_height,
width: key_height,
class: "gg-legend-key"
),
draw_key_glyph(scale, level, index, key_glyph, key_height, fixed_aesthetics),
Draw.text(
"#{Labels.format(scale, level)}",
x: "#{key_height + 5}",
y: "#{10 + key_height * index}",
class: "gg-text gg-legend-text",
text_anchor: "left"
)
]
end
defp draw_key_glyph(scale, level, index, :path, key_height, fixed_aesthetics) do
Draw.line(
x1: 1,
y1: key_height / 2 + key_height * index,
x2: key_height - 1,
y2: key_height / 2 + key_height * index,
stroke: fixed_aesthetics[:color],
stroke_dasharray: "#{scale.transform.(level)}",
stroke_opacity: fixed_aesthetics[:alpha]
)
end
defp draw_key_glyph(scale, level, index, :timeseries, key_height, fixed_aesthetics) do
offset = key_height * index
Draw.polyline(
[
{1, key_height - 1 + offset},
{key_height / 5 * 2, key_height / 5 * 2 + offset},
{key_height / 5 * 3, key_height / 5 * 3 + offset},
{key_height - 1, 1 + offset}
],
fixed_aesthetics[:color],
fixed_aesthetics[:size],
fixed_aesthetics[:linetype],
scale.transform.(level)
)
end
end | lib/ggity/scale/linetype_discrete.ex | 0.784443 | 0.435061 | linetype_discrete.ex | starcoder |
defmodule SanbaseWeb.Graphql.Resolvers.HistoricalBalanceResolver do
import Sanbase.Utils.ErrorHandling,
only: [maybe_handle_graphql_error: 2, handle_graphql_error: 3, handle_graphql_error: 4]
import SanbaseWeb.Graphql.Helpers.Utils, only: [calibrate_interval: 7]
alias Sanbase.Clickhouse.HistoricalBalance
# Return this number of datapoints is the provided interval is an empty string
@datapoints 300
def assets_held_by_address(_root, args, _resolution) do
selector =
case Map.get(args, :selector) do
nil -> %{infrastructure: "ETH", address: Map.fetch!(args, :address)}
selector -> selector
end
HistoricalBalance.assets_held_by_address(selector)
|> case do
{:ok, result} ->
# We do this, because many contracts emit a transfer
# event when minting new tokens by setting 0x00...000
# as the from address, hence 0x00...000 is "sending"
# tokens it does not have which leads to "negative" balance
result =
result
|> Enum.reject(fn %{balance: balance} -> balance < 0 end)
{:ok, result}
{:error, error} ->
{:error,
handle_graphql_error("Assets held by address", selector.address, error,
description: "address"
)}
end
end
def historical_balance(
_root,
%{selector: selector, from: from, to: to, interval: interval, address: address},
_resolution
) do
HistoricalBalance.historical_balance(
selector,
address,
from,
to,
interval
)
|> maybe_handle_graphql_error(fn error ->
handle_graphql_error(
"Historical Balances",
inspect(selector),
error,
description: "selector"
)
end)
end
def historical_balance(
_root,
%{slug: slug, from: from, to: to, interval: interval, address: address},
_resolution
) do
HistoricalBalance.historical_balance(
%{infrastructure: "ETH", slug: slug},
address,
from,
to,
interval
)
|> maybe_handle_graphql_error(fn error ->
handle_graphql_error("Historical Balances", slug, error)
end)
end
def miners_balance(
_root,
%{slug: slug, from: from, to: to, interval: interval},
_resolution
) do
with {:ok, from, to, interval} <-
calibrate_interval(
HistoricalBalance.MinersBalance,
slug,
from,
to,
interval,
86_400,
@datapoints
),
{:ok, balance} <-
HistoricalBalance.MinersBalance.historical_balance(slug, from, to, interval) do
{:ok, balance}
else
{:error, error} ->
{:error, handle_graphql_error("Miners Balance", slug, error)}
end
end
end | lib/sanbase_web/graphql/resolvers/historical_balance_resolver.ex | 0.799364 | 0.402979 | historical_balance_resolver.ex | starcoder |
defmodule Listerine.Channels do
@moduledoc """
This module provides functions to create and remove private channels.
"""
use Coxir
# Reads the channels from the config.json file.
defp get_courses() do
case File.read("config.json") do
{:ok, ""} -> nil
{:ok, body} -> Poison.decode!(body)["courses"]
_ -> nil
end
end
# Updates the config.json with the new courses.
defp save_courses(courses) do
new_map =
case File.read("config.json") do
{:ok, ""} ->
%{"courses" => courses}
{:ok, body} ->
map = Poison.decode!(body)
Map.put(map, "courses", courses)
_ ->
%{"courses" => courses}
end
File.write("config.json", Poison.encode!(new_map), [:binary])
end
@doc """
Generates an array with embed fields that contain all available courses.
"""
def generate_courses_embed_fields() do
possible_embed_fiels =
for(year <- Map.keys(get_courses()), do: generate_courses_embed_field(year))
Enum.filter(possible_embed_fiels, fn x -> Map.get(x, :value) != "" end)
end
# Generates a Formated field to use in embed with all courses in the given year.
defp generate_courses_embed_field(year) do
%{
name: year <> "º ano",
value: get_courses_year(year),
inline: true
}
end
# Generates a string with all the courses in a year separated by a newline.
defp get_courses_year(year) do
courses_year_arr = Map.keys(Map.get(get_courses(), year))
Enum.join(courses_year_arr, "\n")
end
@doc """
Adds courses to the guild and registers them to the config.json file.
Returns the list of added courses or `nil` if none were added.
"""
def add_courses(guild, year, courses) do
courses = Enum.uniq(courses)
map_zeros = fn x -> Map.new(x, fn e -> {e, %{"role" => 0, "channels" => []}} end) end
{courses, new_map} =
case get_courses() do
nil ->
{courses, %{year => map_zeros.(courses)}}
map ->
case map[year] do
nil -> {courses, put_in(map[year], map_zeros.(courses))}
crs -> {courses -- Map.keys(crs), map}
end
end
added = create_course_channels(guild, courses)
new_map = update_in(new_map[year], fn cl -> Map.merge(cl, added) end)
save_courses(new_map)
Map.keys(added)
end
# Creates the private channels, corresponding roles and sets the permissions.
defp create_course_channels(_, []), do: %{}
defp create_course_channels(guild, [course | others]) do
role =
Guild.create_role(
guild,
%{:name => course, :hoist => false, :mentionable => true}
)
ow = [
%{id: get_role(Guild.get_roles(guild), "@everyone").id, type: "role", deny: 1024},
%{id: role.id, type: "role", allow: 1024}
]
cat = Guild.create_channel(guild, %{name: course, type: 4, permission_overwrites: ow})
ch1 = Guild.create_channel(guild, %{name: "duvidas", type: 0, parent_id: cat.id})
ch2 = Guild.create_channel(guild, %{name: "anexos", type: 0, parent_id: cat.id})
Map.put(create_course_channels(guild, others), cat.name, %{
"role" => role.id,
"channels" => [cat.id, ch1.id, ch2.id]
})
end
# Returns a role with a given name or `nil` if none are found.
defp get_role(l, name), do: Enum.find(l, fn e -> e[:name] == name end)
@doc """
Removes courses from the config.json file and the corresponding channels and roles
from the guild.
Returns the list of removed channels or `nil` if none where removed.
"""
def remove_courses(courses) do
case get_courses() do
nil ->
nil
map ->
# Only let registered channels be deleted.
valid_courses =
map
|> Map.values()
|> Enum.reduce([], fn x, acc -> acc ++ Map.keys(x) end)
|> Listerine.Helpers.intersect(courses)
removed =
map
|> Map.values()
|> Enum.reduce(
[],
fn x, ac -> ac ++ (Map.take(x, valid_courses) |> Map.values()) end
)
|> remove_course_channels()
new_map =
Enum.reduce(Map.keys(map), map, fn x, acc ->
Map.put(acc, x, Map.drop(map[x], valid_courses))
end)
save_courses(new_map)
removed
end
end
# Removes the channels and roles from the guild.
defp remove_course_channels([]), do: []
defp remove_course_channels([course | others]) do
do_or_nil = fn
nil, _ -> nil
x, f -> f.(x)
end
Role.get(course["role"]) |> do_or_nil.(&Role.delete/1)
prepend = fn x, l -> [x | l] end
course["channels"]
|> Enum.reduce(
[],
fn c, ac -> [Channel.get(c) |> do_or_nil.(&Channel.delete/1) | ac] end
)
|> (fn
nil -> nil
a -> Enum.find(a, fn x -> x.type == 4 end).name
end).()
|> prepend.(remove_course_channels(others))
end
@doc """
Adds the roles passed in the `courses` list the the author of the `message`.
Returns a list of added roles.
"""
def add_roles(message, courses) do
guild = message.channel.guild_id
member = Guild.get_member(guild, message.author.id)
case get_courses() do
nil ->
nil
roles ->
roles = Enum.reduce(Map.keys(roles), %{}, fn x, acc -> Map.merge(acc, roles[x]) end)
Enum.filter(courses, fn x -> x in Map.keys(roles) end)
|> Enum.reduce([], fn x, acc ->
case Member.add_role(member, roles[x]["role"]) do
:ok -> [x | acc]
_ -> acc
end
end)
end
end
def get_roles_year(year) do
Map.keys(get_courses()[year])
end
@doc """
Removes the roles passed in the `courses` list the the author of the `message`.
Returns a list of removed roles.
"""
def rm_role(message, courses) do
guild = message.channel.guild_id
member = Guild.get_member(guild, message.author.id)
case get_courses() do
nil ->
nil
roles ->
roles = Enum.reduce(Map.keys(roles), %{}, fn x, acc -> Map.merge(acc, roles[x]) end)
Enum.filter(courses, fn x -> x in Map.keys(roles) end)
|> Enum.reduce([], fn x, acc ->
case Member.remove_role(member, roles[x]["role"]) do
:ok -> [x | acc]
_ -> acc
end
end)
end
end
end | lib/Listerine/channels.ex | 0.84039 | 0.505859 | channels.ex | starcoder |
defmodule Tradehub.Stream do
@moduledoc ~S"""
This module enables the power to interact with the Tradehub Demex socket.
Behinds the scene, this module requires two dependencies to communicate with the socket server, and broadcasting
the received messages to listeners: WebSockex, and Phoenix PubSub.
To subscribe any topics of the websocket server, you can simply do:
``` elixir
defmodule MyApp.Client do
use Tradehub.Stream, topics: ["market_stats", "recent_trades.swth_eth1"]
def handle_info(message, state) do
# Handle you messages here
IO.puts(message)
{:ok, state}
end
end
```
`Tradehub.Stream` built based on `GenServer` so you can easily fit into any supervision tree.
``` elixir
defmodule MyApp.Application do
use Application
def start(_opts, _args) do
children = [
MyApp.Client
]
Supervisor.start_link(children, strategy: :one_for_one)
end
end
```
Or if you want to manage everything manually, implement your own a process and manualy do subscribe
to any topics you want by using utilize functions of `Tradehub.Stream`.
```
defmodule MyApp.Client do
use GenServer
def start_link(state) do
GenServer.start_link(__MODULE__, state, name: __MODULE__)
end
## Callbacks
def init(stack) do
# Start listening on the `market_stats` topic
Tradehub.Stream.market_stats()
{:ok, stack}
end
# Handle latest message from the `market_stats` channel
def handle_info(msg, state) do
IO.puts("Receive message -- #{msg}")
{:noreply, state}
end
end
```
"""
use WebSockex
require Logger
@ws Tradehub.config(Application.fetch_env!(:tradehub, :network))[:ws]
@typedoc "The channel ID"
@type channel_id :: String.t()
defmacro __using__(opts) do
topics = Keyword.get(opts, :topics)
quote do
use GenServer
def start_link(_opts) do
GenServer.start_link(__MODULE__, %{}, name: __MODULE__)
end
def init(state) do
unquote(topics)
|> Enum.each(fn topic -> Tradehub.Stream.subscribe(topic) end)
{:ok, state}
end
def terminate(_reason, state) do
IO.puts("#{__MODULE__} going down...")
unquote(topics)
|> Enum.each(fn topic -> Tradehub.Stream.unsubscribe(topic) end)
:normal
end
end
end
@doc false
@spec start_link(any) :: {:error, any} | {:ok, pid}
def start_link(state) do
WebSockex.start_link(@ws, __MODULE__, state, name: Stream)
end
def handle_frame({type, msg}, state) do
Logger.debug("#{__MODULE__} received #{type} - #{msg}")
decode_msg = Jason.decode!(msg, keys: :atoms)
case Map.has_key?(decode_msg, :channel) do
true ->
Logger.debug("#{__MODULE__} broadcast the message to observers")
Phoenix.PubSub.broadcast(Tradehub.PubSub, decode_msg.channel, msg)
{:ok, state}
false ->
{:ok, state}
end
end
def handle_info(msg, state) do
Logger.debug("#{__MODULE__} received non-websocket message: #{IO.inspect(msg)}")
{:ok, state}
end
def handle_cast({:send, {type, msg} = frame}, state) do
Logger.debug("#{__MODULE__} sending #{type} frame with payload: #{IO.inspect(msg)}")
{:reply, frame, state}
end
def handle_cast(msg, state) do
Logger.debug("#{__MODULE__} sending frame with payload: #{IO.inspect(msg)}")
{:reply, msg, state}
end
def handle_ping(ping_frame, state) do
Logger.debug("#{__MODULE__} received a ping frame: #{IO.inspect(ping_frame)}")
{:ok, state}
end
def handle_pong(pong_frame, state) do
Logger.debug("#{__MODULE__} received a pong frame: #{IO.inspect(pong_frame)}")
{:ok, state}
end
@doc """
The utility function to subscribe into the channel, and broadcast messages if any to the `Tradehub.PubSub`
with a specific topic name.
## Examples
iex> Tradehub.Stream.subscribe("market_stats")
"""
@spec subscribe(String.t()) :: channel_id() | {:error, reason :: any}
def subscribe(topic) do
frame_content = %{
id: topic,
method: "subscribe",
params: %{channels: [topic]}
}
frame = {:text, Jason.encode!(frame_content)}
case WebSockex.send_frame(Stream, frame) do
:ok ->
Phoenix.PubSub.subscribe(Tradehub.PubSub, topic)
Logger.debug("#{__MODULE__} starting subscribe messages on channel #{topic}")
topic
other ->
other
end
end
@doc """
The utility function to unsubscribe an available channel
## Examples
iex> Tradehub.Stream.unsubscribe("market_stats")
"""
@spec unsubscribe(topic :: String.t()) :: :ok | {:error, reason :: any}
def unsubscribe(topic) do
frame_content = %{
id: topic,
method: "unsubscribe",
params: %{channels: [topic]}
}
frame = {:text, Jason.encode!(frame_content)}
case WebSockex.send_frame(Stream, frame) do
:ok ->
Logger.debug("Stopping subscribe messages on channel #{topic}")
:ok
other ->
other
end
end
@doc """
Subscribes to the `account_trades.[account]` channel to request upto 100 trades of the given account.
## Examples
iex> topic = Tradehub.Stream.account_trades("swth1fdqkq5gc5x8h6a0j9hamc30stlvea6zldprt6q")
iex> Process.info(self(), :messages)
iex> Tradehub.Stream.unsubscribe topic
"""
@spec account_trades(String.t()) :: channel_id() | {:error, reason :: any}
def account_trades(account), do: subscribe("account_trades.#{account}")
@doc """
Subscribes to the `account_trades_by_market.[market].[account]` channel to request trades of the given account within a market.
## Examples
iex> topic = Tradehub.Stream.account_trades_by_market("swth_eth1", "swth1fdqkq5gc5x8h6a0j9hamc30stlvea6zldprt6q")
iex> Process.info(self(), :messages)
iex> Tradehub.Stream.unsubscribe topic
"""
@spec account_trades_by_market(String.t(), String.t()) :: channel_id() | {:error, reason :: any}
def account_trades_by_market(market, account), do: subscribe("account_trades_by_market.#{market}.#{account}")
@doc """
Subscribes to the `balances.[account]` channel of the given account to request latest balance updates.
## Examples
iex> topic = Tradehub.Stream.balances("swth1fdqkq5gc5x8h6a0j9hamc30stlvea6zldprt6q")
iex> Process.info(self(), :messages)
iex> Tradehub.Stream.unsubscribe topic
"""
@spec balances(String.t()) :: channel_id() | {:error, reason :: any}
def balances(account), do: subscribe("balances.#{account}")
@doc """
Subscribes to the `candlesticks.[market].[resolution]` channel to request latest candlesticks of the market in the resolution timeframe.
## Examples
iex> topic = Tradehub.Stream.candlesticks("swth_eth1", 30)
iex> Process.info(self(), :messages)
iex> Tradehub.Stream.unsubscribe topic
"""
@spec candlesticks(String.t(), String.t()) :: channel_id() | {:error, reason :: any}
def candlesticks(market, resolution), do: subscribe("candlesticks.#{market}.#{resolution}")
@doc """
Subscribes to the `leverages_by_market.[market].[account]` channel to request latest leverages information of the
given account within a market.
## Examples
iex> topic = Tradehub.Stream.leverages_by_market("swth_eth1", "swth1fdqkq5gc5x8h6a0j9hamc30stlvea6zldprt6q")
iex> Process.info(self(), :messages)
iex> Tradehub.Stream.unsubscribe topic
"""
@spec leverages_by_market(String.t(), String.t()) :: channel_id() | {:error, reason :: any}
def leverages_by_market(market, account), do: subscribe("leverages_by_market.#{market}.#{account}")
@doc """
Subscribes to the `leverages.[account]` channel to request latest leverages information of the given account.
## Examples
iex> topic = Tradehub.Stream.leverages("swth1fdqkq5gc5x8h6a0j9hamc30stlvea6zldprt6q")
iex> Process.info(self(), :messages)
iex> Tradehub.Stream.unsubscribe topic
"""
@spec leverages(String.t()) :: channel_id() | {:error, reason :: any}
def leverages(account), do: subscribe("leverages.#{account}")
@doc """
Subscribes to the `market_stats` channel to request the latest statistics of the market.
## Examples
iex> topic = Tradehub.Stream.market_stats
iex> Process.info(self(), :messages)
iex> Tradehub.Stream.unsubscribe topic
"""
@spec market_stats() :: channel_id() | {:error, reason :: any}
def market_stats, do: subscribe("market_stats")
@doc """
Subscribes to the `books.[market]` channel to request the latest orderbook of the given market.
## Examples
iex> topic = Tradehub.Stream.books("swth_eth1")
iex> Process.info(self(), :messages)
iex> Tradehub.Stream.unsubscribe topic
"""
@spec books(String.t()) :: channel_id() | {:error, reason :: any}
def books(market), do: subscribe("books.#{market}")
@doc """
Subscribes to the `orders_by_market.[market].[account]` channel to request the latest orders of the given account within
a specific market.
## Examples
iex> topic = Tradehub.Stream.orders_by_market("swth_eth1", "swth1fdqkq5gc5x8h6a0j9hamc30stlvea6zldprt6q")
iex> Process.info(self(), :messages)
iex> Tradehub.Stream.unsubscribe topic
"""
@spec orders_by_market(String.t(), String.t()) :: channel_id() | {:error, reason :: any}
def orders_by_market(market, account), do: subscribe("orders_by_market.#{market}.#{account}")
@doc """
Subscribes to the `orders.[account]` channel to request the latest orders of the given account.
## Examples
iex> topic = Tradehub.Stream.orders("swth1fdqkq5gc5x8h6a0j9hamc30stlvea6zldprt6q")
iex> Process.info(self(), :messages)
iex> Tradehub.Stream.unsubscribe topic
"""
@spec orders(String.t()) :: channel_id() | {:error, reason :: any}
def orders(account), do: subscribe("orders.#{account}")
@doc """
Subscribes to the `positions_by_market.[market].[account]` channel to request the latest positions of the given account
within a particular market
## Examples
iex> topic = Tradehub.Stream.positions_by_market("swth_eth1", "swth1fdqkq5gc5x8h6a0j9hamc30stlvea6zldprt6q")
iex> Process.info(self(), :messages)
iex> Tradehub.Stream.unsubscribe topic
"""
@spec positions_by_market(String.t(), String.t()) :: channel_id() | {:error, reason :: any}
def positions_by_market(market, account), do: subscribe("positions.#{market}.#{account}")
@doc """
Subscribes to the `positions.[account]` channel to request the latest positions of the given account.
## Examples
iex> topic = Tradehub.Stream.positions("swth1fdqkq5gc5x8h6a0j9hamc30stlvea6zldprt6q")
iex> Process.info(self(), :messages)
iex> Tradehub.Stream.unsubscribe topic
"""
@spec positions(String.t()) :: channel_id() | {:error, reason :: any}
def positions(account), do: subscribe("positions.#{account}")
@doc """
Subscribes to the `recent_trades.[market]` channel to request the recent trades of the given market.
## Examples
iex> topic = Tradehub.Stream.recent_trades("swth_eth1")
iex> Process.info(self(), :messages)
iex> Tradehub.Stream.unsubscribe topic
"""
@spec recent_trades(String.t()) :: channel_id() | {:error, reason :: any}
def recent_trades(market), do: subscribe("recent_trades.#{market}")
end | lib/tradehub/stream.ex | 0.897434 | 0.805785 | stream.ex | starcoder |
defmodule FaultTree.Parser.XML do
@moduledoc """
Parse an XML document into a FaultTree.
The XML schema expected is mostly for opsa-mef, defined at:
[https://github.com/rakhimov/scram/blob/master/share/input.rng](https://github.com/rakhimov/scram/blob/master/share/input.rng)
"""
import SweetXml
alias FaultTree.Node
@doc """
Parse the XML document into a FaultTree.
Steps:
- create a new tree
- parse xml and add all events/gates to the tree
- walk the tree and set parent field for anything showing up as a child of another node
- for duplicates, set all references after the first as a transfer gate
- walk the tree again and remove original child references
Things missing:
- parameter parsing
- only the first defined fault-tree will be parsed
- unsupported gate types
- nested gates are not supported, each gate must be defined with `define-gate` and only use event refs
- boolean events
- a bunch of other things from https://raw.githubusercontent.com/rakhimov/scram/master/share/input.rng
"""
@spec parse(String.t()) :: map()
def parse(doc) do
tree = %FaultTree{}
root = doc |> xpath(~x"/opsa-mef"e)
mapped = root |> xmap(
gates: [
~x"define-fault-tree/define-gate"l,
name: ~x"@name | name/text()"s,
description: ~x"@label | label/text()"os,
type: ~x"name(or|and|atleast)"s |> transform_by(&String.to_atom/1),
atleast_min: ~x"atleast/@min"io,
atleast_total: ~x"atleast/@total"io,
children: ~x"*/event"l |> transform_by(&transform_events/1),
],
events: [
~x"define-fault-tree/define-basic-event | model-data/define-basic-event"l,
name: ~x"@name"s,
description: ~x"@label | label/text()"os,
probability: ~x"float/@value | int/@value"s |> transform_by(&Decimal.new/1),
]
)
# Add all the basic events to the tree, without parents
tree = mapped
|> Map.get(:events)
|> Stream.map(&convert_event/1)
|> Enum.reduce(tree, fn node, tree -> FaultTree.add_node(tree, node) end)
# Add all gates
tree = mapped
|> Map.get(:gates)
|> Stream.map(&convert_gate/1)
|> Enum.reduce(tree, fn node, tree -> FaultTree.add_node(tree, node) end)
# Set parent attributes
tree.nodes
|> Enum.reduce(tree, fn node, tree ->
parents = tree.nodes |> Enum.filter(fn %{children: children} -> children != nil and node.name in children end)
case parents do
[] -> tree
[first | rest] -> tree |> set_parent(node, first) |> add_duplicate_children(node, first) |> add_transfers(node, rest)
end
end)
# Null out the child field, which will be a list of strings
|> Map.update!(:nodes, fn nodes -> nodes |> Enum.map(fn n -> Map.put(n, :children, []) end) end)
end
defp convert_event(event), do: Node |> struct(event) |> Map.put(:type, :basic)
defp convert_gate(gate) do
{min, gate} = Map.pop(gate, :atleast_min)
{total, gate} = Map.pop(gate, :atleast_total)
case {min, total} do
{nil, _} -> struct(Node, gate)
{_, nil} -> struct(Node, gate)
params -> Node |> struct(gate) |> Map.put(:atleast, params)
end
end
defp set_parent(tree, child, parent) do
Map.update!(tree, :nodes, fn nodes ->
Enum.map(nodes, fn node ->
if node.id == child.id, do: Map.put(node, :parent, parent.name), else: node
end)
end)
end
defp add_duplicate_children(tree, child, parent) do
count = parent.children |> Enum.filter(fn c -> c == child.name end) |> Enum.count()
1..count
|> Enum.drop(1)
|> Enum.reduce(tree, fn _, tree -> FaultTree.add_transfer(tree, parent.name, child.name) end)
end
defp add_transfers(tree, _node, []), do: tree
defp add_transfers(tree, node, [parent | rest]) do
tree
|> FaultTree.add_transfer(parent.name, node.name)
|> add_duplicate_children(node, parent)
|> add_transfers(node, rest)
end
defp transform_events(nodes), do: nodes |> Enum.flat_map(&add_duplicate_events/1)
defp add_duplicate_events(node) do
event = node
|> xpath(~x".", name: ~x"./@name"s, count: ~x"./@count"io |> transform_by(&one_if_nil/1))
1..event[:count]
|> Enum.map(fn _ -> event[:name] end)
end
defp one_if_nil(nil), do: 1
defp one_if_nil(x), do: x
end | lib/fault_tree/parser/xml.ex | 0.868102 | 0.67298 | xml.ex | starcoder |
defmodule Day22 do
use Bitwise
def read_file(path) do
File.stream!(path)
|> Enum.to_list
|> parse_input
end
def parse_input(rows) do
map = rows
|> Enum.reverse
|> Enum.with_index
|> Enum.map(&parse_row/1)
|> Enum.reduce(%{}, &Map.merge/2)
size = rows |> Enum.count
transform = -div(size, 2)
map |> transform({transform, transform})
end
def parse_row({row, y}) do
row
|> String.graphemes
|> Enum.with_index
|> Enum.filter(fn {char, _} -> char == "#" end)
|> Enum.map(fn {_, x} -> {x, y} end)
|> Enum.map(fn coord -> {coord, :infected} end)
|> Enum.into(%{})
end
def transform(map, {dy, dx}) do
map |> Map.to_list |> Enum.map(fn {{y, x}, val} -> {{y + dy, x + dx}, val} end) |> Map.new
end
def turn(current_direction, :left) do
case current_direction do
:north -> :west
:west -> :south
:south -> :east
:east -> :north
end
end
def turn(current_direction, :right) do
case current_direction do
:north -> :east
:east -> :south
:south -> :west
:west -> :north
end
end
def turn(current_direction, :reverse) do
case current_direction do
:north -> :south
:east -> :west
:south -> :north
:west -> :east
end
end
def move({x, y}, direction) do
case direction do
:north -> {x, y + 1}
:east -> {x + 1, y}
:south -> {x, y - 1}
:west -> {x - 1, y}
end
end
def step(_, _, _, infected, steps) when steps == 1, do: infected
def step(map, current_position, direction, infected, steps) do
status = Map.get(map, current_position, :clean)
is_infected = status == :infected
new_direction = if is_infected do
turn(direction, :right)
else
turn(direction, :left)
end
new_map = map |> Map.put(current_position, (if is_infected, do: :clean, else: :infected))
new_position = move(current_position, new_direction)
infected = if is_infected, do: infected, else: infected + 1
step(new_map, new_position, new_direction, infected, steps - 1)
end
def caused_infection(map, steps) do
step(map, {0, 0}, :north, 0, steps)
end
def new_direction(status, direction) do
case status do
:clean -> turn(direction, :left)
:weakened -> direction
:infected -> turn(direction, :right)
:flagged -> turn(direction, :reverse)
end
end
def new_status(status) do
case status do
:clean -> :weakened
:weakened -> :infected
:infected -> :flagged
:flagged -> :clean
end
end
def step2(_, _, _, infected, steps) when steps == 0, do: infected
def step2(map, current_position, direction, infected, steps) do
status = Map.get(map, current_position, :clean)
new_direction = new_direction(status, direction)
new_map = map |> Map.put(current_position, new_status(status))
new_position = move(current_position, new_direction)
infected = if status == :weakened, do: infected + 1, else: infected
step2(new_map, new_position, new_direction, infected, steps - 1)
end
def caused_infection_part2(map, steps) do
step2(map, {0, 0}, :north, 0, steps)
end
end | lib/day22.ex | 0.540196 | 0.647359 | day22.ex | starcoder |
defmodule Forma.Parser do
def parse!(input, parsers, {:struct, name, fields}) do
case input do
input when is_map(input) -> map_exact_fields!(input, parsers, struct(name), fields)
_ -> raise "not a map #{inspect input}"
end
end
def parse!(input, parsers, {:exact_map, fields}) do
case input do
input when is_map(input) -> map_exact_fields!(input, parsers, %{}, fields)
_ -> raise "not a map #{inspect input}"
end
end
def parse!(input, parsers, {:assoc_map, fields}) do
case input do
input when is_map(input) -> map_assoc_fields!(input, parsers, %{}, fields)
_ -> raise "not a map #{inspect input}"
end
end
def parse!(input, parsers, {:union, possible}) do
ret = Enum.reduce_while(possible, :not_found, fn (candidate, _) ->
try do
{:halt, {:ok, parse!(input, parsers, candidate)}}
rescue
_ -> {:cont, {:error, :no_value}}
end
end)
case ret do
{:ok, v} -> v
{:error, :no_value} -> raise "#{inspect input} doesn't match any of: #{inspect possible}"
end
end
def parse!(input, parsers, {:range, [{type, _, from}, {_, _, to}]}) do
input = parse!(input, parsers, {type, []})
case from <= input && input <= to do
true -> input
_ -> raise "#{input} is not between #{from} and #{to}"
end
end
def parse!(input, parsers, {:list, type}) do
case input do
xs when is_list(xs) -> Enum.map(xs, &parse!(&1, parsers, type))
x -> raise "can't convert #{inspect x} to a list"
end
end
def parse!(input, _parsers, {:binary, []}) do
case input do
x when is_binary(x) -> x
x -> raise "can't convert #{inspect x} to binary"
end
end
def parse!(input, _parsers, {:integer, []}) do
case input do
x when is_number(x) -> trunc(x)
x -> raise "can't convert #{inspect x} to an integer"
end
end
def parse!(input, _parsers, {:neg_integer, []}) do
case input do
x when is_number(x) and x < 0 -> trunc(x)
x -> raise "can't convert #{inspect x} to a negative integer"
end
end
def parse!(input, _parsers, {:non_neg_integer, []}) do
case input do
x when is_number(x) and x > -1 -> trunc(x)
x -> raise "can't convert #{inspect x} to a non-negative integer"
end
end
def parse!(input, _parsers, {:pos_integer, []}) do
case input do
x when is_number(x) and x > 0 -> trunc(x)
x -> raise "can't convert #{inspect x} to a positive integer"
end
end
def parse!(input, _parsers, {:atom, []}) do
case input do
x when is_binary(x) -> String.to_atom(x)
x when is_atom(x) -> x
x -> raise "can't convert #{inspect x} to an atom"
end
end
def parse!(input, _parsers, {:atom, nil}) do
case input do
nil -> nil
"" -> nil
x -> raise "can't convert #{inspect x} to nil"
end
end
def parse!(input, _parsers, {:atom, val}) do
val_string = Atom.to_string(val)
case input do
^val -> val
^val_string -> val
x -> raise "can't convert #{inspect x} into #{inspect val}"
end
end
def parse!(input, _parsers, {:boolean, []}) do
case input do
x when is_boolean(x) -> x
x -> raise "can't convert #{inspect x} to a boolean"
end
end
def parse!(input, _parsers, {:float, []}) do
case input do
x when is_number(x) -> x
x -> raise "can't convert #{inspect x} to a float"
end
end
def parse!(input, parsers, {{module, type}, params}) do
cond do
f = Map.get(parsers, {module, type}) -> apply(f, [input | params])
function_exported?(module, :__forma__, 2) -> apply(module, :__forma__, [type, input] ++ params)
true ->
case Forma.Types.for(module, type) do
:opaque -> raise "{#{module}, #{type}} is opaque and no parser or parser behaviour is defined"
typ -> parse!(input, parsers, typ)
end
end
end
def parse!(input, _parsers, {:any, []}), do: input
def parse!(_input, _parsers, type) do
raise "type #{inspect type} is not implemented yet"
end
defp map_assoc_fields!(input, parsers, acc, {key, value}) do
Enum.reduce(input, acc, fn {k, v}, acc ->
Map.put(acc, parse!(k, parsers, key), parse!(v, parsers, value))
end)
end
defp map_exact_fields!(input, parsers, acc, {key, value}) do
Enum.reduce(input, acc, fn {k, v}, acc ->
Map.put(acc, parse!(k, parsers, key), parse!(v, parsers, value))
end)
end
defp map_exact_fields!(input, parsers, acc, fields) do
Enum.reduce(input, acc, fn {key, value}, acc ->
case Map.get(fields, key) do
{field, type} -> Map.put(acc, field, parse!(value, parsers, type))
nil -> acc
end
end)
end
end | lib/forma/parser.ex | 0.662906 | 0.65474 | parser.ex | starcoder |
defmodule Elixir99.Lists2 do
def encode_modified(list, current \\ nil, acc \\ [], counter \\ 0) do
new_acc = case counter do
0 -> acc
1 -> acc ++ [current]
_ -> acc ++ [{counter, current}]
end
case list do
[] -> new_acc
[head | tail] ->
if head == current do
encode_modified(tail, current, acc, counter+1)
else
encode_modified(tail, head, new_acc, 1)
end
end
end
defp duplicate(element, count, acc \\ []) do
case count do
0 -> []
1 -> acc ++ [element]
_ -> duplicate(element, count - 1, acc ++ [element])
end
end
defp decode_tuple(tuple) do
elem(tuple, 1) |> duplicate(elem(tuple, 0))
end
def decode_modified(list, acc \\ []) do
case list do
[] -> acc
[head | tail] ->
case is_tuple(head) do
false -> decode_modified(tail, acc ++ [head])
true -> decode_modified(tail, acc ++ decode_tuple(head))
end
end
end
def encode_directly(list) do
encode_modified(list)
end
def dupli(list, acc \\ []) do
case list do
[] -> acc
[head | tail] -> dupli(tail, acc ++ [head, head])
end
end
def repli(list, n, acc \\ []) do
case list do
[] -> acc
[head | tail] -> repli(tail, n, acc ++ duplicate(head, n))
end
end
def drop_every(list, n, acc \\ [], count \\ 1) do
case list do
[] -> acc
[head | tail] ->
case count == n do
true -> drop_every(tail, n, acc, 1)
false -> drop_every(tail, n, acc++[head], count+1)
end
end
end
def split(list, n, acc \\ [], count \\ 1) do
case length(list) > n-count do
true ->
[head | tail] = list
case count == n do
true -> {acc ++ [head], tail}
false -> split(tail, n, acc ++ [head], count+1)
end
false -> {:error, "n cannot be bigger than list length"}
end
end
@spec slice([any], any, any) :: :error | [...] | {:error, <<_::328>>}
def slice(list, n, k) do
case length(list) < k or n>k do
true -> {:error, "length of a list cannot be smaller than k"}
false -> list
|> split(n)
|> elem(1)
|> split(k-n)
|> elem(0)
end
end
def mod(n, k) do
r = rem(n, k)
case r < 0 do
true -> k+r
false -> r
end
end
def rotate(list, n) do
k = length(list)
i = mod(n, k)
splitted = list |> split(i)
elem(splitted, 1) ++ elem(splitted, 0)
end
def remove_at(list, k, count \\ 1, acc \\ []) do
[head | tail] = list
case count == k do
true -> {head, acc++tail}
false -> remove_at(tail, k, count+1, acc ++ [head])
end
end
end | lib/elixir99_lists2.ex | 0.603231 | 0.606964 | elixir99_lists2.ex | starcoder |
defmodule Rayray.Renderings.Stl do
alias Rayray.Camera
alias Rayray.Canvas
alias Rayray.Lights
alias Rayray.Material
alias Rayray.Matrix
alias Rayray.Plane
alias Rayray.Transformations
alias Rayray.Tuple
alias Rayray.World
def do_it() do
floor = Plane.new()
floor_material = Material.new()
floor_material = %{floor_material | color: Tuple.color(1, 0.9, 0.9), specular: 0}
floor = %{floor | material: floor_material}
world = World.new()
triangle_material = Material.new()
triangle_material = %{
triangle_material
| color: Tuple.color(0.0196, 0.65, 0.874),
diffuse: 0.7,
specular: 0.3
}
mesh = Rayray.Xtl.parse("/Users/clark/Downloads/Moon_binary.stl")
triangle_transform =
Matrix.rotation_z(:math.pi() / 2)
|> Matrix.multiply(Matrix.rotation_x(-1 * :math.pi() / 3))
|> Matrix.multiply(Matrix.scaling(2, 2, 2))
triangles =
mesh.triangles
|> Enum.map(fn triangle ->
%{triangle | material: triangle_material, transform: triangle_transform}
end)
# triangles |> Enum.take(5) |> IO.inspect()
# t = Rayray.Triangle.new(Tuple.point(1, 0, 1), Tuple.point(0, 1, 0), Tuple.point(1, 1, 0))
# %{t | material: triangle_material}
# triangles = [t]
world = %{
world
| light: Lights.point_light(Tuple.point(-10, 10, -10), Tuple.color(1, 1, 1)),
objects: triangles ++ [floor]
}
camera = Camera.new(600, 600, :math.pi() / 2)
camera = %{
camera
| transform:
Transformations.view_transform(
Tuple.point(0, 1.5, -5),
Tuple.point(0, 1, 0),
Tuple.vector(0, 1, 0)
)
}
IO.puts("started rendering")
canvas = Camera.render(camera, world)
IO.puts("done rendering")
ppm = Canvas.canvas_to_ppm(canvas)
IO.puts("Done ppm")
File.write!("mesh_400x400.ppm", ppm)
end
end
defmodule Rayray.Xtl do
alias Rayray.Triangle
alias Rayray.Tuple
def parse(file) do
# parse_string(File.read!(file))
{:ok, stl} =
File.open(file, [:read, :raw], fn handle ->
str = IO.binread(handle, :all)
parse_string(str)
# IO.binstream(handle, 2048) |> Enum.map(fn x -> String.length(x) end)
end)
stl
end
def parse_string(str) when is_binary(str) do
{headers, rest} = parse_headers(str)
triangles = parse_triangles(rest)
Map.put(headers, :triangles, triangles)
end
def parse_headers(<<
_header::bytes-size(80),
number_of_triangles::unsigned-little-integer-size(32),
rest::binary()
>>) do
{%{number_of_triangles: number_of_triangles}, rest}
end
def parse_triangles(rest) do
do_parse_triangles(rest, [])
end
def do_parse_triangles(
<<
_nx::little-float-size(32),
_ny::little-float-size(32),
_nz::little-float-size(32),
v1x::little-float-size(32),
v1y::little-float-size(32),
v1z::little-float-size(32),
v2x::little-float-size(32),
v2y::little-float-size(32),
v2z::little-float-size(32),
v3x::little-float-size(32),
v3y::little-float-size(32),
v3z::little-float-size(32),
_abc::bytes-size(2),
rest::binary()
>>,
triangles
) do
do_parse_triangles(
rest,
[
Triangle.new(
Tuple.point(v1x, v1y, v1z),
Tuple.point(v2x, v2y, v2z),
Tuple.point(v3x, v3y, v3z)
)
| triangles
# %{
# normal: %{x: nx, y: ny, z: nz},
# v1: %{x: v1x, y: v1y, z: v1z},
# v2: %{x: v2x, y: v2y, z: v2z},
# v3: %{x: v3x, y: v3y, z: v3z}
# }
]
)
end
def do_parse_triangles(_rest, triangles) do
triangles
end
end | lib/rayray/renderings/stl.ex | 0.71403 | 0.525491 | stl.ex | starcoder |
defmodule ChoreRunner.Input do
@valid_types ~w(string int float file bool)a
@type input_type :: :string | :int | :float | :file | :bool
@type reason :: atom() | String.t()
@type validator_function ::
(term() -> {:ok, term()} | :ok | true | {:error, reason} | nil | false)
@type input_options :: [
validators: [validator_function],
description: String.t()
]
@type t :: {input_type, atom, input_options}
defguard valid_type(type) when type in @valid_types
for type <- @valid_types do
@spec unquote(type)(atom(), input_options) :: t
def unquote(type)(name, opts \\ []) do
{unquote(type), name, opts}
end
end
def types, do: @valid_types
def validate_field(type, value) when valid_type(type) do
do_validate(type, do_cast(value, type))
end
defp do_cast(value, :string), do: to_string(value)
defp do_cast(value, :int) when is_binary(value) do
case Integer.parse(value) do
{int, _} -> int
_ -> value
end
end
defp do_cast(value, :float) when is_binary(value) do
case Float.parse(value) do
{float, _} -> float
_ -> value
end
end
defp do_cast(value, :bool) when is_binary(value) do
case String.downcase(value) do
"true" -> true
"false" -> false
_ -> value
end
end
defp do_cast(value, :int) when is_integer(value), do: value
defp do_cast(value, :float) when is_float(value), do: value
defp do_cast(value, :bool) when is_boolean(value), do: value
defp do_cast(value, _), do: value
defp do_validate(:string, value) when is_binary(value), do: {:ok, value}
defp do_validate(:int, value) when is_integer(value), do: {:ok, value}
defp do_validate(:float, value) when is_float(value), do: {:ok, value}
defp do_validate(:bool, value) when is_boolean(value), do: {:ok, value}
defp do_validate(:file, %module{} = value) when module == Plug.Upload, do: {:ok, value}
defp do_validate(:file, path) when is_binary(path) do
if File.exists?(path), do: {:ok, path}, else: {:error, :does_not_exist}
end
defp do_validate(_, _), do: {:error, :invalid}
end | lib/chore_runner/input.ex | 0.770422 | 0.463748 | input.ex | starcoder |
defmodule Solana.CompactArray do
@moduledoc false
use Bitwise, skip_operators: true
# https://docs.solana.com/developing/programming-model/transactions#compact-array-format
@spec to_iolist(arr :: iolist | nil) :: iolist
def to_iolist(nil), do: []
def to_iolist(arr) when is_list(arr) do
[encode_length(length(arr)) | arr]
end
def to_iolist(bin) when is_binary(bin) do
[encode_length(byte_size(bin)) | [bin]]
end
@spec encode_length(length :: non_neg_integer) :: list
def encode_length(length) when bsr(length, 7) == 0, do: [encode_bits(length)]
def encode_length(length) do
[bor(encode_bits(length), 0x80) | encode_length(bsr(length, 7))]
end
defp encode_bits(bits), do: band(bits, 0x7F)
@spec decode_and_split(encoded :: binary) :: {binary, non_neg_integer} | :error
def decode_and_split(""), do: :error
def decode_and_split(encoded) do
count = decode_length(encoded)
count_size = compact_length_bytes(count)
case encoded do
<<length::count_size*8, rest::binary>> -> {rest, length}
_ -> :error
end
end
@spec decode_and_split(encoded :: binary, item_size :: non_neg_integer) ::
{[binary], binary, non_neg_integer} | :error
def decode_and_split("", _), do: :error
def decode_and_split(encoded, item_size) do
count = decode_length(encoded)
count_size = compact_length_bytes(count)
data_size = count * item_size
case encoded do
<<length::count_size*8, data::binary-size(data_size), rest::binary>> ->
{Solana.Helpers.chunk(data, item_size), rest, length}
_ ->
:error
end
end
def decode_length(bytes), do: decode_length(bytes, 0)
def decode_length(<<elem, _::binary>>, size) when band(elem, 0x80) == 0 do
decode_bits(elem, size)
end
def decode_length([elem | _], size) when band(elem, 0x80) == 0 do
decode_bits(elem, size)
end
def decode_length(<<elem, rest::binary>>, size) do
bor(decode_bits(elem, size), decode_length(rest, size + 1))
end
def decode_length([elem | rest], size) do
bor(decode_bits(elem, size), decode_length(rest, size + 1))
end
defp decode_bits(bits, size), do: bits |> band(0x7F) |> bsl(7 * size)
defp compact_length_bytes(length) when length < 0x7F, do: 1
defp compact_length_bytes(length) when length < 0x3FFF, do: 2
defp compact_length_bytes(_), do: 3
end | lib/solana/compact_array.ex | 0.769687 | 0.460228 | compact_array.ex | starcoder |
% Licensed under the Apache License, Version 2.0 (the "License"); you may not
% use this file except in compliance with the License. You may obtain a copy of
% the License at
%
% http://www.apache.org/licenses/LICENSE-2.0
%
% Unless required by applicable law or agreed to in writing, software
% distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
% WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
% License for the specific language governing permissions and limitations under
% the License.
-module(rexi_utils).
-export([server_id/1, server_pid/1, send/2, recv/6]).
%% @doc Return a rexi_server id for the given node.
server_id(Node) ->
case config:get("rexi", "server_per_node", "false") of
"true" ->
list_to_atom("rexi_server_" ++ atom_to_list(Node));
_ ->
rexi_server
end.
%% @doc Return a {server_id(node()), Node} Pid name for the given Node.
server_pid(Node) ->
{server_id(node()), Node}.
%% @doc send a message as quickly as possible
send(Dest, Msg) ->
case erlang:send(Dest, Msg, [noconnect, nosuspend]) of
ok ->
ok;
_ ->
% treat nosuspend and noconnect the same
rexi_buffer:send(Dest, Msg)
end.
%% @doc set up the receive loop with an overall timeout
-spec recv([any()], integer(), function(), any(), timeout(), timeout()) ->
{ok, any()} | {timeout, any()} | {error, atom()} | {error, atom(), any()}.
recv(Refs, Keypos, Fun, Acc0, infinity, PerMsgTO) ->
process_mailbox(Refs, Keypos, Fun, Acc0, nil, PerMsgTO);
recv(Refs, Keypos, Fun, Acc0, GlobalTimeout, PerMsgTO) ->
TimeoutRef = erlang:make_ref(),
TRef = erlang:send_after(GlobalTimeout, self(), {timeout, TimeoutRef}),
try
process_mailbox(Refs, Keypos, Fun, Acc0, TimeoutRef, PerMsgTO)
after
erlang:cancel_timer(TRef)
end.
process_mailbox(RefList, Keypos, Fun, Acc0, TimeoutRef, PerMsgTO) ->
case process_message(RefList, Keypos, Fun, Acc0, TimeoutRef, PerMsgTO) of
{ok, Acc} ->
process_mailbox(RefList, Keypos, Fun, Acc, TimeoutRef, PerMsgTO);
{new_refs, NewRefList, Acc} ->
process_mailbox(NewRefList, Keypos, Fun, Acc, TimeoutRef, PerMsgTO);
{stop, Acc} ->
{ok, Acc};
Error ->
Error
end.
process_message(RefList, Keypos, Fun, Acc0, TimeoutRef, PerMsgTO) ->
receive
{timeout, TimeoutRef} ->
{timeout, Acc0};
{rexi, Ref, Msg} ->
case lists:keyfind(Ref, Keypos, RefList) of
false ->
{ok, Acc0};
Worker ->
Fun(Msg, Worker, Acc0)
end;
{rexi, Ref, From, Msg} ->
case lists:keyfind(Ref, Keypos, RefList) of
false ->
{ok, Acc0};
Worker ->
Fun(Msg, {Worker, From}, Acc0)
end;
{Ref, Msg} ->
case lists:keyfind(Ref, Keypos, RefList) of
false ->
% this was some non-matching message which we will ignore
{ok, Acc0};
Worker ->
Fun(Msg, Worker, Acc0)
end;
{Ref, From, Msg} ->
case lists:keyfind(Ref, Keypos, RefList) of
false ->
{ok, Acc0};
Worker ->
Fun(Msg, {Worker, From}, Acc0)
end;
{rexi_DOWN, _, _, _} = Msg ->
Fun(Msg, nil, Acc0)
after PerMsgTO ->
{timeout, Acc0}
end. | src/rexi/src/rexi_utils.erl | 0.639286 | 0.410166 | rexi_utils.erl | starcoder |
%%
%% Copyright (c) 2018 <NAME>
%% All rights reserved.
%% Distributed under the terms of the MIT License. See the LICENSE file.
%%
%% SIP qvalue
%% (used in Accept and Contact headers)
%%
-module(ersip_qvalue).
-export([make/1,
parse/1,
assemble/1
]).
-export_type([qvalue/0]).
%%%===================================================================
%%% Types
%%%===================================================================
-type qvalue() :: {qvalue, 0..1000}.
-type parse_result() :: {ok, qvalue()}
| {error, {invalid_qvalue, binary()}}.
%%%===================================================================
%%% API
%%%===================================================================
-spec make(binary()) -> qvalue().
make(Bin) ->
case parse(Bin) of
{ok, QValue} ->
QValue;
{error, Reason} ->
error(Reason)
end.
%% qvalue = ( "0" [ "." 0*3DIGIT ] )
%% / ( "1" [ "." 0*3("0") ] )
-spec parse(binary()) -> parse_result().
parse(Bin) ->
parse_impl(Bin).
-spec assemble(qvalue()) -> binary().
assemble({qvalue, 1000}) ->
<<"1">>;
assemble({qvalue, Value}) ->
ValueBin = integer_to_binary(Value),
case byte_size(ValueBin) of
1 -> <<"0.00", ValueBin/binary>>;
2 -> <<"0.0", ValueBin/binary>>;
3 -> <<"0.", ValueBin/binary>>
end.
%%%===================================================================
%%% Internal implementation
%%%===================================================================
%% qvalue = ( "0" [ "." 0*3DIGIT ] )
%% / ( "1" [ "." 0*3("0") ] )
-spec parse_impl(binary()) -> parse_result().
parse_impl(<<"0">>) ->
{ok, {qvalue, 0}};
parse_impl(<<"1">>) ->
{ok, {qvalue, 1000}};
parse_impl(<<"1.", AllZeroes/binary>> = QVal) ->
case all_zeroes(AllZeroes) of
true ->
{ok, {qvalue, 1000}};
false ->
{error, {invalid_qvalue, QVal}}
end;
parse_impl(<<"0.", Rest/binary>> = QVal) when byte_size(Rest) =< 3 ->
ValueBin = add_zeroes(Rest),
try
{ok, {qvalue, binary_to_integer(ValueBin)}}
catch
error:badarg ->
{error, {invalid_qvalue, QVal}}
end;
parse_impl(QVal) ->
{error, {invalid_qvalue, QVal}}.
-spec all_zeroes(binary()) -> boolean().
all_zeroes(<<>>) ->
true;
all_zeroes(<<"0", Rest/binary>>) ->
all_zeroes(Rest);
all_zeroes(_) ->
false.
-spec add_zeroes(binary()) -> binary().
add_zeroes(X) when byte_size(X) == 3 -> X;
add_zeroes(X) when byte_size(X) == 2 -> <<X/binary, "0">>;
add_zeroes(X) when byte_size(X) == 1 -> <<X/binary, "00">>;
add_zeroes(X) when byte_size(X) == 0 -> <<"000">>. | src/ersip_qvalue.erl | 0.529263 | 0.455441 | ersip_qvalue.erl | starcoder |
-module(math_functions).
-export([even/1, odd/1, filter/2, split1/1, split2/1]).
even(Number) ->
% Use `rem` to check if the remainder is 0
% if it is return true, otherwise false
0 =:= Number rem 2.
odd(Number) ->
% Use `rem` to check if the remainder is 1
% if it is return true, otherwise false
% (for odd numbers the remainder of
% division by 2 will always be 1)
1 =:= Number rem 2.
filter(Fun, List) ->
% Invoke filter/3, which does the real work
% then reverse the resulting list, as
% filter/3 returns a list in reverse order.
lists:reverse(filter(Fun, List, [])).
filter(_Fun, [], Result) ->
% If there are no more items in the list
% return the result
Result;
filter(Fun, [Item|Remaining], Result) ->
% If another item still exists in the list
% Apply `Fun` to it and check the result,
% if true, add the item to the result.
case Fun(Item) of
true ->
filter(Fun, Remaining, [Item|Result]);
_ ->
filter(Fun, Remaining, Result)
end.
split1(List) ->
% For the first version of split we use an
% accumulator. So we defined split/2 to pass
% along the accumulator to each recursive call
split(List, {[], []}).
split([], {Even, Odd}) ->
% The Even and Odd lists were constructed in
% reverse so we reverse them to correct the
% order.
{lists:reverse(Even), lists:reverse(Odd)};
split([Item|List], {Even, Odd}) ->
% In order to determine what list an item
% should be added to we pass it to even/1. If
% it returns true we add it to the Even list,
% otherwise we add it to the Odd list.
case ?MODULE:even(Item) of
true ->
split(List, {[Item|Even], Odd});
false ->
split(List, {Even, [Item|Odd]})
end.
split2(List) ->
% In the second version of the split function
% we simply invoke the filter/2 function and
% pass in a reference to the even/1 or odd/1
% functions. The first call returns all the even
% items and the second returns all the odd items.
Even = filter(fun even/1, List),
Odd = filter(fun odd/1, List),
% Then we simply return both lists
{Even, Odd}. | chapter_4/exercise_7/math_functions.erl | 0.58522 | 0.61231 | math_functions.erl | starcoder |
%%%------------------------------------------------------------------------
%% Copyright 2017, OpenCensus Authors
%% Licensed under the Apache License, Version 2.0 (the "License");
%% you may not use this file except in compliance with the License.
%% You may obtain a copy of the License at
%%
%% http://www.apache.org/licenses/LICENSE-2.0
%%
%% Unless required by applicable law or agreed to in writing, software
%% distributed under the License is distributed on an "AS IS" BASIS,
%% WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
%% See the License for the specific language governing permissions and
%% limitations under the License.
%%
%% @doc ocp uses the pdict instead of a ctx variable for tracking context.
%% The functions fetch the current span context from the pdict and
%% passes it through to the oc_trace function of the same name.
%% @end
%%%-----------------------------------------------------------------------
-module(ocp).
-export([with_tags/1,
update_tags/1,
with_span_ctx/1,
with_child_span/1,
with_child_span/2,
with_child_span/3,
current_span_ctx/0,
current_tags/0,
finish_span/0,
unsample_span/0,
record/2,
put_attribute/2,
put_attributes/1,
add_time_event/1,
add_time_event/2,
add_link/1,
set_status/2,
spawn/1,
spawn/2,
spawn/3,
spawn/4,
spawn_link/1,
spawn_link/2,
spawn_link/3,
spawn_link/4,
spawn_monitor/1,
spawn_monitor/3,
spawn_opt/2,
spawn_opt/3,
spawn_opt/4,
spawn_opt/5]).
-include("opencensus.hrl").
-include("oc_logger.hrl").
-define(FUN_WITH_CTX(Fun),
begin
Context = current_span_ctx(),
Tags = current_tags(),
fun () ->
Mfa = erlang:fun_info_mfa(Fun),
put('$initial_call', Mfa),
ocp:with_span_ctx(Context),
ocp:with_tags(Tags),
erlang:apply(Fun, [])
end
end).
-define(MFA_WITH_CTX(M, F, A),
begin
Context = current_span_ctx(),
Tags = current_tags(),
fun () ->
put('$initial_call', {M, F, length(A)}),
ocp:with_span_ctx(Context),
ocp:with_tags(Tags),
erlang:apply(M, F, A)
end
end).
%%--------------------------------------------------------------------
%% @doc
%% Replaces the tags in the current context.
%% @end
%%--------------------------------------------------------------------
-spec with_tags(opencensus:tags()) -> maybe(opencensus:tags()).
with_tags(Map) ->
put(?TAG_CTX, Map).
%%--------------------------------------------------------------------
%% @doc
%% Merges the tags in the current context with a map of tags.
%% @end
%%--------------------------------------------------------------------
-spec update_tags(maps:map()) -> opencensus:tags().
update_tags(Map) ->
put(?TAG_CTX, oc_tags:update(current_tags(), Map)).
%%--------------------------------------------------------------------
%% @doc
%% Replaces the span in the current context.
%% @end
%%--------------------------------------------------------------------
-spec with_span_ctx(opencensus:span_ctx()) -> maybe(opencensus:span_ctx()).
with_span_ctx(SpanCtx) ->
?SET_LOG_METADATA(SpanCtx),
put(?SPAN_CTX, SpanCtx).
%%--------------------------------------------------------------------
%% @doc
%% Starts a new span as a child of the current span and replaces it.
%% @end
%%--------------------------------------------------------------------
-spec with_child_span(unicode:unicode_binary()) -> opencensus:maybe(opencensus:span_ctx()).
with_child_span(Name) ->
with_span_ctx(oc_trace:start_span(Name, current_span_ctx(), #{})).
%%--------------------------------------------------------------------
%% @doc
%% Starts a new span with attributes as a child of the current span
%% and replaces it.
%% @end
%%--------------------------------------------------------------------
-spec with_child_span(unicode:unicode_binary(), opencensus:attributes()) -> opencensus:maybe(opencensus:span_ctx()).
with_child_span(Name, Attributes) ->
with_span_ctx(oc_trace:start_span(Name, current_span_ctx(), #{attributes => Attributes})).
%%--------------------------------------------------------------------
%% @doc
%% Starts a new span as a child of the current span and uses it as the
%% current span while running the function `Fun', finishing the span
%% and resetting the current span context after the function finishes.
%% @end
%%--------------------------------------------------------------------
-spec with_child_span(unicode:unicode_binary(), opencensus:attributes(), fun()) -> maybe(opencensus:span_ctx()).
with_child_span(Name, Attributes, Fun) ->
CurrentSpanCtx = current_span_ctx(),
NewSpanCtx = oc_trace:start_span(Name, CurrentSpanCtx, #{attributes => Attributes}),
with_span_ctx(NewSpanCtx),
try Fun()
after
oc_trace:finish_span(current_span_ctx()),
with_span_ctx(CurrentSpanCtx)
end.
-spec current_span_ctx() -> maybe(opencensus:span_ctx()).
current_span_ctx() ->
get(?SPAN_CTX).
-spec current_tags() -> opencensus:tags().
current_tags() ->
case get(?TAG_CTX) of
undefined ->
oc_tags:new();
Tags ->
Tags
end.
%%--------------------------------------------------------------------
%% @doc
%% Finishes the span in the current pdict context.
%% @end
%%--------------------------------------------------------------------
-spec finish_span() -> boolean().
finish_span() ->
CurrentCtx = current_span_ctx(),
ParentCtx = oc_trace:parent_span_ctx(CurrentCtx),
Ret = oc_trace:finish_span(CurrentCtx),
with_span_ctx(ParentCtx),
Ret.
%%--------------------------------------------------------------------
%% @doc
%% Records a measurement with tags from the pdict context.
%%
%% Raises `{unknown_measure, MeasureName}' if measure doesn't exist.
%% @end
%%--------------------------------------------------------------------
-spec record(oc_stat_measure:name(), number()) -> ok.
record(MeasureName, Value) ->
oc_stat:record(current_tags(), MeasureName, Value).
%%--------------------------------------------------------------------
%% @doc
%% Put an attribute (a key/value pair) in the attribute map of a span.
%% If the attribute already exists it is overwritten with the new value.
%% @end
%%--------------------------------------------------------------------
-spec put_attribute(unicode:unicode_binary(), opencensus:attribute_value()) -> boolean() | {error, invalid_attribute}.
put_attribute(Key, Value) when is_binary(Key)
, (is_binary(Value) orelse is_integer(Value) orelse is_boolean(Value)) ->
oc_trace:put_attribute(Key, Value, current_span_ctx());
put_attribute(_Key, _Value) ->
{error, invalid_attribute}.
-spec unsample_span() -> boolean().
unsample_span() ->
CurrentCtx = current_span_ctx(),
case oc_trace:unsample_span(CurrentCtx) of
undefined ->
false;
NewCtx ->
with_span_ctx(NewCtx),
true
end.
%%--------------------------------------------------------------------
%% @doc
%% Merge a map of attributes with the attributes of current span.
%% The new values overwrite the old if any keys are the same.
%% @end
%%--------------------------------------------------------------------
-spec put_attributes(#{unicode:unicode_binary() => opencensus:attribute_value()}) -> boolean().
put_attributes(NewAttributes) ->
oc_trace:put_attributes(NewAttributes, current_span_ctx()).
%%--------------------------------------------------------------------
%% @doc
%% Add an Annotation or MessageEvent to the list of TimeEvents in the
%% current span.
%% @end
%%--------------------------------------------------------------------
-spec add_time_event(opencensus:annotation() | opencensus:message_event()) -> boolean().
add_time_event(TimeEvent) ->
oc_trace:add_time_event(TimeEvent, current_span_ctx()).
-spec add_time_event(wts:timestamp(), opencensus:annotation() | opencensus:message_event()) -> boolean().
add_time_event(Timestamp, TimeEvent) ->
oc_trace:add_time_event(Timestamp, TimeEvent, current_span_ctx()).
%%--------------------------------------------------------------------
%% @doc
%% Set Status of current span.
%% @end
%%--------------------------------------------------------------------
-spec set_status(integer(), unicode:unicode_binary()) -> boolean().
set_status(Code, Message) ->
oc_trace:set_status(Code, Message, current_span_ctx()).
%%--------------------------------------------------------------------
%% @doc
%% Add a Link to the list of Links in the current span.
%% @end
%%--------------------------------------------------------------------
-spec add_link(opencensus:link()) -> boolean().
add_link(Link) ->
oc_trace:add_link(Link, current_span_ctx()).
%%--------------------------------------------------------------------
%% @doc
%% Starts a new process using `erlang:spawn/1' with current_span_ctx
%% and current_tags from the calling process.
%% @end
%%--------------------------------------------------------------------
spawn(Fun) ->
erlang:spawn(?FUN_WITH_CTX(Fun)).
%%--------------------------------------------------------------------
%% @doc
%% Starts a new process using `erlang:spawn/2' with current_span_ctx
%% and current_tags from the calling process.
%% @end
%%--------------------------------------------------------------------
spawn(Node, Fun) ->
erlang:spawn(Node, ?FUN_WITH_CTX(Fun)).
%%--------------------------------------------------------------------
%% @doc
%% Starts a new process using `erlang:spawn/3' with current_span_ctx
%% and current_tags from the calling process.
%% @end
%%--------------------------------------------------------------------
spawn(M, F, A) ->
erlang:spawn(?MFA_WITH_CTX(M, F, A)).
%%--------------------------------------------------------------------
%% @doc
%% Starts a new process using `erlang:spawn/4' with current_span_ctx
%% and current_tags from the calling process.
%% @end
%%--------------------------------------------------------------------
spawn(Node, M, F, A) ->
erlang:spawn(Node, ?MFA_WITH_CTX(M, F, A)).
%%--------------------------------------------------------------------
%% @doc
%% Starts a new process using `erlang:spawn_link/1' with current_span_ctx
%% and current_tags from the calling process.
%% @end
%%--------------------------------------------------------------------
spawn_link(Fun) ->
erlang:spawn_link(?FUN_WITH_CTX(Fun)).
%%--------------------------------------------------------------------
%% @doc
%% Starts a new process using `erlang:spawn_link/2' with current_span_ctx
%% and current_tags from the calling process.
%% @end
%%--------------------------------------------------------------------
spawn_link(Node, Fun) ->
erlang:spawn_link(Node, ?FUN_WITH_CTX(Fun)).
%%--------------------------------------------------------------------
%% @doc
%% Starts a new process using `erlang:spawn_link/3' with current_span_ctx
%% and current_tags from the calling process.
%% @end
%%--------------------------------------------------------------------
spawn_link(M, F, A) ->
erlang:spawn_link(?MFA_WITH_CTX(M, F, A)).
%%--------------------------------------------------------------------
%% @doc
%% Starts a new process using `erlang:spawn_link/4' with current_span_ctx
%% and current_tags from the calling process.
%% @end
%%--------------------------------------------------------------------
spawn_link(Node, M, F, A) ->
erlang:spawn_link(Node, ?MFA_WITH_CTX(M, F, A)).
%%--------------------------------------------------------------------
%% @doc
%% Starts a new process using `erlang:spawn_monitor/1' with current_span_ctx
%% and current_tags from the calling process.
%% @end
%%--------------------------------------------------------------------
spawn_monitor(Fun) ->
erlang:spawn_monitor(?FUN_WITH_CTX(Fun)).
%%--------------------------------------------------------------------
%% @doc
%% Starts a new process using `erlang:spawn_monitor/3' with current_span_ctx
%% and current_tags from the calling process.
%% @end
%%--------------------------------------------------------------------
spawn_monitor(M, F, A) ->
erlang:spawn_monitor(?MFA_WITH_CTX(M, F, A)).
%%--------------------------------------------------------------------
%% @doc
%% Starts a new process using `erlang:spawn_opt/2' with current_span_ctx
%% and current_tags from the calling process.
%% @end
%%--------------------------------------------------------------------
spawn_opt(Fun, Opt) ->
erlang:spawn_opt(?FUN_WITH_CTX(Fun), Opt).
%%--------------------------------------------------------------------
%% @doc
%% Starts a new process using `erlang:spawn_opt/3' with current_span_ctx
%% and current_tags from the calling process.
%% @end
%%--------------------------------------------------------------------
spawn_opt(Node, Fun, Opt) ->
erlang:spawn_opt(Node, ?FUN_WITH_CTX(Fun), Opt).
%%--------------------------------------------------------------------
%% @doc
%% Starts a new process using `erlang:spawn_opt/4' with current_span_ctx
%% and current_tags from the calling process.
%% @end
%%--------------------------------------------------------------------
spawn_opt(M, F, A, Opt) ->
erlang:spawn_opt(?MFA_WITH_CTX(M, F, A), Opt).
%%--------------------------------------------------------------------
%% @doc
%% Starts a new process using `erlang:spawn_opt/5' with current_span_ctx
%% and current_tags from the calling process.
%% @end
%%--------------------------------------------------------------------
spawn_opt(Node, M, F, A, Opt) ->
erlang:spawn_opt(Node, ?MFA_WITH_CTX(M, F, A), Opt). | src/ocp.erl | 0.530236 | 0.442034 | ocp.erl | starcoder |
%%%-------------------------------------------------------------------
%%% Copyright 2014 The RySim Authors. All rights reserved.
%%%
%%% Licensed under the Apache License, Version 2.0 (the "License");
%%% you may not use this file except in compliance with the License.
%%% You may obtain a copy of the License at
%%%
%%% http://www.apache.org/licenses/LICENSE-2.0
%%%
%%% Unless required by applicable law or agreed to in writing, software
%%% distributed under the License is distributed on an "AS IS" BASIS,
%%% WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
%%% See the License for the specific language governing permissions and
%%% limitations under the License.
%%%-------------------------------------------------------------------
-module(event_queue).
-include("rysim.hrl").
%% API
-export([create/0, top_event/1, pop_event/1, push_event/2]).
%% ===================================================================
%% API
%% ===================================================================
%%--------------------------------------------------------------------
%% @doc
%% Factory function that creates an empty queue and returns it the caller. The
%% returned value needs to be passed into other API functions, but should be
%% treated as opaque outside this module.
%% @spec create() -> {ok, Data}
%% Data = record(event_queue_data)
%% @end
%% --------------------------------------------------------------------
create() ->
{ok, #event_queue_data{body = gb_trees:empty()}}.
%%--------------------------------------------------------------------
%% @doc
%% Return first entry of the queue. In the event the queue is empty,
%% undefined is returned.
%% @spec top_event(Data) -> {ok, undefined} | {ok, Event}
%% Data = record(event_queue_data)
%% Event = record(sim_event)
%% @end
%% --------------------------------------------------------------------
top_event(Data) when is_record(Data, event_queue_data) ->
Queue = Data#event_queue_data.body,
case gb_trees:is_empty(Queue) of
true ->
{ok, undefined};
_ ->
{_, [Event|_]} = gb_trees:smallest(Queue),
{ok, Event}
end;
top_event(Data) ->
error_logger:error_msg("~p:~p Unable to top_event, from ~p, due to bad type!",
[?MODULE, ?LINE, Data]),
throw(badarg).
%%--------------------------------------------------------------------
%% @doc
%% Remove the first entry from the queue and returns the new queue.
%% Pass through to the underlying behaviour implementation.
%% @spec pop_event(Data) -> {ok, NewData}
%% Data = record(event_queue_data)
%% NewData = record(event_queue_data)
%% @end
%% --------------------------------------------------------------------
pop_event(Data) when is_record(Data, event_queue_data) ->
Queue = Data#event_queue_data.body,
case gb_trees:is_empty(Queue) of
true ->
{ok, Data};
_ ->
remove_event(Data)
end;
pop_event(Data) ->
error_logger:error_msg("~p:~p Unable to pop_event, from ~p, due to bad type!",
[?MODULE, ?LINE, Data]),
throw(badarg).
%%--------------------------------------------------------------------
%% @doc
%% Adds a new sim_event to the queue using its timestamp's absolute
%% value as the key.
%% @spec push_event(Event, Data) -> {ok, NewData}
%% Event = record(sim_event)
%% Data = record(event_queue_data)
%% NewData = record(event_queue_data)
%% @end
%% --------------------------------------------------------------------
push_event(Event, Data) when is_record(Event, sim_event),
is_record(Data, event_queue_data) ->
Key = erlang:abs(Event#sim_event.timestamp),
OldQueue = Data#event_queue_data.body,
NewQueue = case gb_trees:lookup(Key, OldQueue) of
none ->
gb_trees:insert(Key, [Event], OldQueue);
{value, Events} ->
gb_trees:update(Key, [Event|Events], OldQueue)
end,
NewData = Data#event_queue_data{body = NewQueue},
{ok, NewData};
push_event(Event, Data) ->
error_logger:error_msg("~p:~p Unable to push_event, ~p to ~p, due to bad type!",
[?MODULE, ?LINE, Event, Data]),
throw(badarg).
%% ===================================================================
%% Private
%% ===================================================================
%%--------------------------------------------------------------------
%% @doc
%% Removes the top event from the queue. This call will fail if the tree
%% is empty.
%% @spec remove_event(Data) -> {ok, NewData}
%% Data = record(event_queue_data)
%% NewData = record(event_queue_data)
%% @end
%% --------------------------------------------------------------------
remove_event(Data) when is_record(Data, event_queue_data)->
OldQueue = Data#event_queue_data.body,
{Key, [_|Events]} = gb_trees:smallest(OldQueue),
NewQueue = case Events == [] of
true ->
gb_trees:delete(Key, OldQueue);
_ ->
gb_trees:update(Key, Events, OldQueue)
end,
NewData = Data#event_queue_data{body = NewQueue},
{ok, NewData}. | erlang/rysim_des/src/event_queue.erl | 0.593609 | 0.404802 | event_queue.erl | starcoder |
% The Computer Language Benchmarks Game
% https://salsa.debian.org/benchmarksgame-team/benchmarksgame/
% contributed by <NAME>
%% erlc spectralnorm.erl
%% erl -smp enable -noshell -run spectralnorm main 5500
-module(spectralnorm).
-export([main/1]).
-compile( [ inline, { inline_size, 1000 } ] ).
main([Arg]) ->
register(server, self()),
N = list_to_integer(Arg),
{U, V} = power_method(N, 10, erlang:make_tuple(N, 1), []),
io:format("~.9f\n", [ eigen(N, U, V, 0, 0) ]),
erlang:halt(0).
% eigenvalue of V
eigen(0, _, _, VBV, VV) when VV /= 0 -> math:sqrt(VBV / VV);
eigen(I, U, V, VBV, VV) when I /= 0 ->
VI = element(I, V),
eigen(I-1, U, V, VBV + element(I, U)*VI, VV + VI*VI).
% 2I steps of the power method
power_method(_, 0, A, B) -> {A, B};
power_method(N, I, A, _B) ->
V = atav(N, A),
U = atav(N, V),
power_method(N, I-1, U, V).
% return element i,j of infinite matrix A
a(II,JJ) -> 1/((II+JJ-2)*(II-1+JJ)/2+II).
% multiply vector v by matrix A
av(N, V) -> pmap(N, fun(Begin, End) -> av(N, Begin, End, V) end).
av(N, Begin, End, V) -> server ! { self(), [ avloop(N, I, V, 0.0) || I <- lists:seq(Begin, End) ]}.
avloop(0, _, _, X) -> X;
avloop(J, I, V, X) -> avloop(J-1, I, V, X + a(I, J)*element(J, V) ).
% multiply vector v by matrix A transposed
atv(N, V) -> pmap(N, fun(Begin, End)-> atv(N, Begin, End, V) end).
atv(N, Begin, End, V) -> server ! { self(), [ atvloop(N, I, V, 0.0) || I <- lists:seq(Begin, End) ]}.
atvloop(0, _, _, X) -> X;
atvloop(J, I, V, X) -> atvloop(J-1, I, V, X + a(J, I)*element(J, V) ).
% multiply vector v by matrix A and then by matrix A transposed
atav(N, V) -> atv(N, av(N, V)).
%Helper function for multicore
pmap(N, F) ->
Chunks = chunks(0, erlang:system_info(logical_processors), N, []),
Pids = [spawn(fun()-> F(Begin, End) end) || {Begin, End} <- Chunks],
Res = [ receive {Pid, X} -> X end || Pid <- Pids],
list_to_tuple(lists:flatten(Res)).
chunks(I, P, N, A) when I == P-1 -> lists:reverse([{I*(N div P)+1, N} | A ]);
chunks(I, P, N, A) -> chunks(I+1, P, N, [{ I*(N div P)+1, (I+1)*(N div P)} | A ]). | bench/spectralnorm.erl | 0.626353 | 0.527134 | spectralnorm.erl | starcoder |
%%
%% %CopyrightBegin%
%%
%% Copyright Ericsson AB 1996-2020. All Rights Reserved.
%%
%% Licensed under the Apache License, Version 2.0 (the "License");
%% you may not use this file except in compliance with the License.
%% You may obtain a copy of the License at
%%
%% http://www.apache.org/licenses/LICENSE-2.0
%%
%% Unless required by applicable law or agreed to in writing, software
%% distributed under the License is distributed on an "AS IS" BASIS,
%% WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
%% See the License for the specific language governing permissions and
%% limitations under the License.
%%
%% %CopyrightEnd%
%%
-module(erl_pp).
%%% Pretty printer for Erlang code in the same format as returned from
%%% the parser. It does not always produce pretty code.
-export([form/1,form/2,
attribute/1,attribute/2,function/1,function/2,
guard/1,guard/2,exprs/1,exprs/2,exprs/3,expr/1,expr/2,expr/3,expr/4]).
-import(lists, [append/1,foldr/3,map/2,mapfoldl/3,reverse/1,reverse/2]).
-import(io_lib, [write/1,format/2]).
-import(erl_parse, [inop_prec/1,preop_prec/1,func_prec/0,max_prec/0,
type_inop_prec/1, type_preop_prec/1]).
-define(MAXLINE, 72).
-type(hook_function() :: none
| fun((Expr :: erl_parse:abstract_expr(),
CurrentIndentation :: integer(),
CurrentPrecedence :: non_neg_integer(),
Options :: options()) ->
io_lib:chars())).
-type(option() :: {hook, hook_function()}
| {encoding, latin1 | unicode | utf8}
| {quote_singleton_atom_types, boolean()}).
-type(options() :: hook_function() | [option()]).
-record(pp, {value_fun, singleton_atom_type_fun, string_fun, char_fun}).
-record(options, {hook, encoding, opts}).
%-define(DEBUG, true).
-ifdef(DEBUG).
-define(FORM_TEST(T),
_ = case T of
{eof, _Line} -> ok;
{warning, _W} -> ok;
{error, _E} -> ok;
_ -> ?TEST(T)
end).
-define(EXPRS_TEST(L),
_ = [?TEST(E) || E <- L]).
-define(TEST(T),
%% Assumes that erl_anno has been compiled with DEBUG=true.
%% erl_pp does not use the annoations, but test it anyway.
%% Note: hooks are not handled.
_ = try
erl_parse:map_anno(fun(A) when is_list(A) -> A end, T)
catch
_:_ ->
erlang:error(badarg, [T])
end).
-else.
-define(FORM_TEST(T), ok).
-define(EXPRS_TEST(T), ok).
-define(TEST(T), ok).
-endif.
%%%
%%% Exported functions
%%%
-spec(form(Form) -> io_lib:chars() when
Form :: erl_parse:abstract_form() | erl_parse:form_info()).
form(Thing) ->
form(Thing, none).
-spec(form(Form, Options) -> io_lib:chars() when
Form :: erl_parse:abstract_form() | erl_parse:form_info(),
Options :: options()).
form(Thing, Options) ->
?FORM_TEST(Thing),
State = state(Options),
frmt(lform(Thing, options(Options)), State).
-spec(attribute(Attribute) -> io_lib:chars() when
Attribute :: erl_parse:abstract_form()).
attribute(Thing) ->
attribute(Thing, none).
-spec(attribute(Attribute, Options) -> io_lib:chars() when
Attribute :: erl_parse:abstract_form(),
Options :: options()).
attribute(Thing, Options) ->
?TEST(Thing),
State = state(Options),
frmt(lattribute(Thing, options(Options)), State).
-spec(function(Function) -> io_lib:chars() when
Function :: erl_parse:abstract_form()).
function(F) ->
function(F, none).
-spec(function(Function, Options) -> io_lib:chars() when
Function :: erl_parse:abstract_form(),
Options :: options()).
function(F, Options) ->
?TEST(F),
frmt(lfunction(F, options(Options)), state(Options)).
-spec(guard(Guard) -> io_lib:chars() when
Guard :: [erl_parse:abstract_expr()]).
guard(Gs) ->
guard(Gs, none).
-spec(guard(Guard, Options) -> io_lib:chars() when
Guard :: [erl_parse:abstract_expr()],
Options :: options()).
guard(Gs, Options) ->
?EXPRS_TEST(Gs),
frmt(lguard(Gs, options(Options)), state(Options)).
-spec(exprs(Expressions) -> io_lib:chars() when
Expressions :: [erl_parse:abstract_expr()]).
exprs(Es) ->
exprs(Es, 0, none).
-spec(exprs(Expressions, Options) -> io_lib:chars() when
Expressions :: [erl_parse:abstract_expr()],
Options :: options()).
exprs(Es, Options) ->
exprs(Es, 0, Options).
-spec(exprs(Expressions, Indent, Options) -> io_lib:chars() when
Expressions :: [erl_parse:abstract_expr()],
Indent :: integer(),
Options :: options()).
exprs(Es, I, Options) ->
?EXPRS_TEST(Es),
frmt({seq,[],[],[$,],lexprs(Es, options(Options))}, I, state(Options)).
-spec(expr(Expression) -> io_lib:chars() when
Expression :: erl_parse:abstract_expr()).
expr(E) ->
?TEST(E),
frmt(lexpr(E, 0, options(none)), state(none)).
-spec(expr(Expression, Options) -> io_lib:chars() when
Expression :: erl_parse:abstract_expr(),
Options :: options()).
expr(E, Options) ->
?TEST(E),
frmt(lexpr(E, 0, options(Options)), state(Options)).
-spec(expr(Expression, Indent, Options) -> io_lib:chars() when
Expression :: erl_parse:abstract_expr(),
Indent :: integer(),
Options :: options()).
expr(E, I, Options) ->
?TEST(E),
frmt(lexpr(E, 0, options(Options)), I, state(Options)).
-spec(expr(Expression, Indent, Precedence, Options) -> io_lib:chars() when
Expression :: erl_parse:abstract_expr(),
Indent :: integer(),
Precedence :: non_neg_integer(),
Options :: options()).
expr(E, I, P, Options) ->
?TEST(E),
frmt(lexpr(E, P, options(Options)), I, state(Options)).
%%%
%%% Local functions
%%%
options(Options) when is_list(Options) ->
Hook = proplists:get_value(hook, Options, none),
Encoding = encoding(Options),
#options{hook = Hook, encoding = Encoding, opts = Options};
options(Hook) ->
#options{hook = Hook, encoding = encoding([]), opts = Hook}.
state(Options) when is_list(Options) ->
Quote = proplists:get_bool(quote_singleton_atom_types, Options),
case encoding(Options) of
latin1 -> latin1_state(Quote);
unicode -> unicode_state(Quote)
end;
state(_Hook) ->
latin1_state(false).
latin1_state(Quote) ->
Options = [{encoding,latin1}],
ValueFun = fun(V) -> io_lib_pretty:print(V, Options) end,
SingletonFun =
case Quote of
true ->
fun(A) ->
io_lib:write_string_as_latin1(atom_to_list(A), $')
end; %'
false ->
ValueFun
end,
#pp{value_fun = ValueFun,
singleton_atom_type_fun = SingletonFun,
string_fun = fun io_lib:write_string_as_latin1/1,
char_fun = fun io_lib:write_char_as_latin1/1}.
unicode_state(Quote) ->
Options = [{encoding,unicode}],
ValueFun = fun(V) -> io_lib_pretty:print(V, Options) end,
SingletonFun =
case Quote of
true ->
fun(A) -> io_lib:write_string(atom_to_list(A), $') end; %'
false ->
ValueFun
end,
#pp{value_fun = ValueFun,
singleton_atom_type_fun = SingletonFun,
string_fun = fun io_lib:write_string/1,
char_fun = fun io_lib:write_char/1}.
encoding(Options) ->
case proplists:get_value(encoding, Options, epp:default_encoding()) of
latin1 -> latin1;
utf8 -> unicode;
unicode -> unicode
end.
lform({attribute,Line,Name,Arg}, Opts) ->
lattribute({attribute,Line,Name,Arg}, Opts);
lform({function,Line,Name,Arity,Clauses}, Opts) ->
lfunction({function,Line,Name,Arity,Clauses}, Opts);
%% These are specials to make it easier for the compiler.
lform({error,_}=E, Opts) ->
message(E, Opts);
lform({warning,_}=W, Opts) ->
message(W, Opts);
lform({eof,_Line}, _Opts) ->
$\n.
message(M, #options{encoding = Encoding}) ->
F = case Encoding of
latin1 -> "~p\n";
unicode -> "~tp\n"
end,
leaf(format(F, [M])).
lattribute({attribute,_Line,type,Type}, Opts) ->
[typeattr(type, Type, Opts),leaf(".\n")];
lattribute({attribute,_Line,opaque,Type}, Opts) ->
[typeattr(opaque, Type, Opts),leaf(".\n")];
lattribute({attribute,_Line,spec,Arg}, _Opts) ->
[specattr(spec, Arg),leaf(".\n")];
lattribute({attribute,_Line,callback,Arg}, _Opts) ->
[specattr(callback, Arg),leaf(".\n")];
lattribute({attribute,_Line,Name,Arg}, Opts) ->
[lattribute(Name, Arg, Opts),leaf(".\n")].
lattribute(module, {M,Vs}, _Opts) ->
A = a0(),
attr(module,[{var,A,pname(M)},
foldr(fun(V, C) -> {cons,A,{var,A,V},C}
end, {nil,A}, Vs)]);
lattribute(module, M, _Opts) ->
attr(module, [{var,a0(),pname(M)}]);
lattribute(export, Falist, _Opts) ->
attrib(export, falist(Falist));
lattribute(import, Name, _Opts) when is_list(Name) ->
attr(import, [{var,a0(),pname(Name)}]);
lattribute(import, {From,Falist}, _Opts) ->
attrib(import, [leaf(pname(From)),falist(Falist)]);
lattribute(export_type, Talist, _Opts) ->
attrib(export_type, falist(Talist));
lattribute(optional_callbacks, Falist, Opts) ->
try attrib(optional_callbacks, falist(Falist))
catch _:_ -> attr(optional_callbacks, [abstract(Falist, Opts)])
end;
lattribute(file, {Name,Line}, _Opts) ->
attr(file, [{string,a0(),Name},{integer,a0(),Line}]);
lattribute(record, {Name,Is}, Opts) ->
Nl = [leaf("-record("),{atom,Name},$,],
[{first,Nl,record_fields(Is, Opts)},$)];
lattribute(Name, Arg, Options) ->
attr(Name, [abstract(Arg, Options)]).
abstract(Arg, #options{encoding = Encoding}) ->
erl_parse:abstract(Arg, [{encoding,Encoding}]).
typeattr(Tag, {TypeName,Type,Args}, _Opts) ->
{first,leaf("-"++atom_to_list(Tag)++" "),
typed(call({atom,a0(),TypeName}, Args, 0, options(none)), Type)}.
ltype(T) ->
ltype(T, 0).
ltype({ann_type,_Line,[V,T]}, Prec) ->
{L,P,R} = type_inop_prec('::'),
Vl = ltype(V, L),
Tr = ltype(T, R),
El = {list,[{cstep,[Vl,' ::'],Tr}]},
maybe_paren(P, Prec, El);
ltype({paren_type,_Line,[T]}, P) ->
%% Generated before Erlang/OTP 18.
ltype(T, P);
ltype({type,_Line,union,Ts}, Prec) ->
{_L,P,R} = type_inop_prec('|'),
E = {seq,[],[],[' |'],ltypes(Ts, R)},
maybe_paren(P, Prec, E);
ltype({type,_Line,list,[T]}, _) ->
{seq,$[,$],$,,[ltype(T)]};
ltype({type,_Line,nonempty_list,[T]}, _) ->
{seq,$[,$],[$,],[ltype(T),leaf("...")]};
ltype({type,Line,nil,[]}, _) ->
lexpr({nil,Line}, options(none));
ltype({type,Line,map,any}, _) ->
simple_type({atom,Line,map}, []);
ltype({type,_Line,map,Pairs}, Prec) ->
{P,_R} = type_preop_prec('#'),
E = map_type(Pairs),
maybe_paren(P, Prec, E);
ltype({type,Line,tuple,any}, _) ->
simple_type({atom,Line,tuple}, []);
ltype({type,_Line,tuple,Ts}, _) ->
tuple_type(Ts, fun ltype/2);
ltype({type,_Line,record,[{atom,_,N}|Fs]}, Prec) ->
{P,_R} = type_preop_prec('#'),
E = record_type(N, Fs),
maybe_paren(P, Prec, E);
ltype({type,_Line,range,[_I1,_I2]=Es}, Prec) ->
{_L,P,R} = type_inop_prec('..'),
F = fun(E, Opts) -> lexpr(E, R, Opts) end,
E = expr_list(Es, '..', F, options(none)),
maybe_paren(P, Prec, E);
ltype({type,_Line,binary,[I1,I2]}, _) ->
binary_type(I1, I2); % except binary()
ltype({type,_Line,'fun',[]}, _) ->
leaf("fun()");
ltype({type,_,'fun',[{type,_,any},_]}=FunType, _) ->
[fun_type(['fun',$(], FunType),$)];
ltype({type,_Line,'fun',[{type,_,product,_},_]}=FunType, _) ->
[fun_type(['fun',$(], FunType),$)];
ltype({type,Line,T,Ts}, _) ->
simple_type({atom,Line,T}, Ts);
ltype({user_type,Line,T,Ts}, _) ->
simple_type({atom,Line,T}, Ts);
ltype({remote_type,Line,[M,F,Ts]}, _) ->
simple_type({remote,Line,M,F}, Ts);
ltype({atom,_,T}, _) ->
{singleton_atom_type,T};
ltype(E, P) ->
lexpr(E, P, options(none)).
binary_type(I1, I2) ->
B = [[] || {integer,_,0} <- [I1]] =:= [],
U = [[] || {integer,_,0} <- [I2]] =:= [],
P = max_prec(),
E1 = [[leaf("_:"),lexpr(I1, P, options(none))] || B],
E2 = [[leaf("_:_*"),lexpr(I2, P, options(none))] || U],
case E1++E2 of
[] ->
leaf("<<>>");
Es ->
{seq,'<<','>>',[$,],Es}
end.
map_type(Fs) ->
{first,[$#],map_pair_types(Fs)}.
map_pair_types(Fs) ->
tuple_type(Fs, fun map_pair_type/2).
map_pair_type({type,_Line,map_field_assoc,[KType,VType]}, Prec) ->
{list,[{cstep,[ltype(KType, Prec),leaf(" =>")],ltype(VType, Prec)}]};
map_pair_type({type,_Line,map_field_exact,[KType,VType]}, Prec) ->
{list,[{cstep,[ltype(KType, Prec),leaf(" :=")],ltype(VType, Prec)}]}.
record_type(Name, Fields) ->
{first,[record_name(Name)],field_types(Fields)}.
field_types(Fs) ->
tuple_type(Fs, fun field_type/2).
field_type({type,_Line,field_type,[Name,Type]}, _Prec) ->
typed(lexpr(Name, options(none)), Type).
typed(B, Type) ->
{list,[{cstep,[B,' ::'],ltype(Type)}]}.
tuple_type([], _) ->
leaf("{}");
tuple_type(Ts, F) ->
{seq,${,$},[$,],ltypes(Ts, F, 0)}.
specattr(SpecKind, {FuncSpec,TypeSpecs}) ->
Func = case FuncSpec of
{F,_A} ->
{atom,F};
{M,F,_A} ->
[{atom,M},$:,{atom,F}]
end,
{first,leaf(lists:concat(["-", SpecKind, " "])),
{list,[{first,Func,spec_clauses(TypeSpecs)}]}}.
spec_clauses(TypeSpecs) ->
{prefer_nl,[$;],[sig_type(T) || T <- TypeSpecs]}.
sig_type({type,_Line,bounded_fun,[T,Gs]}) ->
guard_type(fun_type([], T), Gs);
sig_type(FunType) ->
fun_type([], FunType).
guard_type(Before, Gs) ->
Opts = options(none),
Gl = {list,[{step,'when',expr_list(Gs, [$,], fun constraint/2, Opts)}]},
{list,[{step,Before,Gl}]}.
constraint({type,_Line,constraint,[{atom,_,is_subtype},[{var,_,_}=V,Type]]},
_Opts) ->
typed(lexpr(V, options(none)), Type);
constraint({type,_Line,constraint,[Tag,As]}, _Opts) ->
simple_type(Tag, As).
fun_type(Before, {type,_,'fun',[FType,Ret]}) ->
{first,Before,{step,[type_args(FType),' ->'],ltype(Ret)}}.
type_args({type,_Line,any}) ->
leaf("(...)");
type_args({type,_line,product,Ts}) ->
targs(Ts).
simple_type(Tag, Types) ->
{first,lexpr(Tag, options(none)),targs(Types)}.
targs(Ts) ->
{seq,$(,$),[$,],ltypes(Ts, 0)}.
ltypes(Ts, Prec) ->
ltypes(Ts, fun ltype/2, Prec).
ltypes(Ts, F, Prec) ->
[F(T, Prec) || T <- Ts].
attr(Name, Args) ->
{first,[$-,{atom,Name}],args(Args, options(none))}.
attrib(Name, Args) ->
{first,[$-,{atom,Name}],[{seq,$(,$),[$,],Args}]}.
pname(['' | As]) ->
[$. | pname(As)];
pname([A]) ->
write(A);
pname([A | As]) ->
[write(A),$.|pname(As)];
pname(A) when is_atom(A) ->
write(A).
falist([]) ->
['[]'];
falist(Falist) ->
L = [begin
{Name,Arity} = Fa,
[{atom,Name},leaf(format("/~w", [Arity]))]
end || Fa <- Falist],
[{seq,$[,$],$,,L}].
lfunction({function,_Line,Name,_Arity,Cs}, Opts) ->
Cll = nl_clauses(fun (C, H) -> func_clause(Name, C, H) end, $;, Opts, Cs),
[Cll,leaf(".\n")].
func_clause(Name, {clause,Line,Head,Guard,Body}, Opts) ->
Hl = call({atom,Line,Name}, Head, 0, Opts),
Gl = guard_when(Hl, Guard, Opts),
Bl = body(Body, Opts),
{step,Gl,Bl}.
guard_when(Before, Guard, Opts) ->
guard_when(Before, Guard, Opts, ' ->').
guard_when(Before, Guard, Opts, After) ->
Gl = lguard(Guard, Opts),
[{list,[{step,Before,Gl}]},After].
lguard([E|Es], Opts) when is_list(E) ->
{list,[{step,'when',expr_list([E|Es], [$;], fun guard0/2, Opts)}]};
lguard([E|Es], Opts) -> % before R6
lguard([[E|Es]], Opts);
lguard([], _) ->
[].
guard0(Es, Opts) ->
expr_list(Es, [$,], fun lexpr/2, Opts).
%% body(Before, Es, Opts) -> [Char].
body([E], Opts) ->
lexpr(E, Opts);
body(Es, Opts) ->
{prefer_nl,[$,],lexprs(Es, Opts)}.
lexpr(E, Opts) ->
lexpr(E, 0, Opts).
lexpr({var,_,V}, _, _) when is_integer(V) -> %Special hack for Robert
leaf(format("_~w", [V]));
lexpr({var,_,V}, _, _) -> leaf(format("~ts", [V]));
lexpr({char,_,C}, _, _) -> {char,C};
lexpr({integer,_,N}, _, _) -> leaf(write(N));
lexpr({float,_,F}, _, _) -> leaf(write(F));
lexpr({atom,_,A}, _, _) -> {atom,A};
lexpr({string,_,S}, _, _) -> {string,S};
lexpr({nil,_}, _, _) -> '[]';
lexpr({cons,_,H,T}, _, Opts) ->
list(T, [H], Opts);
lexpr({lc,_,E,Qs}, _Prec, Opts) ->
P = max_prec(),
Lcl = {list,[{step,[lexpr(E, P, Opts),leaf(" ||")],lc_quals(Qs, Opts)}]},
{list,[{seq,$[,[],[[]],[{force_nl,leaf(" "),[Lcl]}]},$]]};
%% {list,[{step,$[,Lcl},$]]};
lexpr({bc,_,E,Qs}, _Prec, Opts) ->
P = max_prec(),
Lcl = {list,[{step,[lexpr(E, P, Opts),leaf(" ||")],lc_quals(Qs, Opts)}]},
{list,[{seq,'<<',[],[[]],[{force_nl,leaf(" "),[Lcl]}]},'>>']};
%% {list,[{step,'<<',Lcl},'>>']};
lexpr({tuple,_,Elts}, _, Opts) ->
tuple(Elts, Opts);
%%lexpr({struct,_,Tag,Elts}, _, Opts) ->
%% {first,format("~w", [Tag]),tuple(Elts, Opts)};
lexpr({record_index, _, Name, F}, Prec, Opts) ->
{P,R} = preop_prec('#'),
Nl = record_name(Name),
El = [Nl,$.,lexpr(F, R, Opts)],
maybe_paren(P, Prec, El);
lexpr({record, _, Name, Fs}, Prec, Opts) ->
{P,_R} = preop_prec('#'),
Nl = record_name(Name),
El = {first,Nl,record_fields(Fs, Opts)},
maybe_paren(P, Prec, El);
lexpr({record_field, _, Rec, Name, F}, Prec, Opts) ->
{L,P,R} = inop_prec('#'),
Rl = lexpr(Rec, L, Opts),
Sep = hash_after_integer(Rec, [$#]),
Nl = [Sep,{atom,Name},$.],
El = [Rl,Nl,lexpr(F, R, Opts)],
maybe_paren(P, Prec, El);
lexpr({record, _, Rec, Name, Fs}, Prec, Opts) ->
{L,P,_R} = inop_prec('#'),
Rl = lexpr(Rec, L, Opts),
Sep = hash_after_integer(Rec, []),
Nl = record_name(Name),
El = {first,[Rl,Sep,Nl],record_fields(Fs, Opts)},
maybe_paren(P, Prec, El);
lexpr({record_field, _, {atom,_,''}, F}, Prec, Opts) ->
{_L,P,R} = inop_prec('.'),
El = [$.,lexpr(F, R, Opts)],
maybe_paren(P, Prec, El);
lexpr({record_field, _, Rec, F}, Prec, Opts) ->
{L,P,R} = inop_prec('.'),
El = [lexpr(Rec, L, Opts),$.,lexpr(F, R, Opts)],
maybe_paren(P, Prec, El);
lexpr({map, _, Fs}, Prec, Opts) ->
{P,_R} = preop_prec('#'),
El = {first,$#,map_fields(Fs, Opts)},
maybe_paren(P, Prec, El);
lexpr({map, _, Map, Fs}, Prec, Opts) ->
{L,P,_R} = inop_prec('#'),
Rl = lexpr(Map, L, Opts),
Sep = hash_after_integer(Map, [$#]),
El = {first,[Rl|Sep],map_fields(Fs, Opts)},
maybe_paren(P, Prec, El);
lexpr({block,_,Es}, _, Opts) ->
{list,[{step,'begin',body(Es, Opts)},{reserved,'end'}]};
lexpr({'if',_,Cs}, _, Opts) ->
{list,[{step,'if',if_clauses(Cs, Opts)},{reserved,'end'}]};
lexpr({'case',_,Expr,Cs}, _, Opts) ->
{list,[{step,{list,[{step,'case',lexpr(Expr, Opts)},{reserved,'of'}]},
cr_clauses(Cs, Opts)},
{reserved,'end'}]};
lexpr({'cond',_,Cs}, _, Opts) ->
{list,[{step,leaf("cond"),cond_clauses(Cs, Opts)},{reserved,'end'}]};
lexpr({'receive',_,Cs}, _, Opts) ->
{list,[{step,'receive',cr_clauses(Cs, Opts)},{reserved,'end'}]};
lexpr({'receive',_,Cs,To,ToOpt}, _, Opts) ->
Al = {list,[{step,[lexpr(To, Opts),' ->'],body(ToOpt, Opts)}]},
{list,[{step,'receive',cr_clauses(Cs, Opts)},
{step,'after',Al},
{reserved,'end'}]};
lexpr({'fun',_,{function,F,A}}, _Prec, _Opts) ->
[leaf("fun "),{atom,F},leaf(format("/~w", [A]))];
lexpr({'fun',L,{function,_,_}=Func,Extra}, Prec, Opts) ->
{force_nl,fun_info(Extra),lexpr({'fun',L,Func}, Prec, Opts)};
lexpr({'fun',L,{function,M,F,A}}, Prec, Opts)
when is_atom(M), is_atom(F), is_integer(A) ->
%% For backward compatibility with pre-R15 abstract format.
Mod = erl_parse:abstract(M),
Fun = erl_parse:abstract(F),
Arity = erl_parse:abstract(A),
lexpr({'fun',L,{function,Mod,Fun,Arity}}, Prec, Opts);
lexpr({'fun',_,{function,M,F,A}}, _Prec, Opts) ->
%% New format in R15.
NameItem = lexpr(M, Opts),
CallItem = lexpr(F, Opts),
ArityItem = lexpr(A, Opts),
["fun ",NameItem,$:,CallItem,$/,ArityItem];
lexpr({'fun',_,{clauses,Cs}}, _Prec, Opts) ->
{list,[{first,'fun',fun_clauses(Cs, Opts, unnamed)},{reserved,'end'}]};
lexpr({named_fun,_,Name,Cs}, _Prec, Opts) ->
{list,[{first,['fun', " "],fun_clauses(Cs, Opts, {named, Name})},
{reserved,'end'}]};
lexpr({'fun',_,{clauses,Cs},Extra}, _Prec, Opts) ->
{force_nl,fun_info(Extra),
{list,[{first,'fun',fun_clauses(Cs, Opts, unnamed)},{reserved,'end'}]}};
lexpr({named_fun,_,Name,Cs,Extra}, _Prec, Opts) ->
{force_nl,fun_info(Extra),
{list,[{first,['fun', " "],fun_clauses(Cs, Opts, {named, Name})},
{reserved,'end'}]}};
lexpr({call,_,{remote,_,{atom,_,M},{atom,_,F}=N}=Name,Args}, Prec, Opts) ->
case erl_internal:bif(M, F, length(Args)) of
true ->
call(N, Args, Prec, Opts);
false ->
call(Name, Args, Prec, Opts)
end;
lexpr({call,_,Name,Args}, Prec, Opts) ->
call(Name, Args, Prec, Opts);
lexpr({'try',_,Es,Scs,Ccs,As}, _, Opts) ->
{list,[if
Scs =:= [] ->
{step,'try',body(Es, Opts)};
true ->
{step,{list,[{step,'try',body(Es, Opts)},{reserved,'of'}]},
cr_clauses(Scs, Opts)}
end] ++
if
Ccs =:= [] ->
[];
true ->
[{step,'catch',try_clauses(Ccs, Opts)}]
end ++
if
As =:= [] ->
[];
true ->
[{step,'after',body(As, Opts)}]
end ++
[{reserved,'end'}]};
lexpr({'catch',_,Expr}, Prec, Opts) ->
{P,R} = preop_prec('catch'),
El = {list,[{step,'catch',lexpr(Expr, R, Opts)}]},
maybe_paren(P, Prec, El);
lexpr({match,_,Lhs,Rhs}, Prec, Opts) ->
{L,P,R} = inop_prec('='),
Pl = lexpr(Lhs, L, Opts),
Rl = lexpr(Rhs, R, Opts),
El = {list,[{cstep,[Pl,' ='],Rl}]},
maybe_paren(P, Prec, El);
lexpr({op,_,Op,Arg}, Prec, Opts) ->
{P,R} = preop_prec(Op),
Ol = {reserved, leaf(format("~s ", [Op]))},
El = [Ol,lexpr(Arg, R, Opts)],
maybe_paren(P, Prec, El);
lexpr({op,_,Op,Larg,Rarg}, Prec, Opts) when Op =:= 'orelse';
Op =:= 'andalso' ->
%% Breaks lines since R12B.
{L,P,R} = inop_prec(Op),
Ll = lexpr(Larg, L, Opts),
Ol = {reserved, leaf(format("~s", [Op]))},
Lr = lexpr(Rarg, R, Opts),
El = {prefer_nl,[[]],[Ll,Ol,Lr]},
maybe_paren(P, Prec, El);
lexpr({op,_,Op,Larg,Rarg}, Prec, Opts) ->
{L,P,R} = inop_prec(Op),
Ll = lexpr(Larg, L, Opts),
Ol = {reserved, leaf(format("~s", [Op]))},
Lr = lexpr(Rarg, R, Opts),
El = {list,[Ll,Ol,Lr]},
maybe_paren(P, Prec, El);
%% Special expressions which are not really legal everywhere.
lexpr({remote,_,M,F}, Prec, Opts) ->
{L,P,R} = inop_prec(':'),
NameItem = lexpr(M, L, Opts),
CallItem = lexpr(F, R, Opts),
maybe_paren(P, Prec, [NameItem,$:,CallItem]);
%% BIT SYNTAX:
lexpr({bin,_,Fs}, _, Opts) ->
bit_grp(Fs, Opts);
%% Special case for straight values.
lexpr({value,_,Val}, _,_) ->
{value,Val};
%% Now do the hook.
lexpr(Other, _Precedence, #options{hook = none}) ->
leaf(format("INVALID-FORM:~w:",[Other]));
lexpr(HookExpr, Precedence, #options{hook = {Mod,Func,Eas}})
when Mod =/= 'fun' ->
{ehook,HookExpr,Precedence,{Mod,Func,Eas}};
lexpr(HookExpr, Precedence, #options{hook = Func, opts = Options}) ->
{hook,HookExpr,Precedence,Func,Options}.
%% An integer is separated from the following '#' by a space, which
%% erl_scan can handle.
hash_after_integer({integer, _, _}, C) ->
[$\s|C];
hash_after_integer({'fun',_,{function, _, _}}, C) ->
[$\s|C];
hash_after_integer({'fun',_,{function, _, _, _}}, C) ->
[$\s|C];
hash_after_integer(_, C) ->
C.
call(Name, Args, Prec, Opts) ->
{F,P} = func_prec(),
Item = {first,lexpr(Name, F, Opts),args(Args, Opts)},
maybe_paren(P, Prec, Item).
fun_info(Extra) ->
[leaf("% fun-info: "),{value,Extra}].
%% BITS:
bit_grp([], _Opts) ->
leaf("<<>>");
bit_grp(Fs, Opts) ->
append([['<<'], [bit_elems(Fs, Opts)], ['>>']]).
bit_elems(Es, Opts) ->
expr_list(Es, $,, fun bit_elem/2, Opts).
bit_elem({bin_element,_,Expr,Sz,Types}, Opts) ->
P = max_prec(),
VChars = lexpr(Expr, P, Opts),
SChars = if
Sz =/= default ->
[VChars,$:,lexpr(Sz, P, Opts)];
true ->
VChars
end,
if
Types =/= default ->
[SChars,$/|bit_elem_types(Types)];
true ->
SChars
end.
bit_elem_types([T]) ->
[bit_elem_type(T)];
bit_elem_types([T | Rest]) ->
[bit_elem_type(T), $-|bit_elem_types(Rest)].
bit_elem_type({A,B}) ->
[lexpr(erl_parse:abstract(A), options(none)),
$:,
lexpr(erl_parse:abstract(B), options(none))];
bit_elem_type(T) ->
lexpr(erl_parse:abstract(T), options(none)).
%% end of BITS
record_name(Name) ->
[$#,{atom,Name}].
record_fields(Fs, Opts) ->
tuple(Fs, fun record_field/2, Opts).
record_field({record_field,_,F,Val}, Opts) ->
{L,_P,R} = inop_prec('='),
Fl = lexpr(F, L, Opts),
Vl = lexpr(Val, R, Opts),
{list,[{cstep,[Fl,' ='],Vl}]};
record_field({typed_record_field,{record_field,_,F,Val},Type}, Opts) ->
{L,_P,R} = inop_prec('='),
Fl = lexpr(F, L, Opts),
Vl = typed(lexpr(Val, R, Opts), Type),
{list,[{cstep,[Fl,' ='],Vl}]};
record_field({typed_record_field,Field,Type}, Opts) ->
typed(record_field(Field, Opts), Type);
record_field({record_field,_,F}, Opts) ->
lexpr(F, 0, Opts).
map_fields(Fs, Opts) ->
tuple(Fs, fun map_field/2, Opts).
map_field({map_field_assoc,_,K,V}, Opts) ->
Pl = lexpr(K, 0, Opts),
{list,[{step,[Pl,leaf(" =>")],lexpr(V, 0, Opts)}]};
map_field({map_field_exact,_,K,V}, Opts) ->
Pl = lexpr(K, 0, Opts),
{list,[{step,[Pl,leaf(" :=")],lexpr(V, 0, Opts)}]}.
list({cons,_,H,T}, Es, Opts) ->
list(T, [H|Es], Opts);
list({nil,_}, Es, Opts) ->
proper_list(reverse(Es), Opts);
list(Other, Es, Opts) ->
improper_list(reverse(Es, [Other]), Opts).
%% if_clauses(Clauses, Opts) -> [Char].
%% Print 'if' clauses.
if_clauses(Cs, Opts) ->
clauses(fun if_clause/2, Opts, Cs).
if_clause({clause,_,[],G,B}, Opts) ->
Gl = [guard_no_when(G, Opts),' ->'],
{step,Gl,body(B, Opts)}.
guard_no_when([E|Es], Opts) when is_list(E) ->
expr_list([E|Es], $;, fun guard0/2, Opts);
guard_no_when([E|Es], Opts) -> % before R6
guard_no_when([[E|Es]], Opts);
guard_no_when([], _) -> % cannot happen
leaf("true").
%% cr_clauses(Clauses, Opts) -> [Char].
%% Print 'case'/'receive' clauses.
cr_clauses(Cs, Opts) ->
clauses(fun cr_clause/2, Opts, Cs).
cr_clause({clause,_,[T],G,B}, Opts) ->
El = lexpr(T, 0, Opts),
Gl = guard_when(El, G, Opts),
Bl = body(B, Opts),
{step,Gl,Bl}.
%% try_clauses(Clauses, Opts) -> [Char].
%% Print 'try' clauses.
try_clauses(Cs, Opts) ->
clauses(fun try_clause/2, Opts, Cs).
try_clause({clause,_,[{tuple,_,[C,V,S]}],G,B}, Opts) ->
Cs = lexpr(C, 0, Opts),
El = lexpr(V, 0, Opts),
CsEl = [Cs,$:,El],
Sl = stack_backtrace(S, CsEl, Opts),
Gl = guard_when(Sl, G, Opts),
Bl = body(B, Opts),
{step,Gl,Bl}.
stack_backtrace({var,_,'_'}, El, _Opts) ->
El;
stack_backtrace(S, El, Opts) ->
El++[$:,lexpr(S, 0, Opts)].
%% fun_clauses(Clauses, Opts) -> [Char].
%% Print 'fun' clauses.
fun_clauses(Cs, Opts, unnamed) ->
nl_clauses(fun fun_clause/2, [$;], Opts, Cs);
fun_clauses(Cs, Opts, {named, Name}) ->
nl_clauses(fun (C, H) ->
{step,Gl,Bl} = fun_clause(C, H),
{step,[atom_to_list(Name),Gl],Bl}
end, [$;], Opts, Cs).
fun_clause({clause,_,A,G,B}, Opts) ->
El = args(A, Opts),
Gl = guard_when(El, G, Opts),
Bl = body(B, Opts),
{step,Gl,Bl}.
%% cond_clauses(Clauses, Opts) -> [Char].
%% Print 'cond' clauses.
cond_clauses(Cs, Opts) ->
clauses(fun cond_clause/2, Opts, Cs).
cond_clause({clause,_,[],[[E]],B}, Opts) ->
{step,[lexpr(E, Opts),' ->'],body(B, Opts)}.
%% nl_clauses(Type, Opts, Clauses) -> [Char].
%% Generic clause printing function (always breaks lines).
nl_clauses(Type, Sep, Opts, Cs) ->
{prefer_nl,Sep,lexprs(Cs, Type, Opts)}.
%% clauses(Type, Opts, Clauses) -> [Char].
%% Generic clause printing function (breaks lines since R12B).
clauses(Type, Opts, Cs) ->
{prefer_nl,[$;],lexprs(Cs, Type, Opts)}.
%% lc_quals(Qualifiers, After, Opts)
%% List comprehension qualifiers (breaks lines since R12B).
lc_quals(Qs, Opts) ->
{prefer_nl,[$,],lexprs(Qs, fun lc_qual/2, Opts)}.
lc_qual({b_generate,_,Pat,E}, Opts) ->
Pl = lexpr(Pat, 0, Opts),
{list,[{step,[Pl,leaf(" <=")],lexpr(E, 0, Opts)}]};
lc_qual({generate,_,Pat,E}, Opts) ->
Pl = lexpr(Pat, 0, Opts),
{list,[{step,[Pl,leaf(" <-")],lexpr(E, 0, Opts)}]};
lc_qual(Q, Opts) ->
lexpr(Q, 0, Opts).
proper_list(Es, Opts) ->
{seq,$[,$],[$,],lexprs(Es, Opts)}.
improper_list(Es, Opts) ->
{seq,$[,$],[{$,,' |'}],lexprs(Es, Opts)}.
tuple(L, Opts) ->
tuple(L, fun lexpr/2, Opts).
tuple([], _F, _Opts) ->
leaf("{}");
tuple(Es, F, Opts) ->
{seq,${,$},[$,],lexprs(Es, F, Opts)}.
args(As, Opts) ->
{seq,$(,$),[$,],lexprs(As, Opts)}.
expr_list(Es, Sep, F, Opts) ->
{seq,[],[],Sep,lexprs(Es, F, Opts)}.
lexprs(Es, Opts) ->
lexprs(Es, fun lexpr/2, Opts).
lexprs(Es, F, Opts) ->
[F(E, Opts) || E <- Es].
maybe_paren(P, Prec, Expr) when P < Prec ->
[$(,Expr,$)];
maybe_paren(_P, _Prec, Expr) ->
Expr.
leaf(S) ->
{leaf,string:length(S),S}.
%%% Do the formatting. Currently nothing fancy. Could probably have
%%% done it in one single pass.
frmt(Item, PP) ->
frmt(Item, 0, PP).
frmt(Item, I, PP) ->
ST = spacetab(),
WT = wordtable(),
{Chars,_Length} = f(Item, I, ST, WT, PP),
[Chars].
%%% What the tags mean:
%%% - C: a character
%%% - [I|Is]: Is follow after I without newline or space
%%% - {list,IPs}: try to put all IPs on one line, if that fails newlines
%%% and indentation are inserted between IPs.
%%% - {first,I,IP2}: IP2 follows after I, and is output with an indentation
%%% updated with the width of I.
%%% - {seq,Before,After,Separator,IPs}: a sequence of Is separated by
%%% Separator. Before is output before IPs, and the indentation of IPs
%%% is updated with the width of Before. After follows after IPs.
%%% - {force_nl,ExtraInfo,I}: fun-info (a comment) forces linebreak before I.
%%% - {prefer_nl,Sep,IPs}: forces linebreak between Is unlesss negative
%%% indentation.
%%% - {atom,A}: an atom
%%% - {singleton_atom_type,A}: a singleton atom type
%%% - {char,C}: a character
%%% - {string,S}: a string.
%%% - {value,T}: a term.
%%% - {hook,...}, {ehook,...}: hook expressions.
%%%
%%% list, first, seq, force_nl, and prefer_nl all accept IPs, where each
%%% element is either an item or a tuple {step|cstep,I1,I2}. step means
%%% that I2 is output after linebreak and an incremented indentation.
%%% cstep works similarly, but no linebreak if the width of I1 is less
%%% than the indentation (this is for "A = <expression over several lines>).
f([]=Nil, _I0, _ST, _WT, _PP) ->
{Nil,0};
f(C, _I0, _ST, _WT, _PP) when is_integer(C) ->
{C,1};
f({leaf,Length,Chars}, _I0, _ST, _WT, _PP) ->
{Chars,Length};
f([Item|Items], I0, ST, WT, PP) ->
consecutive(Items, f(Item, I0, ST, WT, PP), I0, ST, WT, PP);
f({list,Items}, I0, ST, WT, PP) ->
f({seq,[],[],[[]],Items}, I0, ST, WT, PP);
f({first,E,Item}, I0, ST, WT, PP) ->
f({seq,E,[],[[]],[Item]}, I0, ST, WT, PP);
f({seq,Before,After,Sep,LItems}, I0, ST, WT, PP) ->
BCharsSize = f(Before, I0, ST, WT, PP),
I = indent(BCharsSize, I0),
CharsSizeL = fl(LItems, Sep, I, After, ST, WT, PP),
{CharsL,SizeL} = unz(CharsSizeL),
{BCharsL,BSizeL} = unz1([BCharsSize]),
Sizes = BSizeL ++ SizeL,
NSepChars = if
is_list(Sep), Sep =/= [] ->
erlang:max(0, length(CharsL)-1); % not string:length
true ->
0
end,
case same_line(I0, Sizes, NSepChars) of
{yes,Size} ->
Chars = if
NSepChars > 0 -> insert_sep(CharsL, $\s);
true -> CharsL
end,
{BCharsL++Chars,Size};
no ->
CharsList = handle_step(CharsSizeL, I, ST),
{LChars, LSize} =
maybe_newlines(CharsList, LItems, I, NSepChars, ST),
{[BCharsL,LChars],nsz(LSize, I0)}
end;
f({force_nl,_ExtraInfoItem,Item}, I, ST, WT, PP) when I < 0 ->
%% Extra info is a comment; cannot have that on the same line
f(Item, I, ST, WT, PP);
f({force_nl,ExtraInfoItem,Item}, I, ST, WT, PP) ->
f({prefer_nl,[],[ExtraInfoItem,Item]}, I, ST, WT, PP);
f({prefer_nl,Sep,LItems}, I, ST, WT, PP) when I < 0 ->
f({seq,[],[],Sep,LItems}, I, ST, WT, PP);
f({prefer_nl,Sep,LItems}, I0, ST, WT, PP) ->
CharsSize2L = fl(LItems, Sep, I0, [], ST, WT, PP),
{_CharsL,Sizes} = unz(CharsSize2L),
if
Sizes =:= [] ->
{[], 0};
true ->
{insert_newlines(CharsSize2L, I0, ST),
nsz(lists:last(Sizes), I0)}
end;
f({value,V}, I, ST, WT, PP) ->
f(write_a_value(V, PP), I, ST, WT, PP);
f({atom,A}, I, ST, WT, PP) ->
f(write_an_atom(A, PP), I, ST, WT, PP);
f({singleton_atom_type,A}, I, ST, WT, PP) ->
f(write_a_singleton_atom_type(A, PP), I, ST, WT, PP);
f({char,C}, I, ST, WT, PP) ->
f(write_a_char(C, PP), I, ST, WT, PP);
f({string,S}, I, ST, WT, PP) ->
f(write_a_string(S, I, PP), I, ST, WT, PP);
f({reserved,R}, I, ST, WT, PP) ->
f(R, I, ST, WT, PP);
f({hook,HookExpr,Precedence,Func,Options}, I, _ST, _WT, _PP) ->
Chars = Func(HookExpr, I, Precedence, Options),
{Chars,indentation(Chars, I)};
f({ehook,HookExpr,Precedence,{Mod,Func,Eas}=ModFuncEas}, I, _ST, _WT, _PP) ->
Chars = apply(Mod, Func, [HookExpr,I,Precedence,ModFuncEas|Eas]),
{Chars,indentation(Chars, I)};
f(WordName, _I, _ST, WT, _PP) when is_atom(WordName) ->
word(WordName, WT).
-define(IND, 4).
%% fl(ListItems, I0, ST, WT) -> [[CharsSize1,CharsSize2]]
%% ListItems = [{Item,Items}|Item]
fl([], _Sep, I0, After, ST, WT, PP) ->
[[f(After, I0, ST, WT, PP),{[],0}]];
fl(CItems, Sep0, I0, After, ST, WT, PP) ->
F = fun({step,Item1,Item2}, S) ->
[f(Item1, I0, ST, WT, PP),
f([Item2,S], incr(I0, ?IND), ST, WT, PP)];
({cstep,Item1,Item2}, S) ->
{_,Sz1} = CharSize1 = f(Item1, I0, ST, WT, PP),
if
is_integer(Sz1), Sz1 < ?IND ->
Item2p = [leaf("\s"),Item2,S],
[consecutive(Item2p, CharSize1, I0, ST, WT, PP),{[],0}];
true ->
[CharSize1,f([Item2,S], incr(I0, ?IND), ST, WT, PP)]
end;
({reserved,Word}, S) ->
[f([Word,S], I0, ST, WT, PP),{[],0}];
(Item, S) ->
[f([Item,S], I0, ST, WT, PP),{[],0}]
end,
{Sep,LastSep} = sep(Sep0),
fl1(CItems, F, Sep, LastSep, After).
sep([{S,LS}]) -> {[S],[LS]};
sep({_,_}=Sep) -> Sep;
sep(S) -> {S, S}.
fl1([CItem], F, _Sep, _LastSep, After) ->
[F(CItem,After)];
fl1([CItem1,CItem2], F, _Sep, LastSep, After) ->
[F(CItem1, LastSep),F(CItem2, After)];
fl1([CItem|CItems], F, Sep, LastSep, After) ->
[F(CItem, Sep)|fl1(CItems, F, Sep, LastSep, After)].
consecutive(Items, CharSize1, I0, ST, WT, PP) ->
{CharsSizes,_Length} =
mapfoldl(fun(Item, Len) ->
CharsSize = f(Item, Len, ST, WT, PP),
{CharsSize,indent(CharsSize, Len)}
end, indent(CharSize1, I0), Items),
{CharsL,SizeL} = unz1([CharSize1|CharsSizes]),
{CharsL,line_size(SizeL)}.
unz(CharsSizesL) ->
unz1(append(CharsSizesL)).
unz1(CharSizes) ->
lists:unzip(nonzero(CharSizes)).
nonzero(CharSizes) ->
lists:filter(fun({_,Sz}) -> Sz =/= 0 end, CharSizes).
maybe_newlines([{Chars,Size}], [], _I, _NSepChars, _ST) ->
{Chars,Size};
maybe_newlines(CharsSizeList, Items, I, NSepChars, ST) when I >= 0 ->
maybe_sep(CharsSizeList, Items, I, NSepChars, nl_indent(I, ST)).
maybe_sep([{Chars1,Size1}|CharsSizeL], [Item|Items], I0, NSepChars, Sep) ->
I1 = case classify_item(Item) of
atomic ->
I0 + Size1;
_ ->
?MAXLINE+1
end,
maybe_sep1(CharsSizeL, Items, I0, I1, Sep, NSepChars, Size1, [Chars1]).
maybe_sep1([{Chars,Size}|CharsSizeL], [Item|Items],
I0, I, Sep, NSepChars, Sz0, A) ->
case classify_item(Item) of
atomic when is_integer(Size) ->
Size1 = Size + 1,
I1 = I + Size1,
if
I1 =< ?MAXLINE ->
A1 = if
NSepChars > 0 -> [Chars,$\s|A];
true -> [Chars|A]
end,
maybe_sep1(CharsSizeL, Items, I0, I1, Sep, NSepChars,
Sz0 + Size1, A1);
true ->
A1 = [Chars,Sep|A],
maybe_sep1(CharsSizeL, Items, I0, I0 + Size, Sep,
NSepChars, Size1, A1)
end;
_ ->
A1 = [Chars,Sep|A],
maybe_sep1(CharsSizeL, Items, I0, ?MAXLINE+1, Sep, NSepChars,
0, A1)
end;
maybe_sep1(_CharsSizeL, _Items, _Io, _I, _Sep, _NSepChars, Sz, A) ->
{lists:reverse(A), Sz}.
insert_newlines(CharsSizesL, I, ST) when I >= 0 ->
{CharsL, _} = unz1(handle_step(CharsSizesL, I, ST)),
insert_nl(CharsL, I, ST).
handle_step(CharsSizesL, I, ST) ->
map(fun([{_C1,0},{_C2,0}]) ->
{[], 0};
([{C1,Sz1},{_C2,0}]) ->
{C1, Sz1};
([{C1,Sz1},{C2,Sz2}]) when Sz2 > 0 ->
{insert_nl([C1,C2], I+?IND, ST),line_size([Sz1,Sz2])}
end, CharsSizesL).
insert_nl(CharsL, I, ST) ->
insert_sep(CharsL, nl_indent(I, ST)).
insert_sep([Chars1|CharsL], Sep) ->
[Chars1 | [[Sep,Chars] || Chars <- CharsL]].
nl_indent(0, _T) ->
$\n;
nl_indent(I, T) when I > 0 ->
[$\n|spaces(I, T)].
classify_item({atom, _}) -> atomic;
classify_item({singleton_atom_type, _}) -> atomic;
classify_item(Atom) when is_atom(Atom) -> atomic;
classify_item({leaf, _, _}) -> atomic;
classify_item(_) -> complex.
same_line(I0, SizeL, NSepChars) ->
try
Size = lists:sum(SizeL) + NSepChars,
true = incr(I0, Size) =< ?MAXLINE,
{yes,Size}
catch _:_ ->
no
end.
line_size(SizeL) ->
line_size(SizeL, 0, false).
line_size([], Size, false) ->
Size;
line_size([], Size, true) ->
{line,Size};
line_size([{line,Len}|SizeL], _, _) ->
line_size(SizeL, Len, true);
line_size([Sz|SizeL], SizeSoFar, LF) ->
line_size(SizeL, SizeSoFar+Sz, LF).
nsz({line,_Len}=Sz, _I) ->
Sz;
nsz(Size, I) when I >= 0 ->
{line,Size+I}.
indent({_Chars,{line,Len}}, _I) ->
Len;
indent({_Chars,Size}, I) ->
incr(I, Size).
incr(I, _Incr) when I < 0 ->
I;
incr(I, Incr) ->
I+Incr.
indentation(E, I) when I < 0 ->
string:length(E);
indentation(E, I0) ->
I = io_lib_format:indentation(E, I0),
case has_nl(E) of
true -> {line,I};
false -> I
end.
has_nl([$\n|_]) ->
true;
has_nl([C|Cs]) when is_integer(C) ->
has_nl(Cs);
has_nl([C|Cs]) ->
has_nl(C) orelse has_nl(Cs);
has_nl([]) ->
false.
write_a_value(V, PP) ->
flat_leaf(write_value(V, PP)).
write_an_atom(A, PP) ->
flat_leaf(write_atom(A, PP)).
write_a_singleton_atom_type(A, PP) ->
flat_leaf(write_singleton_atom_type(A, PP)).
write_a_char(C, PP) ->
flat_leaf(write_char(C, PP)).
-define(MIN_SUBSTRING, 5).
write_a_string(S, I, PP) when I < 0; S =:= [] ->
flat_leaf(write_string(S, PP));
write_a_string(S, I, PP) ->
Len = erlang:max(?MAXLINE-I, ?MIN_SUBSTRING),
{list,write_a_string(S, Len, Len, PP)}.
write_a_string([], _N, _Len, _PP) ->
[];
write_a_string(S, N, Len, PP) ->
SS = string:slice(S, 0, N),
Sl = write_string(SS, PP),
case (string:length(Sl) > Len) and (N > ?MIN_SUBSTRING) of
true ->
write_a_string(S, N-1, Len, PP);
false ->
[flat_leaf(Sl) |
write_a_string(string:slice(S, string:length(SS)), Len, Len, PP)]
end.
flat_leaf(S) ->
L = lists:flatten(S),
{leaf,string:length(L),L}.
write_value(V, PP) ->
(PP#pp.value_fun)(V).
write_atom(A, PP) ->
(PP#pp.value_fun)(A).
write_singleton_atom_type(A, PP) ->
(PP#pp.singleton_atom_type_fun)(A).
write_string(S, PP) ->
(PP#pp.string_fun)(S).
write_char(C, PP) ->
(PP#pp.char_fun)(C).
%%
%% Utilities
%%
a0() ->
erl_anno:new(0).
-define(N_SPACES, 30).
spacetab() ->
{[_|L],_} = mapfoldl(fun(_, A) -> {A,[$\s|A]}
end, [], lists:seq(0, ?N_SPACES)),
list_to_tuple(L).
spaces(N, T) when N =< ?N_SPACES ->
element(N, T);
spaces(N, T) ->
[element(?N_SPACES, T)|spaces(N-?N_SPACES, T)].
wordtable() ->
L = [begin {leaf,Sz,S} = leaf(W), {S,Sz} end ||
W <- [" ->"," =","<<",">>","[]","after","begin","case","catch",
"end","fun","if","of","receive","try","when"," ::","..",
" |"]],
list_to_tuple(L).
word(' ->', WT) -> element(1, WT);
word(' =', WT) -> element(2, WT);
word('<<', WT) -> element(3, WT);
word('>>', WT) -> element(4, WT);
word('[]', WT) -> element(5, WT);
word('after', WT) -> element(6, WT);
word('begin', WT) -> element(7, WT);
word('case', WT) -> element(8, WT);
word('catch', WT) -> element(9, WT);
word('end', WT) -> element(10, WT);
word('fun', WT) -> element(11, WT);
word('if', WT) -> element(12, WT);
word('of', WT) -> element(13, WT);
word('receive', WT) -> element(14, WT);
word('try', WT) -> element(15, WT);
word('when', WT) -> element(16, WT);
word(' ::', WT) -> element(17, WT);
word('..', WT) -> element(18, WT);
word(' |', WT) -> element(19, WT). | lib/stdlib/src/erl_pp.erl | 0.50415 | 0.410993 | erl_pp.erl | starcoder |
%%
%% @doc <NAME>'s 5-card Poker hand evaluator, ported to Erlang.
%%
-module(poker).
-export([hand_rank/1, sort_hands/1, winners/1]).
%% Return a list of the ranks, sorted with higher first.
%% Hand is a list of 2-char strings, i.e. ["6H","3D","AS","TH","JC"].
card_ranks(Hand) ->
Ranks = [string:str("-23456789TJQKA", [R]) || [R,_] <- Hand],
case lists:sort(fun erlang:'>'/2, Ranks) of
[14,5,4,3,2] -> [5,4,3,2,1];
SortedRanks -> SortedRanks
end.
%% Return a value indicating the ranking of a hand.
hand_rank(Hand) ->
CardRanks = card_ranks(Hand),
HandRanks = [R || F <- poker_hands(),
begin R = F(CardRanks, Hand), R /= undefined end],
hd(HandRanks).
%% http://en.wikipedia.org/wiki/List_of_poker_hands
poker_hands() ->
[
fun straight_flush/2,
fun four_of_kind/2,
fun full_house/2,
fun flush/2,
fun straight/2,
fun three_of_kind/2,
fun two_pair/2,
fun pair/2,
fun high_card/2
].
straight_flush(Ranks, Hand) ->
case {straight(Ranks,Hand), flush(Ranks,Hand)} of
{[4,R], [5|_]} -> [8, R];
_ -> undefined
end.
four_of_kind([H,L,L,L,L], _) -> [7, L, H];
four_of_kind([H,H,H,H,L], _) -> [7, H, L];
four_of_kind(_,_) -> undefined.
full_house([H,H,H,L,L], _) -> [6, H, L];
full_house([H,H,L,L,L], _) -> [6, L, H];
full_house(_,_) -> undefined.
flush(Ranks, [[_,S], [_,S], [_,S], [_,S], [_,S]]) -> [5 | Ranks];
flush(_,_) -> undefined.
straight([R1, R2, R3, R4, R5], _)
when R1 == R2+1, R2 == R3+1, R3 == R4+1, R4 == R5+1 -> [4, R1];
straight(_,_) -> undefined.
three_of_kind([R,R,R,_,_] = Ranks, _) -> [3, R | Ranks];
three_of_kind([_,R,R,R,_] = Ranks, _) -> [3, R | Ranks];
three_of_kind([_,_,R,R,R] = Ranks, _) -> [3, R | Ranks];
three_of_kind(_,_) -> undefined.
two_pair([H,H,L,L,_] = Ranks, _) -> [2, H, L | Ranks];
two_pair([H,H,_,L,L] = Ranks, _) -> [2, H, L | Ranks];
two_pair([_,H,H,L,L] = Ranks, _) -> [2, H, L | Ranks];
two_pair(_,_) -> undefined.
pair([R,R,_,_,_] = Ranks, _) -> [1, R | Ranks];
pair([_,R,R,_,_] = Ranks, _) -> [1, R | Ranks];
pair([_,_,R,R,_] = Ranks, _) -> [1, R | Ranks];
pair([_,_,_,R,R] = Ranks, _) -> [1, R | Ranks];
pair(_,_) -> undefined.
high_card(Ranks, _) -> [0 | Ranks].
%% To find winners, we just sort hands according to their rankings
sort_hands(Hands) ->
lists:sort(fun(H1, H2) -> hand_rank(H2) =< hand_rank(H1) end, Hands).
winners(Hands) ->
SortedHands = sort_hands(Hands),
HighestRank = hand_rank(hd(SortedHands)),
[H || H <- SortedHands, hand_rank(H) == HighestRank].
%% ============================================================================
%% Unit tests
%% ============================================================================
-include_lib("eunit/include/eunit.hrl").
list_to_hand(HandStr) ->
re:split(HandStr, " ", [{return,list}]).
card_ranks_test() ->
?assertEqual([6,5,4,3,2], card_ranks(list_to_hand("2H 3H 4H 5H 6H"))),
?assertEqual([5,4,3,2,1], card_ranks(list_to_hand("2S 3H 4C 5D AH"))).
hand_rank_result(HandStr) ->
Hand = list_to_hand(HandStr),
hand_rank(Hand).
straight_flush_test() ->
?assertEqual([8,6], hand_rank_result("2H 4H 6H 3H 5H")),
?assertEqual([8,5], hand_rank_result("2H 4H AH 3H 5H")),
?assertEqual([8,14], hand_rank_result("TH QH AH KH JH")).
four_of_kind_test() ->
?assertEqual([7,2,6], hand_rank_result("2H 2D 6H 2S 2C")),
?assertEqual([7,6,2], hand_rank_result("2H 6D 6H 6S 6C")).
full_house_test() ->
?assertEqual([6,6,2], hand_rank_result("2H 6D 6H 2S 6C")),
?assertEqual([6,2,6], hand_rank_result("2H 2D 6H 2S 6C")).
flush_test() ->
?assertEqual([5,6,6,5,4,2], hand_rank_result("2H 4H 6H 6H 5H")).
straight_test() ->
?assertEqual([4,6], hand_rank_result("2H 4H 6D 3H 5H")).
three_of_kind_test() ->
?assertEqual([3,2,14,4,2,2,2], hand_rank_result("2H 4H 2D AH 2S")).
two_pair_test() ->
?assertEqual([2,6,2,13,6,6,2,2], hand_rank_result("2H KD 6H 2S 6C")).
pair_test() ->
?assertEqual([1,2,13,12,6,2,2], hand_rank_result("2H KD QH 2S 6C")).
high_card_test() ->
?assertEqual([0,14,13,12,6,2], hand_rank_result("AH KD QH 2S 6C")).
sort_hands_test() ->
RoyalFlushSpades = list_to_hand("AS QS TS JS KS"),
RoyalFlushClubs = list_to_hand("AC JC TC QC KC"),
StraightFlushTenHigh = list_to_hand("7C 6C TC 8C 9C"),
StraightFlushSevenHigh = list_to_hand("7D 4D 5D 3D 6D"),
SteelWheel = list_to_hand("3H 4H 5H AH 2H"),
NineQuadsJackKicker = list_to_hand("9H JD 9S 9C 9D"),
NineQuadsFiveKicker = list_to_hand("9H 5S 9S 9C 9D"),
QueensOverNines = list_to_hand("9H QS QD QC 9D"),
QueensOverThrees = list_to_hand("QH 3S QD QC 3D"),
KingFlush = list_to_hand("JS 2S 4S KS 7S"),
QueenFlushJackHigh = list_to_hand("JC 2C QC 5C 7C"),
QueenFlushEightHigh = list_to_hand("8C 2C QC 5C 7C"),
QueenFlushEightHighSix = list_to_hand("8C 2C QC 5C 6C"),
StraightNineHigh = list_to_hand("7C 6S 5D 8H 9C"),
StraightSixHigh = list_to_hand("2D 4C 5S 3D 6H"),
Wheel = list_to_hand("3H 4H 5D AD 2S"),
ThreeQueens = list_to_hand("5H 4C QD QC QS"),
ThreeTensSixKicker = list_to_hand("5H TH 6D TC TS"),
ThreeTensFiveKicker = list_to_hand("5H TH 4D TC TS"),
ThreeTensFiveKicker3 = list_to_hand("5H TH 3D TC TS"),
KingsOverNines = list_to_hand("9H KH 9D KC 2S"),
JackOverFoursAceKicker = list_to_hand("4D 4H JD JH AC"),
JackOverFoursTenKicker = list_to_hand("4D 4H JD JH TS"),
TwoAces = list_to_hand("AD 4H JD 7H AS"),
TwoTensAceHigh = list_to_hand("AD TH TD 7H 8S"),
TwoTensSixHigh = list_to_hand("6D TH TD 4H 5S"),
TwoTensSixHigh4 = list_to_hand("6D TH TD 4H 3S"),
TwoTensSixHigh42 = list_to_hand("6D TH TD 4H 2S"),
AceHigh = list_to_hand("6D AH TD 4H 2S"),
JackHigh8 = list_to_hand("8D 5H 7D JH 2S"),
JackHigh86 = list_to_hand("8D 5H 6D JH 2S"),
JackHigh864 = list_to_hand("8D 4H 6D JH 2S"),
?assertEqual(
[
RoyalFlushSpades,
RoyalFlushClubs,
StraightFlushTenHigh,
StraightFlushSevenHigh,
SteelWheel,
NineQuadsJackKicker,
NineQuadsFiveKicker,
QueensOverNines,
QueensOverThrees,
KingFlush,
QueenFlushJackHigh,
QueenFlushEightHigh,
QueenFlushEightHighSix,
StraightNineHigh,
StraightSixHigh,
Wheel,
ThreeQueens,
ThreeTensSixKicker,
ThreeTensFiveKicker,
ThreeTensFiveKicker3,
KingsOverNines,
JackOverFoursAceKicker,
JackOverFoursTenKicker,
TwoAces,
TwoTensAceHigh,
TwoTensSixHigh,
TwoTensSixHigh4,
TwoTensSixHigh42,
AceHigh,
JackHigh8,
JackHigh86,
JackHigh864
],
sort_hands([
StraightNineHigh,
ThreeQueens,
AceHigh,
QueenFlushEightHighSix,
TwoTensSixHigh4,
ThreeTensFiveKicker3,
JackHigh864,
QueensOverNines,
KingFlush,
ThreeTensFiveKicker,
NineQuadsFiveKicker,
TwoAces,
StraightFlushSevenHigh,
JackHigh86,
TwoTensSixHigh,
TwoTensAceHigh,
JackHigh8,
QueenFlushEightHigh,
QueensOverThrees,
JackOverFoursTenKicker,
Wheel,
SteelWheel,
NineQuadsJackKicker,
StraightSixHigh,
StraightFlushTenHigh,
ThreeTensSixKicker,
RoyalFlushSpades,
TwoTensSixHigh42,
QueenFlushJackHigh,
JackOverFoursAceKicker,
KingsOverNines,
RoyalFlushClubs
])
),
?assertEqual([SteelWheel], winners([Wheel, SteelWheel])),
?assertEqual([RoyalFlushSpades, RoyalFlushClubs], winners([RoyalFlushSpades, RoyalFlushClubs])). | lib/examples/src/poker.erl | 0.580233 | 0.588416 | poker.erl | starcoder |
% ==============================================================================
% Uniform distribution
% ==============================================================================
-module(uniform).
-export([pdf/3, cdf/3, invcdf/3, rnd/3]).
-ifdef(TEST).
-include_lib("eunit/include/eunit.hrl").
-endif.
% ------------------------------------------------------------------------------
% pdf - Uniform probability density function
% ------------------------------------------------------------------------------
pdf(X,A,_) when X < A -> 0.0;
pdf(X,_,B) when X > B -> 0.0;
pdf(_,A,B) ->
1/(B-A).
% ------------------------------------------------------------------------------
% cdf - Uniform cumulative distribution function
% ------------------------------------------------------------------------------
cdf(X,A,_) when X < A -> 0.0;
cdf(X,_,B) when X > B -> 1.0;
cdf(X,A,B) ->
(X-A)/(B-A).
% ------------------------------------------------------------------------------
% invcdf - Inverse uniform distribution function
% ------------------------------------------------------------------------------
invcdf(P,_,_) when P < 0 orelse P > 1 -> {error,"Invalid probability"};
invcdf(P,A,B) ->
A + P*(B-A).
% ------------------------------------------------------------------------------
% rnd - RNG function
% ------------------------------------------------------------------------------
rnd(N,A,B) ->
lists:map(fun(_) -> invcdf(rand:uniform(),A,B) end, lists:seq(1,N)).
% ==============================================================================
% EUnit tests
% ------------------------------------------------------------------------------
-ifdef(TEST).
% ------------------------------------------------------------------------------
% pdf tests
% ------------------------------------------------------------------------------
pdf_test() ->
?assertEqual(0.0, pdf(-0.999,0,10)),
?assertEqual(0.1, pdf(0,0,10)),
?assertEqual(0.1, pdf(2,0,10)),
?assertEqual(0.1, pdf(5,0,10)),
?assertEqual(0.1, pdf(7,0,10)),
?assertEqual(0.1, pdf(10,0,10)),
?assertEqual(0.0, pdf(10.001,0,10)).
% ------------------------------------------------------------------------------
% cdf tests
% ------------------------------------------------------------------------------
cdf_test() ->
?assertEqual(0.0, cdf(-0.999,0,10)),
?assertEqual(0.0, cdf(0,0,10)),
?assertEqual(0.2, cdf(2,0,10)),
?assertEqual(0.5, cdf(5,0,10)),
?assertEqual(0.7, cdf(7,0,10)),
?assertEqual(1.0, cdf(10,0,10)),
?assertEqual(1.0, cdf(10.001,0,10)).
% ------------------------------------------------------------------------------
% inv tests
% ------------------------------------------------------------------------------
invcdf_test() ->
?assertEqual(1.0, invcdf(0.1,0,10)),
?assertEqual(5.0, invcdf(0.5,0,10)),
?assertEqual(10.0, invcdf(1.0,0,10)).
invcdf_error_test() ->
?assertEqual({error,"Invalid probability"}, invcdf(-0.1,0,10)),
?assertEqual({error,"Invalid probability"}, invcdf(1.1,0,10)).
% ------------------------------------------------------------------------------
% rng tests
% ------------------------------------------------------------------------------
rnd_positive_test() ->
[X] = rnd(1,0,10),
?assert(X >= 0.0 andalso X =< 10.0).
rnd_length_test() ->
Xs = rnd(23,1,3),
?assert(length(Xs) =:= 23).
-endif. | src/uniform.erl | 0.513181 | 0.521288 | uniform.erl | starcoder |
%% -------------------------------------------------------------------
%% luwak_mr: utilities for map/reducing on Luwak data
%%
%% This file is provided to you under the Apache License,
%% Version 2.0 (the "License"); you may not use this file
%% except in compliance with the License. You may obtain
%% a copy of the License at
%%
%% http://www.apache.org/licenses/LICENSE-2.0
%%
%% Unless required by applicable law or agreed to in writing,
%% software distributed under the License is distributed on an
%% "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
%% KIND, either express or implied. See the License for the
%% specific language governing permissions and limitations
%% under the License.
%%
%% -------------------------------------------------------------------
%% @doc Tools for map/reducing Luwak data.
%%
%% The primary tool in this module is a function that conforms to
%% the interface for "dynamic map/reduce inputs." This function
%% will allow you to set up a map/reduce process for running a
%% computation across the blocks of a Luwak file.
%%
%% To use the function via the Erlang client:
%%```
%% C:mapred({modfun, luwak_mr, file, <<"my_file_name">>},
%% [... your query ...]).
%%'''
%% Over HTTP, structure your JSON query like:
%%```
%% {"inputs":{"module":"luwak_mr",
%% "function":"file",
%% "arg":"my_file_name"},
%% "query":[... your query ...]}
%%'''
%%
%% The luwak_mr:file/3 function will send an input to the
%% map/reduce query for each block in the file. The "KeyData"
%% for the block will be its offset in the file. As a trivial
%% example, you might use this to get an ordered list of the
%% first byte of each block like so:
%%```
%% F = fun(B, O, _) ->
%% <<Y, _/binary>> = luwak_block:data(B),
%% [{Y, O}]
%% end,
%% {ok, Bytes} = C:mapred({modfun,luwak_mr,file,<<"name">>},
%% [{map, {qfun, F}, none, true}]),
%% OrderedBytes = lists:keysort(2, Bytes),
%% [ Y || {Y, _} <- OrderedBytes.
%%'''
-module(luwak_mr).
-export([file/3]).
-include("luwak.hrl").
%% @spec file(pid(), binary(), integer()) -> ok
%% @doc Sends the bucket-keys for the blocks of a Luwak file as
%% map/reduce inputs to the specified FlowPid. Use it by
%% specifying the map/reduce input as:
%%```
%% {modfun, luwak_mr, file, <<"file_name">>}
%%'''
file(FlowPid, Filename, _Timeout) when is_binary(Filename) ->
{ok, Client} = riak:local_client(),
{ok, File} = luwak_file:get(Client, Filename),
V = riak_object:get_value(File),
{block_size, BlockSize} = lists:keyfind(block_size, 1, V),
case lists:keyfind(root, 1, V) of
{root, RootKey} -> tree(FlowPid, Client, BlockSize, RootKey, 0);
false -> ok
end,
luke_flow:finish_inputs(FlowPid).
%% @spec tree(pid(), riak_client(), integer(), binary(), integer())
%% -> integer()
%% @doc Recursive tree walker used by file/3. This function assumes
%% that a child link in a tree is a data block if the size it
%% lists is less than or equal to the specified BlockSize, and
%% that it is a subtree if the size is greater than BlockSize.
%%
%% The result is the offset of the byte that would imediately
%% follow all of the bytes in this tree. This fact is unused,
%% but *could* be used for testing an invariant.
tree(FlowPid, Client, BlockSize, Key, Offset) ->
{ok, #n{children=Children}} = luwak_tree:get(Client, Key),
lists:foldl(
fun({SubTree, Size}, SubOffset) when Size > BlockSize ->
tree(FlowPid, Client, BlockSize, SubTree, SubOffset),
SubOffset+Size;
({Leaf, Size}, LeafOffset) ->
luke_flow:add_inputs(
FlowPid, [{{?N_BUCKET, Leaf}, LeafOffset}]),
LeafOffset+Size
end,
Offset,
Children). | mapreduce/erlang/luwak_mr.erl | 0.676299 | 0.787686 | luwak_mr.erl | starcoder |
%%
%% %CopyrightBegin%
%%
%% Copyright Ericsson AB 2002-2009. All Rights Reserved.
%%
%% The contents of this file are subject to the Erlang Public License,
%% Version 1.1, (the "License"); you may not use this file except in
%% compliance with the License. You should have received a copy of the
%% Erlang Public License along with this software. If not, it can be
%% retrieved online at http://www.erlang.org/.
%%
%% Software distributed under the License is distributed on an "AS IS"
%% basis, WITHOUT WARRANTY OF ANY KIND, either express or implied. See
%% the License for the specific language governing rights and limitations
%% under the License.
%%
%% %CopyrightEnd%
%%
-module(packages).
-export([to_string/1, concat/1, concat/2, is_valid/1, is_segmented/1,
split/1, last/1, first/1, strip_last/1, find_modules/1,
find_modules/2]).
%% A package name (or a package-qualified module name) may be an atom or
%% a string (list of nonnegative integers) - not a deep list, and not a
%% list containing atoms. A name may be empty, but may not contain two
%% consecutive period (`.') characters or end with a period character.
-type package_name() :: atom() | string().
-spec to_string(package_name()) -> string().
to_string(Name) when is_atom(Name) ->
atom_to_list(Name);
to_string(Name) ->
Name.
%% `concat' does not insert a leading period if the first segment is
%% empty. However, the result may contain leading, consecutive or
%% dangling period characters, if any of the segments after the first
%% are empty. Use 'is_valid' to check the result if necessary.
-spec concat(package_name(), package_name()) -> string().
concat(A, B) ->
concat([A, B]).
-spec concat([package_name()]) -> string().
concat([H | T]) when is_atom(H) ->
concat([atom_to_list(H) | T]);
concat(["" | T]) ->
concat_1(T);
concat(L) ->
concat_1(L).
concat_1([H | T]) when is_atom(H) ->
concat_1([atom_to_list(H) | T]);
concat_1([H]) ->
H;
concat_1([H | T]) ->
H ++ "." ++ concat_1(T);
concat_1([]) ->
"";
concat_1(Name) ->
erlang:error({badarg, Name}).
-spec is_valid(package_name()) -> boolean().
is_valid(Name) when is_atom(Name) ->
is_valid_1(atom_to_list(Name));
is_valid([$. | _]) ->
false;
is_valid(Name) ->
is_valid_1(Name).
is_valid_1([$.]) -> false;
is_valid_1([$., $. | _]) -> false;
is_valid_1([H | T]) when is_integer(H), H >= 0 ->
is_valid_1(T);
is_valid_1([]) -> true;
is_valid_1(_) -> false.
-spec split(package_name()) -> [string()].
split(Name) when is_atom(Name) ->
split_1(atom_to_list(Name), []);
split(Name) ->
split_1(Name, []).
split_1([$. | T], Cs) ->
[lists:reverse(Cs) | split_1(T, [])];
split_1([H | T], Cs) when is_integer(H), H >= 0 ->
split_1(T, [H | Cs]);
split_1([], Cs) ->
[lists:reverse(Cs)];
split_1(_, _) ->
erlang:error(badarg).
%% This is equivalent to testing if `split(Name)' yields a list of
%% length larger than one (i.e., if the name can be split into two or
%% more segments), but is cheaper.
-spec is_segmented(package_name()) -> boolean().
is_segmented(Name) when is_atom(Name) ->
is_segmented_1(atom_to_list(Name));
is_segmented(Name) ->
is_segmented_1(Name).
is_segmented_1([$. | _]) -> true;
is_segmented_1([H | T]) when is_integer(H), H >= 0 ->
is_segmented_1(T);
is_segmented_1([]) -> false;
is_segmented_1(_) ->
erlang:error(badarg).
-spec last(package_name()) -> string().
last(Name) ->
last_1(split(Name)).
last_1([H]) -> H;
last_1([_ | T]) -> last_1(T).
-spec first(package_name()) -> [string()].
first(Name) ->
first_1(split(Name)).
first_1([H | T]) when T =/= [] -> [H | first_1(T)];
first_1(_) -> [].
-spec strip_last(package_name()) -> string().
strip_last(Name) ->
concat(first(Name)).
%% This finds all modules available for a given package, using the
%% current code server search path. (There is no guarantee that the
%% modules are loadable; only that the object files exist.)
-spec find_modules(package_name()) -> [string()].
find_modules(P) ->
find_modules(P, code:get_path()).
-spec find_modules(package_name(), [string()]) -> [string()].
find_modules(P, Paths) ->
P1 = filename:join(packages:split(P)),
find_modules(P1, Paths, code:objfile_extension(), sets:new()).
find_modules(P, [Path | Paths], Ext, S0) ->
case file:list_dir(filename:join(Path, P)) of
{ok, Fs} ->
Fs1 = [F || F <- Fs, filename:extension(F) =:= Ext],
S1 = lists:foldl(fun (F, S) ->
F1 = filename:rootname(F, Ext),
sets:add_element(F1, S)
end,
S0, Fs1),
find_modules(P, Paths, Ext, S1);
_ ->
find_modules(P, Paths, Ext, S0)
end;
find_modules(_P, [], _Ext, S) ->
sets:to_list(S). | dependencies/otp/r15b03-1/lib/kernel/src/packages.erl | 0.549761 | 0.525795 | packages.erl | starcoder |
% Licensed under the Apache License, Version 2.0 (the "License"); you may not
% use this file except in compliance with the License. You may obtain a copy of
% the License at
%
% http://www.apache.org/licenses/LICENSE-2.0
%
% Unless required by applicable law or agreed to in writing, software
% distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
% WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
% License for the specific language governing permissions and limitations under
% the License.
% This module implements the chooose subtree algorithm for the vtree. It is an
% implementation of the choose subtree algorithm described in:
% A Revised R * -tree in Comparison with Related Index Structures
% by <NAME>, <NAME>
-module(vtree_choose).
-include("vtree.hrl").
-export([choose_subtree/3]).
-ifdef(makecheck).
-compile(nowarn_export_all).
-compile(export_all).
-endif.
-spec choose_subtree(Nodes :: [split_node()], NewMbb :: mbb(),
Less :: lessfun()) -> split_node().
choose_subtree(Nodes, NewMbb, Less) ->
% Return all nodes that completely enclose the node that will be inserted,
% hence don't need any expansion.
Cov = lists:filter(fun({Mbb, _}) ->
vtree_util:within_mbb(NewMbb, Mbb, Less)
end, Nodes),
case Cov of
% Any node needs to be expanded to include the new one
[] ->
Sorted =
lists:sort(
fun({MbbA, _}, {MbbB, _}) ->
DeltaA = calc_delta_perimeter(MbbA, NewMbb, Less),
DeltaB = calc_delta_perimeter(MbbB, NewMbb, Less),
DeltaA =< DeltaB
end, Nodes),
case limit_nodes(Sorted, NewMbb, Less) of
[] ->
hd(Sorted);
Limited ->
process_limited(Limited, NewMbb, Less)
end;
% There are nodes that don't need any expension to include the
% newly added node
Cov ->
min_size(Cov)
end.
-spec limit_nodes(Nodes :: [split_node()], NewMbb :: mbb(), Less :: lessfun())
-> [split_node()].
limit_nodes([{FirstMbb, _}|Nodes], NewMbb, Less) ->
% All perimetric overlaps with first entry as entry the new
% node will be assigned to.
OverlapFirst = [{calc_delta_common_overlap_perimeter(
FirstMbb, J, NewMbb, Less), Node}
|| {J, _}=Node <- Nodes],
% Extract the first p nodes where the last node is the one whose
% perimetric overlap would increase from the assignment of the new
% node to the first node. I.e that all nodes from the end of the
% list will be dropped that don't increase the perimetric overlap.
Limited = lists:reverse(
lists:dropwhile(fun({Overlap, _}) ->
Overlap == 0
end, lists:reverse(OverlapFirst))),
% Strip off the overlap and return a plain list of nodes
[Node || {_, Node} <- Limited].
% Go on with finding an optimum candidate with a limited subset of the nodes
-spec process_limited(Limited :: [split_node()], NewMbb :: mbb(),
Less :: lessfun()) -> split_node().
process_limited(Limited, NewMbb, Less) ->
CheckComp = case any_zero_volume(Limited, NewMbb, Less) of
true -> check_comp(perimeter, Limited, NewMbb, Less);
false -> check_comp(volume, Limited, NewMbb, Less)
end,
case CheckComp of
{success, FinalNode} ->
FinalNode;
% All nodes lead to additional overlap with the other
% nodes, when the new node is assigned to one of them,
% hence use the one that leads to the minimum overlap
{overlap, Candidates} ->
{_, {_Overlap, FinalNode}} = vtree_util:find_min_value(
fun({Overlap, _}) ->
Overlap
end, Candidates),
FinalNode
end.
% It loops through the `Nodes` to find an ideal candidate, the `NewMbb` can
% be assigned to. Take the node where the assignment of the new MBB leads
% to minimal overlap with the other nodes.
% The whole algorithm gernally follows a depth first approach but adds
% a lot of complexity due to optimizations in case there is no node that
% doesn't lead to increased intersection with other nodes.
% For more details see the corresponding function CheckComp, in Section 3
% of the RR*-tree paper.
-spec check_comp(Aggregate :: perimeter | volume, Nodes :: [split_node()],
NewMbb :: mbb(), Less :: lessfun()) ->
{overlap, [{number(), split_node()}]} |
{success, split_node()}.
check_comp(Aggregate, Nodes, NewMbb, Less) ->
case check_comp(Aggregate, 1, NewMbb, [], [], Less, Nodes) of
{overlap, OverlapAcc, _} ->
{overlap, OverlapAcc};
{success, _} = Success ->
Success
end.
-spec check_comp(Aggregate :: perimeter | volume, Index :: pos_integer(),
NewMbb :: mbb(), OverlapAcc :: [{number(), split_node()}],
CandAcc :: [pos_integer()], Less :: lessfun(),
OrigNodes :: [split_node()]) ->
{overlap, [{number(), split_node()}], [pos_integer()]}
| {success, split_node()}.
check_comp(Aggregate, Index, NewMbb, OverlapAcc, CandAcc, Less, OrigNodes) ->
CandAcc2 = [Index|CandAcc],
T = lists:nth(Index, OrigNodes),
Result = check_comp0(Aggregate, OrigNodes, 0, 1, Index, NewMbb, OverlapAcc,
CandAcc2, Less, T, OrigNodes),
case Result of
{overlap, Overlap, OverlapAcc2, CandAcc3} ->
OverlapAcc3 = [{Overlap, T}|OverlapAcc2],
{overlap, OverlapAcc3, CandAcc3};
{success, _} = Success ->
Success
end.
% `check_comp0` corresponds to the for-loop in CheckComp, Section 3.
% Loop complete, the total overlap is still 0, hence use this node
-spec check_comp0(Aggregate :: perimeter | volume, Nodes :: [split_node()],
Overlap :: number(), Counter :: pos_integer(),
Index :: pos_integer(), NewMbb :: mbb(),
OverlapAcc :: [{number(), split_node()}],
CandAcc :: [pos_integer()], Less :: lessfun(),
T :: split_node(), OrigNodes :: [split_node()]) ->
{overlap, number(), [{number(), split_node()}],
[pos_integer()]} |
{success, split_node()}.
check_comp0(_Aggregate, [], Overlap, _Counter, _Index, _NewMbb, _OverlapAcc,
_CandAcc, _Less, T, _OrigNodes) when Overlap == 0 ->
{success, T};
% Loop complete, there is some overlap, hence report it back
check_comp0(_Aggregate, [], Overlap, _Counter, _Index, _NewMbb, OverlapAcc,
CandAcc, _Less, _T, _OrigNodes) ->
{overlap, Overlap, OverlapAcc, CandAcc};
% Skip the case when the current item would be the one we are currently
% comparing to
check_comp0(Aggregate, [_J|Nodes], Overlap, Counter, Index, NewMbb, OverlapAcc,
CandAcc, Less, T, OrigNodes) when Index == Counter ->
check_comp0(Aggregate, Nodes, Overlap, Counter+1, Index, NewMbb, OverlapAcc,
CandAcc, Less, T, OrigNodes);
% The normal case
check_comp0(Aggregate, [J|Nodes], Overlap, Counter, Index, NewMbb, OverlapAcc,
CandAcc, Less, T, OrigNodes) ->
{TMbb, _} = T,
{JMbb, _} = J,
NewOverlap = case Aggregate of
perimeter -> calc_delta_common_overlap_perimeter(
TMbb, JMbb, NewMbb, Less);
volume -> calc_delta_common_overlap_volume(
TMbb, JMbb, NewMbb, Less)
end,
Overlap2 = Overlap + NewOverlap,
Result = case NewOverlap /= 0 andalso not lists:member(Counter, CandAcc) of
true ->
check_comp(Aggregate, Counter, NewMbb, OverlapAcc,
CandAcc, Less, OrigNodes);
false ->
% Just keep on looping with the current index
{overlap, OverlapAcc, CandAcc}
end,
case Result of
{success, _} = Success ->
Success;
{overlap, OverlapAcc2, CandAcc2} ->
check_comp0(Aggregate, Nodes, Overlap2, Counter+1, Index, NewMbb,
OverlapAcc2, CandAcc2, Less, T, OrigNodes)
end.
% Return true if the assignment of the `NewMbb` to the supplied nodes would
% still lead to at least one zero volume node
-spec any_zero_volume(Nodes :: [split_node()], NewMbb :: mbb(),
Less :: lessfun()) -> boolean().
any_zero_volume([], _NewMbb, _Less) ->
false;
any_zero_volume([{Mbb,_}|T], NewMbb, Less) ->
Merged = vtree_util:calc_mbb([Mbb, NewMbb], Less),
Volume = vtree_util:calc_volume(Merged),
case Volume == 0 of
true ->
true;
_ ->
any_zero_volume(T, NewMbb, Less)
end.
% Calculates the delta perimeter of two MBBs, i.e. expension of the perimeter
% when you merge the `NewMbb` with the the ``OriginalMbb`.
-spec calc_delta_perimeter(OriginalMbb :: mbb(), NewMbb :: mbb(),
Less :: lessfun()) -> number().
calc_delta_perimeter(OriginalMbb, NewMbb, Less) ->
Merged = vtree_util:calc_mbb([OriginalMbb, NewMbb], Less),
vtree_util:calc_perimeter(Merged) - vtree_util:calc_perimeter(OriginalMbb).
% It's the common overlap (by perimeter) of MBB `T` with MBB `J` with the
% newly added MBB `NewMbb`, i.e. the difference between the intersection of
% `J` merged with `NewMbb` and T and the intersection between `J` and `T`.
% See section 3, Definition 1.
-spec calc_delta_common_overlap_perimeter(T :: mbb(), J :: mbb(),
NewMbb :: mbb(), Less :: lessfun())
-> number().
calc_delta_common_overlap_perimeter(T, J, NewMbb, Less) ->
Merged = vtree_util:calc_mbb([T, NewMbb], Less),
MergedPerimeter = case vtree_util:intersect_mbb(Merged, J, Less) of
overlapfree -> 0;
MergedMbb -> vtree_util:calc_perimeter(MergedMbb)
end,
OldPerimeter = case vtree_util:intersect_mbb(T, J, Less) of
overlapfree -> 0;
OldMbb -> vtree_util:calc_perimeter(OldMbb)
end,
MergedPerimeter - OldPerimeter.
% It's the same as `calc_delta_common_overlap_perimeter`, but with using the
% volume instead of the perimeter.
% See section 3, Definition 1.
-spec calc_delta_common_overlap_volume(T :: mbb(), J :: mbb(),
NewMbb :: mbb(), Less :: lessfun()) ->
number().
calc_delta_common_overlap_volume(T, J, NewMbb, Less) ->
Merged = vtree_util:calc_mbb([T, NewMbb], Less),
MergedPerimeter = case vtree_util:intersect_mbb(Merged, J, Less) of
overlapfree -> 0;
MergedMbb -> vtree_util:calc_volume(MergedMbb)
end,
OldPerimeter = case vtree_util:intersect_mbb(T, J, Less) of
overlapfree -> 0;
OldMbb -> vtree_util:calc_volume(OldMbb)
end,
MergedPerimeter - OldPerimeter.
% Returns the node (from a list of nodes) that has the least volume, except
% one node has zero volume, then the one with the least perimeter is returned
-spec min_size(Nodes :: [split_node()]) -> split_node().
min_size(Nodes) ->
{MinVolume, MinNode} = vtree_util:find_min_value(
fun({Mbb, _}) ->
vtree_util:calc_volume(Mbb)
end, Nodes),
case MinVolume == 0 of
% There's at least one node with zero volume, hence use the
% perimeter instead
true ->
{_MinPerim, MinNode2} = vtree_util:find_min_value(
fun({Mbb, _}) ->
vtree_util:calc_perimeter(Mbb)
end, Nodes),
MinNode2;
false ->
MinNode
end. | vtree/src/vtree_choose.erl | 0.76291 | 0.598811 | vtree_choose.erl | starcoder |
%%% @author <NAME> <<EMAIL>>
%%% @copyright (C) 2021, <NAME>
%%% @doc
%%% A particle for a minimizing particle swarm optimization problem.
%%% @end
%%% Created : 25 Apr 2021 by <NAME> <<EMAIL>>
-module(particle).
-export([new/3, new/5, step/2, position/1, value/1, state/1]).
-export_type([particle/0, position/0, velocity/0,
value/0, objective/0, state/0]).
-type vector() :: [float()].
-type position() :: vector().
-type velocity() :: vector().
-type value() :: term().
-type objective() :: fun((position()) -> value()).
-type state() :: {position(), value()}.
-record(particle, {position = [0.0] :: position(),
velocity = [0.0] :: velocity(),
value :: value(),
objective :: objective(),
best_position = [0.0] :: position(),
best_value = 0.0 :: value(),
max_position :: undefined | position(),
min_position :: undefined | position()}).
-opaque particle() :: #particle{}.
-define(PHI, 1.49618).
-define(OMEGA, 0.7298).
%% @doc
%% Initialize a new particle at `Position'. The particle's initial
%% velocity is `Velocity' and its objective function is computed by
%% `ObjectiveFun'. Immediately upon creation (before returning) the
%% particle's initial value is calculated by calling
%% ``ObjectiveFun(Position)''.
%% @end
-spec new(Position :: position(),
Velocity :: velocity(),
ObjectiveFun :: fun((position()) -> value())) -> particle().
new(Position, Velocity, ObjectiveFun) ->
Value = ObjectiveFun(Position),
#particle{position = Position,
velocity = Velocity,
value = Value,
objective = ObjectiveFun,
best_position = Position,
best_value = Value}.
%% @doc Create a new particle with Upper and lower bounds on position.
-spec new(Position :: position(),
Velocity :: velocity(),
ObjectiveFun :: fun((position()) -> value()),
MinPosition :: position(),
MaxPosition :: position()) -> particle().
new(Position, Velocity, ObjectiveFun, MinPosition, MaxPosition) ->
Particle = new(Position, Velocity, ObjectiveFun),
Particle#particle{min_position = MinPosition, max_position = MaxPosition}.
%% @doc
%% Apply the original PSO acceleration algorithm to `Velocity'.
%% @end
-spec accelerate(Velocity :: velocity(),
Position :: position(),
PersonalBest :: position(),
Neighbors :: [{position(), value()}]) -> velocity().
accelerate(Velocity, CurrentPosition, BestPosition, Neighbors) ->
[{GlobalBest, _}|_] = lists:keysort(2, Neighbors),
SelfAcceleration = vector:multiply(
?PHI * rand:uniform_real(),
vector:subtract(BestPosition, CurrentPosition)),
GlobalAcceleration = vector:multiply(
?PHI * rand:uniform_real(),
vector:subtract(GlobalBest, CurrentPosition)),
vector:add(
vector:multiply(?OMEGA, Velocity),
vector:add(SelfAcceleration, GlobalAcceleration)).
%% @doc Return the current position of `Particle'.
-spec position(Particle :: particle()) -> position().
position(#particle{position = Position}) -> Position.
%% @doc
%% Return the particle's value. If the value is undefined an error is
%% thrown.
%% @end
-spec value(Particle :: particle()) -> value().
value(#particle{value = Value}) -> Value.
%% @doc Return the position and value of `Particle'.
-spec state(Particle :: particle()) -> {position(), value()}.
state(Particle) ->
{position(Particle), value(Particle)}.
%% @doc Evaluate the objective function at the particle's current position.
-spec eval(Particle :: particle(), Position :: position()) -> particle().
eval(Particle = #particle{best_value = BestValue,
objective = ObjectiveFun},
Position) ->
NewValue = ObjectiveFun(Position),
if
NewValue < BestValue ->
Particle#particle{best_value = NewValue,
best_position = Position,
value = NewValue,
position = Position};
NewValue >= BestValue ->
Particle#particle{value = NewValue, position = Position}
end.
%% @doc
%% Evaluate a single iteration of the PSO algorithm. The velocity of
%% `Particle' is updated under the influence of `Neighbors' and the
%% new velocity is applied to update the particle's position.
%%
%% `Neighbors' should contain the best solution found so fajr by each
%% of the particles that influences this particle.
%% @end
-spec step(Particle :: particle(), Neighbors :: [{position(), value()}]) -> particle().
step(Particle, Neighbors) ->
NewVelocity = limit_velocity(
Particle#particle.max_position,
Particle#particle.min_position,
accelerate(Particle#particle.velocity,
Particle#particle.position,
Particle#particle.best_position,
Neighbors)),
NewPosition = limit_position(
Particle#particle.max_position,
Particle#particle.min_position,
vector:add(Particle#particle.position, NewVelocity)),
eval(Particle#particle{velocity = NewVelocity}, NewPosition).
sign(X) when X < 0 -> -1;
sign(X) when X > 0 -> 1;
sign(X) when X == 0 -> 0.
limit_velocity(undefined, undefined, Velocity) ->
Velocity;
limit_velocity(Max, Min, Velocity)
when (Max =/= undefined) and (Min =/= undefined) ->
Bounds = [abs(X) || X <- vector:to_list(vector:subtract(Max, Min))],
vector:from_list([if abs(X) > Limit -> sign(X) * Limit;
abs(X) =< Limit -> X
end || {X, Limit} <- lists:zip(Velocity, Bounds)]).
limit_position(undefined, undefined, Velocity) ->
Velocity;
limit_position(Max, Min, Velocity)
when (Max =/= undefined) and (Min =/= undefined) ->
vector:from_list(
[if X > XMax -> XMax;
X < XMin -> XMin;
true -> X
end || {X, XMax, XMin} <- lists:zip3(Velocity, Max, Min)]).
-ifdef(TEST).
-include_lib("eunit/include/eunit.hrl").
step_test_() ->
Particle = particle:new([10.0], [0.0], fun([X]) -> abs(X) end),
[{"Particle moves towards its only neighbor to the right",
fun() ->
[NewPosition] = particle:position(
particle:step(Particle, [{[30.0], 30.0}])),
?assert(NewPosition > 10.0)
end},
{"Particle moves towards its only neighbor to the left",
fun() ->
[NewPosition] = particle:position(
particle:step(Particle, [{[-10.0], 10.0}])),
?assert(NewPosition < 10.0)
end},
{"Particle moves towards its best neighbor (to the right)",
fun() ->
[NewPosition] =
particle:position(
particle:step(
Particle,
[{[-10.0], 10.0}, {[30.0], 1.0}, {[0.0], 2.0}])),
?assert(NewPosition > 10.0)
end},
{"Particle moves towards its best neighbor (to the left)",
fun() ->
[NewPosition] =
particle:position(
particle:step(
Particle,
[{[-10.0], 0.0}, {[30.0], 1.0}, {[10.0], 10.0}])),
?assert(NewPosition < 10.0)
end}].
limit_position_test_() ->
Max = [1.0, 0.0, -1.0],
Min = [0.0, -1.0, -2.0],
[?_assertEqual([0.5, -0.5, -1.5],
limit_position(Max, Min, [0.5, -0.5, -1.5])),
?_assertEqual(Max, limit_position(Max, Min, Max)),
?_assertEqual(Min, limit_position(Max, Min, Min)),
?_assertEqual(Max, limit_position(Max, Min, [X + 1 || X <- Max])),
?_assertEqual(Min, limit_position(Max, Min, [X - 1 || X <- Min]))].
limit_velocity_test_() ->
Max = [1.0, 0.0, -1.0, 1.0],
Min = [0.0, -1.0, -2.0, -1.0],
[[?_assertEqual(Expected, X)
|| {X, Expected} <- lists:zip(
vector:to_list(
limit_velocity(
Max, Min,
vector:from_list([2.0, 1.1, -1.1, 3.0]))),
[1.0, 1.0, -1.0, 2.0])],
[?_assertEqual(Expected, X)
|| {X, Expected} <- lists:zip(
vector:to_list(
limit_velocity(
Max, Min,
vector:from_list([-2.0, 0.0, 101, -3.0]))),
[-1.0, 0.0, 1.0, -2.0])],
?_assertEqual(vector:from_list([1.0, 1.0, 1.0, 1.0]),
limit_velocity(Max, Min,
vector:from_list([1.0, 1.0, 1.0, 1.0]))),
?_assertEqual(vector:from_list([0.0, 0.0, 0.0, 0.0]),
limit_velocity(Max, Min,
vector:from_list([0.0, 0.0, 0.0, 0.0])))].
-endif. | src/particle.erl | 0.722331 | 0.584745 | particle.erl | starcoder |
%%======================================================================
%%
%% Leo Commons
%%
%% Copyright (c) 2012-2015 Rakuten, Inc.
%%
%% This file is provided to you under the Apache License,
%% Version 2.0 (the "License"); you may not use this file
%% except in compliance with the License. You may obtain
%% a copy of the License at
%%
%% http://www.apache.org/licenses/LICENSE-2.0
%%
%% Unless required by applicable law or agreed to in writing,
%% software distributed under the License is distributed on an
%% "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
%% KIND, either express or implied. See the License for the
%% specific language governing permissions and limitations
%% under the License.
%%
%% ---------------------------------------------------------------------
%% Leo Commons - Miscellaneous
%%
%% @doc leo_misc is miscellaneous utilities
%% @reference https://github.com/leo-project/leo_commons/blob/master/src/leo_misc.erl
%% @end
%%======================================================================
-module(leo_misc).
-author('<NAME>').
-export([node_existence/1, node_existence/2,
get_value/2, get_value/3,
binary_tokens/2,
init_env/0, get_env/2, get_env/3, set_env/3,
any_to_binary/1
]).
-include("leo_commons.hrl").
-include_lib("eunit/include/eunit.hrl").
%% @doc check a node existence.
%%
-spec(node_existence(Node) ->
Existence::boolean() when Node::atom()).
node_existence(Node) ->
node_existence(Node, 5000).
%% @doc check a node existence.
%%
-spec(node_existence(Node, Timeout) ->
Existence::boolean() when Node::atom(),
Timeout::pos_integer()).
node_existence(Node, Timeout) ->
(Node == rpc:call(Node, erlang, node, [], Timeout)).
%% @doc Retrieve a value from prop-lists
%%
-spec(get_value(Key, Props) ->
undefined | any() when Key::any(),
Props::[tuple()]).
get_value(Key, Props) ->
get_value(Key, Props, undefined).
%% @doc Retrieve a value from prop-lists
%%
-spec(get_value(Key, Props, Default) ->
undefined | any() when Key::any(),
Props::[tuple()],
Default::any()).
get_value(Key, Props, Default) ->
case lists:keyfind(Key, 1, Props) of
false ->
Default;
{_, Value} ->
Value
end.
%% @doc Retrieve tokens from binary-data by delimiter-char
%%
-spec(binary_tokens(Bin, Delimiter) ->
Tokens::[binary()] when Bin::binary(),
Delimiter::binary()).
binary_tokens(Bin, Delimiter) ->
case binary:split(Bin, Delimiter, [global,trim]) of
[<<>>|Rest] ->
Rest;
Tokens ->
Tokens
end.
%% @doc Initialize table of env
%%
-spec(init_env() ->
ok).
init_env() ->
case ets:info(?ETS_ENV_TABLE) of
undefined ->
case ets:new(?ETS_ENV_TABLE,
[named_table, set, public, {read_concurrency, true}]) of
?ETS_ENV_TABLE ->
ok;
_ ->
erlang:error({error, 'could_not_create_ets_table'})
end;
_ ->
ok
end.
%% @doc Returns the value of the configuration parameter Par application from ETS
%%
-spec(get_env(AppName, Key) ->
{ok, any()} | undefined when AppName::atom(),
Key::any()).
get_env(AppName, Key) ->
get_env(AppName, Key, undefined).
%% @doc Returns the value of the configuration parameter Par application from ETS
%%
-spec(get_env(AppName, Key, Default) ->
{ok, any()} | undefined when AppName::atom(),
Key::any(),
Default::any()).
get_env(AppName, Key, Default) ->
case ets:lookup(?ETS_ENV_TABLE, {env, AppName, Key}) of
[{_, Val}] ->
{ok, Val};
_ ->
Default
end.
%% @doc Sets the value of the configuration parameter Par for Application to ETS
%%
-spec(set_env(AppName, Key, Val) ->
ok when AppName::atom(),
Key::any(),
Val::any()).
set_env(AppName, Key, Val) ->
_ = ets:insert(?ETS_ENV_TABLE, {{env, AppName, Key}, Val}),
ok.
%% @doc Convert value from any-type to binary
%%
-spec(any_to_binary(V) ->
binary() when V::binary()).
any_to_binary(V) when is_binary(V) ->
V;
any_to_binary(V) when is_atom(V) ->
list_to_binary(atom_to_list(V));
any_to_binary(V) when is_list(V) ->
list_to_binary(V);
any_to_binary(V) when is_number(V) ->
list_to_binary(integer_to_list(V));
any_to_binary(V) ->
term_to_binary(V). | deps/leo_commons/src/leo_misc.erl | 0.738386 | 0.411347 | leo_misc.erl | starcoder |
%% Copyright Ericsson AB 1996-2020. All Rights Reserved.
%% Copyright (c) 2020 Facebook, Inc. and its affiliates.
%%
%% Licensed under the Apache License, Version 2.0 (the "License");
%% you may not use this file except in compliance with the License.
%% You may obtain a copy of the License at
%%
%% http://www.apache.org/licenses/LICENSE-2.0
%%
%% Unless required by applicable law or agreed to in writing, software
%% distributed under the License is distributed on an "AS IS" BASIS,
%% WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
%% See the License for the specific language governing permissions and
%% limitations under the License.
-module(erlt_shape).
-export([parse_transform/2]).
parse_transform(Forms, _Options) ->
erlt_ast:postwalk(Forms, fun(Node, Ctx) -> rewrite(Node, Ctx) end).
rewrite({type, Line, closed_shape, Fields}, type) ->
MapTypeFields = to_typed_map_fields(Fields),
{type, Line, map, MapTypeFields};
rewrite({type, Line, open_shape, Fields, _Var}, type) ->
MapTypeFields = to_typed_map_fields(Fields),
{type, Line, map, MapTypeFields ++ [generic_open_shape_field_type(Line)]};
rewrite({shape, Line, Fields}, pattern) ->
MapFields = to_map_fields(map_field_exact, Fields),
{map, Line, MapFields};
rewrite({shape, Line, Fields}, Ctx) when Ctx =:= expr orelse Ctx =:= guard ->
MapFields = to_map_fields(map_field_assoc, Fields),
{map, Line, MapFields};
rewrite({shape_update, Line, Expr, Fields}, Ctx) when Ctx =:= expr orelse Ctx =:= guard ->
MapFields = to_map_fields(map_field_assoc, Fields),
{map, Line, Expr, MapFields};
rewrite({shape_field, Line, Expr, Field}, Ctx) when Ctx =:= expr orelse Ctx =:= guard ->
{call, Line, {remote, Line, {atom, Line, erlang}, {atom, Line, map_get}}, [Field, Expr]};
rewrite(Other, _) ->
Other.
generic_open_shape_field_type(Line) ->
{type, Line, map_field_assoc, [{type, Line, atom, []}, {type, Line, any, []}]}.
to_typed_map_fields(Fields) ->
[
{type, Line, map_field_exact, [Key, Value]}
|| {field_definition, Line, Key, undefined, Value} <- Fields
].
to_map_fields(TypeOfMapField, Fields) ->
[{TypeOfMapField, Line, Key, Value} || {field, Line, Key, Value} <- Fields]. | erltc/src/erlt_shape.erl | 0.668447 | 0.416797 | erlt_shape.erl | starcoder |
% @doc API for the
% <a href="https://reference.digilentinc.com/reference/pmod/pmodhygro/start">
% PmodHYGRO
% </a>.
%
% Start the driver with
% ```
% 1> grisp:add_device(i2c, pmod_hygro).
% '''
% @end
-module(pmod_hygro).
-behaviour(gen_server).
% API
-export([start_link/2]).
-export([temp/0]).
-export([humid/0]).
-export([measurements/0]).
% Callbacks
-export([init/1]).
-export([handle_call/3]).
-export([handle_cast/2]).
-export([handle_info/2]).
-export([code_change/3]).
-export([terminate/2]).
-define(DEVICE_ADR, 16#40).
-define(TEMP_REGISTER, 16#00).
-define(DELAY_TIME, 15).
%
%--- Records -------------------------------------------------------------------
%
-record(state, {}).
%--- API -----------------------------------------------------------------------
% @private
start_link(Slot, _Opts) ->
gen_server:start_link({local, ?MODULE}, ?MODULE, Slot, []).
% @doc Measure the temperature in °C.
%
% === Example ===
% ```
% 2> pmod_hygro:temp().
% [{temp,24.6746826171875}]
% '''
-spec temp() -> [{temp, float()}].
temp() ->
gen_server:call(?MODULE, temp).
% @doc Measure the humidity in %.
%
% === Example ===
% ```
% 2> pmod_hygro:humid().
% [{humid,50.225830078125}]
% '''
-spec humid() -> [{humid, float()}].
humid() ->
gen_server:call(?MODULE, humid).
% @doc Measure the temperature and humidity.
%
% === Example ===
% ```
% 2> pmod_hygro:measurements().
% [{temp,24.52362060546875},{humid,50.823974609375}]
% '''
-spec measurements() -> [{temp, float()}|{humid, float()}].
measurements() ->
gen_server:call(?MODULE, measurements).
%--- Callbacks -----------------------------------------------------------------
% @private
init(Slot) ->
grisp_devices:register(Slot, ?MODULE),
{ok, #state{}}.
% @private
handle_call(temp, _From, State) ->
{ok, <<T:14/unsigned-big, _:2>>} = device_request(2),
Temp = evaluate_temp(T),
{reply, [{temp, Temp}], State};
handle_call(humid, _From, State) ->
{ok, <<_:14, _:2, H:14/unsigned-big, _:2>>} = device_request(4),
Humid = evaluate_humid(H),
{reply, [{humid, Humid}], State};
handle_call(measurements, _From, State) ->
{ok, <<T:14/unsigned-big, _:2, H:14/unsigned-big, _:2>>} = device_request(4),
Temp = evaluate_temp(T),
Humid = evaluate_humid(H),
{reply, [{temp, Temp}, {humid, Humid}], State}.
% @private
handle_cast(Request, _State) -> error({unknown_cast, Request}).
% @private
handle_info(Info, _State) -> error({unknown_info, Info}).
% @private
code_change(_OldVsn, State, _Extra) -> {ok, State}.
% @private
terminate(_Reason, _State) -> ok.
%--- Internal ------------------------------------------------------------------
device_request(BytesToRead) ->
Response = grisp_i2c:msgs([?DEVICE_ADR, {write, <<?TEMP_REGISTER>>},
{sleep, ?DELAY_TIME},
{read, BytesToRead}]),
{ok, Response}.
evaluate_temp(T) ->
(T / 16384) * 165 - 40.
evaluate_humid(H) ->
(H / 16384) * 100. | src/pmod_hygro.erl | 0.612541 | 0.427994 | pmod_hygro.erl | starcoder |
-module(day3).
-behaviour(aoc).
-include_lib("eunit/include/eunit.hrl").
-export([input_type/0, parse_input/1, p1/1, p2/1]).
input_type() -> lines.
parse_input(Lines) ->
lists:map(fun binary_to_list/1, Lines).
%% @doc
%% The submarine has been making some odd creaking noises, so you ask it to produce
%% a diagnostic report just in case.
%%
%% The diagnostic report (your puzzle input) consists of a list of binary numbers
%% which, when decoded properly, can tell you many useful things about the
%% conditions of the submarine. The first parameter to check is the power
%% consumption.
%%
%% You need to use the binary numbers in the diagnostic report to generate two new
%% binary numbers (called the gamma rate and the epsilon rate). The power
%% consumption can then be found by multiplying the gamma rate by the epsilon rate.
%%
%% Each bit in the gamma rate can be determined by finding the most common bit in
%% the corresponding position of all numbers in the diagnostic report. For example,
%% given the following diagnostic report:
%%
%% 00100
%% 11110
%% 10110
%% 10111
%% 10101
%% 01111
%% 00111
%% 11100
%% 10000
%% 11001
%% 00010
%% 01010
%%
%% Considering only the first bit of each number, there are five 0 bits and seven 1
%% bits. Since the most common bit is 1, the first bit of the gamma rate is 1.
%%
%% The most common second bit of the numbers in the diagnostic report is 0, so the
%% second bit of the gamma rate is 0.
%%
%% The most common value of the third, fourth, and fifth bits are 1, 1, and 0,
%% respectively, and so the final three bits of the gamma rate are 110.
%%
%% So, the gamma rate is the binary number 10110, or 22 in decimal.
%%
%% The epsilon rate is calculated in a similar way; rather than use the most common
%% bit, the least common bit from each position is used. So, the epsilon rate is
%% 01001, or 9 in decimal. Multiplying the gamma rate (22) by the epsilon rate (9)
%% produces the power consumption, 198.
%%
%% Use the binary numbers in your diagnostic report to calculate the gamma rate and
%% epsilon rate, then multiply them together. What is the power consumption of the
%% submarine? (Be sure to represent your answer in decimal, not binary.)
p1(Readings) ->
{Gamma, Epsilon} = decode_power(mcbs(Readings)),
Gamma * Epsilon.
mcbs([First | _] = Readings) ->
[
case O >= Z of true -> $1; false -> $0 end
||
{Z, O} <- lists:foldl(fun append/2, [{0, 0} || _ <- First], Readings)
].
append(Reading, Counts) ->
append(Reading, Counts, []).
append([], [], Next) ->
lists:reverse(Next);
append([$0 | Bits], [ { Z, O } | Prev ], Next) ->
append(Bits, Prev, [ { Z + 1, O } | Next ] );
append([$1 | Bits], [ { Z, O } | Prev ], Next) ->
append(Bits, Prev, [ { Z, O + 1 } | Next ] ).
decode_power(MCBs) ->
decode_power(lists:reverse(MCBs), {0, 0}, 1).
decode_power([], {Gamma, Epsilon}, _Exponent) -> {Gamma, Epsilon};
decode_power([$0 | MCBs], {G, E}, Exp) -> decode_power(MCBs, {G, E + Exp}, Exp * 2);
decode_power([$1 | MCBs], {G, E}, Exp) -> decode_power(MCBs, {G + Exp, E}, Exp * 2).
-ifdef(EUNIT).
example() -> [
"00100",
"11110",
"10110",
"10111",
"10101",
"01111",
"00111",
"11100",
"10000",
"11001",
"00010",
"01010"
].
p1_test() ->
?assertEqual(198, p1(example())).
-endif.
%% @doc
%% Next, you should verify the life support rating, which can be determined by
%% multiplying the oxygen generator rating by the CO2 scrubber rating.
%%
%% Both the oxygen generator rating and the CO2 scrubber rating are values that can
%% be found in your diagnostic report - finding them is the tricky part. Both
%% values are located using a similar process that involves filtering out values
%% until only one remains. Before searching for either rating value, start with the
%% full list of binary numbers from your diagnostic report and consider just the
%% first bit of those numbers. Then:
%%
%% - Keep only numbers selected by the bit criteria for the type of rating value for
%% which you are searching. Discard numbers which do not match the bit criteria.
%%
%% - If you only have one number left, stop; this is the rating value for which
%% you are searching.
%%
%% - Otherwise, repeat the process, considering the next
%% bit to the right.
%%
%% The bit criteria depends on which type of rating value you want to find:
%%
%% - To find oxygen generator rating, determine the most common value (0 or 1) in
%% the current bit position, and keep only numbers with that bit in that position.
%% If 0 and 1 are equally common, keep values with a 1 in the position being
%% considered.
%% - To find CO2 scrubber rating, determine the least common value (0
%% or 1) in the current bit position, and keep only numbers with that bit in that
%% position. If 0 and 1 are equally common, keep values with a 0 in the position
%% being considered.
%%
%% For example, to determine the oxygen generator rating value using the same
%% example diagnostic report from above:
%%
%% - Start with all 12 numbers and consider only the first bit of each number.
%% There are more 1 bits (7) than 0 bits (5), so keep only the 7 numbers with a 1
%% in the first position: 11110, 10110, 10111, 10101, 11100, 10000, and 11001.
%%
%% - Then, consider the second bit of the 7 remaining numbers: there are more 0
%% bits (4) than 1 bits (3), so keep only the 4 numbers with a 0 in the second
%% position: 10110, 10111, 10101, and 10000. - In the third position, three of
%% the four numbers have a 1, so keep those three: 10110, 10111, and 10101.
%%
%% - In the fourth position, two of the three numbers have a 1, so keep those two:
%% 10110 and 10111. - In the fifth position, there are an equal number of 0 bits
%% and 1 bits (one each). So, to find the oxygen generator rating, keep the number
%% with a 1 in that position: 10111.
%%
%% - As there is only one number left, stop; the oxygen generator rating is 10111,
%% or 23 in decimal.
%%
%% Then, to determine the CO2 scrubber rating value from the same example above:
%%
%% - Start again with all 12 numbers and consider only the first bit of each
%% number. There are fewer 0 bits (5) than 1 bits (7), so keep only the 5 numbers
%% with a 0 in the first position: 00100, 01111, 00111, 00010, and 01010.
%%
%% - Then, consider the second bit of the 5 remaining numbers: there are fewer 1
%% bits (2) than 0 bits (3), so keep only the 2 numbers with a 1 in the second
%% position: 01111 and 01010.
%%
%% - In the third position, there are an equal number of 0 bits and 1 bits (one
%% each). So, to find the CO2 scrubber rating, keep the number with a 0 in that
%% position: 01010.
%%
%% - As there is only one number left, stop; the CO2 scrubber rating is 01010, or
%% 10 in decimal.
%%
%% Finally, to find the life support rating, multiply the oxygen generator rating
%% (23) by the CO2 scrubber rating (10) to get 230.
%%
%% Use the binary numbers in your diagnostic report to calculate the oxygen
%% generator rating and CO2 scrubber rating, then multiply them together. What is
%% the life support rating of the submarine? (Be sure to represent your answer in
%% decimal, not binary.)
p2(Readings) ->
decode_life_support(mcbs(Readings), Readings).
decode_life_support(MCBs, Readings) ->
Pairs = lists:map(fun (A) -> {A, A} end, Readings),
o2(MCBs, Pairs) * co2(MCBs, Pairs).
decode_gas(_Matches, _MCBs, [{_, Reading}]) ->
dec2bin(Reading);
decode_gas({H, L}, [Bit | _MCBs], Readings) ->
Matching = lists:filtermap(
fun ({[D | Ds], Reading}) -> case {Bit, D} of
{$1, H} -> {true, {Ds, Reading}};
{$0, L} -> {true, {Ds, Reading}};
_ -> false
end end,
Readings
),
MCBs = mcbs(lists:map(fun fst/1, Matching)),
decode_gas({H, L}, MCBs, Matching).
o2(MCBs, Reading) -> decode_gas({$1, $0}, MCBs, Reading).
co2(MCBs, Reading) -> decode_gas({$0, $1}, MCBs, Reading).
-ifdef(EUNIT).
p2_test() ->
?assertEqual(230, p2(example())).
-endif.
%%% Helpers
dec2bin(S) -> dec2bin(lists:reverse(S), 0, 1).
dec2bin([], V, _E) -> V;
dec2bin([$0 | S], V, E) -> dec2bin(S, V, E * 2);
dec2bin([$1 | S], V, E) -> dec2bin(S, V + E, E * 2).
fst({S, _}) -> S. | src/day3.erl | 0.508544 | 0.852537 | day3.erl | starcoder |
%% Helper module for binaries/bitstrings
-module(gradualizer_bin).
-export([compute_type/1]).
-include("gradualizer.hrl").
%% Computes the type of a bitstring expression or pattern based on the sizes
%% of the elements. The returned type is a normalized bitstring type.
-spec compute_type(ExprOrPat) -> gradualizer_type:abstract_type()
when ExprOrPat :: {bin, _, _},
ExprOrPat :: gradualizer_type:abstract_expr().
compute_type(Bin) ->
View = bin_view(Bin),
bitstr_view_to_type(View).
%% <<_:B, _:_*U>> is represented as {B, U} (fixed base + multiple of unit)
-type bitstr_view() :: {non_neg_integer(), non_neg_integer()} | none.
bitstr_concat({B1, U1}, {B2, U2}) ->
{B1 + B2, gcd(U1, U2)};
bitstr_concat(none, _) -> none;
bitstr_concat(_, none) -> none.
-spec bitstr_view_to_type(bitstr_view()) -> gradualizer_type:abstract_type().
bitstr_view_to_type({B, U}) ->
Anno = erl_anno:new(0),
{type, Anno, binary, [{integer, Anno, B}, {integer, Anno, U}]};
bitstr_view_to_type(none) ->
{type, erl_anno:new(0), none, []}.
%% Returns the view of a bit expression or pattern, i.e. computes its size
-spec bin_view({bin, _, _}) -> bitstr_view().
bin_view({bin, _, BinElements}) ->
ElementViews = [bin_element_view(E) || E <- BinElements],
lists:foldl(fun bitstr_concat/2, {0, 0}, ElementViews).
bin_element_view({bin_element, Anno, {Lit, _, _}, default, _Spec} = BinElem)
when Lit == integer; Lit == char; Lit == string ->
%% Literal with default size, i.e. no variables to consider.
%% Size is not allowed for utf8/utf16/utf32.
Bin = {bin, Anno, [BinElem]},
{value, Value, []} = erl_eval:expr(Bin, []),
{bit_size(?assert_type(Value, bitstring())), 0};
bin_element_view({bin_element, Anno, {string, _, Chars}, Size, Spec}) ->
%% Expand <<"ab":32/float>> to <<$a:32/float, $b:32/float>>
%% FIXME: Not true for float, integer
Views = [bin_element_view({bin_element, Anno, {char, Anno, Char}, Size, Spec})
|| Char <- Chars],
lists:foldl(fun bitstr_concat/2, {0, 0}, Views);
bin_element_view({bin_element, _Anno, _Expr, default, Specifiers}) ->
%% Default size
%% <<1/integer-unit:2>> gives the following error:
%% * 1: a bit unit size must not be specified unless a size is specified too
%% However <<(<<9:9>>)/binary-unit:3>> gives no error.
%% The type specifier 'binary' seems to be the only exception though.
case get_type_specifier(Specifiers) of
integer -> {8, 0};
float -> {64, 0};
binary -> {0, get_unit(Specifiers)};
bytes -> {0, 8};
bitstring -> {0, 1};
bits -> {0, 1};
utf8 -> {0, 8}; %% 1-4 bytes
utf16 -> {0, 16}; %% 2-4 bytes
utf32 -> {32, 0} %% 4 bytes, fixed
end;
bin_element_view({bin_element, _Anno, _Expr, SizeSpec, Specifiers}) ->
%% Non-default size, possibly a constant expression
try erl_eval:expr(SizeSpec, []) of
{value, Sz, _VarBinds} ->
{Sz * get_unit(Specifiers), 0}
catch
error:{unbound_var, _} ->
%% Variable size
U = get_unit(Specifiers),
case get_type_specifier(Specifiers) of
float when U == 64 -> {64, 0}; %% size must be 1 in this case
float -> {32, 32}; %% a float must be 32 or 64 bits
_OtherType -> {0, U} %% any multiple of the unit
end
end.
-spec get_type_specifier(Specifiers :: [atom() | {unit, non_neg_integer()}] |
default) -> atom().
get_type_specifier(Specifiers) when is_list(Specifiers) ->
case [S || S <- Specifiers,
S == integer orelse S == float orelse
S == binary orelse S == bytes orelse
S == bitstring orelse S == bits orelse
S == utf8 orelse S == utf16 orelse
S == utf32] of
[S|_] -> ?assert_type(S, atom());
[] -> integer %% default
end;
get_type_specifier(default) -> integer.
get_unit(Specifiers) when is_list(Specifiers) ->
case [U || {unit, U} <- Specifiers] of
[U|_] -> U;
[] -> get_default_unit(Specifiers)
end;
get_unit(default) -> 1.
get_default_unit(Specifiers) when is_list(Specifiers) ->
case get_type_specifier(Specifiers) of
binary -> 8;
bytes -> 8;
_Other -> 1
end.
-spec gcd(non_neg_integer(), non_neg_integer()) -> non_neg_integer().
gcd(A, B) when B > A -> gcd1(B, A);
gcd(A, B) -> gcd1(A, B).
-spec gcd1(non_neg_integer(), non_neg_integer()) -> non_neg_integer().
gcd1(A, 0) -> A;
gcd1(A, B) ->
case A rem B of
0 -> B;
X -> gcd1(B, X)
end. | src/gradualizer_bin.erl | 0.526343 | 0.520314 | gradualizer_bin.erl | starcoder |
-module(sv).
-export([timestamp/0, new/1, new/2, destroy/1, ask/2, done/3]).
-export([run/2]).
%% Internal API
-export([report/2]).
%% @doc Creates a new queue
%% @end
-spec new(Conf) -> {ok, pid()}
when
Conf :: proplists:proplist().
new(Conf) ->
new(undefined, Conf).
-spec new(Queue, Conf) -> {ok, pid()}
when
Queue :: undefined | atom(),
Conf :: proplists:proplist().
new(Queue, Conf) ->
{ok, Pid} = safetyvalve_sup:start_queue(Queue, Conf),
{ok, Pid}.
%% @doc Destroys a previously created queue
%% @end
-spec destroy(Queue) -> ok | {error, not_found | simple_one_for_one}
when
Queue :: undefined | pid() | atom().
destroy(Queue) ->
safetyvalve_sup:stop_queue(Queue).
%% @doc Enqueue a job on a queue
%% <p>Try to run `Fun' on queue `Name'. The `Fun' is run at time `TP'.
%% This means that either the
%% function will run straight away, or be queued for some time until
%% it is allowed to run (in case of an overload scenario). The
%% function will return either the result of `Fun' or an `{error,
%% Reason}' error term, describing the overload situation encountered.</p>
%% @end
-spec run(Name, Fun) -> {ok, Result} | {error, Reason}
when
Name :: atom() | pid(),
Fun :: fun (() -> term),
Result :: term(),
Reason :: term().
run(Name, Fun) ->
StartPoint = timestamp(),
case sv_queue:ask(Name, StartPoint) of
{go, Ref} ->
Res = Fun(),
EndPoint = timestamp(),
sv_queue:done(Name, Ref, EndPoint),
{ok, Res};
{error, Reason} ->
{error, Reason}
end.
%% @doc ask/2 requests the use of a resource in safetyvalve
%% <p>Ask for the use of a `Queue' at timepoint `T'. Returns either `{go, Ref}' if
%% you are allowed to use the resource or `{error, Reason}' in case of an error</p>
%% <p>The timepoint `T' should be generated via a call to `sv:timestamp()'. Also, note
%% that this call will block until the resource is either given, or the system gives
%% up on processing the request because it has exceeded some queueing threshold.</p>
%% <p>When you are done processing, you are obliged to call `sv:done(Queue, Ref, TE)'
%% where `Ref' is the given reference and `TE' is a time endpoint as given by
%% a call to `sv:timestamp()'.
%% @end
-spec ask(Queue, T) -> {go, Ref} | {error, Reason}
when
Queue :: atom() | pid(),
T :: integer(),
Ref :: term(), % Opaque
Reason :: term().
ask(QN, T) ->
sv_queue:ask(QN, T).
%% @doc done/3 relinquishes a resource yet again to the queue
%% <p>Call this function when you are done with using a resource. @see ask/2 for the
%% documentation of how to invoke this function.</p>
-spec done(Queue, Ref, TE) -> ok
when
Queue :: atom(),
Ref :: term(),
TE :: integer().
done(QN, R, TE) ->
sv_queue:done(QN, R, TE).
%% @private
report(_T, _Event) ->
hopefully_traced.
%% @doc Construct a timestamp in a canonical way for Safetyvalve.
%% The important rule here is that timestamps are used as unique time
%% representations, which in turn means we have to create a timestamp
%% and latch on a unique integer.
%% @end
-spec timestamp() -> {integer(), integer()}.
timestamp() ->
T = sv_time:monotonic_time(micro_seconds),
U = sv_time:unique_integer(),
{T, U}. | src/sv.erl | 0.688364 | 0.449393 | sv.erl | starcoder |
%%% vim:ts=2:sw=2:et
%%%-----------------------------------------------------------------------------
%%% @doc Erlang map-reduce parse transform
%%%
%%% This transform introduces two modifications of the list comprehension syntax
%%% that allow to perform a fold and mapfold on a list.
%%%
%%% ==== Indexed List Comprehension ====
%%%
%%% This extension of a list comprehension, passes an additional argument to the
%%% left hand side of the comprehension, which is the index of the current item
%%% in the list:
%%% ```
%%% [ io:format("Rec#~w: ~p\n", [I, N]) || I, N <- L]
%%% ^^
%%% ```
%%% The index is defined by the a variable listed after the `||' operator.
%%% This is equivalent to the following:
%%% ```
%%% lists:mapfoldl(
%%% fun(N, I) ->
%%% io:format("Rec#~w: ~p\n", [I, N]),
%%% I+1
%%% end, 1, L)
%%% ```
%%%
%%% === Fold Comprehension ===
%%%
%%% To invoke the fold comprehension transform include the initial state
%%% assignment into a comprehension that returns a non-tuple expression:
%%% ```
%%% [S+N || S = 1, N <- L].
%%% ^^^ ^^^^^
%%% '''
%%%
%%% In this example the `S' variable gets assigned the initial state `1', and
%%% the `S+N' expression represents the body of the fold function that
%%% is passed the iteration variable `N' and the state variable `S':
%%% ```
%%% lists:foldl(fun(N, S) -> S+N end, 1, L).
%%% '''
%%%
%%% Fold comprehension can be combined with the indexed list comprehension:
%%% ```
%%% [running_sum(I, N, S+N) || I, S=5, N <- L].
%%%
%%% running_sum(I, N, RunningSum) ->
%%% io:format("Rec#~w: ~p (~w)\n", [I, N, RunningSum]),
%%% S.
%%% '''
%%%
%%% In this case the definition of the indexed fold comprehension would be
%%% transformed to:
%%% ```
%%% element(2, lists:foldl(fun(I, {N, S}) ->
%%% running_sum(I, N, S), {N+1, S+N} end, {1,5}, L)),
%%% '''
%%%
%%% == Compilation ==
%%%
%%% When using this as a parse transform, include the
%%% `{parse_transform,listcomp}' compiler option.
%%%
%%% For debugging the AST of the resulting transform, pass the following
%%% options to the `erlc' compiler:
%%% <dl>
%%% <li>`-Dlistcomp_orig' - print the original AST before the transform</li>
%%% <li>`-Dlistcomp_ast' - print the transformed AST</li>
%%% <li>`-Dlistcomp_src' - print the resulting source code after the transform</li>
%%% </dl>
%%%
%%% @author <NAME> <saleyn(at)gmail(dot)com>
%%% @end
%%%-----------------------------------------------------------------------------
%%% Copyright (c) 2021 <NAME>
%%%
%%% Permission is hereby granted, free of charge, to any person
%%% obtaining a copy of this software and associated documentation
%%% files (the "Software"), to deal in the Software without restriction,
%%% including without limitation the rights to use, copy, modify, merge,
%%% publish, distribute, sublicense, and/or sell copies of the Software,
%%% and to permit persons to whom the Software is furnished to do
%%% so, subject to the following conditions:
%%%
%%% The above copyright notice and this permission notice shall be included
%%% in all copies or substantial portions of the Software.
%%%
%%% THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
%%% EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
%%% MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT.
%%% IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY
%%% CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT,
%%% TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE
%%% SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
%%%-----------------------------------------------------------------------------
-module(listcomp).
-export([parse_transform/2]).
-export([foldl/3, foldr/3]).
-import(etran_util, [transform/2]).
%% @doc parse_transform entry point
parse_transform(AST, Options) ->
etran_util:apply_transform(?MODULE, fun replace/1, AST, Options).
%%------------------------------------------------------------------------------
%% @doc Fold over a list by additionally passing the list's current item number
%% to the folding fun. This function is similar to lists:foldl/3, except
%% that the fun takes the extra second integer argument that represents
%% the sequential number of the item from the list.
%% @end
%%------------------------------------------------------------------------------
-spec foldl(fun((Position::integer(), Item::term(), Acc::term()) -> NewAcc::term()),
Init::term(), list()) -> term().
foldl(Fun, Init, List) when is_function(Fun, 3) ->
element(2, lists:foldl(fun(V, {I, S}) -> R = Fun(V, I, S), {I+1, R} end, {1, Init}, List)).
%%------------------------------------------------------------------------------
%% @doc Fold over a list by additionally passing the list's current item number
%% to the folding fun. This function is similar to lists:foldr/3, except
%% that the fun takes the extra second integer argument that represents
%% the sequential number of the item from the list.
%% @end
%%------------------------------------------------------------------------------
-spec foldr(fun((Position::integer(), Item::term(), Acc::term()) -> NewAcc::term()),
Init::term(), list()) -> term().
foldr(Fun, Init, List) when is_function(Fun, 3) ->
N = length(List),
element(2, lists:foldr(fun(V, {I, S}) -> R = Fun(V, I, S), {I-1, R} end, {N, Init}, List)).
%% Fold Transform
%% ==============
%% Example:
%% L = [1,2,3]
%% [S+I || S = 0, I <- L]. %% Returns: 6
%% ^^^^^
%% [do(Index,S+I) || Index, S = 0, I <- L]. %% Returns: 6, Uses Index as the item index
%% ^^^^^ ^^^^^
%% do(N, Sum) ->
%% io:format("Item#~w running sum: ~w\n", [N, Sum]),
%% Sum.
%%
%% [S+I || S = 0, I <- L]
%% Rewrite: lists:foldl(fun(I, S) -> S+I end, 1, L).
%% [S+I || N, S = 0, I <- L]
%% Rewrite: lists:foldl(fun(I, S) -> S+I end, 1, L).
%% [S+I || S = 1, I <- L, I > 10]
%% Rewrite: lists:foldl(fun(I, S) -> S+I end, 1, [_I || _I <- L, _I > 10]).
%% [S+I+J || S = 1, I <- L1, J <- L2]
%% Rewrite: lists:foldl(fun({I,J}, S) -> S+I+J end, 1, [{I,J} || I <- L1, J <- L2]).
%%
replace({lc,Loc,ResBody0,
[{var, _, V}=Var,
{match,_,{var,_,_},_StateInit0}=Match | Generators]})
when is_atom(V)
, element(1, ResBody0) /= tuple
, element(1, hd(Generators)) == generate ->
replace2(Loc, ResBody0, Var, Match, Generators);
replace({lc,Loc,ResBody0,
[{match,_,{var,_,_},_StateInit0}=Match | Generators]})
when element(1, ResBody0) /= tuple
, element(1, hd(Generators)) == generate ->
replace2(Loc, ResBody0, undefined, Match, Generators);
%% Indexed list comprehension
%% ==========================
%% [I || _Index, I <- L1]
%% Rewrite: element(1, lists:mapfoldl(fun(I, _Index) -> {I, _Index+1} end, 1, L)).
replace({lc,Loc,ResBody0, [{var, _, V}=Var | Generators]})
when is_atom(V)
, element(1, hd(Generators)) == generate ->
replace2(Loc, ResBody0, Var, undefined, Generators);
replace(_Exp) ->
continue.
maybe_make_var(_, {var,_,_} = Arg) ->
{Arg, Arg};
maybe_make_var({Ln,Pos}=Loc, Arg) ->
Var = {var,Loc,list_to_atom("_I@"++integer_to_list(Ln)++"_"++integer_to_list(Pos))},
Match = {match, Loc, Var, Arg},
{Var, Match}.
replace2(Loc, ResBody0, Index, Match, Generators) ->
[FunBody] = transform(fun replace/1, [ResBody0]),
{Init,StateVar} =
case {Index, Match} of
{{var,_,_}, {match,_,{var,VLoc,_}=StateVar0,StateInit0}} ->
%% [S+I || N, S = 5, I <- L, I > 10]
%% Rewrite fold init, and state passed to fun:
%% Init: {1, 5}
%% State: {N, S}
[Init0] = transform(fun replace/1, [StateInit0]),
{{tuple, VLoc, [{integer, VLoc, 1}, Init0]},
{tuple, VLoc, [Index, StateVar0]}};
{undefined, {match,_,{var, _, _}=StateVar0,StateInit0}} ->
%% [S+I || S = 0, I <- L]
%% Rewrite:
%% Init: 0
%% State: S
[Init0] = transform(fun replace/1, [StateInit0]),
{Init0, StateVar0};
{{var,VLoc,_}, undefined} ->
%% [I || N, I <- L]
%% Rewrite:
%% Init: 1
%% State: N
{{integer, VLoc, 1}, undefined}
end,
% Split generators from filters
{Gens, Filters} =
lists:splitwith(fun(G) -> element(1, G) == generate end, Generators),
{GLoc, FunArgs, ListOfLCs} =
case Gens of
[{generate, Loc0, FunArg0, List0}] when Filters == [] ->
% Simple case with one generator and no filters:
% [S+I || S = 1, I <- L]
{Loc0, FunArg0, List0};
[{generate, Loc0, FunArg0, List0}] when Filters /= [] ->
% Simple case with one generator and no filters:
% [S+I || S = 1, {I,_} <- L, I > 1, I < 10]
% Convert the comprehension with filter into:
% lists:mapfoldl(fun({I,_}, S) -> {I,S+I} end, 1, [_V || _V = {I,_} <- L, I > 1, I < 10])
{VarForm, MatchForm} = maybe_make_var(Loc0, FunArg0),
{Loc0, FunArg0, {lc, Loc0, VarForm,
[{generate, Loc0, MatchForm, List0}| Filters]}};
_ ->
% More than one generator:
% [S+I || S = 1, I = FunArg1 <- List1, J = FunArg2 <- List2, ...]
% - Make a list:
% [{I,FunArg1,LCList1}, {J,FunArg2,LCList2}, ...]
VarsList = lists:reverse(
lists:foldl(fun({generate, GLoc, FunArg0, List0}, ALists) ->
{VarForm,MatchForm} = maybe_make_var(GLoc, FunArg0),
[{VarForm, MatchForm, List0}|ALists]
end, [], Gens)),
% - Create a new list comprehension:
% [{I,J,...} || I <- L1, J <- L2, ..., Filters]
Vars = [V || {V,_,_} <- VarsList],
ArgVars = {tuple, Loc, Vars},
FArgs = {tuple, Loc, [A || {_,A,_} <- VarsList]},
ListLCs = {lc, Loc, ArgVars,
[{generate, GLoc, _Match, LList}
|| {{var,GLoc,_}, _Match, LList} <- VarsList] ++ Filters},
{Loc, FArgs, ListLCs}
end,
CallMapFoldl = fun(InitForm, StateForm, FoldFun, FunBodyForm) ->
% lists:FoldFun(fun(FunArgs, StateForm) -> FunBodyForm end,
% InitForm, _ListOfLCs = [{I,J,...} || I <- L1, J <- L2, ...])
{call, GLoc,
{remote,GLoc,{atom,GLoc,lists},{atom,GLoc,FoldFun}},
[{'fun', GLoc,
{clauses,
[{clause, Loc,
[FunArgs, StateForm], % The fun has 2 arguments: ({I,J, ...}, S) -> ...
[], % No guards
[FunBodyForm] % Body
}]}},
InitForm,
ListOfLCs]}
end,
% Finally, rewrite the call:
case {Index,Match} of
{{var,_,_}, {match,_,{var,_VLoc,_}, _}} ->
% For [FunBody || N, StateVar = Init, ...]
% produce: element(2, lists:foldl(fun(I, {N,StateVar}) -> {N+1,FunBody} end, {1,Init}, L)).
FunBodyForm =
{tuple, _VLoc, [{op, _VLoc, '+', Index, {integer, _VLoc, 1}}, FunBody]},
{call, GLoc,
{atom, GLoc, element},
[{integer, GLoc, 2}, CallMapFoldl(Init, StateVar, foldl, FunBodyForm)]};
{undefined, {match,_,{var,_,_}, _}} ->
% For [FunBody || StateVar = Init, I <- L, ...]
% produce: lists:foldl(fun(I, S) -> FunBody end, 1, L).
CallMapFoldl(Init, StateVar, foldl, FunBody);
{{var, _VLoc, _}, undefined} ->
% For [FunBody || N, I <- L]
% produce: element(1, lists:mapfoldl(fun(FunBody, N) -> {FunBody, N+1} end, 1, L)).
FunBodyForm =
{tuple, _VLoc, [FunBody, {op, _VLoc, '+', Index, {integer, _VLoc, 1}}]},
{call, GLoc,
{atom, GLoc, element},
[{integer, GLoc, 1},
CallMapFoldl(Init, Index, mapfoldl, FunBodyForm)]}
end. | src/listcomp.erl | 0.591133 | 0.623635 | listcomp.erl | starcoder |
%% -------------------------------------------------------------------
%%
%% riak_kv_mapper: Executes map functions on input batches
%%
%% Copyright (c) 2007-2010 Basho Technologies, Inc. All Rights Reserved.
%%
%% This file is provided to you under the Apache License,
%% Version 2.0 (the "License"); you may not use this file
%% except in compliance with the License. You may obtain
%% a copy of the License at
%%
%% http://www.apache.org/licenses/LICENSE-2.0
%%
%% Unless required by applicable law or agreed to in writing,
%% software distributed under the License is distributed on an
%% "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
%% KIND, either express or implied. See the License for the
%% specific language governing permissions and limitations
%% under the License.
%%
%% -------------------------------------------------------------------
-module(riak_kv_mapred_filters).
-ifdef(TEST).
-include_lib("eunit/include/eunit.hrl").
-endif.
-export([build_filter/1]).
-export([int_to_string/1,
string_to_int/1,
float_to_string/1,
string_to_float/1,
to_upper/1,
to_lower/1,
tokenize/1,
urldecode/1]).
-export([greater_than/1,
less_than/1,
greater_than_eq/1,
less_than_eq/1,
between/1,
matches/1,
neq/1,
eq/1,
set_member/1,
similar_to/1,
starts_with/1,
ends_with/1]).
-export([logical_and/1,
logical_or/1,
logical_not/1]).
-export([compose/1, resolve_name/1, build_exprs/1]).
-export([to_string/1, levenshtein/2]).
%% @doc Convert a list of Filter expressions into a list of MFA tuples.
-spec build_filter([[string()]]) ->
{ok, [{Module::atom(), Function::atom(), Args::list()}]}
|{error, {bad_filter, string()}}.
build_filter(Exprs) ->
build_filter(Exprs, []).
build_filter([], Accum) ->
{ok, lists:reverse(Accum)};
build_filter([[FunName|Args]|T], Accum) ->
case resolve_name(FunName) of
error ->
{error, {bad_filter, FunName}};
Fun ->
build_filter(T, [{?MODULE, Fun, Args}|Accum])
end.
%% Transform functions
int_to_string(_) ->
fun(V) -> list_to_binary(integer_to_list(V)) end.
string_to_int(_) ->
fun(V) -> list_to_integer(to_string(V)) end.
float_to_string(_) ->
fun(V) -> list_to_binary(float_to_list(V)) end.
string_to_float(_) ->
fun(V) -> list_to_float(to_string(V)) end.
to_upper(_) ->
fun(V) -> string:to_upper(to_string(V)) end.
to_lower(_) ->
fun(V) -> string:to_lower(to_string(V)) end.
tokenize(Args) ->
Seps = riak_kv_mapred_filters:to_string(lists:nth(1, Args)),
TokenNum = lists:nth(2, Args),
fun(V) -> list_to_binary(lists:nth(TokenNum, string:tokens(to_string(V), Seps))) end.
urldecode(_) ->
fun(V) -> {_, V1} = hd(mochiweb_util:parse_qs("x=" ++ to_string(V))), V1 end.
%% Filter functions
greater_than([Value|_]) ->
fun(V) -> V > Value end.
less_than([Value|_]) ->
fun(V) -> V < Value end.
greater_than_eq([Value|_]) ->
fun(V) -> V >= Value end.
less_than_eq([Value|_]) ->
fun(V) -> V =< Value end.
between(Args) ->
LowValue = lists:nth(1, Args),
HighValue = lists:nth(2, Args),
case length(Args) == 3 of
true ->
case lists:nth(3, Args) of
true -> Inclusive = true;
false -> Inclusive = false
end;
false ->
Inclusive = true
end,
case Inclusive of
true ->
fun(V) -> V >= LowValue andalso V =< HighValue end;
false ->
fun(V) -> V > LowValue andalso V < HighValue end
end.
matches([MS|_]) ->
{ok, CMS} = re:compile(MS),
fun(V) ->
case re:run(V, CMS) of
nomatch -> false;
{match, _} -> true
end
end.
neq([Value|_]) ->
fun(V) -> V =/= Value end.
eq([Value|_]) ->
fun(V) -> V =:= Value end.
set_member(Args) ->
Args1 = lists:map(fun(Arg) -> riak_kv_mapred_filters:to_string(Arg) end, Args),
VSet = sets:from_list(Args1),
fun(V) -> sets:is_element(riak_kv_mapred_filters:to_string(V), VSet) end.
similar_to(Args) ->
Str = riak_kv_mapred_filters:to_string(lists:nth(1, Args)),
MaxDist = lists:nth(2, Args),
fun(V) -> riak_kv_mapred_filters:levenshtein(
riak_kv_mapred_filters:to_string(V), Str) =< MaxDist end.
starts_with([Str|_]) ->
fun(V) -> lists:prefix(riak_kv_mapred_filters:to_string(Str), riak_kv_mapred_filters:to_string(V)) end.
ends_with([Str|_]) ->
Str1 = lists:reverse(riak_kv_mapred_filters:to_string(Str)),
fun(V) -> lists:prefix(Str1, lists:reverse(riak_kv_mapred_filters:to_string(V))) end.
%% Logical functions
logical_and(Args) ->
Funs = lists:map(fun(FilterDef) ->
{ok, FilterDef1} = build_exprs(FilterDef),
compose(FilterDef1) end, Args),
fun(V) ->
lists:all(fun(Fun) -> Fun(V) end, Funs) end.
logical_or(Args) ->
Funs = lists:map(fun(FilterDef) ->
{ok, FilterDef1} = build_exprs(FilterDef),
compose(FilterDef1) end, Args),
fun(V) ->
lists:any(fun(Fun) -> Fun(V) end, Funs) end.
logical_not([FilterDef|_]) ->
{ok, FilterDef1} = build_exprs(FilterDef),
FilterFun = compose(FilterDef1),
fun(V) -> case FilterFun(V) of
true -> false;
false -> true end end.
%% Composer
compose([]) ->
fun(_) -> true end;
compose(Filters) ->
compose(Filters, fun(V) -> V end).
compose([], F0) -> F0;
compose([Filter1|Filters], F0) ->
{FilterMod, FilterFun, Args} = Filter1,
Fun1 = FilterMod:FilterFun(Args),
F1 = fun(CArgs) -> Fun1(F0(CArgs)) end,
compose(Filters, F1).
build_exprs(Exprs) ->
build_exprs(Exprs, []).
build_exprs([], Accum) ->
{ok, lists:reverse(Accum)};
build_exprs([[FunName|Args]|T], Accum) ->
case riak_kv_mapred_filters:resolve_name(FunName) of
error ->
{error, {bad_filter, FunName}};
Fun ->
build_exprs(T, [{riak_kv_mapred_filters, Fun, Args}|Accum])
end.
%% Resolver
resolve_name(<<"int_to_string">>) ->
int_to_string;
resolve_name(<<"string_to_int">>) ->
string_to_int;
resolve_name(<<"string_to_float">>) ->
string_to_float;
resolve_name(<<"float_to_string">>) ->
float_to_string;
resolve_name(<<"greater_than">>) ->
greater_than;
resolve_name(<<"less_than">>) ->
less_than;
resolve_name(<<"greater_than_eq">>) ->
greater_than_eq;
resolve_name(<<"less_than_eq">>) ->
less_than_eq;
resolve_name(<<"between">>) ->
between;
resolve_name(<<"similar_to">>) ->
similar_to;
resolve_name(<<"matches">>) ->
matches;
resolve_name(<<"to_upper">>) ->
to_upper;
resolve_name(<<"to_lower">>) ->
to_lower;
resolve_name(<<"set_member">>) ->
set_member;
resolve_name(<<"neq">>) ->
neq;
resolve_name(<<"eq">>) ->
eq;
resolve_name(<<"tokenize">>) ->
tokenize;
resolve_name(<<"urldecode">>) ->
urldecode;
resolve_name(<<"sounds_like">>) ->
sounds_like;
resolve_name(<<"starts_with">>) ->
starts_with;
resolve_name(<<"ends_with">>) ->
ends_with;
resolve_name(<<"and">>) ->
logical_and;
resolve_name(<<"or">>) ->
logical_or;
resolve_name(<<"not">>) ->
logical_not;
resolve_name(_Err) ->
error.
%% Internal
to_string(V) when is_binary(V) ->
binary_to_list(V);
to_string(V) when is_list(V) ->
V;
to_string(V) when is_integer(V) ->
integer_to_list(V);
to_string(V) when is_float(V) ->
float_to_list(V);
to_string(V) ->
io_lib:format("~p", [V]).
%%
%% Levenshtein code by <NAME>, <NAME> via
%% http://www.trapexit.org/String_similar_to_(Levenshtein)
%%
%%------------------------------------------------------------------------------
%% @spec levenshtein(StringA :: string(), StringB :: string()) -> integer()
%% @doc Calculates the Levenshtein distance between two strings
%% @end
%%------------------------------------------------------------------------------
levenshtein(Samestring, Samestring) -> 0;
levenshtein(String, []) -> length(String);
levenshtein([], String) -> length(String);
levenshtein(Source, Target) ->
levenshtein_rec(Source, Target, lists:seq(0, length(Target)), 1).
%% Recurses over every character in the source string and calculates a list of distances
levenshtein_rec([SrcHead|SrcTail], Target, DistList, Step) ->
levenshtein_rec(SrcTail, Target, levenshtein_distlist(Target, DistList, SrcHead, [Step], Step), Step + 1);
levenshtein_rec([], _, DistList, _) ->
lists:last(DistList).
%% Generates a distance list with distance values for every character in the target string
levenshtein_distlist([TargetHead|TargetTail], [DLH|DLT], SourceChar, NewDistList, LastDist) when length(DLT) > 0 ->
Min = lists:min([LastDist + 1, hd(DLT) + 1, DLH + dif(TargetHead, SourceChar)]),
levenshtein_distlist(TargetTail, DLT, SourceChar, NewDistList ++ [Min], Min);
levenshtein_distlist([], _, _, NewDistList, _) ->
NewDistList.
% Calculates the difference between two characters or other values
dif(C, C) -> 0;
dif(_, _) -> 1.
-ifdef(TEST).
% transforms
int_to_string_test() ->
F = compose([{riak_kv_mapred_filters, int_to_string, []}]),
F(42) =:= "42".
string_to_int_test() ->
F = compose([{riak_kv_mapred_filters, string_to_int, []}]),
F("42") =:= 42.
to_upper_test() ->
F = compose([{riak_kv_mapred_filters, to_upper, []}]),
F("basho") =:= "BASHO" andalso F("BASHO") =:= "BASHO".
to_lower_test() ->
F = compose([{riak_kv_mapred_filters, to_lower, []}]),
F("BASHO") =:= "basho" andalso F("basho") =:= "basho".
tokenize_test() ->
F = compose([{riak_kv_mapred_filters, tokenize, ["-", 3]}]),
F("12-20-2010") =:= "2010".
% filters
greater_than_test() ->
F = compose([{riak_kv_mapred_filters, greater_than, [2009]}]),
F(2010) =:= true andalso F(2000) =:= false.
less_than_test() ->
F = compose([{riak_kv_mapred_filters, less_than, [2009]}]),
F(2008) =:= true andalso F(2020) =:= false.
greater_than_eq_test() ->
F = compose([{riak_kv_mapred_filters, greater_than_eq, [2009]}]),
F(2009) =:= true andalso F(2000) =:= false.
less_than_eq_test() ->
F = compose([{riak_kv_mapred_filters, less_than_eq, [2009]}]),
F(2009) =:= true andalso F(2020) =:= false.
between_implicit_inclusive_test() ->
F = compose([{riak_kv_mapred_filters, between, [2009, 2010]}]),
F(2009) =:= true andalso F(2020) =:= false.
between_inclusive_test() ->
F = compose([{riak_kv_mapred_filters, between, [2009, 2010, true]}]),
F(2009) =:= true andalso F(2020) =:= false.
between_exclusive_test() ->
F = compose([{riak_kv_mapred_filters, between, [2008, 2010, false]}]),
F(2009) =:= true andalso F(2010) =:= false.
matches_test() ->
F = compose([{riak_kv_mapred_filters, matches, ["johnm.*"]}]),
F("<EMAIL>") =:= true andalso F("<EMAIL>") =:= false.
neq_test() ->
F = compose([{riak_kv_mapred_filters, neq, ["12-20-2010"]}]),
F("01-08-1901") =:= true andalso F("12-20-2010") =:= false.
eq_test() ->
F = compose([{riak_kv_mapred_filters, neq, ["12-20-2010"]}]),
F("01-08-1901") =:= false andalso F("12-20-2010") =:= true.
set_member_test() ->
F = compose([{riak_kv_mapred_filters, set_member, ["12-20-2010", "12-21-2010"]}]),
F("12-21-2010") =:= true andalso F("12-22-2010") =:= false.
similar_to_test() ->
F = compose([{riak_kv_mapred_filters, similar_to, ["basho", 1]}]),
F("baysho") =:= true andalso F("baysho!") =:= false.
starts_with_test() ->
F = compose([{riak_kv_mapred_filters, starts_with, ["12-20"]}]),
F("12-20-2010") =:= true andalso F("11-20-2010") =:= false.
ends_with_test() ->
F = compose([{riak_kv_mapred_filters, ends_with, ["2010"]}]),
F("12-20-2010") =:= true andalso F("12-20-2011") =:= false.
compose2_test() ->
F = compose([{riak_kv_mapred_filters, to_lower, []},
{riak_kv_mapred_filters, similar_to, ["basho", 3]}]),
F("BAYSHO") =:= true andalso F("lolcaTS") =:= false.
compose3_test() ->
F = compose([{riak_kv_mapred_filters, tokenize, ["-", 3]},
{riak_kv_mapred_filters, string_to_int, []},
{riak_kv_mapred_filters, between, [2009, 2010]}]),
F("12-20-2009") =:= true andalso F("12-20-2011") =:= false.
build_exprs_test() ->
Expr = [[<<"tokenize">>, "-", 3]],
FilterDef = build_exprs(Expr),
FilterDef =:= [{riak_kv_mapred_filters, tokenize, ["-", 3]}].
logical_and_test() ->
F = compose([{riak_kv_mapred_filters, logical_and, [
[[<<"to_lower">>],
[<<"starts_with">>, "hello"]],
[[<<"to_lower">>],
[<<"ends_with">>, "dolly"]]]}]),
F("hello dolly") =:= true andalso F("dolly hello") =:= false.
logical_or_test() ->
F = compose([{riak_kv_mapred_filters, logical_or, [
[[<<"to_lower">>],
[<<"starts_with">>, "hello"]],
[[<<"to_lower">>],
[<<"starts_with">>, "cloned"]]]}]),
F("hello dolly") =:= true andalso F("cloned sheep") =:= true.
logical_not_test() ->
F = compose([{riak_kv_mapred_filters, logical_not, [
[[<<"ends_with">>, "2010"]]]}]),
F("12-20-2010") =:= false andalso F("12-20-2011") =:= true.
-endif. | deps/riak_kv/src/riak_kv_mapred_filters.erl | 0.512449 | 0.474692 | riak_kv_mapred_filters.erl | starcoder |
%% @author jstypka <<EMAIL>>
%% @version 1.0
%% @doc This module handles the logic of a single island in hybrid model
-module(mas_hybrid_island).
-export([start/2, close/1, sendAgent/2]).
-include ("mas.hrl").
-type agent() :: mas:agent().
-type sim_params() :: mas:sim_params().
%% ====================================================================
%% API functions
%% ====================================================================
%% @doc Generates initial data and starts the computation
-spec start(sim_params(), config()) -> no_return().
start(SP, Cf) ->
mas_misc_util:seed_random(),
Agents = mas_misc_util:generate_population(SP, Cf),
timer:send_interval(Cf#config.write_interval, write),
Result = loop(Agents,
mas_misc_util:create_new_counter(Cf),
SP,
Cf),
mas_hybrid:send_result(Result).
-spec close(pid()) -> {finish, pid()}.
close(Pid) ->
Pid ! {finish, self()}.
%% @doc Asynchronusly sends an agent immigrating to this island
-spec sendAgent(pid(), agent()) -> {agent, pid(), agent()}.
sendAgent(Pid, Agent) ->
Pid ! {agent, self(), Agent}.
%% ====================================================================
%% Internal functions
%% ====================================================================
%% @doc The main island process loop.
%% A new generation of the population is created in every iteration.
-spec loop([agent()], counter(), sim_params(), config()) -> [agent()].
loop(Agents, InteractionCounter, SP, Cf) ->
receive
write ->
[exometer:update([self(), Interaction], Val)
|| {Interaction, Val} <- dict:to_list(InteractionCounter)],
loop(Agents,
mas_misc_util:create_new_counter(Cf),
SP,
Cf);
{agent, _Pid, A} ->
loop([A | Agents], InteractionCounter, SP, Cf);
{finish, _Pid} ->
Agents
after 0 ->
Tagged = [{mas_misc_util:behaviour_proxy(A, SP, Cf), A}
|| A <- Agents ],
Groups = mas_misc_util:group_by(Tagged),
NewGroups = [mas_misc_util:meeting_proxy(G, mas_hybrid, SP, Cf)
|| G <- Groups],
NewAgents = mas_misc_util:shuffle(lists:flatten(NewGroups)),
NewCounter =
mas_misc_util:add_interactions_to_counter(Groups,
InteractionCounter),
loop(NewAgents, NewCounter, SP, Cf)
end. | src/hybrid/mas_hybrid_island.erl | 0.510985 | 0.446133 | mas_hybrid_island.erl | starcoder |
% %
%% Copyright (c) 2015-2016 <NAME>. All Rights Reserved.
%%
%% This file is provided to you under the Apache License,
%% Version 2.0 (the "License"); you may not use this file
%% except in compliance with the License. You may obtain
%% a copy of the License at
%%
%% http://www.apache.org/licenses/LICENSE-2.0
%%
%% Unless required by applicable law or agreed to in writing,
%% software distributed under the License is distributed on an
%% "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
%% KIND, either express or implied. See the License for the
%% specific language governing permissions and limitations
%% under the License.
%%
%% -------------------------------------------------------------------
%% @doc LWWRegister.
%% We assume timestamp are unique, totally ordered and consistent
%% with causal order. We use integers as timestamps.
%% When using this, make sure you provide globally unique
%% timestamps.
%%
%% @reference <NAME>, <NAME>, <NAME> and <NAME>
%% A comprehensive study of Convergent and Commutative Replicated Data Types (2011)
%% [http://hal.upmc.fr/file/index/docid/555588/filename/techreport.pdf]
%%
%% @reference <NAME>
%% delta-enabled-crdts C++ library
%% [https://github.com/CBaquero/delta-enabled-crdts]
-module(state_lwwregister).
-author("<NAME> <<EMAIL>>").
-behaviour(type).
-behaviour(state_type).
-define(TYPE, ?MODULE).
-ifdef(TEST).
-include_lib("eunit/include/eunit.hrl").
-endif.
-export([new/0, new/1]).
-export([mutate/3, delta_mutate/3, merge/2]).
-export([query/1, equal/2, is_bottom/1,
is_inflation/2, is_strict_inflation/2,
irreducible_is_strict_inflation/2]).
-export([join_decomposition/1, delta/2, digest/1]).
-export([encode/2, decode/2]).
-export_type([state_lwwregister/0, state_lwwregister_op/0]).
-opaque state_lwwregister() :: {?TYPE, payload()}.
-type payload() :: {timestamp(), value()}.
-type timestamp() :: non_neg_integer().
-type value() :: term().
-type state_lwwregister_op() :: {set, timestamp(), value()}.
%% @doc Create a new, empty `state_lwwregister()'
-spec new() -> state_lwwregister().
new() ->
{?TYPE, {0, undefined}}.
%% @doc Create a new, empty `state_lwwregister()'
-spec new([term()]) -> state_lwwregister().
new([]) ->
new().
%% @doc Mutate a `state_lwwregister()'.
-spec mutate(state_lwwregister_op(), type:id(), state_lwwregister()) ->
{ok, state_lwwregister()}.
mutate(Op, Actor, {?TYPE, _Register}=CRDT) ->
state_type:mutate(Op, Actor, CRDT).
%% @doc Delta-mutate a `state_lwwregister()'.
%% The first argument can only be `{set, timestamp(), value()}'.
%% The second argument is the replica id (unused).
%% The third argument is the `state_lwwregister()' to be inflated.
-spec delta_mutate(state_lwwregister_op(), type:id(), state_lwwregister()) ->
{ok, state_lwwregister()}.
delta_mutate({set, Timestamp, Value}, _Actor, {?TYPE, _Register}) when is_integer(Timestamp) ->
{ok, {?TYPE, {Timestamp, Value}}}.
%% @doc Returns the value of the `state_lwwregister()'.
-spec query(state_lwwregister()) -> value().
query({?TYPE, {_Timestamp, Value}}) ->
Value.
%% @doc Merge two `state_lwwregister()'.
%% The result is the set union of both sets in the
%% `state_lwwregister()' passed as argument.
-spec merge(state_lwwregister(), state_lwwregister()) -> state_lwwregister().
merge({?TYPE, {Timestamp1, Value1}}, {?TYPE, {Timestamp2, Value2}}) ->
Register = case Timestamp1 > Timestamp2 of
true ->
{Timestamp1, Value1};
false ->
{Timestamp2, Value2}
end,
{?TYPE, Register}.
%% @doc Equality for `state_lwwregister()'.
-spec equal(state_lwwregister(), state_lwwregister()) -> boolean().
equal({?TYPE, Register1}, {?TYPE, Register2}) ->
Register1 == Register2.
%% @doc Check if a Register is bottom.
-spec is_bottom(state_lwwregister()) -> boolean().
is_bottom({?TYPE, _}=CRDT) ->
CRDT == new().
%% @doc Given two `state_lwwregister()', check if the second is an inflation
%% of the first.
%% The second `state_lwwregister()' it has a higher timestamp or the same
%% timstamp (and in this case, they are equal).
-spec is_inflation(state_lwwregister(), state_lwwregister()) -> boolean().
is_inflation({?TYPE, {Timestamp1, _}}, {?TYPE, {Timestamp2, _}}) ->
Timestamp2 >= Timestamp1.
%% @doc Check for strict inflation.
-spec is_strict_inflation(state_lwwregister(), state_lwwregister()) -> boolean().
is_strict_inflation({?TYPE, _}=CRDT1, {?TYPE, _}=CRDT2) ->
state_type:is_strict_inflation(CRDT1, CRDT2).
%% @doc Check for irreducible strict inflation.
-spec irreducible_is_strict_inflation(state_lwwregister(),
state_type:digest()) ->
boolean().
irreducible_is_strict_inflation({?TYPE, _}=A, B) ->
state_type:irreducible_is_strict_inflation(A, B).
-spec digest(state_lwwregister()) -> state_type:digest().
digest({?TYPE, _}=CRDT) ->
{state, CRDT}.
%% @doc Join decomposition for `state_lwwregister()'.
-spec join_decomposition(state_lwwregister()) -> [state_lwwregister()].
join_decomposition({?TYPE, _}=CRDT) ->
[CRDT].
%% @doc Delta calculation for `state_lwwregister()'.
-spec delta(state_lwwregister(), state_type:digest()) -> state_lwwregister().
delta({?TYPE, _}=A, B) ->
state_type:delta(A, B).
-spec encode(state_type:format(), state_lwwregister()) -> binary().
encode(erlang, {?TYPE, _}=CRDT) ->
erlang:term_to_binary(CRDT).
-spec decode(state_type:format(), binary()) -> state_lwwregister().
decode(erlang, Binary) ->
{?TYPE, _} = CRDT = erlang:binary_to_term(Binary),
CRDT.
%% ===================================================================
%% EUnit tests
%% ===================================================================
-ifdef(TEST).
new_test() ->
Bottom = {0, undefined},
?assertEqual({?TYPE, Bottom}, new()).
query_test() ->
Register0 = new(),
Register1 = {?TYPE, {1234, "a"}},
?assertEqual(undefined, query(Register0)),
?assertEqual("a", query(Register1)).
delta_set_test() ->
Actor = 1,
Register0 = new(),
{ok, {?TYPE, Delta1}} = delta_mutate({set, 1234, "a"}, Actor, Register0),
Register1 = merge({?TYPE, Delta1}, Register0),
{ok, {?TYPE, Delta2}} = delta_mutate({set, 1235, "b"}, Actor, Register1),
Register2 = merge({?TYPE, Delta2}, Register1),
?assertEqual({?TYPE, {1234, "a"}}, {?TYPE, Delta1}),
?assertEqual({?TYPE, {1234, "a"}}, Register1),
?assertEqual({?TYPE, {1235, "b"}}, {?TYPE, Delta2}),
?assertEqual({?TYPE, {1235, "b"}}, Register2).
set_test() ->
Actor = 1,
Register0 = new(),
{ok, Register1} = mutate({set, 1234, "a"}, Actor, Register0),
{ok, Register2} = mutate({set, 1235, "b"}, Actor, Register1),
?assertEqual({?TYPE, {1234, "a"}}, Register1),
?assertEqual({?TYPE, {1235, "b"}}, Register2).
merge_idempotent_test() ->
Register1 = {?TYPE, {1234, "a"}},
Register2 = {?TYPE, {1235, "b"}},
Register3 = merge(Register1, Register1),
Register4 = merge(Register2, Register2),
?assertEqual(Register1, Register3),
?assertEqual(Register2, Register4).
merge_commutative_test() ->
Register1 = {?TYPE, {1234, "a"}},
Register2 = {?TYPE, {1235, "b"}},
Register3 = merge(Register1, Register2),
Register4 = merge(Register2, Register1),
?assertEqual(Register2, Register3),
?assertEqual(Register2, Register4).
equal_test() ->
Register1 = {?TYPE, {1234, "a"}},
Register2 = {?TYPE, {1235, "b"}},
?assert(equal(Register1, Register1)),
?assertNot(equal(Register1, Register2)).
is_bottom_test() ->
Register0 = new(),
Register1 = {?TYPE, {1234, "a"}},
?assert(is_bottom(Register0)),
?assertNot(is_bottom(Register1)).
is_inflation_test() ->
Register1 = {?TYPE, {1234, "a"}},
Register2 = {?TYPE, {1235, "b"}},
?assert(is_inflation(Register1, Register1)),
?assert(is_inflation(Register1, Register2)),
?assertNot(is_inflation(Register2, Register1)),
%% check inflation with merge
?assert(state_type:is_inflation(Register1, Register1)),
?assert(state_type:is_inflation(Register1, Register2)),
?assertNot(state_type:is_inflation(Register2, Register1)).
is_strict_inflation_test() ->
Register1 = {?TYPE, {1234, "a"}},
Register2 = {?TYPE, {1235, "b"}},
?assertNot(is_strict_inflation(Register1, Register1)),
?assert(is_strict_inflation(Register1, Register2)),
?assertNot(is_strict_inflation(Register2, Register1)).
join_decomposition_test() ->
Register = {?TYPE, {1234, "a"}},
Decomp = join_decomposition(Register),
?assertEqual([Register], Decomp).
encode_decode_test() ->
Register = {?TYPE, {1234, "a"}},
Binary = encode(erlang, Register),
ERegister = decode(erlang, Binary),
?assertEqual(Register, ERegister).
-endif. | src/state_lwwregister.erl | 0.677901 | 0.444022 | state_lwwregister.erl | starcoder |
%% -------------------------------------------------------------------
%%
%% Copyright (c) 2013, 2014 Basho Technologies, Inc.
%%
%% This file is provided to you under the Apache License,
%% Version 2.0 (the "License"); you may not use this file
%% except in compliance with the License. You may obtain
%% a copy of the License at
%%
%% http://www.apache.org/licenses/LICENSE-2.0
%%
%% Unless required by applicable law or agreed to in writing,
%% software distributed under the License is distributed on an
%% "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
%% KIND, either express or implied. See the License for the
%% specific language governing permissions and limitations
%% under the License.
%%
%% -------------------------------------------------------------------
-module(clique_table).
%% API
-export([print/2, print/3,
create_table/2,
autosize_create_table/2, autosize_create_table/3]).
-include("clique_status_types.hrl").
-define(MAX_LINE_LEN, 100).
-define(else, true).
-define(MINWIDTH(W),
if W =< 0 ->
1;
?else ->
W
end).
-spec print(list(), list()) -> ok.
print(_Spec, []) ->
ok;
%% Explict sizes were not given. This is called using the new status types.
print(Schema, Rows) when is_list(hd(Schema)) ->
Table = autosize_create_table(Schema, Rows),
io:format("~n~ts~n", [Table]);
print(Spec, Rows) ->
Table = create_table(Spec, Rows),
io:format("~n~ts~n", [Table]).
-spec print(list(), list(), list()) -> ok.
print(_Hdr, _Spec, []) ->
ok;
print(Header, Spec, Rows) ->
Table = create_table(Spec, Rows),
io:format("~ts~n~n~ts~n", [Header, Table]).
-spec autosize_create_table([any()], [[any()]]) -> iolist().
autosize_create_table(Schema, Rows) ->
autosize_create_table(Schema, Rows, []).
%% Currently the only constraint supported in the proplist is
%% `fixed_width' with a list of columns that *must not* be shrunk
%% (e.g., integer values). First column is 0.
-spec autosize_create_table([any()], [[any()]], [tuple()]) -> iolist().
autosize_create_table(Schema, Rows, Constraints) ->
BorderSize = 1 + length(hd(Rows)),
MaxLineLen = case io:columns() of
%% Leaving an extra space seems to work better
{ok, N} -> N - 1;
{error, enotsup} ->
%% see if the calling envionment has exported an env var with the information we need
case rpc:call(node(erlang:group_leader()), os, getenv, ["CLIQUE_COLUMNS"]) of
false ->
?MAX_LINE_LEN;
Value ->
try list_to_integer(Value) of
Res -> Res - 1
catch _:_ -> ?MAX_LINE_LEN
end
end
end,
Sizes = get_field_widths(MaxLineLen - BorderSize, [Schema | Rows],
proplists:get_value(fixed_width, Constraints, [])),
Spec = lists:zip(Schema, Sizes),
create_table(Spec, Rows, MaxLineLen, []).
-spec create_table(list(), list()) -> iolist().
create_table(Spec, Rows) ->
Lengths = get_row_length(Spec, Rows),
Length = lists:sum(Lengths)+2,
AdjustedSpec = [{Field, NewLength} || {{Field, _DefaultLength}, NewLength}
<- lists:zip(Spec, Lengths)],
create_table(AdjustedSpec, Rows, Length, []).
-spec create_table(list(), list(), non_neg_integer(), iolist()) -> iolist().
create_table(Spec, Rows, Length, []) ->
FirstThreeRows = [vertical_border(Spec), titles(Spec),
vertical_border(Spec)],
create_table(Spec, Rows, Length, FirstThreeRows);
create_table(_Spec, [], _Length, IoList) when length(IoList) == 3 ->
%% table had no rows, no final row needed
lists:reverse(IoList);
create_table(Spec, [], _Length, IoList) ->
BottomBorder = vertical_border(Spec),
%% There are no more rows to print so return the table
lists:reverse([BottomBorder | IoList]);
create_table(Spec, [Row | Rows], Length, IoList) ->
create_table(Spec, Rows, Length, [row(Spec, Row) | IoList]).
%% Measure and shrink table width as necessary to fit the console
-spec get_field_widths(pos_integer(), [term()], [non_neg_integer()]) -> [non_neg_integer()].
get_field_widths(MaxLineLen, Rows, Unshrinkable) ->
Widths = max_widths(Rows),
fit_widths_to_terminal(MaxLineLen, Widths, Unshrinkable).
fit_widths_to_terminal(MaxWidth, Widths, Unshrinkable) ->
Sum = lists:sum(Widths),
Weights = calculate_field_weights(Sum, Widths, Unshrinkable),
MustRemove = Sum - MaxWidth,
calculate_new_widths(MaxWidth, MustRemove, Widths, Weights).
%% Determine field weighting as proportion of total width of the
%% table. Fields which were flagged as unshrinkable will be given a
%% weight of 0.
-spec calculate_field_weights(pos_integer(), list(pos_integer()),
list(non_neg_integer())) ->
list(number()).
calculate_field_weights(Sum, Widths, []) ->
%% If no fields are constrained as unshrinkable, simply divide
%% each width by the sum of all widths for our proportions
lists:map(fun(X) -> X / Sum end, Widths);
calculate_field_weights(_Sum, Widths, Unshrinkable) ->
TaggedWidths = flag_unshrinkable_widths(Widths, Unshrinkable),
ShrinkableWidth = lists:sum(lists:filter(fun({_X, noshrink}) -> false;
(_X) -> true end,
TaggedWidths)),
lists:map(fun({_X, noshrink}) -> 0;
(X) -> X / ShrinkableWidth end,
TaggedWidths).
%% Takes a list of column widths and a list of (zero-based) index
%% values of the columns that must not shrink. Returns a mixed list of
%% widths and `noshrink' tuples.
flag_unshrinkable_widths(Widths, NoShrink) ->
{_, NewWidths} =
lists:foldl(fun(X, {Idx, Mapped}) ->
case lists:member(Idx, NoShrink) of
true ->
{Idx + 1, [{X, noshrink}|Mapped]};
false ->
{Idx + 1, [X|Mapped]}
end
end, {0, []}, Widths),
lists:reverse(NewWidths).
%% Calculate the proportional weight for each column for shrinking.
%% Zip the results into a `{Width, Weight, Index}' tuple list.
column_zip(Widths, Weights, ToNarrow) ->
column_zip(Widths, Weights, ToNarrow, 0, []).
column_zip([], [], _ToNarrow, _Index, Accum) ->
lists:reverse(Accum);
column_zip([Width|Widths], [Weight|Weights], ToNarrow, Index, Accum) ->
NewWidth = ?MINWIDTH(Width - round(ToNarrow * Weight)),
column_zip(Widths, Weights, ToNarrow, Index+1,
[{NewWidth, Weight, Index}] ++ Accum).
%% Given the widths based on data to be displayed, return widths
%% necessary to narrow the table to fit the console.
calculate_new_widths(_Max, ToNarrow, Widths, _Weights) when ToNarrow =< 0 ->
%% Console is wide enough, no need to narrow
Widths;
calculate_new_widths(MaxWidth, ToNarrow, Widths, Weights) ->
fix_rounding(MaxWidth, column_zip(Widths, Weights, ToNarrow)).
%% Rounding may introduce an error. If so, remove the requisite number
%% of spaces from the widest field
fix_rounding(Target, Cols) ->
Widths = lists:map(fun({Width, _Weight, _Idx}) -> Width end,
Cols),
SumWidths = lists:sum(Widths),
shrink_widest(Target, SumWidths, Widths, Cols).
%% Determine whether our target table width is wider than the terminal
%% due to any rounding error and find columns eligible to be shrunk.
shrink_widest(Target, Current, Widths, _Cols) when Target =< Current ->
Widths;
shrink_widest(Target, Current, Widths, Cols) ->
Gap = Current - Target,
NonZeroWeighted = lists:dropwhile(fun({_Width, 0, _Idx}) -> true;
(_) -> false end,
Cols),
shrink_widest_weighted(Gap, NonZeroWeighted, Widths).
%% Take the widest column with a non-zero weight and reduce it by the
%% amount necessary to compensate for any rounding error.
shrink_widest_weighted(_Gap, [], Widths) ->
Widths; %% All columns constrained to fixed widths, nothing we can do
shrink_widest_weighted(Gap, Cols, Widths) ->
SortedCols = lists:sort(
fun({WidthA, _WeightA, _IdxA}, {WidthB, _WeightB, _IdxB}) ->
WidthA > WidthB
end, Cols),
{OldWidth, _Weight, Idx} = hd(SortedCols),
NewWidth = ?MINWIDTH(OldWidth - Gap),
replace_list_element(Idx, NewWidth, Widths).
%% Replace the item at `Index' in `List' with `Element'.
%% Zero-based indexing.
-spec replace_list_element(non_neg_integer(), term(), list()) -> list().
replace_list_element(Index, Element, List) ->
{Prefix, Suffix} = lists:split(Index, List),
Prefix ++ [Element] ++ tl(Suffix).
get_row_length(Spec, Rows) ->
Res = lists:foldl(fun({_Name, MinSize}, Total) ->
Longest = find_longest_field(Rows, length(Total)+1),
Size = erlang:max(MinSize, Longest),
[Size | Total]
end, [], Spec),
lists:reverse(Res).
-spec find_longest_field(list(), pos_integer()) -> non_neg_integer().
find_longest_field(Rows, ColumnNo) ->
lists:foldl(fun(Row, Longest) ->
erlang:max(Longest,
field_length(lists:nth(ColumnNo,Row)))
end, 0, Rows).
-spec max_widths([term()]) -> list(pos_integer()).
max_widths([Row]) ->
field_lengths(Row);
max_widths([Row1 | Rest]) ->
Row1Lengths = field_lengths(Row1),
lists:foldl(fun(Row, Acc) ->
Lengths = field_lengths(Row),
[max(A, B) || {A, B} <- lists:zip(Lengths, Acc)]
end, Row1Lengths, Rest).
-spec row(list(), list(string())) -> iolist().
row(Spec, Row0) ->
%% handle multiline fields
Rows = expand_row(Row0),
[
[ $| | lists:reverse(
["\n" | lists:foldl(fun({{_, Size}, Str}, Acc) ->
[align(Str, Size) | Acc]
end, [], lists:zip(Spec, Row))])] || Row <- Rows].
-spec titles(list()) -> iolist().
titles(Spec) ->
[ $| | lists:reverse(
["\n" | lists:foldl(fun({Title, Size}, TitleRow) ->
[align(Title, Size) | TitleRow]
end, [], Spec)])].
-spec align(string(), non_neg_integer()) -> iolist().
align(undefined, Size) ->
align("", Size);
align(Str, Size) when is_integer(Str) ->
align(integer_to_list(Str), Size);
align(Str, Size) when is_float(Str) ->
%% cuttlefish provides 6 decimals; we will too.
align(float_to_list(Str, [{decimals, 6}, compact]), Size);
align(Str, Size) when is_binary(Str) ->
align(unicode:characters_to_list(Str, utf8), Size);
align(Str, Size) when is_atom(Str) ->
align(atom_to_list(Str), Size);
align(Str, Size) when is_list(Str), length(Str) >= Size ->
Truncated = lists:sublist(Str, Size),
Truncated ++ "|";
align(Str, Size) when is_list(Str) ->
string:centre(Str, Size) ++ "|";
align(Term, Size) ->
Str = lists:flatten(io_lib:format("~p", [Term])),
align(Str, Size).
-spec vertical_border(list(tuple())) -> string().
vertical_border(Spec) ->
lists:reverse([$\n, [[char_seq(Length, $-), $+] ||
{_Name, Length} <- Spec], $+]).
-spec char_seq(non_neg_integer(), char()) -> string().
char_seq(Length, Char) ->
[Char || _ <- lists:seq(1, Length)].
field_lengths(Row) ->
[field_length(Field) || Field <- Row].
field_length(Field) when is_atom(Field) ->
field_length(atom_to_list(Field));
field_length(Field) when is_binary(Field) ->
field_length(unicode:characters_to_list(Field, utf8));
field_length(Field) when is_list(Field) ->
Lines = string:tokens(lists:flatten(Field), "\n"),
lists:foldl(fun(Line, Longest) ->
erlang:max(Longest,
length(Line))
end, 0, Lines);
field_length(Field) ->
field_length(io_lib:format("~p", [Field])).
expand_field(Field) when is_atom(Field) ->
expand_field(atom_to_list(Field));
expand_field(Field) when is_binary(Field) ->
expand_field(unicode:characters_to_list(Field, utf8));
expand_field(Field) when is_list(Field) ->
string:tokens(lists:flatten(Field), "\n");
expand_field(Field) ->
expand_field(io_lib:format("~p", [Field])).
expand_row(Row) ->
{ExpandedRow, MaxHeight} = lists:foldl(fun(Field, {Fields, Max}) ->
EF = expand_field(Field),
{[EF|Fields], erlang:max(Max, length(EF))}
end, {[], 0}, lists:reverse(Row)),
PaddedRow = [pad_field(Field, MaxHeight) || Field <- ExpandedRow],
[ [ lists:nth(N, Field) || Field <- PaddedRow]
|| N <- lists:seq(1, MaxHeight)].
pad_field(Field, MaxHeight) when length(Field) < MaxHeight ->
Field ++ ["" || _ <- lists:seq(1, MaxHeight - length(Field))];
pad_field(Field, _MaxHeight) ->
Field. | src/clique_table.erl | 0.545528 | 0.417628 | clique_table.erl | starcoder |
-module(day8).
-export([solve_part1/1, solve_part2/1]).
% for tests
-export([parse/1, detect_loop/1, fix/1]).
%%% solution
solve_part1(Text) ->
Instructions = parse(Text),
{loops, Acc} = detect_loop(Instructions),
Acc.
solve_part2(Text) ->
Instructions = parse(Text),
fix(Instructions).
%%% internals
%%% Parsing Input
%% Transform the input text into a non-empty list of instructions.
-type operation() :: nop | acc | jmp.
-type argument() :: integer().
-type instruction() :: {operation(), argument()}.
-type instructions() :: [instruction()].
-spec parse(Text) ->
Instructions when
Text :: string(),
Instructions :: instructions().
parse(Text) ->
Lines = string:lexemes(Text, "\n"),
[parse_line(Line) || Line <- Lines].
%% Convert a single line of input into an instruction
-spec parse_line(Line) ->
Instruction when
Line :: string(),
Instruction :: instruction().
parse_line(Line) ->
[OperationStr, ArgumentStr] = string:lexemes(Line, " "),
ArgumentInt = list_to_integer(ArgumentStr),
case OperationStr of
"nop" -> {nop, ArgumentInt};
"acc" -> {acc, ArgumentInt};
"jmp" -> {jmp, ArgumentInt}
end.
%%% Part 1
%% detects the value of accumulator before running a repeating command
-type behaviour() :: halts | loops.
-spec detect_loop(Instructions) ->
DetectionResult when
Instructions :: instructions(),
DetectionResult :: {behaviour(), integer()}.
detect_loop(Instructions) ->
detect_loop(Instructions, _CurrentInstruction=1,
_Visited=sets:new(), _Acc=0).
-spec detect_loop(Instructions, CurrentInstruction, Visited, Acc) ->
DetectionResult when
Instructions :: instructions(),
CurrentInstruction :: pos_integer(),
Visited :: sets:set(),
Acc :: integer(),
DetectionResult :: {behaviour(), integer()}.
detect_loop(Instructions, CurrentInstruction, _, Acc)
when CurrentInstruction =:= length(Instructions) + 1 ->
{halts, Acc};
detect_loop(Instructions, CurrentInstruction, Visited, Acc) ->
case sets:is_element(CurrentInstruction, Visited) of
true -> {loops, Acc};
false ->
NewVisited = sets:add_element(CurrentInstruction, Visited),
{Operation, Argument} = lists:nth(CurrentInstruction,
Instructions),
case Operation of
nop -> detect_loop(Instructions, CurrentInstruction + 1,
NewVisited, Acc);
acc -> detect_loop(Instructions, CurrentInstruction + 1,
NewVisited, Acc + Argument);
jmp -> detect_loop(Instructions,
CurrentInstruction + Argument,
NewVisited, Acc)
end
end.
%%% Part 2
%% Changes a nop to jmp, or the other way around to make
%% the program terminate, outputs the value of Acc after the last instruction
-spec fix(Instructions) ->
Acc when
Instructions :: instructions(),
Acc :: integer().
fix(Instructions) ->
fix(Instructions, _Current=1).
-spec fix(Instructions, Current) ->
Acc when
Instructions :: instructions(),
Current :: pos_integer(),
Acc :: integer().
fix(Instructions, Current) ->
Mutated = replace(Instructions, Current),
case detect_loop(Mutated) of
{halts, Val} -> Val;
{loops, _Val} -> fix(Instructions, Current + 1)
end.
replace(Instructions, Index) ->
Instruction = lists:nth(Index, Instructions),
Replacement = case Instruction of
{nop, Arg} -> {jmp, Arg};
{jmp, Arg} -> {nop, Arg};
Any -> Any
end,
lists:sublist(Instructions, Index - 1) ++ [Replacement] ++ lists:nthtail(Index, Instructions). | src/day8.erl | 0.523664 | 0.50177 | day8.erl | starcoder |
%%%-------------------------------------------------------------------
%%% @doc A static cache storage for Erlang.
%%% Currently consists of two compilers:
%%% ASM - Used for large lists as value. Read the module docs
%%% for more info.
%%%
%%% Syntax - Used for all other cases.
%%%
%%% BEAM - This is experimental and only works for OTP20+
%%% @end
%%%-------------------------------------------------------------------
-module(haki).
-include("internal.hrl").
-include("types.hrl").
-export([
cache/2,
cache/3,
cache_bucket/2,
cache_bucket/3,
get/1,
get/2,
load_snapshot/1,
load_snapshot/2
]).
%% @doc Creates a new module with given Key and stores the Value
%% @end
-spec cache(cache_key(), cache_value()) -> ok | {error, any()}.
cache(Key, Val) ->
cache(Key, Val, ?DEFAULT_CACHE_OPTIONS).
%% @doc Creates a new module with given Key and stores the Value while
%% forcing the compiler that is used to create the module.
%% @end
-spec cache(cache_key(), cache_value(), cache_options()) -> ok | {error, any()}.
cache(Key, Val, Options) ->
?timed(cache,
begin
FilledOptions = maps:merge(?DEFAULT_CACHE_OPTIONS, Options),
haki_compiler:compile(Key, Val, FilledOptions)
end
).
%% @doc Creates a new module named after the bucket, each key/value pair of the
%% given map will be separately retrievable by get/2.
%% @end
-spec cache_bucket(cache_bucket_name(), cache_bucket_value()) -> ok | {error, any()}.
cache_bucket(Bucket, Map) ->
cache_bucket(Bucket, Map, ?DEFAULT_CACHE_OPTIONS).
%% @doc Creates a new module named after the bucket, each key/value pair of the
%% given map will be separately retrievable by get/2.
%% @end
-spec cache_bucket(cache_bucket_name(), cache_bucket_value(), cache_options()) -> ok | {error, any()}.
cache_bucket(Bucket, Map, Options) ->
?timed(cache,
begin
FilledOptions = maps:merge(?DEFAULT_CACHE_OPTIONS, Options),
haki_compiler:compile_bucket(Bucket, Map, FilledOptions)
end
).
%% @doc Retrieves the value for the given Key, by finding the module name
%% and calling get/0 on it.
%% @end
-spec get(cache_key()) -> cache_value() | bad_key.
get(Key) ->
?timed(get,
begin
Mod = haki_compiler:mod_name(Key),
case module_loaded(Mod) of
false ->
bad_key;
_ ->
Mod:get()
end
end
).
%% @doc Retrieves the value for the given Key in the given bucket, by finding the module
%% name and calling get/1 on it.
%% @end
-spec get(cache_bucket_name(), cache_key()) -> cache_value() | bad_key | bad_bucket.
get(Bucket, Key) ->
?timed(get,
begin
Mod = haki_compiler:mod_name(Bucket),
case module_loaded(Mod) of
false ->
bad_bucket;
_ ->
Mod:get(Key)
end
end
).
%% @doc Loads a cached key snapshot from the binary file.
%% @end
-spec load_snapshot(cache_key()) -> {module, module()} | {error, any()}.
load_snapshot(Key) ->
?timed(load_snapshot,
begin
ModName = haki_compiler:mod_name(Key),
code:load_file(ModName)
end
).
%% @doc Loads a cached key snapshot from the binary file given a path.
%% @end
-spec load_snapshot(string(), cache_key()) -> {module, module()} | {error, any()}.
load_snapshot(Path, Key) ->
?timed(load_snapshot,
begin
code:add_pathz(Path),
ModName = haki_compiler:mod_name(Key),
code:load_file(ModName)
end
). | src/haki.erl | 0.510252 | 0.515498 | haki.erl | starcoder |
%% Copyright (c) 2011-2013 <NAME>
%%
%% Licensed under the Apache License, Version 2.0 (the "License");
%% you may not use this file except in compliance with the License.
%% You may obtain a copy of the License at
%%
%% http://www.apache.org/licenses/LICENSE-2.0
%%
%% Unless required by applicable law or agreed to in writing, software
%% distributed under the License is distributed on an "AS IS" BASIS,
%% WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
%% See the License for the specific language governing permissions and
%% limitations under the License.
%%% File : lfe_bits.erl
%%% Author : <NAME>
%%% Purpose : Lisp Flavoured Erlang common functions for binaries.
-module(lfe_bits).
-export([parse_bitspecs/1,get_bitspecs/1]).
%% The standard imports
-import(lists, [foldl/3]).
%% Everything default'ed.
-record(spec, {type=default,size=default,unit=default,
sign=default,endian=default}).
%% get_bitspecs(Specs) -> {ok,Size,{Type,Unit,Sign,End}} | {error,Error}.
%% Parse a bitspec, apply defaults and return data. The size field is
%% unevaluated. We only return the first error found.
get_bitspecs(Specs) ->
try
#spec{type=Ty0,size=Sz0,unit=Un0,sign=Si0,endian=En0} =
parse_bitspecs(Specs, #spec{}),
{Ty,Sz,Un,Si,En} = apply_defaults(Ty0, Sz0, Un0, Si0, En0),
{ok,Sz,{Ty,Un,Si,En}}
catch
throw:Error -> Error
end.
%% parse_bitspecs(Specs) -> {ok,Size,{Type,Unit,Sign,End}} | {error,Error}.
%% Parse a bitspec and return data. Unmentioned fields get the value
%% default. We only return the first error found.
parse_bitspecs(Specs) ->
case catch parse_bitspecs(Specs, #spec{}) of
#spec{type=Ty,size=Sz,unit=Un,sign=Si,endian=En} ->
{ok,Sz,{Ty,Un,Si,En}};
Error -> Error
end.
%% parse_bitspecs(Specs, #spec{}) -> #spec{}.
%% Parse a bitspec and return a #spec{} record. Unmentioned fields get
%% the value default. Errors throw the tuple {error,Error} and must be
%% caught.
parse_bitspecs(Ss, Sp0) ->
foldl(fun (S, Sp) -> parse_bitspec(S, Sp) end, Sp0, Ss).
%% parse_bitspec(Spec, #spec{}) -> #spec{}.
%% We also convert synonyms to the standard value.
%% Types.
parse_bitspec(integer, Sp) -> Sp#spec{type=integer};
parse_bitspec(float, Sp) -> Sp#spec{type=float};
parse_bitspec(binary, Sp) -> Sp#spec{type=binary,unit=8};
parse_bitspec(bytes, Sp) -> Sp#spec{type=binary,unit=8};
parse_bitspec(bitstring, Sp) -> Sp#spec{type=binary,unit=1};
parse_bitspec(bits, Sp) -> Sp#spec{type=binary,unit=1};
%% Unicode types.
parse_bitspec('utf8', Sp) -> Sp#spec{type=utf8};
parse_bitspec('utf-8', Sp) -> Sp#spec{type=utf8};
parse_bitspec('utf16', Sp) -> Sp#spec{type=utf16};
parse_bitspec('utf-16', Sp) -> Sp#spec{type=utf16};
parse_bitspec('utf32', Sp) -> Sp#spec{type=utf32};
parse_bitspec('utf-32', Sp) -> Sp#spec{type=utf32};
%% Endianness.
parse_bitspec('big', Sp) -> Sp#spec{endian=big};
parse_bitspec('big-endian', Sp) -> Sp#spec{endian=big};
parse_bitspec('little', Sp) -> Sp#spec{endian=little};
parse_bitspec('little-endian', Sp) -> Sp#spec{endian=little};
parse_bitspec('native', Sp) -> Sp#spec{endian=native};
parse_bitspec('native-endian', Sp) -> Sp#spec{endian=native};
%% Sign.
parse_bitspec(signed, Sp) -> Sp#spec{sign=signed};
parse_bitspec(unsigned, Sp) -> Sp#spec{sign=unsigned};
%% Size and unit, return these as is.
parse_bitspec([size,S], Sp) -> Sp#spec{size=S};
parse_bitspec([unit,U], Sp) when is_integer(U), U > 0, U =< 256 ->
Sp#spec{unit=U};
parse_bitspec(Spec, _) -> throw({error,{undefined_bittype,Spec}}).
%% apply_defaults(Type, Size, Unit, Sign, Endian) ->
%% {Type,Size,Unit,Sign,Endian}.
%% This is taken almost directly from erl_bits.erl.
%% Default type.
apply_defaults(default, Sz, Un, Si, En) ->
apply_defaults(integer, Sz, Un, Si, En);
%% Default size.
apply_defaults(binary, default, Un, Si, En) ->
apply_defaults(binary, all, Un, Si, En);
apply_defaults(integer, default, Un, Si, En) ->
check_unit(Un),
apply_defaults(integer, 8, Un, Si, En);
apply_defaults(utf8, default, Un, Si, En) ->
apply_defaults(utf8, undefined, Un, Si, En);
apply_defaults(utf16, default, Un, Si, En) ->
apply_defaults(utf16, undefined, Un, Si, En);
apply_defaults(utf32, default, Un, Si, En) ->
apply_defaults(utf32, undefined, Un, Si, En);
apply_defaults(float, default, Un, Si, En) ->
check_unit(Un),
apply_defaults(float, 64, 1, Si, En);
%% Default unit.
apply_defaults(binary, Sz, default, Si, En) ->
apply_defaults(binary, Sz, 8, Si, En);
apply_defaults(integer, Sz, default, Si, En) ->
apply_defaults(integer, Sz, 1, Si, En);
apply_defaults(float, Sz, default, Si, En) ->
apply_defaults(float, Sz, 1, Si, En);
%% Default sign.
apply_defaults(Ty, Sz, Un, default, En) ->
apply_defaults(Ty, Sz, Un, unsigned, En);
%% Default endian.
apply_defaults(Ty, Sz, Un, Si, default) ->
apply_defaults(Ty, Sz, Un, Si, big);
%% Done.
apply_defaults(Ty, Sz, Un, Si, En) ->
{Ty,Sz,Un,Si,En}.
check_unit(default) -> ok;
check_unit(_) -> throw({error,bittype_unit}). | src/lfe_bits.erl | 0.603581 | 0.484136 | lfe_bits.erl | starcoder |
-module(graph_traversal).
-include("records.hrl").
%% API
-export([traverse/2, temp_traverse/2]).
%% @doc
%% traverse/2 represents the main line of looking for a match - the one that was spawned first in a given file
%% part and will try to find match starting with each text character as the first character of the potential match.
%% If a match on traverse/2 fails, the matcher is reset and traversing continues.
traverse(Matcher, <<>>) -> {ok, end_of_input, Matcher};
traverse(Matcher, Text = <<Char, _Rest/bitstring>>) ->
searcher:notify_if_match_found(Matcher),
Matchers = traverse_step(Matcher, Char),
case length(Matchers) of
0 ->
{ResetMatcher, NextText} = reset(Matcher, Text),
traverse(ResetMatcher, NextText);
_More ->
diverge(tl(Matchers), Text),
traverse(hd(Matchers), next_text(hd(Matchers), Text))
end.
%% @doc
%% temp_traverse/2 represents the diverged lines of looking for a match - the ones that were spawned in states with
%% more than 1 emanating edge. If a match on spawned traversal fails, the process job is finished.
temp_traverse(Matcher, <<>>) -> {ok, end_of_input, Matcher};
temp_traverse(Matcher, Text = <<Char, _Rest/bitstring>>) ->
searcher:notify_if_match_found(Matcher),
Matchers = traverse_step(Matcher, Char),
case length(Matchers) of
0 -> {ok, finished, Matcher};
_More ->
diverge(tl(Matchers), Text),
temp_traverse(hd(Matchers), next_text(hd(Matchers), Text))
end.
%% @doc
%% Generates the list of matchers that can be created from current state by travelling along one of the emanating paths.
traverse_step(M, Char) ->
Graph = (M#matcher.graph)#graph.nfa,
RawEdges = digraph:out_edges(Graph, {vertex, M#matcher.current_state}),
Edges = lists:map(fun(Edge) -> digraph:edge(Graph, Edge) end, RawEdges),
PossibleTransitions = lists:filtermap(fun(Edge) -> is_traversable(Edge, Char) end, Edges),
_Matchers = lists:map(fun(Transition) -> create_next_state_matcher(M, Transition) end, PossibleTransitions).
%% @doc
%% Checks if given edge can be traversed on given character (Label).
is_traversable({_Edge, _From, {vertex, ToState}, eps}, _Label) -> {true, {eps, ToState}};
is_traversable({_Edge, _From, {vertex, ToState}, Label}, Label) -> {true, {Label, ToState}};
is_traversable(_Edge, _Label) -> false.
%% @doc
%% Creates next state of matcher after it traverses along the given edge.
create_next_state_matcher(M, {eps, NextState}) ->
#matcher{graph = M#matcher.graph,
current_state = NextState,
matched = M#matcher.matched,
read = false,
part_id = M#matcher.part_id};
create_next_state_matcher(M, {AcceptedChar, NextState}) ->
#matcher{graph = M#matcher.graph,
current_state = NextState,
matched = [AcceptedChar | M#matcher.matched],
read = true,
part_id = M#matcher.part_id}.
%% @doc
%% Resets matcher when matching failed so it will resume searching from the next character to the one from which the
%% failed match started.
reset(M, Text = <<_Char, Rest/bitstring>>) ->
Matched = lists:reverse(M#matcher.matched),
case length(Matched) of
0 -> NextText = Rest;
_More ->
<<_H, RestoredInput/bitstring>> = list_to_binary(Matched),
NextText = <<RestoredInput/bitstring, Text/bitstring>>
end,
ResetMatcher = M#matcher{current_state = (M#matcher.graph)#graph.entry,
matched = [],
read = false},
{ResetMatcher, NextText}.
%% @doc
%% Spawns searching processes on divergence points of the graph.
diverge(Matchers, Text) ->
lists:foreach(fun(Matcher) -> workers_manager:spawn_worker(Matcher, next_text(Matcher, Text)) end, Matchers).
%% @doc
%% Drops first character from the analyzed text if the matcher recently read it.
next_text(M, Text = <<_H, Rest/bitstring>>) ->
case M#matcher.read of
true -> Rest;
false -> Text
end. | apps/grepper/src/graph_traversal.erl | 0.545528 | 0.637877 | graph_traversal.erl | starcoder |
%%%-------------------------------------------------------------------
%%% @author <NAME> <<EMAIL>>
%%% @copyright (c) WhatsApp Inc. and its affiliates. All rights reserved.
%%% @doc
%%% Implements Erlang interpreter via eval/3 function.
%%%
%%% Allows to `call' functions that are not exported by interpreting
%%% function code.
%%%
%%% For cases when evaluation is too slow, it's possible to recompile
%%% a module and hot-code-load it using `export/1,2,3', and later
%%% revert the change with `revert/1'.
%%% @end
%%%-------------------------------------------------------------------
-module(power_shell).
-author("<EMAIL>").
%% API
-export([
eval/3,
eval/4,
export/1,
export/2,
export/3,
revert/1
]).
%% Internal exports
-export([sentinel/3]).
%%--------------------------------------------------------------------
%% API
%% @doc Performs erlang:apply(Module, Fun, Args) by evaluating AST
%% of Module:Fun.
%% @param Module Module name, must be either loaded or discoverable with code:which() or filelib:find_source()
%% @param Fun function name, may not be exported
%% @param Args List of arguments
-spec eval( Module :: module(), Fun :: atom(), Args :: [term()]) ->
term().
eval(Mod, Fun, Args) when is_atom(Mod), is_atom(Fun), is_list(Args) ->
eval_apply(erlang:is_builtin(Mod, Fun, length(Args)), Mod, Fun, Args, undefined).
%% @doc Performs erlang:apply(Module, Fun, Args) by evaluating AST
%% of Module:Fun.
%% @param Module Module name, must be either loaded or discoverable with code:which() or filelib:find_source()
%% @param Fun function name, may not be exported
%% @param Args List of arguments
%% @param FunMap AST of all functions defined in Mod, as returned by power_shell_cache:get_module(Mod)
%% Should be used if starting power_shell_cache gen_server is undesirable.
-spec eval(Module :: module(), Fun :: atom(), Args :: [term()], power_shell_cache:function_map()) ->
term().
eval(Mod, Fun, Args, FunMap) ->
eval_apply(erlang:is_builtin(Mod, Fun, length(Args)), Mod, Fun, Args, FunMap).
%% @equiv export(Module, all, #{})
-spec export(Module :: module()) -> pid().
export(Mod) ->
export(Mod, all).
%% @equiv export(Module, Export, #{})
-spec export(Module :: module(), Export :: all | [{Fun :: atom(), Arity :: non_neg_integer()}]) -> pid().
export(Mod, Export) ->
export(Mod, Export, #{}).
-type export_options() :: #{
link => boolean()
}.
%% @doc Retrieves code (AST) of the `Module' from the debug information chunk. Exports all or selected
%% functions and reloads the module.
%% A sentinel process is created for every `export' call, linked to the caller process. When
%% the calling process terminates, linked sentinel loads the original module back and
%% terminates. Sentinel process can be stopped gracefully by calling `revert(Sentinel)'.
%% There is no protection against multiple `export' calls interacting, and caller is
%% responsible for proper synchronisation. Use `eval' when no safe sequence can be found.
%% @param Module Module name, must be either loaded or discoverable with code:which()
%% @param Export list of tuples `{Fun, Arity}' to be exported, or `all' to export all functions.
%% @param Options use `#{link => false}' to start the sentinel process unlinked, assuming manual
%% `revert' call.
-spec export(Module :: module(), all | [{Fun :: atom(), Arity :: non_neg_integer()}],
export_options()) -> pid().
export(Mod, Export, #{link := false}) ->
proc_lib:start(?MODULE, sentinel, [self(), Mod, Export]);
export(Mod, Export, _Options) ->
proc_lib:start_link(?MODULE, sentinel, [self(), Mod, Export]).
%% @doc Gracefully stops the sentinel process, causing the original module to be loaded back.
%% @param Sentinel process to stop.
-spec revert(Sentinel :: pid()) -> ok.
revert(Sentinel) ->
proc_lib:stop(Sentinel).
%%--------------------------------------------------------------------
%% Internal functions
%% Sentinel process
sentinel(Parent, Mod, Export) ->
Options = proplists:get_value(options, Mod:module_info(compile), []),
{Mod, Binary, File} = code:get_object_code(Mod),
{ok, {Mod, [{abstract_code, {_, Forms}}]}} = beam_lib:chunks(Binary, [abstract_code]),
Expanded = erl_expand_records:module(Forms, [strict_record_tests]),
{ok, Mod, Bin} = make_export(Expanded, Options, Export),
%% trap exits before loading the code
erlang:process_flag(trap_exit, true),
{module, Mod} = code:load_binary(Mod, File, Bin),
proc_lib:init_ack(Parent, self()),
receive
{'EXIT', Parent, _Reason} ->
{module, Mod} = code:load_binary(Mod, File, Binary);
{system, Reply, {terminate, _Reason}} ->
{module, Mod} = code:load_binary(Mod, File, Binary),
gen:reply(Reply, ok)
end.
make_export(Forms, Options, all) ->
compile:forms(Forms, [export_all, binary, debug_info] ++ Options);
make_export(Forms, Options, Exports) ->
Exported = insert_export(Forms, [], Exports),
compile:forms(Exported, [binary, debug_info] ++ Options).
%% don't handle modules that export nothing, crash instead, for these modules
%% aren't useful anyway.
insert_export([{attribute, Anno, export, _} = First | Tail], Passed, Exports) ->
ExAnno = erl_anno:new(erl_anno:line(Anno)),
lists:reverse(Passed) ++ [First, {attribute, ExAnno, export, Exports} | Tail];
insert_export([Form | Tail], Passed, Exports) ->
insert_export(Tail, [Form | Passed], Exports).
%%--------------------------------------------------------------------
%% Evaluator
-define (STACK_TOKEN, '$power_shell_stack_trace').
eval_apply(true, Mod, Fun, Args, _FunMap) ->
erlang:apply(Mod, Fun, Args);
eval_apply(false, Mod, Fun, Args, FunMap0) ->
put(?STACK_TOKEN, []),
try
FunMap = if FunMap0 =:= undefined -> power_shell_cache:get_module(Mod); true -> FunMap0 end,
eval_impl(Mod, Fun, Args, FunMap)
catch
error:enoent ->
{current_stacktrace, Trace} = process_info(self(), current_stacktrace),
erlang:raise(error, undef, [{Mod, Fun, Args, []}] ++ tl(Trace));
throw:Reason ->
{current_stacktrace, Trace} = process_info(self(), current_stacktrace),
% recover stack from process dictionary
pop_stack(),
erlang:raise(throw, Reason, get_stack() ++ tl(Trace));
Class:Reason ->
{current_stacktrace, Trace} = process_info(self(), current_stacktrace),
% recover stack from process dictionary
erlang:raise(Class, Reason, get_stack() ++ tl(Trace))
after
erase(?STACK_TOKEN)
end.
push_stack(MFA) ->
put(?STACK_TOKEN, [MFA | get(?STACK_TOKEN)]).
pop_stack() ->
put(?STACK_TOKEN, tl(get(?STACK_TOKEN))).
get_stack() ->
get(?STACK_TOKEN).
%[{M, F, length(Args), Dbg} || {M, F, Args, Dbg} <- Stack].
%[hd(Stack) | [{M, F, length(Args)} || {M, F, Args} <- tl(Stack)]].
eval_impl(Mod, Fun, Args, FunMap) ->
Arity = length(Args),
case maps:get({Fun, Arity}, FunMap, undefined) of
undefined ->
push_stack({Mod, Fun, Args, []}),
erlang:raise(error, undef, [{Mod, Fun, Args, []}]);
{function, _, Fun, Arity, Clauses} ->
case power_shell_eval:match_clause(Clauses, Args, power_shell_eval:new_bindings(), local_fun_handler(Mod, FunMap)) of
{Body, Bindings} ->
% find line number by reverse scan of clauses, in {clause,_,H,G,B}
%%% UGLY %%% - See TODO for eval_exprs
{clause, _Line, _, _, Body} = lists:keyfind(Body, 5, Clauses),
%NewLocalStack = [{Mod, Fun, Arity, [{line, Line}]}],
push_stack({Mod, Fun, length(Args), []}),
% power_shell_eval:exprs() does not allow to get the value of the last expr only
% power_shell_eval:exprs(Body, Bindings, local_fun_handler(Mod, FunMap)),
R = eval_exprs(Body, Bindings,
local_fun_handler(Mod, FunMap),
non_local_fun_handler()),
pop_stack(),
R;
nomatch ->
push_stack({Mod, Fun, Args, []}),
erlang:raise(error, function_clause, [{Mod, Fun, Args, []}])
end
end.
%% TODO: find a way to supply Line Number (every Expr has it) for Stack purposes
eval_exprs([Expr], Bindings, LocalFun, NonLocalFun) ->
power_shell_eval:expr(Expr, Bindings, LocalFun, NonLocalFun, value);
eval_exprs([Expr | Exprs], Bindings, LocalFun, NonLocalFun) ->
{value, _Value, NewBindings} = power_shell_eval:expr(Expr, Bindings, LocalFun, NonLocalFun),
eval_exprs(Exprs, NewBindings, LocalFun, NonLocalFun).
local_fun_handler(Mod, FunMap) ->
{
value,
fun ({function, Fun, Arity}, []) ->
% extract body for this locally-defined function
% requires patched erl_eval, thus a need for power_shell_eval.
% we rely on the fact that AST has been compiled without errors,
% which means local functions are surely defined
FunArity = {Fun, Arity},
#{FunArity := {function, _, Fun, Arity, Clauses}} = FunMap,
Clauses;
(Fun, Args) ->
eval_impl(Mod, Fun, Args, FunMap)
end
}.
%% Non-local handler serves following purposes:
% * catch exception thrown and fixup stack
non_local_fun_handler() ->
{
value, fun do_apply/2
}.
do_apply({M, F}, A) ->
push_stack({M, F, length(A), []}),
Ret = try
erlang:apply(M, F, A)
catch
error:undef:Stack ->
pop_stack(),
push_stack({M, F, A, []}),
erlang:raise(error, undef, Stack)
end,
pop_stack(),
Ret;
do_apply(Fun, A) ->
[{Mod, Name, _, _} | _] = get_stack(), % assume that anonymous fun comes from the same module & function
Lambda = list_to_atom(atom_to_list(Name) ++ "-fun-$"),
push_stack({Mod, Lambda, length(A), []}),
Ret = erlang:apply(Fun, A),
pop_stack(),
Ret. | src/power_shell.erl | 0.540681 | 0.414543 | power_shell.erl | starcoder |
%% -----------------------------------------------------------------------------
%%
%% The MIT License (MIT)
%%
%% Copyright (c) 2015 <NAME>
%%
%% Permission is hereby granted, free of charge, to any person obtaining a copy
%% of this software and associated documentation files (the "Software"), to deal
%% in the Software without restriction, including without limitation the rights
%% to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
%% copies of the Software, and to permit persons to whom the Software is
%% furnished to do so, subject to the following conditions:
%%
%% The above copyright notice and this permission notice shall be included in all
%% copies or substantial portions of the Software.
%%
%% THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
%% IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
%% FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
%% AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
%% LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
%% OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
%% SOFTWARE.
%%
%% -----------------------------------------------------------------------------
%%
%% @author <NAME> <<EMAIL>>
%% @doc Example digraph implementation as list of edges
%%
%% The example module shows how to implement custom digraph implementation.
%% It should be simplest possible digraph module.
%%
%% @copyright 2015 <NAME>
%% @end
%%
%% -----------------------------------------------------------------------------
-module(list_digraph).
-behaviour(gen_digraph).
-export([new/0]).
-export([ from_list/1
, to_list/1
, edges/1
, no_edges/1
, vertices/1
, no_vertices/1
, in_neighbours/2
, out_neighbours/2
, in_degree/2
, out_degree/2
, sources/1
, sinks/1
, delete/1
, is_edge/3
, is_path/2
, get_path/3
, get_cycle/2
, get_short_path/3
, get_short_cycle/2
, has_path/3
, has_cycle/2
, reachable/2
, reachable_neighbours/2
, reaching/2
, reaching_neighbours/2
, components/1
, strong_components/1
, preorder/1
, is_acyclic/1
, postorder/1
, topsort/1
, condensation/1
]).
-ifdef(TEST).
-include_lib("eunit/include/eunit.hrl").
-endif.
%% -----------------------------------------------------------------------------
%% API
%% -----------------------------------------------------------------------------
new() -> {?MODULE, {[], []}}.
%% -----------------------------------------------------------------------------
%% Callbacks
%% -----------------------------------------------------------------------------
from_list(L) ->
Vs = [ V || E <- L, V <- case E of
{V} -> [V];
{V1, V2} -> [V1, V2];
_ -> error(badarg)
end ],
Es = [ E || {_, _} = E <- L ],
{?MODULE, {lists:usort(Vs), lists:sort(Es)}}.
to_list(G) -> gen_digraph:gen_to_list(G).
edges({_, {_, Es}}) -> Es.
no_edges(G) -> gen_digraph:gen_no_edges(G).
vertices({_, {Vs, _}}) -> Vs.
no_vertices(G) -> gen_digraph:gen_no_vertices(G).
in_neighbours(G, V) -> gen_digraph:gen_in_neighbours(G, V).
out_neighbours(G, V) -> gen_digraph:gen_out_neighbours(G, V).
in_degree(G, V) -> gen_digraph:gen_in_degree(G, V).
out_degree(G, V) -> gen_digraph:gen_out_degree(G, V).
sources(G) -> gen_digraph:gen_sources(G).
sinks(G) -> gen_digraph:gen_sinks(G).
delete(_) -> true.
is_edge(G, V1, V2) -> gen_digraph:gen_is_edge(G, V1, V2).
is_path(G, P) -> gen_digraph:gen_is_path(G, P).
get_path(G, V1, V2) -> gen_digraph:gen_get_path(G, V1, V2).
get_cycle(G, V) -> gen_digraph:gen_get_cycle(G, V).
get_short_path(G, V1, V2) -> gen_digraph:gen_get_short_path(G, V1, V2).
get_short_cycle(G, V) -> gen_digraph:gen_get_short_cycle(G, V).
has_path(G, V1, V2) -> gen_digraph:gen_has_path(G, V1, V2).
has_cycle(G, V) -> gen_digraph:gen_has_cycle(G, V).
reachable(G, Vs) -> gen_digraph:gen_reachable(G, Vs).
reachable_neighbours(G, Vs) -> gen_digraph:gen_reachable_neighbours(G, Vs).
reaching(G, Vs) -> gen_digraph:gen_reaching(G, Vs).
reaching_neighbours(G, Vs) -> gen_digraph:gen_reaching_neighbours(G, Vs).
components(G) -> gen_digraph:gen_components(G).
strong_components(G) -> gen_digraph:gen_strong_components(G).
preorder(G) -> gen_digraph:gen_preorder(G).
is_acyclic(G) -> gen_digraph:gen_is_acyclic(G).
postorder(G) -> gen_digraph:gen_postorder(G).
topsort(G) -> gen_digraph:gen_topsort(G).
condensation(G) -> gen_digraph:gen_condensation(G).
%% -----------------------------------------------------------------------------
%% Tests
%% -----------------------------------------------------------------------------
-ifdef(TEST).
gen_properties_test_() ->
gen_digraph:gen_properties_tests(?MODULE).
gen_tests_test_() ->
gen_digraph:gen_tests(?MODULE).
-endif. %% TEST | src/list_digraph.erl | 0.615435 | 0.436742 | list_digraph.erl | starcoder |
% Licensed under the Apache License, Version 2.0 (the "License"); you may not
% use this file except in compliance with the License. You may obtain a copy of
% the License at
%
% http://www.apache.org/licenses/LICENSE-2.0
%
% Unless required by applicable law or agreed to in writing, software
% distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
% WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
% License for the specific language governing permissions and limitations under
% the License.
% This module implements rate limiting based on a variation the additive
% increase / multiplicative decrease feedback control algorithm.
%
% https://en.wikipedia.org/wiki/Additive_increase/multiplicative_decrease
%
% This is an adaptive algorithm which converges on available channel
% capacity where each participant (client) doesn't a priori know the
% capacity, and participants don't communicate or know about each other (so they
% don't coordinate to divide the capacity among themselves).
%
% The algorithm referenced above estimates a rate, whereas the implemented
% algorithm uses an interval (in milliseconds). It preserves the original
% semantics, that is the failure part is multplicative and the success part is
% additive. The relationship between rate and interval is: rate = 1000 /
% interval.
%
% There are two main API functions:
%
% success(Key) -> IntervalInMilliseconds
% failure(Key) -> IntervalInMilliseconds
%
% Key is any term, typically something like {Method, Url}. The result from the
% function is the current period value. Caller then might decide to sleep for
% that amount of time before or after each request.
-module(couch_replicator_rate_limiter).
-behaviour(gen_server).
-export([
start_link/0
]).
-export([
init/1,
terminate/2,
handle_call/3,
handle_info/2,
handle_cast/2,
code_change/3
]).
-export([
interval/1,
max_interval/0,
failure/1,
success/1
]).
% Types
-type key() :: any().
-type interval() :: non_neg_integer().
-type msec() :: non_neg_integer().
% Definitions
% Main parameters of the algorithm. The factor is the multiplicative part and
% base interval is the additive.
-define(BASE_INTERVAL, 20).
-define(BACKOFF_FACTOR, 1.2).
% If estimated period exceeds a limit, it is clipped to this value. This
% defines a practical limit of this algorithm. This is driven by real world
% concerns such as having a connection which sleeps for too long and ends up
% with socket timeout errors, or replication jobs which occupy a scheduler
% slot without making any progress.
-define(MAX_INTERVAL, 25000).
% Specify when (threshold) and how much (factor) to decay the estimated period.
% If there is a long pause between consecutive updates, the estimated period
% would become less accurate as more time passes. In such case choose to
% optimistically decay the estimated value. That is assume there a certain
% rate of successful requests happened. (For reference, TCP congestion algorithm
% also handles a variation of this in RFC 5681 under "Restarting Idle
% Connections" section).
-define(TIME_DECAY_FACTOR, 2).
-define(TIME_DECAY_THRESHOLD, 1000).
% Limit the rate of updates applied. This controls the rate of change of the
% estimated value. In colloquial terms it defines how "twitchy" the algorithm
% is. Or, another way to look at it, this is as a poor version of a low pass
% filter. (Some alternative TCP congestion control algorithms, like Westwood+
% use something similar to solve the ACK compression problem).
-define(SENSITIVITY_TIME_WINDOW, 80).
-record(state, {timer}).
-record(rec, {id, backoff, ts}).
-spec start_link() -> {ok, pid()} | ignore | {error, term()}.
start_link() ->
gen_server:start_link({local, ?MODULE}, ?MODULE, [], []).
-spec interval(key()) -> interval().
interval(Key) ->
{Interval, _Timestamp} = interval_and_timestamp(Key),
Interval.
-spec max_interval() -> interval().
max_interval() ->
?MAX_INTERVAL.
-spec failure(key()) -> interval().
failure(Key) ->
{Interval, Timestamp} = interval_and_timestamp(Key),
update_failure(Key, Interval, Timestamp, now_msec()).
-spec success(key()) -> interval().
success(Key) ->
{Interval, Timestamp} = interval_and_timestamp(Key),
update_success(Key, Interval, Timestamp, now_msec()).
% gen_server callbacks
init([]) ->
couch_replicator_rate_limiter_tables:create(#rec.id),
{ok, #state{timer = new_timer()}}.
terminate(_Reason, _State) ->
ok.
handle_call(_Msg, _From, State) ->
{reply, invalid, State}.
handle_cast(_, State) ->
{noreply, State}.
handle_info(cleanup, #state{timer = Timer}) ->
erlang:cancel_timer(Timer),
TIds = couch_replicator_rate_limiter_tables:tids(),
[cleanup_table(TId, now_msec() - ?MAX_INTERVAL) || TId <- TIds],
{noreply, #state{timer = new_timer()}}.
code_change(_OldVsn, State, _Extra) ->
{ok, State}.
% Private functions
-spec update_success(any(), interval(), msec(), msec()) -> interval().
update_success(_Key, _Interval, _Timestamp = 0, _Now) ->
0; % No ets entry. Keep it that way and don't insert a new one.
update_success(_Key, Interval, Timestamp, Now)
when Now - Timestamp =< ?SENSITIVITY_TIME_WINDOW ->
Interval; % Ignore too frequent updates.
update_success(Key, Interval, Timestamp, Now) ->
DecayedInterval = time_decay(Now - Timestamp, Interval),
AdditiveFactor = additive_factor(DecayedInterval),
NewInterval = DecayedInterval - AdditiveFactor,
if
NewInterval =< 0 ->
Table = couch_replicator_rate_limiter_tables:term_to_table(Key),
ets:delete(Table, Key),
0;
NewInterval =< ?BASE_INTERVAL ->
insert(Key, ?BASE_INTERVAL, Now);
NewInterval > ?BASE_INTERVAL ->
insert(Key, NewInterval, Now)
end.
-spec update_failure(any(), interval(), msec(), msec()) -> interval().
update_failure(_Key, Interval, Timestamp, Now)
when Now - Timestamp =< ?SENSITIVITY_TIME_WINDOW ->
Interval; % Ignore too frequent updates.
update_failure(Key, Interval, _Timestamp, Now) ->
Interval1 = erlang:max(Interval, ?BASE_INTERVAL),
Interval2 = round(Interval1 * ?BACKOFF_FACTOR),
Interval3 = erlang:min(Interval2, ?MAX_INTERVAL),
insert(Key, Interval3, Now).
-spec insert(any(), interval(), msec()) -> interval().
insert(Key, Interval, Timestamp) ->
Entry = #rec{id = Key, backoff = Interval, ts = Timestamp},
Table = couch_replicator_rate_limiter_tables:term_to_table(Key),
ets:insert(Table, Entry),
Interval.
-spec interval_and_timestamp(key()) -> {interval(), msec()}.
interval_and_timestamp(Key) ->
Table = couch_replicator_rate_limiter_tables:term_to_table(Key),
case ets:lookup(Table, Key) of
[] ->
{0, 0};
[#rec{backoff = Interval, ts = Timestamp}] ->
{Interval, Timestamp}
end.
-spec time_decay(msec(), interval()) -> interval().
time_decay(Dt, Interval) when Dt > ?TIME_DECAY_THRESHOLD ->
DecayedInterval = Interval - ?TIME_DECAY_FACTOR * Dt,
erlang:max(round(DecayedInterval), 0);
time_decay(_Dt, Interval) ->
Interval.
% Calculate additive factor. Ideally it would be a constant but in this case
% it is a step function to help handle larger values as they are approaching
% the backoff limit. Large success values closer to the limit add some
% pressure against the limit, which is useful, as at the backoff limit the
% whole replication job is killed which can be costly in time and temporary work
% lost by those jobs.
-spec additive_factor(interval()) -> interval().
additive_factor(Interval) when Interval > 10000 ->
?BASE_INTERVAL * 50;
additive_factor(Interval) when Interval > 1000 ->
?BASE_INTERVAL * 5;
additive_factor(Interval) when Interval > 100 ->
?BASE_INTERVAL * 2;
additive_factor(_Interval) ->
?BASE_INTERVAL.
-spec new_timer() -> reference().
new_timer() ->
erlang:send_after(?MAX_INTERVAL * 2, self(), cleanup).
-spec now_msec() -> msec().
now_msec() ->
{Mega, Sec, Micro} = os:timestamp(),
((Mega * 1000000) + Sec) * 1000 + Micro div 1000.
-spec cleanup_table(atom(), msec()) -> non_neg_integer().
cleanup_table(Tid, LimitMSec) ->
Head = #rec{ts = '$1', _ = '_'},
Guard = {'<', '$1', LimitMSec},
ets:select_delete(Tid, [{Head, [Guard], [true]}]). | src/couch_replicator/src/couch_replicator_rate_limiter.erl | 0.859266 | 0.572484 | couch_replicator_rate_limiter.erl | starcoder |
%%--------------------------------------------------------------------
%% Copyright (c) 2020-2021 DGIOT Technologies Co., Ltd. All Rights Reserved.
%%
%% Licensed under the Apache License, Version 2.0 (the "License");
%% you may not use this file except in compliance with the License.
%% You may obtain a copy of the License at
%%
%% http://www.apache.org/licenses/LICENSE-2.0
%%
%% Unless required by applicable law or agreed to in writing, software
%% distributed under the License is distributed on an "AS IS" BASIS,
%% WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
%% See the License for the specific language governing permissions and
%% limitations under the License.
%%--------------------------------------------------------------------
-module(dgiot_logger).
-author("johnliu").
%% Logs
-export([ debug/1
, debug/2
, debug/3
, info/1
, info/2
, info/3
, warning/1
, warning/2
, warning/3
, error/1
, error/2
, error/3
, critical/1
, critical/2
, critical/3
]).
%%--------------------------------------------------------------------
%% APIs
%%--------------------------------------------------------------------
-spec(debug(unicode:chardata()) -> ok).
debug(Msg) ->
dgiot_logger:debug(Msg).
-spec(debug(io:format(), [term()]) -> ok).
debug(Format, Args) ->
dgiot_logger:debug(Format, Args).
-spec(debug(logger:metadata(), io:format(), [term()]) -> ok).
debug(Metadata, Format, Args) when is_map(Metadata) ->
dgiot_logger:debug(Format, Args, Metadata).
-spec(info(unicode:chardata()) -> ok).
info(Msg) ->
dgiot_logger:info(Msg).
-spec(info(io:format(), [term()]) -> ok).
info(Format, Args) ->
dgiot_logger:info(Format, Args).
-spec(info(logger:metadata(), io:format(), [term()]) -> ok).
info(Metadata, Format, Args) when is_map(Metadata) ->
dgiot_logger:info(Format, Args, Metadata).
-spec(warning(unicode:chardata()) -> ok).
warning(Msg) ->
dgiot_logger:warning(Msg).
-spec(warning(io:format(), [term()]) -> ok).
warning(Format, Args) ->
dgiot_logger:warning(Format, Args).
-spec(warning(logger:metadata(), io:format(), [term()]) -> ok).
warning(Metadata, Format, Args) when is_map(Metadata) ->
dgiot_logger:warning(Format, Args, Metadata).
-spec(error(unicode:chardata()) -> ok).
error(Msg) ->
dgiot_logger:error(Msg).
-spec(error(io:format(), [term()]) -> ok).
error(Format, Args) ->
dgiot_logger:error(Format, Args).
-spec(error(logger:metadata(), io:format(), [term()]) -> ok).
error(Metadata, Format, Args) when is_map(Metadata) ->
dgiot_logger:error(Format, Args, Metadata).
-spec(critical(unicode:chardata()) -> ok).
critical(Msg) ->
logger:critical(Msg).
-spec(critical(io:format(), [term()]) -> ok).
critical(Format, Args) ->
logger:critical(Format, Args).
-spec(critical(logger:metadata(), io:format(), [term()]) -> ok).
critical(Metadata, Format, Args) when is_map(Metadata) ->
logger:critical(Format, Args, Metadata). | apps/dgiot/src/otp/dgiot_logger.erl | 0.592902 | 0.522507 | dgiot_logger.erl | starcoder |
%% -------------------------------------------------------------------
%%
%% riak_kv_bucket: bucket validation functions
%%
%% Copyright (c) 2007-2011 Basho Technologies, Inc. All Rights Reserved.
%%
%% This file is provided to you under the Apache License,
%% Version 2.0 (the "License"); you may not use this file
%% except in compliance with the License. You may obtain
%% a copy of the License at
%%
%% http://www.apache.org/licenses/LICENSE-2.0
%%
%% Unless required by applicable law or agreed to in writing,
%% software distributed under the License is distributed on an
%% "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
%% KIND, either express or implied. See the License for the
%% specific language governing permissions and limitations
%% under the License.
%%
%% -------------------------------------------------------------------
%% @doc KV Bucket validation functions
-module(riak_kv_bucket).
-export([validate/4]).
-include("riak_kv_types.hrl").
-ifdef(TEST).
-ifdef(EQC).
-compile([export_all]).
-include_lib("eqc/include/eqc.hrl").
-endif.
-include_lib("eunit/include/eunit.hrl").
-endif.
-type prop() :: {PropName::atom(), PropValue::any()}.
-type error_reason() :: atom() | string().
-type error() :: {PropName::atom(), ErrorReason::error_reason()}.
-type props() :: [prop()].
-type errors() :: [error()].
-export_type([props/0]).
%% @doc called by riak_core in a few places to ensure bucket
%% properties are sane. The arguments combinations have the following
%% meanings:-
%%
%% The first argument is the `Phase' of the bucket/bucket type
%% mutation and can be either `create' or `update'.
%%
%% `create' always means that we are creating a new bucket type or
%% updating an inactive bucket type. In the first case `Existing' is
%% the atom `undefined', in the second it is a list of the valid
%% properties returned from the first invocation of `validate/4'. The
%% value of `Bucket' will only ever be a two-tuple of `{binary(),
%% undefined}' for create, as it is only used on bucket types. The
%% final argument `BucketProps' is a list of the properties the user
%% provided for type creation merged with the default properties
%% defined in `riak_core_bucket_type:defaults/0' The job of the
%% function is to validate the given `BucketProps' and return a two
%% tuple `{Good, Bad}' where the first element is the list of valid
%% properties and the second a list of `error()' tuples. Riak_Core
%% will store the `Good' list in metadata iif the `Bad' list is the
%% empty list. It is worth noting that on `create' we must ignore the
%% `Existing' argument altogether.
%%
%% `update' means that we are either updating a bucket type or a
%% bucket. If `Bucket' is a `binary()' or a tuple `{binary(),
%% binary()}' then, a bucket is being updated. If `bucket' is a two
%% tuple of `{binary(), undefined}' then a bucket type is being
%% updated. When `validate/4' is called with `update' as the phase
%% then `Existing' will be the set of properties stored in metadata
%% for this bucket (the set returned as `Good' from the `create'
%% phase) and `BucketProps' will be ONLY the properties that user has
%% supplied as those to update (note: update may mean adding new
%% properties.) The job of `validate/4' in this case is to validate
%% the new properties and return a complete set of bucket properties
%% (ie the new properties merged with the existing propeties) in
%% `Good', riak will then persist these `Good' properties, providing
%% `Bad' is empty.
%%
%% `validate/4' can be used to enforce immutable or co-invariant bucket
%% properties, like "only non-default bucket types can have a
%% `datatype' property", and that "`datatype' buckets must be
%% allow_mult" and "once set, `datatype' cannot be changed".
%%
%% There is no way to _remove_ a property
%%
%% @see validate_dt_props/3, assert_no_datatype/1
-spec validate(create | update,
{riak_core_bucket_type:bucket_type(), undefined | binary()} | binary(),
undefined | props(),
props()) -> {props(), errors()}.
validate(create, _Bucket, _Existing, BucketProps) when is_list(BucketProps) ->
validate_create_bucket_type(BucketProps);
validate(update, {_TypeName, undefined}, Existing, New) when is_list(Existing),
is_list(New) ->
validate_update_bucket_type(Existing, New);
validate(update, {Type, Name}, Existing, New) when is_list(Existing),
is_list(New),
is_binary(Name),
Type /= <<"default">> ->
validate_update_typed_bucket(Existing, New);
validate(update, _Bucket, Existing, New) when is_list(Existing),
is_list(New) ->
validate_default_bucket(Existing, New).
%% @private bucket creation time validation
-spec validate_create_bucket_type(props()) -> {props(), errors()}.
validate_create_bucket_type(BucketProps) ->
case proplists:get_value(consistent, BucketProps) of
%% type is explicitly or implicitly not intended to be consistent
Consistent when Consistent =:= false orelse
Consistent =:= undefined ->
{Unvalidated, Valid, Errors} = validate_create_dt_props(BucketProps);
%% type may be consistent (the value may not be valid)
Consistent ->
{Unvalidated, Valid, Errors} = validate_create_consistent_props(Consistent, BucketProps)
end,
validate(Unvalidated, Valid, Errors).
%% @private update phase of bucket type. Merges properties from
%% existing with valid new properties
-spec validate_update_bucket_type(props(), props()) -> {props(), errors()}.
validate_update_bucket_type(Existing, New) ->
case proplists:get_value(consistent, Existing) of
%% type is explicitly or implicitly not already consistent
Consistent when Consistent =:= false orelse
Consistent =:= undefined ->
{Unvalidated, Valid, Errors} = validate_update_dt_props(Existing, New);
_Consistent ->
{Unvalidated, Valid, Errors} = validate_update_consistent_props(Existing, New)
end,
{Good, Bad} = validate(Unvalidated, Valid, Errors),
{merge(Good, Existing), Bad}.
%% @private just delegates, but I added it to illustrate the many
%% possible type of validation.
-spec validate_update_typed_bucket(props(), props()) -> {props(), errors()}.
validate_update_typed_bucket(Existing, New) ->
validate_update_bucket_type(Existing, New).
%% @private as far as datatypes go, default buckets are free to do as
%% they please, the datatypes API only works on typed buckets. Go
%% wild!
-spec validate_default_bucket(props(), props()) -> {props(), errors()}.
validate_default_bucket(Existing, New) ->
Unvalidated = merge(New, Existing),
validate(Unvalidated, [], []).
%% @private properties in new overwrite those in old
-spec merge(props(), props()) -> props().
merge(New, Old) ->
riak_core_bucket_props:merge(New, Old).
%% @private general property validation
-spec validate(InProps::props(), ValidProps::props(), Errors::errors()) ->
{props(), errors()}.
validate([], ValidProps, Errors) ->
{ValidProps, Errors};
validate([{BoolProp, MaybeBool}|T], ValidProps, Errors) when is_atom(BoolProp), BoolProp =:= allow_mult
orelse BoolProp =:= basic_quorum
orelse BoolProp =:= last_write_wins
orelse BoolProp =:= notfound_ok
orelse BoolProp =:= stat_tracked ->
case coerce_bool(MaybeBool) of
error ->
validate(T, ValidProps, [{BoolProp, not_boolean}|Errors]);
Bool ->
validate(T, [{BoolProp, Bool}|ValidProps], Errors)
end;
validate([{consistent, Value}|T], ValidProps, Errors) ->
case Value of
false -> validate(T, [{consistent, false} | ValidProps], Errors);
_ -> validate(T, ValidProps, [{consistent, "cannot update consistent property"}|Errors])
end;
validate([{IntProp, MaybeInt}=Prop | T], ValidProps, Errors) when IntProp =:= big_vclock
orelse IntProp =:= n_val
orelse IntProp =:= old_vclock
orelse IntProp =:= small_vclock ->
case is_integer(MaybeInt) of
true when MaybeInt > 0 ->
validate(T, [Prop | ValidProps], Errors);
_ ->
validate(T, ValidProps, [{IntProp, not_integer} | Errors])
end;
validate([{QProp, MaybeQ}=Prop | T], ValidProps, Errors) when QProp =:= r
orelse QProp =:= rw
orelse QProp =:= w ->
case is_quorum(MaybeQ) of
true ->
validate(T, [Prop | ValidProps], Errors);
false ->
validate(T, ValidProps, [{QProp, not_valid_quorum} | Errors])
end;
validate([{QProp, MaybeQ}=Prop | T], ValidProps, Errors) when QProp =:= dw
orelse QProp =:= pw
orelse QProp =:= pr ->
case is_opt_quorum(MaybeQ) of
true ->
validate(T, [Prop | ValidProps], Errors);
false ->
validate(T, ValidProps, [{QProp, not_valid_quorum} | Errors])
end;
validate([Prop|T], ValidProps, Errors) ->
validate(T, [Prop|ValidProps], Errors).
-spec is_quorum(term()) -> boolean().
is_quorum(Q) when is_integer(Q), Q > 0 ->
true;
is_quorum(Q) when Q =:= quorum
orelse Q =:= one
orelse Q =:= all
orelse Q =:= <<"quorum">>
orelse Q =:= <<"one">>
orelse Q =:= <<"all">> ->
true;
is_quorum(_) ->
false.
%% @private some quorum options can be zero
-spec is_opt_quorum(term()) -> boolean().
is_opt_quorum(Q) when is_integer(Q), Q >= 0 ->
true;
is_opt_quorum(Q) ->
is_quorum(Q).
-spec coerce_bool(any()) -> boolean() | error.
coerce_bool(true) ->
true;
coerce_bool(false) ->
false;
coerce_bool(MaybeBool) when is_atom(MaybeBool) ->
coerce_bool(atom_to_list(MaybeBool));
coerce_bool(MaybeBool) when is_binary(MaybeBool) ->
coerce_bool(binary_to_list(MaybeBool));
coerce_bool(Int) when is_integer(Int), Int =< 0 ->
false;
coerce_bool(Int) when is_integer(Int) , Int > 0 ->
true;
coerce_bool(MaybeBool) when is_list(MaybeBool) ->
Lower = string:to_lower(MaybeBool),
Atom = (catch list_to_existing_atom(Lower)),
case Atom of
true -> true;
false -> false;
_ -> error
end;
coerce_bool(_) ->
error.
%% @private riak consistent object support requires a bucket type
%% where `consistent' is defined and not `false' to have `consistent'
%% set to true. this function validates that property.
%%
%% We take the indication of a value other than `false' to mean the user
%% intended to create a consistent type. We validate that the value is actually
%% something Riak can understand -- `true'. Why don't we just convert any other
%% value to true? Well, the user maybe type "fals" so lets be careful.
-spec validate_create_consistent_props(any(), props()) -> {props(), props(), errors()}.
validate_create_consistent_props(true, New) ->
{lists:keydelete(consistent, 1, New), [{consistent, true}], []};
validate_create_consistent_props(false, New) ->
{lists:keydelete(consistent, 1, New), [{consistent, false}], []};
validate_create_consistent_props(undefined, New) ->
{New, [], []};
validate_create_consistent_props(Invalid, New) ->
Err = lists:flatten(io_lib:format("~p is not a valid value for consistent. Use \"true\" or \"false\"", [Invalid])),
{lists:keydelete(consistent, 1, New), [], [{consistent, Err}]}.
%% @private riak datatype support requires a bucket type of `datatype'
%% and `allow_mult' set to `true'. These function enforces those
%% properties.
%%
%% We take the presence of a `datatype' property as indication that
%% this bucket type is a special type, somewhere to store CRDTs. I
%% realise this slightly undermines the reason for bucket types (no
%% magic names) but there has to be some way to indicate intent, and
%% that way is the "special" property name `datatype'.
%%
%% Since we don't ever want sibling CRDT types (though we can handle
%% them @see riak_kv_crdt), `datatype' is an immutable property. Once
%% you create a bucket with a certain datatype you can't change
%% it. The `update' bucket type path enforces this. It doesn't
%% validate the correctness of the type, since it assumes that was
%% done at creation, only that it is either the same as existing or
%% not present.
-spec validate_create_dt_props(props()) -> {props(), props(), errors()}.
validate_create_dt_props(New) ->
validate_create_dt_props(proplists:get_value(datatype, New), New).
%% @private validate the datatype, if present
-spec validate_create_dt_props(undefined | atom(), props()) -> {props(), props(), errors()}.
validate_create_dt_props(undefined, New) ->
{New, [], []};
validate_create_dt_props(DataType, New) ->
Unvalidated = lists:keydelete(datatype, 1, New),
Mod = riak_kv_crdt:to_mod(DataType),
case lists:member(Mod, ?V2_TOP_LEVEL_TYPES) of
true ->
validate_create_dt_props(Unvalidated, [{datatype, DataType}], []);
false ->
Err = lists:flatten(io_lib:format("~p not supported for bucket datatype property", [DataType])),
validate_create_dt_props(Unvalidated, [], [{datatype, Err}])
end.
%% @private validate the boolean property, if `datatype' was present,
%% require `allow_mult=true' even if `datatype' was invalid, as we
%% assume the user meant to create `datatype' bucket
-spec validate_create_dt_props(props(), props(), errors()) -> {props(), props(), errors()}.
validate_create_dt_props(Unvalidated0, Valid, Invalid) ->
Unvalidated = lists:keydelete(allow_mult, 1, Unvalidated0),
case allow_mult(Unvalidated0) of
true ->
{Unvalidated, [{allow_mult, true} | Valid], Invalid};
_ ->
Err = io_lib:format("Data Type buckets must be allow_mult=true", []),
{Unvalidated, Valid, [{allow_mult, Err} | Invalid]}
end.
%% @private validate that strongly-consistent types and buckets do not
%% have their n_val changed, nor become eventually consistent
-spec validate_update_consistent_props(props(), props()) -> {props(), props(), errors()}.
validate_update_consistent_props(Existing, New) ->
Unvalidated = lists:keydelete(n_val, 1, lists:keydelete(consistent, 1, New)),
OldNVal = proplists:get_value(n_val, Existing),
NewNVal = proplists:get_value(n_val, New, OldNVal),
NewConsistent = proplists:get_value(consistent, New),
CErr = "cannot update consistent property",
NErr = "n_val cannot be modified for existing consistent type",
case {NewConsistent, OldNVal, NewNVal} of
{undefined, _, undefined} ->
{Unvalidated, [], []};
{undefined, _N, _N} ->
{Unvalidated, [{n_val, NewNVal}], []};
{true, _N, _N} ->
{Unvalidated, [{n_val, NewNVal}, {consistent, true}], []};
{C, _N, _N} when C =/= undefined orelse
C =/= true ->
{Unvalidated, [{n_val, NewNVal}], [{consistent, CErr}]};
{undefined, _OldN, _NewN} ->
{Unvalidated, [], [{n_val, NErr}]};
{true, _OldN, _NewN} ->
{Unvalidated, [{consistent, true}], [{n_val, NErr}]};
{_, _, _} ->
{Unvalidated, [], [{n_val, NErr}, {consistent, CErr}]}
end.
%% @private somewhat duplicates the create path, but easier to read
%% this way, and chars are free
-spec validate_update_dt_props(props(), props()) -> {props(), props(), errors()}.
validate_update_dt_props(Existing, New0) ->
New = lists:keydelete(datatype, 1, New0),
case {proplists:get_value(datatype, Existing), proplists:get_value(datatype, New0)} of
{undefined, undefined} ->
{New, [], []};
{undefined, _Datatype} ->
{New, [], [{datatype, "Cannot add datatype to existing bucket"}]};
{_Datatype, undefined} ->
validate_update_dt_props(New, [] , []);
{Datatype, Datatype} ->
validate_update_dt_props(New, [{datatype, Datatype}] , []);
{_Datatype, _Datatype2} ->
validate_update_dt_props(New, [] , [{datatype, "Cannot update datatype on existing bucket"}])
end.
%% @private check that allow_mult is correct
-spec validate_update_dt_props(props(), props(), errors()) -> {props(), props(), errors()}.
validate_update_dt_props(New, Valid, Invalid) ->
Unvalidated = lists:keydelete(allow_mult, 1, New),
case allow_mult(New) of
undefined ->
{Unvalidated, Valid, Invalid};
true ->
{Unvalidated, [{allow_mult, true} | Valid], Invalid};
_ ->
{Unvalidated, Valid, [{allow_mult, "Cannot change datatype bucket from allow_mult=true"} | Invalid]}
end.
%% @private just grab the allow_mult value if it exists
-spec allow_mult(props()) -> boolean() | 'undefined' | 'error'.
allow_mult(Props) ->
case proplists:get_value(allow_mult, Props) of
undefined ->
undefined;
MaybeBool ->
coerce_bool(MaybeBool)
end.
%%
%% EUNIT tests...
%%
-ifdef (TEST).
coerce_bool_test_ () ->
[?_assertEqual(false, coerce_bool(false)),
?_assertEqual(true, coerce_bool(true)),
?_assertEqual(true, coerce_bool("True")),
?_assertEqual(false, coerce_bool("fAlSE")),
?_assertEqual(false, coerce_bool(<<"FAlse">>)),
?_assertEqual(true, coerce_bool(<<"trUe">>)),
?_assertEqual(true, coerce_bool(1)),
?_assertEqual(true, coerce_bool(234567)),
?_assertEqual(false, coerce_bool(0)),
?_assertEqual(false, coerce_bool(-1234)),
?_assertEqual(false, coerce_bool('FALSE')),
?_assertEqual(true, coerce_bool('TrUe')),
?_assertEqual(error, coerce_bool("Purple")),
?_assertEqual(error, coerce_bool(<<"frangipan">>)),
?_assertEqual(error, coerce_bool(erlang:make_ref()))
].
-ifdef(EQC).
-define(QC_OUT(P),
eqc:on_output(fun(Str, Args) ->
io:format(user, Str, Args) end, P)).
-define(TEST_TIME_SECS, 10).
immutable_test_() ->
{timeout, ?TEST_TIME_SECS+5, [?_assert(test_immutable() =:= true)]}.
valid_test_() ->
{timeout, ?TEST_TIME_SECS+5, [?_assert(test_create() =:= true)]}.
merges_props_test_() ->
{timeout, ?TEST_TIME_SECS+5, [?_assert(test_merges() =:= true)]}.
test_immutable() ->
test_immutable(?TEST_TIME_SECS).
test_immutable(TestTimeSecs) ->
eqc:quickcheck(eqc:testing_time(TestTimeSecs, ?QC_OUT(prop_immutable()))).
test_create() ->
test_create(?TEST_TIME_SECS).
test_create(TestTimeSecs) ->
eqc:quickcheck(eqc:testing_time(TestTimeSecs, ?QC_OUT(prop_create_valid()))).
test_merges() ->
test_merges(?TEST_TIME_SECS).
test_merges(TestTimeSecs) ->
eqc:quickcheck(eqc:testing_time(TestTimeSecs, ?QC_OUT(prop_merges()))).
%% Props
%% When validating:
%% * Once the datatype has been set, it cannot be unset or changed and
%% allow_mult must remain true
%% * the consistent property cannot change and neither can the n_val if
%% the type is consistent
prop_immutable() ->
?FORALL(Args, gen_args(no_default_buckets),
begin
Result = erlang:apply(?MODULE, validate, Args),
Phase = lists:nth(1, Args),
Existing = lists:nth(3, Args),
New = lists:nth(4, Args),
?WHENFAIL(
begin
io:format("Phase: ~p~n", [Phase]),
io:format("Bucket ~p~n", [lists:nth(2, Args)]),
io:format("Existing ~p~n", [Existing]),
io:format("New ~p~n", [New]),
io:format("Result ~p~n", [Result]),
io:format("{allow_mult, valid_dt, valid_consistent, n_val_changed}~n"),
io:format("{~p,~p,~p,~p}~n~n",
[allow_mult(New), valid_datatype(New), valid_consistent(New), n_val_changed(Existing, New)])
end,
collect(with_title("{allow_mult, valid_dt, valid_consistent, n_val_changed}"),
{allow_mult(New), valid_datatype(New), valid_consistent(New), n_val_changed(Existing, New)},
immutable(Phase, New, Existing, Result)))
end).
%% When creating a bucket type:
%% * for datatypes, the datatype must be
%% valid, and allow mult must be true
%% * for consistent data, the consistent property must be valid
prop_create_valid() ->
?FORALL({Bucket, Existing, New}, {gen_bucket(create, bucket_types),
gen_existing(), gen_new(create)},
begin
Result = validate(create, Bucket, Existing, New),
?WHENFAIL(
begin
io:format("Bucket ~p~n", [Bucket]),
io:format("Existing ~p~n", [Existing]),
io:format("New ~p~n", [New]),
io:format("Result ~p~n", [Result]),
io:format("{has_datatype, valid_datatype, allow_mult, has_consistent, valid_consistent}~n"),
io:format("{~p,~p,~p,~p,~p}~n~n",
[has_datatype(New), valid_datatype(New), allow_mult(New), has_consistent(New), valid_consistent(New)])
end,
collect(with_title("{has_datatype, valid_datatype, allow_mult, has_consistent, valid_consistent}"),
{has_datatype(New), valid_datatype(New), allow_mult(New), has_consistent(New), valid_consistent(New)},
only_create_if_valid(Result, New)))
end).
%% As of 2.0pre? validate/4 must merge the new and existing props,
%% verify that.
prop_merges() ->
?FORALL({Bucket, Existing0, New}, {gen_bucket(update, any),
gen_existing(), gen_new(update)},
begin
%% ensure default buckets are not marked consistent since that is invalid
Existing = case default_bucket(Bucket) of
true -> lists:keydelete(consistent, 1, Existing0);
false -> Existing0
end,
Result={Good, _Bad} = validate(update, Bucket, Existing, New),
DefaultBucket = default_bucket(Bucket),
HasAllowMult = has_allow_mult(New),
AllowMult = allow_mult(New),
HasDatatype = has_datatype(Existing),
NValChanged = n_val_changed(Existing, New),
IsConsistent = is_consistent(Existing),
NewConsistent = proplists:get_value(consistent, New),
Expected = case {DefaultBucket, HasAllowMult, AllowMult, HasDatatype,
NValChanged, IsConsistent, NewConsistent} of
%% default bucket, attempted to change consistent to invalid value
%% allow_mult may be invalid too
{true,_, Mult, _, _, false, notvalid} ->
maybe_bad_mult(Mult, merge(lists:keydelete(consistent, 1, New), Existing));
%% default bucket, attempted to change consistent to true
%% allow_mult may be invalid too
{true, _, Mult, _, _, false, true} ->
maybe_bad_mult(Mult, merge(lists:keydelete(consistent, 1, New), Existing));
%% all valid for default type buckets
{true, _, Mult, _, _, _, _} when Mult /= error ->
merge(New, Existing);
%% default bucket: allow mult is invalid
{true, true, _, _, _, _, _} ->
maybe_bad_mult(error, merge(New, Existing));
%% typed bucket, allow mult changed but not consistent or data type
{false, true, Mult, false, _, false, _} when Mult /= error ->
%% the n_val is the only valid change we generate in this case. can't change
%% data type or consistent peroperty
NVal = proplists:get_value(n_val, New, proplists:get_value(n_val, Existing)),
merge([proplists:lookup(allow_mult, New), {n_val, NVal}], Existing);
%% typed bucket, allow_mult change is invalid. n_val has changed. not a datatype or
%% consistent
{false, true, error, false, true, false, _} ->
merge([proplists:lookup(n_val, New)], Existing);
%% typed bucket, allow_mult change is invalid and n_val hasn't changed
{false, true, error, false, false, false, _} ->
Existing;
%% typed bucket, strongly-consistent, both n_val and consistent value are invalid changes
{false, _, Mult,_,true,true,false} ->
maybe_bad_mult(Mult,
merge(lists:keydelete(consistent, 1, lists:keydelete(n_val, 1, New)), Existing));
%% typed bucket, strongly-consistent, both n_val and consistent value are invalid changes
{false, _, Mult,_,true,true,notvalid} ->
maybe_bad_mult(Mult,
merge(lists:keydelete(consistent, 1, lists:keydelete(n_val, 1, New)), Existing));
%% typed bucket, strongly-consistent, n_val change is invalid
{false, _, Mult,_,true,true,_} ->
maybe_bad_mult(Mult, merge(lists:keydelete(n_val, 1, New), Existing));
%% typed bucket, strongly-consistent, consistent value change is invalid
{false, _, Mult, _, _, true, false} ->
maybe_bad_mult(Mult, merge(lists:keydelete(consistent, 1, New), Existing));
%% typed bucket, strongly-consistent, consistent value change is invalid
{false, _, Mult, _, _, true, notvalid} ->
maybe_bad_mult(Mult, merge(lists:keydelete(consistent, 1, New), Existing));
%% typed bucket, strongly-consistent, all good (except maybe allow_mult)
{false, _, Mult, _, _, true, _} ->
maybe_bad_mult(Mult, merge(New, Existing));
%% typed bucket, strongly-consistent, all good (except maybe allow_mult)
{false, true, _Mult, true,_, _, _} ->
NVal = proplists:get_value(n_val, New, proplists:get_value(n_val, Existing)),
merge([{n_val, NVal}], Existing);
%% typed bucket, bucket not strongly consistent or a data type, all valid
{false,_,_,_,_,false,_} ->
merge(New, Existing)
end,
?WHENFAIL(
begin
io:format("Bucket ~p~n", [Bucket]),
io:format("Existing ~p~n", [lists:sort(Existing)]),
io:format("New ~p~n", [New]),
io:format("Result ~p~n", [Result]),
io:format("Diff ~p~n", [sets:to_list(sets:subtract(sets:from_list(Expected), sets:from_list(Good)))])
end,
sets:is_subset(sets:from_list(Expected), sets:from_list(Good)))
end).
%% Generators
gen_args(GenDefBucket) ->
?LET(Phase, gen_phase(), [Phase, gen_bucket(Phase, GenDefBucket),
gen_existing(), gen_new(update)]).
gen_phase() ->
oneof([create, update]).
gen_bucket(create, _) ->
gen_bucket_type();
gen_bucket(update, no_default_buckets) ->
oneof([gen_bucket_type(), gen_typed_bucket()]);
gen_bucket(update, _) ->
oneof([gen_bucket_type(), gen_typed_bucket(), gen_bucket()]).
gen_bucket_type() ->
{binary(20), undefined}.
gen_typed_bucket() ->
{binary(20), binary(20)}.
gen_bucket() ->
oneof([{<<"default">>, binary(20)}, binary(20)]).
gen_existing() ->
Defaults0 = riak_core_bucket_type:defaults(),
Defaults = lists:keydelete(allow_mult, 1, Defaults0),
?LET({MultDT, Consistent}, {gen_valid_mult_dt(), gen_maybe_consistent()},
Defaults ++ MultDT ++ Consistent).
gen_maybe_consistent() ->
oneof([[], gen_valid_consistent()]).
gen_maybe_bad_consistent() ->
oneof([gen_valid_consistent(), [{consistent, notvalid}]]).
gen_valid_consistent() ->
?LET(Consistent, bool(), [{consistent, Consistent}]).
gen_valid_mult_dt() ->
?LET(Mult, bool(), gen_valid_mult_dt(Mult)).
gen_valid_mult_dt(false) ->
?LET(AllowMult, bool(), [{allow_mult, AllowMult}]);
gen_valid_mult_dt(true) ->
?LET(Datatype, gen_datatype(), [{allow_mult, true}, {datatype, Datatype}]).
gen_new(update) ->
?LET({Mult, Datatype, Consistent, NVal},
{gen_allow_mult(), oneof([[], gen_datatype_property()]),
oneof([[], gen_maybe_bad_consistent()]), oneof([[], [{n_val, choose(1, 10)}]])},
Mult ++ Datatype ++ Consistent ++ NVal);
gen_new(create) ->
Defaults0 = riak_core_bucket_type:defaults(),
Defaults = lists:keydelete(allow_mult, 1, Defaults0),
?LET({Mult, DatatypeOrConsistent}, {gen_allow_mult(), frequency([{5, gen_datatype_property()},
{5, gen_maybe_bad_consistent()},
{5, []}])},
Defaults ++ Mult ++ DatatypeOrConsistent).
gen_allow_mult() ->
?LET(Mult, frequency([{9, bool()}, {1, binary()}]), [{allow_mult, Mult}]).
gen_datatype_property() ->
?LET(Datattype, oneof([gen_datatype(), notadatatype]), [{datatype, Datattype}]).
gen_datatype() ->
?LET(Datamod, oneof(?V2_TOP_LEVEL_TYPES), riak_kv_crdt:from_mod(Datamod)).
%% helpers
immutable(create, _, _, _) ->
true;
immutable(_, _New, undefined, _) ->
true;
immutable(update, New, Existing, {_Good, Bad}) ->
case proplists:get_value(consistent, Existing) of
Consistent when Consistent =:= false orelse
Consistent =:= undefined ->
%% not an existing consistent type (or bucket) so validate it
%% as a datatype (immutable_dt also covers the case where it was
%% not a datatype, yes i know this is kind of wierd when you read the
%% test, sorry...)
NewDT = proplists:get_value(datatype, New),
NewAM = proplists:get_value(allow_mult, New),
ExistingDT = proplists:get_value(datatype, Existing),
immutable_dt(NewDT, NewAM, ExistingDT, Bad);
true ->
%% existing type (or bucket) is consistent
immutable_consistent(New, Existing, Bad)
end.
immutable_consistent(New, Existing, Bad) ->
NewCS = proplists:get_value(consistent, New),
OldN = proplists:get_value(n_val, Existing),
NewN = proplists:get_value(n_val, New),
immutable_consistent(NewCS, OldN, NewN, Bad).
%% Consistent properties must remain consistent and
%% the n_val must not change. This function assumes the
%% existing value for consistent is true.
immutable_consistent(undefined, _N, undefined, _Bad) ->
%% consistent and n_val not modified
true;
immutable_consistent(true, _N, undefined, _Bad) ->
%% consistent still set to true and n_val not modified
true;
immutable_consistent(Consistent, _N, _N, _Bad) when Consistent =:= undefined orelse
Consistent =:= true ->
%% consistent not modified or still set to true and n_val
%% modified but set to same value
true;
immutable_consistent(Consistent, _OldN, _NewN, Bad) when Consistent =:= undefined orelse
Consistent =:= true ->
%% consistent not modified or still set to true but n_val modified
has_n_val(Bad);
immutable_consistent(_Consistent, OldN, NewN, Bad) when OldN =:= NewN orelse
NewN =:= undefined ->
%% consistent modified but set to invalid value or false, n_val not modified
%% or set to existing value
has_consistent(Bad);
immutable_consistent(_Consistent, _OldN, _NewN, Bad) ->
has_consistent(Bad) andalso has_n_val(Bad).
%% If data type and allow mult and are in New they must match what is in existing
%% or be in Bad
immutable_dt(undefined, undefined, _Meh1, _Bad) ->
%% datatype and allow_mult are not being modified, so its valid
true;
immutable_dt(undefined, _, undefined, _Bad) ->
%% data type not in new or existing, so its valid
true;
immutable_dt(_Datatype, undefined, _Datatype, _Bad) ->
%% data types from new and existing match and allow mult not modified, valid
true;
immutable_dt(_Datatype, true, _Datatype, _Bad) ->
%% data type from new and existing match and allow mult still set to true, valid
true;
immutable_dt(undefined, true, _Datatype, _Bad) ->
%% data type not modified and allow_mult still set to true, vald
true;
immutable_dt(_Datatype, undefined, _Datatype2, Bad) ->
%% data types do not match, allow_mult not modified
has_datatype(Bad);
immutable_dt(_Datatype, true, _Datatype2, Bad) ->
%% data types do not match, allow_mult still set to true
has_datatype(Bad);
immutable_dt(_Datatype, false, undefined, Bad) ->
%% datatype defined when it wasn't before
has_datatype(Bad);
immutable_dt(_Datatype, false, _Datatype, Bad) ->
%% attempt to set allow_mult to false when data type set is invalid, datatype not modified
has_allow_mult(Bad);
immutable_dt(undefined, false, _Meh, Bad) ->
%% data type not modified but exists and allow_mult set to false is invalid
has_allow_mult(Bad);
immutable_dt(_Datatype, false, _Datatype2, Bad) ->
%% data type changed and allow mult modified to be false, both are invalid
has_allow_mult(Bad) andalso has_datatype(Bad);
immutable_dt(undefined, _, _Datatype, Bad) ->
%% datatype not modified but allow_mult is invalid
has_allow_mult(Bad);
immutable_dt(_Datatype, _, _Datatype, Bad) ->
%% allow mult is invalid but data types still match
has_allow_mult(Bad);
immutable_dt(_, _, _, Bad) ->
%% allow_mult and data type are invalid
has_allow_mult(Bad) andalso has_datatype(Bad).
only_create_if_valid({Good, Bad}, New) ->
DT = proplists:get_value(datatype, New),
AM = proplists:get_value(allow_mult, New),
CS = proplists:get_value(consistent, New),
case {DT, AM, CS} of
%% if consistent or datatype properties are not defined then properties should be
%% valid since no other properties generated can be in valid
{undefined, _AllowMult, Consistent} when Consistent =:= false orelse
Consistent =:= undefined ->
true;
%% if datatype is defined, its not a consistent type and allow_mult=true
%% then the datatype must be valid
{Datatype, true, Consistent} when Consistent =:= false orelse
Consistent =:= undefined ->
case lists:member(riak_kv_crdt:to_mod(Datatype), ?V2_TOP_LEVEL_TYPES) of
true ->
has_datatype(Good) andalso has_allow_mult(Good);
false ->
has_datatype(Bad) andalso has_allow_mult(Good)
end;
%% if the datatype is defined, the type is not consistent and allow_mult is false
%% then allow_mult should be in the Bad list and the datatype may be depending on if it
%% is valid
{Datatype, _, Consistent} when Consistent =:= false orelse
Consistent =:= undefined->
case lists:member(riak_kv_crdt:to_mod(Datatype), ?V2_TOP_LEVEL_TYPES) of
true ->
has_allow_mult(Bad) andalso has_datatype(Good);
false ->
has_datatype(Bad) andalso has_allow_mult(Bad)
end;
%% the type is consistent, whether it has a datatype or allow_mult set is irrelevant (for now
%% at least)
{_, _, true} ->
has_consistent(Good);
%% the type was not inconsistent (explicitly or implicitly) but the value is invalid
{_, _, _Consistent} ->
has_consistent(Bad)
end.
has_datatype(Props) ->
proplists:get_value(datatype, Props) /= undefined.
has_allow_mult(Props) ->
proplists:get_value(allow_mult, Props) /= undefined.
valid_datatype(Props) ->
Datatype = proplists:get_value(datatype, Props),
lists:member(riak_kv_crdt:to_mod(Datatype), ?V2_TOP_LEVEL_TYPES).
has_consistent(Props) ->
proplists:get_value(consistent, Props) /= undefined.
valid_consistent(Props) ->
case proplists:get_value(consistent, Props) of
true ->
true;
false ->
true;
_ ->
false
end.
is_consistent(Props) ->
proplists:get_value(consistent, Props) =:= true.
has_n_val(Props) ->
proplists:get_value(n_val, Props) /= undefined.
n_val_changed(Existing, New) ->
NewN = proplists:get_value(n_val, New),
proplists:get_value(n_val, Existing) =/= NewN andalso
NewN =/= undefined.
default_bucket({<<"default">>, _}) ->
true;
default_bucket(B) when is_binary(B) ->
true;
default_bucket(_) ->
false.
maybe_bad_mult(error, Props) ->
lists:keydelete(allow_mult, 1, Props);
maybe_bad_mult(_, Props) ->
Props.
-endif.
-endif. | deps/riak_kv/src/riak_kv_bucket.erl | 0.700075 | 0.479138 | riak_kv_bucket.erl | starcoder |
%% Copyright 2019, JobTeaser
%%
%% Licensed under the Apache License, Version 2.0 (the "License");
%% you may not use this file except in compliance with the License.
%% You may obtain a copy of the License at
%%
%% http://www.apache.org/licenses/LICENSE-2.0
%%
%% Unless required by applicable law or agreed to in writing, software
%% distributed under the License is distributed on an "AS IS" BASIS,
%% WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
%% See the License for the specific language governing permissions and
%% limitations under the License.
-module(hotp).
-export([generate/3]).
-export_type([counter/0]).
-type counter() :: non_neg_integer().
%% A counter used as moving factor.
%%
%% While it is described as a 8 byte binary value, it is always used to
%% represent an integer.
%%
%% Defined in <a href="https://tools.ietf.org/html/rfc4226#section-5.1">RFC
%% 4226 5.1</a>.
-type sha1_hmac() :: <<_:160>>.
%% A HMAC-SHA1 binary value.
%% @doc Generate a HMAC-based one-time password.
%%
%% See <a href="https://tools.ietf.org/html/rfc4226#section-5.3">RFC 4226
%% 5.3</a>.
-spec generate(Key, Counter, NbDigits) -> Password when
Key :: binary(),
Counter :: counter(),
NbDigits :: pos_integer(),
Password :: non_neg_integer().
generate(Key, Counter, NbDigits) ->
truncate(crypto:hmac(sha, Key, <<Counter:64>>), NbDigits).
%% @doc Truncate a SHA1 HMAC and reduce it to a numeric password containing
%% `NbDigits' digits.
%%
%% See <a href="https://tools.ietf.org/html/rfc4226#section-5.3">RFC 4226
%% 5.3</a>.
-spec truncate(HMAC, NbDigits) -> Password when
HMAC :: sha1_hmac(),
NbDigits :: pos_integer(),
Password :: non_neg_integer().
truncate(HMAC, NbDigits) when byte_size(HMAC) == 20 ->
Offset = binary:at(HMAC, 19) band 16#0f,
C0 = (binary:at(HMAC, Offset) band 16#7f) bsl 24,
C1 = (binary:at(HMAC, Offset + 1) band 16#ff) bsl 16,
C2 = (binary:at(HMAC, Offset + 2) band 16#ff) bsl 8,
C3 = (binary:at(HMAC, Offset + 3) band 16#ff),
P = C0 bor C1 bor C2 bor C3,
P rem pow10(NbDigits).
%% @doc Compute 10 to the power of `N'.
%%
%% The function is only defined for positive integers.
%%
%% We do not use `math:pow' since it uses floating point numbers.
-spec pow10(non_neg_integer()) -> pos_integer().
pow10(N) when N > 0 ->
pow10(N, 1).
pow10(0, Acc) ->
Acc;
pow10(N, Acc) ->
pow10(N - 1, Acc * 10). | src/hotp.erl | 0.707203 | 0.551393 | hotp.erl | starcoder |
%%%-------------------------------------------------------------------
%% @author <NAME> <<EMAIL>>
%% @copyright (C) 2017, <NAME>
%% @doc erl_naive_bayes.erl
%% Naive Bayes means that the simplifying assumption that the value of
%% a particular feature is independent of the values of other features.
%% Learning in NaiveBayes essentially means to construct the Frequency-table,
%% the likelihood-table and the conditional-probability table (based on the observations in likelihood table)
%% the conditional probability contains probabilities P(attribute|class)
%% To generalize/classify: calculate posterior probability based on the "evidence" of the data to be classified.
%% Pick the classification with the highest posterior probability.
%% To compute the posterior probability we use the frequency table, the likelihood table and the conditional
%% probability table which all were constructed during training based on training data.
%% The naive thing with the classifier is that we assume that all atrtributes are independent and thus to
%% calculate the posterior probability given a set of evidence-attributes we can take the product. This is
%% naive because it is likely that the attributes are not independent in reality, but it makes it much simpler
%% to compute.
%% FrequencyTable is a table of {attribute, AttributeTable} where AttirbuteTable is a list of all observed values of
%% the attribute and its corresponding classification in the training set.
%% LikelihoodTable is a table of {{attribute, Value}, Probability} or {{classification}, Probability} based on the training data.
%% ConditionalProbabilityTable is a table of {{attribute, value}, Class, Probability}, i.e the probabiliy of attribute value given
%% classification. Based on training data.
%% Example use-case:
%% > c(erl_naive_bayes).
%% > Examples = erl_naive_bayes:examples_play_tennis().
%% > Model = erl_naive_bayes:learn_model(Examples).
%% > erl_naive_bayes:classify(Model, [{outlook, overcast}, {temperature, hot}, {humidity, high}, {windy, true}]).
%% > erl_naive_bayes:classify(Model, [{outlook, overcast}]).
%% > erl_naive_bayes:classify(Model, [{outlook, sunny}, {humidity, high}]).
%% @end
%%%-------------------------------------------------------------------
-module(erl_naive_bayes).
-author('<NAME> <<EMAIL>>').
%% API
-export([learn_model/1, examples_play_tennis/0, classify/2]).
%% types
-type model()::{frequency_table(), likelihood_table(), conditional_probability_table(), classes()}.
-type conditional_probability_table()::list({attribute_value_pair(), classification(), float()}).
-type likelihood_table()::list({attribute_value_pair() | classification(), float()}).
-type frequency_table()::list(frequency_table_row()).
-type frequency_table_row()::{attribute(), attribute_table()}.
-type attribute_table()::list(attribute_table_row()).
-type attribute_table_row()::{attribute_value(), list({classification(), integer()})}.
-type attribute_value_pairs()::list(attribute_value_pair()).
-type attribute_value_pair()::{attribute(), attribute_value()}.
-type attribute():: atom().
-type attributes():: list(attribute()).
-type attribute_value():: atom().
-type values():: values().
-type classification():: atom().
-type classes():: list(classification()).
-type example():: {attribute_value_pairs(), classification()}.
-type examples() :: list(example()).
%%====================================================================
%% API functions
%%====================================================================
%% Learn the naive-bayes model given a set of examples
-spec learn_model(Examples :: examples()) -> model().
learn_model(Examples) ->
FreqTable = create_frequency_table(Examples),
LikelihoodTable = create_likelihood_table(Examples),
ConditionalProbabilityTable = create_conditional_probability_table(Examples, FreqTable),
Classes = sets:to_list(sets:from_list(classes(Examples))),
{FreqTable, LikelihoodTable, ConditionalProbabilityTable, Classes}.
%% Classify a set of attributes to a given class given a Model
-spec classify(Model::model(), Evidence::attribute_value_pairs())-> {ClassProbability :: float(), Class :: classification()}.
classify({_,LikelihoodTable,ConditionalProbabilityTable,Classes}, Evidence)->
PosteriorProbabilities = lists:map(fun(C) -> {posterior_probability(C, Evidence, LikelihoodTable, ConditionalProbabilityTable), C} end, Classes),
{ClassProb, Class} = lists:max(PosteriorProbabilities),
{ClassProb, Class}.
%%====================================================================
%% Internal functions
%%====================================================================
%% Calculate Posterior probability of Class given evidence and Conditional probabilities
-spec posterior_probability(Class::classification(), Evidence::attribute_value_pairs(), LikelihoodTable::likelihood_table(), ConditionalProbabilityTable::conditional_probability_table()) -> PosteriorProbability::float().
posterior_probability(Class, Evidence, LikelihoodTable, ConditionalProbabilityTable)->
{Class, ClassProb} = lists:keyfind(Class, 1, LikelihoodTable),
ConditionalProbList = lists:map(fun(AV) ->
{AV, Class, AttributeValueProb} = lists:keyfind(Class, 2, lists:filter(fun({AV2, _, _})-> AV =:= AV2 end, ConditionalProbabilityTable)),
AttributeValueProb
end, Evidence),
ConditionalProb = lists:foldl(fun(P, Acc) ->
P*Acc
end, ClassProb, ConditionalProbList),
EvidenceProbList = lists:map(fun(AV) ->
{AV, AVProb} = lists:keyfind(AV, 1, LikelihoodTable),
AVProb
end, Evidence),
EvidenceProb = lists:foldl(fun(P, Acc) ->
P*Acc
end, 1, EvidenceProbList),
ConditionalProb/EvidenceProb.
%% Create conditional probability table for given exampels and frequency table
-spec create_conditional_probability_table(Examples:: examples(), FreqTable::frequency_table()) -> conditional_probability_table().
create_conditional_probability_table(Examples, FreqTable)->
AttributeValues = sets:to_list(sets:from_list(attribute_value_pairs(Examples))),
Classes = sets:to_list(sets:from_list(classes(Examples))),
lists:flatten(lists:foldl(fun(AV, Acc) ->
ConditionalProbabilityList = lists:foldl(fun(C, Acc2) ->
[{AV, C, conditional_probability(AV, C, FreqTable)}|Acc2]
end, [], Classes),
[ConditionalProbabilityList|Acc]
end, [], AttributeValues)).
%% Calculate conditional probability for Attribute-Value pair given Class
-spec conditional_probability(attribute_value_pair(), classification(), frequency_table()) -> ConditionalProbability::float().
conditional_probability({Attribute, Value}, Class, FreqTable)->
{all, L} = lists:keyfind(all, 1, FreqTable),
{Class, [{Class, ClassFreq}]} = lists:keyfind(Class, 1, L),
{Attribute, AttributeTable} = lists:keyfind(Attribute, 1, FreqTable),
{Value, ValueClassCountList} = lists:keyfind(Value, 1, AttributeTable),
{Class, AttributeValueGivenClassFreq} = lists:keyfind(Class, 1, ValueClassCountList),
AttributeValueGivenClassFreq/ClassFreq.
%% Create likelihoodTable given a set of examples
-spec create_likelihood_table(Examples :: examples) -> likelihood_table().
create_likelihood_table(Examples) ->
AttributeValues = attribute_value_pairs(Examples),
Classes = classes(Examples),
ClassesAttributes = Classes ++ AttributeValues,
ClassesAttributesSet = sets:to_list(sets:from_list(ClassesAttributes)),
LikelihoodTable = lists:map(fun(CA) ->
Count = lists:foldl(fun(CA2, Acc) ->
case CA2 of
CA ->
Acc + 1;
_ ->
Acc
end
end, 0, ClassesAttributes),
Probability = Count / length(Examples),
{CA, Probability}
end, ClassesAttributesSet),
LikelihoodTable.
%% Create frequency table given a set of examples
-spec create_frequency_table(Examples :: examples()) -> frequency_table().
create_frequency_table(Examples)->
Attributes = attributes(Examples),
FreqAttributes = lists:foldl(fun(A, Acc) ->
[{A, create_attribute_table(A,Examples)}|Acc]
end, [], sets:to_list(sets:from_list(Attributes))),
Classes = classes(Examples),
FreqTotal = {all, lists:map(fun(C) ->
Count = length(lists:filter(fun(C2) -> C2 =:= C end, Classes)),
{C, [{C, Count}]}
end, sets:to_list(sets:from_list(Classes)))},
FreqAttributes ++ [FreqTotal].
%% Create attribute table for a given attribute and set of examples
-spec create_attribute_table(Attribute :: attribute(), Examples :: examples()) -> attribute_table().
create_attribute_table(Attribute, Examples)->
Values = values(Examples, Attribute),
lists:foldl(fun(V, Acc) ->
[{V, count_classifications_for_attribute_value(Attribute, V, Examples)}|Acc]
end, [], sets:to_list(sets:from_list(Values))).
%% Count classifications per value for a given attribute
-spec count_classifications_for_attribute_value(Attribute::attribute(), Value :: attribute_value(), Examples :: examples) -> attribute_table_row().
count_classifications_for_attribute_value(Attribute, Value, Examples)->
Classes = classes(Examples),
CountedClasses = lists:map(fun(C) ->
Count = lists:foldl(fun
({AVs, C2}, Acc) ->
case C2 of
C ->
case lists:member({Attribute, Value}, AVs) of
true ->
Acc + 1;
false ->
Acc
end;
_ ->
Acc
end
end, 0, Examples),
{C, Count}
end, sets:to_list(sets:from_list(Classes))),
CountedClasses.
%% Extract list of classes from Examples
-spec classes(Examples :: examples()) -> Classes::classes().
classes(Examples)->
lists:map(fun({_, C}) -> C end, Examples).
%% Extract values for a given attribute from Examples of attribute-value pairs + classification
-spec values(Examples :: examples(), Attribute :: attribute()) -> Values::values().
values(Examples, Attribute)->
lists:map(fun({_,V}) -> V end,
lists:filter(fun({A,_}) -> A =:= Attribute end,
attribute_value_pairs(Examples))).
%% Extract attribute-value-pairs from Examples
-spec attribute_value_pairs(Examples :: examples()) -> AttributeValues::attribute_value_pairs().
attribute_value_pairs(Examples)->
lists:flatten(lists:map(fun({A, _}) -> A end, Examples)).
%% Extract attributes from Examples
-spec attributes(Examples :: examples()) -> Attributes::attributes().
attributes(Examples)->
lists:map(fun({A,_}) -> A end, lists:flatten(lists:map(fun({A, _}) -> A end, Examples))).
%%====================================================================
%% Example Data
%%====================================================================
%% Sample set of examples
-spec examples_play_tennis() -> examples().
examples_play_tennis()->
[
{[
{outlook,sunny},{temperature,hot},{humidity,high},{windy,false}
], not_play
},
{[
{outlook,sunny},{temperature,hot},{humidity,high},{windy,true}
], not_play
},
{[
{outlook,overcast},{temperature,hot},{humidity,high},{windy,false}
], play
},
{[
{outlook,rain},{temperature,mild},{humidity,high},{windy,false}
], play
},
{[
{outlook,rain},{temperature,cool},{humidity,normal},{windy,false}
], play
},
{[
{outlook,rain},{temperature,cool},{humidity,normal},{windy,true}
], not_play
},
{[
{outlook,overcast},{temperature,cool},{humidity,normal},{windy,true}
], play
},
{[
{outlook,sunny},{temperature,mild},{humidity,high},{windy,false}
], not_play
},
{[
{outlook,sunny},{temperature,cool},{humidity,normal},{windy,false}
], play
},
{[
{outlook,rain},{temperature,mild},{humidity,normal},{windy,false}
], play
},
{[
{outlook,sunny},{temperature,mild},{humidity,normal},{windy,true}
], play
},
{[
{outlook,overcast},{temperature,mild},{humidity,high},{windy,true}
], play
},
{[
{outlook,overcast},{temperature,hot},{humidity,normal},{windy,false}
], play
},
{[
{outlook,rain},{temperature,mild},{humidity,high},{windy,true}
], not_play
}
]. | erl_naive_bayes/erl_naive_bayes.erl | 0.67854 | 0.750027 | erl_naive_bayes.erl | starcoder |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.