content_type stringclasses 8 values | main_lang stringclasses 7 values | message stringlengths 1 50 | sha stringlengths 40 40 | patch stringlengths 52 962k | file_count int64 1 300 |
|---|---|---|---|---|---|
PHP | PHP | revert one change | a2f859fb3636850ae49cc35fe245d7ac54586bf3 | <ide><path>src/Cache/Cache.php
<ide> *
<ide> * ```
<ide> * Cache::config('shared', [
<del> * 'className' => \Cake\Cache\Engine\ApcuEngine::class,
<add> * 'className' => Cake\Cache\Engine\ApcuEngine::class,
<ide> * 'prefix' => 'my_app_'
<ide> * ]);
<ide> * ``` | 1 |
Text | Text | add missing readme descriptions | 339230eebf973796cdc7ac8b313a97e01aff8294 | <ide><path>README.md
<ide> running TensorFlow 0.12 or earlier, please
<ide> - [neural_programmer](neural_programmer): neural network augmented with logic and mathematic operations.
<ide> - [next_frame_prediction](next_frame_prediction): probabilistic future frame synthesis via cross convolutional networks.
<ide> - [object_detection](object_detection): localizing and identifying multiple objects in a single image.
<add>- [pcl_rl](pcl_rl): code for several reinforcement learning algorithms, including Path Consistency Learning.
<add>- [ptn](ptn): perspective transformer nets for 3D object reconstruction.
<add>- [qa_kg](qa_kg): module networks for question answering on knowledge graphs.
<ide> - [real_nvp](real_nvp): density estimation using real-valued non-volume preserving (real NVP) transformations.
<ide> - [rebar](rebar): low-variance, unbiased gradient estimates for discrete latent variable models.
<ide> - [resnet](resnet): deep and wide residual networks. | 1 |
Javascript | Javascript | fix eventlistener fork | ebbe599a25f4d751787ad49a9026215f14dc6d03 | <ide><path>scripts/rollup/forks.js
<ide> const forks = Object.freeze({
<ide> },
<ide>
<ide> // We wrap top-level listeners into guards on www.
<del> './packages/react-dom/src/events/EventListener.js': (bundleType, entry) => {
<add> './packages/react-dom-bindings/src/events/EventListener.js': (
<add> bundleType,
<add> entry
<add> ) => {
<ide> switch (bundleType) {
<ide> case FB_WWW_DEV:
<ide> case FB_WWW_PROD:
<ide> const forks = Object.freeze({
<ide> return null;
<ide> } else {
<ide> // Use the www fork which is integrated with TimeSlice profiling.
<del> return './packages/react-dom/src/events/forks/EventListener-www.js';
<add> return './packages/react-dom-bindings/src/events/forks/EventListener-www.js';
<ide> }
<ide> default:
<ide> return null; | 1 |
PHP | PHP | add comment explaining behavior | b430cd2d4895b9c0794edb7dee2f99285b5aed48 | <ide><path>src/Controller/Controller.php
<ide> protected function chooseViewClass(): ?string
<ide> if (empty($possibleViewClasses)) {
<ide> return null;
<ide> }
<add> // Controller or component has already made a view class decision.
<add> // That decision should overwrite the framework behavior.
<ide> if ($this->viewBuilder()->getClassName() !== null) {
<ide> return null;
<ide> } | 1 |
Ruby | Ruby | fix arg destructure | b1164adda12268b38bba9b0d81c0d26b7251b8bb | <ide><path>activesupport/lib/active_support/cache.rb
<ide> require 'benchmark'
<add>require 'active_support/core_ext/array/wrap'
<ide> require 'active_support/core_ext/benchmark'
<ide> require 'active_support/core_ext/exception'
<ide> require 'active_support/core_ext/class/attribute_accessors'
<ide> module Strategy
<ide> # ActiveSupport::Cache.lookup_store(MyOwnCacheStore.new)
<ide> # # => returns MyOwnCacheStore.new
<ide> def self.lookup_store(*store_option)
<del> store = store_option.shift
<del> parameters = store_option
<add> store, *parameters = *Array.wrap(store_option).flatten
<ide>
<ide> case store
<ide> when Symbol | 1 |
Javascript | Javascript | update validation snapshot | 8af4c0800c5b29e00a12c0445f8490fa1c432f3f | <ide><path>test/Validation.test.js
<ide> describe("Validation", () => {
<ide> expect(msg).toMatchInlineSnapshot(`
<ide> "Invalid configuration object. Webpack has been initialised using a configuration object that does not match the API schema.
<ide> - configuration has an unknown property 'postcss'. These properties are valid:
<del> object { amd?, bail?, cache?, context?, dependencies?, devServer?, devtool?, entry?, externals?, loader?, mode?, module?, name?, node?, optimization?, output?, parallelism?, performance?, plugins?, profile?, recordsInputPath?, recordsOutputPath?, recordsPath?, resolve?, resolveLoader?, serve?, stats?, target?, watch?, watchOptions? }
<add> object { amd?, bail?, cache?, context?, dependencies?, devServer?, devtool?, entry?, experiments?, externals?, loader?, mode?, module?, name?, node?, optimization?, output?, parallelism?, performance?, plugins?, profile?, recordsInputPath?, recordsOutputPath?, recordsPath?, resolve?, resolveLoader?, serve?, stats?, target?, watch?, watchOptions? }
<ide> For typos: please correct them.
<ide> For loader options: webpack >= v2.0.0 no longer allows custom properties in configuration.
<ide> Loaders should be updated to allow passing options via loader options in module.rules. | 1 |
Ruby | Ruby | move `.tar.xz` logic from `xz` to `tar` | f8dc9eff5898d4b9d75c409954540777ab844c23 | <ide><path>Library/Homebrew/unpack_strategy/tar.rb
<ide> def self.can_extract?(path)
<ide> private
<ide>
<ide> def extract_to_dir(unpack_dir, basename:, verbose:)
<del> system_command! "tar", args: ["xf", path, "-C", unpack_dir]
<add> Dir.mktmpdir do |tmpdir|
<add> tar_path = path
<add>
<add> if DependencyCollector.tar_needs_xz_dependency? && Xz.can_extract?(path)
<add> tmpdir = Pathname(tmpdir)
<add> Xz.new(path).extract(to: tmpdir)
<add> tar_path = tmpdir.children.first
<add> end
<add>
<add> system_command! "tar", args: ["xf", tar_path, "-C", unpack_dir]
<add> end
<ide> end
<ide> end
<ide> end
<ide><path>Library/Homebrew/unpack_strategy/xz.rb
<ide> def extract_to_dir(unpack_dir, basename:, verbose:)
<ide> system_command! "unxz",
<ide> args: [*quiet_flags, "-T0", "--", unpack_dir/basename],
<ide> env: { "PATH" => PATH.new(Formula["xz"].opt_bin, ENV["PATH"]) }
<del> extract_nested_tar(unpack_dir)
<del> end
<del>
<del> def extract_nested_tar(unpack_dir)
<del> return unless DependencyCollector.tar_needs_xz_dependency?
<del> return if (children = unpack_dir.children).count != 1
<del> return if (tar = children.first).extname != ".tar"
<del>
<del> Dir.mktmpdir do |tmpdir|
<del> tmpdir = Pathname(tmpdir)
<del> FileUtils.mv tar, tmpdir/tar.basename
<del> Tar.new(tmpdir/tar.basename).extract(to: unpack_dir)
<del> end
<ide> end
<ide> end
<ide> end | 2 |
Javascript | Javascript | fix bug when applying common.glsl | ea188856bf35610a2e968a79c9e4ea5136fd3397 | <ide><path>src/renderers/shaders/ShaderLib.js
<ide> THREE.ShaderLib = {
<ide>
<ide> "void main() {",
<ide>
<del> " vec4 worldPosition = transformNormal( position, modelMatrix );",
<add> " vWorldPosition = transformNormal( position, modelMatrix );",
<ide>
<ide> " gl_Position = projectionMatrix * modelViewMatrix * vec4( position, 1.0 );",
<ide>
<ide> THREE.ShaderLib = {
<ide>
<ide> "void main() {",
<ide>
<del> " vec4 worldPosition = transformNormal( position, modelMatrix );",
<add> " vWorldPosition = transformNormal( position, modelMatrix );",
<ide>
<ide> " gl_Position = projectionMatrix * modelViewMatrix * vec4( position, 1.0 );",
<ide> | 1 |
Javascript | Javascript | reduce forwardref(view) noise in systrace | 3aea678c38b3f2f208c039a22fc9c6f841826e81 | <ide><path>Libraries/Components/View/View.js
<ide> export type Props = ViewProps;
<ide>
<ide> let ViewToExport = ViewNativeComponent;
<ide> if (__DEV__) {
<del> const View = (
<del> props: Props,
<del> forwardedRef: React.Ref<typeof ViewNativeComponent>,
<del> ) => {
<del> return (
<del> <TextAncestor.Consumer>
<del> {hasTextAncestor => {
<del> invariant(
<del> !hasTextAncestor,
<del> 'Nesting of <View> within <Text> is not currently supported.',
<del> );
<del> return <ViewNativeComponent {...props} ref={forwardedRef} />;
<del> }}
<del> </TextAncestor.Consumer>
<del> );
<del> };
<del> // $FlowFixMe - TODO T29156721 `React.forwardRef` is not defined in Flow, yet.
<del> ViewToExport = React.forwardRef(View);
<add> if (!global.__RCTProfileIsProfiling) {
<add> const View = (
<add> props: Props,
<add> forwardedRef: React.Ref<typeof ViewNativeComponent>,
<add> ) => {
<add> return (
<add> <TextAncestor.Consumer>
<add> {hasTextAncestor => {
<add> invariant(
<add> !hasTextAncestor,
<add> 'Nesting of <View> within <Text> is not currently supported.',
<add> );
<add> return <ViewNativeComponent {...props} ref={forwardedRef} />;
<add> }}
<add> </TextAncestor.Consumer>
<add> );
<add> };
<add> // $FlowFixMe - TODO T29156721 `React.forwardRef` is not defined in Flow, yet.
<add> ViewToExport = React.forwardRef(View);
<add> }
<ide> }
<ide>
<ide> module.exports = ((ViewToExport: $FlowFixMe): typeof ViewNativeComponent); | 1 |
PHP | PHP | fix bad merge | ac6fc32c2933d5868a2c0d2aea4af9405d710378 | <ide><path>src/Illuminate/Http/Concerns/InteractsWithInput.php
<ide> public function bearerToken()
<ide> if ($position !== false) {
<ide> $header = substr($header, $position + 7);
<ide>
<del><<<<<<< HEAD
<del> return str_contains($header, ',') ? strstr(',', $header, true) : $header;
<del>=======
<del> return strpos($header, ',') !== false ? strstr($header, ',', true) : $header;
<del>>>>>>>> 8.x
<add> return str_contains($header, ',') ? strstr($header, ',', true) : $header;
<ide> }
<ide> }
<ide> | 1 |
Python | Python | remove redundant parenthesis | 60cf315d1b61e823ebc73714800bc9e1e3833c12 | <ide><path>airflow/executors/celery_executor.py
<ide> def trigger_tasks(self, open_slots: int) -> None:
<ide>
<ide> task_tuples_to_send: List[TaskInstanceInCelery] = []
<ide>
<del> for _ in range(min((open_slots, len(self.queued_tasks)))):
<add> for _ in range(min(open_slots, len(self.queued_tasks))):
<ide> key, (command, _, queue, simple_ti) = sorted_queue.pop(0)
<ide> task_tuples_to_send.append((key, simple_ti, command, queue, execute_command))
<ide> | 1 |
Python | Python | move pickle import to numpy.compat | b6dc039961768bd5f3a3d7f57e8c396f8fa02815 | <ide><path>numpy/compat/py3k.py
<ide> 'unicode', 'asunicode', 'asbytes_nested', 'asunicode_nested',
<ide> 'asstr', 'open_latin1', 'long', 'basestring', 'sixu',
<ide> 'integer_types', 'is_pathlib_path', 'npy_load_module', 'Path',
<del> 'contextlib_nullcontext', 'os_fspath', 'os_PathLike']
<add> 'pickle', 'contextlib_nullcontext', 'os_fspath', 'os_PathLike']
<ide>
<ide> import sys
<ide> try:
<ide> if sys.version_info[0] >= 3:
<ide> import io
<ide>
<add> try:
<add> import pickle5 as pickle
<add> except ImportError:
<add> import pickle
<add>
<ide> long = int
<ide> integer_types = (int,)
<ide> basestring = str
<ide> def sixu(s):
<ide>
<ide> strchar = 'U'
<ide>
<del>
<ide> else:
<add> import cpickle as pickle
<add>
<ide> bytes = str
<ide> long = long
<ide> basestring = basestring
<ide> def open_latin1(filename, mode='r'):
<ide> def sixu(s):
<ide> return unicode(s, 'unicode_escape')
<ide>
<del>
<ide> def getexception():
<ide> return sys.exc_info()[1]
<ide>
<ide><path>numpy/core/numeric.py
<ide> import contextlib
<ide>
<ide> import numpy as np
<add>from numpy.compat import pickle, basestring
<ide> from . import multiarray
<ide> from .multiarray import (
<ide> _fastCopyAndTranspose as fastCopyAndTranspose, ALLOW_THREADS,
<ide> newaxis = None
<ide>
<ide> if sys.version_info[0] >= 3:
<del> if sys.version_info[1] in (6, 7):
<del> try:
<del> import pickle5 as pickle
<del> except ImportError:
<del> import pickle
<del> else:
<del> import pickle
<del> basestring = str
<ide> import builtins
<ide> else:
<del> import cPickle as pickle
<ide> import __builtin__ as builtins
<ide>
<ide>
<ide><path>numpy/core/tests/test_datetime.py
<ide> assert_, assert_equal, assert_raises, assert_warns, suppress_warnings,
<ide> assert_raises_regex,
<ide> )
<del>from numpy.core.numeric import pickle
<add>from numpy.compat import pickle
<ide>
<ide> # Use pytz to test out various time zones if available
<ide> try:
<ide><path>numpy/core/tests/test_dtype.py
<ide> from numpy.core._rational_tests import rational
<ide> from numpy.testing import (
<ide> assert_, assert_equal, assert_array_equal, assert_raises, HAS_REFCOUNT)
<del>from numpy.core.numeric import pickle
<add>from numpy.compat import pickle
<ide>
<ide> def assert_dtype_equal(a, b):
<ide> assert_equal(a, b)
<ide><path>numpy/core/tests/test_multiarray.py
<ide> import pytest
<ide> from contextlib import contextmanager
<ide>
<del>from numpy.core.numeric import pickle
<add>from numpy.compat import pickle
<ide>
<ide> if sys.version_info[0] >= 3:
<ide> import builtins
<ide><path>numpy/core/tests/test_overrides.py
<ide> from numpy.core.overrides import (
<ide> _get_implementing_args, array_function_dispatch,
<ide> verify_matching_signatures)
<del>from numpy.core.numeric import pickle
<add>from numpy.compat import pickle
<ide> import pytest
<ide>
<ide>
<ide><path>numpy/core/tests/test_records.py
<ide> assert_, assert_equal, assert_array_equal, assert_array_almost_equal,
<ide> assert_raises, temppath
<ide> )
<del>from numpy.core.numeric import pickle
<add>from numpy.compat import pickle
<ide>
<ide>
<ide> class TestFromrecords(object):
<ide><path>numpy/core/tests/test_regression.py
<ide> assert_raises_regex, assert_warns, suppress_warnings,
<ide> _assert_valid_refcount, HAS_REFCOUNT,
<ide> )
<del>from numpy.compat import asbytes, asunicode, long
<del>from numpy.core.numeric import pickle
<add>from numpy.compat import asbytes, asunicode, long, pickle
<ide>
<ide> try:
<ide> RecursionError
<ide><path>numpy/core/tests/test_ufunc.py
<ide> assert_almost_equal, assert_array_almost_equal, assert_no_warnings,
<ide> assert_allclose,
<ide> )
<del>from numpy.core.numeric import pickle
<add>from numpy.compat import pickle
<ide>
<ide>
<ide> class TestUfuncKwargs(object):
<ide><path>numpy/lib/format.py
<ide> import warnings
<ide> from numpy.lib.utils import safe_eval
<ide> from numpy.compat import (
<del> asbytes, asstr, isfileobj, long, os_fspath
<add> asbytes, asstr, isfileobj, long, os_fspath, pickle
<ide> )
<del>from numpy.core.numeric import pickle
<ide>
<ide>
<ide> MAGIC_PREFIX = b'\x93NUMPY'
<ide><path>numpy/lib/npyio.py
<ide>
<ide> from numpy.compat import (
<ide> asbytes, asstr, asunicode, asbytes_nested, bytes, basestring, unicode,
<del> os_fspath, os_PathLike
<add> os_fspath, os_PathLike, pickle
<ide> )
<del>from numpy.core.numeric import pickle
<ide>
<ide> if sys.version_info[0] >= 3:
<ide> from collections.abc import Mapping
<ide><path>numpy/ma/core.py
<ide> from numpy.core.multiarray import normalize_axis_index
<ide> from numpy.core.numeric import normalize_axis_tuple
<ide> from numpy.core._internal import recursive
<del>from numpy.core.numeric import pickle
<add>from numpy.compat import pickle
<ide>
<ide>
<ide> __all__ = [
<ide><path>numpy/ma/tests/test_core.py
<ide> ravel, repeat, reshape, resize, shape, sin, sinh, sometrue, sort, sqrt,
<ide> subtract, sum, take, tan, tanh, transpose, where, zeros,
<ide> )
<del>from numpy.core.numeric import pickle
<add>from numpy.compat import pickle
<ide>
<ide> pi = np.pi
<ide>
<ide><path>numpy/ma/tests/test_mrecords.py
<ide> assert_, assert_equal,
<ide> assert_equal_records,
<ide> )
<del>from numpy.core.numeric import pickle
<add>from numpy.compat import pickle
<ide>
<ide>
<ide> class TestMRecords(object):
<ide><path>numpy/ma/tests/test_old_ma.py
<ide> repeat, resize, shape, sin, sinh, sometrue, sort, sqrt, subtract, sum,
<ide> take, tan, tanh, transpose, where, zeros,
<ide> )
<del>from numpy.core.numeric import pickle
<add>from numpy.compat import pickle
<ide>
<ide> pi = np.pi
<ide>
<ide><path>numpy/matrixlib/tests/test_masked_matrix.py
<ide> MaskType, getmask, MaskedArray, nomask,
<ide> log, add, hypot, divide)
<ide> from numpy.ma.extras import mr_
<del>from numpy.core.numeric import pickle
<add>from numpy.compat import pickle
<ide>
<ide>
<ide> class MMatrix(MaskedArray, np.matrix,):
<ide><path>numpy/tests/test_reloading.py
<ide> import sys
<ide>
<ide> from numpy.testing import assert_raises, assert_, assert_equal
<del>from numpy.core.numeric import pickle
<add>from numpy.compat import pickle
<ide>
<ide> if sys.version_info[:2] >= (3, 4):
<ide> from importlib import reload | 17 |
Javascript | Javascript | export more helpers in videojs object | 9d832eca5f997e040a554cd2e44c11b3800079a1 | <ide><path>src/js/video.js
<ide> import * as Fn from './utils/fn.js';
<ide> import TextTrack from './tracks/text-track.js';
<ide> import AudioTrack from './tracks/audio-track.js';
<ide> import VideoTrack from './tracks/video-track.js';
<del>
<ide> import { createTimeRanges } from './utils/time-ranges.js';
<ide> import formatTime, { setFormatTime, resetFormatTime } from './utils/format-time.js';
<ide> import log, { createLogger } from './utils/log.js';
<ide> import * as Dom from './utils/dom.js';
<ide> import * as browser from './utils/browser.js';
<ide> import * as Url from './utils/url.js';
<del>import {isObject} from './utils/obj';
<add>import * as Obj from './utils/obj';
<add>import clamp from './utils/clamp';
<add>import { isPromise, silencePromise } from './utils/promise';
<add>import * as StringCases from './utils/string-cases';
<ide> import computedStyle from './utils/computed-style.js';
<ide> import extend from './extend.js';
<ide> import xhr from '@videojs/xhr';
<ide> function videojs(id, options, ready) {
<ide> hooks('beforesetup').forEach((hookFunction) => {
<ide> const opts = hookFunction(el, mergeOptions(options));
<ide>
<del> if (!isObject(opts) || Array.isArray(opts)) {
<add> if (!Obj.isObject(opts) || Array.isArray(opts)) {
<ide> log.error('please return an object in beforesetup hooks');
<ide> return;
<ide> }
<ide> videojs.defineLazyProperty = defineLazyProperty;
<ide> // In a major update this could become the default text and key.
<ide> videojs.addLanguage('en', {'Non-Fullscreen': 'Exit Fullscreen'});
<ide>
<add>videojs.clamp = clamp;
<add>videojs.fn = Fn;
<add>videojs.obj = Obj;
<add>videojs.isPromise = isPromise;
<add>videojs.silencePromise = silencePromise;
<add>videojs.strings = StringCases;
<add>
<ide> export default videojs;
<ide> | 1 |
Text | Text | add note on defaults block option | 6a116bd9091fc68dc26317b398ebcbdfe8c34155 | <ide><path>guides/source/routing.md
<ide> get 'photos/:id', to: 'photos#show', defaults: { format: 'jpg' }
<ide>
<ide> Rails would match `photos/12` to the `show` action of `PhotosController`, and set `params[:format]` to `"jpg"`.
<ide>
<add>You can also use `defaults` in a block format to define the defaults for multiple items:
<add>
<add>```ruby
<add>defaults format: :json do
<add> resources :photos
<add>end
<add>```
<add>
<ide> NOTE: You cannot override defaults via query parameters - this is for security reasons. The only defaults that can be overridden are dynamic segments via substitution in the URL path.
<ide>
<ide> ### Naming Routes | 1 |
Ruby | Ruby | move dependencies to softwarespec | 5511a8b3f528445a79ae04138698e43620400640 | <ide><path>Library/Homebrew/formula.rb
<del>require 'dependency_collector'
<ide> require 'formula_support'
<ide> require 'formula_lock'
<ide> require 'formula_pin'
<ide> def resources
<ide> active_spec.resources.values
<ide> end
<ide>
<add> def deps
<add> active_spec.deps
<add> end
<add>
<add> def requirements
<add> active_spec.requirements
<add> end
<add>
<ide> # if the dir is there, but it's empty we consider it not installed
<ide> def installed?
<ide> (dir = installed_prefix).directory? && dir.children.length > 0
<ide> def self.path name
<ide> Pathname.new("#{HOMEBREW_REPOSITORY}/Library/Formula/#{name.downcase}.rb")
<ide> end
<ide>
<del> def deps; self.class.dependencies.deps; end
<del> def requirements; self.class.dependencies.requirements; end
<del>
<ide> def env
<ide> @env ||= self.class.env
<ide> end
<ide> def resource name, &block
<ide> end
<ide> end
<ide>
<del> def dependencies
<del> @dependencies ||= DependencyCollector.new
<del> end
<del>
<ide> def depends_on dep
<del> d = dependencies.add(dep)
<del> build.add_dep_option(d) unless d.nil?
<add> specs.each { |spec| spec.depends_on(dep) }
<ide> end
<ide>
<ide> def option name, description=nil
<ide><path>Library/Homebrew/software_spec.rb
<ide> require 'checksum'
<ide> require 'version'
<ide> require 'build_options'
<add>require 'dependency_collector'
<ide>
<ide> class SoftwareSpec
<ide> extend Forwardable
<ide>
<ide> attr_reader :build, :resources, :owner
<add> attr_reader :dependency_collector
<ide>
<ide> def_delegators :@resource, :stage, :fetch
<ide> def_delegators :@resource, :download_strategy, :verify_download_integrity
<ide> def initialize url=nil, version=nil
<ide> @resource = Resource.new(:default, url, version)
<ide> @resources = {}
<ide> @build = BuildOptions.new(ARGV.options_only)
<add> @dependency_collector = DependencyCollector.new
<ide> end
<ide>
<ide> def owner= owner
<ide> def option name, description=nil
<ide> raise "Options should not start with dashes." if name[0, 1] == "-"
<ide> build.add(name, description)
<ide> end
<add>
<add> def depends_on spec
<add> dep = dependency_collector.add(spec)
<add> build.add_dep_option(dep) if dep
<add> end
<add>
<add> def deps
<add> dependency_collector.deps
<add> end
<add>
<add> def requirements
<add> dependency_collector.requirements
<add> end
<ide> end
<ide>
<ide> class HeadSoftwareSpec < SoftwareSpec
<ide><path>Library/Homebrew/test/test_formula.rb
<ide> def initialize(*args)
<ide> ensure
<ide> path.unlink
<ide> end
<del>
<del> def test_dependency_option_integration
<del> f = formula do
<del> url 'foo-1.0'
<del> depends_on 'foo' => :optional
<del> depends_on 'bar' => :recommended
<del> end
<del>
<del> assert f.build.has_option?('with-foo')
<del> assert f.build.has_option?('without-bar')
<del> end
<del>
<del> def test_explicit_options_override_default_dep_option_description
<del> f = formula do
<del> url 'foo-1.0'
<del> option 'with-foo', 'blah'
<del> depends_on 'foo' => :optional
<del> end
<del>
<del> assert_equal 'blah', f.build.first.description
<del> end
<ide> end
<ide><path>Library/Homebrew/test/test_software_spec.rb
<ide> def test_option_accepts_symbols
<ide> @spec.option(:foo)
<ide> assert @spec.build.has_option? 'foo'
<ide> end
<add>
<add> def test_depends_on
<add> @spec.depends_on('foo')
<add> assert_equal 'foo', @spec.deps.first.name
<add> end
<add>
<add> def test_dependency_option_integration
<add> @spec.depends_on 'foo' => :optional
<add> @spec.depends_on 'bar' => :recommended
<add> assert @spec.build.has_option?('with-foo')
<add> assert @spec.build.has_option?('without-bar')
<add> end
<add>
<add> def test_explicit_options_override_default_dep_option_description
<add> @spec.option('with-foo', 'blah')
<add> @spec.depends_on('foo' => :optional)
<add> assert_equal 'blah', @spec.build.first.description
<add> end
<ide> end
<ide>
<ide> class HeadSoftwareSpecTests < Test::Unit::TestCase | 4 |
Java | Java | refine resolvabletype class | 3337fd32cba66aee549e4dddae61b86fe80832e3 | <ide><path>spring-core/src/main/java/org/springframework/core/GenericTypeResolver.java
<ide> public static Class[] resolveTypeArguments(Class<?> clazz, Class<?> genericIfc)
<ide> * @deprecated as of Spring 4.0 in favor of {@link ResolvableType}
<ide> */
<ide> @Deprecated
<del> public static Class<?> resolveType(Type genericType, Map<TypeVariable, Type> typeVariableMap) {
<del> TypeVariableResolver variableResolver = new TypeVariableMapResolver(typeVariableMap);
<del> Class<?> resolved = ResolvableType.forType(genericType, variableResolver).resolve();
<del> return (resolved == null ? Object.class : resolved);
<add> public static Class<?> resolveType(Type genericType, final Map<TypeVariable, Type> typeVariableMap) {
<add>
<add> ResolvableType.VariableResolver variableResolver = new ResolvableType.VariableResolver() {
<add> @Override
<add> public ResolvableType resolveVariable(TypeVariable<?> variable) {
<add> Type type = typeVariableMap.get(variable);
<add> return (type == null ? null : ResolvableType.forType(type));
<add> }
<add>
<add> @Override
<add> public Object getSource() {
<add> return typeVariableMap;
<add> }
<add> };
<add>
<add> return ResolvableType.forType(genericType, variableResolver).resolve(Object.class);
<ide> }
<ide>
<ide> /**
<ide> private static void buildTypeVariableMap(ResolvableType type, Map<TypeVariable,
<ide> }
<ide> }
<ide>
<del>
<del> /**
<del> * Adapts a {@code typeVariableMap} to a {@link TypeVariableResolver}.
<del> */
<del> private static class TypeVariableMapResolver implements TypeVariableResolver {
<del>
<del> private final Map<TypeVariable, Type> typeVariableMap;
<del>
<del> public TypeVariableMapResolver(Map<TypeVariable, Type> typeVariableMap) {
<del> Assert.notNull("TypeVariableMap must not be null");
<del> this.typeVariableMap = typeVariableMap;
<del> }
<del>
<del> @Override
<del> public Type resolveVariable(TypeVariable typeVariable) {
<del> return this.typeVariableMap.get(typeVariable);
<del> }
<del>
<del> @Override
<del> public boolean equals(Object obj) {
<del> if (this == obj) {
<del> return true;
<del> }
<del> if (obj instanceof TypeVariableMapResolver) {
<del> TypeVariableMapResolver other = (TypeVariableMapResolver) obj;
<del> return this.typeVariableMap.equals(other.typeVariableMap);
<del> }
<del> return false;
<del> }
<del>
<del> @Override
<del> public int hashCode() {
<del> return this.typeVariableMap.hashCode();
<del> }
<del> }
<del>
<ide> }
<ide><path>spring-core/src/main/java/org/springframework/core/ResolvableType.java
<ide>
<ide> package org.springframework.core;
<ide>
<add>import java.io.ObjectStreamException;
<add>import java.io.Serializable;
<ide> import java.lang.reflect.Array;
<ide> import java.lang.reflect.Constructor;
<ide> import java.lang.reflect.Field;
<ide> *
<ide> * <p>{@code ResolvableTypes} may be obtained from {@link #forField(Field) fields},
<ide> * {@link #forMethodParameter(Method, int) method parameters},
<del> * {@link #forMethodReturnType(Method) method returns}, {@link #forClass(Class) classes}, or
<del> * directly from a {@link #forType(Type) java.lang.reflect.Type}. Most methods on this class
<del> * will themselves return {@link ResolvableType}s, allowing easy navigation. For example:
<add> * {@link #forMethodReturnType(Method) method returns} or
<add> * {@link #forClass(Class) classes}. Most methods on this class will themselves return
<add> * {@link ResolvableType}s, allowing easy navigation. For example:
<ide> * <pre class="code">
<ide> * private HashMap<Integer, List<String>> myMap;
<ide> *
<ide> * @author Phillip Webb
<ide> * @author Juergen Hoeller
<ide> * @since 4.0
<del> * @see TypeVariableResolver
<ide> * @see #forField(Field)
<ide> * @see #forMethodParameter(Method, int)
<ide> * @see #forMethodReturnType(Method)
<ide> * @see #forConstructorParameter(Constructor, int)
<ide> * @see #forClass(Class)
<ide> * @see #forType(Type)
<ide> */
<del>public final class ResolvableType implements TypeVariableResolver {
<add>public final class ResolvableType implements Serializable {
<ide>
<ide> private static ConcurrentReferenceHashMap<ResolvableType, ResolvableType> cache =
<ide> new ConcurrentReferenceHashMap<ResolvableType, ResolvableType>();
<ide> public final class ResolvableType implements TypeVariableResolver {
<ide> * {@code ResolvableType} returned when no value is available. {@code NONE} is used
<ide> * in preference to {@code null} so that multiple method calls can be safely chained.
<ide> */
<del> public static final ResolvableType NONE = new ResolvableType(null, null);
<add> public static final ResolvableType NONE = new ResolvableType(null, null, null);
<ide>
<ide>
<ide> private static final ResolvableType[] EMPTY_TYPES_ARRAY = new ResolvableType[0];
<ide> public final class ResolvableType implements TypeVariableResolver {
<ide> private final Type type;
<ide>
<ide> /**
<del> * The {@link TypeVariableResolver} to use or {@code null} if no resolver is available.
<add> * The {@link VariableResolver} to use or {@code null} if no resolver is available.
<ide> */
<del> private final TypeVariableResolver variableResolver;
<add> private final VariableResolver variableResolver;
<ide>
<ide> /**
<ide> * Stored copy of the resolved value or {@code null} if the resolve method has not
<ide> * yet been called. {@code void.class} is used when the resolve method failed.
<ide> */
<ide> private Class<?> resolved;
<ide>
<add> /**
<add> * The component type for an array or {@code null} if the type should be deduced.
<add> */
<add> private final ResolvableType componentType;
<add>
<add>
<ide>
<ide> /**
<ide> * Private constructor used to create a new {@link ResolvableType}.
<ide> * @param type the underlying java type (may only be {@code null} for {@link #NONE})
<ide> * @param variableResolver the resolver used for {@link TypeVariable}s (may be {@code null})
<add> * @param componentType an option declared component type for arrays (may be {@code null})
<ide> */
<del> private ResolvableType(Type type, TypeVariableResolver variableResolver) {
<add> private ResolvableType(Type type, VariableResolver variableResolver, ResolvableType componentType) {
<ide> this.type = type;
<ide> this.variableResolver = variableResolver;
<add> this.componentType = componentType;
<ide> }
<ide>
<ide>
<ide> public ResolvableType getComponentType() {
<ide> if (this == NONE) {
<ide> return NONE;
<ide> }
<add> if (this.componentType != null) {
<add> return this.componentType;
<add> }
<ide> if (this.type instanceof Class) {
<ide> Class<?> componentType = ((Class<?>) this.type).getComponentType();
<del> return (componentType == null ? NONE : forType(componentType,
<del> this.variableResolver));
<add> return forType(componentType, this.variableResolver);
<ide> }
<ide> if (this.type instanceof GenericArrayType) {
<ide> return forType(((GenericArrayType) this.type).getGenericComponentType(),
<ide> public ResolvableType as(Class<?> type) {
<ide> * @see #getInterfaces()
<ide> */
<ide> public ResolvableType getSuperType() {
<del> Class<?> resolved = resolve();
<add> final Class<?> resolved = resolve();
<ide> if (resolved == null || resolved.getGenericSuperclass() == null) {
<ide> return NONE;
<ide> }
<del> return forType(resolved.getGenericSuperclass(), this);
<add> return forType(SerializableTypeWrapper.forGenericSuperclass(resolved), asVariableResolver());
<ide> }
<ide>
<ide> /**
<ide> public ResolvableType getSuperType() {
<ide> * @see #getSuperType()
<ide> */
<ide> public ResolvableType[] getInterfaces() {
<del> Class<?> resolved = resolve();
<add> final Class<?> resolved = resolve();
<ide> if (resolved == null || ObjectUtils.isEmpty(resolved.getGenericInterfaces())) {
<ide> return EMPTY_TYPES_ARRAY;
<ide> }
<del> Type[] interfaceTypes = resolved.getGenericInterfaces();
<del> ResolvableType[] interfaces = new ResolvableType[interfaceTypes.length];
<del> for (int i = 0; i < interfaceTypes.length; i++) {
<del> interfaces[i] = forType(interfaceTypes[i], this);
<del> }
<del> return interfaces;
<add> return forTypes(SerializableTypeWrapper.forGenericInterfaces(resolved), asVariableResolver());
<ide> }
<ide>
<ide> /**
<ide> public ResolvableType[] getGenerics() {
<ide> if (this == NONE) {
<ide> return EMPTY_TYPES_ARRAY;
<ide> }
<add> if (this.type instanceof Class<?>) {
<add> Class<?> typeClass = (Class<?>) this.type;
<add> return forTypes(SerializableTypeWrapper.forTypeParameters(typeClass), this.variableResolver);
<add> }
<ide> if (this.type instanceof ParameterizedType) {
<del> Type[] genericTypes = ((ParameterizedType) getType()).getActualTypeArguments();
<del> ResolvableType[] generics = new ResolvableType[genericTypes.length];
<del> for (int i = 0; i < genericTypes.length; i++) {
<del> generics[i] = forType(genericTypes[i], this);
<add> Type[] actualTypeArguments = ((ParameterizedType) this.type).getActualTypeArguments();
<add> ResolvableType[] generics = new ResolvableType[actualTypeArguments.length];
<add> for (int i = 0; i < actualTypeArguments.length; i++) {
<add> generics[i] = forType(actualTypeArguments[i], this.variableResolver);
<ide> }
<ide> return generics;
<ide> }
<ide> public Class<?> resolve() {
<ide> */
<ide> public Class<?> resolve(Class<?> fallback) {
<ide> if (this.resolved == null) {
<del> synchronized (this) {
<del> this.resolved = resolveClass();
<del> this.resolved = (this.resolved == null ? void.class : this.resolved);
<del> }
<add> Class<?> resolvedClass = resolveClass();
<add> this.resolved = (resolvedClass == null ? void.class : resolvedClass);
<ide> }
<ide> return (this.resolved == void.class ? fallback : this.resolved);
<ide> }
<ide> private Class<?> resolveClass() {
<ide>
<ide> /**
<ide> * Resolve this type by a single level, returning the resolved value or {@link #NONE}.
<add> * NOTE: the returned {@link ResolvableType} should only be used as an intermediary as
<add> * it cannot be serialized.
<ide> */
<ide> ResolvableType resolveType() {
<del> Type resolved = null;
<add>
<ide> if (this.type instanceof ParameterizedType) {
<del> resolved = ((ParameterizedType) this.type).getRawType();
<add> return forType(((ParameterizedType) this.type).getRawType(),
<add> this.variableResolver);
<ide> }
<del> else if (this.type instanceof WildcardType) {
<del> resolved = resolveBounds(((WildcardType) this.type).getUpperBounds());
<add>
<add> if (this.type instanceof WildcardType) {
<add> Type resolved = resolveBounds(((WildcardType) this.type).getUpperBounds());
<ide> if (resolved == null) {
<ide> resolved = resolveBounds(((WildcardType) this.type).getLowerBounds());
<ide> }
<add> return forType(resolved, this.variableResolver);
<ide> }
<del> else if (this.type instanceof TypeVariable) {
<add>
<add> if (this.type instanceof TypeVariable) {
<add> TypeVariable<?> variable = (TypeVariable<?>) this.type;
<add>
<add> // Try default variable resolution
<ide> if (this.variableResolver != null) {
<del> resolved = this.variableResolver.resolveVariable((TypeVariable<?>) this.type);
<del> }
<del> if (resolved == null) {
<del> resolved = resolveBounds(((TypeVariable<?>) this.type).getBounds());
<add> ResolvableType resolved = this.variableResolver.resolveVariable(variable);
<add> if(resolved != null) {
<add> return resolved;
<add> }
<ide> }
<add>
<add> // Fallback to bounds
<add> return forType(resolveBounds(variable.getBounds()), this.variableResolver);
<ide> }
<del> return (resolved == null ? NONE : forType(resolved, this.variableResolver));
<add>
<add> return NONE;
<ide> }
<ide>
<ide> private Type resolveBounds(Type[] bounds) {
<ide> private Type resolveBounds(Type[] bounds) {
<ide> return bounds[0];
<ide> }
<ide>
<del> public Type resolveVariable(TypeVariable<?> variable) {
<del> Assert.notNull("Variable must not be null");
<add> private ResolvableType resolveVariable(TypeVariable<?> variable) {
<add>
<add> if (this.type instanceof TypeVariable) {
<add> return resolveType().resolveVariable(variable);
<add> }
<add>
<ide> if (this.type instanceof ParameterizedType) {
<add>
<ide> ParameterizedType parameterizedType = (ParameterizedType) this.type;
<del> Type owner = parameterizedType.getOwnerType();
<ide> if (parameterizedType.getRawType().equals(variable.getGenericDeclaration())) {
<ide> TypeVariable<?>[] variables = resolve().getTypeParameters();
<ide> for (int i = 0; i < variables.length; i++) {
<ide> if (ObjectUtils.nullSafeEquals(variables[i].getName(), variable.getName())) {
<del> return parameterizedType.getActualTypeArguments()[i];
<add> Type actualType = parameterizedType.getActualTypeArguments()[i];
<add> return forType(actualType, this.variableResolver);
<ide> }
<ide> }
<ide> }
<del> Type resolved = null;
<del> if (this.variableResolver != null) {
<del> resolved = this.variableResolver.resolveVariable(variable);
<del> }
<del> if (resolved == null && owner != null) {
<del> resolved = forType(owner, this.variableResolver).resolveVariable(variable);
<add>
<add> if (parameterizedType.getOwnerType() != null) {
<add> return forType(parameterizedType.getOwnerType(),
<add> this.variableResolver).resolveVariable(variable);
<ide> }
<del> return resolved;
<ide> }
<del> if (this.type instanceof TypeVariable) {
<del> return resolveType().resolveVariable(variable);
<add>
<add> if (this.variableResolver != null) {
<add> return this.variableResolver.resolveVariable(variable);
<ide> }
<add>
<ide> return null;
<ide> }
<ide>
<ide> public boolean equals(Object obj) {
<ide> }
<ide> if (obj instanceof ResolvableType) {
<ide> ResolvableType other = (ResolvableType) obj;
<del> return ObjectUtils.nullSafeEquals(this.type, other.type) &&
<del> ObjectUtils.nullSafeEquals(this.variableResolver, other.variableResolver);
<add> boolean equals = ObjectUtils.nullSafeEquals(this.type, other.type);
<add> equals &= variableResolverSourceEquals(this.variableResolver, other.variableResolver);
<add> equals &= ObjectUtils.nullSafeEquals(this.componentType, other.componentType);
<add> return equals;
<ide> }
<ide> return false;
<ide> }
<ide>
<ide> @Override
<ide> public int hashCode() {
<del> return ObjectUtils.nullSafeHashCode(this.type);
<add> int hashCode = ObjectUtils.nullSafeHashCode(this.type);
<add> hashCode = hashCode * 31 + ObjectUtils.nullSafeHashCode(this.variableResolver);
<add> hashCode = hashCode * 31 + ObjectUtils.nullSafeHashCode(this.componentType);
<add> return hashCode;
<ide> }
<ide>
<add> /**
<add> * Custom serialization support for {@value #NONE}.
<add> */
<add> private Object readResolve() throws ObjectStreamException {
<add> return (this.type == null ? NONE : this);
<add> }
<add>
<add> /**
<add> * Adapts this {@link ResolvableType} to a {@link VariableResolver}.
<add> */
<add> VariableResolver asVariableResolver() {
<add> if (this == NONE) {
<add> return null;
<add> }
<add>
<add> return new VariableResolver() {
<add> @Override
<add> public ResolvableType resolveVariable(TypeVariable<?> variable) {
<add> return ResolvableType.this.resolveVariable(variable);
<add> }
<add>
<add> @Override
<add> public Object getSource() {
<add> return ResolvableType.this;
<add> }
<add> };
<add> }
<add>
<add> private static boolean variableResolverSourceEquals(VariableResolver o1, VariableResolver o2) {
<add> Object s1 = (o1 == null ? null : o1.getSource());
<add> Object s2 = (o2 == null ? null : o2.getSource());
<add> return ObjectUtils.nullSafeEquals(s1,s2);
<add> }
<add>
<add> private static ResolvableType[] forTypes(Type[] types, VariableResolver owner) {
<add> ResolvableType[] result = new ResolvableType[types.length];
<add> for (int i = 0; i < types.length; i++) {
<add> result[i] = forType(types[i], owner);
<add> }
<add> return result;
<add> }
<ide>
<ide> /**
<ide> * Return a {@link ResolvableType} for the specified {@link Class}. For example
<ide> * {@code ResolvableType.forClass(MyArrayList.class)}.
<ide> * @param sourceClass the source class (must not be {@code null}
<ide> * @return a {@link ResolvableType} for the specified class
<ide> * @see #forClass(Class, Class)
<add> * @see #forClassWithGenerics(Class, Class...)
<ide> */
<ide> public static ResolvableType forClass(Class<?> sourceClass) {
<ide> Assert.notNull(sourceClass, "Source class must not be null");
<ide> public static ResolvableType forClass(Class<?> sourceClass) {
<ide> * implementation. For example
<ide> * {@code ResolvableType.forClass(List.class, MyArrayList.class)}.
<ide> * @param sourceClass the source class (must not be {@code null}
<del> * @param implementationClass the implementation class (must not be {@code null})
<add> * @param implementationClass the implementation class
<ide> * @return a {@link ResolvableType} for the specified class backed by the given
<ide> * implementation class
<ide> * @see #forClass(Class)
<add> * @see #forClassWithGenerics(Class, Class...)
<ide> */
<ide> public static ResolvableType forClass(Class<?> sourceClass, Class<?> implementationClass) {
<ide> Assert.notNull(sourceClass, "Source class must not be null");
<del> ResolvableType asType = (implementationClass != null ? forType(implementationClass).as(sourceClass) : NONE);
<add> ResolvableType asType = forType(implementationClass).as(sourceClass);
<ide> return (asType == NONE ? forType(sourceClass) : asType);
<ide> }
<ide>
<ide> public static ResolvableType forClass(Class<?> sourceClass, Class<?> implementat
<ide> */
<ide> public static ResolvableType forField(Field field) {
<ide> Assert.notNull(field, "Field must not be null");
<del> return forType(field.getGenericType());
<add> return forType(SerializableTypeWrapper.forField(field));
<ide> }
<ide>
<ide> /**
<ide> public static ResolvableType forField(Field field) {
<ide> * <p>Use this variant when the class that declares the field includes generic
<ide> * parameter variables that are satisfied by the implementation class.
<ide> * @param field the source field
<del> * @param implementationClass the implementation class (must not be {@code null})
<add> * @param implementationClass the implementation class
<ide> * @return a {@link ResolvableType} for the specified field
<ide> * @see #forField(Field)
<ide> */
<ide> public static ResolvableType forField(Field field, Class<?> implementationClass) {
<ide> Assert.notNull(field, "Field must not be null");
<del> TypeVariableResolver variableResolver = (implementationClass != null ?
<del> forType(implementationClass).as(field.getDeclaringClass()) : null);
<del> return forType(field.getGenericType(), variableResolver);
<add> ResolvableType owner = forType(implementationClass).as(field.getDeclaringClass());
<add> return forType(SerializableTypeWrapper.forField(field), owner.asVariableResolver());
<ide> }
<ide>
<ide> /**
<ide> public static ResolvableType forField(Field field, Class<?> implementationClass)
<ide> */
<ide> public static ResolvableType forField(Field field, int nestingLevel) {
<ide> Assert.notNull(field, "Field must not be null");
<del> return forType(field.getGenericType()).getNested(nestingLevel);
<add> return forType(SerializableTypeWrapper.forField(field)).getNested(nestingLevel);
<ide> }
<ide>
<ide> /**
<ide> public static ResolvableType forField(Field field, int nestingLevel) {
<ide> * @param field the source field
<ide> * @param nestingLevel the nesting level (1 for the outer level; 2 for a nested
<ide> * generic type; etc)
<del> * @param implementationClass the implementation class (must not be {@code null})
<add> * @param implementationClass the implementation class
<ide> * @return a {@link ResolvableType} for the specified field
<ide> * @see #forField(Field)
<ide> */
<ide> public static ResolvableType forField(Field field, int nestingLevel, Class<?> implementationClass) {
<ide> Assert.notNull(field, "Field must not be null");
<del> TypeVariableResolver variableResolver = (implementationClass != null ?
<del> forType(implementationClass).as(field.getDeclaringClass()) : null);
<del> return forType(field.getGenericType(), variableResolver).getNested(nestingLevel);
<add> ResolvableType owner = forType(implementationClass).as(field.getDeclaringClass());
<add> return forType(SerializableTypeWrapper.forField(field), owner.asVariableResolver()).getNested(nestingLevel);
<ide> }
<ide>
<ide> /**
<ide> public static ResolvableType forConstructorParameter(Constructor<?> constructor,
<ide> * implementation class.
<ide> * @param constructor the source constructor (must not be {@code null})
<ide> * @param parameterIndex the parameter index
<del> * @param implementationClass the implementation class (must not be {@code null})
<add> * @param implementationClass the implementation class
<ide> * @return a {@link ResolvableType} for the specified constructor parameter
<ide> * @see #forConstructorParameter(Constructor, int)
<ide> */
<ide> public static ResolvableType forConstructorParameter(Constructor<?> constructor, int parameterIndex,
<ide> Class<?> implementationClass) {
<del>
<ide> Assert.notNull(constructor, "Constructor must not be null");
<ide> MethodParameter methodParameter = new MethodParameter(constructor, parameterIndex);
<ide> methodParameter.setContainingClass(implementationClass);
<ide> return forMethodParameter(methodParameter);
<ide> }
<ide>
<add> /**
<add> * Return a {@link ResolvableType} for the specified {@link Method} return type.
<add> * @param method the source for the method return type
<add> * @return a {@link ResolvableType} for the specified method return
<add> * @see #forMethodReturnType(Method, Class)
<add> */
<add> public static ResolvableType forMethodReturnType(Method method) {
<add> Assert.notNull(method, "Method must not be null");
<add> return forMethodParameter(MethodParameter.forMethodOrConstructor(method, -1));
<add> }
<add>
<add> /**
<add> * Return a {@link ResolvableType} for the specified {@link Method} return type.
<add> * Use this variant when the class that declares the method includes generic
<add> * parameter variables that are satisfied by the implementation class.
<add> * @param method the source for the method return type
<add> * @param implementationClass the implementation class
<add> * @return a {@link ResolvableType} for the specified method return
<add> * @see #forMethodReturnType(Method)
<add> */
<add> public static ResolvableType forMethodReturnType(Method method, Class<?> implementationClass) {
<add> Assert.notNull(method, "Method must not be null");
<add> MethodParameter methodParameter = MethodParameter.forMethodOrConstructor(method, -1);
<add> methodParameter.setContainingClass(implementationClass);
<add> return forMethodParameter(methodParameter);
<add> }
<add>
<ide> /**
<ide> * Return a {@link ResolvableType} for the specified {@link Method} parameter.
<ide> * @param method the source method (must not be {@code null})
<ide> public static ResolvableType forMethodParameter(Method method, int parameterInde
<ide> * includes generic parameter variables that are satisfied by the implementation class.
<ide> * @param method the source method (must not be {@code null})
<ide> * @param parameterIndex the parameter index
<del> * @param implementationClass the implementation class (must not be {@code null})
<add> * @param implementationClass the implementation class
<ide> * @return a {@link ResolvableType} for the specified method parameter
<ide> * @see #forMethodParameter(Method, int, Class)
<ide> * @see #forMethodParameter(MethodParameter)
<ide> */
<del> public static ResolvableType forMethodParameter(Method method, int parameterIndex, Class<?> implementationClass) {
<add> public static ResolvableType forMethodParameter(Method method, int parameterIndex,
<add> Class<?> implementationClass) {
<ide> Assert.notNull(method, "Method must not be null");
<ide> MethodParameter methodParameter = new MethodParameter(method, parameterIndex);
<ide> methodParameter.setContainingClass(implementationClass);
<ide> public static ResolvableType forMethodParameter(Method method, int parameterInde
<ide> */
<ide> public static ResolvableType forMethodParameter(MethodParameter methodParameter) {
<ide> Assert.notNull(methodParameter, "MethodParameter must not be null");
<del> TypeVariableResolver variableResolver = (methodParameter.getContainingClass() != null ?
<del> forType(methodParameter.getContainingClass()).as(methodParameter.getDeclaringClass()) : null);
<del> return forType(methodParameter.getGenericParameterType(), variableResolver).getNested(
<del> methodParameter.getNestingLevel(), methodParameter.typeIndexesPerLevel);
<add> ResolvableType owner = forType(methodParameter.getContainingClass()).as(
<add> methodParameter.getDeclaringClass());
<add> return forType(SerializableTypeWrapper.forMethodParameter(methodParameter),
<add> owner.asVariableResolver()).getNested(methodParameter.getNestingLevel(),
<add> methodParameter.typeIndexesPerLevel);
<ide> }
<ide>
<ide> /**
<del> * Return a {@link ResolvableType} for the specified {@link Method} return type.
<del> * @param method the source for the method return type
<del> * @return a {@link ResolvableType} for the specified method return
<del> * @see #forMethodReturnType(Method, Class)
<add> * Return a {@link ResolvableType} as a array of the specified {@code componentType}.
<add> * @param componentType the component type
<add> * @return a {@link ResolvableType} as an array of the specified component type
<ide> */
<del> public static ResolvableType forMethodReturnType(Method method) {
<del> Assert.notNull(method, "Method must not be null");
<del> return forType(method.getGenericReturnType());
<add> public static ResolvableType forArrayComponent(final ResolvableType componentType) {
<add> Assert.notNull(componentType, "ComponentType must not be null");
<add> Class<?> arrayClass = Array.newInstance(componentType.resolve(), 0).getClass();
<add> return new ResolvableType(arrayClass, null, componentType);
<ide> }
<ide>
<ide> /**
<del> * Return a {@link ResolvableType} for the specified {@link Method} return type.
<del> * Use this variant when the class that declares the method includes generic
<del> * parameter variables that are satisfied by the implementation class.
<del> * @param method the source for the method return type
<del> * @param implementationClass the implementation class (must not be {@code null})
<del> * @return a {@link ResolvableType} for the specified method return
<del> * @see #forMethodReturnType(Method)
<add> * Return a {@link ResolvableType} for the specified {@link Class} with pre-declared
<add> * generics.
<add> * @param sourceClass the source class
<add> * @param generics the generics of the class
<add> * @return a {@link ResolvableType} for the specific class and generics
<add> * @see #forClassWithGenerics(Class, ResolvableType...)
<ide> */
<del> public static ResolvableType forMethodReturnType(Method method, Class<?> implementationClass) {
<del> Assert.notNull(method, "Method must not be null");
<del> TypeVariableResolver variableResolver = (implementationClass != null ?
<del> forType(implementationClass).as(method.getDeclaringClass()) : null);
<del> return forType(method.getGenericReturnType(), variableResolver);
<add> public static ResolvableType forClassWithGenerics(Class<?> sourceClass,
<add> Class<?>... generics) {
<add> Assert.notNull(sourceClass, "Source class must not be null");
<add> Assert.notNull(generics, "Generics must not be null");
<add> ResolvableType[] resolvableGenerics = new ResolvableType[generics.length];
<add> for (int i = 0; i < generics.length; i++) {
<add> resolvableGenerics[i] = forClass(generics[i]);
<add> }
<add> return forClassWithGenerics(sourceClass, resolvableGenerics);
<add> }
<add>
<add> /**
<add> * Return a {@link ResolvableType} for the specified {@link Class} with pre-declared
<add> * generics.
<add> * @param sourceClass the source class
<add> * @param generics the generics of the class
<add> * @return a {@link ResolvableType} for the specific class and generics
<add> * @see #forClassWithGenerics(Class, Class...)
<add> */
<add> public static ResolvableType forClassWithGenerics(Class<?> sourceClass,
<add> final ResolvableType... generics) {
<add> Assert.notNull(sourceClass, "Source class must not be null");
<add> Assert.notNull(generics, "Generics must not be null");
<add> final TypeVariable<?>[] typeVariables = sourceClass.getTypeParameters();
<add> Assert.isTrue(typeVariables.length == generics.length,
<add> "Missmatched number of generics specified");
<add> VariableResolver variableResolver = new VariableResolver() {
<add> @Override
<add> public ResolvableType resolveVariable(TypeVariable<?> variable) {
<add> for (int i = 0; i < typeVariables.length; i++) {
<add> if(typeVariables[i].equals(variable)) {
<add> return generics[i];
<add> }
<add> }
<add> return null;
<add> }
<add>
<add> @Override
<add> public Object getSource() {
<add> return generics;
<add> }
<add> };
<add>
<add> return forType(sourceClass, variableResolver);
<ide> }
<ide>
<ide> /**
<del> * Return a {@link ResolvableType} for the specified {@link java.lang.reflect.Type}.
<del> * @param type the source type (must not be {@code null})
<del> * @return a {@link ResolvableType} for the specified {@link java.lang.reflect.Type}
<add> * Return a {@link ResolvableType} for the specified {@link Type}. NOTE: The resulting
<add> * {@link ResolvableType} may not be {@link Serializable}.
<add> * @param type the source type or {@code null}
<add> * @return a {@link ResolvableType} for the specified {@link Type}
<add> * @see #forType(Type, ResolvableType)
<ide> */
<ide> public static ResolvableType forType(Type type) {
<del> return forType(type, null);
<add> return forType(type, (VariableResolver) null);
<ide> }
<ide>
<ide> /**
<del> * Return a {@link ResolvableType} for the specified {@link java.lang.reflect.Type}
<del> * backed by a given {@link TypeVariableResolver}.
<del> * @param type the source type (must not be {@code null})
<del> * @param variableResolver the variable resolver
<del> * @return a {@link ResolvableType} for the specified {@link java.lang.reflect.Type}
<del> * and {@link TypeVariableResolver}
<add> * Return a {@link ResolvableType} for the specified {@link Type} backed by the
<add> * given owner type. NOTE: The resulting {@link ResolvableType} may not be
<add> * {@link Serializable}.
<add> * @param type the source type or {@code null}
<add> * @param owner the owner type used to resolve variables
<add> * @return a {@link ResolvableType} for the specified {@link Type} and owner
<add> * @see #forType(Type)
<ide> */
<del> public static ResolvableType forType(Type type, TypeVariableResolver variableResolver) {
<del> ResolvableType key = new ResolvableType(type, variableResolver);
<add> public static ResolvableType forType(Type type, ResolvableType owner) {
<add> VariableResolver variableResolver = null;
<add> if (owner != null) {
<add> variableResolver = owner.asVariableResolver();
<add> }
<add> return forType(type, variableResolver);
<add> }
<add>
<add> /**
<add> * Return a {@link ResolvableType} for the specified {@link Type} backed by a given
<add> * {@link VariableResolver}.
<add> * @param type the source type or {@code null}
<add> * @param variableResolver the variable resolver or {@code null}
<add> * @return a {@link ResolvableType} for the specified {@link Type} and {@link VariableResolver}
<add> */
<add> static ResolvableType forType(Type type, VariableResolver variableResolver) {
<add> if(type == null) {
<add> return NONE;
<add> }
<ide> // Check the cache, we may have a ResolvableType that may have already been resolved
<add> ResolvableType key = new ResolvableType(type, variableResolver, null);
<ide> ResolvableType resolvableType = cache.get(key);
<ide> if (resolvableType == null) {
<ide> resolvableType = key;
<ide> public static ResolvableType forType(Type type, TypeVariableResolver variableRes
<ide> }
<ide>
<ide>
<add> /**
<add> * Strategy interface used to resolve {@link TypeVariable}s.
<add> */
<add> static interface VariableResolver extends Serializable {
<add>
<add> /**
<add> * Return the source of the resolver (used for hashCode and equals).
<add> */
<add> Object getSource();
<add>
<add> /**
<add> * Resolve the specified varaible.
<add> * @param variable the variable to resolve
<add> * @return the resolved variable or {@code null}
<add> */
<add> ResolvableType resolveVariable(TypeVariable<?> variable);
<add>
<add> }
<add>
<add>
<ide> /**
<ide> * Internal helper to handle bounds from {@link WildcardType}s.
<ide> */
<ide> public static WildcardBounds get(ResolvableType type) {
<ide> Type[] bounds = boundsType == Kind.UPPER ? wildcardType.getUpperBounds() : wildcardType.getLowerBounds();
<ide> ResolvableType[] resolvableBounds = new ResolvableType[bounds.length];
<ide> for (int i = 0; i < bounds.length; i++) {
<del> resolvableBounds[i] = forType(bounds[i], type.variableResolver);
<add> resolvableBounds[i] = ResolvableType.forType(bounds[i],
<add> type.variableResolver);
<ide> }
<ide> return new WildcardBounds(boundsType, resolvableBounds);
<ide> }
<ide><path>spring-core/src/main/java/org/springframework/core/SerializableTypeWrapper.java
<add>/*
<add> * Copyright 2002-2013 the original author or authors.
<add> *
<add> * Licensed under the Apache License, Version 2.0 (the "License");
<add> * you may not use this file except in compliance with the License.
<add> * You may obtain a copy of the License at
<add> *
<add> * http://www.apache.org/licenses/LICENSE-2.0
<add> *
<add> * Unless required by applicable law or agreed to in writing, software
<add> * distributed under the License is distributed on an "AS IS" BASIS,
<add> * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
<add> * See the License for the specific language governing permissions and
<add> * limitations under the License.
<add> */
<add>
<add>package org.springframework.core;
<add>
<add>import java.io.IOException;
<add>import java.io.ObjectInputStream;
<add>import java.io.Serializable;
<add>import java.lang.reflect.Field;
<add>import java.lang.reflect.GenericArrayType;
<add>import java.lang.reflect.InvocationHandler;
<add>import java.lang.reflect.Method;
<add>import java.lang.reflect.ParameterizedType;
<add>import java.lang.reflect.Proxy;
<add>import java.lang.reflect.Type;
<add>import java.lang.reflect.TypeVariable;
<add>import java.lang.reflect.WildcardType;
<add>
<add>import org.springframework.util.Assert;
<add>import org.springframework.util.ReflectionUtils;
<add>
<add>/**
<add> * Internal utility class that can be used to obtain wrapped {@link Serializable} variants
<add> * of {@link java.lang.reflect.Type}s.
<add> *
<add> * <p>{@link #forField(Field) Fields} or {@link #forMethodParameter(MethodParameter)
<add> * MethodParameters} can be used as the root source for a serializable type. Alternatively
<add> * the {@link #forGenericSuperclass(Class) superclass},
<add> * {@link #forGenericInterfaces(Class) interfaces} or {@link #forTypeParameters(Class)
<add> * type parameters} or a regular {@link Class} can also be used as source.
<add> *
<add> * <p>The returned type will either be a {@link Class} or a serializable proxy of
<add> * {@link GenericArrayType}, {@link ParameterizedType}, {@link TypeVariable} or
<add> * {@link WildcardType}. With the exception of {@link Class} (which is final) calls to
<add> * methods that return further {@link Type}s (for example
<add> * {@link GenericArrayType#getGenericComponentType()}) will be automatically wrapped.
<add> *
<add> * @author Phillip Webb
<add> * @since 4.0
<add> */
<add>abstract class SerializableTypeWrapper {
<add>
<add> private static final Class<?>[] SUPPORTED_SERIALAZABLE_TYPES = { GenericArrayType.class,
<add> ParameterizedType.class, TypeVariable.class, WildcardType.class };
<add>
<add>
<add> /**
<add> * Return a {@link Serializable} variant of {@link Field#getGenericType()}.
<add> */
<add> public static Type forField(Field field) {
<add> Assert.notNull(field, "Field must not be null");
<add> return forTypeProvider(new FieldTypeProvider(field));
<add> }
<add>
<add> /**
<add> * Return a {@link Serializable} variant of
<add> * {@link MethodParameter#getGenericParameterType()}.
<add> */
<add> public static Type forMethodParameter(MethodParameter methodParameter) {
<add> return forTypeProvider(new MethodParameterTypeProvider(methodParameter));
<add> }
<add>
<add> /**
<add> * Return a {@link Serializable} variant of {@link Class#getGenericSuperclass()}.
<add> */
<add> public static Type forGenericSuperclass(final Class<?> type) {
<add> return forTypeProvider(new TypeProvider() {
<add> @Override
<add> public Type getType() {
<add> return type.getGenericSuperclass();
<add> }
<add> });
<add> }
<add>
<add> /**
<add> * Return a {@link Serializable} variant of {@link Class#getGenericInterfaces()}.
<add> */
<add> public static Type[] forGenericInterfaces(final Class<?> type) {
<add> Type[] result = new Type[type.getGenericInterfaces().length];
<add> for (int i = 0; i < result.length; i++) {
<add> final int index = i;
<add> result[i] = forTypeProvider(new TypeProvider() {
<add> @Override
<add> public Type getType() {
<add> return type.getGenericInterfaces()[index];
<add> }
<add> });
<add> }
<add> return result;
<add> }
<add>
<add> /**
<add> * Return a {@link Serializable} variant of {@link Class#getTypeParameters()}.
<add> */
<add> public static Type[] forTypeParameters(final Class<?> type) {
<add> Type[] result = new Type[type.getTypeParameters().length];
<add> for (int i = 0; i < result.length; i++) {
<add> final int index = i;
<add> result[i] = forTypeProvider(new TypeProvider() {
<add> @Override
<add> public Type getType() {
<add> return type.getTypeParameters()[index];
<add> }
<add> });
<add> }
<add> return result;
<add> }
<add>
<add>
<add> /**
<add> * Return a {@link Serializable} {@link Type} backed by a {@link TypeProvider} .
<add> */
<add> private static Type forTypeProvider(final TypeProvider provider) {
<add> Assert.notNull(provider, "Provider must not be null");
<add> if (provider.getType() instanceof Serializable || provider.getType() == null) {
<add> return provider.getType();
<add> }
<add> for (Class<?> type : SUPPORTED_SERIALAZABLE_TYPES) {
<add> if (type.isAssignableFrom(provider.getType().getClass())) {
<add> ClassLoader classLoader = provider.getClass().getClassLoader();
<add> Class<?>[] interfaces = new Class<?>[] { type, Serializable.class };
<add> InvocationHandler handler = new TypeProxyInvocationHandler(provider);
<add> return (Type) Proxy.newProxyInstance(classLoader, interfaces, handler);
<add> }
<add> }
<add> throw new IllegalArgumentException("Unsupported Type class "
<add> + provider.getType().getClass().getName());
<add> }
<add>
<add>
<add> /**
<add> * A {@link Serializable} interface providing access to a {@link Type}.
<add> */
<add> private static interface TypeProvider extends Serializable {
<add>
<add> /**
<add> * Return the (possibly non {@link Serializable}) {@link Type}.
<add> */
<add> Type getType();
<add>
<add> }
<add>
<add>
<add> /**
<add> * {@link Serializable} {@link InvocationHandler} used by the Proxied {@link Type}.
<add> * Provides serialization support and enhances any methods that return {@code Type}
<add> * or {@code Type[]}.
<add> */
<add> private static class TypeProxyInvocationHandler implements InvocationHandler,
<add> Serializable {
<add>
<add> private final TypeProvider provider;
<add>
<add>
<add> public TypeProxyInvocationHandler(TypeProvider provider) {
<add> this.provider = provider;
<add> }
<add>
<add>
<add> @Override
<add> public Object invoke(Object proxy, Method method, Object[] args) throws Throwable {
<add> if (Type.class.equals(method.getReturnType()) && args == null) {
<add> return forTypeProvider(new MethodInvokeTypeProvider(this.provider, method, -1));
<add> }
<add> if (Type[].class.equals(method.getReturnType()) && args == null) {
<add> Type[] result = new Type[((Type[]) method.invoke(this.provider.getType(), args)).length];
<add> for (int i = 0; i < result.length; i++) {
<add> result[i] = forTypeProvider(new MethodInvokeTypeProvider(this.provider, method, i));
<add> }
<add> return result;
<add> }
<add> return method.invoke(this.provider.getType(), args);
<add> }
<add>
<add> }
<add>
<add>
<add>
<add> /**
<add> * {@link TypeProvider} for {@link Type}s obtained from a {@link Field}.
<add> */
<add> private static class FieldTypeProvider implements TypeProvider {
<add>
<add> private final String fieldName;
<add>
<add> private final Class<?> declaringClass;
<add>
<add> private transient Field field;
<add>
<add>
<add> public FieldTypeProvider(Field field) {
<add> this.fieldName = field.getName();
<add> this.declaringClass = field.getDeclaringClass();
<add> this.field = field;
<add> }
<add>
<add>
<add> @Override
<add> public Type getType() {
<add> return this.field.getGenericType();
<add> }
<add>
<add> private void readObject(ObjectInputStream inputStream) throws IOException,
<add> ClassNotFoundException {
<add> inputStream.defaultReadObject();
<add> try {
<add> this.field = this.declaringClass.getDeclaredField(this.fieldName);
<add> }
<add> catch (Throwable ex) {
<add> throw new IllegalStateException(
<add> "Could not find original class structure", ex);
<add> }
<add> }
<add>
<add> }
<add>
<add>
<add> /**
<add> * {@link TypeProvider} for {@link Type}s obtained from a {@link MethodParameter}.
<add> */
<add> private static class MethodParameterTypeProvider implements TypeProvider {
<add>
<add> private final String methodName;
<add>
<add> private final Class<?>[] parameterTypes;
<add>
<add> private final Class<?> declaringClass;
<add>
<add> private final int parameterIndex;
<add>
<add> private transient MethodParameter methodParameter;
<add>
<add>
<add> public MethodParameterTypeProvider(MethodParameter methodParameter) {
<add> if (methodParameter.getMethod() != null) {
<add> this.methodName = methodParameter.getMethod().getName();
<add> this.parameterTypes = methodParameter.getMethod().getParameterTypes();
<add> }
<add> else {
<add> this.methodName = null;
<add> this.parameterTypes = methodParameter.getConstructor().getParameterTypes();
<add> }
<add> this.declaringClass = methodParameter.getDeclaringClass();
<add> this.parameterIndex = methodParameter.getParameterIndex();
<add> this.methodParameter = methodParameter;
<add> }
<add>
<add>
<add> @Override
<add> public Type getType() {
<add> return this.methodParameter.getGenericParameterType();
<add> }
<add>
<add> private void readObject(ObjectInputStream inputStream) throws IOException,
<add> ClassNotFoundException {
<add> inputStream.defaultReadObject();
<add> try {
<add> if (this.methodName != null) {
<add> this.methodParameter = new MethodParameter(
<add> this.declaringClass.getDeclaredMethod(this.methodName,
<add> this.parameterTypes), this.parameterIndex);
<add> }
<add> else {
<add> this.methodParameter = new MethodParameter(
<add> this.declaringClass.getDeclaredConstructor(this.parameterTypes),
<add> this.parameterIndex);
<add> }
<add> }
<add> catch (Throwable ex) {
<add> throw new IllegalStateException(
<add> "Could not find original class structure", ex);
<add> }
<add> }
<add>
<add> }
<add>
<add>
<add> /**
<add> * {@link TypeProvider} for {@link Type}s obtained by invoking a no-arg method.
<add> */
<add> private static class MethodInvokeTypeProvider implements TypeProvider {
<add>
<add> private final TypeProvider provider;
<add>
<add> private final String methodName;
<add>
<add> private final int index;
<add>
<add> private transient Object result;
<add>
<add>
<add> public MethodInvokeTypeProvider(TypeProvider provider, Method method, int index) {
<add> this.provider = provider;
<add> this.methodName = method.getName();
<add> this.index = index;
<add> this.result = ReflectionUtils.invokeMethod(method, provider.getType());
<add> }
<add>
<add>
<add> @Override
<add> public Type getType() {
<add> if (this.result instanceof Type || this.result == null) {
<add> return (Type) this.result;
<add> }
<add> return ((Type[])this.result)[this.index];
<add> }
<add>
<add> private void readObject(ObjectInputStream inputStream) throws IOException,
<add> ClassNotFoundException {
<add> inputStream.defaultReadObject();
<add> Method method = ReflectionUtils.findMethod(
<add> this.provider.getType().getClass(), this.methodName);
<add> this.result = ReflectionUtils.invokeMethod(method, this.provider.getType());
<add> }
<add>
<add> }
<add>}
<ide><path>spring-core/src/main/java/org/springframework/core/TypeVariableResolver.java
<del>/*
<del> * Copyright 2002-2013 the original author or authors.
<del> *
<del> * Licensed under the Apache License, Version 2.0 (the "License");
<del> * you may not use this file except in compliance with the License.
<del> * You may obtain a copy of the License at
<del> *
<del> * http://www.apache.org/licenses/LICENSE-2.0
<del> *
<del> * Unless required by applicable law or agreed to in writing, software
<del> * distributed under the License is distributed on an "AS IS" BASIS,
<del> * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
<del> * See the License for the specific language governing permissions and
<del> * limitations under the License.
<del> */
<del>
<del>package org.springframework.core;
<del>
<del>import java.lang.reflect.Type;
<del>import java.lang.reflect.TypeVariable;
<del>
<del>/**
<del> * Strategy interface that can be used to resolve {@link java.lang.reflect.TypeVariable}s.
<del> *
<del> * @author Phillip Webb
<del> * @since 4.0
<del> * @see ResolvableType
<del> * @see GenericTypeResolver
<del> */
<del>interface TypeVariableResolver {
<del>
<del> /**
<del> * Resolve the specified type variable.
<del> * @param typeVariable the type variable to resolve (must not be {@code null})
<del> * @return the resolved {@link java.lang.reflect.Type} for the variable or
<del> * {@code null} if the variable cannot be resolved.
<del> */
<del> Type resolveVariable(TypeVariable<?> typeVariable);
<del>
<del>}
<ide><path>spring-core/src/test/java/org/springframework/core/ResolvableTypeTests.java
<ide>
<ide> package org.springframework.core;
<ide>
<add>import java.io.ByteArrayInputStream;
<add>import java.io.ByteArrayOutputStream;
<add>import java.io.ObjectInputStream;
<add>import java.io.ObjectOutputStream;
<ide> import java.io.Serializable;
<ide> import java.lang.reflect.Constructor;
<ide> import java.lang.reflect.Field;
<ide> import org.mockito.ArgumentCaptor;
<ide> import org.mockito.Captor;
<ide> import org.mockito.runners.MockitoJUnitRunner;
<del>
<add>import org.springframework.core.ResolvableType.VariableResolver;
<ide> import org.springframework.util.MultiValueMap;
<ide>
<add>import static org.mockito.BDDMockito.*;
<add>
<add>import static org.mockito.Mockito.*;
<ide> import static org.hamcrest.Matchers.*;
<ide> import static org.junit.Assert.*;
<del>import static org.mockito.BDDMockito.*;
<ide>
<ide> /**
<ide> * Tests for {@link ResolvableType}.
<ide> public void noneReturnValues() throws Exception {
<ide> assertThat(none.resolve(String.class), equalTo((Class) String.class));
<ide> assertThat(none.resolveGeneric(0), nullValue());
<ide> assertThat(none.resolveGenerics().length, equalTo(0));
<del> assertThat(none.resolveVariable(mock(TypeVariable.class)), nullValue());
<ide> assertThat(none.toString(), equalTo("?"));
<ide> assertThat(none.isAssignableFrom(ResolvableType.forClass(Object.class)), equalTo(false));
<ide> }
<ide> public void getGenericOfGeneric() throws Exception {
<ide> assertThat(type.getGeneric().getGeneric().getType(), equalTo((Type) String.class));
<ide> }
<ide>
<add> @Test
<add> public void genericOfGenericWithAs() throws Exception {
<add> ResolvableType type = ResolvableType.forField(Fields.class.getField("stringListList")).asCollection();
<add> assertThat(type.toString(), equalTo("java.util.Collection<java.util.List<java.lang.String>>"));
<add> assertThat(type.getGeneric().asCollection().toString(), equalTo("java.util.Collection<java.lang.String>"));
<add> }
<add>
<ide> @Test
<ide> public void getGenericOfGenericByIndexes() throws Exception {
<ide> ResolvableType type = ResolvableType.forField(Fields.class.getField("stringListList"));
<ide> public void hasGenerics() throws Exception {
<ide> }
<ide>
<ide> @Test
<del> public void getGenerics() throws Exception {
<add> public void getGenericsFromParameterizedType() throws Exception {
<ide> ResolvableType type = ResolvableType.forClass(List.class, ExtendsList.class);
<ide> ResolvableType[] generics = type.getGenerics();
<ide> assertThat(generics.length, equalTo(1));
<ide> assertThat(generics[0].resolve(), equalTo((Class) CharSequence.class));
<ide> }
<ide>
<add> @Test
<add> public void getGenericsFromClass() throws Exception {
<add> ResolvableType type = ResolvableType.forClass(List.class);
<add> ResolvableType[] generics = type.getGenerics();
<add> assertThat(generics.length, equalTo(1));
<add> assertThat(generics[0].getType().toString(), equalTo("E"));
<add> }
<add>
<ide> @Test
<ide> public void noGetGenerics() throws Exception {
<ide> ResolvableType type = ResolvableType.forClass(ExtendsList.class);
<ide> public void resolveVariableFromInheritedFieldSwitched() throws Exception {
<ide> public void doesResolveFromOuterOwner() throws Exception {
<ide> ResolvableType type = ResolvableType.forField(
<ide> Fields.class.getField("listOfListOfUnknown")).as(Collection.class);
<del> ResolvableType generic = type.getGeneric(0);
<del> assertThat(generic.resolve(), equalTo((Class) List.class));
<del> assertThat(generic.as(Collection.class).getGeneric(0).as(Collection.class).resolve(), nullValue());
<add> assertThat(type.getGeneric(0).resolve(), equalTo((Class) List.class));
<add> assertThat(type.getGeneric(0).as(Collection.class).getGeneric(0).as(Collection.class).resolve(), nullValue());
<ide> }
<ide>
<ide> @Test
<ide> public void resolveTypeVariableFromType() throws Exception {
<ide> public void resolveTypeVariableFromTypeWithVariableResolver() throws Exception {
<ide> Type sourceType = Methods.class.getMethod("typedReturn").getGenericReturnType();
<ide> ResolvableType type = ResolvableType.forType(
<del> sourceType, ResolvableType.forClass(TypedMethods.class).as(Methods.class));
<add> sourceType, ResolvableType.forClass(TypedMethods.class).as(Methods.class).asVariableResolver());
<ide> assertThat(type.resolve(), equalTo((Class) String.class));
<ide> assertThat(type.getType().toString(), equalTo("T"));
<ide> }
<ide>
<ide> @Test
<ide> public void resolveTypeWithCustomVariableResolver() throws Exception {
<del> TypeVariableResolver variableResolver = mock(TypeVariableResolver.class);
<del> given(variableResolver.resolveVariable((TypeVariable<?>) anyObject())).willReturn(Long.class);
<add> VariableResolver variableResolver = mock(VariableResolver.class);
<add> ResolvableType longType = ResolvableType.forClass(Long.class);
<add> given(variableResolver.resolveVariable((TypeVariable<?>) anyObject())).willReturn(longType);
<ide>
<ide> ResolvableType variable = ResolvableType.forType(
<ide> Fields.class.getField("typeVariableType").getGenericType(), variableResolver);
<ide> public void resolveTypeWithCustomVariableResolver() throws Exception {
<ide> public void toStrings() throws Exception {
<ide> assertThat(ResolvableType.NONE.toString(), equalTo("?"));
<ide>
<del> assertFieldToStringValue("classType", "java.util.List");
<add> assertFieldToStringValue("classType", "java.util.List<?>");
<ide> assertFieldToStringValue("typeVariableType", "?");
<ide> assertFieldToStringValue("parameterizedType", "java.util.List<?>");
<del> assertFieldToStringValue("arrayClassType", "java.util.List[]");
<add> assertFieldToStringValue("arrayClassType", "java.util.List<?>[]");
<ide> assertFieldToStringValue("genericArrayType", "java.util.List<java.lang.String>[]");
<ide> assertFieldToStringValue("genericMultiArrayType", "java.util.List<java.lang.String>[][][]");
<ide> assertFieldToStringValue("wildcardType", "java.util.List<java.lang.Number>");
<ide> public void toStrings() throws Exception {
<ide> assertFieldToStringValue("stringArrayList", "java.util.List<java.lang.String[]>");
<ide> assertFieldToStringValue("stringIntegerMultiValueMap", "org.springframework.util.MultiValueMap<java.lang.String, java.lang.Integer>");
<ide> assertFieldToStringValue("stringIntegerMultiValueMapSwitched", VariableNameSwitch.class.getName() + "<java.lang.Integer, java.lang.String>");
<del> assertFieldToStringValue("listOfListOfUnknown", "java.util.List<java.util.List>");
<add> assertFieldToStringValue("listOfListOfUnknown", "java.util.List<java.util.List<?>>");
<ide>
<ide> assertTypedFieldToStringValue("typeVariableType", "java.lang.String");
<ide> assertTypedFieldToStringValue("parameterizedType", "java.util.List<java.lang.String>");
<ide> public void resolveFromOuterClass() throws Exception {
<ide> assertThat(type.resolve(), equalTo((Type) Integer.class));
<ide> }
<ide>
<add> @Test
<add> public void resolveFromClassWithGenerics() throws Exception {
<add> ResolvableType type = ResolvableType.forClassWithGenerics(List.class, ResolvableType.forClassWithGenerics(List.class, String.class));
<add> assertThat(type.asCollection().toString(), equalTo("java.util.Collection<java.util.List<java.lang.String>>"));
<add> assertThat(type.asCollection().getGeneric().toString(), equalTo("java.util.List<java.lang.String>"));
<add> assertThat(type.asCollection().getGeneric().asCollection().toString(), equalTo("java.util.Collection<java.lang.String>"));
<add> assertThat(type.toString(), equalTo("java.util.List<java.util.List<java.lang.String>>"));
<add> assertThat(type.asCollection().getGeneric().getGeneric().resolve(), equalTo((Type) String.class));
<add> }
<add>
<add>
<ide> @Test
<ide> public void isAssignableFromMustNotBeNull() throws Exception {
<ide> this.thrown.expect(IllegalArgumentException.class);
<ide> public void isAssignableFromForComplexWildcards() throws Exception {
<ide> public void hashCodeAndEquals() throws Exception {
<ide> ResolvableType forClass = ResolvableType.forClass(List.class);
<ide> ResolvableType forFieldDirect = ResolvableType.forField(Fields.class.getDeclaredField("stringList"));
<del> ResolvableType forFieldViaType = ResolvableType.forType(Fields.class.getDeclaredField("stringList").getGenericType());
<add> ResolvableType forFieldViaType = ResolvableType.forType(Fields.class.getDeclaredField("stringList").getGenericType(), (VariableResolver) null);
<ide> ResolvableType forFieldWithImplementation = ResolvableType.forField(Fields.class.getDeclaredField("stringList"), TypedFields.class);
<ide>
<ide> assertThat(forClass, equalTo(forClass));
<ide> public void javaDocSample() throws Exception {
<ide> assertThat(t.resolveGeneric(1, 0), equalTo((Class) String.class));
<ide> }
<ide>
<add> @Test
<add> public void forClassWithGenerics() throws Exception {
<add> ResolvableType elementType = ResolvableType.forClassWithGenerics(Map.class, Integer.class, String.class);
<add> ResolvableType listType = ResolvableType.forClassWithGenerics(List.class, elementType);
<add> assertThat(listType.toString(), equalTo("java.util.List<java.util.Map<java.lang.Integer, java.lang.String>>"));
<add> }
<add>
<add> @Test
<add> public void classWithGenericsAs() throws Exception {
<add> ResolvableType type = ResolvableType.forClassWithGenerics(MultiValueMap.class, Integer.class, String.class);
<add> assertThat(type.asMap().toString(), equalTo("java.util.Map<java.lang.Integer, java.util.List<java.lang.String>>"));
<add> }
<add>
<add> @Test
<add> public void forClassWithMismatchedGenerics() throws Exception {
<add> thrown.expect(IllegalArgumentException.class);
<add> thrown.expectMessage("Missmatched number of generics specified");
<add> ResolvableType.forClassWithGenerics(Map.class, Integer.class);
<add> }
<add>
<add> @Test
<add> public void forArrayComponent() throws Exception {
<add> ResolvableType elementType = ResolvableType.forField(Fields.class.getField("stringList"));
<add> ResolvableType type = ResolvableType.forArrayComponent(elementType);
<add> assertThat(type.toString(), equalTo("java.util.List<java.lang.String>[]"));
<add> assertThat(type.resolve(), equalTo((Class) List[].class));
<add> }
<add>
<add> @Test
<add> public void serialize() throws Exception {
<add> testSerialization(ResolvableType.forClass(List.class));
<add> testSerialization(ResolvableType.forField(Fields.class.getField("charSequenceList")));
<add> testSerialization(ResolvableType.forMethodParameter(Methods.class.getMethod("charSequenceParameter", List.class), 0));
<add> testSerialization(ResolvableType.forMethodReturnType(Methods.class.getMethod("charSequenceReturn")));
<add> testSerialization(ResolvableType.forConstructorParameter(Constructors.class.getConstructor(List.class), 0));
<add> testSerialization(ResolvableType.forField(Fields.class.getField("charSequenceList")).getGeneric());
<add> testSerialization(ResolvableType.forField(Fields.class.getField("charSequenceList")).asCollection());
<add> testSerialization(ResolvableType.forClass(ExtendsMap.class).getSuperType());
<add> ResolvableType deserializedNone = testSerialization(ResolvableType.NONE);
<add> assertThat(deserializedNone, sameInstance(ResolvableType.NONE));
<add> }
<add>
<add> private ResolvableType testSerialization(ResolvableType type) throws Exception {
<add> ByteArrayOutputStream bos = new ByteArrayOutputStream();
<add> ObjectOutputStream oos = new ObjectOutputStream(bos);
<add> oos.writeObject(type);
<add> oos.close();
<add> ObjectInputStream ois = new ObjectInputStream(new ByteArrayInputStream(bos.toByteArray()));
<add> ResolvableType read = (ResolvableType) ois.readObject();
<add> assertThat(read, equalTo(type));
<add> assertThat(read.getType(), equalTo(type.getType()));
<add> assertThat(read.resolve(), equalTo((Class) type.resolve()));
<add> return read;
<add> }
<ide>
<ide> private static AssertAssignbleMatcher assertAssignable(final ResolvableType type,
<ide> final ResolvableType... fromTypes) {
<ide><path>spring-core/src/test/java/org/springframework/core/SerializableTypeWrapperTests.java
<add>/*
<add> * Copyright 2002-2013 the original author or authors.
<add> *
<add> * Licensed under the Apache License, Version 2.0 (the "License");
<add> * you may not use this file except in compliance with the License.
<add> * You may obtain a copy of the License at
<add> *
<add> * http://www.apache.org/licenses/LICENSE-2.0
<add> *
<add> * Unless required by applicable law or agreed to in writing, software
<add> * distributed under the License is distributed on an "AS IS" BASIS,
<add> * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
<add> * See the License for the specific language governing permissions and
<add> * limitations under the License.
<add> */
<add>
<add>package org.springframework.core;
<add>
<add>import java.io.ByteArrayInputStream;
<add>import java.io.ByteArrayOutputStream;
<add>import java.io.ObjectInputStream;
<add>import java.io.ObjectOutputStream;
<add>import java.lang.reflect.Constructor;
<add>import java.lang.reflect.GenericArrayType;
<add>import java.lang.reflect.Method;
<add>import java.lang.reflect.ParameterizedType;
<add>import java.lang.reflect.Type;
<add>import java.lang.reflect.TypeVariable;
<add>import java.lang.reflect.WildcardType;
<add>import java.util.ArrayList;
<add>import java.util.List;
<add>
<add>import org.junit.Test;
<add>
<add>import static org.hamcrest.Matchers.*;
<add>import static org.junit.Assert.*;
<add>
<add>
<add>/**
<add> * Tests for {@link SerializableTypeWrapper}.
<add> *
<add> * @author Phillip Webb
<add> */
<add>public class SerializableTypeWrapperTests {
<add>
<add> @Test
<add> public void forField() throws Exception {
<add> Type type = SerializableTypeWrapper.forField(Fields.class.getField("parameterizedType"));
<add> assertThat(type.toString(), equalTo("java.util.List<java.lang.String>"));
<add> assertSerialzable(type);
<add> }
<add>
<add> @Test
<add> public void forMethodParameter() throws Exception {
<add> Method method = Methods.class.getDeclaredMethod("method", Class.class, Object.class);
<add> Type type = SerializableTypeWrapper.forMethodParameter(MethodParameter.forMethodOrConstructor(method, 0));
<add> assertThat(type.toString(), equalTo("java.lang.Class<T>"));
<add> assertSerialzable(type);
<add> }
<add>
<add> @Test
<add> public void forConstructor() throws Exception {
<add> Constructor<?> constructor = Constructors.class.getDeclaredConstructor(List.class);
<add> Type type = SerializableTypeWrapper.forMethodParameter(MethodParameter.forMethodOrConstructor(constructor, 0));
<add> assertThat(type.toString(), equalTo("java.util.List<java.lang.String>"));
<add> assertSerialzable(type);
<add> }
<add>
<add> @Test
<add> public void forGenericSuperClass() throws Exception {
<add> Type type = SerializableTypeWrapper.forGenericSuperclass(ArrayList.class);
<add> assertThat(type.toString(), equalTo("java.util.AbstractList<E>"));
<add> assertSerialzable(type);
<add> }
<add>
<add> @Test
<add> public void forGenericInterfaces() throws Exception {
<add> Type type = SerializableTypeWrapper.forGenericInterfaces(List.class)[0];
<add> assertThat(type.toString(), equalTo("java.util.Collection<E>"));
<add> assertSerialzable(type);
<add> }
<add>
<add> @Test
<add> public void forTypeParamters() throws Exception {
<add> Type type = SerializableTypeWrapper.forTypeParameters(List.class)[0];
<add> assertThat(type.toString(), equalTo("E"));
<add> assertSerialzable(type);
<add> }
<add>
<add> @Test
<add> public void classType() throws Exception {
<add> Type type = SerializableTypeWrapper.forField(Fields.class.getField("classType"));
<add> assertThat(type.toString(), equalTo("class java.lang.String"));
<add> assertSerialzable(type);
<add> }
<add>
<add> @Test
<add> public void genericArrayType() throws Exception {
<add> GenericArrayType type = (GenericArrayType) SerializableTypeWrapper.forField(Fields.class.getField("genericArrayType"));
<add> assertThat(type.toString(), equalTo("java.util.List<java.lang.String>[]"));
<add> assertSerialzable(type);
<add> assertSerialzable(type.getGenericComponentType());
<add> }
<add>
<add> @Test
<add> public void parameterizedType() throws Exception {
<add> ParameterizedType type = (ParameterizedType) SerializableTypeWrapper.forField(Fields.class.getField("parameterizedType"));
<add> assertThat(type.toString(), equalTo("java.util.List<java.lang.String>"));
<add> assertSerialzable(type);
<add> assertSerialzable(type.getOwnerType());
<add> assertSerialzable(type.getRawType());
<add> assertSerialzable(type.getActualTypeArguments());
<add> assertSerialzable(type.getActualTypeArguments()[0]);
<add> }
<add>
<add> @Test
<add> public void typeVariableType() throws Exception {
<add> TypeVariable<?> type = (TypeVariable<?>) SerializableTypeWrapper.forField(Fields.class.getField("typeVariableType"));
<add> assertThat(type.toString(), equalTo("T"));
<add> assertSerialzable(type);
<add> assertSerialzable(type.getBounds());
<add> }
<add>
<add> @Test
<add> public void wildcardType() throws Exception {
<add> ParameterizedType typeSource = (ParameterizedType) SerializableTypeWrapper.forField(Fields.class.getField("wildcardType"));
<add> WildcardType type = (WildcardType) typeSource.getActualTypeArguments()[0];
<add> assertThat(type.toString(), equalTo("? extends java.lang.CharSequence"));
<add> assertSerialzable(type);
<add> assertSerialzable(type.getLowerBounds());
<add> assertSerialzable(type.getUpperBounds());
<add> }
<add>
<add>
<add> private void assertSerialzable(Object source) throws Exception {
<add> ByteArrayOutputStream bos = new ByteArrayOutputStream();
<add> ObjectOutputStream oos = new ObjectOutputStream(bos);
<add> oos.writeObject(source);
<add> oos.close();
<add> ObjectInputStream ois = new ObjectInputStream(new ByteArrayInputStream(bos.toByteArray()));
<add> assertThat(ois.readObject(), equalTo(source));
<add> }
<add>
<add>
<add> static class Fields<T> {
<add>
<add> public String classType;
<add>
<add> public List<String>[] genericArrayType;
<add>
<add> public List<String> parameterizedType;
<add>
<add> public T typeVariableType;
<add>
<add> public List<? extends CharSequence> wildcardType;
<add>
<add> }
<add>
<add> static interface Methods {
<add>
<add> <T> List<T> method(Class<T> p1, T p2);
<add>
<add> }
<add>
<add> static class Constructors {
<add>
<add> public Constructors(List<String> p) {
<add> }
<add>
<add> }
<add>} | 6 |
Ruby | Ruby | use underscore in notification namespaces | 3990310a2bedd0dff5753e3e9b1282e686cff0cc | <ide><path>actionmailer/lib/action_mailer/base.rb
<ide> def deliver!(mail = @mail)
<ide> logger.debug "\n#{mail.encoded}"
<ide> end
<ide>
<del> ActiveSupport::Notifications.instrument("actionmailer.deliver", :mail => mail) do
<add> ActiveSupport::Notifications.instrument("action_mailer.deliver", :mail => mail) do
<ide> begin
<ide> self.delivery_method.perform_delivery(mail) if perform_deliveries
<ide> rescue Exception => e # Net::SMTP errors or sendmail pipe errors
<ide><path>actionpack/lib/action_controller/caching.rb
<ide> def cache_configured?
<ide> end
<ide>
<ide> def log_event(name, before, after, instrumenter_id, payload)
<del> if name.to_s =~ /actioncontroller\.((read|write|expire|exist)_(fragment|page)\??)/
<add> if name.to_s =~ /action_controller\.((read|write|expire|exist)_(fragment|page)\??)/
<ide> key_or_path = payload[:key] || payload[:path]
<ide> human_name = $1.humanize
<ide> duration = (after - before) * 1000
<ide><path>actionpack/lib/action_controller/caching/fragments.rb
<ide> def expire_fragment(key, options = nil)
<ide> end
<ide>
<ide> def instrument_fragment_cache(name, key)
<del> ActiveSupport::Notifications.instrument("actioncontroller.#{name}", :key => key){ yield }
<add> ActiveSupport::Notifications.instrument("action_controller.#{name}", :key => key){ yield }
<ide> end
<ide> end
<ide> end
<ide><path>actionpack/lib/action_controller/caching/pages.rb
<ide> def page_cache_path(path)
<ide> end
<ide>
<ide> def instrument_page_cache(name, path)
<del> ActiveSupport::Notifications.instrument("actioncontroller.#{name}", :path => path){ yield }
<add> ActiveSupport::Notifications.instrument("action_controller.#{name}", :path => path){ yield }
<ide> end
<ide> end
<ide>
<ide><path>actionpack/lib/action_controller/metal/logger.rb
<ide> module Logger
<ide> attr_internal :view_runtime
<ide>
<ide> def process_action(action)
<del> ActiveSupport::Notifications.instrument("actioncontroller.process_action",
<add> ActiveSupport::Notifications.instrument("action_controller.process_action",
<ide> :controller => self, :action => action) do
<ide> super
<ide> end
<ide> module ClassMethods
<ide> # This is the hook invoked by ActiveSupport::Notifications.subscribe.
<ide> # If you need to log any event, overwrite the method and do it here.
<ide> def log_event(name, before, after, instrumenter_id, payload) #:nodoc:
<del> if name == "actioncontroller.process_action"
<add> if name == "action_controller.process_action"
<ide> duration = [(after - before) * 1000, 0.01].max
<ide> controller = payload[:controller]
<ide> request = controller.request
<ide> def log_event(name, before, after, instrumenter_id, payload) #:nodoc:
<ide> message << " [#{request.request_uri rescue "unknown"}]"
<ide>
<ide> logger.info(message)
<del> elsif name == "actionview.render_template"
<add> elsif name == "action_view.render_template"
<ide> # TODO Make render_template logging work if you are using just ActionView
<ide> duration = (after - before) * 1000
<ide> message = "Rendered #{payload[:identifier]}"
<ide><path>actionpack/lib/action_view/render/partials.rb
<ide> def render
<ide> options = @options
<ide>
<ide> if @collection
<del> ActiveSupport::Notifications.instrument("actionview.render_collection",
<add> ActiveSupport::Notifications.instrument("action_view.render_collection",
<ide> :path => @path, :count => @collection.size) do
<ide> render_collection
<ide> end
<ide> else
<del> content = ActiveSupport::Notifications.instrument("actionview.render_partial",
<add> content = ActiveSupport::Notifications.instrument("action_view.render_partial",
<ide> :path => @path) do
<ide> render_partial
<ide> end
<ide><path>actionpack/lib/action_view/render/rendering.rb
<ide> def render_template(options)
<ide> def _render_template(template, layout = nil, options = {})
<ide> locals = options[:locals] || {}
<ide>
<del> content = ActiveSupport::Notifications.instrument("actionview.render_template",
<add> content = ActiveSupport::Notifications.instrument("action_view.render_template",
<ide> :identifier => template.identifier, :layout => (layout ? layout.identifier : nil)) do
<ide> template.render(self, locals)
<ide> end
<ide> def _render_template(template, layout = nil, options = {})
<ide> end
<ide>
<ide> def _render_layout(layout, locals, &block)
<del> ActiveSupport::Notifications.instrument("actionview.render_layout",
<add> ActiveSupport::Notifications.instrument("action_view.render_layout",
<ide> :identifier => layout.identifier) do
<ide> layout.render(self, locals){ |*name| _layout_for(*name, &block) }
<ide> end
<ide><path>activerecord/lib/active_record/connection_adapters/abstract_adapter.rb
<ide> def log_info(sql, name, ms)
<ide> protected
<ide> def log(sql, name)
<ide> result = nil
<del> ActiveSupport::Notifications.instrument("activerecord.sql", :sql => sql, :name => name) do
<add> ActiveSupport::Notifications.instrument("active_record.sql", :sql => sql, :name => name) do
<ide> @runtime += Benchmark.ms { result = yield }
<ide> end
<ide> result
<ide><path>activerecord/lib/active_record/railtie.rb
<ide> class Railtie < Rails::Railtie
<ide> initializer "active_record.notifications" do
<ide> require 'active_support/notifications'
<ide>
<del> ActiveSupport::Notifications.subscribe("activerecord.sql") do |name, before, after, instrumenter_id, payload|
<add> ActiveSupport::Notifications.subscribe("active_record.sql") do |name, before, after, instrumenter_id, payload|
<ide> ActiveRecord::Base.connection.log_info(payload[:sql], payload[:name], (after - before) * 1000)
<ide> end
<ide> end
<ide><path>activesupport/lib/active_support/cache.rb
<ide> def instrument(operation, key, options)
<ide> if self.class.instrument
<ide> payload = { :key => key }
<ide> payload.merge!(options) if options.is_a?(Hash)
<del> ActiveSupport::Notifications.instrument("activesupport.cache_#{operation}", payload){ yield }
<add> ActiveSupport::Notifications.instrument("active_support.cache_#{operation}", payload){ yield }
<ide> else
<ide> yield
<ide> end | 10 |
Text | Text | add 0.69.6 changelog | e78a495900607a91270bbb0f9e5ec88e511c5f59 | <ide><path>CHANGELOG.md
<ide> - Add GitHub token permissions for workflows ([3da3d82320](https://github.com/facebook/react-native/commit/3da3d82320bd035c6bd361a82ea12a70dba4e851) by [@varunsh-coder](https://github.com/varunsh-coder))
<ide> - Bump RCT-Folly to 2021-07-22 ([68f3a42fc7](https://github.com/facebook/react-native/commit/68f3a42fc7380051714253f43b42175de361f8bd) by [@luissantana](https://github.com/luissantana))
<ide>
<add>## v0.69.6
<add>
<add>### Changed
<add>
<add>- Bump version of `promise` from 8.0.3 to 8.2.0, enabling `Promise.allSettled` ([951538c080](https://github.com/facebook/react-native/commit/951538c080ef745da304fb308fa91d597e0dd98a) by [@retyui](https://github.com/retyui))
<add>
<add>### Fixed
<add>
<add>- Fix hermes profiler ([81564c1a3d](https://github.com/facebook/react-native/commit/81564c1a3dae4222858de2a9a34089097f665e82) by [@janicduplessis](https://github.com/janicduplessis))
<add>
<add>#### Android specific
<add>
<add>- Correctly resolve classes with FindClass(..) ([361b310bcc](https://github.com/facebook/react-native/commit/361b310bcc8dddbff42cf63495649291c894d661) by [@evancharlton](https://github.com/evancharlton))
<add>
<add>#### iOS specific
<add>
<add>- Fix the way the orientation events are published, to avoid false publish on orientation change when app changes state to inactive ([7d42106d4c](https://github.com/facebook/react-native/commit/7d42106d4ce20c644bda4d928fb0abc163580cee) by [@lbaldy](https://github.com/lbaldy))
<add>- Fix React module build error with swift integration on new architecture mode ([3afef3c167](https://github.com/facebook/react-native/commit/3afef3c16702cefa5115b059a08741fba255b2db) by [@Kudo](https://github.com/Kudo))
<add>
<ide> ## v0.69.5
<ide>
<ide> ### Changed | 1 |
Python | Python | change eager retry behaviour | f4e674334d23db2727a2ab8ed9b582cd0246c21a | <ide><path>celery/app/task.py
<ide> def retry(self, args=None, kwargs=None, exc=None, throw=True,
<ide> ), task_args=S.args, task_kwargs=S.kwargs
<ide> )
<ide>
<del> ret = Retry(exc=exc, when=eta or countdown)
<add> ret = Retry(exc=exc, when=eta or countdown, is_eager=is_eager, sig=S)
<ide>
<ide> if is_eager:
<ide> # if task was executed eagerly using apply(),
<del> # then the retry must also be executed eagerly.
<del> retry_ret = S.apply().get()
<add> # then the retry must also be executed eagerly in apply method
<ide> if throw:
<ide> raise ret
<del> return retry_ret
<add> return ret
<ide>
<ide> try:
<ide> S.apply_async()
<ide> def apply(self, args=None, kwargs=None,
<ide> retval = ret.retval
<ide> if isinstance(retval, ExceptionInfo):
<ide> retval, tb = retval.exception, retval.traceback
<add> if isinstance(retval, Retry) and retval.sig is not None:
<add> return retval.sig.apply(retries=retries + 1)
<ide> state = states.SUCCESS if ret.info is None else ret.info.state
<ide> return EagerResult(task_id, retval, state, traceback=tb)
<ide>
<ide><path>celery/exceptions.py
<ide> class Retry(TaskPredicate):
<ide> #: :class:`~datetime.datetime`.
<ide> when = None
<ide>
<del> def __init__(self, message=None, exc=None, when=None, **kwargs):
<add> def __init__(self, message=None, exc=None, when=None, is_eager=False, sig=None, **kwargs):
<ide> from kombu.utils.encoding import safe_repr
<ide> self.message = message
<ide> if isinstance(exc, string_t):
<ide> self.exc, self.excs = None, exc
<ide> else:
<ide> self.exc, self.excs = exc, safe_repr(exc) if exc else None
<ide> self.when = when
<add> self.is_eager = is_eager
<add> self.sig = sig
<ide> super(Retry, self).__init__(self, exc, when, **kwargs)
<ide>
<ide> def humanize(self):
<ide><path>t/unit/tasks/test_tasks.py
<ide> def test_retry_kwargs_can_be_empty(self):
<ide> finally:
<ide> self.retry_task_mockapply.pop_request()
<ide>
<del> def test_retry_eager(self):
<add> def test_retry_without_throw_eager(self):
<ide> assert self.retry_task_without_throw.apply().get() == 42
<ide>
<add> def test_retry_eager_should_return_value(self):
<add> self.retry_task.max_retries = 3
<add> self.retry_task.iterations = 0
<add> assert self.retry_task.apply([0xFF, 0xFFFF]).get() == 0xFF
<add> assert self.retry_task.iterations == 4
<add>
<ide> def test_retry_not_eager(self):
<ide> self.retry_task_mockapply.push_request()
<ide> try: | 3 |
Python | Python | remove re-defination of flaxwav2vec2forctcmodule | 60b81dfa6faae3aa90c34a7df9304036f513d055 | <ide><path>src/transformers/models/wav2vec2/modeling_flax_wav2vec2.py
<ide> class FlaxWav2Vec2ForCTC(FlaxWav2Vec2PreTrainedModel):
<ide> append_replace_return_docstrings(FlaxWav2Vec2ForCTC, output_type=FlaxCausalLMOutput, config_class=Wav2Vec2Config)
<ide>
<ide>
<del>class FlaxWav2Vec2ForCTCModule(nn.Module):
<del> config: Wav2Vec2Config
<del>
<del>
<ide> class FlaxWav2Vec2ForPreTrainingModule(nn.Module):
<ide> config: Wav2Vec2Config
<ide> dtype: jnp.dtype = jnp.float32
<ide> def __call__(
<ide> append_replace_return_docstrings(
<ide> FlaxWav2Vec2ForPreTraining, output_type=FlaxWav2Vec2ForPreTrainingOutput, config_class=Wav2Vec2Config
<ide> )
<del>
<del>
<del>class FlaxWav2Vec2ForCTCModule(nn.Module):
<del> config: Wav2Vec2Config | 1 |
PHP | PHP | add binary protocol option in memcachedengine | a5843dbe875a5b5b6b63a6e569a274327566d3ee | <ide><path>lib/Cake/Cache/Engine/MemcachedEngine.php
<ide> public function init($settings = array()) {
<ide> __d('cake_dev', 'Memcached extension is not build with SASL support')
<ide> );
<ide> }
<add> $this->_Memcached->setOption(Memcached::OPT_BINARY_PROTOCOL, true);
<ide> $this->_Memcached->setSaslAuthData($this->settings['login'], $this->settings['password']);
<ide> }
<ide> | 1 |
PHP | PHP | add missing return docblocks | 57d31b1a209b3ac74959d93b79612c96a4d58386 | <ide><path>src/Illuminate/Auth/Access/Response.php
<ide> class Response
<ide> * Create a new response.
<ide> *
<ide> * @param string|null $message
<add> * @return void
<ide> */
<ide> public function __construct($message = null)
<ide> {
<ide><path>src/Illuminate/Auth/Events/Attempting.php
<ide> class Attempting
<ide> *
<ide> * @param array $credentials
<ide> * @param bool $remember
<add> * @return void
<ide> */
<ide> public function __construct($credentials, $remember)
<ide> {
<ide><path>src/Illuminate/Auth/Events/Failed.php
<ide> class Failed
<ide> *
<ide> * @param \Illuminate\Contracts\Auth\Authenticatable|null $user
<ide> * @param array $credentials
<add> * @return void
<ide> */
<ide> public function __construct($user, $credentials)
<ide> {
<ide><path>src/Illuminate/Database/Console/Migrations/StatusCommand.php
<ide> class StatusCommand extends BaseCommand
<ide> * Create a new migration rollback command instance.
<ide> *
<ide> * @param \Illuminate\Database\Migrations\Migrator $migrator
<del> * @return \Illuminate\Database\Console\Migrations\StatusCommand
<add> * @return void
<ide> */
<ide> public function __construct(Migrator $migrator)
<ide> {
<ide><path>src/Illuminate/Foundation/AliasLoader.php
<ide> class AliasLoader
<ide> * Create a new AliasLoader instance.
<ide> *
<ide> * @param array $aliases
<add> * @return void
<ide> */
<ide> private function __construct($aliases)
<ide> {
<ide><path>src/Illuminate/Http/JsonResponse.php
<ide> class JsonResponse extends BaseJsonResponse
<ide> * @param int $status
<ide> * @param array $headers
<ide> * @param int $options
<add> * @return void
<ide> */
<ide> public function __construct($data = null, $status = 200, $headers = [], $options = 0)
<ide> {
<ide><path>src/Illuminate/Queue/ListenerOptions.php
<ide> class ListenerOptions extends WorkerOptions
<ide> * @param int $sleep
<ide> * @param int $maxTries
<ide> * @param bool $force
<add> * @return void
<ide> */
<ide> public function __construct($environment = null, $delay = 0, $memory = 128, $timeout = 60, $sleep = 3, $maxTries = 0, $force = false)
<ide> {
<ide><path>src/Illuminate/Queue/WorkerOptions.php
<ide> class WorkerOptions
<ide> * @param int $sleep
<ide> * @param int $maxTries
<ide> * @param bool $force
<add> * @return void
<ide> */
<ide> public function __construct($delay = 0, $memory = 128, $timeout = 60, $sleep = 3, $maxTries = 0, $force = false)
<ide> {
<ide><path>src/Illuminate/Redis/RedisManager.php
<ide> class RedisManager implements Factory
<ide> *
<ide> * @param string $driver
<ide> * @param array $config
<add> * @return void
<ide> */
<ide> public function __construct($driver, array $config)
<ide> { | 9 |
Javascript | Javascript | fix linting issues/errors | 8abdabb66d6913c7df43bae398e57907ffbd10e0 | <ide><path>src/package-transpilation-registry.js
<ide> Object.assign(PackageTranspilationRegistry.prototype, {
<ide> // This means searching for a config for `/path/to/file/here.js` only
<ide> // only iterates four times, even if there are hundreds of configs registered.
<ide> while (thisPath !== lastPath) { // until we reach the root
<del> if (config = this.configByPackagePath[thisPath]) {
<add> if (config = this.configByPackagePath[thisPath]) { // // eslint-disable-line no-cond-assign
<ide> for (var i = 0; i < config.specs.length; i++) {
<ide> spec = config.specs[i]
<ide> if (minimatch(filePath, path.join(config.path, spec.glob))) {
<ide> Object.assign(PackageTranspilationRegistry.prototype, {
<ide> var transpilerSource = spec._transpilerSource || fs.readFileSync(transpilerPath, 'utf8')
<ide> spec._transpilerSource = transpilerSource
<ide> return path.join(
<del> "package-transpile",
<add> 'package-transpile',
<ide> crypto
<ide> .createHash('sha1')
<ide> .update(JSON.stringify(spec.options || {}))
<ide> Object.assign(PackageTranspilationRegistry.prototype, {
<ide> return result
<ide> }
<ide> } else {
<del> var err = new Error("Could not resolve transpiler '" + spec.transpiler + "' from '" + config.path + "'")
<add> var err = new Error("Could not resolve transpiler '" + spec.transpiler + "' from '" + spec._config.path + "'")
<ide> console.error(err)
<ide> }
<ide> } | 1 |
Python | Python | update old link to new website | 06141bd524b23f402417af64415f6c8d94aad789 | <ide><path>celery/result.py
<ide>
<ide> E_WOULDBLOCK = """\
<ide> Never call result.get() within a task!
<del>See http://docs.celeryq.org/en/latest/userguide/tasks.html\
<del>#task-synchronous-subtasks
<add>See https://docs.celeryq.dev/en/latest/userguide/tasks.html\
<add>#avoid-launching-synchronous-subtasks
<ide> """
<ide>
<ide> | 1 |
Javascript | Javascript | fix removelistener for symbols | 1b3dbc9635d45c83e355d312b6114d58664b1e7a | <ide><path>lib/events.js
<ide> const {
<ide> ObjectDefineProperty,
<ide> ObjectGetPrototypeOf,
<ide> ObjectSetPrototypeOf,
<del> ObjectKeys,
<ide> Promise,
<ide> PromiseReject,
<ide> PromiseResolve,
<ide> EventEmitter.prototype.removeAllListeners =
<ide>
<ide> // Emit removeListener for all listeners on all events
<ide> if (arguments.length === 0) {
<del> for (const key of ObjectKeys(events)) {
<add> for (const key of ReflectOwnKeys(events)) {
<ide> if (key === 'removeListener') continue;
<ide> this.removeAllListeners(key);
<ide> }
<ide><path>test/parallel/test-event-emitter-remove-all-listeners.js
<ide> function expect(expected) {
<ide> ee._events = undefined;
<ide> assert.strictEqual(ee, ee.removeAllListeners());
<ide> }
<add>
<add>{
<add> const ee = new events.EventEmitter();
<add> const symbol = Symbol('symbol');
<add> const noop = common.mustNotCall();
<add> ee.on(symbol, noop);
<add>
<add> ee.on('removeListener', common.mustCall((...args) => {
<add> assert.deepStrictEqual(args, [symbol, noop]);
<add> }));
<add>
<add> ee.removeAllListeners();
<add>} | 2 |
Mixed | Python | restore resnet distribution strategies | 18d05ad3df0b6bd5f386d3373f59561f5fa004b1 | <ide><path>official/mnist/mnist.py
<ide> def create_model(data_format):
<ide>
<ide>
<ide> def define_mnist_flags():
<del> flags_core.define_base()
<add> flags_core.define_base(multi_gpu=True, num_gpu=False)
<ide> flags_core.define_image()
<ide> flags.adopt_module_key_flags(flags_core)
<ide> flags_core.set_defaults(data_dir='/tmp/mnist_data',
<ide><path>official/resnet/README.md
<ide> Other versions and formats:
<ide> * [ResNet-v2-ImageNet SavedModel](http://download.tensorflow.org/models/official/resnet_v2_imagenet_savedmodel.tar.gz)
<ide> * [ResNet-v1-ImageNet Checkpoint](http://download.tensorflow.org/models/official/resnet_v1_imagenet_checkpoint.tar.gz)
<ide> * [ResNet-v1-ImageNet SavedModel](http://download.tensorflow.org/models/official/resnet_v1_imagenet_savedmodel.tar.gz)
<add>
<add>## Compute Devices
<add>Training is accomplished using the DistributionStrategies API. (https://github.com/tensorflow/tensorflow/blob/master/tensorflow/contrib/distribute/README.md)
<add>
<add>The appropriate distribution strategy is chosen based on the `--num_gpus` flag. By default this flag is one if TensorFlow is compiled with CUDA, and zero otherwise.
<add>
<add>num_gpus:
<add>+ 0: Use OneDeviceStrategy and train on CPU.
<add>+ 1: Use OneDeviceStrategy and train on GPU.
<add>+ 2+: Use MirroredStrategy (data parallelism) to distribute a batch between devices.
<ide><path>official/resnet/cifar10_main.py
<ide> def preprocess_image(image, is_training):
<ide> return image
<ide>
<ide>
<del>def input_fn(is_training, data_dir, batch_size, num_epochs=1,
<del> num_parallel_calls=1, multi_gpu=False):
<add>def input_fn(is_training, data_dir, batch_size, num_epochs=1):
<ide> """Input_fn using the tf.data input pipeline for CIFAR-10 dataset.
<ide>
<ide> Args:
<ide> is_training: A boolean denoting whether the input is for training.
<ide> data_dir: The directory containing the input data.
<ide> batch_size: The number of samples per batch.
<ide> num_epochs: The number of epochs to repeat the dataset.
<del> num_parallel_calls: The number of records that are processed in parallel.
<del> This can be optimized per data set but for generally homogeneous data
<del> sets, should be approximately the number of available CPU cores.
<del> multi_gpu: Whether this is run multi-GPU. Note that this is only required
<del> currently to handle the batch leftovers, and can be removed
<del> when that is handled directly by Estimator.
<ide>
<ide> Returns:
<ide> A dataset that can be used for iteration.
<ide> """
<ide> filenames = get_filenames(is_training, data_dir)
<ide> dataset = tf.data.FixedLengthRecordDataset(filenames, _RECORD_BYTES)
<ide>
<del> num_images = is_training and _NUM_IMAGES['train'] or _NUM_IMAGES['validation']
<del>
<ide> return resnet_run_loop.process_record_dataset(
<ide> dataset, is_training, batch_size, _NUM_IMAGES['train'],
<del> parse_record, num_epochs, num_parallel_calls,
<del> examples_per_epoch=num_images, multi_gpu=multi_gpu)
<add> parse_record, num_epochs,
<add> )
<ide>
<ide>
<ide> def get_synth_input_fn():
<ide> def loss_filter_fn(_):
<ide> version=params['version'],
<ide> loss_scale=params['loss_scale'],
<ide> loss_filter_fn=loss_filter_fn,
<del> multi_gpu=params['multi_gpu'],
<ide> dtype=params['dtype']
<ide> )
<ide>
<ide><path>official/resnet/cifar10_test.py
<ide> def test_dataset_input_fn(self):
<ide> for pixel in row:
<ide> self.assertAllClose(pixel, np.array([-1.225, 0., 1.225]), rtol=1e-3)
<ide>
<del> def _cifar10_model_fn_helper(self, mode, version, dtype, multi_gpu=False):
<del> with tf.Graph().as_default() as g:
<del> input_fn = cifar10_main.get_synth_input_fn()
<del> dataset = input_fn(True, '', _BATCH_SIZE)
<del> iterator = dataset.make_one_shot_iterator()
<del> features, labels = iterator.get_next()
<del> spec = cifar10_main.cifar10_model_fn(
<del> features, labels, mode, {
<del> 'dtype': dtype,
<del> 'resnet_size': 32,
<del> 'data_format': 'channels_last',
<del> 'batch_size': _BATCH_SIZE,
<del> 'version': version,
<del> 'loss_scale': 128 if dtype == tf.float16 else 1,
<del> 'multi_gpu': multi_gpu
<del> })
<del>
<del> predictions = spec.predictions
<del> self.assertAllEqual(predictions['probabilities'].shape,
<del> (_BATCH_SIZE, 10))
<del> self.assertEqual(predictions['probabilities'].dtype, tf.float32)
<del> self.assertAllEqual(predictions['classes'].shape, (_BATCH_SIZE,))
<del> self.assertEqual(predictions['classes'].dtype, tf.int64)
<del>
<del> if mode != tf.estimator.ModeKeys.PREDICT:
<del> loss = spec.loss
<del> self.assertAllEqual(loss.shape, ())
<del> self.assertEqual(loss.dtype, tf.float32)
<del>
<del> if mode == tf.estimator.ModeKeys.EVAL:
<del> eval_metric_ops = spec.eval_metric_ops
<del> self.assertAllEqual(eval_metric_ops['accuracy'][0].shape, ())
<del> self.assertAllEqual(eval_metric_ops['accuracy'][1].shape, ())
<del> self.assertEqual(eval_metric_ops['accuracy'][0].dtype, tf.float32)
<del> self.assertEqual(eval_metric_ops['accuracy'][1].dtype, tf.float32)
<del>
<del> for v in tf.trainable_variables():
<del> self.assertEqual(v.dtype.base_dtype, tf.float32)
<del>
<del> tensors_to_check = ('initial_conv:0', 'block_layer1:0', 'block_layer2:0',
<del> 'block_layer3:0', 'final_reduce_mean:0',
<del> 'final_dense:0')
<del>
<del> for tensor_name in tensors_to_check:
<del> tensor = g.get_tensor_by_name('resnet_model/' + tensor_name)
<del> self.assertEqual(tensor.dtype, dtype,
<del> 'Tensor {} has dtype {}, while dtype {} was '
<del> 'expected'.format(tensor, tensor.dtype,
<del> dtype))
<del>
<del> def cifar10_model_fn_helper(self, mode, version, multi_gpu=False):
<del> self._cifar10_model_fn_helper(mode=mode, version=version, dtype=tf.float32,
<del> multi_gpu=multi_gpu)
<del> self._cifar10_model_fn_helper(mode=mode, version=version, dtype=tf.float16,
<del> multi_gpu=multi_gpu)
<add> def cifar10_model_fn_helper(self, mode, version, dtype):
<add> input_fn = cifar10_main.get_synth_input_fn()
<add> dataset = input_fn(True, '', _BATCH_SIZE)
<add> iterator = dataset.make_one_shot_iterator()
<add> features, labels = iterator.get_next()
<add> spec = cifar10_main.cifar10_model_fn(
<add> features, labels, mode, {
<add> 'dtype': dtype,
<add> 'resnet_size': 32,
<add> 'data_format': 'channels_last',
<add> 'batch_size': _BATCH_SIZE,
<add> 'version': version,
<add> 'loss_scale': 128 if dtype == tf.float16 else 1,
<add> })
<add>
<add> predictions = spec.predictions
<add> self.assertAllEqual(predictions['probabilities'].shape,
<add> (_BATCH_SIZE, 10))
<add> self.assertEqual(predictions['probabilities'].dtype, tf.float32)
<add> self.assertAllEqual(predictions['classes'].shape, (_BATCH_SIZE,))
<add> self.assertEqual(predictions['classes'].dtype, tf.int64)
<add>
<add> if mode != tf.estimator.ModeKeys.PREDICT:
<add> loss = spec.loss
<add> self.assertAllEqual(loss.shape, ())
<add> self.assertEqual(loss.dtype, tf.float32)
<add>
<add> if mode == tf.estimator.ModeKeys.EVAL:
<add> eval_metric_ops = spec.eval_metric_ops
<add> self.assertAllEqual(eval_metric_ops['accuracy'][0].shape, ())
<add> self.assertAllEqual(eval_metric_ops['accuracy'][1].shape, ())
<add> self.assertEqual(eval_metric_ops['accuracy'][0].dtype, tf.float32)
<add> self.assertEqual(eval_metric_ops['accuracy'][1].dtype, tf.float32)
<ide>
<ide> def test_cifar10_model_fn_train_mode_v1(self):
<del> self.cifar10_model_fn_helper(tf.estimator.ModeKeys.TRAIN, version=1)
<del>
<del> def test_cifar10_model_fn_trainmode__v2(self):
<del> self.cifar10_model_fn_helper(tf.estimator.ModeKeys.TRAIN, version=2)
<del>
<del> def test_cifar10_model_fn_train_mode_multi_gpu_v1(self):
<ide> self.cifar10_model_fn_helper(tf.estimator.ModeKeys.TRAIN, version=1,
<del> multi_gpu=True)
<add> dtype=tf.float32)
<ide>
<del> def test_cifar10_model_fn_train_mode_multi_gpu_v2(self):
<add> def test_cifar10_model_fn_trainmode__v2(self):
<ide> self.cifar10_model_fn_helper(tf.estimator.ModeKeys.TRAIN, version=2,
<del> multi_gpu=True)
<add> dtype=tf.float32)
<ide>
<ide> def test_cifar10_model_fn_eval_mode_v1(self):
<del> self.cifar10_model_fn_helper(tf.estimator.ModeKeys.EVAL, version=1)
<add> self.cifar10_model_fn_helper(tf.estimator.ModeKeys.EVAL, version=1,
<add> dtype=tf.float32)
<ide>
<ide> def test_cifar10_model_fn_eval_mode_v2(self):
<del> self.cifar10_model_fn_helper(tf.estimator.ModeKeys.EVAL, version=2)
<add> self.cifar10_model_fn_helper(tf.estimator.ModeKeys.EVAL, version=2,
<add> dtype=tf.float32)
<ide>
<ide> def test_cifar10_model_fn_predict_mode_v1(self):
<del> self.cifar10_model_fn_helper(tf.estimator.ModeKeys.PREDICT, version=1)
<add> self.cifar10_model_fn_helper(tf.estimator.ModeKeys.PREDICT, version=1,
<add> dtype=tf.float32)
<ide>
<ide> def test_cifar10_model_fn_predict_mode_v2(self):
<del> self.cifar10_model_fn_helper(tf.estimator.ModeKeys.PREDICT, version=2)
<add> self.cifar10_model_fn_helper(tf.estimator.ModeKeys.PREDICT, version=2,
<add> dtype=tf.float32)
<ide>
<ide> def _test_cifar10model_shape(self, version):
<ide> batch_size = 135
<ide><path>official/resnet/imagenet_main.py
<ide> def parse_record(raw_record, is_training):
<ide> return image, label
<ide>
<ide>
<del>def input_fn(is_training, data_dir, batch_size, num_epochs=1,
<del> num_parallel_calls=1, multi_gpu=False):
<add>def input_fn(is_training, data_dir, batch_size, num_epochs=1):
<ide> """Input function which provides batches for train or eval.
<ide>
<ide> Args:
<ide> is_training: A boolean denoting whether the input is for training.
<ide> data_dir: The directory containing the input data.
<ide> batch_size: The number of samples per batch.
<ide> num_epochs: The number of epochs to repeat the dataset.
<del> num_parallel_calls: The number of records that are processed in parallel.
<del> This can be optimized per data set but for generally homogeneous data
<del> sets, should be approximately the number of available CPU cores.
<del> multi_gpu: Whether this is run multi-GPU. Note that this is only required
<del> currently to handle the batch leftovers, and can be removed
<del> when that is handled directly by Estimator.
<ide>
<ide> Returns:
<ide> A dataset that can be used for iteration.
<ide> def input_fn(is_training, data_dir, batch_size, num_epochs=1,
<ide> # Shuffle the input files
<ide> dataset = dataset.shuffle(buffer_size=_NUM_TRAIN_FILES)
<ide>
<del> num_images = is_training and _NUM_IMAGES['train'] or _NUM_IMAGES['validation']
<del>
<ide> # Convert to individual records
<ide> dataset = dataset.flat_map(tf.data.TFRecordDataset)
<ide>
<ide> return resnet_run_loop.process_record_dataset(
<ide> dataset, is_training, batch_size, _SHUFFLE_BUFFER, parse_record,
<del> num_epochs, num_parallel_calls, examples_per_epoch=num_images,
<del> multi_gpu=multi_gpu)
<add> num_epochs
<add> )
<ide>
<ide>
<ide> def get_synth_input_fn():
<ide> def imagenet_model_fn(features, labels, mode, params):
<ide> version=params['version'],
<ide> loss_scale=params['loss_scale'],
<ide> loss_filter_fn=None,
<del> multi_gpu=params['multi_gpu'],
<ide> dtype=params['dtype']
<ide> )
<ide>
<ide><path>official/resnet/imagenet_test.py
<ide> def test_tensor_shapes_resnet_200_with_gpu_v1(self):
<ide> def test_tensor_shapes_resnet_200_with_gpu_v2(self):
<ide> self.tensor_shapes_helper(200, version=2, with_gpu=True)
<ide>
<del> def _resnet_model_fn_helper(self, mode, version, dtype, multi_gpu):
<add> def resnet_model_fn_helper(self, mode, version, dtype):
<ide> """Tests that the EstimatorSpec is given the appropriate arguments."""
<del> with tf.Graph().as_default() as g:
<del> tf.train.create_global_step()
<del>
<del> input_fn = imagenet_main.get_synth_input_fn()
<del> dataset = input_fn(True, '', _BATCH_SIZE)
<del> iterator = dataset.make_one_shot_iterator()
<del> features, labels = iterator.get_next()
<del> spec = imagenet_main.imagenet_model_fn(
<del> features, labels, mode, {
<del> 'dtype': dtype,
<del> 'resnet_size': 50,
<del> 'data_format': 'channels_last',
<del> 'batch_size': _BATCH_SIZE,
<del> 'version': version,
<del> 'loss_scale': 128 if dtype == tf.float16 else 1,
<del> 'multi_gpu': multi_gpu,
<del> })
<del>
<del> predictions = spec.predictions
<del> self.assertAllEqual(predictions['probabilities'].shape,
<del> (_BATCH_SIZE, _LABEL_CLASSES))
<del> self.assertEqual(predictions['probabilities'].dtype, tf.float32)
<del> self.assertAllEqual(predictions['classes'].shape, (_BATCH_SIZE,))
<del> self.assertEqual(predictions['classes'].dtype, tf.int64)
<del>
<del> if mode != tf.estimator.ModeKeys.PREDICT:
<del> loss = spec.loss
<del> self.assertAllEqual(loss.shape, ())
<del> self.assertEqual(loss.dtype, tf.float32)
<del>
<del> if mode == tf.estimator.ModeKeys.EVAL:
<del> eval_metric_ops = spec.eval_metric_ops
<del> self.assertAllEqual(eval_metric_ops['accuracy'][0].shape, ())
<del> self.assertAllEqual(eval_metric_ops['accuracy'][1].shape, ())
<del> self.assertEqual(eval_metric_ops['accuracy'][0].dtype, tf.float32)
<del> self.assertEqual(eval_metric_ops['accuracy'][1].dtype, tf.float32)
<del>
<del> tensors_to_check = ('initial_conv:0', 'initial_max_pool:0',
<del> 'block_layer1:0', 'block_layer2:0',
<del> 'block_layer3:0', 'block_layer4:0',
<del> 'final_reduce_mean:0', 'final_dense:0')
<del>
<del> for tensor_name in tensors_to_check:
<del> tensor = g.get_tensor_by_name('resnet_model/' + tensor_name)
<del> self.assertEqual(tensor.dtype, dtype,
<del> 'Tensor {} has dtype {}, while dtype {} was '
<del> 'expected'.format(tensor, tensor.dtype,
<del> dtype))
<del>
<del> def resnet_model_fn_helper(self, mode, version, multi_gpu=False):
<del> self._resnet_model_fn_helper(mode=mode, version=version, dtype=tf.float32,
<del> multi_gpu=multi_gpu)
<del> self._resnet_model_fn_helper(mode=mode, version=version, dtype=tf.float16,
<del> multi_gpu=multi_gpu)
<add> tf.train.create_global_step()
<add>
<add> input_fn = imagenet_main.get_synth_input_fn()
<add> dataset = input_fn(True, '', _BATCH_SIZE)
<add> iterator = dataset.make_one_shot_iterator()
<add> features, labels = iterator.get_next()
<add> spec = imagenet_main.imagenet_model_fn(
<add> features, labels, mode, {
<add> 'dtype': dtype,
<add> 'resnet_size': 50,
<add> 'data_format': 'channels_last',
<add> 'batch_size': _BATCH_SIZE,
<add> 'version': version,
<add> 'loss_scale': 128 if dtype == tf.float16 else 1,
<add> })
<add>
<add> predictions = spec.predictions
<add> self.assertAllEqual(predictions['probabilities'].shape,
<add> (_BATCH_SIZE, _LABEL_CLASSES))
<add> self.assertEqual(predictions['probabilities'].dtype, tf.float32)
<add> self.assertAllEqual(predictions['classes'].shape, (_BATCH_SIZE,))
<add> self.assertEqual(predictions['classes'].dtype, tf.int64)
<add>
<add> if mode != tf.estimator.ModeKeys.PREDICT:
<add> loss = spec.loss
<add> self.assertAllEqual(loss.shape, ())
<add> self.assertEqual(loss.dtype, tf.float32)
<add>
<add> if mode == tf.estimator.ModeKeys.EVAL:
<add> eval_metric_ops = spec.eval_metric_ops
<add> self.assertAllEqual(eval_metric_ops['accuracy'][0].shape, ())
<add> self.assertAllEqual(eval_metric_ops['accuracy'][1].shape, ())
<add> self.assertEqual(eval_metric_ops['accuracy'][0].dtype, tf.float32)
<add> self.assertEqual(eval_metric_ops['accuracy'][1].dtype, tf.float32)
<ide>
<ide> def test_resnet_model_fn_train_mode_v1(self):
<del> self.resnet_model_fn_helper(tf.estimator.ModeKeys.TRAIN, version=1)
<del>
<del> def test_resnet_model_fn_train_mode_v2(self):
<del> self.resnet_model_fn_helper(tf.estimator.ModeKeys.TRAIN, version=2)
<del>
<del> def test_resnet_model_fn_train_mode_multi_gpu_v1(self):
<ide> self.resnet_model_fn_helper(tf.estimator.ModeKeys.TRAIN, version=1,
<del> multi_gpu=True)
<add> dtype=tf.float32)
<ide>
<del> def test_resnet_model_fn_train_mode_multi_gpu_v2(self):
<add> def test_resnet_model_fn_train_mode_v2(self):
<ide> self.resnet_model_fn_helper(tf.estimator.ModeKeys.TRAIN, version=2,
<del> multi_gpu=True)
<add> dtype=tf.float32)
<ide>
<ide> def test_resnet_model_fn_eval_mode_v1(self):
<del> self.resnet_model_fn_helper(tf.estimator.ModeKeys.EVAL, version=1)
<add> self.resnet_model_fn_helper(tf.estimator.ModeKeys.EVAL, version=1,
<add> dtype=tf.float32)
<ide>
<ide> def test_resnet_model_fn_eval_mode_v2(self):
<del> self.resnet_model_fn_helper(tf.estimator.ModeKeys.EVAL, version=2)
<add> self.resnet_model_fn_helper(tf.estimator.ModeKeys.EVAL, version=2,
<add> dtype=tf.float32)
<ide>
<ide> def test_resnet_model_fn_predict_mode_v1(self):
<del> self.resnet_model_fn_helper(tf.estimator.ModeKeys.PREDICT, version=1)
<add> self.resnet_model_fn_helper(tf.estimator.ModeKeys.PREDICT, version=1,
<add> dtype=tf.float32)
<ide>
<ide> def test_resnet_model_fn_predict_mode_v2(self):
<del> self.resnet_model_fn_helper(tf.estimator.ModeKeys.PREDICT, version=2)
<add> self.resnet_model_fn_helper(tf.estimator.ModeKeys.PREDICT, version=2,
<add> dtype=tf.float32)
<ide>
<ide> def _test_imagenetmodel_shape(self, version):
<ide> batch_size = 135
<ide><path>official/resnet/resnet_run_loop.py
<ide> from official.utils.misc import model_helpers
<ide>
<ide>
<del>FLAGS = flags.FLAGS
<del>
<del>
<ide> ################################################################################
<ide> # Functions for input processing.
<ide> ################################################################################
<ide> def process_record_dataset(dataset, is_training, batch_size, shuffle_buffer,
<del> parse_record_fn, num_epochs=1, num_parallel_calls=1,
<del> examples_per_epoch=0, multi_gpu=False):
<add> parse_record_fn, num_epochs=1):
<ide> """Given a Dataset with raw records, return an iterator over the records.
<ide>
<ide> Args:
<ide> def process_record_dataset(dataset, is_training, batch_size, shuffle_buffer,
<ide> parse_record_fn: A function that takes a raw record and returns the
<ide> corresponding (image, label) pair.
<ide> num_epochs: The number of epochs to repeat the dataset.
<del> num_parallel_calls: The number of records that are processed in parallel.
<del> This can be optimized per data set but for generally homogeneous data
<del> sets, should be approximately the number of available CPU cores.
<del> examples_per_epoch: The number of examples in the current set that
<del> are processed each epoch. Note that this is only used for multi-GPU mode,
<del> and only to handle what will eventually be handled inside of Estimator.
<del> multi_gpu: Whether this is run multi-GPU. Note that this is only required
<del> currently to handle the batch leftovers (see below), and can be removed
<del> when that is handled directly by Estimator.
<ide>
<ide> Returns:
<ide> Dataset of (image, label) pairs ready for iteration.
<ide> """
<add>
<ide> # We prefetch a batch at a time, This can help smooth out the time taken to
<ide> # load input files as we go through shuffling and processing.
<ide> dataset = dataset.prefetch(buffer_size=batch_size)
<ide> def process_record_dataset(dataset, is_training, batch_size, shuffle_buffer,
<ide> # dataset for the appropriate number of epochs.
<ide> dataset = dataset.repeat(num_epochs)
<ide>
<del> # Currently, if we are using multiple GPUs, we can't pass in uneven batches.
<del> # (For example, if we have 4 GPUs, the number of examples in each batch
<del> # must be divisible by 4.) We already ensured this for the batch_size, but
<del> # we have to additionally ensure that any "leftover" examples-- the remainder
<del> # examples (total examples % batch_size) that get called a batch for the very
<del> # last batch of an epoch-- do not raise an error when we try to split them
<del> # over the GPUs. This will likely be handled by Estimator during replication
<del> # in the future, but for now, we just drop the leftovers here.
<del> if multi_gpu:
<del> total_examples = num_epochs * examples_per_epoch
<del> dataset = dataset.take(batch_size * (total_examples // batch_size))
<del>
<del> # Parse the raw records into images and labels
<del> dataset = dataset.map(lambda value: parse_record_fn(value, is_training),
<del> num_parallel_calls=num_parallel_calls)
<del>
<del> dataset = dataset.batch(batch_size)
<add> # Parse the raw records into images and labels. Testing has shown that setting
<add> # num_parallel_batches > 1 produces no improvement in throughput, since
<add> # batch_size is almost always much greater than the number of CPU cores.
<add> dataset = dataset.apply(
<add> tf.contrib.data.map_and_batch(
<add> lambda value: parse_record_fn(value, is_training),
<add> batch_size=batch_size,
<add> num_parallel_batches=1))
<ide>
<ide> # Operations between the final prefetch and the get_next call to the iterator
<ide> # will happen synchronously during run time. We prefetch here again to
<ide> # background all of the above processing work and keep it out of the
<del> # critical training path.
<del> dataset = dataset.prefetch(1)
<add> # critical training path. Setting buffer_size to tf.contrib.data.AUTOTUNE
<add> # allows DistributionStrategies to adjust how many batches to fetch based
<add> # on how many devices are present.
<add> dataset.prefetch(buffer_size=tf.contrib.data.AUTOTUNE)
<ide>
<ide> return dataset
<ide>
<ide> def get_synth_input_fn(height, width, num_channels, num_classes):
<ide> An input_fn that can be used in place of a real one to return a dataset
<ide> that can be used for iteration.
<ide> """
<del> def input_fn(is_training, data_dir, batch_size, *args): # pylint: disable=unused-argument
<add> def input_fn(is_training, data_dir, batch_size, *args, **kwargs): # pylint: disable=unused-argument
<ide> images = tf.zeros((batch_size, height, width, num_channels), tf.float32)
<ide> labels = tf.zeros((batch_size, num_classes), tf.int32)
<ide> return tf.data.Dataset.from_tensors((images, labels)).repeat()
<ide> def learning_rate_fn(global_step):
<ide>
<ide> def resnet_model_fn(features, labels, mode, model_class,
<ide> resnet_size, weight_decay, learning_rate_fn, momentum,
<del> data_format, version, loss_scale,
<del> loss_filter_fn=None, multi_gpu=False,
<add> data_format, version, loss_scale, loss_filter_fn=None,
<ide> dtype=resnet_model.DEFAULT_DTYPE):
<ide> """Shared functionality for different resnet model_fns.
<ide>
<ide> def resnet_model_fn(features, labels, mode, model_class,
<ide> True if the var should be included in loss calculation, and False
<ide> otherwise. If None, batch_normalization variables will be excluded
<ide> from the loss.
<del> multi_gpu: If True, wrap the optimizer in a TowerOptimizer suitable for
<del> data-parallel distribution across multiple GPUs.
<ide> dtype: the TensorFlow dtype to use for calculations.
<ide>
<ide> Returns:
<ide> def exclude_batch_norm(name):
<ide>
<ide> optimizer = tf.train.MomentumOptimizer(
<ide> learning_rate=learning_rate,
<del> momentum=momentum)
<del>
<del> # If we are running multi-GPU, we need to wrap the optimizer.
<del> if multi_gpu:
<del> optimizer = tf.contrib.estimator.TowerOptimizer(optimizer)
<add> momentum=momentum
<add> )
<ide>
<ide> if loss_scale != 1:
<ide> # When computing fp16 gradients, often intermediate tensor values are
<ide> def exclude_batch_norm(name):
<ide> else:
<ide> train_op = None
<ide>
<del> accuracy = tf.metrics.accuracy(
<del> tf.argmax(labels, axis=1), predictions['classes'])
<add> if not tf.contrib.distribute.has_distribution_strategy():
<add> accuracy = tf.metrics.accuracy(
<add> tf.argmax(labels, axis=1), predictions['classes'])
<add> else:
<add> # Metrics are currently not compatible with distribution strategies during
<add> # training. This does not affect the overall performance of the model.
<add> accuracy = (tf.no_op(), tf.constant(0))
<add>
<ide> metrics = {'accuracy': accuracy}
<ide>
<ide> # Create a tensor named train_accuracy for logging purposes
<ide> def exclude_batch_norm(name):
<ide> eval_metric_ops=metrics)
<ide>
<ide>
<del>def validate_batch_size_for_multi_gpu(batch_size):
<add>def per_device_batch_size(batch_size, num_gpus):
<ide> """For multi-gpu, batch-size must be a multiple of the number of GPUs.
<ide>
<del> Note that this should eventually be handled by replicate_model_fn
<add> Note that this should eventually be handled by DistributionStrategies
<ide> directly. Multi-GPU support is currently experimental, however,
<ide> so doing the work here until that feature is in place.
<ide>
<ide> Args:
<del> batch_size: the number of examples processed in each training batch.
<add> batch_size: Global batch size to be divided among devices. This should be
<add> equal to num_gpus times the single-GPU batch_size for multi-gpu training.
<add> num_gpus: How many GPUs are used with DistributionStrategies.
<add>
<add> Returns:
<add> Batch size per device.
<ide>
<ide> Raises:
<del> ValueError: if no GPUs are found, or selected batch_size is invalid.
<add> ValueError: if batch_size is not divisible by number of devices
<ide> """
<del> from tensorflow.python.client import device_lib # pylint: disable=g-import-not-at-top
<del>
<del> local_device_protos = device_lib.list_local_devices()
<del> num_gpus = sum([1 for d in local_device_protos if d.device_type == 'GPU'])
<del> if not num_gpus:
<del> raise ValueError('Multi-GPU mode was specified, but no GPUs '
<del> 'were found. To use CPU, run without --multi_gpu.')
<add> if num_gpus <= 1:
<add> return batch_size
<ide>
<ide> remainder = batch_size % num_gpus
<ide> if remainder:
<ide> err = ('When running with multiple GPUs, batch size '
<del> 'must be a multiple of the number of available GPUs. '
<del> 'Found {} GPUs with a batch size of {}; try --batch_size={} instead.'
<add> 'must be a multiple of the number of available GPUs. Found {} '
<add> 'GPUs with a batch size of {}; try --batch_size={} instead.'
<ide> ).format(num_gpus, batch_size, batch_size - remainder)
<ide> raise ValueError(err)
<add> return int(batch_size / num_gpus)
<ide>
<ide>
<ide> def resnet_main(flags_obj, model_function, input_function, shape=None):
<ide> def resnet_main(flags_obj, model_function, input_function, shape=None):
<ide> dataset that the estimator can train on. This will be wrapped with
<ide> all the relevant flags for running and passed to estimator.
<ide> shape: list of ints representing the shape of the images used for training.
<del> This is only used if flags.export_dir is passed.
<add> This is only used if flags_obj.export_dir is passed.
<ide> """
<ide>
<ide> # Using the Winograd non-fused algorithms provides a small performance boost.
<ide> os.environ['TF_ENABLE_WINOGRAD_NONFUSED'] = '1'
<ide>
<del> if flags_obj.multi_gpu:
<del> validate_batch_size_for_multi_gpu(flags_obj.batch_size)
<del>
<del> # There are two steps required if using multi-GPU: (1) wrap the model_fn,
<del> # and (2) wrap the optimizer. The first happens here, and (2) happens
<del> # in the model_fn itself when the optimizer is defined.
<del> model_function = tf.contrib.estimator.replicate_model_fn(
<del> model_function,
<del> loss_reduction=tf.losses.Reduction.MEAN)
<del>
<ide> # Create session config based on values of inter_op_parallelism_threads and
<ide> # intra_op_parallelism_threads. Note that we default to having
<ide> # allow_soft_placement = True, which is required for multi-GPU and not
<ide> def resnet_main(flags_obj, model_function, input_function, shape=None):
<ide> intra_op_parallelism_threads=flags_obj.intra_op_parallelism_threads,
<ide> allow_soft_placement=True)
<ide>
<del> # Set up a RunConfig to save checkpoint and set session config.
<del> run_config = tf.estimator.RunConfig().replace(save_checkpoints_secs=1e9,
<del> session_config=session_config)
<add> if flags_core.get_num_gpus(flags_obj) == 0:
<add> distribution = tf.contrib.distribute.OneDeviceStrategy('device:CPU:0')
<add> elif flags_core.get_num_gpus(flags_obj) == 1:
<add> distribution = tf.contrib.distribute.OneDeviceStrategy('device:GPU:0')
<add> else:
<add> distribution = tf.contrib.distribute.MirroredStrategy(
<add> num_gpus=flags_core.get_num_gpus(flags_obj)
<add> )
<add>
<add> run_config = tf.estimator.RunConfig(train_distribute=distribution,
<add> session_config=session_config)
<add>
<ide> classifier = tf.estimator.Estimator(
<ide> model_fn=model_function, model_dir=flags_obj.model_dir, config=run_config,
<ide> params={
<ide> 'resnet_size': int(flags_obj.resnet_size),
<ide> 'data_format': flags_obj.data_format,
<ide> 'batch_size': flags_obj.batch_size,
<del> 'multi_gpu': flags_obj.multi_gpu,
<ide> 'version': int(flags_obj.version),
<ide> 'loss_scale': flags_core.get_loss_scale(flags_obj),
<ide> 'dtype': flags_core.get_tf_dtype(flags_obj)
<ide> def resnet_main(flags_obj, model_function, input_function, shape=None):
<ide> benchmark_log_dir=flags_obj.benchmark_log_dir)
<ide>
<ide> def input_fn_train():
<del> return input_function(True, flags_obj.data_dir, flags_obj.batch_size,
<del> flags_obj.epochs_between_evals,
<del> flags_obj.num_parallel_calls, flags_obj.multi_gpu)
<add> return input_function(
<add> is_training=True, data_dir=flags_obj.data_dir,
<add> batch_size=per_device_batch_size(
<add> flags_obj.batch_size, flags_core.get_num_gpus(flags_obj)),
<add> num_epochs=flags_obj.epochs_between_evals)
<ide>
<ide> def input_fn_eval():
<del> return input_function(False, flags_obj.data_dir, flags_obj.batch_size,
<del> 1, flags_obj.num_parallel_calls, flags_obj.multi_gpu)
<add> return input_function(
<add> is_training=False, data_dir=flags_obj.data_dir,
<add> batch_size=per_device_batch_size(
<add> flags_obj.batch_size, flags_core.get_num_gpus(flags_obj)),
<add> num_epochs=1)
<ide>
<ide> total_training_cycle = (flags_obj.train_epochs //
<ide> flags_obj.epochs_between_evals)
<ide> def input_fn_eval():
<ide> max_steps=flags_obj.max_train_steps)
<ide>
<ide> tf.logging.info('Starting to evaluate.')
<del> # flags.max_train_steps is generally associated with testing and profiling.
<del> # As a result it is frequently called with synthetic data, which will
<del> # iterate forever. Passing steps=flags.max_train_steps allows the eval
<del> # (which is generally unimportant in those circumstances) to terminate.
<add>
<add> # flags_obj.max_train_steps is generally associated with testing and
<add> # profiling. As a result it is frequently called with synthetic data, which
<add> # will iterate forever. Passing steps=flags_obj.max_train_steps allows the
<add> # eval (which is generally unimportant in those circumstances) to terminate.
<ide> # Note that eval will run for max_train_steps each loop, regardless of the
<ide> # global_step count.
<ide> eval_results = classifier.evaluate(input_fn=input_fn_eval,
<ide> def input_fn_eval():
<ide> break
<ide>
<ide> if flags_obj.export_dir is not None:
<del> warn_on_multi_gpu_export(flags_obj.multi_gpu)
<del>
<ide> # Exports a saved model for the given classifier.
<ide> input_receiver_fn = export.build_tensor_serving_input_receiver_fn(
<ide> shape, batch_size=flags_obj.batch_size)
<ide> classifier.export_savedmodel(flags_obj.export_dir, input_receiver_fn)
<ide>
<ide>
<del>def warn_on_multi_gpu_export(multi_gpu=False):
<del> """For the time being, multi-GPU mode does not play nicely with exporting."""
<del> if multi_gpu:
<del> tf.logging.warning(
<del> 'You are exporting a SavedModel while in multi-GPU mode. Note that '
<del> 'the resulting SavedModel will require the same GPUs be available.'
<del> 'If you wish to serve the SavedModel from a different device, '
<del> 'try exporting the SavedModel with multi-GPU mode turned off.')
<del>
<del>
<ide> def define_resnet_flags(resnet_size_choices=None):
<ide> """Add flags and validators for ResNet."""
<ide> flags_core.define_base()
<del> flags_core.define_performance()
<add> flags_core.define_performance(num_parallel_calls=False)
<ide> flags_core.define_image()
<ide> flags_core.define_benchmark()
<ide> flags.adopt_module_key_flags(flags_core)
<ide><path>official/utils/flags/_base.py
<ide> from __future__ import print_function
<ide>
<ide> from absl import flags
<add>import tensorflow as tf
<ide>
<ide> from official.utils.flags._conventions import help_wrap
<ide> from official.utils.logs import hooks_helper
<ide>
<ide>
<ide> def define_base(data_dir=True, model_dir=True, train_epochs=True,
<ide> epochs_between_evals=True, stop_threshold=True, batch_size=True,
<del> multi_gpu=True, hooks=True, export_dir=True):
<add> multi_gpu=False, num_gpu=True, hooks=True, export_dir=True):
<ide> """Register base flags.
<ide>
<ide> Args:
<ide> def define_base(data_dir=True, model_dir=True, train_epochs=True,
<ide> eval metric which should trigger the end of training.
<ide> batch_size: Create a flag to specify the batch size.
<ide> multi_gpu: Create a flag to allow the use of all available GPUs.
<add> num_gpu: Create a flag to specify the number of GPUs used.
<ide> hooks: Create a flag to specify hooks for logging.
<ide> export_dir: Create a flag to specify where a SavedModel should be exported.
<ide>
<ide> def define_base(data_dir=True, model_dir=True, train_epochs=True,
<ide> help=help_wrap("Batch size for training and evaluation."))
<ide> key_flags.append("batch_size")
<ide>
<add> assert not (multi_gpu and num_gpu)
<add>
<ide> if multi_gpu:
<ide> flags.DEFINE_bool(
<ide> name="multi_gpu", default=False,
<ide> help=help_wrap("If set, run across all available GPUs."))
<ide> key_flags.append("multi_gpu")
<ide>
<add> if num_gpu:
<add> flags.DEFINE_integer(
<add> name="num_gpus", short_name="ng",
<add> default=1 if tf.test.is_gpu_available() else 0,
<add> help=help_wrap(
<add> "How many GPUs to use with the DistributionStrategies API. The "
<add> "default is 1 if TensorFlow can detect a GPU, and 0 otherwise."))
<add>
<ide> if hooks:
<ide> # Construct a pretty summary of hooks.
<ide> hook_list_str = (
<ide> def define_base(data_dir=True, model_dir=True, train_epochs=True,
<ide> key_flags.append("export_dir")
<ide>
<ide> return key_flags
<add>
<add>
<add>def get_num_gpus(flags_obj):
<add> """Treat num_gpus=-1 as 'use all'."""
<add> if flags_obj.num_gpus != -1:
<add> return flags_obj.num_gpus
<add>
<add> from tensorflow.python.client import device_lib # pylint: disable=g-import-not-at-top
<add> local_device_protos = device_lib.list_local_devices()
<add> return sum([1 for d in local_device_protos if d.device_type == "GPU"])
<ide><path>official/utils/flags/core.py
<ide> def core_fn(*args, **kwargs):
<ide> help_wrap = _conventions.help_wrap
<ide>
<ide>
<add>get_num_gpus = _base.get_num_gpus
<ide> get_tf_dtype = _performance.get_tf_dtype
<ide> get_loss_scale = _performance.get_loss_scale
<ide><path>official/utils/flags/flags_test.py
<ide>
<ide>
<ide> def define_flags():
<del> flags_core.define_base()
<add> flags_core.define_base(multi_gpu=True, num_gpu=False)
<ide> flags_core.define_performance()
<ide> flags_core.define_image()
<ide> flags_core.define_benchmark() | 10 |
Ruby | Ruby | remove dead code, and the tests for it | ef21e013338461f33bf85f5cf6edd84b5ce9b6fe | <ide><path>actionpack/lib/action_view/compiled_templates.rb
<del>module ActionView
<del>
<del> # CompiledTemplates modules hold methods that have been compiled.
<del> # Templates are compiled into these methods so that they do not need to be
<del> # read and parsed for each request.
<del> #
<del> # Each template may be compiled into one or more methods. Each method accepts a given
<del> # set of parameters which is used to implement local assigns passing.
<del> #
<del> # To use a compiled template module, create a new instance and include it into the class
<del> # in which you want the template to be rendered.
<del> class CompiledTemplates < Module
<del> attr_reader :method_names
<del>
<del> def initialize
<del> @method_names = Hash.new do |hash, key|
<del> hash[key] = "__compiled_method_#{(hash.length + 1)}"
<del> end
<del> @mtimes = {}
<del> end
<del>
<del> # Return the full key for the given identifier and argument names
<del> def full_key(identifier, arg_names)
<del> [identifier, arg_names]
<del> end
<del>
<del> # Return the selector for this method or nil if it has not been compiled
<del> def selector(identifier, arg_names)
<del> key = full_key(identifier, arg_names)
<del> method_names.key?(key) ? method_names[key] : nil
<del> end
<del> alias :compiled? :selector
<del>
<del> # Return the time at which the method for the given identifier and argument names was compiled.
<del> def mtime(identifier, arg_names)
<del> @mtimes[full_key(identifier, arg_names)]
<del> end
<del>
<del> # Compile the provided source code for the given argument names and with the given initial line number.
<del> # The identifier should be unique to this source.
<del> #
<del> # The file_name, if provided will appear in backtraces. If not provided, the file_name defaults
<del> # to the identifier.
<del> #
<del> # This method will return the selector for the compiled version of this method.
<del> def compile_source(identifier, arg_names, source, initial_line_number = 0, file_name = nil)
<del> file_name ||= identifier
<del> name = method_names[full_key(identifier, arg_names)]
<del> arg_desc = arg_names.empty? ? '' : "(#{arg_names * ', '})"
<del> fake_file_name = "#{file_name}#{arg_desc}" # Include the arguments for this version (for now)
<del>
<del> method_def = wrap_source(name, arg_names, source)
<del>
<del> begin
<del> module_eval(method_def, fake_file_name, initial_line_number)
<del> @mtimes[full_key(identifier, arg_names)] = Time.now
<del> rescue Exception => e # errors from compiled source
<del> e.blame_file! identifier
<del> raise
<del> end
<del> name
<del> end
<del>
<del> # Wrap the provided source in a def ... end block.
<del> def wrap_source(name, arg_names, source)
<del> "def #{name}(#{arg_names * ', '})\n#{source}\nend"
<del> end
<del> end
<del>end
<ide><path>actionpack/test/template/compiled_templates_test.rb
<del>require 'abstract_unit'
<del>require 'action_view/helpers/date_helper'
<del>require 'action_view/compiled_templates'
<del>
<del>class CompiledTemplateTests < Test::Unit::TestCase
<del> def setup
<del> @ct = ActionView::CompiledTemplates.new
<del> @v = Class.new
<del> @v.send :include, @ct
<del> @a = './test_compile_template_a.rhtml'
<del> @b = './test_compile_template_b.rhtml'
<del> @s = './test_compile_template_link.rhtml'
<del> end
<del> def teardown
<del> [@a, @b, @s].each do |f|
<del> FileUtils.rm(f) if File.exist?(f) || File.symlink?(f)
<del> end
<del> end
<del> attr_reader :ct, :v
<del>
<del> def test_name_allocation
<del> hi_world = ct.method_names['hi world']
<del> hi_sexy = ct.method_names['hi sexy']
<del> wish_upon_a_star = ct.method_names['I love seeing decent error messages']
<del>
<del> assert_equal hi_world, ct.method_names['hi world']
<del> assert_equal hi_sexy, ct.method_names['hi sexy']
<del> assert_equal wish_upon_a_star, ct.method_names['I love seeing decent error messages']
<del> assert_equal 3, [hi_world, hi_sexy, wish_upon_a_star].uniq.length
<del> end
<del>
<del> def test_wrap_source
<del> assert_equal(
<del> "def aliased_assignment(value)\nself.value = value\nend",
<del> @ct.wrap_source(:aliased_assignment, [:value], 'self.value = value')
<del> )
<del>
<del> assert_equal(
<del> "def simple()\nnil\nend",
<del> @ct.wrap_source(:simple, [], 'nil')
<del> )
<del> end
<del>
<del> def test_compile_source_single_method
<del> selector = ct.compile_source('doubling method', [:a], 'a + a')
<del> assert_equal 2, @v.new.send(selector, 1)
<del> assert_equal 4, @v.new.send(selector, 2)
<del> assert_equal -4, @v.new.send(selector, -2)
<del> assert_equal 0, @v.new.send(selector, 0)
<del> selector
<del> end
<del>
<del> def test_compile_source_two_method
<del> sel1 = test_compile_source_single_method # compile the method in the other test
<del> sel2 = ct.compile_source('doubling method', [:a, :b], 'a + b + a + b')
<del> assert_not_equal sel1, sel2
<del>
<del> assert_equal 2, @v.new.send(sel1, 1)
<del> assert_equal 4, @v.new.send(sel1, 2)
<del>
<del> assert_equal 6, @v.new.send(sel2, 1, 2)
<del> assert_equal 32, @v.new.send(sel2, 15, 1)
<del> end
<del>
<del> def test_mtime
<del> t1 = Time.now
<del>
<del> test_compile_source_single_method
<del> mtime = ct.mtime('doubling method', [:a])
<del>
<del> assert mtime < Time.now
<del> assert mtime > t1
<del> end
<del>
<del> uses_mocha 'test_compile_time' do
<del>
<del> def test_compile_time
<del> t = Time.now
<del>
<del> File.open(@a, "w"){|f| f.puts @a}
<del> File.open(@b, "w"){|f| f.puts @b}
<del> # windows doesn't support symlinks (even under cygwin)
<del> windows = (RUBY_PLATFORM =~ /win32/)
<del> `ln -s #{@a} #{@s}` unless windows
<del>
<del> v = ActionView::Base.new
<del> v.base_path = '.'
<del> v.cache_template_loading = false
<del>
<del> ta = ActionView::Template.new(v, @a, false, {})
<del> tb = ActionView::Template.new(v, @b, false, {})
<del> ts = ActionView::Template.new(v, @s, false, {})
<del>
<del> @handler_class = ActionView::Template.handler_class_for_extension(:rhtml)
<del> @handler = @handler_class.new(v)
<del>
<del> # All templates were created at t+1
<del> File::Stat.any_instance.expects(:mtime).times(windows ? 2 : 3).returns(t + 1.second)
<del>
<del> # private methods template_changed_since? and compile_template?
<del> # should report true for all since they have not been compiled
<del> assert @handler.send(:template_changed_since?, @a, t)
<del> assert @handler.send(:template_changed_since?, @b, t)
<del> assert @handler.send(:template_changed_since?, @s, t) unless windows
<del>
<del> assert @handler.send(:compile_template?, ta)
<del> assert @handler.send(:compile_template?, tb)
<del> assert @handler.send(:compile_template?, ts) unless windows
<del>
<del> # All templates are rendered at t+2
<del> Time.expects(:now).times(windows ? 2 : 3).returns(t + 2.seconds)
<del> v.send(:render_template, ta)
<del> v.send(:render_template, tb)
<del> v.send(:render_template, ts) unless windows
<del> a_n = v.method_names[@a]
<del> b_n = v.method_names[@b]
<del> s_n = v.method_names[@s] unless windows
<del> # all of the files have changed since last compile
<del> assert @handler.compile_time[a_n] > t
<del> assert @handler.compile_time[b_n] > t
<del> assert @handler.compile_time[s_n] > t unless windows
<del>
<del> # private methods template_changed_since? and compile_template?
<del> # should report false for all since none have changed since compile
<del> File::Stat.any_instance.expects(:mtime).times(windows ? 6 : 12).returns(t + 1.second)
<del> assert !@handler.send(:template_changed_since?, @a, @handler.compile_time[a_n])
<del> assert !@handler.send(:template_changed_since?, @b, @handler.compile_time[b_n])
<del> assert !@handler.send(:template_changed_since?, @s, @handler.compile_time[s_n]) unless windows
<del> assert !@handler.send(:compile_template?, ta)
<del> assert !@handler.send(:compile_template?, tb)
<del> assert !@handler.send(:compile_template?, ts) unless windows
<del> v.send(:render_template, ta)
<del> v.send(:render_template, tb)
<del> v.send(:render_template, ts) unless windows
<del> # none of the files have changed since last compile
<del> assert @handler.compile_time[a_n] < t + 3.seconds
<del> assert @handler.compile_time[b_n] < t + 3.seconds
<del> assert @handler.compile_time[s_n] < t + 3.seconds unless windows
<del>
<del> `rm #{@s}; ln -s #{@b} #{@s}` unless windows
<del> # private methods template_changed_since? and compile_template?
<del> # should report true for symlink since it has changed since compile
<del>
<del> # t + 3.seconds is for the symlink
<del> File::Stat.any_instance.expects(:mtime).times(windows ? 6 : 9).returns(
<del> *(windows ? [ t + 1.second, t + 1.second ] :
<del> [ t + 1.second, t + 1.second, t + 3.second ]) * 3)
<del> assert !@handler.send(:template_changed_since?, @a, @handler.compile_time[a_n])
<del> assert !@handler.send(:template_changed_since?, @b, @handler.compile_time[b_n])
<del> assert @handler.send(:template_changed_since?, @s, @handler.compile_time[s_n]) unless windows
<del> assert !@handler.send(:compile_template?, ta)
<del> assert !@handler.send(:compile_template?, tb)
<del> assert @handler.send(:compile_template?, ts) unless windows
<del>
<del> # Only the symlink template gets rendered at t+3
<del> Time.stubs(:now).returns(t + 3.seconds) unless windows
<del> v.send(:render_template, ta)
<del> v.send(:render_template, tb)
<del> v.send(:render_template, ts) unless windows
<del> # the symlink has changed since last compile
<del> assert @handler.compile_time[a_n] < t + 3.seconds
<del> assert @handler.compile_time[b_n] < t + 3.seconds
<del> assert_equal @handler.compile_time[s_n], t + 3.seconds unless windows
<del>
<del> FileUtils.touch @b
<del> # private methods template_changed_since? and compile_template?
<del> # should report true for symlink and file at end of symlink
<del> # since it has changed since last compile
<del> #
<del> # t+4 is for @b and also for the file that @s points to, which is @b
<del> File::Stat.any_instance.expects(:mtime).times(windows ? 6 : 12).returns(
<del> *(windows ? [ t + 1.second, t + 4.seconds ] :
<del> [ t + 1.second, t + 4.seconds, t + 3.second, t + 4.seconds ]) * 3)
<del> assert !@handler.send(:template_changed_since?, @a, @handler.compile_time[a_n])
<del> assert @handler.send(:template_changed_since?, @b, @handler.compile_time[b_n])
<del> assert @handler.send(:template_changed_since?, @s, @handler.compile_time[s_n]) unless windows
<del> assert !@handler.send(:compile_template?, ta)
<del> assert @handler.send(:compile_template?, tb)
<del> assert @handler.send(:compile_template?, ts) unless windows
<del>
<del> Time.expects(:now).times(windows ? 1 : 2).returns(t + 5.seconds)
<del> v.send(:render_template, ta)
<del> v.send(:render_template, tb)
<del> v.send(:render_template, ts) unless windows
<del> # the file at the end of the symlink has changed since last compile
<del> # both the symlink and the file at the end of it should be recompiled
<del> assert @handler.compile_time[a_n] < t + 5.seconds
<del> assert_equal @handler.compile_time[b_n], t + 5.seconds
<del> assert_equal @handler.compile_time[s_n], t + 5.seconds unless windows
<del> end
<del> end
<del>end | 2 |
PHP | PHP | fake() assertions | 9922c87369962939af7c620472fe3a82dd0f4019 | <ide><path>src/Illuminate/Support/Testing/Fakes/EventFake.php
<ide> use Closure;
<ide> use Illuminate\Contracts\Events\Dispatcher;
<ide> use Illuminate\Support\Arr;
<add>use Illuminate\Support\Traits\ReflectsClosures;
<ide> use PHPUnit\Framework\Assert as PHPUnit;
<ide>
<ide> class EventFake implements Dispatcher
<ide> {
<add> use ReflectsClosures;
<add>
<ide> /**
<ide> * The original event dispatcher.
<ide> *
<ide> public function __construct(Dispatcher $dispatcher, $eventsToFake = [])
<ide> */
<ide> public function assertDispatched($event, $callback = null)
<ide> {
<add> if ($event instanceof Closure) {
<add> [$event, $callback] = [$this->firstParameterType($event), $event];
<add> }
<add>
<ide> if (is_int($callback)) {
<ide> return $this->assertDispatchedTimes($event, $callback);
<ide> }
<ide> public function assertDispatchedTimes($event, $times = 1)
<ide> */
<ide> public function assertNotDispatched($event, $callback = null)
<ide> {
<add> if ($event instanceof Closure) {
<add> [$event, $callback] = [$this->firstParameterType($event), $event];
<add> }
<add>
<ide> PHPUnit::assertCount(
<ide> 0, $this->dispatched($event, $callback),
<ide> "The unexpected [{$event}] event was dispatched."
<ide><path>tests/Support/SupportTestingEventFakeTest.php
<ide> public function testAssertDispatched()
<ide> $this->fake->assertDispatched(EventStub::class);
<ide> }
<ide>
<add> public function testAssertDispatchedWithClosure()
<add> {
<add> $this->fake->dispatch(new EventStub);
<add>
<add> $this->fake->assertDispatched(function (EventStub $event) {
<add> return true;
<add> });
<add> }
<add>
<ide> public function testAssertDispatchedWithCallbackInt()
<ide> {
<ide> $this->fake->dispatch(EventStub::class);
<ide> public function testAssertNotDispatched()
<ide> }
<ide> }
<ide>
<add> public function testAssertNotDispatchedWithClosure()
<add> {
<add> $this->fake->dispatch(new EventStub);
<add>
<add> try {
<add> $this->fake->assertNotDispatched(function (EventStub $event) {
<add> return true;
<add> });
<add> $this->fail();
<add> } catch (ExpectationFailedException $e) {
<add> $this->assertThat($e, new ExceptionMessage('The unexpected [Illuminate\Tests\Support\EventStub] event was dispatched.'));
<add> }
<add> }
<add>
<ide> public function testAssertDispatchedWithIgnore()
<ide> {
<ide> $dispatcher = m::mock(Dispatcher::class); | 2 |
Ruby | Ruby | add nodoc to localcacheregistry | 4770b773d01d1e313373ef115da40b70244b6ff7 | <ide><path>activesupport/lib/active_support/cache/strategy/local_cache.rb
<ide> module Strategy
<ide> # in-memory cache for faster access.
<ide> module LocalCache
<ide> # Class for storing and registering the local caches.
<del> class LocalCacheRegistry
<add> class LocalCacheRegistry # :nodoc:
<ide> extend ActiveSupport::PerThreadRegistry
<ide>
<ide> def initialize | 1 |
Javascript | Javascript | add code comments | 19668677884515f3fe469dab50be8b0b03791ba3 | <ide><path>packages/ember-routing/lib/location/history_location.js
<ide> export default EmberObject.extend({
<ide> @return url {String}
<ide> */
<ide> getURL() {
<del> var rootURL = get(this, 'rootURL');
<ide> var location = get(this, 'location');
<ide> var path = location.pathname;
<add>
<add> var rootURL = get(this, 'rootURL');
<ide> var baseURL = get(this, 'baseURL');
<ide>
<add> // remove trailing slashes if they exists
<ide> rootURL = rootURL.replace(/\/$/, '');
<ide> baseURL = baseURL.replace(/\/$/, '');
<ide>
<add> // remove baseURL and rootURL from path
<ide> var url = path.replace(baseURL, '').replace(rootURL, '');
<del> var search = location.search || '';
<ide>
<add> var search = location.search || '';
<ide> url += search;
<ide> url += this.getHash();
<ide>
<ide> export default EmberObject.extend({
<ide> var baseURL = get(this, 'baseURL');
<ide>
<ide> if (url !== '') {
<add> // remove trailing slashes if they exists
<ide> rootURL = rootURL.replace(/\/$/, '');
<ide> baseURL = baseURL.replace(/\/$/, '');
<ide> } else if (baseURL.match(/^\//) && rootURL.match(/^\//)) {
<add> // if baseURL and rootURL both start with a slash
<add> // ... remove trailing slash from baseURL if it exists
<ide> baseURL = baseURL.replace(/\/$/, '');
<ide> }
<ide> | 1 |
Text | Text | update downloads page for 0.14 | e1d4668fd5453157083665dbe034590b54ac83c0 | <ide><path>docs/downloads.md
<ide> title: Downloads
<ide> layout: single
<ide> ---
<ide> Download the starter kit to get everything you need to
<del>[get started with React](/react/docs/getting-started.html). The starter kit includes React, the in-browser JSX transformer, and some simple example apps.
<add>[get started with React](/react/docs/getting-started.html). The starter kit includes React and some simple example apps.
<ide>
<ide> <div class="buttons-unit downloads">
<ide> <a href="/react/downloads/react-{{site.react_version}}.zip" class="button">
<ide> All scripts are also available via [CDNJS](https://cdnjs.com/libraries/react/).
<ide>
<ide> ## npm
<ide>
<del>If you're using an npm-compatible packaging system like browserify or webpack, you can use the `react` package. After installing it using `npm install react` or adding `react` to `package.json`, you can use React:
<add>We recommend using React from npm with a bundler like [browserify](http://browserify.org/) or [webpack](https://webpack.github.io/). You can use the `react` and `react-dom` packages. After installing it using `npm install --save react react-dom`, you can use:
<ide>
<ide> ```js
<del>var React = require('react-dom');
<del>ReactDOM.render(...);
<add>var React = require('react');
<add>var ReactDOM = require('react-dom');
<add>ReactDOM.render(<App />, ...);
<ide> ```
<ide>
<del>If you'd like to use any [add-ons](/react/docs/addons.html), use `var React = require('react/addons');` instead.
<add>Each of the [add-ons](/react/docs/addons.html) lives in its own package.
<ide>
<del>**Note:** by default, React will be in development mode. To use React in production mode, set the environment variable `NODE_ENV` to `production`. A minifier that performs dead-code elimination such as [UglifyJS](https://github.com/mishoo/UglifyJS2) is recommended to completely remove the extra code present in development mode.
<add>**Note:** by default, React will be in development mode. To use React in production mode, set the environment variable `NODE_ENV` to `production` (using envify or webpack's DefinePlugin). A minifier that performs dead-code elimination such as [UglifyJS](https://github.com/mishoo/UglifyJS2) is recommended to completely remove the extra code present in development mode.
<ide>
<ide> ## Bower
<ide> | 1 |
Go | Go | fix windows cross compile with new netlink | 7d8b5fc3aa291e1126f43157a26b9b021023d76f | <ide><path>daemon/container_windows.go
<ide> func (container *Container) unmountIpcMounts() error {
<ide> func (container *Container) ipcMounts() []execdriver.Mount {
<ide> return nil
<ide> }
<add>
<add>func getDefaultRouteMtu() (int, error) {
<add> return -1, errSystemNotSupported
<add>}
<ide><path>daemon/daemon.go
<ide> import (
<ide> "fmt"
<ide> "io"
<ide> "io/ioutil"
<del> "net"
<ide> "os"
<ide> "path/filepath"
<ide> "regexp"
<ide> import (
<ide> "github.com/docker/docker/volume/local"
<ide> "github.com/docker/docker/volume/store"
<ide> "github.com/docker/libnetwork"
<del> "github.com/vishvananda/netlink"
<ide> )
<ide>
<ide> var (
<ide> func setDefaultMtu(config *Config) {
<ide>
<ide> var errNoDefaultRoute = errors.New("no default route was found")
<ide>
<del>// getDefaultRouteMtu returns the MTU for the default route's interface.
<del>func getDefaultRouteMtu() (int, error) {
<del> routes, err := netlink.RouteList(nil, 0)
<del> if err != nil {
<del> return 0, err
<del> }
<del> for _, r := range routes {
<del> // a nil Dst means that this is the default route.
<del> if r.Dst == nil {
<del> i, err := net.InterfaceByIndex(r.LinkIndex)
<del> if err != nil {
<del> continue
<del> }
<del> return i.MTU, nil
<del> }
<del> }
<del> return 0, errNoDefaultRoute
<del>}
<del>
<ide> // verifyContainerSettings performs validation of the hostconfig and config
<ide> // structures.
<ide> func (daemon *Daemon) verifyContainerSettings(ctx context.Context, hostConfig *runconfig.HostConfig, config *runconfig.Config) ([]string, error) {
<ide><path>daemon/daemon_unix.go
<ide> import (
<ide> "github.com/docker/libnetwork/netlabel"
<ide> "github.com/docker/libnetwork/options"
<ide> "github.com/opencontainers/runc/libcontainer/label"
<add> "github.com/vishvananda/netlink"
<ide> )
<ide>
<ide> const (
<ide> func (daemon *Daemon) newBaseContainer(id string) Container {
<ide> VolumesRW: make(map[string]bool),
<ide> }
<ide> }
<add>
<add>// getDefaultRouteMtu returns the MTU for the default route's interface.
<add>func getDefaultRouteMtu() (int, error) {
<add> routes, err := netlink.RouteList(nil, 0)
<add> if err != nil {
<add> return 0, err
<add> }
<add> for _, r := range routes {
<add> // a nil Dst means that this is the default route.
<add> if r.Dst == nil {
<add> i, err := net.InterfaceByIndex(r.LinkIndex)
<add> if err != nil {
<add> continue
<add> }
<add> return i.MTU, nil
<add> }
<add> }
<add> return 0, errNoDefaultRoute
<add>} | 3 |
Python | Python | add tests for tile and fix error | c75bd2d03ccc6415bb9a0f090d7e2d80b40ea2af | <ide><path>numpy/lib/shape_base.py
<ide> def tile(A, reps):
<ide> c = _nx.array(A,copy=False,subok=True,ndmin=d)
<ide> shape = list(c.shape)
<ide> n = c.size
<del> if (d < A.ndim):
<del> tup = (1,)*(A.ndim-d) + tup
<add> if (d < c.ndim):
<add> tup = (1,)*(c.ndim-d) + tup
<ide> for i, nrep in enumerate(tup):
<ide> if nrep!=1:
<ide> c = c.reshape(-1,n).repeat(nrep,0)
<ide><path>numpy/lib/tests/test_shape_base.py
<ide> def check_rank_checking(self):
<ide> assert False, "ValueError expected"
<ide>
<ide>
<add>class test_tile(NumpyTestCase):
<add> def check_basic(self):
<add> a = array([0,1,2])
<add> b = [[1,2],[3,4]]
<add> assert_equal(tile(a,2), [0,1,2,0,1,2])
<add> assert_equal(tile(a,(2,2)), [[0,1,2,0,1,2],[0,1,2,0,1,2]])
<add> assert_equal(tile(a,(1,2)), [[0,1,2,0,1,2]])
<add> assert_equal(tile(b, 2), [[1,2,1,2],[3,4,3,4]])
<add> assert_equal(tile(b,(2,1)),[[1,2],[3,4],[1,2],[3,4]])
<add> assert_equal(tile(b,(2,2)),[[1,2,1,2],[3,4,3,4],[1,2,1,2],[3,4,3,4]])
<add>
<ide> # Utility
<ide>
<ide> def compare_results(res,desired): | 2 |
Text | Text | set missing links for d3-contour | f7b65c85def5484bc4f9c487b4924a7413e7ab2e | <ide><path>API.md
<ide> Color ramps and palettes for quantitative, ordinal and categorical scales.
<ide> Compute contour polygons using marching squares.
<ide>
<ide> * [d3.contours](https://github.com/d3/d3-contour/blob/master/README.md#contours) - create a new contour generator.
<del>* *contours* - compute the contours for a given grid of values.
<del>* *contours*.size -
<del>* *contours*.smooth -
<del>* *contours*.thresholds -
<add>* [contours](https://github.com/d3/d3-contour/blob/master/README.md#_contours) - compute the contours for a given grid of values.
<add>* [contours.contour](https://github.com/d3/d3-contour/blob/master/README.md#contours_contour) -
<add>* [contours.size](https://github.com/d3/d3-contour/blob/master/README.md#contours_size) -
<add>* [contours.smooth](https://github.com/d3/d3-contour/blob/master/README.md#contours_smooth) -
<add>* [contours.thresholds](https://github.com/d3/d3-contour/blob/master/README.md#contours_thresholds) -
<ide> * [d3.contourDensity](https://github.com/d3/d3-contour/blob/master/README.md#contourDensity) - create a new density estimator.
<del>* *density* - estimate the density of a given array of samples.
<del>* *density*.x -
<del>* *density*.y -
<del>* *density*.cellSize -
<del>* *density*.thresholds -
<del>* *density*.bandwidth -
<add>* [density](https://github.com/d3/d3-contour/blob/master/README.md#_density) - estimate the density of a given array of samples.
<add>* [density.x](https://github.com/d3/d3-contour/blob/master/README.md#density_x) -
<add>* [density.y](https://github.com/d3/d3-contour/blob/master/README.md#density_y) -
<add>* [density.cellSize](https://github.com/d3/d3-contour/blob/master/README.md#density_cellSize) -
<add>* [density.thresholds](https://github.com/d3/d3-contour/blob/master/README.md#density_thresholds) -
<add>* [density.bandwidth](https://github.com/d3/d3-contour/blob/master/README.md#density_bandwidth) -
<ide>
<ide> ## [Dispatches (d3-dispatch)](https://github.com/d3/d3-dispatch)
<ide> | 1 |
Ruby | Ruby | use helper method | 9b0c74e8780f9769320ee912e43066627602ce68 | <ide><path>activestorage/test/analyzer/video_analyzer_test.rb
<ide> class ActiveStorage::Analyzer::VideoAnalyzerTest < ActiveSupport::TestCase
<ide>
<ide> test "analyzing a video without a video stream" do
<ide> blob = create_file_blob(filename: "video_without_video_stream.mp4", content_type: "video/mp4")
<del> assert_equal({ "analyzed" => true, "identified" => true }, blob.tap(&:analyze).metadata)
<add> metadata = extract_metadata_from(blob)
<add> assert_equal({ "analyzed" => true, "identified" => true }, metadata)
<ide> end
<ide>
<ide> private | 1 |
Python | Python | add comment to explain `object.__new__` | 5391ac3d426a9cfa1317e989e1b312dbe1d7b82f | <ide><path>numpy/doc/subclassing.py
<ide> class C(object):
<ide> def __new__(cls, *args):
<ide> print('Cls in __new__:', cls)
<ide> print('Args in __new__:', args)
<add> # The `object` type __new__ method takes a single argument.
<ide> return object.__new__(cls)
<ide>
<ide> def __init__(self, *args): | 1 |
Javascript | Javascript | add support for default message | 84d80be2b4624b9fcfac81204ff047eba108d804 | <ide><path>src/ngMessages/messages.js
<ide> var jqLite;
<ide> * sequencing based on the order of how the messages are defined in the template.
<ide> *
<ide> * Currently, the ngMessages module only contains the code for the `ngMessages`, `ngMessagesInclude`
<del> * `ngMessage` and `ngMessageExp` directives.
<add> * `ngMessage`, `ngMessageExp` and `ngMessageDefault` directives.
<ide> *
<ide> * ## Usage
<ide> * The `ngMessages` directive allows keys in a key/value collection to be associated with a child element
<ide> var jqLite;
<ide> * .some-message.ng-leave.ng-leave-active {}
<ide> * ```
<ide> *
<del> * {@link ngAnimate Click here} to learn how to use JavaScript animations or to learn more about ngAnimate.
<add> * {@link ngAnimate See the ngAnimate docs} to learn how to use JavaScript animations or to learn
<add> * more about ngAnimate.
<add> *
<add> * ## Displaying a default message
<add> * If the ngMessages renders no inner ngMessage directive (i.e. when none of the truthy
<add> * keys are matched by a defined message), then it will render a default message
<add> * using the {@link ngMessageDefault} directive.
<add> * Note that matched messages will always take precedence over unmatched messages. That means
<add> * the default message will not be displayed when another message is matched. This is also
<add> * true for `ng-messages-multiple`.
<add> *
<add> * ```html
<add> * <div ng-messages="myForm.myField.$error" role="alert">
<add> * <div ng-message="required">This field is required</div>
<add> * <div ng-message="minlength">This field is too short</div>
<add> * <div ng-message-default>This field has an input error</div>
<add> * </div>
<add> * ```
<add> *
<add>
<ide> */
<ide> angular.module('ngMessages', [], function initAngularHelpers() {
<ide> // Access helpers from AngularJS core.
<ide> angular.module('ngMessages', [], function initAngularHelpers() {
<ide> * at a time and this depends on the prioritization of the messages within the template. (This can
<ide> * be changed by using the `ng-messages-multiple` or `multiple` attribute on the directive container.)
<ide> *
<del> * A remote template can also be used to promote message reusability and messages can also be
<del> * overridden.
<add> * A remote template can also be used (With {@link ngMessagesInclude}) to promote message
<add> * reusability and messages can also be overridden.
<add> *
<add> * A default message can also be displayed when no `ngMessage` directive is inserted, using the
<add> * {@link ngMessageDefault} directive.
<ide> *
<ide> * {@link module:ngMessages Click here} to learn more about `ngMessages` and `ngMessage`.
<ide> *
<ide> angular.module('ngMessages', [], function initAngularHelpers() {
<ide> * <ANY ng-message="stringValue">...</ANY>
<ide> * <ANY ng-message="stringValue1, stringValue2, ...">...</ANY>
<ide> * <ANY ng-message-exp="expressionValue">...</ANY>
<add> * <ANY ng-message-default>...</ANY>
<ide> * </ANY>
<ide> *
<ide> * <!-- or by using element directives -->
<ide> * <ng-messages for="expression" role="alert">
<ide> * <ng-message when="stringValue">...</ng-message>
<ide> * <ng-message when="stringValue1, stringValue2, ...">...</ng-message>
<ide> * <ng-message when-exp="expressionValue">...</ng-message>
<add> * <ng-message-default>...</ng-message-default>
<ide> * </ng-messages>
<ide> * ```
<ide> *
<ide> angular.module('ngMessages', [], function initAngularHelpers() {
<ide> * <div ng-message="required">You did not enter a field</div>
<ide> * <div ng-message="minlength">Your field is too short</div>
<ide> * <div ng-message="maxlength">Your field is too long</div>
<add> * <div ng-message-default>This field has an input error</div>
<ide> * </div>
<ide> * </form>
<ide> * </file>
<ide> angular.module('ngMessages', [], function initAngularHelpers() {
<ide>
<ide> var unmatchedMessages = [];
<ide> var matchedKeys = {};
<add> var truthyKeys = 0;
<ide> var messageItem = ctrl.head;
<ide> var messageFound = false;
<ide> var totalMessages = 0;
<ide> angular.module('ngMessages', [], function initAngularHelpers() {
<ide> var messageUsed = false;
<ide> if (!messageFound) {
<ide> forEach(collection, function(value, key) {
<del> if (!messageUsed && truthy(value) && messageCtrl.test(key)) {
<del> // this is to prevent the same error name from showing up twice
<del> if (matchedKeys[key]) return;
<del> matchedKeys[key] = true;
<add> if (truthy(value) && !messageUsed) {
<add> truthyKeys++;
<add>
<add> if (messageCtrl.test(key)) {
<add> // this is to prevent the same error name from showing up twice
<add> if (matchedKeys[key]) return;
<add> matchedKeys[key] = true;
<ide>
<del> messageUsed = true;
<del> messageCtrl.attach();
<add> messageUsed = true;
<add> messageCtrl.attach();
<add> }
<ide> }
<ide> });
<ide> }
<ide> angular.module('ngMessages', [], function initAngularHelpers() {
<ide> messageCtrl.detach();
<ide> });
<ide>
<del> if (unmatchedMessages.length !== totalMessages) {
<add> var messageMatched = unmatchedMessages.length !== totalMessages;
<add> var attachDefault = ctrl.default && !messageMatched && truthyKeys > 0;
<add>
<add> if (attachDefault) {
<add> ctrl.default.attach();
<add> } else if (ctrl.default) {
<add> ctrl.default.detach();
<add> }
<add>
<add> if (messageMatched || attachDefault) {
<ide> $animate.setClass($element, ACTIVE_CLASS, INACTIVE_CLASS);
<ide> } else {
<ide> $animate.setClass($element, INACTIVE_CLASS, ACTIVE_CLASS);
<ide> angular.module('ngMessages', [], function initAngularHelpers() {
<ide> }
<ide> };
<ide>
<del> this.register = function(comment, messageCtrl) {
<del> var nextKey = latestKey.toString();
<del> messages[nextKey] = {
<del> message: messageCtrl
<del> };
<del> insertMessageNode($element[0], comment, nextKey);
<del> comment.$$ngMessageNode = nextKey;
<del> latestKey++;
<add> this.register = function(comment, messageCtrl, isDefault) {
<add> if (isDefault) {
<add> ctrl.default = messageCtrl;
<add> } else {
<add> var nextKey = latestKey.toString();
<add> messages[nextKey] = {
<add> message: messageCtrl
<add> };
<add> insertMessageNode($element[0], comment, nextKey);
<add> comment.$$ngMessageNode = nextKey;
<add> latestKey++;
<add> }
<ide>
<ide> ctrl.reRender();
<ide> };
<ide>
<del> this.deregister = function(comment) {
<del> var key = comment.$$ngMessageNode;
<del> delete comment.$$ngMessageNode;
<del> removeMessageNode($element[0], comment, key);
<del> delete messages[key];
<add> this.deregister = function(comment, isDefault) {
<add> if (isDefault) {
<add> delete ctrl.default;
<add> } else {
<add> var key = comment.$$ngMessageNode;
<add> delete comment.$$ngMessageNode;
<add> removeMessageNode($element[0], comment, key);
<add> delete messages[key];
<add> }
<ide> ctrl.reRender();
<ide> };
<ide>
<ide> angular.module('ngMessages', [], function initAngularHelpers() {
<ide> *
<ide> * @param {expression} ngMessageExp|whenExp an expression value corresponding to the message key.
<ide> */
<del> .directive('ngMessageExp', ngMessageDirectiveFactory());
<add> .directive('ngMessageExp', ngMessageDirectiveFactory())
<add>
<add> /**
<add> * @ngdoc directive
<add> * @name ngMessageDefault
<add> * @restrict AE
<add> * @scope
<add> *
<add> * @description
<add> * `ngMessageDefault` is a directive with the purpose to show and hide a default message for
<add> * {@link ngMessages}, when none of provided messages matches.
<add> *
<add> * More information about using `ngMessageDefault` can be found in the
<add> * {@link module:ngMessages `ngMessages` module documentation}.
<add> *
<add> * @usage
<add> * ```html
<add> * <!-- using attribute directives -->
<add> * <ANY ng-messages="expression" role="alert">
<add> * <ANY ng-message="stringValue">...</ANY>
<add> * <ANY ng-message="stringValue1, stringValue2, ...">...</ANY>
<add> * <ANY ng-message-default>...</ANY>
<add> * </ANY>
<add> *
<add> * <!-- or by using element directives -->
<add> * <ng-messages for="expression" role="alert">
<add> * <ng-message when="stringValue">...</ng-message>
<add> * <ng-message when="stringValue1, stringValue2, ...">...</ng-message>
<add> * <ng-message-default>...</ng-message-default>
<add> * </ng-messages>
<add> *
<add> */
<add> .directive('ngMessageDefault', ngMessageDirectiveFactory(true));
<ide>
<del>function ngMessageDirectiveFactory() {
<add>function ngMessageDirectiveFactory(isDefault) {
<ide> return ['$animate', function($animate) {
<ide> return {
<ide> restrict: 'AE',
<ide> function ngMessageDirectiveFactory() {
<ide> terminal: true,
<ide> require: '^^ngMessages',
<ide> link: function(scope, element, attrs, ngMessagesCtrl, $transclude) {
<del> var commentNode = element[0];
<del>
<del> var records;
<del> var staticExp = attrs.ngMessage || attrs.when;
<del> var dynamicExp = attrs.ngMessageExp || attrs.whenExp;
<del> var assignRecords = function(items) {
<del> records = items
<del> ? (isArray(items)
<del> ? items
<del> : items.split(/[\s,]+/))
<del> : null;
<del> ngMessagesCtrl.reRender();
<del> };
<add> var commentNode, records, staticExp, dynamicExp;
<add>
<add> if (!isDefault) {
<add> commentNode = element[0];
<add> staticExp = attrs.ngMessage || attrs.when;
<add> dynamicExp = attrs.ngMessageExp || attrs.whenExp;
<add>
<add> var assignRecords = function(items) {
<add> records = items
<add> ? (isArray(items)
<add> ? items
<add> : items.split(/[\s,]+/))
<add> : null;
<add> ngMessagesCtrl.reRender();
<add> };
<ide>
<del> if (dynamicExp) {
<del> assignRecords(scope.$eval(dynamicExp));
<del> scope.$watchCollection(dynamicExp, assignRecords);
<del> } else {
<del> assignRecords(staticExp);
<add> if (dynamicExp) {
<add> assignRecords(scope.$eval(dynamicExp));
<add> scope.$watchCollection(dynamicExp, assignRecords);
<add> } else {
<add> assignRecords(staticExp);
<add> }
<ide> }
<ide>
<ide> var currentElement, messageCtrl;
<ide> function ngMessageDirectiveFactory() {
<ide> // If the message element was removed via a call to `detach` then `currentElement` will be null
<ide> // So this handler only handles cases where something else removed the message element.
<ide> if (currentElement && currentElement.$$attachId === $$attachId) {
<del> ngMessagesCtrl.deregister(commentNode);
<add> ngMessagesCtrl.deregister(commentNode, isDefault);
<ide> messageCtrl.detach();
<ide> }
<ide> newScope.$destroy();
<ide> function ngMessageDirectiveFactory() {
<ide> $animate.leave(elm);
<ide> }
<ide> }
<del> });
<add> }, isDefault);
<ide>
<ide> // We need to ensure that this directive deregisters itself when it no longer exists
<ide> // Normally this is done when the attached element is destroyed; but if this directive
<ide> // gets removed before we attach the message to the DOM there is nothing to watch
<ide> // in which case we must deregister when the containing scope is destroyed.
<ide> scope.$on('$destroy', function() {
<del> ngMessagesCtrl.deregister(commentNode);
<add> ngMessagesCtrl.deregister(commentNode, isDefault);
<ide> });
<ide> }
<ide> };
<ide><path>test/ngMessages/messagesSpec.js
<ide> describe('ngMessages', function() {
<ide> );
<ide>
<ide>
<add> describe('default message', function() {
<add> it('should render a default message when no message matches', inject(function($rootScope, $compile) {
<add> element = $compile('<div ng-messages="col">' +
<add> ' <div ng-message="val">Message is set</div>' +
<add> ' <div ng-message-default>Default message is set</div>' +
<add> '</div>')($rootScope);
<add> $rootScope.$apply(function() {
<add> $rootScope.col = { unexpected: false };
<add> });
<add>
<add> $rootScope.$digest();
<add>
<add> expect(element.text().trim()).toBe('');
<add> expect(element).not.toHaveClass('ng-active');
<add>
<add> $rootScope.$apply(function() {
<add> $rootScope.col = { unexpected: true };
<add> });
<add>
<add> expect(element.text().trim()).toBe('Default message is set');
<add> expect(element).toHaveClass('ng-active');
<add>
<add> $rootScope.$apply(function() {
<add> $rootScope.col = { unexpected: false };
<add> });
<add>
<add> expect(element.text().trim()).toBe('');
<add> expect(element).not.toHaveClass('ng-active');
<add>
<add> $rootScope.$apply(function() {
<add> $rootScope.col = { val: true, unexpected: true };
<add> });
<add>
<add> expect(element.text().trim()).toBe('Message is set');
<add> expect(element).toHaveClass('ng-active');
<add> }));
<add>
<add> it('should not render a default message with ng-messages-multiple if another error matches',
<add> inject(function($rootScope, $compile) {
<add> element = $compile('<div ng-messages="col" ng-messages-multiple>' +
<add> ' <div ng-message="val">Message is set</div>' +
<add> ' <div ng-message="other">Other message is set</div>' +
<add> ' <div ng-message-default>Default message is set</div>' +
<add> '</div>')($rootScope);
<add>
<add> expect(element.text().trim()).toBe('');
<add>
<add> $rootScope.$apply(function() {
<add> $rootScope.col = { val: true, other: false, unexpected: false };
<add> });
<add>
<add> expect(element.text().trim()).toBe('Message is set');
<add>
<add> $rootScope.$apply(function() {
<add> $rootScope.col = { val: true, other: true, unexpected: true };
<add> });
<add>
<add> expect(element.text().trim()).toBe('Message is set Other message is set');
<add>
<add> $rootScope.$apply(function() {
<add> $rootScope.col = { val: false, other: false, unexpected: true };
<add> });
<add>
<add> expect(element.text().trim()).toBe('Default message is set');
<add> })
<add> );
<add>
<add> it('should handle a default message with ngIf', inject(function($rootScope, $compile) {
<add> element = $compile('<div ng-messages="col">' +
<add> ' <div ng-message="val">Message is set</div>' +
<add> ' <div ng-if="default" ng-message-default>Default message is set</div>' +
<add> '</div>')($rootScope);
<add> $rootScope.default = true;
<add> $rootScope.col = {unexpected: true};
<add> $rootScope.$digest();
<add>
<add> expect(element.text().trim()).toBe('Default message is set');
<add>
<add> $rootScope.$apply('default = false');
<add>
<add> expect(element.text().trim()).toBe('');
<add>
<add> $rootScope.$apply('default = true');
<add>
<add> expect(element.text().trim()).toBe('Default message is set');
<add>
<add> $rootScope.$apply(function() {
<add> $rootScope.col = { val: true };
<add> });
<add>
<add> expect(element.text().trim()).toBe('Message is set');
<add> }));
<add> });
<add>
<ide> describe('when including templates', function() {
<ide> they('should work with a dynamic collection model which is managed by ngRepeat',
<ide> {'<div ng-messages-include="...">': '<div ng-messages="item">' + | 2 |
Python | Python | remove code for old django versions | b82ec540ad447dcf703a9b21e9e94976109fd175 | <ide><path>rest_framework/views.py
<ide> def force_evaluation():
<ide> 'Use `.all()` or call `.get_queryset()` instead.'
<ide> )
<ide> cls.queryset._fetch_all = force_evaluation
<del> cls.queryset._result_iter = force_evaluation # Django <= 1.5
<ide>
<ide> view = super(APIView, cls).as_view(**initkwargs)
<ide> view.cls = cls
<ide><path>rest_framework/viewsets.py
<ide> def view(request, *args, **kwargs):
<ide> handler = getattr(self, action)
<ide> setattr(self, method, handler)
<ide>
<del> # Patch this in as it's otherwise only present from 1.5 onwards
<del> if hasattr(self, 'get') and not hasattr(self, 'head'):
<del> self.head = self.get
<del>
<ide> # And continue as usual
<ide> return self.dispatch(request, *args, **kwargs)
<ide> | 2 |
Python | Python | remove unused import | 1368c31a705a4892995f42cf5e0dcdcbfa13a1ce | <ide><path>rest_framework/generics.py
<ide> from django.utils import six
<ide> from django.utils.translation import ugettext as _
<ide> from rest_framework import views, mixins
<del>from rest_framework.exceptions import NotFound
<ide> from rest_framework.settings import api_settings
<ide>
<ide> | 1 |
Javascript | Javascript | expose stuff and all non-bundle modules | 2c8ea603680fb731f86db385360ff6e01a8d1b51 | <ide><path>lib/JavascriptModulesPlugin.js
<ide> class JavascriptModulesPlugin {
<ide> (source, chunk, hash, moduleTemplate, dependencyTemplates) => {
<ide> return Template.renderChunkModules(
<ide> chunk,
<del> () => true,
<add> m => typeof m.source === "function",
<ide> moduleTemplate,
<ide> dependencyTemplates,
<ide> "/******/ "
<ide> class JavascriptModulesPlugin {
<ide> renderJavascript(chunkTemplate, chunk, moduleTemplate, dependencyTemplates) {
<ide> const moduleSources = Template.renderChunkModules(
<ide> chunk,
<del> m => true,
<add> m => typeof m.source === "function",
<ide> moduleTemplate,
<ide> dependencyTemplates
<ide> );
<ide><path>lib/webpack.js
<ide> exportPlugins(exports, {
<ide> ContextExclusionPlugin: () => require("./ContextExclusionPlugin"),
<ide> ContextReplacementPlugin: () => require("./ContextReplacementPlugin"),
<ide> DefinePlugin: () => require("./DefinePlugin"),
<add> Dependency: () => require("./Dependency"),
<ide> DllPlugin: () => require("./DllPlugin"),
<ide> DllReferencePlugin: () => require("./DllReferencePlugin"),
<ide> EnvironmentPlugin: () => require("./EnvironmentPlugin"),
<ide> exportPlugins(exports, {
<ide> LoaderOptionsPlugin: () => require("./LoaderOptionsPlugin"),
<ide> LoaderTargetPlugin: () => require("./LoaderTargetPlugin"),
<ide> MemoryOutputFileSystem: () => require("./MemoryOutputFileSystem"),
<add> Module: () => require("./Module"),
<ide> ModuleFilenameHelpers: () => require("./ModuleFilenameHelpers"),
<ide> NamedChunksPlugin: () => require("./NamedChunksPlugin"),
<ide> NamedModulesPlugin: () => require("./NamedModulesPlugin"),
<ide> exportPlugins(exports, {
<ide> SingleEntryPlugin: () => require("./SingleEntryPlugin"),
<ide> SourceMapDevToolPlugin: () => require("./SourceMapDevToolPlugin"),
<ide> Stats: () => require("./Stats"),
<add> Template: () => require("./Template"),
<ide> UmdMainTemplatePlugin: () => require("./UmdMainTemplatePlugin"),
<ide> WatchIgnorePlugin: () => require("./WatchIgnorePlugin")
<ide> }); | 2 |
Javascript | Javascript | add source maps to all min files | 908071afbf32c46fe9110e4a67e104bbd4b3a56b | <ide><path>lib/grunt/utils.js
<ide> module.exports = {
<ide> },
<ide>
<ide>
<del> singleStrict: function(src, insert, newline){
<del> var useStrict = newline ? "$1\n'use strict';" : "$1'use strict';";
<add> singleStrict: function(src, insert){
<ide> return src
<ide> .replace(/\s*("|')use strict("|');\s*/g, insert) // remove all file-specific strict mode flags
<del> .replace(/(\(function\([^)]*\)\s*\{)/, useStrict); // add single strict mode flag
<add> .replace(/(\(function\([^)]*\)\s*\{)/, "$1'use strict';"); // add single strict mode flag
<add> },
<add>
<add>
<add> sourceMap: function(mapFile, fileContents) {
<add> // use the following once Chrome beta or stable supports the //# pragma
<add> // var sourceMapLine = '//# sourceMappingURL=' + mapFile + '\n';
<add> var sourceMapLine = '/*\n//@ sourceMappingURL=' + mapFile + '\n*/\n';
<add> return fileContents + sourceMapLine;
<ide> },
<ide>
<ide>
<ide> min: function(file, done) {
<ide> var minFile = file.replace(/\.js$/, '.min.js');
<add> var mapFile = minFile + '.map';
<add> var mapFileName = mapFile.match(/[^\/]+$/)[0];
<ide> shell.exec(
<ide> 'java ' +
<ide> this.java32flags() + ' ' +
<ide> '-jar components/closure-compiler/compiler.jar ' +
<ide> '--compilation_level SIMPLE_OPTIMIZATIONS ' +
<ide> '--language_in ECMASCRIPT5_STRICT ' +
<add> '--source_map_format=V3 ' +
<add> '--create_source_map ' + mapFile + ' ' +
<ide> '--js ' + file + ' ' +
<ide> '--js_output_file ' + minFile,
<ide> function(code) {
<ide> if (code !== 0) grunt.fail.warn('Error minifying ' + file);
<del> grunt.file.write(minFile, this.singleStrict(grunt.file.read(minFile), '\n'));
<add>
<add> // closure creates the source map relative to build/ folder, we need to strip those references
<add> grunt.file.write(mapFile, grunt.file.read(mapFile).replace('"file":"build/', '"file":"').
<add> replace('"sources":["build/','"sources":["'));
<add>
<add> // move add use strict into the closure + add source map pragma
<add> grunt.file.write(minFile, this.sourceMap(mapFileName, this.singleStrict(grunt.file.read(minFile), '\n')));
<ide> grunt.log.ok(file + ' minified into ' + minFile);
<ide> done();
<ide> }.bind(this)); | 1 |
Python | Python | fix ja leading spaces | 332803eda9e9999434d4da41e56d1689f353bbd8 | <ide><path>spacy/lang/ja/__init__.py
<ide> def get_dtokens_and_spaces(dtokens, text, gap_tag="空白"):
<ide> return text_dtokens, text_spaces
<ide>
<ide> # align words and dtokens by referring text, and insert gap tokens for the space char spans
<del> for word, dtoken in zip(words, dtokens):
<add> for i, (word, dtoken) in enumerate(zip(words, dtokens)):
<ide> # skip all space tokens
<ide> if word.isspace():
<ide> continue
<ide> def get_dtokens_and_spaces(dtokens, text, gap_tag="空白"):
<ide> text_spaces.append(False)
<ide> text_pos += len(word)
<ide> # poll a space char after the word
<del> if text_pos < len(text) and text[text_pos] == " ":
<add> if i + 1 < len(dtokens) and dtokens[i + 1].surface == " ":
<ide> text_spaces[-1] = True
<ide> text_pos += 1
<ide>
<ide><path>spacy/tests/tokenizer/test_naughty_strings.py
<ide> r"₀₁₂",
<ide> r"⁰⁴⁵₀₁₂",
<ide> r"ด้้้้้็็็็็้้้้้็็็็็้้้้้้้้็็็็็้้้้้็็็็็้้้้้้้้็็็็็้้้้้็็็็็้้้้้้้้็็็็็้้้้้็็็็ ด้้้้้็็็็็้้้้้็็็็็้้้้้้้้็็็็็้้้้้็็็็็้้้้้้้้็็็็็้้้้้็็็็็้้้้้้้้็็็็็้้้้้็็็็ ด้้้้้็็็็็้้้้้็็็็็้้้้้้้้็็็็็้้้้้็็็็็้้้้้้้้็็็็็้้้้้็็็็็้้้้้้้้็็็็็้้้้้็็็็",
<add> r" ̄ ̄",
<ide> # Two-Byte Characters
<ide> r"田中さんにあげて下さい",
<ide> r"パーティーへ行かないか", | 2 |
Javascript | Javascript | use buffer.from instead of slice | 894ef94b86d6bda4c8c8ae35e348185faa13d3d9 | <ide><path>lib/serialization/FileMiddleware.js
<ide> const deserialize = async (middleware, name, readFile) => {
<ide> } else if (contentPosition !== 0) {
<ide> if (length <= contentItemLength - contentPosition) {
<ide> result.push(
<del> contentItem.slice(contentPosition, contentPosition + length)
<add> Buffer.from(
<add> contentItem.buffer,
<add> contentItem.byteOffset + contentPosition,
<add> length
<add> )
<ide> );
<ide> contentPosition += length;
<ide> length = 0;
<ide> } else {
<del> result.push(contentItem.slice(contentPosition));
<add> result.push(
<add> Buffer.from(
<add> contentItem.buffer,
<add> contentItem.byteOffset + contentPosition
<add> )
<add> );
<ide> length -= contentItemLength - contentPosition;
<ide> contentPosition = contentItemLength;
<ide> }
<ide> class FileMiddleware extends SerializerMiddleware {
<ide> let readBuffer = currentBuffer;
<ide> let readOffset = currentBufferUsed;
<ide> let readLength = currentBuffer.length - currentBufferUsed;
<add> // values passed to fs.read must be valid int32 values
<ide> if (readOffset > 0x7fffffff) {
<ide> readBuffer = currentBuffer.slice(readOffset);
<ide> readOffset = 0; | 1 |
Java | Java | optimize single blockingobservable operations | 3816f473cb9f7d1a691855d053720f57bf51a0cd | <ide><path>rxjava/src/main/java/rx/observables/BlockingObservable.java
<ide> /**
<ide> * Copyright 2014 Netflix, Inc.
<del> *
<add> *
<ide> * Licensed under the Apache License, Version 2.0 (the "License");
<ide> * you may not use this file except in compliance with the License.
<ide> * You may obtain a copy of the License at
<del> *
<add> *
<ide> * http://www.apache.org/licenses/LICENSE-2.0
<del> *
<add> *
<ide> * Unless required by applicable law or agreed to in writing, software
<ide> * distributed under the License is distributed on an "AS IS" BASIS,
<ide> * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
<ide> public static <T> BlockingObservable<T> from(final Observable<? extends T> o) {
<ide> * need the {@link Subscriber#onCompleted()} or {@link Subscriber#onError(Throwable)} methods.
<ide> * <p>
<ide> * <img width="640" height="330" src="https://github.com/ReactiveX/RxJava/wiki/images/rx-operators/B.forEach.png" alt="">
<del> *
<add> *
<ide> * @param onNext
<ide> * the {@link Action1} to invoke for each item emitted by the {@code BlockingObservable}
<ide> * @throws RuntimeException
<ide> public void onNext(T args) {
<ide> * Returns an {@link Iterator} that iterates over all items emitted by this {@code BlockingObservable}.
<ide> * <p>
<ide> * <img width="640" height="315" src="https://github.com/ReactiveX/RxJava/wiki/images/rx-operators/B.getIterator.png" alt="">
<del> *
<add> *
<ide> * @return an {@link Iterator} that can iterate over the items emitted by this {@code BlockingObservable}
<ide> * @see <a href="https://github.com/ReactiveX/RxJava/wiki/Blocking-Observable-Operators#transformations-tofuture-toiterable-and-toiteratorgetiterator">RxJava Wiki: getIterator()</a>
<ide> */
<ide> public Iterator<T> getIterator() {
<ide> /**
<ide> * Returns the first item emitted by this {@code BlockingObservable}, or throws
<ide> * {@code NoSuchElementException} if it emits no items.
<del> *
<add> *
<ide> * @return the first item emitted by this {@code BlockingObservable}
<ide> * @throws NoSuchElementException
<ide> * if this {@code BlockingObservable} emits no items
<ide> * @see <a href="https://github.com/ReactiveX/RxJava/wiki/Blocking-Observable-Operators#first-and-firstordefault">RxJava Wiki: first()</a>
<ide> * @see <a href="http://msdn.microsoft.com/en-us/library/hh229177.aspx">MSDN: Observable.First</a>
<ide> */
<ide> public T first() {
<del> return from(o.first()).single();
<add> return blockForSingle(o.first());
<ide> }
<ide>
<ide> /**
<ide> * Returns the first item emitted by this {@code BlockingObservable} that matches a predicate, or throws
<ide> * {@code NoSuchElementException} if it emits no such item.
<del> *
<add> *
<ide> * @param predicate
<ide> * a predicate function to evaluate items emitted by this {@code BlockingObservable}
<ide> * @return the first item emitted by this {@code BlockingObservable} that matches the predicate
<ide> public T first() {
<ide> * @see <a href="http://msdn.microsoft.com/en-us/library/hh229739.aspx">MSDN: Observable.First</a>
<ide> */
<ide> public T first(Func1<? super T, Boolean> predicate) {
<del> return from(o.first(predicate)).single();
<add> return blockForSingle(o.first(predicate));
<ide> }
<ide>
<ide> /**
<ide> * Returns the first item emitted by this {@code BlockingObservable}, or a default value if it emits no
<ide> * items.
<del> *
<add> *
<ide> * @param defaultValue
<ide> * a default value to return if this {@code BlockingObservable} emits no items
<ide> * @return the first item emitted by this {@code BlockingObservable}, or the default value if it emits no
<ide> public T first(Func1<? super T, Boolean> predicate) {
<ide> * @see <a href="http://msdn.microsoft.com/en-us/library/hh229320.aspx">MSDN: Observable.FirstOrDefault</a>
<ide> */
<ide> public T firstOrDefault(T defaultValue) {
<del> return from(o.take(1)).singleOrDefault(defaultValue);
<add> return blockForSingle(o.map(Functions.<T>identity()).firstOrDefault(defaultValue));
<ide> }
<ide>
<ide> /**
<ide> * Returns the first item emitted by this {@code BlockingObservable} that matches a predicate, or a default
<ide> * value if it emits no such items.
<del> *
<add> *
<ide> * @param defaultValue
<ide> * a default value to return if this {@code BlockingObservable} emits no matching items
<ide> * @param predicate
<ide> public T firstOrDefault(T defaultValue) {
<ide> * @see <a href="http://msdn.microsoft.com/en-us/library/hh229759.aspx">MSDN: Observable.FirstOrDefault</a>
<ide> */
<ide> public T firstOrDefault(T defaultValue, Func1<? super T, Boolean> predicate) {
<del> return from(o.filter(predicate)).firstOrDefault(defaultValue);
<add> return blockForSingle(o.filter(predicate).map(Functions.<T>identity()).firstOrDefault(defaultValue));
<ide> }
<ide>
<ide> /**
<ide> * Returns the last item emitted by this {@code BlockingObservable}, or throws
<ide> * {@code NoSuchElementException} if this {@code BlockingObservable} emits no items.
<ide> * <p>
<ide> * <img width="640" height="315" src="https://github.com/ReactiveX/RxJava/wiki/images/rx-operators/B.last.png" alt="">
<del> *
<add> *
<ide> * @return the last item emitted by this {@code BlockingObservable}
<ide> * @throws NoSuchElementException
<ide> * if this {@code BlockingObservable} emits no items
<ide> * @see <a href="https://github.com/ReactiveX/RxJava/wiki/Blocking-Observable-Operators#last-and-lastordefault">RxJava Wiki: last()</a>
<ide> * @see <a href="http://msdn.microsoft.com/en-us/library/system.reactive.linq.observable.last.aspx">MSDN: Observable.Last</a>
<ide> */
<ide> public T last() {
<del> return from(o.last()).single();
<add> return blockForSingle(o.last());
<ide> }
<ide>
<ide> /**
<ide> * Returns the last item emitted by this {@code BlockingObservable} that matches a predicate, or throws
<ide> * {@code NoSuchElementException} if it emits no such items.
<ide> * <p>
<ide> * <img width="640" height="315" src="https://github.com/ReactiveX/RxJava/wiki/images/rx-operators/B.last.p.png" alt="">
<del> *
<add> *
<ide> * @param predicate
<ide> * a predicate function to evaluate items emitted by the {@code BlockingObservable}
<ide> * @return the last item emitted by the {@code BlockingObservable} that matches the predicate
<ide> public T last() {
<ide> * @see <a href="http://msdn.microsoft.com/en-us/library/system.reactive.linq.observable.last.aspx">MSDN: Observable.Last</a>
<ide> */
<ide> public T last(final Func1<? super T, Boolean> predicate) {
<del> return from(o.last(predicate)).single();
<add> return blockForSingle(o.last(predicate));
<ide> }
<ide>
<ide> /**
<ide> * Returns the last item emitted by this {@code BlockingObservable}, or a default value if it emits no
<ide> * items.
<ide> * <p>
<ide> * <img width="640" height="310" src="https://github.com/ReactiveX/RxJava/wiki/images/rx-operators/B.lastOrDefault.png" alt="">
<del> *
<add> *
<ide> * @param defaultValue
<ide> * a default value to return if this {@code BlockingObservable} emits no items
<ide> * @return the last item emitted by the {@code BlockingObservable}, or the default value if it emits no
<ide> public T last(final Func1<? super T, Boolean> predicate) {
<ide> * @see <a href="http://msdn.microsoft.com/en-us/library/system.reactive.linq.observable.lastordefault.aspx">MSDN: Observable.LastOrDefault</a>
<ide> */
<ide> public T lastOrDefault(T defaultValue) {
<del> return from(o.takeLast(1)).singleOrDefault(defaultValue);
<add> return blockForSingle(o.map(Functions.<T>identity()).lastOrDefault(defaultValue));
<ide> }
<ide>
<ide> /**
<ide> * Returns the last item emitted by this {@code BlockingObservable} that matches a predicate, or a default
<ide> * value if it emits no such items.
<ide> * <p>
<ide> * <img width="640" height="315" src="https://github.com/ReactiveX/RxJava/wiki/images/rx-operators/B.lastOrDefault.p.png" alt="">
<del> *
<add> *
<ide> * @param defaultValue
<ide> * a default value to return if this {@code BlockingObservable} emits no matching items
<ide> * @param predicate
<ide> public T lastOrDefault(T defaultValue) {
<ide> * @see <a href="http://msdn.microsoft.com/en-us/library/system.reactive.linq.observable.lastordefault.aspx">MSDN: Observable.LastOrDefault</a>
<ide> */
<ide> public T lastOrDefault(T defaultValue, Func1<? super T, Boolean> predicate) {
<del> return from(o.filter(predicate)).lastOrDefault(defaultValue);
<add> return blockForSingle(o.filter(predicate).map(Functions.<T>identity()).lastOrDefault(defaultValue));
<ide> }
<ide>
<ide> /**
<ide> * Returns an {@link Iterable} that always returns the item most recently emitted by this
<ide> * {@code BlockingObservable}.
<ide> * <p>
<ide> * <img width="640" height="490" src="https://github.com/ReactiveX/RxJava/wiki/images/rx-operators/B.mostRecent.png" alt="">
<del> *
<add> *
<ide> * @param initialValue
<ide> * the initial value that the {@link Iterable} sequence will yield if this
<ide> * {@code BlockingObservable} has not yet emitted an item
<ide> public Iterable<T> mostRecent(T initialValue) {
<ide> * returns that item.
<ide> * <p>
<ide> * <img width="640" height="490" src="https://github.com/ReactiveX/RxJava/wiki/images/rx-operators/B.next.png" alt="">
<del> *
<add> *
<ide> * @return an {@link Iterable} that blocks upon each iteration until this {@code BlockingObservable} emits
<ide> * a new item, whereupon the Iterable returns that item
<ide> * @see <a href="https://github.com/ReactiveX/RxJava/wiki/Blocking-Observable-Operators#next">RxJava Wiki: next()</a>
<ide> public Iterable<T> next() {
<ide> * <p>
<ide> * Note also that an {@code onNext} directly followed by {@code onCompleted} might hide the {@code onNext}
<ide> * event.
<del> *
<add> *
<ide> * @return an Iterable that always returns the latest item emitted by this {@code BlockingObservable}
<ide> * @see <a href="https://github.com/ReactiveX/RxJava/wiki/Blocking-Observable-Operators#latest">RxJava wiki: latest()</a>
<ide> * @see <a href="http://msdn.microsoft.com/en-us/library/hh212115.aspx">MSDN: Observable.Latest</a>
<ide> public Iterable<T> latest() {
<ide> * throw a {@code NoSuchElementException}.
<ide> * <p>
<ide> * <img width="640" height="315" src="https://github.com/ReactiveX/RxJava/wiki/images/rx-operators/B.single.png" alt="">
<del> *
<add> *
<ide> * @return the single item emitted by this {@code BlockingObservable}
<ide> * @see <a href="https://github.com/ReactiveX/RxJava/wiki/Blocking-Observable-Operators#single-and-singleordefault">RxJava Wiki: single()</a>
<ide> * @see <a href="http://msdn.microsoft.com/en-us/library/system.reactive.linq.observable.single.aspx">MSDN: Observable.Single</a>
<ide> */
<ide> public T single() {
<del> return from(o.single()).toIterable().iterator().next();
<add> return blockForSingle(o.single());
<ide> }
<ide>
<ide> /**
<ide> * If this {@code BlockingObservable} completes after emitting a single item that matches a given predicate,
<ide> * return that item, otherwise throw a {@code NoSuchElementException}.
<ide> * <p>
<ide> * <img width="640" height="315" src="https://github.com/ReactiveX/RxJava/wiki/images/rx-operators/B.single.p.png" alt="">
<del> *
<add> *
<ide> * @param predicate
<ide> * a predicate function to evaluate items emitted by this {@link BlockingObservable}
<ide> * @return the single item emitted by this {@code BlockingObservable} that matches the predicate
<ide> * @see <a href="https://github.com/ReactiveX/RxJava/wiki/Blocking-Observable-Operators#single-and-singleordefault">RxJava Wiki: single()</a>
<ide> * @see <a href="http://msdn.microsoft.com/en-us/library/system.reactive.linq.observable.single.aspx">MSDN: Observable.Single</a>
<ide> */
<ide> public T single(Func1<? super T, Boolean> predicate) {
<del> return from(o.single(predicate)).toIterable().iterator().next();
<add> return blockForSingle(o.single(predicate));
<ide> }
<ide>
<ide> /**
<ide> public T single(Func1<? super T, Boolean> predicate) {
<ide> * value.
<ide> * <p>
<ide> * <img width="640" height="315" src="https://github.com/ReactiveX/RxJava/wiki/images/rx-operators/B.singleOrDefault.png" alt="">
<del> *
<add> *
<ide> * @param defaultValue
<ide> * a default value to return if this {@code BlockingObservable} emits no items
<ide> * @return the single item emitted by this {@code BlockingObservable}, or the default value if it emits no
<ide> public T single(Func1<? super T, Boolean> predicate) {
<ide> * @see <a href="http://msdn.microsoft.com/en-us/library/system.reactive.linq.observable.singleordefault.aspx">MSDN: Observable.SingleOrDefault</a>
<ide> */
<ide> public T singleOrDefault(T defaultValue) {
<del> return from(o.map(Functions.<T>identity()).singleOrDefault(defaultValue)).single();
<add> return blockForSingle(o.map(Functions.<T>identity()).singleOrDefault(defaultValue));
<ide> }
<ide>
<ide> /**
<ide> public T singleOrDefault(T defaultValue) {
<ide> * emits no items, return a default value.
<ide> * <p>
<ide> * <img width="640" height="315" src="https://github.com/ReactiveX/RxJava/wiki/images/rx-operators/B.singleOrDefault.p.png" alt="">
<del> *
<add> *
<ide> * @param defaultValue
<ide> * a default value to return if this {@code BlockingObservable} emits no matching items
<ide> * @param predicate
<ide> public T singleOrDefault(T defaultValue) {
<ide> * @see <a href="http://msdn.microsoft.com/en-us/library/system.reactive.linq.observable.singleordefault.aspx">MSDN: Observable.SingleOrDefault</a>
<ide> */
<ide> public T singleOrDefault(T defaultValue, Func1<? super T, Boolean> predicate) {
<del> return from(o.filter(predicate)).singleOrDefault(defaultValue);
<add> return blockForSingle(o.filter(predicate).map(Functions.<T>identity()).singleOrDefault(defaultValue));
<ide> }
<ide>
<ide> /**
<ide> public T singleOrDefault(T defaultValue, Func1<? super T, Boolean> predicate) {
<ide> * If the {@code BlockingObservable} may emit more than one item, use {@code Observable.toList().toBlocking().toFuture()}.
<ide> * <p>
<ide> * <img width="640" height="395" src="https://github.com/ReactiveX/RxJava/wiki/images/rx-operators/B.toFuture.png" alt="">
<del> *
<add> *
<ide> * @return a {@link Future} that expects a single item to be emitted by this {@code BlockingObservable}
<ide> * @see <a href="https://github.com/ReactiveX/RxJava/wiki/Blocking-Observable-Operators#transformations-tofuture-toiterable-and-toiteratorgetiterator">RxJava Wiki: toFuture()</a>
<ide> */
<ide> public Future<T> toFuture() {
<ide> * Converts this {@code BlockingObservable} into an {@link Iterable}.
<ide> * <p>
<ide> * <img width="640" height="315" src="https://github.com/ReactiveX/RxJava/wiki/images/rx-operators/B.toIterable.png" alt="">
<del> *
<add> *
<ide> * @return an {@link Iterable} version of this {@code BlockingObservable}
<ide> * @see <a href="https://github.com/ReactiveX/RxJava/wiki/Blocking-Observable-Operators#transformations-tofuture-toiterable-and-toiteratorgetiterator">RxJava Wiki: toIterable()</a>
<ide> */
<ide> public Iterator<T> iterator() {
<ide> }
<ide> };
<ide> }
<add>
<add> /**
<add> * Helper method which handles the actual blocking for a single response.
<add> * <p>
<add> * If the {@link Observable} errors, it will be thrown right away.
<add> *
<add> * @return the actual item.
<add> */
<add> private T blockForSingle(final Observable<? extends T> observable) {
<add> final AtomicReference<T> returnItem = new AtomicReference<T>();
<add> final AtomicReference<Throwable> returnException = new AtomicReference<Throwable>();
<add> final CountDownLatch latch = new CountDownLatch(1);
<add>
<add> observable.subscribe(new Subscriber<T>() {
<add> @Override
<add> public void onCompleted() {
<add> latch.countDown();
<add> }
<add>
<add> @Override
<add> public void onError(final Throwable e) {
<add> returnException.set(e);
<add> latch.countDown();
<add> }
<add>
<add> @Override
<add> public void onNext(final T item) {
<add> returnItem.set(item);
<add> }
<add> });
<add>
<add> try {
<add> latch.await();
<add> } catch (InterruptedException e) {
<add> Thread.currentThread().interrupt();
<add> throw new RuntimeException("Interrupted while waiting for subscription to complete.", e);
<add> }
<add>
<add> if (returnException.get() != null) {
<add> if (returnException.get() instanceof RuntimeException) {
<add> throw (RuntimeException) returnException.get();
<add> } else {
<add> throw new RuntimeException(returnException.get());
<add> }
<add> }
<add>
<add> return returnItem.get();
<add> }
<ide> }
<ide><path>rxjava/src/perf/java/rx/observables/BlockingObservablePerf.java
<add>/**
<add> * Copyright 2014 Netflix, Inc.
<add> *
<add> * Licensed under the Apache License, Version 2.0 (the "License");
<add> * you may not use this file except in compliance with the License.
<add> * You may obtain a copy of the License at
<add> *
<add> * http://www.apache.org/licenses/LICENSE-2.0
<add> *
<add> * Unless required by applicable law or agreed to in writing, software
<add> * distributed under the License is distributed on an "AS IS" BASIS,
<add> * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
<add> * See the License for the specific language governing permissions and
<add> * limitations under the License.
<add> */
<add>package rx.observables;
<add>
<add>import org.openjdk.jmh.annotations.Benchmark;
<add>import org.openjdk.jmh.annotations.BenchmarkMode;
<add>import org.openjdk.jmh.annotations.Mode;
<add>import org.openjdk.jmh.annotations.OutputTimeUnit;
<add>import org.openjdk.jmh.annotations.Param;
<add>import org.openjdk.jmh.annotations.Scope;
<add>import org.openjdk.jmh.annotations.State;
<add>import rx.Observable;
<add>import rx.functions.Func1;
<add>import rx.jmh.InputWithIncrementingInteger;
<add>
<add>import java.util.concurrent.TimeUnit;
<add>
<add>@BenchmarkMode(Mode.Throughput)
<add>@OutputTimeUnit(TimeUnit.SECONDS)
<add>@State(Scope.Thread)
<add>public class BlockingObservablePerf {
<add>
<add> @State(Scope.Thread)
<add> public static class MultiInput extends InputWithIncrementingInteger {
<add>
<add> @Param({ "1", "1000", "1000000" })
<add> public int size;
<add>
<add> @Override
<add> public int getSize() {
<add> return size;
<add> }
<add>
<add> }
<add>
<add> @State(Scope.Thread)
<add> public static class SingleInput extends InputWithIncrementingInteger {
<add>
<add> @Param({ "1" })
<add> public int size;
<add>
<add> @Override
<add> public int getSize() {
<add> return size;
<add> }
<add>
<add> }
<add>
<add> @Benchmark
<add> public int benchSingle(final SingleInput input) {
<add> return input.observable.toBlocking().single();
<add> }
<add>
<add> @Benchmark
<add> public int benchFirst(final MultiInput input) {
<add> return input.observable.toBlocking().first();
<add> }
<add>
<add> @Benchmark
<add> public int benchLast(final MultiInput input) {
<add> return input.observable.toBlocking().last();
<add> }
<add>
<add>}
<ide><path>rxjava/src/test/java/rx/observables/BlockingObservableTest.java
<ide> public Boolean call(Integer args) {
<ide> @Test
<ide> public void testLastWithPredicate() {
<ide> BlockingObservable<String> obs = BlockingObservable.from(Observable.just("one", "two", "three"));
<del>
<ide> assertEquals("two", obs.last(new Func1<String, Boolean>() {
<ide> @Override
<ide> public Boolean call(String s) {
<ide> public Boolean call(String s) {
<ide> }));
<ide> }
<ide>
<add> @Test
<ide> public void testSingle() {
<ide> BlockingObservable<String> observable = BlockingObservable.from(Observable.just("one"));
<ide> assertEquals("one", observable.single());
<ide><path>rxjava/src/test/java/rx/util/AssertObservableTest.java
<ide> public void testPassNull() {
<ide> AssertObservable.assertObservableEqualsBlocking("foo", null, null);
<ide> }
<ide>
<del> @Test(expected = AssertionError.class)
<add> @Test(expected = RuntimeException.class)
<ide> public void testFailNotNull() {
<ide> AssertObservable.assertObservableEqualsBlocking("foo", Observable.just(1, 2), Observable.just(1));
<ide> }
<ide>
<del> @Test(expected = AssertionError.class)
<add> @Test(expected = RuntimeException.class)
<ide> public void testFailNull() {
<ide> AssertObservable.assertObservableEqualsBlocking("foo", Observable.just(1, 2), null);
<ide> } | 4 |
Javascript | Javascript | use const instead of var in require | 6d21a74c5da48b36cde147993fe9d0e8154a9457 | <ide><path>gulpfile.js
<del>var elixir = require('laravel-elixir');
<add>const elixir = require('laravel-elixir');
<ide>
<ide> require('laravel-elixir-vue');
<ide> | 1 |
Javascript | Javascript | update router.js. fixes | f6e85ce7aeb4ba2bcff9d3faf0df5fffa60d1f37 | <ide><path>packages/ember-routing/lib/vendor/router.js
<ide> define("router",
<ide> */
<ide> function getMatchPoint(router, handlers, objects, inputParams) {
<ide>
<del> var objectsToMatch = objects.length,
<del> matchPoint = handlers.length,
<add> var matchPoint = handlers.length,
<ide> providedModels = {}, i,
<ide> currentHandlerInfos = router.currentHandlerInfos || [],
<ide> params = {},
<ide> oldParams = router.currentParams || {},
<ide> activeTransition = router.activeTransition,
<del> handlerParams = {};
<add> handlerParams = {},
<add> obj;
<ide>
<add> objects = slice.call(objects);
<ide> merge(params, inputParams);
<ide>
<ide> for (i = handlers.length - 1; i >= 0; i--) {
<ide> define("router",
<ide> if (handlerObj.isDynamic) {
<ide> // URL transition.
<ide>
<del> if (objectsToMatch > 0) {
<add> if (obj = getMatchPointObject(objects, handlerName, activeTransition, true)) {
<ide> hasChanged = true;
<del> providedModels[handlerName] = objects[--objectsToMatch];
<add> providedModels[handlerName] = obj;
<ide> } else {
<ide> handlerParams[handlerName] = {};
<ide> for (var prop in handlerObj.params) {
<ide> define("router",
<ide> handlerParams[handlerName][prop] = params[prop] = newParam;
<ide> }
<ide> }
<del> } else if (handlerObj.hasOwnProperty('names') && handlerObj.names.length) {
<add> } else if (handlerObj.hasOwnProperty('names')) {
<ide> // Named transition.
<ide>
<del> if (objectsToMatch > 0) {
<add> if (obj = getMatchPointObject(objects, handlerName, activeTransition, handlerObj.names.length)) {
<ide> hasChanged = true;
<del> providedModels[handlerName] = objects[--objectsToMatch];
<del> } else if (activeTransition && activeTransition.providedModels[handlerName]) {
<del>
<del> // Use model from previous transition attempt, preferably the resolved one.
<del> hasChanged = true;
<del> providedModels[handlerName] = activeTransition.providedModels[handlerName] ||
<del> activeTransition.resolvedModels[handlerName];
<add> providedModels[handlerName] = obj;
<ide> } else {
<ide> var names = handlerObj.names;
<ide> handlerParams[handlerName] = {};
<ide> for (var j = 0, len = names.length; j < len; ++j) {
<ide> var name = names[j];
<del> handlerParams[handlerName][name] = params[name] = oldParams[name];
<add> handlerParams[handlerName][name] = params[name] = oldParams[name] || params[name];
<ide> }
<ide> }
<ide> }
<ide>
<ide> if (hasChanged) { matchPoint = i; }
<ide> }
<ide>
<del> if (objectsToMatch > 0) {
<add> if (objects.length > 0) {
<ide> throw "More context objects were passed than there are dynamic segments for the route: " + handlers[handlers.length - 1].handler;
<ide> }
<ide>
<ide> return { matchPoint: matchPoint, providedModels: providedModels, params: params, handlerParams: handlerParams };
<ide> }
<ide>
<add> function getMatchPointObject(objects, handlerName, activeTransition, canUseProvidedObject) {
<add> if (objects.length && canUseProvidedObject) {
<add> return objects.pop();
<add> } else if (activeTransition) {
<add> // Use model from previous transition attempt, preferably the resolved one.
<add> return (canUseProvidedObject && activeTransition.providedModels[handlerName]) ||
<add> activeTransition.resolvedModels[handlerName];
<add> }
<add> }
<add>
<ide> /**
<ide> @private
<ide>
<ide> define("router",
<ide> handler = handlerInfo.handler,
<ide> handlerName = handlerInfo.name,
<ide> seq = transition.sequence,
<del> errorAlreadyHandled = false,
<del> resolvedModel;
<add> errorAlreadyHandled = false;
<ide>
<ide> if (index < matchPoint) {
<ide> log(router, seq, handlerName + ": using context from already-active handler");
<ide>
<ide> // We're before the match point, so don't run any hooks,
<ide> // just use the already resolved context from the handler.
<del> resolvedModel = handlerInfo.handler.context;
<add> transition.resolvedModels[handlerInfo.name] = handlerInfo.handler.context;
<ide> return proceed();
<ide> }
<ide>
<ide> define("router",
<ide> trigger(handlerInfos.slice(0, index + 1), true, ['error', reason, transition]);
<ide>
<ide> if (handler.error) {
<del> handler.error(reason, transition); }
<del>
<add> handler.error(reason, transition);
<add> }
<ide>
<ide> // Propagate the original error.
<ide> return RSVP.reject(reason);
<ide> define("router",
<ide> // want to use the value returned from `afterModel` in any way, but rather
<ide> // always resolve with the original `context` object.
<ide>
<del> resolvedModel = context;
<del> return handler.afterModel && handler.afterModel(resolvedModel, transition);
<add> transition.resolvedModels[handlerInfo.name] = context;
<add> return handler.afterModel && handler.afterModel(context, transition);
<ide> }
<ide>
<ide> function proceed() {
<ide> log(router, seq, handlerName + ": validation succeeded, proceeding");
<ide>
<del> handlerInfo.context = transition.resolvedModels[handlerInfo.name] = resolvedModel;
<add> handlerInfo.context = transition.resolvedModels[handlerInfo.name];
<ide> return validateEntry(transition, handlerInfos, index + 1, matchPoint, handlerParams);
<ide> }
<ide> }
<ide> define("router",
<ide> return handler.context;
<ide> }
<ide>
<del> if (handlerInfo.isDynamic && transition.providedModels.hasOwnProperty(handlerName)) {
<add> if (transition.providedModels.hasOwnProperty(handlerName)) {
<ide> var providedModel = transition.providedModels[handlerName];
<ide> return typeof providedModel === 'function' ? providedModel() : providedModel;
<ide> } | 1 |
Python | Python | update links from s3 to huggingface.co | 55e8d0cea25be18b044523b30f4bef58fec63289 | <ide><path>examples/seq2seq/bertabs/configuration_bertabs.py
<ide>
<ide>
<ide> BERTABS_FINETUNED_CONFIG_MAP = {
<del> "bertabs-finetuned-cnndm": "https://s3.amazonaws.com/models.huggingface.co/bert/remi/bertabs-finetuned-cnndm-extractive-abstractive-summarization/config.json",
<add> "bertabs-finetuned-cnndm": "https://huggingface.co/remi/bertabs-finetuned-cnndm-extractive-abstractive-summarization/resolve/main/config.json",
<ide> }
<ide>
<ide>
<ide><path>src/transformers/configuration_albert.py
<ide>
<ide>
<ide> ALBERT_PRETRAINED_CONFIG_ARCHIVE_MAP = {
<del> "albert-base-v1": "https://s3.amazonaws.com/models.huggingface.co/bert/albert-base-v1-config.json",
<del> "albert-large-v1": "https://s3.amazonaws.com/models.huggingface.co/bert/albert-large-v1-config.json",
<del> "albert-xlarge-v1": "https://s3.amazonaws.com/models.huggingface.co/bert/albert-xlarge-v1-config.json",
<del> "albert-xxlarge-v1": "https://s3.amazonaws.com/models.huggingface.co/bert/albert-xxlarge-v1-config.json",
<del> "albert-base-v2": "https://s3.amazonaws.com/models.huggingface.co/bert/albert-base-v2-config.json",
<del> "albert-large-v2": "https://s3.amazonaws.com/models.huggingface.co/bert/albert-large-v2-config.json",
<del> "albert-xlarge-v2": "https://s3.amazonaws.com/models.huggingface.co/bert/albert-xlarge-v2-config.json",
<del> "albert-xxlarge-v2": "https://s3.amazonaws.com/models.huggingface.co/bert/albert-xxlarge-v2-config.json",
<add> "albert-base-v1": "https://huggingface.co/albert-base-v1/resolve/main/config.json",
<add> "albert-large-v1": "https://huggingface.co/albert-large-v1/resolve/main/config.json",
<add> "albert-xlarge-v1": "https://huggingface.co/albert-xlarge-v1/resolve/main/config.json",
<add> "albert-xxlarge-v1": "https://huggingface.co/albert-xxlarge-v1/resolve/main/config.json",
<add> "albert-base-v2": "https://huggingface.co/albert-base-v2/resolve/main/config.json",
<add> "albert-large-v2": "https://huggingface.co/albert-large-v2/resolve/main/config.json",
<add> "albert-xlarge-v2": "https://huggingface.co/albert-xlarge-v2/resolve/main/config.json",
<add> "albert-xxlarge-v2": "https://huggingface.co/albert-xxlarge-v2/resolve/main/config.json",
<ide> }
<ide>
<ide>
<ide><path>src/transformers/configuration_bart.py
<ide> logger = logging.get_logger(__name__)
<ide>
<ide> BART_PRETRAINED_CONFIG_ARCHIVE_MAP = {
<del> "facebook/bart-base": "https://s3.amazonaws.com/models.huggingface.co/bert/facebook/bart-base/config.json",
<del> "facebook/bart-large": "https://s3.amazonaws.com/models.huggingface.co/bert/facebook/bart-large/config.json",
<del> "facebook/bart-large-mnli": "https://s3.amazonaws.com/models.huggingface.co/bert/facebook/bart-large-mnli/config.json",
<del> "facebook/bart-large-cnn": "https://s3.amazonaws.com/models.huggingface.co/bert/facebook/bart-large-cnn/config.json",
<del> "facebook/bart-large-xsum": "https://s3.amazonaws.com/models.huggingface.co/bert/facebook/bart-large-xsum/config.json",
<del> "facebook/mbart-large-en-ro": "https://s3.amazonaws.com/models.huggingface.co/bert/facebook/mbart-large-en-ro/config.json",
<del> "yjernite/bart_eli5": "https://s3.amazonaws.com/models.huggingface.co/bert/yjernite/bart_eli5/config.json",
<add> "facebook/bart-base": "https://huggingface.co/facebook/bart-base/resolve/main/config.json",
<add> "facebook/bart-large": "https://huggingface.co/facebook/bart-large/resolve/main/config.json",
<add> "facebook/bart-large-mnli": "https://huggingface.co/facebook/bart-large-mnli/resolve/main/config.json",
<add> "facebook/bart-large-cnn": "https://huggingface.co/facebook/bart-large-cnn/resolve/main/config.json",
<add> "facebook/bart-large-xsum": "https://huggingface.co/facebook/bart-large-xsum/resolve/main/config.json",
<add> "facebook/mbart-large-en-ro": "https://huggingface.co/facebook/mbart-large-en-ro/resolve/main/config.json",
<add> "yjernite/bart_eli5": "https://huggingface.co/yjernite/bart_eli5/resolve/main/config.json",
<ide> }
<ide>
<ide>
<ide><path>src/transformers/configuration_bert.py
<ide> logger = logging.get_logger(__name__)
<ide>
<ide> BERT_PRETRAINED_CONFIG_ARCHIVE_MAP = {
<del> "bert-base-uncased": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-config.json",
<del> "bert-large-uncased": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-uncased-config.json",
<del> "bert-base-cased": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-cased-config.json",
<del> "bert-large-cased": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-cased-config.json",
<del> "bert-base-multilingual-uncased": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-multilingual-uncased-config.json",
<del> "bert-base-multilingual-cased": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-multilingual-cased-config.json",
<del> "bert-base-chinese": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-chinese-config.json",
<del> "bert-base-german-cased": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-german-cased-config.json",
<del> "bert-large-uncased-whole-word-masking": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-uncased-whole-word-masking-config.json",
<del> "bert-large-cased-whole-word-masking": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-cased-whole-word-masking-config.json",
<del> "bert-large-uncased-whole-word-masking-finetuned-squad": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-uncased-whole-word-masking-finetuned-squad-config.json",
<del> "bert-large-cased-whole-word-masking-finetuned-squad": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-cased-whole-word-masking-finetuned-squad-config.json",
<del> "bert-base-cased-finetuned-mrpc": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-cased-finetuned-mrpc-config.json",
<del> "bert-base-german-dbmdz-cased": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-german-dbmdz-cased-config.json",
<del> "bert-base-german-dbmdz-uncased": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-german-dbmdz-uncased-config.json",
<del> "cl-tohoku/bert-base-japanese": "https://s3.amazonaws.com/models.huggingface.co/bert/cl-tohoku/bert-base-japanese/config.json",
<del> "cl-tohoku/bert-base-japanese-whole-word-masking": "https://s3.amazonaws.com/models.huggingface.co/bert/cl-tohoku/bert-base-japanese-whole-word-masking/config.json",
<del> "cl-tohoku/bert-base-japanese-char": "https://s3.amazonaws.com/models.huggingface.co/bert/cl-tohoku/bert-base-japanese-char/config.json",
<del> "cl-tohoku/bert-base-japanese-char-whole-word-masking": "https://s3.amazonaws.com/models.huggingface.co/bert/cl-tohoku/bert-base-japanese-char-whole-word-masking/config.json",
<del> "TurkuNLP/bert-base-finnish-cased-v1": "https://s3.amazonaws.com/models.huggingface.co/bert/TurkuNLP/bert-base-finnish-cased-v1/config.json",
<del> "TurkuNLP/bert-base-finnish-uncased-v1": "https://s3.amazonaws.com/models.huggingface.co/bert/TurkuNLP/bert-base-finnish-uncased-v1/config.json",
<del> "wietsedv/bert-base-dutch-cased": "https://s3.amazonaws.com/models.huggingface.co/bert/wietsedv/bert-base-dutch-cased/config.json",
<add> "bert-base-uncased": "https://huggingface.co/bert-base-uncased/resolve/main/config.json",
<add> "bert-large-uncased": "https://huggingface.co/bert-large-uncased/resolve/main/config.json",
<add> "bert-base-cased": "https://huggingface.co/bert-base-cased/resolve/main/config.json",
<add> "bert-large-cased": "https://huggingface.co/bert-large-cased/resolve/main/config.json",
<add> "bert-base-multilingual-uncased": "https://huggingface.co/bert-base-multilingual-uncased/resolve/main/config.json",
<add> "bert-base-multilingual-cased": "https://huggingface.co/bert-base-multilingual-cased/resolve/main/config.json",
<add> "bert-base-chinese": "https://huggingface.co/bert-base-chinese/resolve/main/config.json",
<add> "bert-base-german-cased": "https://huggingface.co/bert-base-german-cased/resolve/main/config.json",
<add> "bert-large-uncased-whole-word-masking": "https://huggingface.co/bert-large-uncased-whole-word-masking/resolve/main/config.json",
<add> "bert-large-cased-whole-word-masking": "https://huggingface.co/bert-large-cased-whole-word-masking/resolve/main/config.json",
<add> "bert-large-uncased-whole-word-masking-finetuned-squad": "https://huggingface.co/bert-large-uncased-whole-word-masking-finetuned-squad/resolve/main/config.json",
<add> "bert-large-cased-whole-word-masking-finetuned-squad": "https://huggingface.co/bert-large-cased-whole-word-masking-finetuned-squad/resolve/main/config.json",
<add> "bert-base-cased-finetuned-mrpc": "https://huggingface.co/bert-base-cased-finetuned-mrpc/resolve/main/config.json",
<add> "bert-base-german-dbmdz-cased": "https://huggingface.co/bert-base-german-dbmdz-cased/resolve/main/config.json",
<add> "bert-base-german-dbmdz-uncased": "https://huggingface.co/bert-base-german-dbmdz-uncased/resolve/main/config.json",
<add> "cl-tohoku/bert-base-japanese": "https://huggingface.co/cl-tohoku/bert-base-japanese/resolve/main/config.json",
<add> "cl-tohoku/bert-base-japanese-whole-word-masking": "https://huggingface.co/cl-tohoku/bert-base-japanese-whole-word-masking/resolve/main/config.json",
<add> "cl-tohoku/bert-base-japanese-char": "https://huggingface.co/cl-tohoku/bert-base-japanese-char/resolve/main/config.json",
<add> "cl-tohoku/bert-base-japanese-char-whole-word-masking": "https://huggingface.co/cl-tohoku/bert-base-japanese-char-whole-word-masking/resolve/main/config.json",
<add> "TurkuNLP/bert-base-finnish-cased-v1": "https://huggingface.co/TurkuNLP/bert-base-finnish-cased-v1/resolve/main/config.json",
<add> "TurkuNLP/bert-base-finnish-uncased-v1": "https://huggingface.co/TurkuNLP/bert-base-finnish-uncased-v1/resolve/main/config.json",
<add> "wietsedv/bert-base-dutch-cased": "https://huggingface.co/wietsedv/bert-base-dutch-cased/resolve/main/config.json",
<ide> # See all BERT models at https://huggingface.co/models?filter=bert
<ide> }
<ide>
<ide><path>src/transformers/configuration_camembert.py
<ide> logger = logging.get_logger(__name__)
<ide>
<ide> CAMEMBERT_PRETRAINED_CONFIG_ARCHIVE_MAP = {
<del> "camembert-base": "https://s3.amazonaws.com/models.huggingface.co/bert/camembert-base-config.json",
<del> "umberto-commoncrawl-cased-v1": "https://s3.amazonaws.com/models.huggingface.co/bert/Musixmatch/umberto-commoncrawl-cased-v1/config.json",
<del> "umberto-wikipedia-uncased-v1": "https://s3.amazonaws.com/models.huggingface.co/bert/Musixmatch/umberto-wikipedia-uncased-v1/config.json",
<add> "camembert-base": "https://huggingface.co/camembert-base/resolve/main/config.json",
<add> "umberto-commoncrawl-cased-v1": "https://huggingface.co/Musixmatch/umberto-commoncrawl-cased-v1/resolve/main/config.json",
<add> "umberto-wikipedia-uncased-v1": "https://huggingface.co/Musixmatch/umberto-wikipedia-uncased-v1/resolve/main/config.json",
<ide> }
<ide>
<ide>
<ide><path>src/transformers/configuration_ctrl.py
<ide>
<ide> logger = logging.get_logger(__name__)
<ide>
<del>CTRL_PRETRAINED_CONFIG_ARCHIVE_MAP = {"ctrl": "https://s3.amazonaws.com/models.huggingface.co/bert/ctrl-config.json"}
<add>CTRL_PRETRAINED_CONFIG_ARCHIVE_MAP = {"ctrl": "https://huggingface.co/ctrl/resolve/main/config.json"}
<ide>
<ide>
<ide> class CTRLConfig(PretrainedConfig):
<ide><path>src/transformers/configuration_deberta.py
<ide> logger = logging.get_logger(__name__)
<ide>
<ide> DEBERTA_PRETRAINED_CONFIG_ARCHIVE_MAP = {
<del> "microsoft/deberta-base": "https://s3.amazonaws.com/models.huggingface.co/bert/microsoft/deberta-base/config.json",
<del> "microsoft/deberta-large": "https://s3.amazonaws.com/models.huggingface.co/bert/microsoft/deberta-large/config.json",
<add> "microsoft/deberta-base": "https://huggingface.co/microsoft/deberta-base/resolve/main/config.json",
<add> "microsoft/deberta-large": "https://huggingface.co/microsoft/deberta-large/resolve/main/config.json",
<ide> }
<ide>
<ide>
<ide><path>src/transformers/configuration_distilbert.py
<ide> logger = logging.get_logger(__name__)
<ide>
<ide> DISTILBERT_PRETRAINED_CONFIG_ARCHIVE_MAP = {
<del> "distilbert-base-uncased": "https://s3.amazonaws.com/models.huggingface.co/bert/distilbert-base-uncased-config.json",
<del> "distilbert-base-uncased-distilled-squad": "https://s3.amazonaws.com/models.huggingface.co/bert/distilbert-base-uncased-distilled-squad-config.json",
<del> "distilbert-base-cased": "https://s3.amazonaws.com/models.huggingface.co/bert/distilbert-base-cased-config.json",
<del> "distilbert-base-cased-distilled-squad": "https://s3.amazonaws.com/models.huggingface.co/bert/distilbert-base-cased-distilled-squad-config.json",
<del> "distilbert-base-german-cased": "https://s3.amazonaws.com/models.huggingface.co/bert/distilbert-base-german-cased-config.json",
<del> "distilbert-base-multilingual-cased": "https://s3.amazonaws.com/models.huggingface.co/bert/distilbert-base-multilingual-cased-config.json",
<del> "distilbert-base-uncased-finetuned-sst-2-english": "https://s3.amazonaws.com/models.huggingface.co/bert/distilbert-base-uncased-finetuned-sst-2-english-config.json",
<add> "distilbert-base-uncased": "https://huggingface.co/distilbert-base-uncased/resolve/main/config.json",
<add> "distilbert-base-uncased-distilled-squad": "https://huggingface.co/distilbert-base-uncased-distilled-squad/resolve/main/config.json",
<add> "distilbert-base-cased": "https://huggingface.co/distilbert-base-cased/resolve/main/config.json",
<add> "distilbert-base-cased-distilled-squad": "https://huggingface.co/distilbert-base-cased-distilled-squad/resolve/main/config.json",
<add> "distilbert-base-german-cased": "https://huggingface.co/distilbert-base-german-cased/resolve/main/config.json",
<add> "distilbert-base-multilingual-cased": "https://huggingface.co/distilbert-base-multilingual-cased/resolve/main/config.json",
<add> "distilbert-base-uncased-finetuned-sst-2-english": "https://huggingface.co/distilbert-base-uncased-finetuned-sst-2-english/resolve/main/config.json",
<ide> }
<ide>
<ide>
<ide><path>src/transformers/configuration_dpr.py
<ide> logger = logging.get_logger(__name__)
<ide>
<ide> DPR_PRETRAINED_CONFIG_ARCHIVE_MAP = {
<del> "facebook/dpr-ctx_encoder-single-nq-base": "https://s3.amazonaws.com/models.huggingface.co/bert/facebook/dpr-ctx_encoder-single-nq-base/config.json",
<del> "facebook/dpr-question_encoder-single-nq-base": "https://s3.amazonaws.com/models.huggingface.co/bert/facebook/dpr-question_encoder-single-nq-base/config.json",
<del> "facebook/dpr-reader-single-nq-base": "https://s3.amazonaws.com/models.huggingface.co/bert/facebook/dpr-reader-single-nq-base/config.json",
<del> "facebook/dpr-ctx_encoder-multiset-base": "https://s3.amazonaws.com/models.huggingface.co/bert/facebook/dpr-ctx_encoder-multiset-base/config.json",
<del> "facebook/dpr-question_encoder-multiset-base": "https://s3.amazonaws.com/models.huggingface.co/bert/facebook/dpr-question_encoder-multiset-base/config.json",
<del> "facebook/dpr-reader-multiset-base": "https://s3.amazonaws.com/models.huggingface.co/bert/facebook/dpr-reader-multiset-base/config.json",
<add> "facebook/dpr-ctx_encoder-single-nq-base": "https://huggingface.co/facebook/dpr-ctx_encoder-single-nq-base/resolve/main/config.json",
<add> "facebook/dpr-question_encoder-single-nq-base": "https://huggingface.co/facebook/dpr-question_encoder-single-nq-base/resolve/main/config.json",
<add> "facebook/dpr-reader-single-nq-base": "https://huggingface.co/facebook/dpr-reader-single-nq-base/resolve/main/config.json",
<add> "facebook/dpr-ctx_encoder-multiset-base": "https://huggingface.co/facebook/dpr-ctx_encoder-multiset-base/resolve/main/config.json",
<add> "facebook/dpr-question_encoder-multiset-base": "https://huggingface.co/facebook/dpr-question_encoder-multiset-base/resolve/main/config.json",
<add> "facebook/dpr-reader-multiset-base": "https://huggingface.co/facebook/dpr-reader-multiset-base/resolve/main/config.json",
<ide> }
<ide>
<ide>
<ide><path>src/transformers/configuration_electra.py
<ide> logger = logging.get_logger(__name__)
<ide>
<ide> ELECTRA_PRETRAINED_CONFIG_ARCHIVE_MAP = {
<del> "google/electra-small-generator": "https://s3.amazonaws.com/models.huggingface.co/bert/google/electra-small-generator/config.json",
<del> "google/electra-base-generator": "https://s3.amazonaws.com/models.huggingface.co/bert/google/electra-base-generator/config.json",
<del> "google/electra-large-generator": "https://s3.amazonaws.com/models.huggingface.co/bert/google/electra-large-generator/config.json",
<del> "google/electra-small-discriminator": "https://s3.amazonaws.com/models.huggingface.co/bert/google/electra-small-discriminator/config.json",
<del> "google/electra-base-discriminator": "https://s3.amazonaws.com/models.huggingface.co/bert/google/electra-base-discriminator/config.json",
<del> "google/electra-large-discriminator": "https://s3.amazonaws.com/models.huggingface.co/bert/google/electra-large-discriminator/config.json",
<add> "google/electra-small-generator": "https://huggingface.co/google/electra-small-generator/resolve/main/config.json",
<add> "google/electra-base-generator": "https://huggingface.co/google/electra-base-generator/resolve/main/config.json",
<add> "google/electra-large-generator": "https://huggingface.co/google/electra-large-generator/resolve/main/config.json",
<add> "google/electra-small-discriminator": "https://huggingface.co/google/electra-small-discriminator/resolve/main/config.json",
<add> "google/electra-base-discriminator": "https://huggingface.co/google/electra-base-discriminator/resolve/main/config.json",
<add> "google/electra-large-discriminator": "https://huggingface.co/google/electra-large-discriminator/resolve/main/config.json",
<ide> }
<ide>
<ide>
<ide><path>src/transformers/configuration_flaubert.py
<ide> logger = logging.get_logger(__name__)
<ide>
<ide> FLAUBERT_PRETRAINED_CONFIG_ARCHIVE_MAP = {
<del> "flaubert/flaubert_small_cased": "https://s3.amazonaws.com/models.huggingface.co/bert/flaubert/flaubert_small_cased/config.json",
<del> "flaubert/flaubert_base_uncased": "https://s3.amazonaws.com/models.huggingface.co/bert/flaubert/flaubert_base_uncased/config.json",
<del> "flaubert/flaubert_base_cased": "https://s3.amazonaws.com/models.huggingface.co/bert/flaubert/flaubert_base_cased/config.json",
<del> "flaubert/flaubert_large_cased": "https://s3.amazonaws.com/models.huggingface.co/bert/flaubert/flaubert_large_cased/config.json",
<add> "flaubert/flaubert_small_cased": "https://huggingface.co/flaubert/flaubert_small_cased/resolve/main/config.json",
<add> "flaubert/flaubert_base_uncased": "https://huggingface.co/flaubert/flaubert_base_uncased/resolve/main/config.json",
<add> "flaubert/flaubert_base_cased": "https://huggingface.co/flaubert/flaubert_base_cased/resolve/main/config.json",
<add> "flaubert/flaubert_large_cased": "https://huggingface.co/flaubert/flaubert_large_cased/resolve/main/config.json",
<ide> }
<ide>
<ide>
<ide><path>src/transformers/configuration_funnel.py
<ide> logger = logging.get_logger(__name__)
<ide>
<ide> FUNNEL_PRETRAINED_CONFIG_ARCHIVE_MAP = {
<del> "funnel-transformer/small": "https://s3.amazonaws.com/models.huggingface.co/bert/funnel-transformer/small/config.json",
<del> "funnel-transformer/small-base": "https://s3.amazonaws.com/models.huggingface.co/bert/funnel-transformer/small-base/config.json",
<del> "funnel-transformer/medium": "https://s3.amazonaws.com/models.huggingface.co/bert/funnel-transformer/medium/config.json",
<del> "funnel-transformer/medium-base": "https://s3.amazonaws.com/models.huggingface.co/bert/funnel-transformer/medium-base/config.json",
<del> "funnel-transformer/intermediate": "https://s3.amazonaws.com/models.huggingface.co/bert/funnel-transformer/intermediate/config.json",
<del> "funnel-transformer/intermediate-base": "https://s3.amazonaws.com/models.huggingface.co/bert/funnel-transformer/intermediate-base/config.json",
<del> "funnel-transformer/large": "https://s3.amazonaws.com/models.huggingface.co/bert/funnel-transformer/large/config.json",
<del> "funnel-transformer/large-base": "https://s3.amazonaws.com/models.huggingface.co/bert/funnel-transformer/large-base/config.json",
<del> "funnel-transformer/xlarge": "https://s3.amazonaws.com/models.huggingface.co/bert/funnel-transformer/xlarge/config.json",
<del> "funnel-transformer/xlarge-base": "https://s3.amazonaws.com/models.huggingface.co/bert/funnel-transformer/xlarge-base/config.json",
<add> "funnel-transformer/small": "https://huggingface.co/funnel-transformer/small/resolve/main/config.json",
<add> "funnel-transformer/small-base": "https://huggingface.co/funnel-transformer/small-base/resolve/main/config.json",
<add> "funnel-transformer/medium": "https://huggingface.co/funnel-transformer/medium/resolve/main/config.json",
<add> "funnel-transformer/medium-base": "https://huggingface.co/funnel-transformer/medium-base/resolve/main/config.json",
<add> "funnel-transformer/intermediate": "https://huggingface.co/funnel-transformer/intermediate/resolve/main/config.json",
<add> "funnel-transformer/intermediate-base": "https://huggingface.co/funnel-transformer/intermediate-base/resolve/main/config.json",
<add> "funnel-transformer/large": "https://huggingface.co/funnel-transformer/large/resolve/main/config.json",
<add> "funnel-transformer/large-base": "https://huggingface.co/funnel-transformer/large-base/resolve/main/config.json",
<add> "funnel-transformer/xlarge": "https://huggingface.co/funnel-transformer/xlarge/resolve/main/config.json",
<add> "funnel-transformer/xlarge-base": "https://huggingface.co/funnel-transformer/xlarge-base/resolve/main/config.json",
<ide> }
<ide>
<ide>
<ide><path>src/transformers/configuration_gpt2.py
<ide> logger = logging.get_logger(__name__)
<ide>
<ide> GPT2_PRETRAINED_CONFIG_ARCHIVE_MAP = {
<del> "gpt2": "https://s3.amazonaws.com/models.huggingface.co/bert/gpt2-config.json",
<del> "gpt2-medium": "https://s3.amazonaws.com/models.huggingface.co/bert/gpt2-medium-config.json",
<del> "gpt2-large": "https://s3.amazonaws.com/models.huggingface.co/bert/gpt2-large-config.json",
<del> "gpt2-xl": "https://s3.amazonaws.com/models.huggingface.co/bert/gpt2-xl-config.json",
<del> "distilgpt2": "https://s3.amazonaws.com/models.huggingface.co/bert/distilgpt2-config.json",
<add> "gpt2": "https://huggingface.co/gpt2/resolve/main/config.json",
<add> "gpt2-medium": "https://huggingface.co/gpt2-medium/resolve/main/config.json",
<add> "gpt2-large": "https://huggingface.co/gpt2-large/resolve/main/config.json",
<add> "gpt2-xl": "https://huggingface.co/gpt2-xl/resolve/main/config.json",
<add> "distilgpt2": "https://huggingface.co/distilgpt2/resolve/main/config.json",
<ide> }
<ide>
<ide>
<ide><path>src/transformers/configuration_layoutlm.py
<ide> logger = logging.get_logger(__name__)
<ide>
<ide> LAYOUTLM_PRETRAINED_CONFIG_ARCHIVE_MAP = {
<del> "layoutlm-base-uncased": "https://s3.amazonaws.com/models.huggingface.co/bert/microsoft/layoutlm-base-uncased/config.json",
<del> "layoutlm-large-uncased": "https://s3.amazonaws.com/models.huggingface.co/bert/microsoft/layoutlm-large-uncased/config.json",
<add> "layoutlm-base-uncased": "https://huggingface.co/microsoft/layoutlm-base-uncased/resolve/main/config.json",
<add> "layoutlm-large-uncased": "https://huggingface.co/microsoft/layoutlm-large-uncased/resolve/main/config.json",
<ide> }
<ide>
<ide>
<ide><path>src/transformers/configuration_longformer.py
<ide> logger = logging.get_logger(__name__)
<ide>
<ide> LONGFORMER_PRETRAINED_CONFIG_ARCHIVE_MAP = {
<del> "allenai/longformer-base-4096": "https://s3.amazonaws.com/models.huggingface.co/bert/allenai/longformer-base-4096/config.json",
<del> "allenai/longformer-large-4096": "https://s3.amazonaws.com/models.huggingface.co/bert/allenai/longformer-large-4096/config.json",
<del> "allenai/longformer-large-4096-finetuned-triviaqa": "https://s3.amazonaws.com/models.huggingface.co/bert/allenai/longformer-large-4096-finetuned-triviaqa/config.json",
<del> "allenai/longformer-base-4096-extra.pos.embd.only": "https://s3.amazonaws.com/models.huggingface.co/bert/allenai/longformer-base-4096-extra.pos.embd.only/config.json",
<del> "allenai/longformer-large-4096-extra.pos.embd.only": "https://s3.amazonaws.com/models.huggingface.co/bert/allenai/longformer-large-4096-extra.pos.embd.only/config.json",
<add> "allenai/longformer-base-4096": "https://huggingface.co/allenai/longformer-base-4096/resolve/main/config.json",
<add> "allenai/longformer-large-4096": "https://huggingface.co/allenai/longformer-large-4096/resolve/main/config.json",
<add> "allenai/longformer-large-4096-finetuned-triviaqa": "https://huggingface.co/allenai/longformer-large-4096-finetuned-triviaqa/resolve/main/config.json",
<add> "allenai/longformer-base-4096-extra.pos.embd.only": "https://huggingface.co/allenai/longformer-base-4096-extra.pos.embd.only/resolve/main/config.json",
<add> "allenai/longformer-large-4096-extra.pos.embd.only": "https://huggingface.co/allenai/longformer-large-4096-extra.pos.embd.only/resolve/main/config.json",
<ide> }
<ide>
<ide>
<ide><path>src/transformers/configuration_marian.py
<ide>
<ide>
<ide> PRETRAINED_CONFIG_ARCHIVE_MAP = {
<del> "Helsinki-NLP/opus-mt-en-de": "https://s3.amazonaws.com/models.huggingface.co/bert/Helsinki-NLP/opus-mt-en-de/config.json",
<add> "Helsinki-NLP/opus-mt-en-de": "https://huggingface.co/Helsinki-NLP/opus-mt-en-de/resolve/main/config.json",
<ide> }
<ide>
<ide>
<ide><path>src/transformers/configuration_mbart.py
<ide> logger = logging.get_logger(__name__)
<ide>
<ide> MBART_PRETRAINED_CONFIG_ARCHIVE_MAP = {
<del> "facebook/mbart-large-en-ro": "https://s3.amazonaws.com/models.huggingface.co/bert/facebook/mbart-large-en-ro/config.json",
<del> "facebook/mbart-large-cc25": "https://s3.amazonaws.com/models.huggingface.co/bert/facebook/mbart-large-cc25/config.json",
<add> "facebook/mbart-large-en-ro": "https://huggingface.co/facebook/mbart-large-en-ro/resolve/main/config.json",
<add> "facebook/mbart-large-cc25": "https://huggingface.co/facebook/mbart-large-cc25/resolve/main/config.json",
<ide> }
<ide>
<ide>
<ide><path>src/transformers/configuration_mobilebert.py
<ide> logger = logging.get_logger(__name__)
<ide>
<ide> MOBILEBERT_PRETRAINED_CONFIG_ARCHIVE_MAP = {
<del> "mobilebert-uncased": "https://s3.amazonaws.com/models.huggingface.co/bert/google/mobilebert-uncased/config.json"
<add> "mobilebert-uncased": "https://huggingface.co/google/mobilebert-uncased/resolve/main/config.json"
<ide> }
<ide>
<ide>
<ide><path>src/transformers/configuration_openai.py
<ide> logger = logging.get_logger(__name__)
<ide>
<ide> OPENAI_GPT_PRETRAINED_CONFIG_ARCHIVE_MAP = {
<del> "openai-gpt": "https://s3.amazonaws.com/models.huggingface.co/bert/openai-gpt-config.json"
<add> "openai-gpt": "https://huggingface.co/openai-gpt/resolve/main/config.json"
<ide> }
<ide>
<ide>
<ide><path>src/transformers/configuration_prophetnet.py
<ide> logger = logging.get_logger(__name__)
<ide>
<ide> PROPHETNET_PRETRAINED_CONFIG_ARCHIVE_MAP = {
<del> "microsoft/prophetnet-large-uncased": "https://s3.amazonaws.com/models.huggingface.co/bert/microsoft/prophetnet-large-uncased/config.json",
<add> "microsoft/prophetnet-large-uncased": "https://huggingface.co/microsoft/prophetnet-large-uncased/resolve/main/config.json",
<ide> }
<ide>
<ide>
<ide><path>src/transformers/configuration_retribert.py
<ide>
<ide> # TODO: upload to AWS
<ide> RETRIBERT_PRETRAINED_CONFIG_ARCHIVE_MAP = {
<del> "retribert-base-uncased": "https://s3.amazonaws.com/models.huggingface.co/bert/distilbert-base-uncased-config.json",
<add> "retribert-base-uncased": "https://huggingface.co/distilbert-base-uncased/resolve/main/config.json",
<ide> }
<ide>
<ide>
<ide><path>src/transformers/configuration_roberta.py
<ide> logger = logging.get_logger(__name__)
<ide>
<ide> ROBERTA_PRETRAINED_CONFIG_ARCHIVE_MAP = {
<del> "roberta-base": "https://s3.amazonaws.com/models.huggingface.co/bert/roberta-base-config.json",
<del> "roberta-large": "https://s3.amazonaws.com/models.huggingface.co/bert/roberta-large-config.json",
<del> "roberta-large-mnli": "https://s3.amazonaws.com/models.huggingface.co/bert/roberta-large-mnli-config.json",
<del> "distilroberta-base": "https://s3.amazonaws.com/models.huggingface.co/bert/distilroberta-base-config.json",
<del> "roberta-base-openai-detector": "https://s3.amazonaws.com/models.huggingface.co/bert/roberta-base-openai-detector-config.json",
<del> "roberta-large-openai-detector": "https://s3.amazonaws.com/models.huggingface.co/bert/roberta-large-openai-detector-config.json",
<add> "roberta-base": "https://huggingface.co/roberta-base/resolve/main/config.json",
<add> "roberta-large": "https://huggingface.co/roberta-large/resolve/main/config.json",
<add> "roberta-large-mnli": "https://huggingface.co/roberta-large-mnli/resolve/main/config.json",
<add> "distilroberta-base": "https://huggingface.co/distilroberta-base/resolve/main/config.json",
<add> "roberta-base-openai-detector": "https://huggingface.co/roberta-base-openai-detector/resolve/main/config.json",
<add> "roberta-large-openai-detector": "https://huggingface.co/roberta-large-openai-detector/resolve/main/config.json",
<ide> }
<ide>
<ide>
<ide><path>src/transformers/configuration_squeezebert.py
<ide> logger = logging.get_logger(__name__)
<ide>
<ide> SQUEEZEBERT_PRETRAINED_CONFIG_ARCHIVE_MAP = {
<del> "squeezebert/squeezebert-uncased": "https://s3.amazonaws.com/models.huggingface.co/bert/squeezebert/squeezebert-uncased/config.json",
<del> "squeezebert/squeezebert-mnli": "https://s3.amazonaws.com/models.huggingface.co/bert/squeezebert/squeezebert-mnli/config.json",
<del> "squeezebert/squeezebert-mnli-headless": "https://s3.amazonaws.com/models.huggingface.co/bert/squeezebert/squeezebert-mnli-headless/config.json",
<add> "squeezebert/squeezebert-uncased": "https://huggingface.co/squeezebert/squeezebert-uncased/resolve/main/config.json",
<add> "squeezebert/squeezebert-mnli": "https://huggingface.co/squeezebert/squeezebert-mnli/resolve/main/config.json",
<add> "squeezebert/squeezebert-mnli-headless": "https://huggingface.co/squeezebert/squeezebert-mnli-headless/resolve/main/config.json",
<ide> }
<ide>
<ide>
<ide><path>src/transformers/configuration_t5.py
<ide> logger = logging.get_logger(__name__)
<ide>
<ide> T5_PRETRAINED_CONFIG_ARCHIVE_MAP = {
<del> "t5-small": "https://s3.amazonaws.com/models.huggingface.co/bert/t5-small-config.json",
<del> "t5-base": "https://s3.amazonaws.com/models.huggingface.co/bert/t5-base-config.json",
<del> "t5-large": "https://s3.amazonaws.com/models.huggingface.co/bert/t5-large-config.json",
<del> "t5-3b": "https://s3.amazonaws.com/models.huggingface.co/bert/t5-3b-config.json",
<del> "t5-11b": "https://s3.amazonaws.com/models.huggingface.co/bert/t5-11b-config.json",
<add> "t5-small": "https://huggingface.co/t5-small/resolve/main/config.json",
<add> "t5-base": "https://huggingface.co/t5-base/resolve/main/config.json",
<add> "t5-large": "https://huggingface.co/t5-large/resolve/main/config.json",
<add> "t5-3b": "https://huggingface.co/t5-3b/resolve/main/config.json",
<add> "t5-11b": "https://huggingface.co/t5-11b/resolve/main/config.json",
<ide> }
<ide>
<ide>
<ide><path>src/transformers/configuration_transfo_xl.py
<ide> logger = logging.get_logger(__name__)
<ide>
<ide> TRANSFO_XL_PRETRAINED_CONFIG_ARCHIVE_MAP = {
<del> "transfo-xl-wt103": "https://s3.amazonaws.com/models.huggingface.co/bert/transfo-xl-wt103-config.json",
<add> "transfo-xl-wt103": "https://huggingface.co/transfo-xl-wt103/resolve/main/config.json",
<ide> }
<ide>
<ide>
<ide><path>src/transformers/configuration_xlm.py
<ide> logger = logging.get_logger(__name__)
<ide>
<ide> XLM_PRETRAINED_CONFIG_ARCHIVE_MAP = {
<del> "xlm-mlm-en-2048": "https://s3.amazonaws.com/models.huggingface.co/bert/xlm-mlm-en-2048-config.json",
<del> "xlm-mlm-ende-1024": "https://s3.amazonaws.com/models.huggingface.co/bert/xlm-mlm-ende-1024-config.json",
<del> "xlm-mlm-enfr-1024": "https://s3.amazonaws.com/models.huggingface.co/bert/xlm-mlm-enfr-1024-config.json",
<del> "xlm-mlm-enro-1024": "https://s3.amazonaws.com/models.huggingface.co/bert/xlm-mlm-enro-1024-config.json",
<del> "xlm-mlm-tlm-xnli15-1024": "https://s3.amazonaws.com/models.huggingface.co/bert/xlm-mlm-tlm-xnli15-1024-config.json",
<del> "xlm-mlm-xnli15-1024": "https://s3.amazonaws.com/models.huggingface.co/bert/xlm-mlm-xnli15-1024-config.json",
<del> "xlm-clm-enfr-1024": "https://s3.amazonaws.com/models.huggingface.co/bert/xlm-clm-enfr-1024-config.json",
<del> "xlm-clm-ende-1024": "https://s3.amazonaws.com/models.huggingface.co/bert/xlm-clm-ende-1024-config.json",
<del> "xlm-mlm-17-1280": "https://s3.amazonaws.com/models.huggingface.co/bert/xlm-mlm-17-1280-config.json",
<del> "xlm-mlm-100-1280": "https://s3.amazonaws.com/models.huggingface.co/bert/xlm-mlm-100-1280-config.json",
<add> "xlm-mlm-en-2048": "https://huggingface.co/xlm-mlm-en-2048/resolve/main/config.json",
<add> "xlm-mlm-ende-1024": "https://huggingface.co/xlm-mlm-ende-1024/resolve/main/config.json",
<add> "xlm-mlm-enfr-1024": "https://huggingface.co/xlm-mlm-enfr-1024/resolve/main/config.json",
<add> "xlm-mlm-enro-1024": "https://huggingface.co/xlm-mlm-enro-1024/resolve/main/config.json",
<add> "xlm-mlm-tlm-xnli15-1024": "https://huggingface.co/xlm-mlm-tlm-xnli15-1024/resolve/main/config.json",
<add> "xlm-mlm-xnli15-1024": "https://huggingface.co/xlm-mlm-xnli15-1024/resolve/main/config.json",
<add> "xlm-clm-enfr-1024": "https://huggingface.co/xlm-clm-enfr-1024/resolve/main/config.json",
<add> "xlm-clm-ende-1024": "https://huggingface.co/xlm-clm-ende-1024/resolve/main/config.json",
<add> "xlm-mlm-17-1280": "https://huggingface.co/xlm-mlm-17-1280/resolve/main/config.json",
<add> "xlm-mlm-100-1280": "https://huggingface.co/xlm-mlm-100-1280/resolve/main/config.json",
<ide> }
<ide>
<ide>
<ide><path>src/transformers/configuration_xlm_prophetnet.py
<ide> logger = logging.get_logger(__name__)
<ide>
<ide> XLM_PROPHETNET_PRETRAINED_CONFIG_ARCHIVE_MAP = {
<del> "microsoft/xprophetnet-large-wiki100-cased": "https://s3.amazonaws.com/models.huggingface.co/bert/microsoft/xprophetnet-large-wiki100-cased/config.json",
<add> "microsoft/xprophetnet-large-wiki100-cased": "https://huggingface.co/microsoft/xprophetnet-large-wiki100-cased/resolve/main/config.json",
<ide> }
<ide>
<ide>
<ide><path>src/transformers/configuration_xlm_roberta.py
<ide> logger = logging.get_logger(__name__)
<ide>
<ide> XLM_ROBERTA_PRETRAINED_CONFIG_ARCHIVE_MAP = {
<del> "xlm-roberta-base": "https://s3.amazonaws.com/models.huggingface.co/bert/xlm-roberta-base-config.json",
<del> "xlm-roberta-large": "https://s3.amazonaws.com/models.huggingface.co/bert/xlm-roberta-large-config.json",
<del> "xlm-roberta-large-finetuned-conll02-dutch": "https://s3.amazonaws.com/models.huggingface.co/bert/xlm-roberta-large-finetuned-conll02-dutch-config.json",
<del> "xlm-roberta-large-finetuned-conll02-spanish": "https://s3.amazonaws.com/models.huggingface.co/bert/xlm-roberta-large-finetuned-conll02-spanish-config.json",
<del> "xlm-roberta-large-finetuned-conll03-english": "https://s3.amazonaws.com/models.huggingface.co/bert/xlm-roberta-large-finetuned-conll03-english-config.json",
<del> "xlm-roberta-large-finetuned-conll03-german": "https://s3.amazonaws.com/models.huggingface.co/bert/xlm-roberta-large-finetuned-conll03-german-config.json",
<add> "xlm-roberta-base": "https://huggingface.co/xlm-roberta-base/resolve/main/config.json",
<add> "xlm-roberta-large": "https://huggingface.co/xlm-roberta-large/resolve/main/config.json",
<add> "xlm-roberta-large-finetuned-conll02-dutch": "https://huggingface.co/xlm-roberta-large-finetuned-conll02-dutch/resolve/main/config.json",
<add> "xlm-roberta-large-finetuned-conll02-spanish": "https://huggingface.co/xlm-roberta-large-finetuned-conll02-spanish/resolve/main/config.json",
<add> "xlm-roberta-large-finetuned-conll03-english": "https://huggingface.co/xlm-roberta-large-finetuned-conll03-english/resolve/main/config.json",
<add> "xlm-roberta-large-finetuned-conll03-german": "https://huggingface.co/xlm-roberta-large-finetuned-conll03-german/resolve/main/config.json",
<ide> }
<ide>
<ide>
<ide><path>src/transformers/configuration_xlnet.py
<ide> logger = logging.get_logger(__name__)
<ide>
<ide> XLNET_PRETRAINED_CONFIG_ARCHIVE_MAP = {
<del> "xlnet-base-cased": "https://s3.amazonaws.com/models.huggingface.co/bert/xlnet-base-cased-config.json",
<del> "xlnet-large-cased": "https://s3.amazonaws.com/models.huggingface.co/bert/xlnet-large-cased-config.json",
<add> "xlnet-base-cased": "https://huggingface.co/xlnet-base-cased/resolve/main/config.json",
<add> "xlnet-large-cased": "https://huggingface.co/xlnet-large-cased/resolve/main/config.json",
<ide> }
<ide>
<ide>
<ide><path>src/transformers/tokenization_albert.py
<ide>
<ide> PRETRAINED_VOCAB_FILES_MAP = {
<ide> "vocab_file": {
<del> "albert-base-v1": "https://s3.amazonaws.com/models.huggingface.co/bert/albert-base-v1-spiece.model",
<del> "albert-large-v1": "https://s3.amazonaws.com/models.huggingface.co/bert/albert-large-v1-spiece.model",
<del> "albert-xlarge-v1": "https://s3.amazonaws.com/models.huggingface.co/bert/albert-xlarge-v1-spiece.model",
<del> "albert-xxlarge-v1": "https://s3.amazonaws.com/models.huggingface.co/bert/albert-xxlarge-v1-spiece.model",
<del> "albert-base-v2": "https://s3.amazonaws.com/models.huggingface.co/bert/albert-base-v2-spiece.model",
<del> "albert-large-v2": "https://s3.amazonaws.com/models.huggingface.co/bert/albert-large-v2-spiece.model",
<del> "albert-xlarge-v2": "https://s3.amazonaws.com/models.huggingface.co/bert/albert-xlarge-v2-spiece.model",
<del> "albert-xxlarge-v2": "https://s3.amazonaws.com/models.huggingface.co/bert/albert-xxlarge-v2-spiece.model",
<add> "albert-base-v1": "https://huggingface.co/albert-base-v1/resolve/main/spiece.model",
<add> "albert-large-v1": "https://huggingface.co/albert-large-v1/resolve/main/spiece.model",
<add> "albert-xlarge-v1": "https://huggingface.co/albert-xlarge-v1/resolve/main/spiece.model",
<add> "albert-xxlarge-v1": "https://huggingface.co/albert-xxlarge-v1/resolve/main/spiece.model",
<add> "albert-base-v2": "https://huggingface.co/albert-base-v2/resolve/main/spiece.model",
<add> "albert-large-v2": "https://huggingface.co/albert-large-v2/resolve/main/spiece.model",
<add> "albert-xlarge-v2": "https://huggingface.co/albert-xlarge-v2/resolve/main/spiece.model",
<add> "albert-xxlarge-v2": "https://huggingface.co/albert-xxlarge-v2/resolve/main/spiece.model",
<ide> }
<ide> }
<ide>
<ide><path>src/transformers/tokenization_albert_fast.py
<ide>
<ide> PRETRAINED_VOCAB_FILES_MAP = {
<ide> "vocab_file": {
<del> "albert-base-v1": "https://s3.amazonaws.com/models.huggingface.co/bert/albert-base-v1-spiece.model",
<del> "albert-large-v1": "https://s3.amazonaws.com/models.huggingface.co/bert/albert-large-v1-spiece.model",
<del> "albert-xlarge-v1": "https://s3.amazonaws.com/models.huggingface.co/bert/albert-xlarge-v1-spiece.model",
<del> "albert-xxlarge-v1": "https://s3.amazonaws.com/models.huggingface.co/bert/albert-xxlarge-v1-spiece.model",
<del> "albert-base-v2": "https://s3.amazonaws.com/models.huggingface.co/bert/albert-base-v2-spiece.model",
<del> "albert-large-v2": "https://s3.amazonaws.com/models.huggingface.co/bert/albert-large-v2-spiece.model",
<del> "albert-xlarge-v2": "https://s3.amazonaws.com/models.huggingface.co/bert/albert-xlarge-v2-spiece.model",
<del> "albert-xxlarge-v2": "https://s3.amazonaws.com/models.huggingface.co/bert/albert-xxlarge-v2-spiece.model",
<add> "albert-base-v1": "https://huggingface.co/albert-base-v1/resolve/main/spiece.model",
<add> "albert-large-v1": "https://huggingface.co/albert-large-v1/resolve/main/spiece.model",
<add> "albert-xlarge-v1": "https://huggingface.co/albert-xlarge-v1/resolve/main/spiece.model",
<add> "albert-xxlarge-v1": "https://huggingface.co/albert-xxlarge-v1/resolve/main/spiece.model",
<add> "albert-base-v2": "https://huggingface.co/albert-base-v2/resolve/main/spiece.model",
<add> "albert-large-v2": "https://huggingface.co/albert-large-v2/resolve/main/spiece.model",
<add> "albert-xlarge-v2": "https://huggingface.co/albert-xlarge-v2/resolve/main/spiece.model",
<add> "albert-xxlarge-v2": "https://huggingface.co/albert-xxlarge-v2/resolve/main/spiece.model",
<ide> },
<ide> "tokenizer_file": {
<del> "albert-base-v1": "https://s3.amazonaws.com/models.huggingface.co/bert/albert-base-v1-tokenizer.json",
<del> "albert-large-v1": "https://s3.amazonaws.com/models.huggingface.co/bert/albert-large-v1-tokenizer.json",
<del> "albert-xlarge-v1": "https://s3.amazonaws.com/models.huggingface.co/bert/albert-xlarge-v1-tokenizer.json",
<del> "albert-xxlarge-v1": "https://s3.amazonaws.com/models.huggingface.co/bert/albert-xxlarge-v1-tokenizer.json",
<del> "albert-base-v2": "https://s3.amazonaws.com/models.huggingface.co/bert/albert-base-v2-tokenizer.json",
<del> "albert-large-v2": "https://s3.amazonaws.com/models.huggingface.co/bert/albert-large-v2-tokenizer.json",
<del> "albert-xlarge-v2": "https://s3.amazonaws.com/models.huggingface.co/bert/albert-xlarge-v2-tokenizer.json",
<del> "albert-xxlarge-v2": "https://s3.amazonaws.com/models.huggingface.co/bert/albert-xxlarge-v2-tokenizer.json",
<add> "albert-base-v1": "https://huggingface.co/albert-base-v1/resolve/main/tokenizer.json",
<add> "albert-large-v1": "https://huggingface.co/albert-large-v1/resolve/main/tokenizer.json",
<add> "albert-xlarge-v1": "https://huggingface.co/albert-xlarge-v1/resolve/main/tokenizer.json",
<add> "albert-xxlarge-v1": "https://huggingface.co/albert-xxlarge-v1/resolve/main/tokenizer.json",
<add> "albert-base-v2": "https://huggingface.co/albert-base-v2/resolve/main/tokenizer.json",
<add> "albert-large-v2": "https://huggingface.co/albert-large-v2/resolve/main/tokenizer.json",
<add> "albert-xlarge-v2": "https://huggingface.co/albert-xlarge-v2/resolve/main/tokenizer.json",
<add> "albert-xxlarge-v2": "https://huggingface.co/albert-xxlarge-v2/resolve/main/tokenizer.json",
<ide> },
<ide> }
<ide>
<ide><path>src/transformers/tokenization_bart.py
<ide>
<ide>
<ide> # vocab and merges same as roberta
<del>vocab_url = "https://s3.amazonaws.com/models.huggingface.co/bert/roberta-large-vocab.json"
<del>merges_url = "https://s3.amazonaws.com/models.huggingface.co/bert/roberta-large-merges.txt"
<add>vocab_url = "https://huggingface.co/roberta-large/resolve/main/vocab.json"
<add>merges_url = "https://huggingface.co/roberta-large/resolve/main/merges.txt"
<ide> _all_bart_models = [
<ide> "facebook/bart-base",
<ide> "facebook/bart-large",
<ide><path>src/transformers/tokenization_bart_fast.py
<ide>
<ide>
<ide> # vocab and merges same as roberta
<del>vocab_url = "https://s3.amazonaws.com/models.huggingface.co/bert/roberta-large-vocab.json"
<del>merges_url = "https://s3.amazonaws.com/models.huggingface.co/bert/roberta-large-merges.txt"
<del>tokenizer_url = "https://s3.amazonaws.com/models.huggingface.co/bert/roberta-large-tokenizer.json"
<add>vocab_url = "https://huggingface.co/roberta-large/resolve/main/vocab.json"
<add>merges_url = "https://huggingface.co/roberta-large/resolve/main/merges.txt"
<add>tokenizer_url = "https://huggingface.co/roberta-large/resolve/main/tokenizer.json"
<ide> _all_bart_models = [
<ide> "facebook/bart-base",
<ide> "facebook/bart-large",
<ide><path>src/transformers/tokenization_bert.py
<ide>
<ide> PRETRAINED_VOCAB_FILES_MAP = {
<ide> "vocab_file": {
<del> "bert-base-uncased": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-vocab.txt",
<del> "bert-large-uncased": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-uncased-vocab.txt",
<del> "bert-base-cased": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-cased-vocab.txt",
<del> "bert-large-cased": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-cased-vocab.txt",
<del> "bert-base-multilingual-uncased": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-multilingual-uncased-vocab.txt",
<del> "bert-base-multilingual-cased": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-multilingual-cased-vocab.txt",
<del> "bert-base-chinese": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-chinese-vocab.txt",
<add> "bert-base-uncased": "https://huggingface.co/bert-base-uncased/resolve/main/vocab.txt",
<add> "bert-large-uncased": "https://huggingface.co/bert-large-uncased/resolve/main/vocab.txt",
<add> "bert-base-cased": "https://huggingface.co/bert-base-cased/resolve/main/vocab.txt",
<add> "bert-large-cased": "https://huggingface.co/bert-large-cased/resolve/main/vocab.txt",
<add> "bert-base-multilingual-uncased": "https://huggingface.co/bert-base-multilingual-uncased/resolve/main/vocab.txt",
<add> "bert-base-multilingual-cased": "https://huggingface.co/bert-base-multilingual-cased/resolve/main/vocab.txt",
<add> "bert-base-chinese": "https://huggingface.co/bert-base-chinese/resolve/main/vocab.txt",
<ide> "bert-base-german-cased": "https://int-deepset-models-bert.s3.eu-central-1.amazonaws.com/pytorch/bert-base-german-cased-vocab.txt",
<del> "bert-large-uncased-whole-word-masking": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-uncased-whole-word-masking-vocab.txt",
<del> "bert-large-cased-whole-word-masking": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-cased-whole-word-masking-vocab.txt",
<del> "bert-large-uncased-whole-word-masking-finetuned-squad": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-uncased-whole-word-masking-finetuned-squad-vocab.txt",
<del> "bert-large-cased-whole-word-masking-finetuned-squad": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-cased-whole-word-masking-finetuned-squad-vocab.txt",
<del> "bert-base-cased-finetuned-mrpc": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-cased-finetuned-mrpc-vocab.txt",
<del> "bert-base-german-dbmdz-cased": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-german-dbmdz-cased-vocab.txt",
<del> "bert-base-german-dbmdz-uncased": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-german-dbmdz-uncased-vocab.txt",
<del> "TurkuNLP/bert-base-finnish-cased-v1": "https://s3.amazonaws.com/models.huggingface.co/bert/TurkuNLP/bert-base-finnish-cased-v1/vocab.txt",
<del> "TurkuNLP/bert-base-finnish-uncased-v1": "https://s3.amazonaws.com/models.huggingface.co/bert/TurkuNLP/bert-base-finnish-uncased-v1/vocab.txt",
<del> "wietsedv/bert-base-dutch-cased": "https://s3.amazonaws.com/models.huggingface.co/bert/wietsedv/bert-base-dutch-cased/vocab.txt",
<add> "bert-large-uncased-whole-word-masking": "https://huggingface.co/bert-large-uncased-whole-word-masking/resolve/main/vocab.txt",
<add> "bert-large-cased-whole-word-masking": "https://huggingface.co/bert-large-cased-whole-word-masking/resolve/main/vocab.txt",
<add> "bert-large-uncased-whole-word-masking-finetuned-squad": "https://huggingface.co/bert-large-uncased-whole-word-masking-finetuned-squad/resolve/main/vocab.txt",
<add> "bert-large-cased-whole-word-masking-finetuned-squad": "https://huggingface.co/bert-large-cased-whole-word-masking-finetuned-squad/resolve/main/vocab.txt",
<add> "bert-base-cased-finetuned-mrpc": "https://huggingface.co/bert-base-cased-finetuned-mrpc/resolve/main/vocab.txt",
<add> "bert-base-german-dbmdz-cased": "https://huggingface.co/bert-base-german-dbmdz-cased/resolve/main/vocab.txt",
<add> "bert-base-german-dbmdz-uncased": "https://huggingface.co/bert-base-german-dbmdz-uncased/resolve/main/vocab.txt",
<add> "TurkuNLP/bert-base-finnish-cased-v1": "https://huggingface.co/TurkuNLP/bert-base-finnish-cased-v1/resolve/main/vocab.txt",
<add> "TurkuNLP/bert-base-finnish-uncased-v1": "https://huggingface.co/TurkuNLP/bert-base-finnish-uncased-v1/resolve/main/vocab.txt",
<add> "wietsedv/bert-base-dutch-cased": "https://huggingface.co/wietsedv/bert-base-dutch-cased/resolve/main/vocab.txt",
<ide> }
<ide> }
<ide>
<ide><path>src/transformers/tokenization_bert_fast.py
<ide>
<ide> PRETRAINED_VOCAB_FILES_MAP = {
<ide> "vocab_file": {
<del> "bert-base-uncased": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-vocab.txt",
<del> "bert-large-uncased": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-uncased-vocab.txt",
<del> "bert-base-cased": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-cased-vocab.txt",
<del> "bert-large-cased": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-cased-vocab.txt",
<del> "bert-base-multilingual-uncased": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-multilingual-uncased-vocab.txt",
<del> "bert-base-multilingual-cased": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-multilingual-cased-vocab.txt",
<del> "bert-base-chinese": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-chinese-vocab.txt",
<add> "bert-base-uncased": "https://huggingface.co/bert-base-uncased/resolve/main/vocab.txt",
<add> "bert-large-uncased": "https://huggingface.co/bert-large-uncased/resolve/main/vocab.txt",
<add> "bert-base-cased": "https://huggingface.co/bert-base-cased/resolve/main/vocab.txt",
<add> "bert-large-cased": "https://huggingface.co/bert-large-cased/resolve/main/vocab.txt",
<add> "bert-base-multilingual-uncased": "https://huggingface.co/bert-base-multilingual-uncased/resolve/main/vocab.txt",
<add> "bert-base-multilingual-cased": "https://huggingface.co/bert-base-multilingual-cased/resolve/main/vocab.txt",
<add> "bert-base-chinese": "https://huggingface.co/bert-base-chinese/resolve/main/vocab.txt",
<ide> "bert-base-german-cased": "https://int-deepset-models-bert.s3.eu-central-1.amazonaws.com/pytorch/bert-base-german-cased-vocab.txt",
<del> "bert-large-uncased-whole-word-masking": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-uncased-whole-word-masking-vocab.txt",
<del> "bert-large-cased-whole-word-masking": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-cased-whole-word-masking-vocab.txt",
<del> "bert-large-uncased-whole-word-masking-finetuned-squad": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-uncased-whole-word-masking-finetuned-squad-vocab.txt",
<del> "bert-large-cased-whole-word-masking-finetuned-squad": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-cased-whole-word-masking-finetuned-squad-vocab.txt",
<del> "bert-base-cased-finetuned-mrpc": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-cased-finetuned-mrpc-vocab.txt",
<del> "bert-base-german-dbmdz-cased": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-german-dbmdz-cased-vocab.txt",
<del> "bert-base-german-dbmdz-uncased": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-german-dbmdz-uncased-vocab.txt",
<del> "TurkuNLP/bert-base-finnish-cased-v1": "https://s3.amazonaws.com/models.huggingface.co/bert/TurkuNLP/bert-base-finnish-cased-v1/vocab.txt",
<del> "TurkuNLP/bert-base-finnish-uncased-v1": "https://s3.amazonaws.com/models.huggingface.co/bert/TurkuNLP/bert-base-finnish-uncased-v1/vocab.txt",
<del> "wietsedv/bert-base-dutch-cased": "https://s3.amazonaws.com/models.huggingface.co/bert/wietsedv/bert-base-dutch-cased/vocab.txt",
<add> "bert-large-uncased-whole-word-masking": "https://huggingface.co/bert-large-uncased-whole-word-masking/resolve/main/vocab.txt",
<add> "bert-large-cased-whole-word-masking": "https://huggingface.co/bert-large-cased-whole-word-masking/resolve/main/vocab.txt",
<add> "bert-large-uncased-whole-word-masking-finetuned-squad": "https://huggingface.co/bert-large-uncased-whole-word-masking-finetuned-squad/resolve/main/vocab.txt",
<add> "bert-large-cased-whole-word-masking-finetuned-squad": "https://huggingface.co/bert-large-cased-whole-word-masking-finetuned-squad/resolve/main/vocab.txt",
<add> "bert-base-cased-finetuned-mrpc": "https://huggingface.co/bert-base-cased-finetuned-mrpc/resolve/main/vocab.txt",
<add> "bert-base-german-dbmdz-cased": "https://huggingface.co/bert-base-german-dbmdz-cased/resolve/main/vocab.txt",
<add> "bert-base-german-dbmdz-uncased": "https://huggingface.co/bert-base-german-dbmdz-uncased/resolve/main/vocab.txt",
<add> "TurkuNLP/bert-base-finnish-cased-v1": "https://huggingface.co/TurkuNLP/bert-base-finnish-cased-v1/resolve/main/vocab.txt",
<add> "TurkuNLP/bert-base-finnish-uncased-v1": "https://huggingface.co/TurkuNLP/bert-base-finnish-uncased-v1/resolve/main/vocab.txt",
<add> "wietsedv/bert-base-dutch-cased": "https://huggingface.co/wietsedv/bert-base-dutch-cased/resolve/main/vocab.txt",
<ide> },
<ide> "tokenizer_file": {
<del> "bert-base-uncased": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-tokenizer.json",
<del> "bert-large-uncased": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-uncased-tokenizer.json",
<del> "bert-base-cased": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-cased-tokenizer.json",
<del> "bert-large-cased": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-cased-tokenizer.json",
<del> "bert-base-multilingual-uncased": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-multilingual-uncased-tokenizer.json",
<del> "bert-base-multilingual-cased": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-multilingual-cased-tokenizer.json",
<del> "bert-base-chinese": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-chinese-tokenizer.json",
<add> "bert-base-uncased": "https://huggingface.co/bert-base-uncased/resolve/main/tokenizer.json",
<add> "bert-large-uncased": "https://huggingface.co/bert-large-uncased/resolve/main/tokenizer.json",
<add> "bert-base-cased": "https://huggingface.co/bert-base-cased/resolve/main/tokenizer.json",
<add> "bert-large-cased": "https://huggingface.co/bert-large-cased/resolve/main/tokenizer.json",
<add> "bert-base-multilingual-uncased": "https://huggingface.co/bert-base-multilingual-uncased/resolve/main/tokenizer.json",
<add> "bert-base-multilingual-cased": "https://huggingface.co/bert-base-multilingual-cased/resolve/main/tokenizer.json",
<add> "bert-base-chinese": "https://huggingface.co/bert-base-chinese/resolve/main/tokenizer.json",
<ide> "bert-base-german-cased": "https://int-deepset-models-bert.s3.eu-central-1.amazonaws.com/pytorch/bert-base-german-cased-tokenizer.json",
<del> "bert-large-uncased-whole-word-masking": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-uncased-whole-word-masking-tokenizer.json",
<del> "bert-large-cased-whole-word-masking": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-cased-whole-word-masking-tokenizer.json",
<del> "bert-large-uncased-whole-word-masking-finetuned-squad": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-uncased-whole-word-masking-finetuned-squad-tokenizer.json",
<del> "bert-large-cased-whole-word-masking-finetuned-squad": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-cased-whole-word-masking-finetuned-squad-tokenizer.json",
<del> "bert-base-cased-finetuned-mrpc": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-cased-finetuned-mrpc-tokenizer.json",
<del> "bert-base-german-dbmdz-cased": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-german-dbmdz-cased-tokenizer.json",
<del> "bert-base-german-dbmdz-uncased": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-german-dbmdz-uncased-tokenizer.json",
<del> "TurkuNLP/bert-base-finnish-cased-v1": "https://s3.amazonaws.com/models.huggingface.co/bert/TurkuNLP/bert-base-finnish-cased-v1/tokenizer.json",
<del> "TurkuNLP/bert-base-finnish-uncased-v1": "https://s3.amazonaws.com/models.huggingface.co/bert/TurkuNLP/bert-base-finnish-uncased-v1/tokenizer.json",
<del> "wietsedv/bert-base-dutch-cased": "https://s3.amazonaws.com/models.huggingface.co/bert/wietsedv/bert-base-dutch-cased/tokenizer.json",
<add> "bert-large-uncased-whole-word-masking": "https://huggingface.co/bert-large-uncased-whole-word-masking/resolve/main/tokenizer.json",
<add> "bert-large-cased-whole-word-masking": "https://huggingface.co/bert-large-cased-whole-word-masking/resolve/main/tokenizer.json",
<add> "bert-large-uncased-whole-word-masking-finetuned-squad": "https://huggingface.co/bert-large-uncased-whole-word-masking-finetuned-squad/resolve/main/tokenizer.json",
<add> "bert-large-cased-whole-word-masking-finetuned-squad": "https://huggingface.co/bert-large-cased-whole-word-masking-finetuned-squad/resolve/main/tokenizer.json",
<add> "bert-base-cased-finetuned-mrpc": "https://huggingface.co/bert-base-cased-finetuned-mrpc/resolve/main/tokenizer.json",
<add> "bert-base-german-dbmdz-cased": "https://huggingface.co/bert-base-german-dbmdz-cased/resolve/main/tokenizer.json",
<add> "bert-base-german-dbmdz-uncased": "https://huggingface.co/bert-base-german-dbmdz-uncased/resolve/main/tokenizer.json",
<add> "TurkuNLP/bert-base-finnish-cased-v1": "https://huggingface.co/TurkuNLP/bert-base-finnish-cased-v1/resolve/main/tokenizer.json",
<add> "TurkuNLP/bert-base-finnish-uncased-v1": "https://huggingface.co/TurkuNLP/bert-base-finnish-uncased-v1/resolve/main/tokenizer.json",
<add> "wietsedv/bert-base-dutch-cased": "https://huggingface.co/wietsedv/bert-base-dutch-cased/resolve/main/tokenizer.json",
<ide> },
<ide> }
<ide>
<ide><path>src/transformers/tokenization_bert_generation.py
<ide> VOCAB_FILES_NAMES = {"vocab_file": "spiece.model"}
<ide>
<ide> tokenizer_url = (
<del> "https://s3.amazonaws.com/models.huggingface.co/bert/google/bert_for_seq_generation_L-24_bbc_encoder/spiece.model"
<add> "https://huggingface.co/google/bert_for_seq_generation_L-24_bbc_encoder/resolve/main/spiece.model"
<ide> )
<ide>
<ide>
<ide><path>src/transformers/tokenization_bert_japanese.py
<ide>
<ide> PRETRAINED_VOCAB_FILES_MAP = {
<ide> "vocab_file": {
<del> "cl-tohoku/bert-base-japanese": "https://s3.amazonaws.com/models.huggingface.co/bert/cl-tohoku/bert-base-japanese/vocab.txt",
<del> "cl-tohoku/bert-base-japanese-whole-word-masking": "https://s3.amazonaws.com/models.huggingface.co/bert/cl-tohoku/bert-base-japanese-whole-word-masking/vocab.txt",
<del> "cl-tohoku/bert-base-japanese-char": "https://s3.amazonaws.com/models.huggingface.co/bert/cl-tohoku/bert-base-japanese-char/vocab.txt",
<del> "cl-tohoku/bert-base-japanese-char-whole-word-masking": "https://s3.amazonaws.com/models.huggingface.co/bert/cl-tohoku/bert-base-japanese-char-whole-word-masking/vocab.txt",
<add> "cl-tohoku/bert-base-japanese": "https://huggingface.co/cl-tohoku/bert-base-japanese/resolve/main/vocab.txt",
<add> "cl-tohoku/bert-base-japanese-whole-word-masking": "https://huggingface.co/cl-tohoku/bert-base-japanese-whole-word-masking/resolve/main/vocab.txt",
<add> "cl-tohoku/bert-base-japanese-char": "https://huggingface.co/cl-tohoku/bert-base-japanese-char/resolve/main/vocab.txt",
<add> "cl-tohoku/bert-base-japanese-char-whole-word-masking": "https://huggingface.co/cl-tohoku/bert-base-japanese-char-whole-word-masking/resolve/main/vocab.txt",
<ide> }
<ide> }
<ide>
<ide><path>src/transformers/tokenization_bertweet.py
<ide>
<ide> PRETRAINED_VOCAB_FILES_MAP = {
<ide> "vocab_file": {
<del> "vinai/bertweet-base": "https://s3.amazonaws.com/models.huggingface.co/bert/vinai/bertweet-base/vocab.txt",
<add> "vinai/bertweet-base": "https://huggingface.co/vinai/bertweet-base/resolve/main/vocab.txt",
<ide> },
<ide> "merges_file": {
<del> "vinai/bertweet-base": "https://s3.amazonaws.com/models.huggingface.co/bert/vinai/bertweet-base/bpe.codes",
<add> "vinai/bertweet-base": "https://huggingface.co/vinai/bertweet-base/resolve/main/bpe.codes",
<ide> },
<ide> }
<ide>
<ide><path>src/transformers/tokenization_camembert.py
<ide>
<ide> PRETRAINED_VOCAB_FILES_MAP = {
<ide> "vocab_file": {
<del> "camembert-base": "https://s3.amazonaws.com/models.huggingface.co/bert/camembert-base-sentencepiece.bpe.model",
<add> "camembert-base": "https://huggingface.co/camembert-base/resolve/main/sentencepiece.bpe.model",
<ide> }
<ide> }
<ide>
<ide><path>src/transformers/tokenization_camembert_fast.py
<ide>
<ide> PRETRAINED_VOCAB_FILES_MAP = {
<ide> "vocab_file": {
<del> "camembert-base": "https://s3.amazonaws.com/models.huggingface.co/bert/camembert-base-sentencepiece.bpe.model",
<add> "camembert-base": "https://huggingface.co/camembert-base/resolve/main/sentencepiece.bpe.model",
<ide> },
<ide> "tokenizer_file": {
<del> "camembert-base": "https://s3.amazonaws.com/models.huggingface.co/bert/camembert-base-tokenizer.json",
<add> "camembert-base": "https://huggingface.co/camembert-base/resolve/main/tokenizer.json",
<ide> },
<ide> }
<ide>
<ide><path>src/transformers/tokenization_deberta.py
<ide>
<ide> PRETRAINED_VOCAB_FILES_MAP = {
<ide> "vocab_file": {
<del> "microsoft/deberta-base": "https://s3.amazonaws.com/models.huggingface.co/bert/microsoft/deberta-base/bpe_encoder.bin",
<del> "microsoft/deberta-large": "https://s3.amazonaws.com/models.huggingface.co/bert/microsoft/deberta-large/bpe_encoder.bin",
<add> "microsoft/deberta-base": "https://huggingface.co/microsoft/deberta-base/resolve/main/bpe_encoder.bin",
<add> "microsoft/deberta-large": "https://huggingface.co/microsoft/deberta-large/resolve/main/bpe_encoder.bin",
<ide> }
<ide> }
<ide>
<ide><path>src/transformers/tokenization_distilbert.py
<ide>
<ide> PRETRAINED_VOCAB_FILES_MAP = {
<ide> "vocab_file": {
<del> "distilbert-base-uncased": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-vocab.txt",
<del> "distilbert-base-uncased-distilled-squad": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-uncased-vocab.txt",
<del> "distilbert-base-cased": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-cased-vocab.txt",
<del> "distilbert-base-cased-distilled-squad": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-cased-vocab.txt",
<del> "distilbert-base-german-cased": "https://s3.amazonaws.com/models.huggingface.co/bert/distilbert-base-german-cased-vocab.txt",
<del> "distilbert-base-multilingual-cased": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-multilingual-cased-vocab.txt",
<add> "distilbert-base-uncased": "https://huggingface.co/bert-base-uncased/resolve/main/vocab.txt",
<add> "distilbert-base-uncased-distilled-squad": "https://huggingface.co/bert-large-uncased/resolve/main/vocab.txt",
<add> "distilbert-base-cased": "https://huggingface.co/bert-base-cased/resolve/main/vocab.txt",
<add> "distilbert-base-cased-distilled-squad": "https://huggingface.co/bert-large-cased/resolve/main/vocab.txt",
<add> "distilbert-base-german-cased": "https://huggingface.co/distilbert-base-german-cased/resolve/main/vocab.txt",
<add> "distilbert-base-multilingual-cased": "https://huggingface.co/bert-base-multilingual-cased/resolve/main/vocab.txt",
<ide> }
<ide> }
<ide>
<ide><path>src/transformers/tokenization_distilbert_fast.py
<ide>
<ide> PRETRAINED_VOCAB_FILES_MAP = {
<ide> "vocab_file": {
<del> "distilbert-base-uncased": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-vocab.txt",
<del> "distilbert-base-uncased-distilled-squad": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-uncased-vocab.txt",
<del> "distilbert-base-cased": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-cased-vocab.txt",
<del> "distilbert-base-cased-distilled-squad": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-cased-vocab.txt",
<del> "distilbert-base-german-cased": "https://s3.amazonaws.com/models.huggingface.co/bert/distilbert-base-german-cased-vocab.txt",
<del> "distilbert-base-multilingual-cased": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-multilingual-cased-vocab.txt",
<add> "distilbert-base-uncased": "https://huggingface.co/bert-base-uncased/resolve/main/vocab.txt",
<add> "distilbert-base-uncased-distilled-squad": "https://huggingface.co/bert-large-uncased/resolve/main/vocab.txt",
<add> "distilbert-base-cased": "https://huggingface.co/bert-base-cased/resolve/main/vocab.txt",
<add> "distilbert-base-cased-distilled-squad": "https://huggingface.co/bert-large-cased/resolve/main/vocab.txt",
<add> "distilbert-base-german-cased": "https://huggingface.co/distilbert-base-german-cased/resolve/main/vocab.txt",
<add> "distilbert-base-multilingual-cased": "https://huggingface.co/bert-base-multilingual-cased/resolve/main/vocab.txt",
<ide> },
<ide> "tokenizer_file": {
<del> "distilbert-base-uncased": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-tokenizer.json",
<del> "distilbert-base-uncased-distilled-squad": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-uncased-tokenizer.json",
<del> "distilbert-base-cased": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-cased-tokenizer.json",
<del> "distilbert-base-cased-distilled-squad": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-cased-tokenizer.json",
<del> "distilbert-base-german-cased": "https://s3.amazonaws.com/models.huggingface.co/bert/distilbert-base-german-cased-tokenizer.json",
<del> "distilbert-base-multilingual-cased": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-multilingual-cased-tokenizer.json",
<add> "distilbert-base-uncased": "https://huggingface.co/bert-base-uncased/resolve/main/tokenizer.json",
<add> "distilbert-base-uncased-distilled-squad": "https://huggingface.co/bert-large-uncased/resolve/main/tokenizer.json",
<add> "distilbert-base-cased": "https://huggingface.co/bert-base-cased/resolve/main/tokenizer.json",
<add> "distilbert-base-cased-distilled-squad": "https://huggingface.co/bert-large-cased/resolve/main/tokenizer.json",
<add> "distilbert-base-german-cased": "https://huggingface.co/distilbert-base-german-cased/resolve/main/tokenizer.json",
<add> "distilbert-base-multilingual-cased": "https://huggingface.co/bert-base-multilingual-cased/resolve/main/tokenizer.json",
<ide> },
<ide> }
<ide>
<ide><path>src/transformers/tokenization_dpr.py
<ide>
<ide> CONTEXT_ENCODER_PRETRAINED_VOCAB_FILES_MAP = {
<ide> "vocab_file": {
<del> "facebook/dpr-ctx_encoder-single-nq-base": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-vocab.txt",
<del> "facebook/dpr-ctx_encoder-multiset-base": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-vocab.txt",
<add> "facebook/dpr-ctx_encoder-single-nq-base": "https://huggingface.co/bert-base-uncased/resolve/main/vocab.txt",
<add> "facebook/dpr-ctx_encoder-multiset-base": "https://huggingface.co/bert-base-uncased/resolve/main/vocab.txt",
<ide> },
<ide> "tokenizer_file": {
<del> "facebook/dpr-ctx_encoder-single-nq-base": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-tokenizer.json",
<del> "facebook/dpr-ctx_encoder-multiset-base": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-tokenizer.json",
<add> "facebook/dpr-ctx_encoder-single-nq-base": "https://huggingface.co/bert-base-uncased/resolve/main/tokenizer.json",
<add> "facebook/dpr-ctx_encoder-multiset-base": "https://huggingface.co/bert-base-uncased/resolve/main/tokenizer.json",
<ide> },
<ide> }
<ide> QUESTION_ENCODER_PRETRAINED_VOCAB_FILES_MAP = {
<ide> "vocab_file": {
<del> "facebook/dpr-question_encoder-single-nq-base": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-vocab.txt",
<del> "facebook/dpr-question_encoder-multiset-base": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-vocab.txt",
<add> "facebook/dpr-question_encoder-single-nq-base": "https://huggingface.co/bert-base-uncased/resolve/main/vocab.txt",
<add> "facebook/dpr-question_encoder-multiset-base": "https://huggingface.co/bert-base-uncased/resolve/main/vocab.txt",
<ide> },
<ide> "tokenizer_file": {
<del> "facebook/dpr-question_encoder-single-nq-base": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-tokenizer.json",
<del> "facebook/dpr-question_encoder-multiset-base": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-tokenizer.json",
<add> "facebook/dpr-question_encoder-single-nq-base": "https://huggingface.co/bert-base-uncased/resolve/main/tokenizer.json",
<add> "facebook/dpr-question_encoder-multiset-base": "https://huggingface.co/bert-base-uncased/resolve/main/tokenizer.json",
<ide> },
<ide> }
<ide> READER_PRETRAINED_VOCAB_FILES_MAP = {
<ide> "vocab_file": {
<del> "facebook/dpr-reader-single-nq-base": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-vocab.txt",
<del> "facebook/dpr-reader-multiset-base": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-vocab.txt",
<add> "facebook/dpr-reader-single-nq-base": "https://huggingface.co/bert-base-uncased/resolve/main/vocab.txt",
<add> "facebook/dpr-reader-multiset-base": "https://huggingface.co/bert-base-uncased/resolve/main/vocab.txt",
<ide> },
<ide> "tokenizer_file": {
<del> "facebook/dpr-reader-single-nq-base": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-tokenizer.json",
<del> "facebook/dpr-reader-multiset-base": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-tokenizer.json",
<add> "facebook/dpr-reader-single-nq-base": "https://huggingface.co/bert-base-uncased/resolve/main/tokenizer.json",
<add> "facebook/dpr-reader-multiset-base": "https://huggingface.co/bert-base-uncased/resolve/main/tokenizer.json",
<ide> },
<ide> }
<ide>
<ide><path>src/transformers/tokenization_dpr_fast.py
<ide>
<ide> CONTEXT_ENCODER_PRETRAINED_VOCAB_FILES_MAP = {
<ide> "vocab_file": {
<del> "facebook/dpr-ctx_encoder-single-nq-base": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-vocab.txt",
<del> "facebook/dpr-ctx_encoder-multiset-base": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-vocab.txt",
<add> "facebook/dpr-ctx_encoder-single-nq-base": "https://huggingface.co/bert-base-uncased/resolve/main/vocab.txt",
<add> "facebook/dpr-ctx_encoder-multiset-base": "https://huggingface.co/bert-base-uncased/resolve/main/vocab.txt",
<ide> },
<ide> "tokenizer_file": {
<del> "facebook/dpr-ctx_encoder-single-nq-base": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-tokenizer.json",
<del> "facebook/dpr-ctx_encoder-multiset-base": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-tokenizer.json",
<add> "facebook/dpr-ctx_encoder-single-nq-base": "https://huggingface.co/bert-base-uncased/resolve/main/tokenizer.json",
<add> "facebook/dpr-ctx_encoder-multiset-base": "https://huggingface.co/bert-base-uncased/resolve/main/tokenizer.json",
<ide> },
<ide> }
<ide> QUESTION_ENCODER_PRETRAINED_VOCAB_FILES_MAP = {
<ide> "vocab_file": {
<del> "facebook/dpr-question_encoder-single-nq-base": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-vocab.txt",
<del> "facebook/dpr-question_encoder-multiset-base": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-vocab.txt",
<add> "facebook/dpr-question_encoder-single-nq-base": "https://huggingface.co/bert-base-uncased/resolve/main/vocab.txt",
<add> "facebook/dpr-question_encoder-multiset-base": "https://huggingface.co/bert-base-uncased/resolve/main/vocab.txt",
<ide> },
<ide> "tokenizer_file": {
<del> "facebook/dpr-question_encoder-single-nq-base": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-tokenizer.json",
<del> "facebook/dpr-question_encoder-multiset-base": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-tokenizer.json",
<add> "facebook/dpr-question_encoder-single-nq-base": "https://huggingface.co/bert-base-uncased/resolve/main/tokenizer.json",
<add> "facebook/dpr-question_encoder-multiset-base": "https://huggingface.co/bert-base-uncased/resolve/main/tokenizer.json",
<ide> },
<ide> }
<ide> READER_PRETRAINED_VOCAB_FILES_MAP = {
<ide> "vocab_file": {
<del> "facebook/dpr-reader-single-nq-base": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-vocab.txt",
<del> "facebook/dpr-reader-multiset-base": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-vocab.txt",
<add> "facebook/dpr-reader-single-nq-base": "https://huggingface.co/bert-base-uncased/resolve/main/vocab.txt",
<add> "facebook/dpr-reader-multiset-base": "https://huggingface.co/bert-base-uncased/resolve/main/vocab.txt",
<ide> },
<ide> "tokenizer_file": {
<del> "facebook/dpr-reader-single-nq-base": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-tokenizer.json",
<del> "facebook/dpr-reader-multiset-base": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-tokenizer.json",
<add> "facebook/dpr-reader-single-nq-base": "https://huggingface.co/bert-base-uncased/resolve/main/tokenizer.json",
<add> "facebook/dpr-reader-multiset-base": "https://huggingface.co/bert-base-uncased/resolve/main/tokenizer.json",
<ide> },
<ide> }
<ide>
<ide><path>src/transformers/tokenization_electra.py
<ide>
<ide> PRETRAINED_VOCAB_FILES_MAP = {
<ide> "vocab_file": {
<del> "google/electra-small-generator": "https://s3.amazonaws.com/models.huggingface.co/bert/google/electra-small-generator/vocab.txt",
<del> "google/electra-base-generator": "https://s3.amazonaws.com/models.huggingface.co/bert/google/electra-base-generator/vocab.txt",
<del> "google/electra-large-generator": "https://s3.amazonaws.com/models.huggingface.co/bert/google/electra-large-generator/vocab.txt",
<del> "google/electra-small-discriminator": "https://s3.amazonaws.com/models.huggingface.co/bert/google/electra-small-discriminator/vocab.txt",
<del> "google/electra-base-discriminator": "https://s3.amazonaws.com/models.huggingface.co/bert/google/electra-base-discriminator/vocab.txt",
<del> "google/electra-large-discriminator": "https://s3.amazonaws.com/models.huggingface.co/bert/google/electra-large-discriminator/vocab.txt",
<add> "google/electra-small-generator": "https://huggingface.co/google/electra-small-generator/resolve/main/vocab.txt",
<add> "google/electra-base-generator": "https://huggingface.co/google/electra-base-generator/resolve/main/vocab.txt",
<add> "google/electra-large-generator": "https://huggingface.co/google/electra-large-generator/resolve/main/vocab.txt",
<add> "google/electra-small-discriminator": "https://huggingface.co/google/electra-small-discriminator/resolve/main/vocab.txt",
<add> "google/electra-base-discriminator": "https://huggingface.co/google/electra-base-discriminator/resolve/main/vocab.txt",
<add> "google/electra-large-discriminator": "https://huggingface.co/google/electra-large-discriminator/resolve/main/vocab.txt",
<ide> }
<ide> }
<ide>
<ide><path>src/transformers/tokenization_electra_fast.py
<ide>
<ide> PRETRAINED_VOCAB_FILES_MAP = {
<ide> "vocab_file": {
<del> "google/electra-small-generator": "https://s3.amazonaws.com/models.huggingface.co/bert/google/electra-small-generator/vocab.txt",
<del> "google/electra-base-generator": "https://s3.amazonaws.com/models.huggingface.co/bert/google/electra-base-generator/vocab.txt",
<del> "google/electra-large-generator": "https://s3.amazonaws.com/models.huggingface.co/bert/google/electra-large-generator/vocab.txt",
<del> "google/electra-small-discriminator": "https://s3.amazonaws.com/models.huggingface.co/bert/google/electra-small-discriminator/vocab.txt",
<del> "google/electra-base-discriminator": "https://s3.amazonaws.com/models.huggingface.co/bert/google/electra-base-discriminator/vocab.txt",
<del> "google/electra-large-discriminator": "https://s3.amazonaws.com/models.huggingface.co/bert/google/electra-large-discriminator/vocab.txt",
<add> "google/electra-small-generator": "https://huggingface.co/google/electra-small-generator/resolve/main/vocab.txt",
<add> "google/electra-base-generator": "https://huggingface.co/google/electra-base-generator/resolve/main/vocab.txt",
<add> "google/electra-large-generator": "https://huggingface.co/google/electra-large-generator/resolve/main/vocab.txt",
<add> "google/electra-small-discriminator": "https://huggingface.co/google/electra-small-discriminator/resolve/main/vocab.txt",
<add> "google/electra-base-discriminator": "https://huggingface.co/google/electra-base-discriminator/resolve/main/vocab.txt",
<add> "google/electra-large-discriminator": "https://huggingface.co/google/electra-large-discriminator/resolve/main/vocab.txt",
<ide> },
<ide> "tokenizer_file": {
<del> "google/electra-small-generator": "https://s3.amazonaws.com/models.huggingface.co/bert/google/electra-small-generator/tokenizer.json",
<del> "google/electra-base-generator": "https://s3.amazonaws.com/models.huggingface.co/bert/google/electra-base-generator/tokenizer.json",
<del> "google/electra-large-generator": "https://s3.amazonaws.com/models.huggingface.co/bert/google/electra-large-generator/tokenizer.json",
<del> "google/electra-small-discriminator": "https://s3.amazonaws.com/models.huggingface.co/bert/google/electra-small-discriminator/tokenizer.json",
<del> "google/electra-base-discriminator": "https://s3.amazonaws.com/models.huggingface.co/bert/google/electra-base-discriminator/tokenizer.json",
<del> "google/electra-large-discriminator": "https://s3.amazonaws.com/models.huggingface.co/bert/google/electra-large-discriminator/tokenizer.json",
<add> "google/electra-small-generator": "https://huggingface.co/google/electra-small-generator/resolve/main/tokenizer.json",
<add> "google/electra-base-generator": "https://huggingface.co/google/electra-base-generator/resolve/main/tokenizer.json",
<add> "google/electra-large-generator": "https://huggingface.co/google/electra-large-generator/resolve/main/tokenizer.json",
<add> "google/electra-small-discriminator": "https://huggingface.co/google/electra-small-discriminator/resolve/main/tokenizer.json",
<add> "google/electra-base-discriminator": "https://huggingface.co/google/electra-base-discriminator/resolve/main/tokenizer.json",
<add> "google/electra-large-discriminator": "https://huggingface.co/google/electra-large-discriminator/resolve/main/tokenizer.json",
<ide> },
<ide> }
<ide>
<ide><path>src/transformers/tokenization_flaubert.py
<ide>
<ide> PRETRAINED_VOCAB_FILES_MAP = {
<ide> "vocab_file": {
<del> "flaubert/flaubert_small_cased": "https://s3.amazonaws.com/models.huggingface.co/bert/flaubert/flaubert_small_cased/vocab.json",
<del> "flaubert/flaubert_base_uncased": "https://s3.amazonaws.com/models.huggingface.co/bert/flaubert/flaubert_base_uncased/vocab.json",
<del> "flaubert/flaubert_base_cased": "https://s3.amazonaws.com/models.huggingface.co/bert/flaubert/flaubert_base_cased/vocab.json",
<del> "flaubert/flaubert_large_cased": "https://s3.amazonaws.com/models.huggingface.co/bert/flaubert/flaubert_large_cased/vocab.json",
<add> "flaubert/flaubert_small_cased": "https://huggingface.co/flaubert/flaubert_small_cased/resolve/main/vocab.json",
<add> "flaubert/flaubert_base_uncased": "https://huggingface.co/flaubert/flaubert_base_uncased/resolve/main/vocab.json",
<add> "flaubert/flaubert_base_cased": "https://huggingface.co/flaubert/flaubert_base_cased/resolve/main/vocab.json",
<add> "flaubert/flaubert_large_cased": "https://huggingface.co/flaubert/flaubert_large_cased/resolve/main/vocab.json",
<ide> },
<ide> "merges_file": {
<del> "flaubert/flaubert_small_cased": "https://s3.amazonaws.com/models.huggingface.co/bert/flaubert/flaubert_small_cased/merges.txt",
<del> "flaubert/flaubert_base_uncased": "https://s3.amazonaws.com/models.huggingface.co/bert/flaubert/flaubert_base_uncased/merges.txt",
<del> "flaubert/flaubert_base_cased": "https://s3.amazonaws.com/models.huggingface.co/bert/flaubert/flaubert_base_cased/merges.txt",
<del> "flaubert/flaubert_large_cased": "https://s3.amazonaws.com/models.huggingface.co/bert/flaubert/flaubert_large_cased/merges.txt",
<add> "flaubert/flaubert_small_cased": "https://huggingface.co/flaubert/flaubert_small_cased/resolve/main/merges.txt",
<add> "flaubert/flaubert_base_uncased": "https://huggingface.co/flaubert/flaubert_base_uncased/resolve/main/merges.txt",
<add> "flaubert/flaubert_base_cased": "https://huggingface.co/flaubert/flaubert_base_cased/resolve/main/merges.txt",
<add> "flaubert/flaubert_large_cased": "https://huggingface.co/flaubert/flaubert_large_cased/resolve/main/merges.txt",
<ide> },
<ide> }
<ide>
<ide><path>src/transformers/tokenization_funnel.py
<ide>
<ide> PRETRAINED_VOCAB_FILES_MAP = {
<ide> "vocab_file": {
<del> "funnel-transformer/small": "https://s3.amazonaws.com/models.huggingface.co/bert/funnel-transformer/small/vocab.txt",
<del> "funnel-transformer/small-base": "https://s3.amazonaws.com/models.huggingface.co/bert/funnel-transformer/small-base/vocab.txt",
<del> "funnel-transformer/medium": "https://s3.amazonaws.com/models.huggingface.co/bert/funnel-transformer/medium/vocab.txt",
<del> "funnel-transformer/medium-base": "https://s3.amazonaws.com/models.huggingface.co/bert/funnel-transformer/medium-base/vocab.txt",
<del> "funnel-transformer/intermediate": "https://s3.amazonaws.com/models.huggingface.co/bert/funnel-transformer/intermediate/vocab.txt",
<del> "funnel-transformer/intermediate-base": "https://s3.amazonaws.com/models.huggingface.co/bert/funnel-transformer/intermediate-base/vocab.txt",
<del> "funnel-transformer/large": "https://s3.amazonaws.com/models.huggingface.co/bert/funnel-transformer/large/vocab.txt",
<del> "funnel-transformer/large-base": "https://s3.amazonaws.com/models.huggingface.co/bert/funnel-transformer/large-base/vocab.txt",
<del> "funnel-transformer/xlarge": "https://s3.amazonaws.com/models.huggingface.co/bert/funnel-transformer/xlarge/vocab.txt",
<del> "funnel-transformer/xlarge-base": "https://s3.amazonaws.com/models.huggingface.co/bert/funnel-transformer/xlarge-base/vocab.txt",
<add> "funnel-transformer/small": "https://huggingface.co/funnel-transformer/small/resolve/main/vocab.txt",
<add> "funnel-transformer/small-base": "https://huggingface.co/funnel-transformer/small-base/resolve/main/vocab.txt",
<add> "funnel-transformer/medium": "https://huggingface.co/funnel-transformer/medium/resolve/main/vocab.txt",
<add> "funnel-transformer/medium-base": "https://huggingface.co/funnel-transformer/medium-base/resolve/main/vocab.txt",
<add> "funnel-transformer/intermediate": "https://huggingface.co/funnel-transformer/intermediate/resolve/main/vocab.txt",
<add> "funnel-transformer/intermediate-base": "https://huggingface.co/funnel-transformer/intermediate-base/resolve/main/vocab.txt",
<add> "funnel-transformer/large": "https://huggingface.co/funnel-transformer/large/resolve/main/vocab.txt",
<add> "funnel-transformer/large-base": "https://huggingface.co/funnel-transformer/large-base/resolve/main/vocab.txt",
<add> "funnel-transformer/xlarge": "https://huggingface.co/funnel-transformer/xlarge/resolve/main/vocab.txt",
<add> "funnel-transformer/xlarge-base": "https://huggingface.co/funnel-transformer/xlarge-base/resolve/main/vocab.txt",
<ide> }
<ide> }
<ide> PRETRAINED_POSITIONAL_EMBEDDINGS_SIZES = {f"funnel-transformer/{name}": 512 for name in _model_names}
<ide><path>src/transformers/tokenization_funnel_fast.py
<ide>
<ide> PRETRAINED_VOCAB_FILES_MAP = {
<ide> "vocab_file": {
<del> "funnel-transformer/small": "https://s3.amazonaws.com/models.huggingface.co/bert/funnel-transformer/small/vocab.txt",
<del> "funnel-transformer/small-base": "https://s3.amazonaws.com/models.huggingface.co/bert/funnel-transformer/small-base/vocab.txt",
<del> "funnel-transformer/medium": "https://s3.amazonaws.com/models.huggingface.co/bert/funnel-transformer/medium/vocab.txt",
<del> "funnel-transformer/medium-base": "https://s3.amazonaws.com/models.huggingface.co/bert/funnel-transformer/medium-base/vocab.txt",
<del> "funnel-transformer/intermediate": "https://s3.amazonaws.com/models.huggingface.co/bert/funnel-transformer/intermediate/vocab.txt",
<del> "funnel-transformer/intermediate-base": "https://s3.amazonaws.com/models.huggingface.co/bert/funnel-transformer/intermediate-base/vocab.txt",
<del> "funnel-transformer/large": "https://s3.amazonaws.com/models.huggingface.co/bert/funnel-transformer/large/vocab.txt",
<del> "funnel-transformer/large-base": "https://s3.amazonaws.com/models.huggingface.co/bert/funnel-transformer/large-base/vocab.txt",
<del> "funnel-transformer/xlarge": "https://s3.amazonaws.com/models.huggingface.co/bert/funnel-transformer/xlarge/vocab.txt",
<del> "funnel-transformer/xlarge-base": "https://s3.amazonaws.com/models.huggingface.co/bert/funnel-transformer/xlarge-base/vocab.txt",
<add> "funnel-transformer/small": "https://huggingface.co/funnel-transformer/small/resolve/main/vocab.txt",
<add> "funnel-transformer/small-base": "https://huggingface.co/funnel-transformer/small-base/resolve/main/vocab.txt",
<add> "funnel-transformer/medium": "https://huggingface.co/funnel-transformer/medium/resolve/main/vocab.txt",
<add> "funnel-transformer/medium-base": "https://huggingface.co/funnel-transformer/medium-base/resolve/main/vocab.txt",
<add> "funnel-transformer/intermediate": "https://huggingface.co/funnel-transformer/intermediate/resolve/main/vocab.txt",
<add> "funnel-transformer/intermediate-base": "https://huggingface.co/funnel-transformer/intermediate-base/resolve/main/vocab.txt",
<add> "funnel-transformer/large": "https://huggingface.co/funnel-transformer/large/resolve/main/vocab.txt",
<add> "funnel-transformer/large-base": "https://huggingface.co/funnel-transformer/large-base/resolve/main/vocab.txt",
<add> "funnel-transformer/xlarge": "https://huggingface.co/funnel-transformer/xlarge/resolve/main/vocab.txt",
<add> "funnel-transformer/xlarge-base": "https://huggingface.co/funnel-transformer/xlarge-base/resolve/main/vocab.txt",
<ide> },
<ide> "tokenizer_file": {
<del> "funnel-transformer/small": "https://s3.amazonaws.com/models.huggingface.co/bert/funnel-transformer/small/tokenizer.json",
<del> "funnel-transformer/small-base": "https://s3.amazonaws.com/models.huggingface.co/bert/funnel-transformer/small-base/tokenizer.json",
<del> "funnel-transformer/medium": "https://s3.amazonaws.com/models.huggingface.co/bert/funnel-transformer/medium/tokenizer.json",
<del> "funnel-transformer/medium-base": "https://s3.amazonaws.com/models.huggingface.co/bert/funnel-transformer/medium-base/tokenizer.json",
<del> "funnel-transformer/intermediate": "https://s3.amazonaws.com/models.huggingface.co/bert/funnel-transformer/intermediate/tokenizer.json",
<del> "funnel-transformer/intermediate-base": "https://s3.amazonaws.com/models.huggingface.co/bert/funnel-transformer/intermediate-base/tokenizer.json",
<del> "funnel-transformer/large": "https://s3.amazonaws.com/models.huggingface.co/bert/funnel-transformer/large/tokenizer.json",
<del> "funnel-transformer/large-base": "https://s3.amazonaws.com/models.huggingface.co/bert/funnel-transformer/large-base/tokenizer.json",
<del> "funnel-transformer/xlarge": "https://s3.amazonaws.com/models.huggingface.co/bert/funnel-transformer/xlarge/tokenizer.json",
<del> "funnel-transformer/xlarge-base": "https://s3.amazonaws.com/models.huggingface.co/bert/funnel-transformer/xlarge-base/tokenizer.json",
<add> "funnel-transformer/small": "https://huggingface.co/funnel-transformer/small/resolve/main/tokenizer.json",
<add> "funnel-transformer/small-base": "https://huggingface.co/funnel-transformer/small-base/resolve/main/tokenizer.json",
<add> "funnel-transformer/medium": "https://huggingface.co/funnel-transformer/medium/resolve/main/tokenizer.json",
<add> "funnel-transformer/medium-base": "https://huggingface.co/funnel-transformer/medium-base/resolve/main/tokenizer.json",
<add> "funnel-transformer/intermediate": "https://huggingface.co/funnel-transformer/intermediate/resolve/main/tokenizer.json",
<add> "funnel-transformer/intermediate-base": "https://huggingface.co/funnel-transformer/intermediate-base/resolve/main/tokenizer.json",
<add> "funnel-transformer/large": "https://huggingface.co/funnel-transformer/large/resolve/main/tokenizer.json",
<add> "funnel-transformer/large-base": "https://huggingface.co/funnel-transformer/large-base/resolve/main/tokenizer.json",
<add> "funnel-transformer/xlarge": "https://huggingface.co/funnel-transformer/xlarge/resolve/main/tokenizer.json",
<add> "funnel-transformer/xlarge-base": "https://huggingface.co/funnel-transformer/xlarge-base/resolve/main/tokenizer.json",
<ide> },
<ide> }
<ide> PRETRAINED_POSITIONAL_EMBEDDINGS_SIZES = {f"funnel-transformer/{name}": 512 for name in _model_names}
<ide><path>src/transformers/tokenization_gpt2.py
<ide>
<ide> PRETRAINED_VOCAB_FILES_MAP = {
<ide> "vocab_file": {
<del> "gpt2": "https://s3.amazonaws.com/models.huggingface.co/bert/gpt2-vocab.json",
<del> "gpt2-medium": "https://s3.amazonaws.com/models.huggingface.co/bert/gpt2-medium-vocab.json",
<del> "gpt2-large": "https://s3.amazonaws.com/models.huggingface.co/bert/gpt2-large-vocab.json",
<del> "gpt2-xl": "https://s3.amazonaws.com/models.huggingface.co/bert/gpt2-xl-vocab.json",
<del> "distilgpt2": "https://s3.amazonaws.com/models.huggingface.co/bert/distilgpt2-vocab.json",
<add> "gpt2": "https://huggingface.co/gpt2/resolve/main/vocab.json",
<add> "gpt2-medium": "https://huggingface.co/gpt2-medium/resolve/main/vocab.json",
<add> "gpt2-large": "https://huggingface.co/gpt2-large/resolve/main/vocab.json",
<add> "gpt2-xl": "https://huggingface.co/gpt2-xl/resolve/main/vocab.json",
<add> "distilgpt2": "https://huggingface.co/distilgpt2/resolve/main/vocab.json",
<ide> },
<ide> "merges_file": {
<del> "gpt2": "https://s3.amazonaws.com/models.huggingface.co/bert/gpt2-merges.txt",
<del> "gpt2-medium": "https://s3.amazonaws.com/models.huggingface.co/bert/gpt2-medium-merges.txt",
<del> "gpt2-large": "https://s3.amazonaws.com/models.huggingface.co/bert/gpt2-large-merges.txt",
<del> "gpt2-xl": "https://s3.amazonaws.com/models.huggingface.co/bert/gpt2-xl-merges.txt",
<del> "distilgpt2": "https://s3.amazonaws.com/models.huggingface.co/bert/distilgpt2-merges.txt",
<add> "gpt2": "https://huggingface.co/gpt2/resolve/main/merges.txt",
<add> "gpt2-medium": "https://huggingface.co/gpt2-medium/resolve/main/merges.txt",
<add> "gpt2-large": "https://huggingface.co/gpt2-large/resolve/main/merges.txt",
<add> "gpt2-xl": "https://huggingface.co/gpt2-xl/resolve/main/merges.txt",
<add> "distilgpt2": "https://huggingface.co/distilgpt2/resolve/main/merges.txt",
<ide> },
<ide> }
<ide>
<ide><path>src/transformers/tokenization_gpt2_fast.py
<ide>
<ide> PRETRAINED_VOCAB_FILES_MAP = {
<ide> "vocab_file": {
<del> "gpt2": "https://s3.amazonaws.com/models.huggingface.co/bert/gpt2-vocab.json",
<del> "gpt2-medium": "https://s3.amazonaws.com/models.huggingface.co/bert/gpt2-medium-vocab.json",
<del> "gpt2-large": "https://s3.amazonaws.com/models.huggingface.co/bert/gpt2-large-vocab.json",
<del> "gpt2-xl": "https://s3.amazonaws.com/models.huggingface.co/bert/gpt2-xl-vocab.json",
<del> "distilgpt2": "https://s3.amazonaws.com/models.huggingface.co/bert/distilgpt2-vocab.json",
<add> "gpt2": "https://huggingface.co/gpt2/resolve/main/vocab.json",
<add> "gpt2-medium": "https://huggingface.co/gpt2-medium/resolve/main/vocab.json",
<add> "gpt2-large": "https://huggingface.co/gpt2-large/resolve/main/vocab.json",
<add> "gpt2-xl": "https://huggingface.co/gpt2-xl/resolve/main/vocab.json",
<add> "distilgpt2": "https://huggingface.co/distilgpt2/resolve/main/vocab.json",
<ide> },
<ide> "merges_file": {
<del> "gpt2": "https://s3.amazonaws.com/models.huggingface.co/bert/gpt2-merges.txt",
<del> "gpt2-medium": "https://s3.amazonaws.com/models.huggingface.co/bert/gpt2-medium-merges.txt",
<del> "gpt2-large": "https://s3.amazonaws.com/models.huggingface.co/bert/gpt2-large-merges.txt",
<del> "gpt2-xl": "https://s3.amazonaws.com/models.huggingface.co/bert/gpt2-xl-merges.txt",
<del> "distilgpt2": "https://s3.amazonaws.com/models.huggingface.co/bert/distilgpt2-merges.txt",
<add> "gpt2": "https://huggingface.co/gpt2/resolve/main/merges.txt",
<add> "gpt2-medium": "https://huggingface.co/gpt2-medium/resolve/main/merges.txt",
<add> "gpt2-large": "https://huggingface.co/gpt2-large/resolve/main/merges.txt",
<add> "gpt2-xl": "https://huggingface.co/gpt2-xl/resolve/main/merges.txt",
<add> "distilgpt2": "https://huggingface.co/distilgpt2/resolve/main/merges.txt",
<ide> },
<ide> "tokenizer_file": {
<del> "gpt2": "https://s3.amazonaws.com/models.huggingface.co/bert/gpt2-tokenizer.json",
<del> "gpt2-medium": "https://s3.amazonaws.com/models.huggingface.co/bert/gpt2-medium-tokenizer.json",
<del> "gpt2-large": "https://s3.amazonaws.com/models.huggingface.co/bert/gpt2-large-tokenizer.json",
<del> "gpt2-xl": "https://s3.amazonaws.com/models.huggingface.co/bert/gpt2-xl-tokenizer.json",
<del> "distilgpt2": "https://s3.amazonaws.com/models.huggingface.co/bert/distilgpt2-tokenizer.json",
<add> "gpt2": "https://huggingface.co/gpt2/resolve/main/tokenizer.json",
<add> "gpt2-medium": "https://huggingface.co/gpt2-medium/resolve/main/tokenizer.json",
<add> "gpt2-large": "https://huggingface.co/gpt2-large/resolve/main/tokenizer.json",
<add> "gpt2-xl": "https://huggingface.co/gpt2-xl/resolve/main/tokenizer.json",
<add> "distilgpt2": "https://huggingface.co/distilgpt2/resolve/main/tokenizer.json",
<ide> },
<ide> }
<ide>
<ide><path>src/transformers/tokenization_layoutlm.py
<ide>
<ide> PRETRAINED_VOCAB_FILES_MAP = {
<ide> "vocab_file": {
<del> "microsoft/layoutlm-base-uncased": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-vocab.txt",
<del> "microsoft/layoutlm-large-uncased": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-uncased-vocab.txt",
<add> "microsoft/layoutlm-base-uncased": "https://huggingface.co/bert-base-uncased/resolve/main/vocab.txt",
<add> "microsoft/layoutlm-large-uncased": "https://huggingface.co/bert-large-uncased/resolve/main/vocab.txt",
<ide> }
<ide> }
<ide>
<ide><path>src/transformers/tokenization_layoutlm_fast.py
<ide>
<ide> PRETRAINED_VOCAB_FILES_MAP = {
<ide> "vocab_file": {
<del> "microsoft/layoutlm-base-uncased": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-vocab.txt",
<del> "microsoft/layoutlm-large-uncased": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-uncased-vocab.txt",
<add> "microsoft/layoutlm-base-uncased": "https://huggingface.co/bert-base-uncased/resolve/main/vocab.txt",
<add> "microsoft/layoutlm-large-uncased": "https://huggingface.co/bert-large-uncased/resolve/main/vocab.txt",
<ide> },
<ide> "tokenizer_file": {
<del> "microsoft/layoutlm-base-uncased": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-tokenizer.json",
<del> "microsoft/layoutlm-large-uncased": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-uncased-tokenizer.json",
<add> "microsoft/layoutlm-base-uncased": "https://huggingface.co/bert-base-uncased/resolve/main/tokenizer.json",
<add> "microsoft/layoutlm-large-uncased": "https://huggingface.co/bert-large-uncased/resolve/main/tokenizer.json",
<ide> },
<ide> }
<ide>
<ide><path>src/transformers/tokenization_longformer.py
<ide>
<ide>
<ide> # vocab and merges same as roberta
<del>vocab_url = "https://s3.amazonaws.com/models.huggingface.co/bert/roberta-large-vocab.json"
<del>merges_url = "https://s3.amazonaws.com/models.huggingface.co/bert/roberta-large-merges.txt"
<add>vocab_url = "https://huggingface.co/roberta-large/resolve/main/vocab.json"
<add>merges_url = "https://huggingface.co/roberta-large/resolve/main/merges.txt"
<ide> _all_longformer_models = [
<ide> "allenai/longformer-base-4096",
<ide> "allenai/longformer-large-4096",
<ide><path>src/transformers/tokenization_longformer_fast.py
<ide>
<ide>
<ide> # vocab and merges same as roberta
<del>vocab_url = "https://s3.amazonaws.com/models.huggingface.co/bert/roberta-large-vocab.json"
<del>merges_url = "https://s3.amazonaws.com/models.huggingface.co/bert/roberta-large-merges.txt"
<del>tokenizer_url = "https://s3.amazonaws.com/models.huggingface.co/bert/roberta-large-tokenizer.json"
<add>vocab_url = "https://huggingface.co/roberta-large/resolve/main/vocab.json"
<add>merges_url = "https://huggingface.co/roberta-large/resolve/main/merges.txt"
<add>tokenizer_url = "https://huggingface.co/roberta-large/resolve/main/tokenizer.json"
<ide> _all_longformer_models = [
<ide> "allenai/longformer-base-4096",
<ide> "allenai/longformer-large-4096",
<ide><path>src/transformers/tokenization_lxmert.py
<ide> ####################################################
<ide> PRETRAINED_VOCAB_FILES_MAP = {
<ide> "vocab_file": {
<del> "unc-nlp/lxmert-base-uncased": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-vocab.txt",
<add> "unc-nlp/lxmert-base-uncased": "https://huggingface.co/bert-base-uncased/resolve/main/vocab.txt",
<ide> }
<ide> }
<ide>
<ide><path>src/transformers/tokenization_lxmert_fast.py
<ide> ####################################################
<ide> PRETRAINED_VOCAB_FILES_MAP = {
<ide> "vocab_file": {
<del> "unc-nlp/lxmert-base-uncased": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-vocab.txt",
<add> "unc-nlp/lxmert-base-uncased": "https://huggingface.co/bert-base-uncased/resolve/main/vocab.txt",
<ide> },
<ide> "tokenizer_file": {
<del> "unc-nlp/lxmert-base-uncased": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-tokenizer.json",
<add> "unc-nlp/lxmert-base-uncased": "https://huggingface.co/bert-base-uncased/resolve/main/tokenizer.json",
<ide> },
<ide> }
<ide>
<ide><path>src/transformers/tokenization_marian.py
<ide> PRETRAINED_POSITIONAL_EMBEDDINGS_SIZES = {"Helsinki-NLP/opus-mt-en-de": 512}
<ide> PRETRAINED_INIT_CONFIGURATION = {}
<ide>
<del># Example URL https://s3.amazonaws.com/models.huggingface.co/bert/Helsinki-NLP/opus-mt-en-de/vocab.json
<add># Example URL https://huggingface.co/Helsinki-NLP/opus-mt-en-de/resolve/main/vocab.json
<ide>
<ide>
<ide> class MarianTokenizer(PreTrainedTokenizer):
<ide><path>src/transformers/tokenization_mbart.py
<ide> logger = logging.get_logger(__name__)
<ide>
<ide> _all_mbart_models = ["facebook/mbart-large-en-ro", "facebook/mbart-large-cc25"]
<del>SPM_URL = "https://s3.amazonaws.com/models.huggingface.co/bert/facebook/mbart-large-en-ro/sentence.bpe.model"
<add>SPM_URL = "https://huggingface.co/facebook/mbart-large-en-ro/resolve/main/sentence.bpe.model"
<ide>
<ide> FAIRSEQ_LANGUAGE_CODES = [
<ide> "ar_AR",
<ide><path>src/transformers/tokenization_mbart_fast.py
<ide> logger = logging.get_logger(__name__)
<ide>
<ide> _all_mbart_models = ["facebook/mbart-large-en-ro", "facebook/mbart-large-cc25"]
<del>SPM_URL = "https://s3.amazonaws.com/models.huggingface.co/bert/facebook/mbart-large-en-ro/sentence.bpe.model"
<del>tokenizer_URL = "https://s3.amazonaws.com/models.huggingface.co/bert/facebook/mbart-large-en-ro/tokenizer.json"
<add>SPM_URL = "https://huggingface.co/facebook/mbart-large-en-ro/resolve/main/sentence.bpe.model"
<add>tokenizer_URL = "https://huggingface.co/facebook/mbart-large-en-ro/resolve/main/tokenizer.json"
<ide>
<ide> FAIRSEQ_LANGUAGE_CODES = [
<ide> "ar_AR",
<ide><path>src/transformers/tokenization_mobilebert.py
<ide>
<ide> PRETRAINED_VOCAB_FILES_MAP = {
<ide> "vocab_file": {
<del> "mobilebert-uncased": "https://s3.amazonaws.com/models.huggingface.co/bert/google/mobilebert-uncased/vocab.txt"
<add> "mobilebert-uncased": "https://huggingface.co/google/mobilebert-uncased/resolve/main/vocab.txt"
<ide> }
<ide> }
<ide>
<ide><path>src/transformers/tokenization_mobilebert_fast.py
<ide>
<ide> PRETRAINED_VOCAB_FILES_MAP = {
<ide> "vocab_file": {
<del> "mobilebert-uncased": "https://s3.amazonaws.com/models.huggingface.co/bert/google/mobilebert-uncased/vocab.txt"
<add> "mobilebert-uncased": "https://huggingface.co/google/mobilebert-uncased/resolve/main/vocab.txt"
<ide> },
<ide> "tokenizer_file": {
<del> "mobilebert-uncased": "https://s3.amazonaws.com/models.huggingface.co/bert/google/mobilebert-uncased/tokenizer.json"
<add> "mobilebert-uncased": "https://huggingface.co/google/mobilebert-uncased/resolve/main/tokenizer.json"
<ide> },
<ide> }
<ide>
<ide><path>src/transformers/tokenization_openai.py
<ide> }
<ide>
<ide> PRETRAINED_VOCAB_FILES_MAP = {
<del> "vocab_file": {"openai-gpt": "https://s3.amazonaws.com/models.huggingface.co/bert/openai-gpt-vocab.json"},
<del> "merges_file": {"openai-gpt": "https://s3.amazonaws.com/models.huggingface.co/bert/openai-gpt-merges.txt"},
<add> "vocab_file": {"openai-gpt": "https://huggingface.co/openai-gpt/resolve/main/vocab.json"},
<add> "merges_file": {"openai-gpt": "https://huggingface.co/openai-gpt/resolve/main/merges.txt"},
<ide> }
<ide>
<ide> PRETRAINED_POSITIONAL_EMBEDDINGS_SIZES = {
<ide><path>src/transformers/tokenization_openai_fast.py
<ide> VOCAB_FILES_NAMES = {"vocab_file": "vocab.json", "merges_file": "merges.txt", "tokenizer_file": "tokenizer.json"}
<ide>
<ide> PRETRAINED_VOCAB_FILES_MAP = {
<del> "vocab_file": {"openai-gpt": "https://s3.amazonaws.com/models.huggingface.co/bert/openai-gpt-vocab.json"},
<del> "merges_file": {"openai-gpt": "https://s3.amazonaws.com/models.huggingface.co/bert/openai-gpt-merges.txt"},
<del> "tokenizer_file": {"openai-gpt": "https://s3.amazonaws.com/models.huggingface.co/bert/openai-gpt-tokenizer.json"},
<add> "vocab_file": {"openai-gpt": "https://huggingface.co/openai-gpt/resolve/main/vocab.json"},
<add> "merges_file": {"openai-gpt": "https://huggingface.co/openai-gpt/resolve/main/merges.txt"},
<add> "tokenizer_file": {"openai-gpt": "https://huggingface.co/openai-gpt/resolve/main/tokenizer.json"},
<ide> }
<ide>
<ide> PRETRAINED_POSITIONAL_EMBEDDINGS_SIZES = {
<ide><path>src/transformers/tokenization_phobert.py
<ide>
<ide> PRETRAINED_VOCAB_FILES_MAP = {
<ide> "vocab_file": {
<del> "vinai/phobert-base": "https://s3.amazonaws.com/models.huggingface.co/bert/vinai/phobert-base/vocab.txt",
<del> "vinai/phobert-large": "https://s3.amazonaws.com/models.huggingface.co/bert/vinai/phobert-large/vocab.txt",
<add> "vinai/phobert-base": "https://huggingface.co/vinai/phobert-base/resolve/main/vocab.txt",
<add> "vinai/phobert-large": "https://huggingface.co/vinai/phobert-large/resolve/main/vocab.txt",
<ide> },
<ide> "merges_file": {
<del> "vinai/phobert-base": "https://s3.amazonaws.com/models.huggingface.co/bert/vinai/phobert-base/bpe.codes",
<del> "vinai/phobert-large": "https://s3.amazonaws.com/models.huggingface.co/bert/vinai/phobert-large/bpe.codes",
<add> "vinai/phobert-base": "https://huggingface.co/vinai/phobert-base/resolve/main/bpe.codes",
<add> "vinai/phobert-large": "https://huggingface.co/vinai/phobert-large/resolve/main/bpe.codes",
<ide> },
<ide> }
<ide>
<ide><path>src/transformers/tokenization_prophetnet.py
<ide>
<ide> PRETRAINED_VOCAB_FILES_MAP = {
<ide> "vocab_file": {
<del> "microsoft/prophetnet-large-uncased": "https://s3.amazonaws.com/models.huggingface.co/bert/microsoft/prophetnet-large-uncased/prophetnet.tokenizer",
<add> "microsoft/prophetnet-large-uncased": "https://huggingface.co/microsoft/prophetnet-large-uncased/resolve/main/prophetnet.tokenizer",
<ide> }
<ide> }
<ide>
<ide><path>src/transformers/tokenization_retribert.py
<ide>
<ide> PRETRAINED_VOCAB_FILES_MAP = {
<ide> "vocab_file": {
<del> "yjernite/retribert-base-uncased": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-vocab.txt",
<add> "yjernite/retribert-base-uncased": "https://huggingface.co/bert-base-uncased/resolve/main/vocab.txt",
<ide> }
<ide> }
<ide>
<ide><path>src/transformers/tokenization_retribert_fast.py
<ide>
<ide> PRETRAINED_VOCAB_FILES_MAP = {
<ide> "vocab_file": {
<del> "yjernite/retribert-base-uncased": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-vocab.txt",
<add> "yjernite/retribert-base-uncased": "https://huggingface.co/bert-base-uncased/resolve/main/vocab.txt",
<ide> },
<ide> "tokenizer_file": {
<del> "yjernite/retribert-base-uncased": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-tokenizer.json",
<add> "yjernite/retribert-base-uncased": "https://huggingface.co/bert-base-uncased/resolve/main/tokenizer.json",
<ide> },
<ide> }
<ide>
<ide><path>src/transformers/tokenization_roberta.py
<ide>
<ide> PRETRAINED_VOCAB_FILES_MAP = {
<ide> "vocab_file": {
<del> "roberta-base": "https://s3.amazonaws.com/models.huggingface.co/bert/roberta-base-vocab.json",
<del> "roberta-large": "https://s3.amazonaws.com/models.huggingface.co/bert/roberta-large-vocab.json",
<del> "roberta-large-mnli": "https://s3.amazonaws.com/models.huggingface.co/bert/roberta-large-mnli-vocab.json",
<del> "distilroberta-base": "https://s3.amazonaws.com/models.huggingface.co/bert/distilroberta-base-vocab.json",
<del> "roberta-base-openai-detector": "https://s3.amazonaws.com/models.huggingface.co/bert/roberta-base-vocab.json",
<del> "roberta-large-openai-detector": "https://s3.amazonaws.com/models.huggingface.co/bert/roberta-large-vocab.json",
<add> "roberta-base": "https://huggingface.co/roberta-base/resolve/main/vocab.json",
<add> "roberta-large": "https://huggingface.co/roberta-large/resolve/main/vocab.json",
<add> "roberta-large-mnli": "https://huggingface.co/roberta-large-mnli/resolve/main/vocab.json",
<add> "distilroberta-base": "https://huggingface.co/distilroberta-base/resolve/main/vocab.json",
<add> "roberta-base-openai-detector": "https://huggingface.co/roberta-base/resolve/main/vocab.json",
<add> "roberta-large-openai-detector": "https://huggingface.co/roberta-large/resolve/main/vocab.json",
<ide> },
<ide> "merges_file": {
<del> "roberta-base": "https://s3.amazonaws.com/models.huggingface.co/bert/roberta-base-merges.txt",
<del> "roberta-large": "https://s3.amazonaws.com/models.huggingface.co/bert/roberta-large-merges.txt",
<del> "roberta-large-mnli": "https://s3.amazonaws.com/models.huggingface.co/bert/roberta-large-mnli-merges.txt",
<del> "distilroberta-base": "https://s3.amazonaws.com/models.huggingface.co/bert/distilroberta-base-merges.txt",
<del> "roberta-base-openai-detector": "https://s3.amazonaws.com/models.huggingface.co/bert/roberta-base-merges.txt",
<del> "roberta-large-openai-detector": "https://s3.amazonaws.com/models.huggingface.co/bert/roberta-large-merges.txt",
<add> "roberta-base": "https://huggingface.co/roberta-base/resolve/main/merges.txt",
<add> "roberta-large": "https://huggingface.co/roberta-large/resolve/main/merges.txt",
<add> "roberta-large-mnli": "https://huggingface.co/roberta-large-mnli/resolve/main/merges.txt",
<add> "distilroberta-base": "https://huggingface.co/distilroberta-base/resolve/main/merges.txt",
<add> "roberta-base-openai-detector": "https://huggingface.co/roberta-base/resolve/main/merges.txt",
<add> "roberta-large-openai-detector": "https://huggingface.co/roberta-large/resolve/main/merges.txt",
<ide> },
<ide> }
<ide>
<ide><path>src/transformers/tokenization_roberta_fast.py
<ide>
<ide> PRETRAINED_VOCAB_FILES_MAP = {
<ide> "vocab_file": {
<del> "roberta-base": "https://s3.amazonaws.com/models.huggingface.co/bert/roberta-base-vocab.json",
<del> "roberta-large": "https://s3.amazonaws.com/models.huggingface.co/bert/roberta-large-vocab.json",
<del> "roberta-large-mnli": "https://s3.amazonaws.com/models.huggingface.co/bert/roberta-large-mnli-vocab.json",
<del> "distilroberta-base": "https://s3.amazonaws.com/models.huggingface.co/bert/distilroberta-base-vocab.json",
<del> "roberta-base-openai-detector": "https://s3.amazonaws.com/models.huggingface.co/bert/roberta-base-vocab.json",
<del> "roberta-large-openai-detector": "https://s3.amazonaws.com/models.huggingface.co/bert/roberta-large-vocab.json",
<add> "roberta-base": "https://huggingface.co/roberta-base/resolve/main/vocab.json",
<add> "roberta-large": "https://huggingface.co/roberta-large/resolve/main/vocab.json",
<add> "roberta-large-mnli": "https://huggingface.co/roberta-large-mnli/resolve/main/vocab.json",
<add> "distilroberta-base": "https://huggingface.co/distilroberta-base/resolve/main/vocab.json",
<add> "roberta-base-openai-detector": "https://huggingface.co/roberta-base/resolve/main/vocab.json",
<add> "roberta-large-openai-detector": "https://huggingface.co/roberta-large/resolve/main/vocab.json",
<ide> },
<ide> "merges_file": {
<del> "roberta-base": "https://s3.amazonaws.com/models.huggingface.co/bert/roberta-base-merges.txt",
<del> "roberta-large": "https://s3.amazonaws.com/models.huggingface.co/bert/roberta-large-merges.txt",
<del> "roberta-large-mnli": "https://s3.amazonaws.com/models.huggingface.co/bert/roberta-large-mnli-merges.txt",
<del> "distilroberta-base": "https://s3.amazonaws.com/models.huggingface.co/bert/distilroberta-base-merges.txt",
<del> "roberta-base-openai-detector": "https://s3.amazonaws.com/models.huggingface.co/bert/roberta-base-merges.txt",
<del> "roberta-large-openai-detector": "https://s3.amazonaws.com/models.huggingface.co/bert/roberta-large-merges.txt",
<add> "roberta-base": "https://huggingface.co/roberta-base/resolve/main/merges.txt",
<add> "roberta-large": "https://huggingface.co/roberta-large/resolve/main/merges.txt",
<add> "roberta-large-mnli": "https://huggingface.co/roberta-large-mnli/resolve/main/merges.txt",
<add> "distilroberta-base": "https://huggingface.co/distilroberta-base/resolve/main/merges.txt",
<add> "roberta-base-openai-detector": "https://huggingface.co/roberta-base/resolve/main/merges.txt",
<add> "roberta-large-openai-detector": "https://huggingface.co/roberta-large/resolve/main/merges.txt",
<ide> },
<ide> "tokenizer_file": {
<del> "roberta-base": "https://s3.amazonaws.com/models.huggingface.co/bert/roberta-base-tokenizer.json",
<del> "roberta-large": "https://s3.amazonaws.com/models.huggingface.co/bert/roberta-large-tokenizer.json",
<del> "roberta-large-mnli": "https://s3.amazonaws.com/models.huggingface.co/bert/roberta-large-mnli-tokenizer.json",
<del> "distilroberta-base": "https://s3.amazonaws.com/models.huggingface.co/bert/distilroberta-base-tokenizer.json",
<del> "roberta-base-openai-detector": "https://s3.amazonaws.com/models.huggingface.co/bert/roberta-base-tokenizer.json",
<del> "roberta-large-openai-detector": "https://s3.amazonaws.com/models.huggingface.co/bert/roberta-large-tokenizer.json",
<add> "roberta-base": "https://huggingface.co/roberta-base/resolve/main/tokenizer.json",
<add> "roberta-large": "https://huggingface.co/roberta-large/resolve/main/tokenizer.json",
<add> "roberta-large-mnli": "https://huggingface.co/roberta-large-mnli/resolve/main/tokenizer.json",
<add> "distilroberta-base": "https://huggingface.co/distilroberta-base/resolve/main/tokenizer.json",
<add> "roberta-base-openai-detector": "https://huggingface.co/roberta-base/resolve/main/tokenizer.json",
<add> "roberta-large-openai-detector": "https://huggingface.co/roberta-large/resolve/main/tokenizer.json",
<ide> },
<ide> }
<ide>
<ide><path>src/transformers/tokenization_squeezebert.py
<ide>
<ide> PRETRAINED_VOCAB_FILES_MAP = {
<ide> "vocab_file": {
<del> "squeezebert/squeezebert-uncased": "https://s3.amazonaws.com/models.huggingface.co/bert/squeezebert/squeezebert-uncased/vocab.txt",
<del> "squeezebert/squeezebert-mnli": "https://s3.amazonaws.com/models.huggingface.co/bert/squeezebert/squeezebert-mnli/vocab.txt",
<del> "squeezebert/squeezebert-mnli-headless": "https://s3.amazonaws.com/models.huggingface.co/bert/squeezebert/squeezebert-mnli-headless/vocab.txt",
<add> "squeezebert/squeezebert-uncased": "https://huggingface.co/squeezebert/squeezebert-uncased/resolve/main/vocab.txt",
<add> "squeezebert/squeezebert-mnli": "https://huggingface.co/squeezebert/squeezebert-mnli/resolve/main/vocab.txt",
<add> "squeezebert/squeezebert-mnli-headless": "https://huggingface.co/squeezebert/squeezebert-mnli-headless/resolve/main/vocab.txt",
<ide> }
<ide> }
<ide>
<ide><path>src/transformers/tokenization_squeezebert_fast.py
<ide>
<ide> PRETRAINED_VOCAB_FILES_MAP = {
<ide> "vocab_file": {
<del> "squeezebert/squeezebert-uncased": "https://s3.amazonaws.com/models.huggingface.co/bert/squeezebert/squeezebert-uncased/vocab.txt",
<del> "squeezebert/squeezebert-mnli": "https://s3.amazonaws.com/models.huggingface.co/bert/squeezebert/squeezebert-mnli/vocab.txt",
<del> "squeezebert/squeezebert-mnli-headless": "https://s3.amazonaws.com/models.huggingface.co/bert/squeezebert/squeezebert-mnli-headless/vocab.txt",
<add> "squeezebert/squeezebert-uncased": "https://huggingface.co/squeezebert/squeezebert-uncased/resolve/main/vocab.txt",
<add> "squeezebert/squeezebert-mnli": "https://huggingface.co/squeezebert/squeezebert-mnli/resolve/main/vocab.txt",
<add> "squeezebert/squeezebert-mnli-headless": "https://huggingface.co/squeezebert/squeezebert-mnli-headless/resolve/main/vocab.txt",
<ide> },
<ide> "tokenizer_file": {
<del> "squeezebert/squeezebert-uncased": "https://s3.amazonaws.com/models.huggingface.co/bert/squeezebert/squeezebert-uncased/tokenizer.json",
<del> "squeezebert/squeezebert-mnli": "https://s3.amazonaws.com/models.huggingface.co/bert/squeezebert/squeezebert-mnli/tokenizer.json",
<del> "squeezebert/squeezebert-mnli-headless": "https://s3.amazonaws.com/models.huggingface.co/bert/squeezebert/squeezebert-mnli-headless/tokenizer.json",
<add> "squeezebert/squeezebert-uncased": "https://huggingface.co/squeezebert/squeezebert-uncased/resolve/main/tokenizer.json",
<add> "squeezebert/squeezebert-mnli": "https://huggingface.co/squeezebert/squeezebert-mnli/resolve/main/tokenizer.json",
<add> "squeezebert/squeezebert-mnli-headless": "https://huggingface.co/squeezebert/squeezebert-mnli-headless/resolve/main/tokenizer.json",
<ide> },
<ide> }
<ide>
<ide><path>src/transformers/tokenization_t5.py
<ide> ####################################################
<ide> PRETRAINED_VOCAB_FILES_MAP = {
<ide> "vocab_file": {
<del> "t5-small": "https://s3.amazonaws.com/models.huggingface.co/bert/t5-spiece.model",
<del> "t5-base": "https://s3.amazonaws.com/models.huggingface.co/bert/t5-spiece.model",
<del> "t5-large": "https://s3.amazonaws.com/models.huggingface.co/bert/t5-spiece.model",
<del> "t5-3b": "https://s3.amazonaws.com/models.huggingface.co/bert/t5-spiece.model",
<del> "t5-11b": "https://s3.amazonaws.com/models.huggingface.co/bert/t5-spiece.model",
<add> "t5-small": "https://huggingface.co/t5-small/resolve/main/spiece.model",
<add> "t5-base": "https://huggingface.co/t5-base/resolve/main/spiece.model",
<add> "t5-large": "https://huggingface.co/t5-large/resolve/main/spiece.model",
<add> "t5-3b": "https://huggingface.co/t5-3b/resolve/main/spiece.model",
<add> "t5-11b": "https://huggingface.co/t5-11b/resolve/main/spiece.model",
<ide> }
<ide> }
<ide>
<ide><path>src/transformers/tokenization_t5_fast.py
<ide> ####################################################
<ide> PRETRAINED_VOCAB_FILES_MAP = {
<ide> "vocab_file": {
<del> "t5-small": "https://s3.amazonaws.com/models.huggingface.co/bert/t5-spiece.model",
<del> "t5-base": "https://s3.amazonaws.com/models.huggingface.co/bert/t5-spiece.model",
<del> "t5-large": "https://s3.amazonaws.com/models.huggingface.co/bert/t5-spiece.model",
<del> "t5-3b": "https://s3.amazonaws.com/models.huggingface.co/bert/t5-spiece.model",
<del> "t5-11b": "https://s3.amazonaws.com/models.huggingface.co/bert/t5-spiece.model",
<add> "t5-small": "https://huggingface.co/t5-small/resolve/main/spiece.model",
<add> "t5-base": "https://huggingface.co/t5-base/resolve/main/spiece.model",
<add> "t5-large": "https://huggingface.co/t5-large/resolve/main/spiece.model",
<add> "t5-3b": "https://huggingface.co/t5-3b/resolve/main/spiece.model",
<add> "t5-11b": "https://huggingface.co/t5-11b/resolve/main/spiece.model",
<ide> },
<ide> "tokenizer_file": {
<del> "t5-small": "https://s3.amazonaws.com/models.huggingface.co/bert/t5-tokenizer.json",
<del> "t5-base": "https://s3.amazonaws.com/models.huggingface.co/bert/t5-tokenizer.json",
<del> "t5-large": "https://s3.amazonaws.com/models.huggingface.co/bert/t5-tokenizer.json",
<del> "t5-3b": "https://s3.amazonaws.com/models.huggingface.co/bert/t5-tokenizer.json",
<del> "t5-11b": "https://s3.amazonaws.com/models.huggingface.co/bert/t5-tokenizer.json",
<add> "t5-small": "https://huggingface.co/t5-small/resolve/main/tokenizer.json",
<add> "t5-base": "https://huggingface.co/t5-base/resolve/main/tokenizer.json",
<add> "t5-large": "https://huggingface.co/t5-large/resolve/main/tokenizer.json",
<add> "t5-3b": "https://huggingface.co/t5-3b/resolve/main/tokenizer.json",
<add> "t5-11b": "https://huggingface.co/t5-11b/resolve/main/tokenizer.json",
<ide> },
<ide> }
<ide>
<ide><path>src/transformers/tokenization_transfo_xl.py
<ide>
<ide> PRETRAINED_VOCAB_FILES_MAP = {
<ide> "pretrained_vocab_file": {
<del> "transfo-xl-wt103": "https://s3.amazonaws.com/models.huggingface.co/bert/transfo-xl-wt103-vocab.pkl",
<add> "transfo-xl-wt103": "https://huggingface.co/transfo-xl-wt103/resolve/main/vocab.pkl",
<ide> }
<ide> }
<ide>
<ide> }
<ide>
<ide> PRETRAINED_CORPUS_ARCHIVE_MAP = {
<del> "transfo-xl-wt103": "https://s3.amazonaws.com/models.huggingface.co/bert/transfo-xl-wt103-corpus.bin",
<add> "transfo-xl-wt103": "https://huggingface.co/transfo-xl-wt103/resolve/main/corpus.bin",
<ide> }
<ide> CORPUS_NAME = "corpus.bin"
<ide>
<ide><path>src/transformers/tokenization_xlm.py
<ide>
<ide> PRETRAINED_VOCAB_FILES_MAP = {
<ide> "vocab_file": {
<del> "xlm-mlm-en-2048": "https://s3.amazonaws.com/models.huggingface.co/bert/xlm-mlm-en-2048-vocab.json",
<del> "xlm-mlm-ende-1024": "https://s3.amazonaws.com/models.huggingface.co/bert/xlm-mlm-ende-1024-vocab.json",
<del> "xlm-mlm-enfr-1024": "https://s3.amazonaws.com/models.huggingface.co/bert/xlm-mlm-enfr-1024-vocab.json",
<del> "xlm-mlm-enro-1024": "https://s3.amazonaws.com/models.huggingface.co/bert/xlm-mlm-enro-1024-vocab.json",
<del> "xlm-mlm-tlm-xnli15-1024": "https://s3.amazonaws.com/models.huggingface.co/bert/xlm-mlm-tlm-xnli15-1024-vocab.json",
<del> "xlm-mlm-xnli15-1024": "https://s3.amazonaws.com/models.huggingface.co/bert/xlm-mlm-xnli15-1024-vocab.json",
<del> "xlm-clm-enfr-1024": "https://s3.amazonaws.com/models.huggingface.co/bert/xlm-clm-enfr-1024-vocab.json",
<del> "xlm-clm-ende-1024": "https://s3.amazonaws.com/models.huggingface.co/bert/xlm-clm-ende-1024-vocab.json",
<del> "xlm-mlm-17-1280": "https://s3.amazonaws.com/models.huggingface.co/bert/xlm-mlm-17-1280-vocab.json",
<del> "xlm-mlm-100-1280": "https://s3.amazonaws.com/models.huggingface.co/bert/xlm-mlm-100-1280-vocab.json",
<add> "xlm-mlm-en-2048": "https://huggingface.co/xlm-mlm-en-2048/resolve/main/vocab.json",
<add> "xlm-mlm-ende-1024": "https://huggingface.co/xlm-mlm-ende-1024/resolve/main/vocab.json",
<add> "xlm-mlm-enfr-1024": "https://huggingface.co/xlm-mlm-enfr-1024/resolve/main/vocab.json",
<add> "xlm-mlm-enro-1024": "https://huggingface.co/xlm-mlm-enro-1024/resolve/main/vocab.json",
<add> "xlm-mlm-tlm-xnli15-1024": "https://huggingface.co/xlm-mlm-tlm-xnli15-1024/resolve/main/vocab.json",
<add> "xlm-mlm-xnli15-1024": "https://huggingface.co/xlm-mlm-xnli15-1024/resolve/main/vocab.json",
<add> "xlm-clm-enfr-1024": "https://huggingface.co/xlm-clm-enfr-1024/resolve/main/vocab.json",
<add> "xlm-clm-ende-1024": "https://huggingface.co/xlm-clm-ende-1024/resolve/main/vocab.json",
<add> "xlm-mlm-17-1280": "https://huggingface.co/xlm-mlm-17-1280/resolve/main/vocab.json",
<add> "xlm-mlm-100-1280": "https://huggingface.co/xlm-mlm-100-1280/resolve/main/vocab.json",
<ide> },
<ide> "merges_file": {
<del> "xlm-mlm-en-2048": "https://s3.amazonaws.com/models.huggingface.co/bert/xlm-mlm-en-2048-merges.txt",
<del> "xlm-mlm-ende-1024": "https://s3.amazonaws.com/models.huggingface.co/bert/xlm-mlm-ende-1024-merges.txt",
<del> "xlm-mlm-enfr-1024": "https://s3.amazonaws.com/models.huggingface.co/bert/xlm-mlm-enfr-1024-merges.txt",
<del> "xlm-mlm-enro-1024": "https://s3.amazonaws.com/models.huggingface.co/bert/xlm-mlm-enro-1024-merges.txt",
<del> "xlm-mlm-tlm-xnli15-1024": "https://s3.amazonaws.com/models.huggingface.co/bert/xlm-mlm-tlm-xnli15-1024-merges.txt",
<del> "xlm-mlm-xnli15-1024": "https://s3.amazonaws.com/models.huggingface.co/bert/xlm-mlm-xnli15-1024-merges.txt",
<del> "xlm-clm-enfr-1024": "https://s3.amazonaws.com/models.huggingface.co/bert/xlm-mlm-enfr-1024-merges.txt",
<del> "xlm-clm-ende-1024": "https://s3.amazonaws.com/models.huggingface.co/bert/xlm-mlm-ende-1024-merges.txt",
<del> "xlm-mlm-17-1280": "https://s3.amazonaws.com/models.huggingface.co/bert/xlm-mlm-17-1280-merges.txt",
<del> "xlm-mlm-100-1280": "https://s3.amazonaws.com/models.huggingface.co/bert/xlm-mlm-100-1280-merges.txt",
<add> "xlm-mlm-en-2048": "https://huggingface.co/xlm-mlm-en-2048/resolve/main/merges.txt",
<add> "xlm-mlm-ende-1024": "https://huggingface.co/xlm-mlm-ende-1024/resolve/main/merges.txt",
<add> "xlm-mlm-enfr-1024": "https://huggingface.co/xlm-mlm-enfr-1024/resolve/main/merges.txt",
<add> "xlm-mlm-enro-1024": "https://huggingface.co/xlm-mlm-enro-1024/resolve/main/merges.txt",
<add> "xlm-mlm-tlm-xnli15-1024": "https://huggingface.co/xlm-mlm-tlm-xnli15-1024/resolve/main/merges.txt",
<add> "xlm-mlm-xnli15-1024": "https://huggingface.co/xlm-mlm-xnli15-1024/resolve/main/merges.txt",
<add> "xlm-clm-enfr-1024": "https://huggingface.co/xlm-mlm-enfr-1024/resolve/main/merges.txt",
<add> "xlm-clm-ende-1024": "https://huggingface.co/xlm-mlm-ende-1024/resolve/main/merges.txt",
<add> "xlm-mlm-17-1280": "https://huggingface.co/xlm-mlm-17-1280/resolve/main/merges.txt",
<add> "xlm-mlm-100-1280": "https://huggingface.co/xlm-mlm-100-1280/resolve/main/merges.txt",
<ide> },
<ide> }
<ide>
<ide><path>src/transformers/tokenization_xlm_roberta.py
<ide>
<ide> PRETRAINED_VOCAB_FILES_MAP = {
<ide> "vocab_file": {
<del> "xlm-roberta-base": "https://s3.amazonaws.com/models.huggingface.co/bert/xlm-roberta-base-sentencepiece.bpe.model",
<del> "xlm-roberta-large": "https://s3.amazonaws.com/models.huggingface.co/bert/xlm-roberta-large-sentencepiece.bpe.model",
<del> "xlm-roberta-large-finetuned-conll02-dutch": "https://s3.amazonaws.com/models.huggingface.co/bert/xlm-roberta-large-finetuned-conll02-dutch-sentencepiece.bpe.model",
<del> "xlm-roberta-large-finetuned-conll02-spanish": "https://s3.amazonaws.com/models.huggingface.co/bert/xlm-roberta-large-finetuned-conll02-spanish-sentencepiece.bpe.model",
<del> "xlm-roberta-large-finetuned-conll03-english": "https://s3.amazonaws.com/models.huggingface.co/bert/xlm-roberta-large-finetuned-conll03-english-sentencepiece.bpe.model",
<del> "xlm-roberta-large-finetuned-conll03-german": "https://s3.amazonaws.com/models.huggingface.co/bert/xlm-roberta-large-finetuned-conll03-german-sentencepiece.bpe.model",
<add> "xlm-roberta-base": "https://huggingface.co/xlm-roberta-base/resolve/main/sentencepiece.bpe.model",
<add> "xlm-roberta-large": "https://huggingface.co/xlm-roberta-large/resolve/main/sentencepiece.bpe.model",
<add> "xlm-roberta-large-finetuned-conll02-dutch": "https://huggingface.co/xlm-roberta-large-finetuned-conll02-dutch/resolve/main/sentencepiece.bpe.model",
<add> "xlm-roberta-large-finetuned-conll02-spanish": "https://huggingface.co/xlm-roberta-large-finetuned-conll02-spanish/resolve/main/sentencepiece.bpe.model",
<add> "xlm-roberta-large-finetuned-conll03-english": "https://huggingface.co/xlm-roberta-large-finetuned-conll03-english/resolve/main/sentencepiece.bpe.model",
<add> "xlm-roberta-large-finetuned-conll03-german": "https://huggingface.co/xlm-roberta-large-finetuned-conll03-german/resolve/main/sentencepiece.bpe.model",
<ide> }
<ide> }
<ide>
<ide><path>src/transformers/tokenization_xlm_roberta_fast.py
<ide>
<ide> PRETRAINED_VOCAB_FILES_MAP = {
<ide> "vocab_file": {
<del> "xlm-roberta-base": "https://s3.amazonaws.com/models.huggingface.co/bert/xlm-roberta-base-sentencepiece.bpe.model",
<del> "xlm-roberta-large": "https://s3.amazonaws.com/models.huggingface.co/bert/xlm-roberta-large-sentencepiece.bpe.model",
<del> "xlm-roberta-large-finetuned-conll02-dutch": "https://s3.amazonaws.com/models.huggingface.co/bert/xlm-roberta-large-finetuned-conll02-dutch-sentencepiece.bpe.model",
<del> "xlm-roberta-large-finetuned-conll02-spanish": "https://s3.amazonaws.com/models.huggingface.co/bert/xlm-roberta-large-finetuned-conll02-spanish-sentencepiece.bpe.model",
<del> "xlm-roberta-large-finetuned-conll03-english": "https://s3.amazonaws.com/models.huggingface.co/bert/xlm-roberta-large-finetuned-conll03-english-sentencepiece.bpe.model",
<del> "xlm-roberta-large-finetuned-conll03-german": "https://s3.amazonaws.com/models.huggingface.co/bert/xlm-roberta-large-finetuned-conll03-german-sentencepiece.bpe.model",
<add> "xlm-roberta-base": "https://huggingface.co/xlm-roberta-base/resolve/main/sentencepiece.bpe.model",
<add> "xlm-roberta-large": "https://huggingface.co/xlm-roberta-large/resolve/main/sentencepiece.bpe.model",
<add> "xlm-roberta-large-finetuned-conll02-dutch": "https://huggingface.co/xlm-roberta-large-finetuned-conll02-dutch/resolve/main/sentencepiece.bpe.model",
<add> "xlm-roberta-large-finetuned-conll02-spanish": "https://huggingface.co/xlm-roberta-large-finetuned-conll02-spanish/resolve/main/sentencepiece.bpe.model",
<add> "xlm-roberta-large-finetuned-conll03-english": "https://huggingface.co/xlm-roberta-large-finetuned-conll03-english/resolve/main/sentencepiece.bpe.model",
<add> "xlm-roberta-large-finetuned-conll03-german": "https://huggingface.co/xlm-roberta-large-finetuned-conll03-german/resolve/main/sentencepiece.bpe.model",
<ide> },
<ide> "tokenizer_file": {
<del> "xlm-roberta-base": "https://s3.amazonaws.com/models.huggingface.co/bert/xlm-roberta-base-tokenizer.json",
<del> "xlm-roberta-large": "https://s3.amazonaws.com/models.huggingface.co/bert/xlm-roberta-large-tokenizer.json",
<del> "xlm-roberta-large-finetuned-conll02-dutch": "https://s3.amazonaws.com/models.huggingface.co/bert/xlm-roberta-large-finetuned-conll02-dutch-tokenizer.json",
<del> "xlm-roberta-large-finetuned-conll02-spanish": "https://s3.amazonaws.com/models.huggingface.co/bert/xlm-roberta-large-finetuned-conll02-spanish-tokenizer.json",
<del> "xlm-roberta-large-finetuned-conll03-english": "https://s3.amazonaws.com/models.huggingface.co/bert/xlm-roberta-large-finetuned-conll03-english-tokenizer.json",
<del> "xlm-roberta-large-finetuned-conll03-german": "https://s3.amazonaws.com/models.huggingface.co/bert/xlm-roberta-large-finetuned-conll03-german-tokenizer.json",
<add> "xlm-roberta-base": "https://huggingface.co/xlm-roberta-base/resolve/main/tokenizer.json",
<add> "xlm-roberta-large": "https://huggingface.co/xlm-roberta-large/resolve/main/tokenizer.json",
<add> "xlm-roberta-large-finetuned-conll02-dutch": "https://huggingface.co/xlm-roberta-large-finetuned-conll02-dutch/resolve/main/tokenizer.json",
<add> "xlm-roberta-large-finetuned-conll02-spanish": "https://huggingface.co/xlm-roberta-large-finetuned-conll02-spanish/resolve/main/tokenizer.json",
<add> "xlm-roberta-large-finetuned-conll03-english": "https://huggingface.co/xlm-roberta-large-finetuned-conll03-english/resolve/main/tokenizer.json",
<add> "xlm-roberta-large-finetuned-conll03-german": "https://huggingface.co/xlm-roberta-large-finetuned-conll03-german/resolve/main/tokenizer.json",
<ide> },
<ide> }
<ide>
<ide><path>src/transformers/tokenization_xlnet.py
<ide>
<ide> PRETRAINED_VOCAB_FILES_MAP = {
<ide> "vocab_file": {
<del> "xlnet-base-cased": "https://s3.amazonaws.com/models.huggingface.co/bert/xlnet-base-cased-spiece.model",
<del> "xlnet-large-cased": "https://s3.amazonaws.com/models.huggingface.co/bert/xlnet-large-cased-spiece.model",
<add> "xlnet-base-cased": "https://huggingface.co/xlnet-base-cased/resolve/main/spiece.model",
<add> "xlnet-large-cased": "https://huggingface.co/xlnet-large-cased/resolve/main/spiece.model",
<ide> }
<ide> }
<ide>
<ide><path>src/transformers/tokenization_xlnet_fast.py
<ide>
<ide> PRETRAINED_VOCAB_FILES_MAP = {
<ide> "vocab_file": {
<del> "xlnet-base-cased": "https://s3.amazonaws.com/models.huggingface.co/bert/xlnet-base-cased-spiece.model",
<del> "xlnet-large-cased": "https://s3.amazonaws.com/models.huggingface.co/bert/xlnet-large-cased-spiece.model",
<add> "xlnet-base-cased": "https://huggingface.co/xlnet-base-cased/resolve/main/spiece.model",
<add> "xlnet-large-cased": "https://huggingface.co/xlnet-large-cased/resolve/main/spiece.model",
<ide> },
<ide> "tokenizer_file": {
<del> "xlnet-base-cased": "https://s3.amazonaws.com/models.huggingface.co/bert/xlnet-base-cased-tokenizer.json",
<del> "xlnet-large-cased": "https://s3.amazonaws.com/models.huggingface.co/bert/xlnet-large-cased-tokenizer.json",
<add> "xlnet-base-cased": "https://huggingface.co/xlnet-base-cased/resolve/main/tokenizer.json",
<add> "xlnet-large-cased": "https://huggingface.co/xlnet-large-cased/resolve/main/tokenizer.json",
<ide> },
<ide> }
<ide> | 81 |
Python | Python | fix typo in mark_as_failure | 98fdcd749b0c4d3ec1ad0cfae058d193595413e1 | <ide><path>celery/backends/base.py
<ide> def mark_as_failure(self, task_id, exc,
<ide> # elements of the chain. This is only truly important so
<ide> # that the last chain element which controls completion of
<ide> # the chain itself is marked as completed to avoid stalls.
<del> if self.store_result and state in states.PROPAGATE_STATES:
<add> if store_result and state in states.PROPAGATE_STATES:
<ide> try:
<ide> chained_task_id = chain_elem_opts['task_id']
<ide> except KeyError: | 1 |
Python | Python | add multiple tokenizer exceptions for danish | 056547e989ae8d2d5633cf901d9dc376d32cdcfe | <ide><path>spacy/lang/da/tokenizer_exceptions.py
<ide> # encoding: utf8
<add>"""
<add>Tokenizer Exceptions.
<add>Source: https://forkortelse.dk/ and various others.
<add>"""
<add>
<ide> from __future__ import unicode_literals
<ide>
<ide> from ...symbols import ORTH, LEMMA, NORM, TAG, ADP, PUNCT
<ide> _exc[exc_data[ORTH]] = [exc_data]
<ide>
<ide> for orth in [
<del> "A/S", "beg.", "bl.a.", "ca.", "d.s.s.", "dvs.", "f.eks.", "fr.", "hhv.",
<del> "if.", "iflg.", "m.a.o.", "mht.", "min.", "osv.", "pga.", "resp.", "self.",
<del> "t.o.m.", "vha.", ""]:
<add> "A.D.", "A/S", "aarh.", "ac.", "adj.", "adr.", "adsk.", "adv.", "afb.",
<add> "afd.", "afg.", "afk.", "afs.", "aht.", "alg.", "alk.", "alm.", "amer.",
<add> "ang.", "ank.", "anl.", "anv.", "arb.", "arr.", "att.", "B.C.", "bd.",
<add> "bdt.", "beg.", "begr.", "beh.", "bet.", "bev.", "bhk.", "bib.",
<add> "bibl.", "bidr.", "bildl.", "bill.", "bio.", "biol.", "bk.", "BK.",
<add> "bl.", "bl.a.", "borgm.", "bot.", "Boul.", "br.", "brolægn.", "bto.",
<add> "bygn.", "ca.", "cand.", "Chr.", "d.", "d.d.", "d.m.", "d.s.", "d.s.s.",
<add> "d.y.", "d.å.", "d.æ.", "da.", "dagl.", "dat.", "dav.", "def.", "dek.",
<add> "dep.", "desl.", "diam.", "dir.", "disp.", "distr.", "div.", "dkr.",
<add> "dl.", "do.", "dobb.", "Dr.", "dr.h.c", "Dronn.", "ds.", "dvs.", "e.b.",
<add> "e.l.", "e.o.", "e.v.t.", "eftf.", "eftm.", "eg.", "egl.", "eks.",
<add> "eksam.", "ekskl.", "eksp.", "ekspl.", "el.", "el.lign.", "emer.",
<add> "endv.", "eng.", "enk.", "etc.", "etym.", "eur.", "evt.", "exam.", "f.",
<add> "f.eks.", "f.m.", "f.n.", "f.o.", "f.o.m.", "f.s.v.", "f.t.", "f.v.t.",
<add> "f.å.", "fa.", "fakt.", "fam.", "fem.", "ff.", "fg.", "fhv.", "fig.",
<add> "filol.", "filos.", "fl.", "flg.", "fm.", "fmd.", "fol.", "forb.",
<add> "foreg.", "foren.", "forf.", "fork.", "form.", "forr.", "fors.",
<add> "forsk.", "forts.", "fr.", "fr.u.", "frk.", "fsva.", "fuldm.", "fung.",
<add> "fx.", "fys.", "fær.", "g.d.", "g.m.", "gd.", "gdr.", "genuds.", "gl.",
<add> "gn.", "gns.", "gr.", "grdl.", "gross.", "h.a.", "h.c.", "H.K.H.",
<add> "H.M.", "hdl.", "henv.", "Hf.", "hhv.", "hj.hj.", "hj.spl.", "hort.",
<add> "hosp.", "hpl.", "Hr.", "hr.", "hrs.", "hum.", "hvp.", "i/s", "I/S",
<add> "i.e.", "ib.", "id.", "if.", "iflg.", "ifm.", "ift.", "iht.", "ill.",
<add> "indb.", "indreg.", "inf.", "ing.", "inh.", "inj.", "inkl.", "insp.",
<add> "instr.", "isl.", "istf.", "it.", "ital.", "iv.", "jap.", "jf.", "jfr.",
<add> "jnr.", "j.nr.", "jr.", "jur.", "jvf.", "K.", "kap.", "kat.", "kbh.",
<add> "kem.", "kgl.", "kl.", "kld.", "knsp.", "komm.", "kons.", "korr.",
<add> "kp.", "Kprs.", "kr.", "kst.", "kt.", "ktr.", "kv.", "kvt.", "l.",
<add> "L.A.", "l.c.", "lab.", "lat.", "lb.m.", "lb.nr.", "lejl.", "lgd.",
<add> "lic.", "lign.", "lin.", "ling.merc.", "litt.", "Ll.", "loc.cit.",
<add> "lok.", "lrs.", "ltr.", "m/s", "M/S", "m.a.o.", "m.fl.", "m.m.", "m.v.",
<add> "m.v.h.", "Mag.", "maks.", "md.", "mdr.", "mdtl.", "mezz.", "mfl.",
<add> "m.h.p.", "m.h.t", "mht.", "mik.", "min.", "mio.", "modt.", "Mr.",
<add> "mrk.", "mul.", "mv.", "n.br.", "n.f.", "nat.", "nb.", "Ndr.",
<add> "nedenst.", "nl.", "nr.", "Nr.", "nto.", "nuv.", "o/m", "o.a.", "o.fl.",
<add> "o.h.", "o.l.", "o.lign.", "o.m.a.", "o.s.fr.", "obl.", "obs.",
<add> "odont.", "oecon.", "off.", "ofl.", "omg.", "omkr.", "omr.", "omtr.",
<add> "opg.", "opl.", "opr.", "org.", "orig.", "osv.", "ovenst.", "overs.",
<add> "ovf.", "p.", "p.a.", "p.b.a", "p.b.v", "p.c.", "p.m.", "p.m.v.",
<add> "p.n.", "p.p.", "p.p.s.", "p.s.", "p.t.", "p.v.a.", "p.v.c.", "pag.",
<add> "par.", "Pas.", "pass.", "pcs.", "pct.", "pd.", "pens.", "pers.",
<add> "pft.", "pg.", "pga.", "pgl.", "Ph.d.", "pinx.", "pk.", "pkt.",
<add> "polit.", "polyt.", "pos.", "pp.", "ppm.", "pr.", "prc.", "priv.",
<add> "prod.", "prof.", "pron.", "Prs.", "præd.", "præf.", "præt.", "psych.",
<add> "pt.", "pæd.", "q.e.d.", "rad.", "Rcp.", "red.", "ref.", "reg.",
<add> "regn.", "rel.", "rep.", "repr.", "resp.", "rest.", "rm.", "rtg.",
<add> "russ.", "s.", "s.br.", "s.d.", "s.f.", "s.m.b.a.", "s.u.", "s.å.",
<add> "sa.", "sb.", "sc.", "scient.", "scil.", "Sdr.", "sek.", "sekr.",
<add> "self.", "sem.", "sen.", "shj.", "sign.", "sing.", "sj.", "skr.",
<add> "Skt.", "slutn.", "sml.", "smp.", "sms.", "snr.", "soc.", "soc.dem.",
<add> "sort.", "sp.", "spec.", "Spl.", "spm.", "spr.", "spsk.", "statsaut.",
<add> "st.", "stk.", "str.", "stud.", "subj.", "subst.", "suff.", "sup.",
<add> "suppl.", "sv.", "såk.", "sædv.", "sø.", "t/r", "t.", "t.h.", "t.o.",
<add> "t.o.m.", "t.v.", "tab.", "tbl.", "tcp/ip", "td.", "tdl.", "tdr.",
<add> "techn.", "tekn.", "temp.", "th.", "theol.", "ti.", "tidl.", "tilf.",
<add> "tilh.", "till.", "tilsv.", "tjg.", "tkr.", "tlf.", "tlgr.", "to.",
<add> "tr.", "trp.", "tsk.", "tv.", "ty.", "u/b", "udb.", "udbet.", "ugtl.",
<add> "undt.", "v.", "v.f.", "var.", "vb.", "vedk.", "vedl.", "vedr.",
<add> "vejl.", "Vg.", "vh.", "vha.", "vs.", "vsa.", "vær.", "zool.", "ø.lgd.",
<add> "øv.", "øvr.", "årg.", "årh.", ""]:
<ide> _exc[orth] = [{ORTH: orth}]
<ide>
<ide> _custom_base_exc = {
<ide><path>spacy/tests/lang/da/test_exceptions.py
<ide>
<ide> import pytest
<ide>
<del>@pytest.mark.parametrize('text', ["ca.", "m.a.o.", "Jan.", "Dec."])
<add>@pytest.mark.parametrize('text',
<add> ["ca.", "m.a.o.", "Jan.", "Dec.", "kr.", "jf."])
<ide> def test_da_tokenizer_handles_abbr(da_tokenizer, text):
<ide> tokens = da_tokenizer(text)
<ide> assert len(tokens) == 1 | 2 |
Python | Python | add a test case for it | b9e7a62d3ecdbb36e462d5d3d24d7eb6ee80b34a | <ide><path>libcloud/test/test_httplib_ssl.py
<ide> import os
<ide> import sys
<ide> import os.path
<add>import socket
<ide>
<add>import mock
<ide> from mock import patch
<ide>
<ide> import libcloud.security
<ide> def test_setup_ca_cert(self, _):
<ide> self.assertRaisesRegexp(RuntimeError, expected_msg,
<ide> self.httplib_object._setup_ca_cert)
<ide>
<add> @mock.patch('socket.create_connection', mock.MagicMock())
<add> @mock.patch('ssl.wrap_socket')
<add> def test_connect_throws_friendly_error_message_on_ssl_wrap_connection_reset_by_peer(self, mock_wrap_socket):
<add> # Test that we re-throw a more friendly error message in case
<add> # "connection reset by peer" error occurs when trying to establish a
<add> # SSL connection
<add> libcloud.security.VERIFY_SSL_CERT = True
<add> self.httplib_object.verify = True
<add>
<add> # No connection reset by peer, original exception should be thrown
<add> mock_wrap_socket.side_effect = Exception('foo bar fail')
<add>
<add> expected_msg = 'foo bar fail'
<add> self.assertRaisesRegexp(Exception, expected_msg,
<add> self.httplib_object.connect)
<add>
<add> # Connection reset by peer, wrapped exception with friendly error
<add> # message should be thrown
<add> mock_wrap_socket.side_effect = socket.error('Connection reset by peer')
<add>
<add> expected_msg = 'Failed to establish SSL / TLS connection'
<add> self.assertRaisesRegexp(Exception, expected_msg,
<add> self.httplib_object.connect)
<add>
<ide>
<ide> if __name__ == '__main__':
<ide> sys.exit(unittest.main()) | 1 |
Javascript | Javascript | use bufferattribute as mock to supress warnings | a71785d0d3db2b2d86cb0f6a68a4d7a832382c2c | <ide><path>test/unit/core/InstancedBufferGeometry.js
<ide> test( "copy", function() {
<ide> var instanceMock1 = {};
<ide> var instanceMock2 = {};
<ide> var indexMock = createClonableMock();
<del> var attributeMock1 = {};
<del> var attributeMock2 = {};
<add> var defaultAttribute1 = new THREE.BufferAttribute([1]);
<add> var defaultAttribute2 = new THREE.BufferAttribute([2]);
<ide>
<ide> var instance = new THREE.InstancedBufferGeometry();
<ide>
<ide> instance.addGroup( 0, 10, instanceMock1 );
<ide> instance.addGroup( 10, 5, instanceMock2 );
<ide> instance.setIndex( indexMock );
<del> instance.addAttribute( 'attributeMock1', attributeMock1 );
<del> instance.addAttribute( 'attributeMock2', attributeMock2 );
<add> instance.addAttribute( 'defaultAttribute1', defaultAttribute1 );
<add> instance.addAttribute( 'defaultAttribute2', defaultAttribute2 );
<ide>
<ide> var copiedInstance = instance.copy( instance );
<ide>
<ide> test( "copy", function() {
<ide> ok( copiedInstance.index === indexMock, "index was copied" );
<ide> ok( copiedInstance.index.callCount === 1, "index.clone was called once" );
<ide>
<del> ok( copiedInstance.attributes['attributeMock1'] instanceof THREE.BufferAttribute, "attribute was created" );
<add> ok( copiedInstance.attributes['defaultAttribute1'] instanceof THREE.BufferAttribute, "attribute was created" );
<ide> // the given attribute mock was passed to the array property of the created buffer attribute
<del> ok( copiedInstance.attributes['attributeMock1'].array === attributeMock1, "attribute was copied" );
<del> ok( copiedInstance.attributes['attributeMock2'].array === attributeMock2, "attribute was copied" );
<add> ok( copiedInstance.attributes['defaultAttribute1'].array[0] === defaultAttribute1.array, "attribute was copied" );
<add> ok( copiedInstance.attributes['defaultAttribute2'].array[0] === defaultAttribute2.array, "attribute was copied" );
<ide>
<ide> ok( copiedInstance.groups[0].start === 0, "group was copied" );
<ide> ok( copiedInstance.groups[0].count === 10, "group was copied" ); | 1 |
PHP | PHP | update doc block | 3792f56eecd337aea3045b46640bd33e7f5529b2 | <ide><path>Cake/Cache/Cache.php
<ide> public static function enabled() {
<ide> *
<ide> * Examples:
<ide> *
<del> * Using a Closure to provide data, assume $this is a Model:
<add> * Using a Closure to provide data, assume `$this` is a Table object:
<ide> *
<ide> * {{{
<del> * $model = $this;
<del> * $results = Cache::remember('all_articles', function() use ($model) {
<del> * return $model->find('all');
<add> * $results = Cache::remember('all_articles', function() {
<add> * return $this->find('all');
<ide> * });
<ide> * }}}
<ide> * | 1 |
Javascript | Javascript | replace `getall` with `getkeys` in `loadtype3data` | f7f60197ce6de81fcd0b3bdc177c0e55cd39f813 | <ide><path>src/core/evaluator.js
<ide> var TranslatedFont = (function TranslatedFontClosure() {
<ide>
<ide> var translatedFont = this.font;
<ide> var loadCharProcsPromise = Promise.resolve();
<del> var charProcs = this.dict.get('CharProcs').getAll();
<add> var charProcs = this.dict.get('CharProcs');
<ide> var fontResources = this.dict.get('Resources') || resources;
<del> var charProcKeys = Object.keys(charProcs);
<add> var charProcKeys = charProcs.getKeys();
<ide> var charProcOperatorList = Object.create(null);
<ide> for (var i = 0, n = charProcKeys.length; i < n; ++i) {
<ide> loadCharProcsPromise = loadCharProcsPromise.then(function (key) {
<del> var glyphStream = charProcs[key];
<add> var glyphStream = charProcs.get(key);
<ide> var operatorList = new OperatorList();
<ide> return evaluator.getOperatorList(glyphStream, task, fontResources,
<ide> operatorList).then(function () { | 1 |
Python | Python | replace list creation with list literal | 6a8de48502a85fbd4592d7cc43cdde4f610ae894 | <ide><path>glances/core/glances_logs.py
<ide> def add(self, item_state, item_type, item_value,
<ide> # Create the new log item
<ide> # Time is stored in Epoch format
<ide> # Epoch -> DMYHMS = datetime.fromtimestamp(epoch)
<del> item = []
<del> # START DATE
<del> item.append(time.mktime(datetime.now().timetuple()))
<del> item.append(-1) # END DATE
<del> item.append(item_state) # STATE: WARNING|CRITICAL
<del> item.append(item_type) # TYPE: CPU, LOAD, MEM...
<del> item.append(item_value) # MAX
<del> item.append(item_value) # AVG
<del> item.append(item_value) # MIN
<del> item.append(item_value) # SUM
<del> item.append(1) # COUNT
<del> # Process list is sorted automaticaly
<del> # Overwrite the user choise
<del> # topprocess = sorted(proc_list, key=lambda process: process[process_auto_by],
<del> # reverse=True)
<del> # item.append(topprocess[0:3]) # TOP 3 PROCESS LIST
<del> item.append([]) # TOP 3 PROCESS LIST
<del> item.append(proc_desc) # MONITORED PROCESSES DESC
<add> item = [
<add> time.mktime(datetime.now().timetuple()), # START DATE
<add> -1, # END DATE
<add> item_state, # STATE: WARNING|CRITICAL
<add> item_type, # TYPE: CPU, LOAD, MEM...
<add> item_value, # MAX
<add> item_value, # AVG
<add> item_value, # MIN
<add> item_value, # SUM
<add> 1, # COUNT
<add> # Process list is sorted automatically
<add> # Overwrite the user choice
<add> # topprocess = sorted(proc_list, key=lambda process: process[process_auto_by],
<add> # reverse=True)
<add> # topprocess[0:3], # TOP 3 PROCESS LIST
<add> [], # TOP 3 PROCESS LIST
<add> proc_desc] # MONITORED PROCESSES DESC
<ide>
<ide> # Add the item to the list
<ide> self.logs_list.insert(0, item)
<ide><path>glances/plugins/glances_processlist.py
<ide> def add_tree_decoration(self, child_data, is_last_child, first_level):
<ide>
<ide> def get_process_curses_data(self, p, first, args):
<ide> """ Get curses data to display for a process. """
<del> ret = []
<del> ret.append(self.curse_new_line())
<add> ret = [self.curse_new_line()]
<ide> # CPU
<ide> if 'cpu_percent' in p and p['cpu_percent'] is not None and p['cpu_percent'] != '':
<ide> msg = '{0:>6.1f}'.format(p['cpu_percent'])
<ide><path>glances/plugins/glances_uptime.py
<ide> def update(self):
<ide>
<ide> if self.input_method == 'local':
<ide> # Update stats using the standard system lib
<del> uptime = datetime.now() - \
<del> datetime.fromtimestamp(psutil.boot_time())
<add> uptime = datetime.now() - datetime.fromtimestamp(psutil.boot_time())
<ide>
<ide> # Convert uptime to string (because datetime is not JSONifi)
<ide> self.stats = str(uptime).split('.')[0]
<ide> def update(self):
<ide>
<ide> def msg_curse(self, args=None):
<ide> """Return the string to display in the curse interface."""
<del> # Init the return message
<del> ret = []
<del>
<del> # Add the line with decoration
<del> ret.append(self.curse_add_line(_("Uptime: {0}").format(self.stats)))
<del>
<del> # Return the message with decoration
<del> return ret
<add> return [self.curse_add_line(_("Uptime: {0}").format(self.stats))] | 3 |
Java | Java | remove obsolete code in r2dbctransactionmanager | fae36e98b4c3d72941e3f209a87047332ff51ba1 | <ide><path>spring-r2dbc/src/main/java/org/springframework/r2dbc/connection/R2dbcTransactionManager.java
<ide> protected Mono<Void> doCleanupAfterCompletion(TransactionSynchronizationManager
<ide> afterCleanup = afterCleanup.then(Mono.from(con.setAutoCommit(true)));
<ide> }
<ide>
<del> if (txObject.getPreviousIsolationLevel() != null) {
<del> afterCleanup = afterCleanup
<del> .then(Mono.from(con.setTransactionIsolationLevel(txObject.getPreviousIsolationLevel())));
<del> }
<del>
<ide> return afterCleanup.then(Mono.defer(() -> {
<ide> try {
<ide> if (txObject.isNewConnectionHolder()) {
<ide> private static class ConnectionFactoryTransactionObject {
<ide> @Nullable
<ide> private ConnectionHolder connectionHolder;
<ide>
<del> @Nullable
<del> private IsolationLevel previousIsolationLevel;
<del>
<ide> private boolean newConnectionHolder;
<ide>
<ide> private boolean mustRestoreAutoCommit;
<ide> public boolean hasConnectionHolder() {
<ide> return (this.connectionHolder != null);
<ide> }
<ide>
<del> public void setPreviousIsolationLevel(@Nullable IsolationLevel previousIsolationLevel) {
<del> this.previousIsolationLevel = previousIsolationLevel;
<del> }
<del>
<del> @Nullable
<del> public IsolationLevel getPreviousIsolationLevel() {
<del> return this.previousIsolationLevel;
<del> }
<del>
<ide> public void setMustRestoreAutoCommit(boolean mustRestoreAutoCommit) {
<ide> this.mustRestoreAutoCommit = mustRestoreAutoCommit;
<ide> } | 1 |
PHP | PHP | add tests for index reflection | 13f7da961fe019915a502629e6e2f62ee1dc2fed | <ide><path>lib/Cake/Database/Schema/MysqlSchema.php
<ide> public function convertIndexDescription(Table $table, $row) {
<ide> $type = strtolower($row['Index_type']);
<ide> } elseif ($row['Non_unique'] == 0 && $type !== 'primary') {
<ide> $type = 'unique';
<del> } else {
<add> } elseif ($type !== 'primary') {
<ide> $type = 'index';
<ide> }
<ide>
<ide> public function convertIndexDescription(Table $table, $row) {
<ide> // MySQL multi column indexes come back
<ide> // as multiple rows.
<ide> if (!empty($existing)) {
<del> $columns = array_merge($existing['columns'], $columns);
<add> $columns = array_unique(array_merge($existing['columns'], $columns));
<ide> $length = array_merge($existing['length'], $length);
<ide> }
<ide> if ($isIndex) {
<ide><path>lib/Cake/Test/TestCase/Database/Schema/MysqlSchemaTest.php
<ide> public function testDescribeTableIndexes() {
<ide> $schema = new SchemaCollection($connection);
<ide> $result = $schema->describe('articles');
<ide> $this->assertInstanceOf('Cake\Database\Schema\Table', $result);
<add>
<add> $this->assertCount(2, $result->constraints());
<ide> $expected = [
<ide> 'primary' => [
<ide> 'type' => 'primary',
<ide> public function testDescribeTableIndexes() {
<ide> ]
<ide> ]
<ide> ];
<del> foreach ($expected as $name => $props) {
<del> $this->assertEquals(
<del> $props,
<del> $result->constraint($name),
<del> 'Index definition does not match for ' . $name
<del> );
<del> }
<add> $this->assertEquals($expected['primary'], $result->constraint('primary'));
<add> $this->assertEquals($expected['length_idx'], $result->constraint('length_idx'));
<add>
<add> $this->assertCount(1, $result->indexes());
<add> $expected = [
<add> 'type' => 'index',
<add> 'columns' => ['author_id'],
<add> 'length' => []
<add> ];
<add> $this->assertEquals($expected, $result->index('author_idx'));
<ide> }
<ide>
<ide> /** | 2 |
Text | Text | fix headings in quic.md | 7a1220a1d745cf2f6d00ed3aa3905b604cea0952 | <ide><path>doc/api/quic.md
<ide> TBD
<ide>
<ide> ## QUIC JavaScript API
<ide>
<del>### `net.createQuicSocket(\[options\])`
<add>### `net.createQuicSocket([options])`
<ide> <!-- YAML
<ide> added: REPLACEME
<ide> -->
<ide> The object will contain the properties:
<ide>
<ide> If the `QuicEndpoint` is not bound, `quicendpoint.address` is an empty object.
<ide>
<del>#### `quicendpoint.bind(\[options\])`
<add>#### `quicendpoint.bind([options])`
<ide> <!-- YAML
<ide> added: REPLACEME
<ide> -->
<ide> added: REPLACEME
<ide>
<ide> Set to `true` if the `QuicEndpoint` is in the process of closing.
<ide>
<del>#### `quicendpoint.destroy(\[error\])`
<add>#### `quicendpoint.destroy([error])`
<ide> <!-- YAML
<ide> added: REPLACEME
<ide> -->
<ide> the local UDP port.
<ide> added: REPLACEME
<ide> -->
<ide>
<del>#### `quicendpoint.setBroadcast(\[on\])`
<add>#### `quicendpoint.setBroadcast([on])`
<ide> <!-- YAML
<ide> added: REPLACEME
<ide> -->
<ide> A socket's address family's ANY address (IPv4 `'0.0.0.0'` or IPv6 `'::'`)
<ide> can be used to return control of the sockets default outgoing interface to
<ide> the system for future multicast packets.
<ide>
<del>#### `quicendpoint.setMulticastLoopback(\[on\])`
<add>#### `quicendpoint.setMulticastLoopback([on])`
<ide> <!-- YAML
<ide> added: REPLACEME
<ide> -->
<ide> added: REPLACEME
<ide>
<ide> Set to `true` if the `QuicSession` is in the process of a graceful shutdown.
<ide>
<del>#### `quicsession.destroy(\[error\])`
<add>#### `quicsession.destroy([error])`
<ide> <!-- YAML
<ide> added: REPLACEME
<ide> -->
<ide> some properties corresponding to the fields of the certificate.
<ide> If there is no local certificate, or if the `QuicSession` has been destroyed,
<ide> an empty object will be returned.
<ide>
<del>#### `quicsession.getPeerCertificate(\[detailed\])`
<add>#### `quicsession.getPeerCertificate([detailed])`
<ide> <!-- YAML
<ide> added: REPLACEME
<ide> -->
<ide> added: REPLACEME
<ide>
<ide> The minimum RTT recorded so far for this `QuicSession`.
<ide>
<del>#### `quicsession.openStream(\[options\])`
<add>#### `quicsession.openStream([options])`
<ide> <!-- YAML
<ide> added: REPLACEME
<ide> -->
<ide> permitted to close naturally. New `QuicClientSession` and `QuicServerSession`
<ide> instances will not be allowed. The returns `Promise` will be resolved once
<ide> the `QuicSocket` is destroyed.
<ide>
<del>#### `quicsocket.connect(\[options\])`
<add>#### `quicsocket.connect([options])`
<ide> <!-- YAML
<ide> added: REPLACEME
<ide> -->
<ide> added: REPLACEME
<ide>
<ide> Returns a `Promise` that resolves a new `QuicClientSession`.
<ide>
<del>#### `quicsocket.destroy(\[error\])`
<add>#### `quicsocket.destroy([error])`
<ide> <!-- YAML
<ide> added: REPLACEME
<ide> -->
<ide> An array of `QuicEndpoint` instances associated with the `QuicSocket`.
<ide>
<ide> Read-only.
<ide>
<del>#### `quicsocket.listen(\[options\])`
<add>#### `quicsocket.listen([options])`
<ide> <!-- YAML
<ide> added: REPLACEME
<ide> -->
<ide> The maximum received offset for this `QuicStream`.
<ide>
<ide> Read-only.
<ide>
<del>#### `quicstream.pushStream(headers\[, options\])`
<add>#### `quicstream.pushStream(headers[, options])`
<ide> <!-- YAML
<ide> added: REPLACEME
<ide> -->
<ide> The `QuicServerSession` or `QuicClientSession` to which the
<ide>
<ide> Read-only.
<ide>
<del>#### `quicstream.sendFD(fd\[, options\])`
<add>#### `quicstream.sendFD(fd[, options])`
<ide> <!-- YAML
<ide> added: REPLACEME
<ide> -->
<ide> Using the same file descriptor concurrently for multiple streams
<ide> is not supported and may result in data loss. Re-using a file descriptor
<ide> after a stream has finished is supported.
<ide>
<del>#### `quicstream.sendFile(path\[, options\])`
<add>#### `quicstream.sendFile(path[, options])`
<ide> <!-- YAML
<ide> added: REPLACEME
<ide> --> | 1 |
Text | Text | add logo to readme | 2d039ffa29e4c4366a1028555b71cb36b7129fb1 | <ide><path>README.md
<del># Node.js
<del>
<del>[](https://gitter.im/nodejs/node?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge&utm_content=badge) [](https://bestpractices.coreinfrastructure.org/projects/29)
<add><p align="center">
<add> <img alt="Node.js" src="https://nodejs.org/static/images/logo-light.svg" width="400"/>
<add></p>
<add><p align="center">
<add> <a title="Gitter" href="https://gitter.im/nodejs/node?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge&utm_content=badge"><img src="https://badges.gitter.im/Join%20Chat.svg"></a>
<add> <a title="CII Best Practices" href="https://bestpractices.coreinfrastructure.org/projects/29"><img src="https://bestpractices.coreinfrastructure.org/projects/29/badge"></a>
<add></p>
<ide>
<ide> Node.js is a JavaScript runtime built on Chrome's V8 JavaScript engine. Node.js
<ide> uses an event-driven, non-blocking I/O model that makes it lightweight and | 1 |
Text | Text | add grand rounds to companies list | 1e2d237389f974735517f6c6d821ae7563e6b3ec | <ide><path>README.md
<ide> Currently **officially** using Airflow:
<ide> 1. [Cotap](https://github.com/cotap/) [[@maraca](https://github.com/maraca) & [@richardchew](https://github.com/richardchew)]
<ide> 1. [Credit Karma](https://www.creditkarma.com/) [[@preete-dixit-ck](https://github.com/preete-dixit-ck) & [@harish-gaggar-ck](https://github.com/harish-gaggar-ck) & [@greg-finley-ck](https://github.com/greg-finley-ck)]
<ide> 1. [Digital First Media](http://www.digitalfirstmedia.com/) [[@duffn](https://github.com/duffn) & [@mschmo](https://github.com/mschmo) & [@seanmuth](https://github.com/seanmuth)]
<add>1. [Drivy](https://www.drivy.com) [[@AntoineAugusti](https://github.com/AntoineAugusti)]
<ide> 1. [Easy Taxi](http://www.easytaxi.com/) [[@caique-lima](https://github.com/caique-lima) & [@WesleyBatista](https://github.com/WesleyBatista)]
<ide> 1. [eRevalue](https://www.datamaran.com) [[@hamedhsn](https://github.com/hamedhsn)]
<ide> 1. [evo.company](https://evo.company/) [[@orhideous](https://github.com/orhideous)]
<ide> 1. [FreshBooks](https://github.com/freshbooks) [[@DinoCow](https://github.com/DinoCow)]
<ide> 1. [Gentner Lab](http://github.com/gentnerlab) [[@neuromusic](https://github.com/neuromusic)]
<ide> 1. [Glassdoor](https://github.com/Glassdoor) [[@syvineckruyk](https://github.com/syvineckruyk)]
<ide> 1. [GovTech GDS](https://gds-gov.tech) [[@chrissng](https://github.com/chrissng) & [@datagovsg](https://github.com/datagovsg)]
<add>1. [Grand Rounds](https://www.grandrounds.com/) [[@richddr](https://github.com/richddr), [@timz1290](https://github.com/timz1290) & [@wenever](https://github.com/@wenever)]
<ide> 1. [Groupalia](http://es.groupalia.com) [[@jesusfcr](https://github.com/jesusfcr)]
<ide> 1. [Gusto](https://gusto.com) [[@frankhsu](https://github.com/frankhsu)]
<ide> 1. [Handshake](https://joinhandshake.com/) [[@mhickman](https://github.com/mhickman)]
<ide> Currently **officially** using Airflow:
<ide> 1. [Zendesk](https://www.github.com/zendesk)
<ide> 1. [Zenly](https://zen.ly) [[@cerisier](https://github.com/cerisier) & [@jbdalido](https://github.com/jbdalido)]
<ide> 1. [99](https://99taxis.com) [[@fbenevides](https://github.com/fbenevides), [@gustavoamigo](https://github.com/gustavoamigo) & [@mmmaia](https://github.com/mmmaia)]
<del>1. [Drivy](https://www.drivy.com) [[@AntoineAugusti](https://github.com/AntoineAugusti)]
<ide>
<ide> ## Links
<ide> | 1 |
Ruby | Ruby | modify regex in tests | 1edff7934e378393ff09e0b343161c291c6744dd | <ide><path>Library/Homebrew/test/formula_spec.rb
<ide> livecheck do
<ide> skip "foo"
<ide> url "https://brew.sh/test/releases"
<del> regex(/test-(\d+(?:\.\d+)+)\.tbz/)
<add> regex(/test-v?(\d+(?:\.\d+)+)\.t/i)
<ide> end
<ide> end
<ide>
<ide> expect(f.livecheck.skip?).to be true
<ide> expect(f.livecheck.skip_msg).to eq("foo")
<ide> expect(f.livecheck.url).to eq("https://brew.sh/test/releases")
<del> expect(f.livecheck.regex).to eq(/test-(\d+(?:\.\d+)+)\.tbz/)
<add> expect(f.livecheck.regex).to eq(/test-v?(\d+(?:\.\d+)+)\.t/i)
<ide> end
<ide>
<ide> describe "#livecheckable?" do
<ide> f = formula do
<ide> url "https://brew.sh/test-1.0.tbz"
<ide> livecheck do
<del> regex(/test-(\d+(?:.\d+)+).tbz/)
<add> regex(/test-v?(\d+(?:\.\d+)+)\.t/i)
<ide> end
<ide> end
<ide>
<ide> url "https://brew.sh/test-1.0.tbz"
<ide> livecheck do
<ide> url :homepage
<del> regex(/test-(\d+(?:.\d+)+).tbz/)
<add> regex(/test-v?(\d+(?:\.\d+)+)\.t/i)
<ide> end
<ide> end
<ide> | 1 |
PHP | PHP | improve doc blocks | 2f5962ecc6ab3c7ed9d708b4c1ccc083a45952d4 | <ide><path>src/Validation/Validation.php
<ide> public static function localizedTime($check, $type = 'datetime', $format = null)
<ide> }
<ide>
<ide> /**
<del> * Boolean validation, determines if value passed is a boolean integer or true/false.
<add> * Validates if passed value is boolish.
<add> *
<ide> * The list of what is considered to be boolish values, may be set via $booleanValues.
<ide> *
<del> * @param string $check a valid boolean
<del> * @param string $booleanValues list of valid boolish values, defaults to `[true, false, 0, 1, '0', '1']`.
<add> * @param boolean|integer|string $check Value to check.
<add> * @param string $booleanValues List of valid boolish values, defaults to `[true, false, 0, 1, '0', '1']`.
<ide> * @return bool Success
<ide> */
<ide> public static function boolean($check, array $booleanValues = [])
<ide> public static function boolean($check, array $booleanValues = [])
<ide> }
<ide>
<ide> /**
<del> * Truthy validation, determines if value passed is truthy.
<add> * Validates if passed value is truthy.
<add> *
<ide> * The list of what is considered to be truthy values, may be set via $truthyValues.
<ide> *
<del> * @param string $check a valid boolean
<del> * @param array $truthyValues list of valid truthy values, defaults to `[true, 1, '1']`.
<add> * @param boolean|integer|string $check Value to check.
<add> * @param array $truthyValues List of valid truthy values, defaults to `[true, 1, '1']`.
<ide> * @return bool Success
<ide> */
<ide> public static function truthy($check, array $truthyValues = [])
<ide> public static function truthy($check, array $truthyValues = [])
<ide> }
<ide>
<ide> /**
<del> * Truthy validation, determines if value passed is falsey.
<add> * Validates if passed value is falsey.
<add> *
<ide> * The list of what is considered to be truthy values, may be set via $falseyValues.
<ide> *
<del> * @param string $check a valid boolean
<del> * @param array $falseyValues list of valid falsey values, defaults to `[false, 0, '0']`.
<add> * @param boolean|integer|string $check Value to check.
<add> * @param array $falseyValues List of valid falsey values, defaults to `[false, 0, '0']`.
<ide> * @return bool Success
<ide> */
<ide> public static function falsey($check, array $falseyValues = []) | 1 |
Text | Text | use code markup/markdown in headers | a80d4ac3debaea72a6fec4fd691a676dec2b9be7 | <ide><path>doc/api/tty.md
<ide> In most cases, there should be little to no reason for an application to
<ide> manually create instances of the `tty.ReadStream` and `tty.WriteStream`
<ide> classes.
<ide>
<del>## Class: tty.ReadStream
<add>## Class: `tty.ReadStream`
<ide> <!-- YAML
<ide> added: v0.5.8
<ide> -->
<ide> Represents the readable side of a TTY. In normal circumstances
<ide> [`process.stdin`][] will be the only `tty.ReadStream` instance in a Node.js
<ide> process and there should be no reason to create additional instances.
<ide>
<del>### readStream.isRaw
<add>### `readStream.isRaw`
<ide> <!-- YAML
<ide> added: v0.7.7
<ide> -->
<ide>
<ide> A `boolean` that is `true` if the TTY is currently configured to operate as a
<ide> raw device. Defaults to `false`.
<ide>
<del>### readStream.isTTY
<add>### `readStream.isTTY`
<ide> <!-- YAML
<ide> added: v0.5.8
<ide> -->
<ide>
<ide> A `boolean` that is always `true` for `tty.ReadStream` instances.
<ide>
<del>### readStream.setRawMode(mode)
<add>### `readStream.setRawMode(mode)`
<ide> <!-- YAML
<ide> added: v0.7.7
<ide> -->
<ide> including modifiers. Additionally, all special processing of characters by the
<ide> terminal is disabled, including echoing input characters.
<ide> `CTRL`+`C` will no longer cause a `SIGINT` when in this mode.
<ide>
<del>## Class: tty.WriteStream
<add>## Class: `tty.WriteStream`
<ide> <!-- YAML
<ide> added: v0.5.8
<ide> -->
<ide> Represents the writable side of a TTY. In normal circumstances,
<ide> `tty.WriteStream` instances created for a Node.js process and there
<ide> should be no reason to create additional instances.
<ide>
<del>### Event: 'resize'
<add>### Event: `'resize'`
<ide> <!-- YAML
<ide> added: v0.7.7
<ide> -->
<ide> process.stdout.on('resize', () => {
<ide> });
<ide> ```
<ide>
<del>### writeStream.clearLine(dir\[, callback\])
<add>### `writeStream.clearLine(dir[, callback])`
<ide> <!-- YAML
<ide> added: v0.7.7
<ide> changes:
<ide> changes:
<ide> `writeStream.clearLine()` clears the current line of this `WriteStream` in a
<ide> direction identified by `dir`.
<ide>
<del>### writeStream.clearScreenDown(\[callback\])
<add>### `writeStream.clearScreenDown([callback])`
<ide> <!-- YAML
<ide> added: v0.7.7
<ide> changes:
<ide> changes:
<ide> `writeStream.clearScreenDown()` clears this `WriteStream` from the current
<ide> cursor down.
<ide>
<del>### writeStream.columns
<add>### `writeStream.columns`
<ide> <!-- YAML
<ide> added: v0.7.7
<ide> -->
<ide>
<ide> A `number` specifying the number of columns the TTY currently has. This property
<ide> is updated whenever the `'resize'` event is emitted.
<ide>
<del>### writeStream.cursorTo(x\[, y\]\[, callback\])
<add>### `writeStream.cursorTo(x[, y][, callback])`
<ide> <!-- YAML
<ide> added: v0.7.7
<ide> changes:
<ide> changes:
<ide> `writeStream.cursorTo()` moves this `WriteStream`'s cursor to the specified
<ide> position.
<ide>
<del>### writeStream.getColorDepth(\[env\])
<add>### `writeStream.getColorDepth([env])`
<ide> <!-- YAML
<ide> added: v9.9.0
<ide> -->
<ide> To enforce a specific color support, use one of the below environment settings.
<ide> Disabling color support is also possible by using the `NO_COLOR` and
<ide> `NODE_DISABLE_COLORS` environment variables.
<ide>
<del>### writeStream.getWindowSize()
<add>### `writeStream.getWindowSize()`
<ide> <!-- YAML
<ide> added: v0.7.7
<ide> -->
<ide> corresponding to this `WriteStream`. The array is of the type
<ide> `[numColumns, numRows]` where `numColumns` and `numRows` represent the number
<ide> of columns and rows in the corresponding [TTY](tty.html).
<ide>
<del>### writeStream.hasColors(\[count\]\[, env\])
<add>### `writeStream.hasColors([count][, env])`
<ide> <!-- YAML
<ide> added: v11.13.0
<ide> -->
<ide> process.stdout.hasColors(2 ** 24, { TMUX: '1' });
<ide> // Returns false (the environment setting pretends to support 2 ** 8 colors).
<ide> ```
<ide>
<del>### writeStream.isTTY
<add>### `writeStream.isTTY`
<ide> <!-- YAML
<ide> added: v0.5.8
<ide> -->
<ide>
<ide> A `boolean` that is always `true`.
<ide>
<del>### writeStream.moveCursor(dx, dy\[, callback\])
<add>### `writeStream.moveCursor(dx, dy[, callback])`
<ide> <!-- YAML
<ide> added: v0.7.7
<ide> changes:
<ide> changes:
<ide> `writeStream.moveCursor()` moves this `WriteStream`'s cursor *relative* to its
<ide> current position.
<ide>
<del>### writeStream.rows
<add>### `writeStream.rows`
<ide> <!-- YAML
<ide> added: v0.7.7
<ide> -->
<ide>
<ide> A `number` specifying the number of rows the TTY currently has. This property
<ide> is updated whenever the `'resize'` event is emitted.
<ide>
<del>## tty.isatty(fd)
<add>## `tty.isatty(fd)`
<ide> <!-- YAML
<ide> added: v0.5.8
<ide> --> | 1 |
Ruby | Ruby | allow table name and direction in string order arg | 5180fe2cd8233169935065efe8762bd5d7b2709c | <ide><path>activerecord/lib/active_record/relation/calculations.rb
<ide> def pluck(*column_names)
<ide> end
<ide>
<ide> if has_include?(column_names.first)
<del> construct_relation_for_association_calculations.pluck(*column_names)
<add> relation = apply_join_dependency
<add> relation.pluck(*column_names)
<ide> else
<del> enforce_raw_sql_whitelist(column_names)
<add> enforce_raw_sql_whitelist(column_names, whitelist: allowed_pluck_columns)
<ide> relation = spawn
<ide> relation.select_values = column_names.map { |cn|
<ide> @klass.respond_to_attribute?(cn) ? arel_attribute(cn) : cn
<ide> }
<del> result = klass.connection.select_all(relation.arel, nil, bound_attributes)
<add> result = skip_query_cache_if_necessary { klass.connection.select_all(relation.arel, nil) }
<ide> result.cast_values(klass.attribute_types)
<ide> end
<ide> end
<ide> def ids
<ide>
<ide> private
<ide>
<del> def _pluck(column_names, unsafe_raw)
<del> unrecognized = column_names.reject do |cn|
<del> @klass.respond_to_attribute?(cn)
<del> end
<del>
<del> if loaded? && unrecognized.none?
<del> records.pluck(*column_names)
<del> elsif has_include?(column_names.first)
<del> relation = apply_join_dependency
<del> relation.pluck(*column_names)
<del> elsif unsafe_raw || unrecognized.none?
<del> relation = spawn
<del> relation.select_values = column_names.map { |cn|
<del> @klass.respond_to_attribute?(cn) ? arel_attribute(cn) : cn
<del> }
<del> result = skip_query_cache_if_necessary { klass.connection.select_all(relation.arel, nil) }
<del> result.cast_values(klass.attribute_types)
<del> else
<del> raise ArgumentError, "Invalid column name: #{unrecognized}"
<del> end
<add> def allowed_pluck_columns
<add> @klass.attribute_names_and_aliases.map do |name|
<add> [name, "#{table_name}.#{name}"]
<add> end.flatten
<ide> end
<ide>
<ide> def has_include?(column_name)
<ide><path>activerecord/lib/active_record/relation/query_methods.rb
<ide> def order(*args)
<ide>
<ide> # Same as #order but operates on relation in-place instead of copying.
<ide> def order!(*args) # :nodoc:
<del> @klass.enforce_raw_sql_whitelist(column_names_from_order_arguments(args))
<add> @klass.enforce_raw_sql_whitelist(
<add> column_names_from_order_arguments(args),
<add> whitelist: allowed_order_columns
<add> )
<add>
<ide> preprocess_order_args(args)
<ide>
<ide> self.order_values += args
<ide> def reorder(*args)
<ide>
<ide> # Same as #reorder but operates on relation in-place instead of copying.
<ide> def reorder!(*args) # :nodoc:
<del> @klass.enforce_raw_sql_whitelist(column_names_from_order_arguments(args))
<add> @klass.enforce_raw_sql_whitelist(
<add> column_names_from_order_arguments(args),
<add> whitelist: allowed_order_columns
<add> )
<add>
<ide> preprocess_order_args(args)
<ide>
<ide> self.reordering_value = true
<ide> def set_value(name, value) # :nodoc:
<ide>
<ide> private
<ide>
<add> def allowed_order_columns
<add> @klass.attribute_names_and_aliases.map do |name|
<add> [name, "#{table_name}.#{name}"].map do |name|
<add> [
<add> name,
<add> "#{name} asc",
<add> "#{name} ASC",
<add> "#{name} desc",
<add> "#{name} DESC"
<add> ]
<add> end
<add> end.flatten
<add> end
<add>
<ide> # Extract column names from arguments passed to #order or #reorder.
<ide> def column_names_from_order_arguments(args)
<ide> args.flat_map { |arg| arg.is_a?(Hash) ? arg.keys : arg }
<ide><path>activerecord/test/cases/associations/cascaded_eager_loading_test.rb
<ide> class CascadedEagerLoadingTest < ActiveRecord::TestCase
<ide> :categorizations, :people, :categories, :edges, :vertices
<ide>
<ide> def test_eager_association_loading_with_cascaded_two_levels
<del> authors = Author.all.merge!(includes: { posts: :comments }, order: Arel.sql("authors.id")).to_a
<add> authors = Author.all.merge!(includes: { posts: :comments }, order: "authors.id").to_a
<ide> assert_equal 3, authors.size
<ide> assert_equal 5, authors[0].posts.size
<ide> assert_equal 3, authors[1].posts.size
<ide> assert_equal 10, authors[0].posts.collect { |post| post.comments.size }.inject(0) { |sum, i| sum + i }
<ide> end
<ide>
<ide> def test_eager_association_loading_with_cascaded_two_levels_and_one_level
<del> authors = Author.all.merge!(includes: [{ posts: :comments }, :categorizations], order: Arel.sql("authors.id")).to_a
<add> authors = Author.all.merge!(includes: [{ posts: :comments }, :categorizations], order: "authors.id").to_a
<ide> assert_equal 3, authors.size
<ide> assert_equal 5, authors[0].posts.size
<ide> assert_equal 3, authors[1].posts.size
<ide> def test_eager_association_loading_with_hmt_does_not_table_name_collide_when_joi
<ide> end
<ide>
<ide> def test_eager_association_loading_grafts_stashed_associations_to_correct_parent
<del> assert_equal people(:michael), Person.eager_load(primary_contact: :primary_contact).where("primary_contacts_people_2.first_name = ?", "Susan").order(Arel.sql("people.id")).first
<add> assert_equal people(:michael), Person.eager_load(primary_contact: :primary_contact).where("primary_contacts_people_2.first_name = ?", "Susan").order("people.id").first
<ide> end
<ide>
<ide> def test_cascaded_eager_association_loading_with_join_for_count
<ide> def test_eager_association_loading_with_join_for_count
<ide> end
<ide>
<ide> def test_eager_association_loading_with_cascaded_two_levels_with_two_has_many_associations
<del> authors = Author.all.merge!(includes: { posts: [:comments, :categorizations] }, order: Arel.sql("authors.id")).to_a
<add> authors = Author.all.merge!(includes: { posts: [:comments, :categorizations] }, order: "authors.id").to_a
<ide> assert_equal 3, authors.size
<ide> assert_equal 5, authors[0].posts.size
<ide> assert_equal 3, authors[1].posts.size
<ide> assert_equal 10, authors[0].posts.collect { |post| post.comments.size }.inject(0) { |sum, i| sum + i }
<ide> end
<ide>
<ide> def test_eager_association_loading_with_cascaded_two_levels_and_self_table_reference
<del> authors = Author.all.merge!(includes: { posts: [:comments, :author] }, order: Arel.sql("authors.id")).to_a
<add> authors = Author.all.merge!(includes: { posts: [:comments, :author] }, order: "authors.id").to_a
<ide> assert_equal 3, authors.size
<ide> assert_equal 5, authors[0].posts.size
<ide> assert_equal authors(:david).name, authors[0].name
<ide> assert_equal [authors(:david).name], authors[0].posts.collect { |post| post.author.name }.uniq
<ide> end
<ide>
<ide> def test_eager_association_loading_with_cascaded_two_levels_with_condition
<del> authors = Author.all.merge!(includes: { posts: :comments }, where: "authors.id=1", order: Arel.sql("authors.id")).to_a
<add> authors = Author.all.merge!(includes: { posts: :comments }, where: "authors.id=1", order: "authors.id").to_a
<ide> assert_equal 1, authors.size
<ide> assert_equal 5, authors[0].posts.size
<ide> end
<ide>
<ide> def test_eager_association_loading_with_cascaded_three_levels_by_ping_pong
<del> firms = Firm.all.merge!(includes: { account: { firm: :account } }, order: Arel.sql("companies.id")).to_a
<add> firms = Firm.all.merge!(includes: { account: { firm: :account } }, order: "companies.id").to_a
<ide> assert_equal 2, firms.size
<ide> assert_equal firms.first.account, firms.first.account.firm.account
<ide> assert_equal companies(:first_firm).account, assert_no_queries { firms.first.account.firm.account }
<ide> assert_equal companies(:first_firm).account.firm.account, assert_no_queries { firms.first.account.firm.account }
<ide> end
<ide>
<ide> def test_eager_association_loading_with_has_many_sti
<del> topics = Topic.all.merge!(includes: :replies, order: Arel.sql("topics.id")).to_a
<add> topics = Topic.all.merge!(includes: :replies, order: "topics.id").to_a
<ide> first, second, = topics(:first).replies.size, topics(:second).replies.size
<ide> assert_no_queries do
<ide> assert_equal first, topics[0].replies.size
<ide> def test_eager_association_loading_with_has_many_sti_and_subclasses
<ide> silly.parent_id = 1
<ide> assert silly.save
<ide>
<del> topics = Topic.all.merge!(includes: :replies, order: [Arel.sql("topics.id"), Arel.sql("replies_topics.id")]).to_a
<add> topics = Topic.all.merge!(includes: :replies, order: ["topics.id", Arel.sql("replies_topics.id")]).to_a
<ide> assert_no_queries do
<ide> assert_equal 2, topics[0].replies.size
<ide> assert_equal 0, topics[1].replies.size
<ide> def test_eager_association_loading_with_belongs_to_sti
<ide> end
<ide>
<ide> def test_eager_association_loading_with_multiple_stis_and_order
<del> author = Author.all.merge!(includes: { posts: [ :special_comments , :very_special_comment ] }, order: [Arel.sql("authors.name"), Arel.sql("comments.body"), Arel.sql("very_special_comments_posts.body")], where: "posts.id = 4").first
<add> author = Author.all.merge!(includes: { posts: [ :special_comments , :very_special_comment ] }, order: ["authors.name", Arel.sql("comments.body"), Arel.sql("very_special_comments_posts.body")], where: "posts.id = 4").first
<ide> assert_equal authors(:david), author
<ide> assert_no_queries do
<ide> author.posts.first.special_comments
<ide> def test_eager_association_loading_of_stis_with_multiple_references
<ide> end
<ide>
<ide> def test_eager_association_loading_where_first_level_returns_nil
<del> authors = Author.all.merge!(includes: { post_about_thinking: :comments }, order: Arel.sql("authors.id DESC")).to_a
<add> authors = Author.all.merge!(includes: { post_about_thinking: :comments }, order: "authors.id DESC").to_a
<ide> assert_equal [authors(:bob), authors(:mary), authors(:david)], authors
<ide> assert_no_queries do
<ide> authors[2].post_about_thinking.comments.first
<ide> end
<ide> end
<ide>
<ide> def test_eager_association_loading_with_recursive_cascading_four_levels_has_many_through
<del> source = Vertex.all.merge!(includes: { sinks: { sinks: { sinks: :sinks } } }, order: Arel.sql("vertices.id")).first
<add> source = Vertex.all.merge!(includes: { sinks: { sinks: { sinks: :sinks } } }, order: "vertices.id").first
<ide> assert_equal vertices(:vertex_4), assert_no_queries { source.sinks.first.sinks.first.sinks.first }
<ide> end
<ide>
<ide> def test_eager_association_loading_with_recursive_cascading_four_levels_has_and_belongs_to_many
<del> sink = Vertex.all.merge!(includes: { sources: { sources: { sources: :sources } } }, order: Arel.sql("vertices.id DESC")).first
<add> sink = Vertex.all.merge!(includes: { sources: { sources: { sources: :sources } } }, order: "vertices.id DESC").first
<ide> assert_equal vertices(:vertex_1), assert_no_queries { sink.sources.first.sources.first.sources.first.sources.first }
<ide> end
<ide>
<ide> def test_eager_association_loading_with_cascaded_interdependent_one_level_and_two_levels
<del> authors_relation = Author.all.merge!(includes: [:comments, { posts: :categorizations }], order: Arel.sql("authors.id"))
<add> authors_relation = Author.all.merge!(includes: [:comments, { posts: :categorizations }], order: "authors.id")
<ide> authors = authors_relation.to_a
<ide> assert_equal 3, authors.size
<ide> assert_equal 10, authors[0].comments.size
<ide><path>activerecord/test/cases/associations/eager_test.rb
<ide> def test_loading_with_scope_including_joins
<ide> end
<ide>
<ide> def test_with_ordering
<del> list = Post.all.merge!(includes: :comments, order: Arel.sql("posts.id DESC")).to_a
<add> list = Post.all.merge!(includes: :comments, order: "posts.id DESC").to_a
<ide> [:other_by_mary, :other_by_bob, :misc_by_mary, :misc_by_bob, :eager_other,
<ide> :sti_habtm, :sti_post_and_comments, :sti_comments, :authorless, :thinking, :welcome
<ide> ].each_with_index do |post, index|
<ide> def test_calculate_with_string_in_from_and_eager_loading
<ide> end
<ide>
<ide> def test_with_two_tables_in_from_without_getting_double_quoted
<del> posts = Post.select("posts.*").from("authors, posts").eager_load(:comments).where("posts.author_id = authors.id").order(Arel.sql("posts.id")).to_a
<add> posts = Post.select("posts.*").from("authors, posts").eager_load(:comments).where("posts.author_id = authors.id").order("posts.id").to_a
<ide> assert_equal 2, posts.first.comments.size
<ide> end
<ide>
<ide> def test_loading_with_multiple_associations
<del> posts = Post.all.merge!(includes: [ :comments, :author, :categories ], order: Arel.sql("posts.id")).to_a
<add> posts = Post.all.merge!(includes: [ :comments, :author, :categories ], order: "posts.id").to_a
<ide> assert_equal 2, posts.first.comments.size
<ide> assert_equal 2, posts.first.categories.size
<ide> assert_includes posts.first.comments, comments(:greetings)
<ide> def test_finding_with_includes_on_empty_polymorphic_type_column
<ide> end
<ide>
<ide> def test_loading_from_an_association
<del> posts = authors(:david).posts.merge(includes: :comments, order: Arel.sql("posts.id")).to_a
<add> posts = authors(:david).posts.merge(includes: :comments, order: "posts.id").to_a
<ide> assert_equal 2, posts.first.comments.size
<ide> end
<ide>
<ide> def test_nested_loading_through_has_one_association
<ide> end
<ide>
<ide> def test_nested_loading_through_has_one_association_with_order
<del> aa = AuthorAddress.all.merge!(includes: { author: :posts }, order: Arel.sql("author_addresses.id")).find(author_addresses(:david_address).id)
<add> aa = AuthorAddress.all.merge!(includes: { author: :posts }, order: "author_addresses.id").find(author_addresses(:david_address).id)
<ide> assert_equal aa.author.posts.count, aa.author.posts.length
<ide> end
<ide>
<ide> def test_eager_association_loading_with_belongs_to
<ide> end
<ide>
<ide> def test_eager_association_loading_with_belongs_to_and_limit
<del> comments = Comment.all.merge!(includes: :post, limit: 5, order: Arel.sql("comments.id")).to_a
<add> comments = Comment.all.merge!(includes: :post, limit: 5, order: "comments.id").to_a
<ide> assert_equal 5, comments.length
<ide> assert_equal [1, 2, 3, 5, 6], comments.collect(&:id)
<ide> end
<ide>
<ide> def test_eager_association_loading_with_belongs_to_and_limit_and_conditions
<del> comments = Comment.all.merge!(includes: :post, where: "post_id = 4", limit: 3, order: Arel.sql("comments.id")).to_a
<add> comments = Comment.all.merge!(includes: :post, where: "post_id = 4", limit: 3, order: "comments.id").to_a
<ide> assert_equal 3, comments.length
<ide> assert_equal [5, 6, 7], comments.collect(&:id)
<ide> end
<ide>
<ide> def test_eager_association_loading_with_belongs_to_and_limit_and_offset
<del> comments = Comment.all.merge!(includes: :post, limit: 3, offset: 2, order: Arel.sql("comments.id")).to_a
<add> comments = Comment.all.merge!(includes: :post, limit: 3, offset: 2, order: "comments.id").to_a
<ide> assert_equal 3, comments.length
<ide> assert_equal [3, 5, 6], comments.collect(&:id)
<ide> end
<ide>
<ide> def test_eager_association_loading_with_belongs_to_and_limit_and_offset_and_conditions
<del> comments = Comment.all.merge!(includes: :post, where: "post_id = 4", limit: 3, offset: 1, order: Arel.sql("comments.id")).to_a
<add> comments = Comment.all.merge!(includes: :post, where: "post_id = 4", limit: 3, offset: 1, order: "comments.id").to_a
<ide> assert_equal 3, comments.length
<ide> assert_equal [6, 7, 8], comments.collect(&:id)
<ide> end
<ide>
<ide> def test_eager_association_loading_with_belongs_to_and_limit_and_offset_and_conditions_array
<del> comments = Comment.all.merge!(includes: :post, where: ["post_id = ?", 4], limit: 3, offset: 1, order: Arel.sql("comments.id")).to_a
<add> comments = Comment.all.merge!(includes: :post, where: ["post_id = ?", 4], limit: 3, offset: 1, order: "comments.id").to_a
<ide> assert_equal 3, comments.length
<ide> assert_equal [6, 7, 8], comments.collect(&:id)
<ide> end
<ide> def test_eager_association_loading_with_belongs_to_and_conditions_string_with_un
<ide> def test_eager_association_loading_with_belongs_to_and_conditions_hash
<ide> comments = []
<ide> assert_nothing_raised do
<del> comments = Comment.all.merge!(includes: :post, where: { posts: { id: 4 } }, limit: 3, order: Arel.sql("comments.id")).to_a
<add> comments = Comment.all.merge!(includes: :post, where: { posts: { id: 4 } }, limit: 3, order: "comments.id").to_a
<ide> end
<ide> assert_equal 3, comments.length
<ide> assert_equal [5, 6, 7], comments.collect(&:id)
<ide> def test_eager_association_loading_with_belongs_to_and_order_string_with_quoted_
<ide> end
<ide>
<ide> def test_eager_association_loading_with_belongs_to_and_limit_and_multiple_associations
<del> posts = Post.all.merge!(includes: [:author, :very_special_comment], limit: 1, order: Arel.sql("posts.id")).to_a
<add> posts = Post.all.merge!(includes: [:author, :very_special_comment], limit: 1, order: "posts.id").to_a
<ide> assert_equal 1, posts.length
<ide> assert_equal [1], posts.collect(&:id)
<ide> end
<ide>
<ide> def test_eager_association_loading_with_belongs_to_and_limit_and_offset_and_multiple_associations
<del> posts = Post.all.merge!(includes: [:author, :very_special_comment], limit: 1, offset: 1, order: Arel.sql("posts.id")).to_a
<add> posts = Post.all.merge!(includes: [:author, :very_special_comment], limit: 1, offset: 1, order: "posts.id").to_a
<ide> assert_equal 1, posts.length
<ide> assert_equal [2], posts.collect(&:id)
<ide> end
<ide> def test_eager_association_loading_with_explicit_join
<ide> end
<ide>
<ide> def test_eager_with_has_many_through
<del> posts_with_comments = people(:michael).posts.merge(includes: :comments, order: Arel.sql("posts.id")).to_a
<del> posts_with_author = people(:michael).posts.merge(includes: :author, order: Arel.sql("posts.id")).to_a
<del> posts_with_comments_and_author = people(:michael).posts.merge(includes: [ :comments, :author ], order: Arel.sql("posts.id")).to_a
<add> posts_with_comments = people(:michael).posts.merge(includes: :comments, order: "posts.id").to_a
<add> posts_with_author = people(:michael).posts.merge(includes: :author, order: "posts.id").to_a
<add> posts_with_comments_and_author = people(:michael).posts.merge(includes: [ :comments, :author ], order: "posts.id").to_a
<ide> assert_equal 2, posts_with_comments.inject(0) { |sum, post| sum + post.comments.size }
<ide> assert_equal authors(:david), assert_no_queries { posts_with_author.first.author }
<ide> assert_equal authors(:david), assert_no_queries { posts_with_comments_and_author.first.author }
<ide> def test_eager_with_has_many_through_a_belongs_to_association
<ide> end
<ide>
<ide> def test_eager_with_has_many_through_an_sti_join_model
<del> author = Author.all.merge!(includes: :special_post_comments, order: Arel.sql("authors.id")).first
<add> author = Author.all.merge!(includes: :special_post_comments, order: "authors.id").first
<ide> assert_equal [comments(:does_it_hurt)], assert_no_queries { author.special_post_comments }
<ide> end
<ide>
<ide> def test_preloading_has_many_through_with_implicit_source
<ide> end
<ide>
<ide> def test_eager_with_has_many_through_an_sti_join_model_with_conditions_on_both
<del> author = Author.all.merge!(includes: :special_nonexistent_post_comments, order: Arel.sql("authors.id")).first
<add> author = Author.all.merge!(includes: :special_nonexistent_post_comments, order: "authors.id").first
<ide> assert_equal [], author.special_nonexistent_post_comments
<ide> end
<ide>
<ide> def test_eager_with_has_many_through_join_model_with_conditions
<ide> assert_equal Author.all.merge!(includes: :hello_post_comments,
<del> order: Arel.sql("authors.id")).first.hello_post_comments.sort_by(&:id),
<del> Author.all.merge!(order: Arel.sql("authors.id")).first.hello_post_comments.sort_by(&:id)
<add> order: "authors.id").first.hello_post_comments.sort_by(&:id),
<add> Author.all.merge!(order: "authors.id").first.hello_post_comments.sort_by(&:id)
<ide> end
<ide>
<ide> def test_eager_with_has_many_through_join_model_with_conditions_on_top_level
<ide> def test_eager_with_has_many_through_join_model_ignores_default_includes
<ide> end
<ide>
<ide> def test_eager_with_has_many_and_limit
<del> posts = Post.all.merge!(order: Arel.sql("posts.id asc"), includes: [ :author, :comments ], limit: 2).to_a
<add> posts = Post.all.merge!(order: "posts.id asc", includes: [ :author, :comments ], limit: 2).to_a
<ide> assert_equal 2, posts.size
<ide> assert_equal 3, posts.inject(0) { |sum, post| sum + post.comments.size }
<ide> end
<ide>
<ide> def test_eager_with_has_many_and_limit_and_conditions
<del> posts = Post.all.merge!(includes: [ :author, :comments ], limit: 2, where: "posts.body = 'hello'", order: Arel.sql("posts.id")).to_a
<add> posts = Post.all.merge!(includes: [ :author, :comments ], limit: 2, where: "posts.body = 'hello'", order: "posts.id").to_a
<ide> assert_equal 2, posts.size
<ide> assert_equal [4, 5], posts.collect(&:id)
<ide> end
<ide>
<ide> def test_eager_with_has_many_and_limit_and_conditions_array
<del> posts = Post.all.merge!(includes: [ :author, :comments ], limit: 2, where: [ "posts.body = ?", "hello" ], order: Arel.sql("posts.id")).to_a
<add> posts = Post.all.merge!(includes: [ :author, :comments ], limit: 2, where: [ "posts.body = ?", "hello" ], order: "posts.id").to_a
<ide> assert_equal 2, posts.size
<ide> assert_equal [4, 5], posts.collect(&:id)
<ide> end
<ide> def test_eager_count_performed_on_a_has_many_through_association_with_multi_tabl
<ide> end
<ide>
<ide> def test_eager_with_has_and_belongs_to_many_and_limit
<del> posts = Post.all.merge!(includes: :categories, order: Arel.sql("posts.id"), limit: 3).to_a
<add> posts = Post.all.merge!(includes: :categories, order: "posts.id", limit: 3).to_a
<ide> assert_equal 3, posts.size
<ide> assert_equal 2, posts[0].categories.size
<ide> assert_equal 1, posts[1].categories.size
<ide> def test_eager_with_has_many_and_limit_and_scoped_conditions_on_the_eagers
<ide> end
<ide>
<ide> def test_eager_association_loading_with_habtm
<del> posts = Post.all.merge!(includes: :categories, order: Arel.sql("posts.id")).to_a
<add> posts = Post.all.merge!(includes: :categories, order: "posts.id").to_a
<ide> assert_equal 2, posts[0].categories.size
<ide> assert_equal 1, posts[1].categories.size
<ide> assert_equal 0, posts[2].categories.size
<ide> def test_limited_eager_with_multiple_order_columns
<ide> posts(:thinking, :sti_comments),
<ide> Post.all.merge!(
<ide> includes: [:author, :comments], where: { "authors.name" => "David" },
<del> order: [Arel.sql("UPPER(posts.title)"), Arel.sql("posts.id")], limit: 2, offset: 1
<add> order: [Arel.sql("UPPER(posts.title)"), "posts.id"], limit: 2, offset: 1
<ide> ).to_a
<ide> )
<ide> assert_equal(
<ide> posts(:sti_post_and_comments, :sti_comments),
<ide> Post.all.merge!(
<ide> includes: [:author, :comments], where: { "authors.name" => "David" },
<del> order: [Arel.sql("UPPER(posts.title) DESC"), Arel.sql("posts.id")], limit: 2, offset: 1
<add> order: [Arel.sql("UPPER(posts.title) DESC"), "posts.id"], limit: 2, offset: 1
<ide> ).to_a
<ide> )
<ide> end
<ide> def test_limited_eager_with_numeric_in_association
<ide> Person.references(:number1_fans_people).merge(
<ide> includes: [:readers, :primary_contact, :number1_fan],
<ide> where: "number1_fans_people.first_name like 'M%'",
<del> order: Arel.sql("people.id"), limit: 2, offset: 0
<add> order: "people.id", limit: 2, offset: 0
<ide> ).to_a
<ide> )
<ide> end
<ide> def test_eager_loading_with_order_on_joined_table_preloads
<ide>
<ide> def test_eager_loading_with_conditions_on_joined_table_preloads
<ide> posts = assert_queries(2) do
<del> Post.all.merge!(select: "distinct posts.*", includes: :author, joins: [:comments], where: "comments.body like 'Thank you%'", order: Arel.sql("posts.id")).to_a
<add> Post.all.merge!(select: "distinct posts.*", includes: :author, joins: [:comments], where: "comments.body like 'Thank you%'", order: "posts.id").to_a
<ide> end
<ide> assert_equal [posts(:welcome)], posts
<ide> assert_equal authors(:david), assert_no_queries { posts[0].author }
<ide>
<ide> posts = assert_queries(2) do
<del> Post.all.merge!(includes: :author, joins: { taggings: :tag }, where: "tags.name = 'General'", order: Arel.sql("posts.id")).to_a
<add> Post.all.merge!(includes: :author, joins: { taggings: :tag }, where: "tags.name = 'General'", order: "posts.id").to_a
<ide> end
<ide> assert_equal posts(:welcome, :thinking), posts
<ide>
<ide> posts = assert_queries(2) do
<del> Post.all.merge!(includes: :author, joins: { taggings: { tag: :taggings } }, where: "taggings_tags.super_tag_id=2", order: Arel.sql("posts.id")).to_a
<add> Post.all.merge!(includes: :author, joins: { taggings: { tag: :taggings } }, where: "taggings_tags.super_tag_id=2", order: "posts.id").to_a
<ide> end
<ide> assert_equal posts(:welcome, :thinking), posts
<ide> end
<ide> def test_preload_has_many_with_association_condition_and_default_scope
<ide>
<ide> def test_eager_loading_with_conditions_on_string_joined_table_preloads
<ide> posts = assert_queries(2) do
<del> Post.all.merge!(select: "distinct posts.*", includes: :author, joins: "INNER JOIN comments on comments.post_id = posts.id", where: "comments.body like 'Thank you%'", order: Arel.sql("posts.id")).to_a
<add> Post.all.merge!(select: "distinct posts.*", includes: :author, joins: "INNER JOIN comments on comments.post_id = posts.id", where: "comments.body like 'Thank you%'", order: "posts.id").to_a
<ide> end
<ide> assert_equal [posts(:welcome)], posts
<ide> assert_equal authors(:david), assert_no_queries { posts[0].author }
<ide>
<ide> posts = assert_queries(2) do
<del> Post.all.merge!(select: "distinct posts.*", includes: :author, joins: ["INNER JOIN comments on comments.post_id = posts.id"], where: "comments.body like 'Thank you%'", order: Arel.sql("posts.id")).to_a
<add> Post.all.merge!(select: "distinct posts.*", includes: :author, joins: ["INNER JOIN comments on comments.post_id = posts.id"], where: "comments.body like 'Thank you%'", order: "posts.id").to_a
<ide> end
<ide> assert_equal [posts(:welcome)], posts
<ide> assert_equal authors(:david), assert_no_queries { posts[0].author }
<ide> end
<ide>
<ide> def test_eager_loading_with_select_on_joined_table_preloads
<ide> posts = assert_queries(2) do
<del> Post.all.merge!(select: "posts.*, authors.name as author_name", includes: :comments, joins: :author, order: Arel.sql("posts.id")).to_a
<add> Post.all.merge!(select: "posts.*, authors.name as author_name", includes: :comments, joins: :author, order: "posts.id").to_a
<ide> end
<ide> assert_equal "David", posts[0].author_name
<ide> assert_equal posts(:welcome).comments, assert_no_queries { posts[0].comments }
<ide> def test_include_has_many_using_primary_key
<ide>
<ide> def test_preload_has_one_using_primary_key
<ide> expected = accounts(:signals37)
<del> firm = Firm.all.merge!(includes: :account_using_primary_key, order: Arel.sql("companies.id")).first
<add> firm = Firm.all.merge!(includes: :account_using_primary_key, order: "companies.id").first
<ide> assert_no_queries do
<ide> assert_equal expected, firm.account_using_primary_key
<ide> end
<ide> def test_preloading_polymorphic_with_custom_foreign_type
<ide> end
<ide>
<ide> def test_joins_with_includes_should_preload_via_joins
<del> post = assert_queries(1) { Post.includes(:comments).joins(:comments).order(Arel.sql("posts.id desc")).to_a.first }
<add> post = assert_queries(1) { Post.includes(:comments).joins(:comments).order("posts.id desc").to_a.first }
<ide>
<ide> assert_queries(0) do
<ide> assert_not_equal 0, post.comments.to_a.count
<ide> def test_join_eager_with_empty_order_should_generate_valid_sql
<ide>
<ide> def test_deep_including_through_habtm
<ide> # warm up habtm cache
<del> posts = Post.all.merge!(includes: { categories: :categorizations }, order: Arel.sql("posts.id")).to_a
<add> posts = Post.all.merge!(includes: { categories: :categorizations }, order: "posts.id").to_a
<ide> posts[0].categories[0].categorizations.length
<ide>
<del> posts = Post.all.merge!(includes: { categories: :categorizations }, order: Arel.sql("posts.id")).to_a
<add> posts = Post.all.merge!(includes: { categories: :categorizations }, order: "posts.id").to_a
<ide> assert_no_queries { assert_equal 2, posts[0].categories[0].categorizations.length }
<ide> assert_no_queries { assert_equal 1, posts[0].categories[1].categorizations.length }
<ide> assert_no_queries { assert_equal 2, posts[1].categories[0].categorizations.length }
<ide> def test_preloading_has_many_through_with_custom_scope
<ide>
<ide> private
<ide> def find_all_ordered(klass, include = nil)
<del> klass.order(Arel.sql("#{klass.table_name}.#{klass.primary_key}")).includes(include).to_a
<add> klass.order("#{klass.table_name}.#{klass.primary_key}").includes(include).to_a
<ide> end
<ide> end
<ide><path>activerecord/test/cases/associations/has_many_associations_test.rb
<ide> def test_should_generate_valid_sql
<ide> author = authors(:david)
<ide> # this can fail on adapters which require ORDER BY expressions to be included in the SELECT expression
<ide> # if the reorder clauses are not correctly handled
<del> assert author.posts_with_comments_sorted_by_comment_id.where("comments.id > 0").reorder(Arel.sql("posts.comments_count DESC"), Arel.sql("posts.tags_count DESC")).last
<add> assert author.posts_with_comments_sorted_by_comment_id.where("comments.id > 0").reorder("posts.comments_count DESC", "posts.tags_count DESC").last
<ide> end
<ide> end
<ide>
<ide> def test_has_many_without_counter_cache_option
<ide> end
<ide>
<ide> def test_deleting_updates_counter_cache
<del> topic = Topic.order(Arel.sql("id ASC")).first
<add> topic = Topic.order("id ASC").first
<ide> assert_equal topic.replies.to_a.size, topic.replies_count
<ide>
<ide> topic.replies.delete(topic.replies.first)
<ide> def test_counter_cache_updates_in_memory_after_create_with_array
<ide> end
<ide>
<ide> def test_pushing_association_updates_counter_cache
<del> topic = Topic.order(Arel.sql("id ASC")).first
<add> topic = Topic.order("id ASC").first
<ide> reply = Reply.create!
<ide>
<ide> assert_difference "topic.reload.replies_count", 1 do
<ide> def test_custom_named_counter_cache
<ide> end
<ide>
<ide> def test_calling_update_attributes_on_id_changes_the_counter_cache
<del> topic = Topic.order(Arel.sql("id ASC")).first
<add> topic = Topic.order("id ASC").first
<ide> original_count = topic.replies.to_a.size
<ide> assert_equal original_count, topic.replies_count
<ide>
<ide><path>activerecord/test/cases/associations/has_many_through_associations_test.rb
<ide> def test_ordered_has_many_through
<ide> def self.name; "Person"; end
<ide>
<ide> has_many :readers
<del> has_many :posts, -> { order(Arel.sql("posts.id DESC")) }, through: :readers
<add> has_many :posts, -> { order("posts.id DESC") }, through: :readers
<ide> end
<ide> posts = person_prime.includes(:posts).first.posts
<ide>
<ide> def test_joining_has_many_through_with_distinct
<ide> end
<ide>
<ide> def test_joining_has_many_through_belongs_to
<del> posts = Post.joins(:author_categorizations).order(Arel.sql("posts.id")).
<add> posts = Post.joins(:author_categorizations).order("posts.id").
<ide> where("categorizations.id" => categorizations(:mary_thinking_sti).id)
<ide>
<ide> assert_equal [posts(:eager_other), posts(:misc_by_mary), posts(:other_by_mary)], posts
<ide><path>activerecord/test/cases/associations/join_model_test.rb
<ide> def test_create_through_has_many_with_piggyback
<ide> end
<ide>
<ide> def test_include_has_many_through
<del> posts = Post.all.merge!(order: Arel.sql("posts.id")).to_a
<del> posts_with_authors = Post.all.merge!(includes: :authors, order: Arel.sql("posts.id")).to_a
<add> posts = Post.all.merge!(order: "posts.id").to_a
<add> posts_with_authors = Post.all.merge!(includes: :authors, order: "posts.id").to_a
<ide> assert_equal posts.length, posts_with_authors.length
<ide> posts.length.times do |i|
<ide> assert_equal posts[i].authors.length, assert_no_queries { posts_with_authors[i].authors.length }
<ide> def test_include_polymorphic_has_one_defined_in_abstract_parent
<ide> end
<ide>
<ide> def test_include_polymorphic_has_many_through
<del> posts = Post.all.merge!(order: Arel.sql("posts.id")).to_a
<del> posts_with_tags = Post.all.merge!(includes: :tags, order: Arel.sql("posts.id")).to_a
<add> posts = Post.all.merge!(order: "posts.id").to_a
<add> posts_with_tags = Post.all.merge!(includes: :tags, order: "posts.id").to_a
<ide> assert_equal posts.length, posts_with_tags.length
<ide> posts.length.times do |i|
<ide> assert_equal posts[i].tags.length, assert_no_queries { posts_with_tags[i].tags.length }
<ide> end
<ide> end
<ide>
<ide> def test_include_polymorphic_has_many
<del> posts = Post.all.merge!(order: Arel.sql("posts.id")).to_a
<del> posts_with_taggings = Post.all.merge!(includes: :taggings, order: Arel.sql("posts.id")).to_a
<add> posts = Post.all.merge!(order: "posts.id").to_a
<add> posts_with_taggings = Post.all.merge!(includes: :taggings, order: "posts.id").to_a
<ide> assert_equal posts.length, posts_with_taggings.length
<ide> posts.length.times do |i|
<ide> assert_equal posts[i].taggings.length, assert_no_queries { posts_with_taggings[i].taggings.length }
<ide> def test_has_many_through_with_custom_primary_key_on_belongs_to_source
<ide> end
<ide>
<ide> def test_has_many_through_with_custom_primary_key_on_has_many_source
<del> assert_equal [authors(:david), authors(:bob)], posts(:thinking).authors_using_custom_pk.order(Arel.sql("authors.id"))
<add> assert_equal [authors(:david), authors(:bob)], posts(:thinking).authors_using_custom_pk.order("authors.id")
<ide> end
<ide>
<ide> def test_belongs_to_polymorphic_with_counter_cache
<ide> def test_eager_has_many_polymorphic_with_source_type
<ide> end
<ide>
<ide> def test_has_many_through_has_many_find_all
<del> assert_equal comments(:greetings), authors(:david).comments.order(Arel.sql("comments.id")).to_a.first
<add> assert_equal comments(:greetings), authors(:david).comments.order("comments.id").to_a.first
<ide> end
<ide>
<ide> def test_has_many_through_has_many_find_all_with_custom_class
<del> assert_equal comments(:greetings), authors(:david).funky_comments.order(Arel.sql("comments.id")).to_a.first
<add> assert_equal comments(:greetings), authors(:david).funky_comments.order("comments.id").to_a.first
<ide> end
<ide>
<ide> def test_has_many_through_has_many_find_first
<del> assert_equal comments(:greetings), authors(:david).comments.order(Arel.sql("comments.id")).first
<add> assert_equal comments(:greetings), authors(:david).comments.order("comments.id").first
<ide> end
<ide>
<ide> def test_has_many_through_has_many_find_conditions
<del> options = { where: "comments.#{QUOTED_TYPE}='SpecialComment'", order: Arel.sql("comments.id") }
<add> options = { where: "comments.#{QUOTED_TYPE}='SpecialComment'", order: "comments.id" }
<ide> assert_equal comments(:does_it_hurt), authors(:david).comments.merge(options).first
<ide> end
<ide>
<ide> def test_polymorphic_belongs_to
<ide> end
<ide>
<ide> def test_preload_polymorphic_has_many_through
<del> posts = Post.all.merge!(order: Arel.sql("posts.id")).to_a
<del> posts_with_tags = Post.all.merge!(includes: :tags, order: Arel.sql("posts.id")).to_a
<add> posts = Post.all.merge!(order: "posts.id").to_a
<add> posts_with_tags = Post.all.merge!(includes: :tags, order: "posts.id").to_a
<ide> assert_equal posts.length, posts_with_tags.length
<ide> posts.length.times do |i|
<ide> assert_equal posts[i].tags.length, assert_no_queries { posts_with_tags[i].tags.length }
<ide> def test_preload_nil_polymorphic_belongs_to
<ide> end
<ide>
<ide> def test_preload_polymorphic_has_many
<del> posts = Post.all.merge!(order: Arel.sql("posts.id")).to_a
<del> posts_with_taggings = Post.all.merge!(includes: :taggings, order: Arel.sql("posts.id")).to_a
<add> posts = Post.all.merge!(order: "posts.id").to_a
<add> posts_with_taggings = Post.all.merge!(includes: :taggings, order: "posts.id").to_a
<ide> assert_equal posts.length, posts_with_taggings.length
<ide> posts.length.times do |i|
<ide> assert_equal posts[i].taggings.length, assert_no_queries { posts_with_taggings[i].taggings.length }
<ide><path>activerecord/test/cases/associations/nested_through_associations_test.rb
<ide> def test_has_many_through_has_many_with_has_many_through_source_reflection_prelo
<ide> # Through: has_many through
<ide> def test_has_many_through_has_many_through_with_has_many_source_reflection
<ide> luke, david = subscribers(:first), subscribers(:second)
<del> assert_equal [luke, david, david], authors(:david).subscribers.order(Arel.sql("subscribers.nick"))
<add> assert_equal [luke, david, david], authors(:david).subscribers.order("subscribers.nick")
<ide> end
<ide>
<ide> def test_has_many_through_has_many_through_with_has_many_source_reflection_preload
<ide> def test_has_many_through_has_one_with_has_many_through_source_reflection
<ide> groucho_details, other_details = member_details(:groucho), member_details(:some_other_guy)
<ide>
<ide> assert_equal [groucho_details, other_details],
<del> members(:groucho).organization_member_details.order(Arel.sql("member_details.id"))
<add> members(:groucho).organization_member_details.order("member_details.id")
<ide> end
<ide>
<ide> def test_has_many_through_has_one_with_has_many_through_source_reflection_preload
<ide> def test_has_many_through_has_one_through_with_has_many_source_reflection
<ide> groucho_details, other_details = member_details(:groucho), member_details(:some_other_guy)
<ide>
<ide> assert_equal [groucho_details, other_details],
<del> members(:groucho).organization_member_details_2.order(Arel.sql("member_details.id"))
<add> members(:groucho).organization_member_details_2.order("member_details.id")
<ide> end
<ide>
<ide> def test_has_many_through_has_one_through_with_has_many_source_reflection_preload
<ide> def test_has_many_through_has_one_through_with_has_many_source_reflection_preloa
<ide> def test_has_many_through_has_many_with_has_and_belongs_to_many_source_reflection
<ide> general, cooking = categories(:general), categories(:cooking)
<ide>
<del> assert_equal [general, cooking], authors(:bob).post_categories.order(Arel.sql("categories.id"))
<add> assert_equal [general, cooking], authors(:bob).post_categories.order("categories.id")
<ide> end
<ide>
<ide> def test_has_many_through_has_many_with_has_and_belongs_to_many_source_reflection_preload
<ide> def test_has_many_through_has_many_with_has_and_belongs_to_many_source_reflectio
<ide> def test_has_many_through_has_and_belongs_to_many_with_has_many_source_reflection
<ide> greetings, more = comments(:greetings), comments(:more_greetings)
<ide>
<del> assert_equal [greetings, more], categories(:technology).post_comments.order(Arel.sql("comments.id"))
<add> assert_equal [greetings, more], categories(:technology).post_comments.order("comments.id")
<ide> end
<ide>
<ide> def test_has_many_through_has_and_belongs_to_many_with_has_many_source_reflection_preload
<ide> def test_has_many_through_has_and_belongs_to_many_with_has_many_source_reflectio
<ide> Category.joins(:post_comments).first
<ide>
<ide> assert_includes_and_joins_equal(
<del> Category.where("comments.id" => comments(:more_greetings).id).order(Arel.sql("categories.id")),
<add> Category.where("comments.id" => comments(:more_greetings).id).order("categories.id"),
<ide> [categories(:general), categories(:technology)], :post_comments
<ide> )
<ide> end
<ide> def test_has_many_through_has_and_belongs_to_many_with_has_many_source_reflectio
<ide> def test_has_many_through_has_many_with_has_many_through_habtm_source_reflection
<ide> greetings, more = comments(:greetings), comments(:more_greetings)
<ide>
<del> assert_equal [greetings, more], authors(:bob).category_post_comments.order(Arel.sql("comments.id"))
<add> assert_equal [greetings, more], authors(:bob).category_post_comments.order("comments.id")
<ide> end
<ide>
<ide> def test_has_many_through_has_many_with_has_many_through_habtm_source_reflection_preload
<ide> def test_has_many_through_has_many_with_has_many_through_habtm_source_reflection
<ide> Author.joins(:category_post_comments).first
<ide>
<ide> assert_includes_and_joins_equal(
<del> Author.where("comments.id" => comments(:does_it_hurt).id).order(Arel.sql("authors.id")),
<add> Author.where("comments.id" => comments(:does_it_hurt).id).order("authors.id"),
<ide> [authors(:david), authors(:mary)], :category_post_comments
<ide> )
<ide> end
<ide> def test_has_many_through_belongs_to_with_has_many_through_source_reflection
<ide> welcome_general, thinking_general = taggings(:welcome_general), taggings(:thinking_general)
<ide>
<ide> assert_equal [welcome_general, thinking_general],
<del> categorizations(:david_welcome_general).post_taggings.order(Arel.sql("taggings.id"))
<add> categorizations(:david_welcome_general).post_taggings.order("taggings.id")
<ide> end
<ide>
<ide> def test_has_many_through_belongs_to_with_has_many_through_source_reflection_preload
<ide> def test_distinct_has_many_through_a_has_many_through_association_on_source_refl
<ide> def test_distinct_has_many_through_a_has_many_through_association_on_through_reflection
<ide> author = authors(:david)
<ide> assert_equal [subscribers(:first), subscribers(:second)],
<del> author.distinct_subscribers.order(Arel.sql("subscribers.nick"))
<add> author.distinct_subscribers.order("subscribers.nick")
<ide> end
<ide>
<ide> def test_nested_has_many_through_with_a_table_referenced_multiple_times
<ide> def test_nested_has_many_through_with_scope_on_polymorphic_reflection
<ide> end
<ide>
<ide> def test_has_many_through_with_foreign_key_option_on_through_reflection
<del> assert_equal [posts(:welcome), posts(:authorless)], people(:david).agents_posts.order(Arel.sql("posts.id"))
<add> assert_equal [posts(:welcome), posts(:authorless)], people(:david).agents_posts.order("posts.id")
<ide> assert_equal [authors(:david)], references(:david_unicyclist).agents_posts_authors
<ide>
<ide> references = Reference.joins(:agents_posts_authors).where("authors.id" => authors(:david).id)
<ide> assert_equal [references(:david_unicyclist)], references
<ide> end
<ide>
<ide> def test_has_many_through_with_foreign_key_option_on_source_reflection
<del> assert_equal [people(:michael), people(:susan)], jobs(:unicyclist).agents.order(Arel.sql("people.id"))
<add> assert_equal [people(:michael), people(:susan)], jobs(:unicyclist).agents.order("people.id")
<ide>
<ide> jobs = Job.joins(:agents)
<ide> assert_equal [jobs(:unicyclist), jobs(:unicyclist)], jobs
<ide><path>activerecord/test/cases/base_test.rb
<ide> def test_singular_table_name_guesses_for_individual_table
<ide>
<ide> if current_adapter?(:Mysql2Adapter)
<ide> def test_update_all_with_order_and_limit
<del> assert_equal 1, Topic.limit(1).order(Arel.sql("id DESC")).update_all(content: "bulk updated!")
<add> assert_equal 1, Topic.limit(1).order("id DESC").update_all(content: "bulk updated!")
<ide> end
<ide> end
<ide>
<ide> def test_no_limit_offset
<ide>
<ide> def test_find_last
<ide> last = Developer.last
<del> assert_equal last, Developer.all.merge!(order: Arel.sql("id desc")).first
<add> assert_equal last, Developer.all.merge!(order: "id desc").first
<ide> end
<ide>
<ide> def test_last
<del> assert_equal Developer.all.merge!(order: Arel.sql("id desc")).first, Developer.last
<add> assert_equal Developer.all.merge!(order: "id desc").first, Developer.last
<ide> end
<ide>
<ide> def test_all
<ide> def test_all
<ide> end
<ide>
<ide> def test_all_with_conditions
<del> assert_equal Developer.all.merge!(order: Arel.sql("id desc")).to_a, Developer.order(Arel.sql("id desc")).to_a
<add> assert_equal Developer.all.merge!(order: "id desc").to_a, Developer.order("id desc").to_a
<ide> end
<ide>
<ide> def test_find_ordered_last
<del> last = Developer.all.merge!(order: Arel.sql("developers.salary ASC")).last
<del> assert_equal last, Developer.all.merge!(order: Arel.sql("developers.salary ASC")).to_a.last
<add> last = Developer.all.merge!(order: "developers.salary ASC").last
<add> assert_equal last, Developer.all.merge!(order: "developers.salary ASC").to_a.last
<ide> end
<ide>
<ide> def test_find_reverse_ordered_last
<del> last = Developer.all.merge!(order: Arel.sql("developers.salary DESC")).last
<del> assert_equal last, Developer.all.merge!(order: Arel.sql("developers.salary DESC")).to_a.last
<add> last = Developer.all.merge!(order: "developers.salary DESC").last
<add> assert_equal last, Developer.all.merge!(order: "developers.salary DESC").to_a.last
<ide> end
<ide>
<ide> def test_find_multiple_ordered_last
<ide> def test_find_multiple_ordered_last
<ide>
<ide> def test_find_keeps_multiple_order_values
<ide> combined = Developer.all.merge!(order: Arel.sql("developers.name, developers.salary")).to_a
<del> assert_equal combined, Developer.all.merge!(order: [Arel.sql("developers.name"), Arel.sql("developers.salary")]).to_a
<add> assert_equal combined, Developer.all.merge!(order: ["developers.name", "developers.salary"]).to_a
<ide> end
<ide>
<ide> def test_find_keeps_multiple_group_values
<ide><path>activerecord/test/cases/batches_test.rb
<ide> class EachTest < ActiveRecord::TestCase
<ide> fixtures :posts, :subscribers
<ide>
<ide> def setup
<del> @posts = Post.order(Arel.sql("id asc"))
<add> @posts = Post.order("id asc")
<ide> @total = Post.count
<ide> Post.count("id") # preheat arel's table cache
<ide> end
<ide> def test_logger_not_required
<ide> previous_logger = ActiveRecord::Base.logger
<ide> ActiveRecord::Base.logger = nil
<ide> assert_nothing_raised do
<del> Post.order(Arel.sql("comments_count DESC")).find_each { |post| post }
<add> Post.order("comments_count DESC").find_each { |post| post }
<ide> end
<ide> ensure
<ide> ActiveRecord::Base.logger = previous_logger
<ide> def test_find_in_batches_should_not_modify_passed_options
<ide> end
<ide>
<ide> def test_find_in_batches_should_use_any_column_as_primary_key
<del> nick_order_subscribers = Subscriber.order(Arel.sql("nick asc"))
<add> nick_order_subscribers = Subscriber.order("nick asc")
<ide> start_nick = nick_order_subscribers.second.nick
<ide>
<ide> subscribers = []
<ide> def test_in_batches_each_record_should_return_enumerator_if_no_block_given
<ide> end
<ide>
<ide> def test_in_batches_each_record_should_be_ordered_by_id
<del> ids = Post.order(Arel.sql("id ASC")).pluck(:id)
<add> ids = Post.order("id ASC").pluck(:id)
<ide> assert_queries(6) do
<ide> Post.in_batches(of: 2).each_record.with_index do |post, i|
<ide> assert_equal ids[i], post.id
<ide> def test_in_batches_should_return_relations
<ide> end
<ide>
<ide> def test_in_batches_should_start_from_the_start_option
<del> post = Post.order(Arel.sql("id ASC")).where("id >= ?", 2).first
<add> post = Post.order("id ASC").where("id >= ?", 2).first
<ide> assert_queries(2) do
<ide> relation = Post.in_batches(of: 1, start: 2).first
<ide> assert_equal post, relation.first
<ide> end
<ide> end
<ide>
<ide> def test_in_batches_should_end_at_the_finish_option
<del> post = Post.order(Arel.sql("id DESC")).where("id <= ?", 5).first
<add> post = Post.order("id DESC").where("id <= ?", 5).first
<ide> assert_queries(7) do
<ide> relation = Post.in_batches(of: 1, finish: 5, load: true).reverse_each.first
<ide> assert_equal post, relation.last
<ide> def test_in_batches_should_not_modify_passed_options
<ide> end
<ide>
<ide> def test_in_batches_should_use_any_column_as_primary_key
<del> nick_order_subscribers = Subscriber.order(Arel.sql("nick asc"))
<add> nick_order_subscribers = Subscriber.order("nick asc")
<ide> start_nick = nick_order_subscribers.second.nick
<ide>
<ide> subscribers = []
<ide><path>activerecord/test/cases/finder_test.rb
<ide> def test_hash_condition_find_with_escaped_characters
<ide> end
<ide>
<ide> def test_hash_condition_find_with_array
<del> p1, p2 = Post.limit(2).order(Arel.sql("id asc")).to_a
<del> assert_equal [p1, p2], Post.where(id: [p1, p2]).order(Arel.sql("id asc")).to_a
<del> assert_equal [p1, p2], Post.where(id: [p1, p2.id]).order(Arel.sql("id asc")).to_a
<add> p1, p2 = Post.limit(2).order("id asc").to_a
<add> assert_equal [p1, p2], Post.where(id: [p1, p2]).order("id asc").to_a
<add> assert_equal [p1, p2], Post.where(id: [p1, p2.id]).order("id asc").to_a
<ide> end
<ide>
<ide> def test_hash_condition_find_with_nil
<ide> class << Account; self; end.send(:remove_method, :find_by_credit_limit) if Accou
<ide> end
<ide>
<ide> def test_find_by_one_attribute_with_several_options
<del> assert_equal accounts(:unknown), Account.order(Arel.sql("id DESC")).where("id != ?", 3).find_by_credit_limit(50)
<add> assert_equal accounts(:unknown), Account.order("id DESC").where("id != ?", 3).find_by_credit_limit(50)
<ide> end
<ide>
<ide> def test_find_by_one_missing_attribute
<ide> def test_find_last_with_offset
<ide>
<ide> assert_equal devs[2], Developer.offset(2).first
<ide> assert_equal devs[-3], Developer.offset(2).last
<del> assert_equal devs[-3], Developer.offset(2).order(Arel.sql("id DESC")).first
<add> assert_equal devs[-3], Developer.offset(2).order("id DESC").first
<ide> end
<ide>
<ide> def test_find_by_nil_attribute
<ide> def test_find_by_empty_in_condition
<ide> end
<ide>
<ide> def test_find_by_records
<del> p1, p2 = Post.limit(2).order(Arel.sql("id asc")).to_a
<del> assert_equal [p1, p2], Post.where(["id in (?)", [p1, p2]]).order(Arel.sql("id asc"))
<del> assert_equal [p1, p2], Post.where(["id in (?)", [p1, p2.id]]).order(Arel.sql("id asc"))
<add> p1, p2 = Post.limit(2).order("id asc").to_a
<add> assert_equal [p1, p2], Post.where(["id in (?)", [p1, p2]]).order("id asc")
<add> assert_equal [p1, p2], Post.where(["id in (?)", [p1, p2.id]]).order("id asc")
<ide> end
<ide>
<ide> def test_select_value
<ide> def test_find_with_nil_inside_set_passed_for_one_attribute
<ide> client_of = Company.
<ide> where(client_of: [2, 1, nil],
<ide> name: ["37signals", "Summit", "Microsoft"]).
<del> order(Arel.sql("client_of DESC")).
<add> order("client_of DESC").
<ide> map(&:client_of)
<ide>
<ide> assert_includes client_of, nil
<ide> def test_find_with_nil_inside_set_passed_for_one_attribute
<ide> def test_find_with_nil_inside_set_passed_for_attribute
<ide> client_of = Company.
<ide> where(client_of: [nil]).
<del> order(Arel.sql("client_of DESC")).
<add> order("client_of DESC").
<ide> map(&:client_of)
<ide>
<ide> assert_equal [], client_of.compact
<ide> def test_find_with_nil_inside_set_passed_for_attribute
<ide> def test_with_limiting_with_custom_select
<ide> posts = Post.references(:authors).merge(
<ide> includes: :author, select: 'posts.*, authors.id as "author_id"',
<del> limit: 3, order: Arel.sql("posts.id")
<add> limit: 3, order: "posts.id"
<ide> ).to_a
<ide> assert_equal 3, posts.size
<ide> assert_equal [0, 1, 1], posts.map(&:author_id).sort
<ide><path>activerecord/test/cases/relation/merging_test.rb
<ide> class RelationMergingTest < ActiveRecord::TestCase
<ide> fixtures :developers, :comments, :authors, :author_addresses, :posts
<ide>
<ide> def test_relation_merging
<del> devs = Developer.where("salary >= 80000").merge(Developer.limit(2)).merge(Developer.order(Arel.sql("id ASC")).where("id < 3"))
<add> devs = Developer.where("salary >= 80000").merge(Developer.limit(2)).merge(Developer.order("id ASC").where("id < 3"))
<ide> assert_equal [developers(:david), developers(:jamis)], devs.to_a
<ide>
<del> dev_with_count = Developer.limit(1).merge(Developer.order(Arel.sql("id DESC"))).merge(Developer.select("developers.*"))
<add> dev_with_count = Developer.limit(1).merge(Developer.order("id DESC")).merge(Developer.select("developers.*"))
<ide> assert_equal [developers(:poor_jamis)], dev_with_count.to_a
<ide> end
<ide>
<ide> def test_relation_merging_with_eager_load
<ide> end
<ide>
<ide> def test_relation_merging_with_locks
<del> devs = Developer.lock.where("salary >= 80000").order(Arel.sql("id DESC")).merge(Developer.limit(2))
<add> devs = Developer.lock.where("salary >= 80000").order("id DESC").merge(Developer.limit(2))
<ide> assert devs.locked?
<ide> end
<ide>
<ide> class MergingDifferentRelationsTest < ActiveRecord::TestCase
<ide>
<ide> test "merging where relations" do
<ide> hello_by_bob = Post.where(body: "hello").joins(:author).
<del> merge(Author.where(name: "Bob")).order(Arel.sql("posts.id")).pluck(Arel.sql("posts.id"))
<add> merge(Author.where(name: "Bob")).order("posts.id").pluck("posts.id")
<ide>
<ide> assert_equal [posts(:misc_by_bob).id,
<ide> posts(:other_by_bob).id], hello_by_bob
<ide><path>activerecord/test/cases/relation/or_test.rb
<ide> def test_or_without_right_where
<ide> end
<ide>
<ide> def test_or_preserves_other_querying_methods
<del> expected = Post.where("id = 1 or id = 2 or id = 3").order(Arel.sql("body asc")).to_a
<del> partial = Post.order(Arel.sql("body asc"))
<add> expected = Post.where("id = 1 or id = 2 or id = 3").order("body asc").to_a
<add> partial = Post.order("body asc")
<ide> assert_equal expected, partial.where("id = 1").or(partial.where(id: [2, 3])).to_a
<del> assert_equal expected, Post.order(Arel.sql("body asc")).where("id = 1").or(Post.order(Arel.sql("body asc")).where(id: [2, 3])).to_a
<add> assert_equal expected, Post.order("body asc").where("id = 1").or(Post.order("body asc").where(id: [2, 3])).to_a
<ide> end
<ide>
<ide> def test_or_with_incompatible_relations
<ide> error = assert_raises ArgumentError do
<del> Post.order(Arel.sql("body asc")).where("id = 1").or(Post.order(Arel.sql("id desc")).where(id: [2, 3])).to_a
<add> Post.order("body asc").where("id = 1").or(Post.order("id desc").where(id: [2, 3])).to_a
<ide> end
<ide>
<ide> assert_equal "Relation passed to #or must be structurally compatible. Incompatible values: [:order]", error.message
<ide> def test_or_with_unscope_where_column
<ide>
<ide> def test_or_with_unscope_order
<ide> expected = Post.where("id = 1 or id = 2")
<del> assert_equal expected, Post.order(Arel.sql("body asc")).where("id = 1").unscope(:order).or(Post.where("id = 2")).to_a
<add> assert_equal expected, Post.order("body asc").where("id = 1").unscope(:order).or(Post.where("id = 2")).to_a
<ide> end
<ide>
<ide> def test_or_with_incompatible_unscope
<ide> error = assert_raises ArgumentError do
<del> Post.order(Arel.sql("body asc")).where("id = 1").or(Post.order(Arel.sql("body asc")).where("id = 2").unscope(:order)).to_a
<add> Post.order("body asc").where("id = 1").or(Post.order("body asc").where("id = 2").unscope(:order)).to_a
<ide> end
<ide>
<ide> assert_equal "Relation passed to #or must be structurally compatible. Incompatible values: [:order]", error.message
<ide> def test_or_with_named_scope
<ide> end
<ide>
<ide> def test_or_inside_named_scope
<del> expected = Post.where("body LIKE '\%a\%' OR title LIKE ?", "%'%").order(Arel.sql("id DESC")).to_a
<add> expected = Post.where("body LIKE '\%a\%' OR title LIKE ?", "%'%").order("id DESC").to_a
<ide> assert_equal expected, Post.order(id: :desc).typographically_interesting
<ide> end
<ide>
<ide><path>activerecord/test/cases/relations_test.rb
<ide> def test_loaded_all
<ide> end
<ide>
<ide> def test_scoped_first
<del> topics = Topic.all.order(Arel.sql("id ASC"))
<add> topics = Topic.all.order("id ASC")
<ide>
<ide> assert_queries(1) do
<ide> 2.times { assert_equal "The First Topic", topics.first.title }
<ide> def test_scoped_first
<ide> end
<ide>
<ide> def test_loaded_first
<del> topics = Topic.all.order(Arel.sql("id ASC"))
<add> topics = Topic.all.order("id ASC")
<ide> topics.load # force load
<ide>
<ide> assert_no_queries do
<ide> def test_loaded_first
<ide> end
<ide>
<ide> def test_loaded_first_with_limit
<del> topics = Topic.all.order(Arel.sql("id ASC"))
<add> topics = Topic.all.order("id ASC")
<ide> topics.load # force load
<ide>
<ide> assert_no_queries do
<ide> def test_loaded_first_with_limit
<ide> end
<ide>
<ide> def test_first_get_more_than_available
<del> topics = Topic.all.order(Arel.sql("id ASC"))
<add> topics = Topic.all.order("id ASC")
<ide> unloaded_first = topics.first(10)
<ide> topics.load # force load
<ide>
<ide> def test_finding_with_assoc_order
<ide> end
<ide>
<ide> def test_finding_with_arel_assoc_order
<del> topics = Topic.order(Arel.sql("id") => :desc)
<add> topics = Topic.order("id" => :desc)
<ide> assert_equal 5, topics.to_a.size
<ide> assert_equal topics(:fifth).title, topics.first.title
<ide> end
<ide> def test_finding_with_reversed_assoc_order
<ide> end
<ide>
<ide> def test_finding_with_reversed_arel_assoc_order
<del> topics = Topic.order(Arel.sql("id") => :asc).reverse_order
<add> topics = Topic.order("id" => :asc).reverse_order
<ide> assert_equal 5, topics.to_a.size
<ide> assert_equal topics(:fifth).title, topics.first.title
<ide> end
<ide> def test_support_upper_and_lower_case_directions
<ide> end
<ide>
<ide> def test_raising_exception_on_invalid_hash_params
<del> e = assert_raise(ArgumentError) { Topic.order(Arel.sql("name"), Arel.sql("id DESC"), id: :asfsdf) }
<add> e = assert_raise(ArgumentError) { Topic.order(Arel.sql("name"), "id DESC", id: :asfsdf) }
<ide> assert_equal 'Direction "asfsdf" is invalid. Valid directions are: [:asc, :desc, :ASC, :DESC, "asc", "desc", "ASC", "DESC"]', e.message
<ide> end
<ide>
<ide> def test_finding_with_assoc_reorder_by_aliased_attributes
<ide> end
<ide>
<ide> def test_finding_with_order_and_take
<del> entrants = Entrant.order(Arel.sql("id ASC")).limit(2).to_a
<add> entrants = Entrant.order("id ASC").limit(2).to_a
<ide>
<ide> assert_equal 2, entrants.size
<ide> assert_equal entrants(:first).name, entrants.first.name
<ide> def test_finding_with_order_and_take
<ide> def test_finding_with_cross_table_order_and_limit
<ide> tags = Tag.includes(:taggings).
<ide> order(
<del> Arel.sql("tags.name asc"),
<add> "tags.name asc",
<ide> Arel.sql("taggings.taggable_id asc"),
<ide> Arel.sql("REPLACE('abc', taggings.taggable_type, taggings.taggable_type)")
<ide> ).limit(1).to_a
<ide> def test_finding_with_sanitized_order
<ide> end
<ide>
<ide> def test_finding_with_order_limit_and_offset
<del> entrants = Entrant.order(Arel.sql("id ASC")).limit(2).offset(1)
<add> entrants = Entrant.order("id ASC").limit(2).offset(1)
<ide>
<ide> assert_equal 2, entrants.to_a.size
<ide> assert_equal entrants(:second).name, entrants.first.name
<ide>
<del> entrants = Entrant.order(Arel.sql("id ASC")).limit(2).offset(2)
<add> entrants = Entrant.order("id ASC").limit(2).offset(2)
<ide> assert_equal 1, entrants.to_a.size
<ide> assert_equal entrants(:third).name, entrants.first.name
<ide> end
<ide> def test_eager_association_loading_of_stis_with_multiple_references
<ide>
<ide> def test_find_with_preloaded_associations
<ide> assert_queries(2) do
<del> posts = Post.preload(:comments).order(Arel.sql("posts.id"))
<add> posts = Post.preload(:comments).order("posts.id")
<ide> assert posts.first.comments.first
<ide> end
<ide>
<ide> assert_queries(2) do
<del> posts = Post.preload(:comments).order(Arel.sql("posts.id"))
<add> posts = Post.preload(:comments).order("posts.id")
<ide> assert posts.first.comments.first
<ide> end
<ide>
<ide> assert_queries(2) do
<del> posts = Post.preload(:author).order(Arel.sql("posts.id"))
<add> posts = Post.preload(:author).order("posts.id")
<ide> assert posts.first.author
<ide> end
<ide>
<ide> assert_queries(2) do
<del> posts = Post.preload(:author).order(Arel.sql("posts.id"))
<add> posts = Post.preload(:author).order("posts.id")
<ide> assert posts.first.author
<ide> end
<ide>
<ide> assert_queries(3) do
<del> posts = Post.preload(:author, :comments).order(Arel.sql("posts.id"))
<add> posts = Post.preload(:author, :comments).order("posts.id")
<ide> assert posts.first.author
<ide> assert posts.first.comments.first
<ide> end
<ide> def test_preload_applies_to_all_chained_preloaded_scopes
<ide>
<ide> def test_find_with_included_associations
<ide> assert_queries(2) do
<del> posts = Post.includes(:comments).order(Arel.sql("posts.id"))
<add> posts = Post.includes(:comments).order("posts.id")
<ide> assert posts.first.comments.first
<ide> end
<ide>
<ide> assert_queries(2) do
<del> posts = Post.all.includes(:comments).order(Arel.sql("posts.id"))
<add> posts = Post.all.includes(:comments).order("posts.id")
<ide> assert posts.first.comments.first
<ide> end
<ide>
<ide> assert_queries(2) do
<del> posts = Post.includes(:author).order(Arel.sql("posts.id"))
<add> posts = Post.includes(:author).order("posts.id")
<ide> assert posts.first.author
<ide> end
<ide>
<ide> assert_queries(3) do
<del> posts = Post.includes(:author, :comments).order(Arel.sql("posts.id"))
<add> posts = Post.includes(:author, :comments).order("posts.id")
<ide> assert posts.first.author
<ide> assert posts.first.comments.first
<ide> end
<ide> def test_find_id
<ide> end
<ide>
<ide> def test_find_ids
<del> authors = Author.order(Arel.sql("id ASC"))
<add> authors = Author.order("id ASC")
<ide>
<ide> results = authors.find(authors(:david).id, authors(:mary).id)
<ide> assert_kind_of Array, results
<ide> def test_count_explicit_columns
<ide> end
<ide>
<ide> def test_multiple_selects
<del> post = Post.all.select("comments_count").select("title").order(Arel.sql("id ASC")).first
<add> post = Post.all.select("comments_count").select("title").order("id ASC").first
<ide> assert_equal "Welcome to the weblog", post.title
<ide> assert_equal 2, post.comments_count
<ide> end
<ide> def test_explicit_create_with
<ide> end
<ide>
<ide> def test_except
<del> relation = Post.where(author_id: 1).order(Arel.sql("id ASC")).limit(1)
<add> relation = Post.where(author_id: 1).order("id ASC").limit(1)
<ide> assert_equal [posts(:welcome)], relation.to_a
<ide>
<ide> author_posts = relation.except(:order, :limit)
<ide> def test_except
<ide> end
<ide>
<ide> def test_only
<del> relation = Post.where(author_id: 1).order(Arel.sql("id ASC")).limit(1)
<add> relation = Post.where(author_id: 1).order("id ASC").limit(1)
<ide> assert_equal [posts(:welcome)], relation.to_a
<ide>
<ide> author_posts = relation.only(:where)
<ide> def test_only
<ide> end
<ide>
<ide> def test_anonymous_extension
<del> relation = Post.where(author_id: 1).order(Arel.sql("id ASC")).extending do
<add> relation = Post.where(author_id: 1).order("id ASC").extending do
<ide> def author
<ide> "lifo"
<ide> end
<ide> def author
<ide> end
<ide>
<ide> def test_named_extension
<del> relation = Post.where(author_id: 1).order(Arel.sql("id ASC")).extending(Post::NamedExtension)
<add> relation = Post.where(author_id: 1).order("id ASC").extending(Post::NamedExtension)
<ide> assert_equal "lifo", relation.author
<ide> assert_equal "lifo", relation.limit(1).author
<ide> end
<ide> def test_default_scope_order_with_scope_order
<ide> end
<ide>
<ide> def test_order_using_scoping
<del> car1 = CoolCar.order(Arel.sql("id DESC")).scoping do
<del> CoolCar.all.merge!(order: Arel.sql("id asc")).first
<add> car1 = CoolCar.order("id DESC").scoping do
<add> CoolCar.all.merge!(order: "id asc").first
<ide> end
<ide> assert_equal "zyke", car1.name
<ide>
<del> car2 = FastCar.order(Arel.sql("id DESC")).scoping do
<del> FastCar.all.merge!(order: Arel.sql("id asc")).first
<add> car2 = FastCar.order("id DESC").scoping do
<add> FastCar.all.merge!(order: "id asc").first
<ide> end
<ide> assert_equal "zyke", car2.name
<ide> end
<ide> def test_update_all_with_joins_and_limit
<ide> end
<ide>
<ide> def test_update_all_with_joins_and_limit_and_order
<del> comments = Comment.joins(:post).where("posts.id" => posts(:welcome).id).order(Arel.sql("comments.id")).limit(1)
<add> comments = Comment.joins(:post).where("posts.id" => posts(:welcome).id).order("comments.id").limit(1)
<ide> assert_equal 1, comments.update_all(post_id: posts(:thinking).id)
<ide> assert_equal posts(:thinking), comments(:greetings).post
<ide> assert_equal posts(:welcome), comments(:more_greetings).post
<ide> def test_update_all_with_joins_and_offset
<ide> end
<ide>
<ide> def test_update_all_with_joins_and_offset_and_order
<del> all_comments = Comment.joins(:post).where("posts.id" => posts(:welcome).id).order(Arel.sql("posts.id"), Arel.sql("comments.id"))
<add> all_comments = Comment.joins(:post).where("posts.id" => posts(:welcome).id).order(Arel.sql("posts.id"), "comments.id")
<ide> count = all_comments.count
<ide> comments = all_comments.offset(1)
<ide>
<ide> def test_presence
<ide> end
<ide>
<ide> test "joins with select" do
<del> posts = Post.joins(:author).select("id", "authors.author_address_id").order(Arel.sql("posts.id")).limit(3)
<add> posts = Post.joins(:author).select("id", "authors.author_address_id").order("posts.id").limit(3)
<ide> assert_equal [1, 2, 4], posts.map(&:id)
<ide> assert_equal [1, 1, 1], posts.map(&:author_address_id)
<ide> end
<ide><path>activerecord/test/cases/scoping/default_scoping_test.rb
<ide> class DefaultScopingTest < ActiveRecord::TestCase
<ide> fixtures :developers, :posts, :comments
<ide>
<ide> def test_default_scope
<del> expected = Developer.all.merge!(order: Arel.sql("salary DESC")).to_a.collect(&:salary)
<add> expected = Developer.all.merge!(order: "salary DESC").to_a.collect(&:salary)
<ide> received = DeveloperOrderedBySalary.all.collect(&:salary)
<ide> assert_equal expected, received
<ide> end
<ide> def test_scope_overwrites_default
<ide> end
<ide>
<ide> def test_reorder_overrides_default_scope_order
<del> expected = Developer.order(Arel.sql("name DESC")).collect(&:name)
<del> received = DeveloperOrderedBySalary.reorder(Arel.sql("name DESC")).collect(&:name)
<add> expected = Developer.order("name DESC").collect(&:name)
<add> received = DeveloperOrderedBySalary.reorder("name DESC").collect(&:name)
<ide> assert_equal expected, received
<ide> end
<ide>
<ide> def test_order_after_reorder_combines_orders
<ide> expected = Developer.order(Arel.sql("name DESC, id DESC")).collect { |dev| [dev.name, dev.id] }
<del> received = Developer.order(Arel.sql("name ASC")).reorder(Arel.sql("name DESC")).order(Arel.sql("id DESC")).collect { |dev| [dev.name, dev.id] }
<add> received = Developer.order("name ASC").reorder("name DESC").order("id DESC").collect { |dev| [dev.name, dev.id] }
<ide> assert_equal expected, received
<ide> end
<ide>
<ide> def test_unscope_overrides_default_scope
<ide>
<ide> def test_unscope_after_reordering_and_combining
<ide> expected = Developer.order(Arel.sql("id DESC, name DESC")).collect { |dev| [dev.name, dev.id] }
<del> received = DeveloperOrderedBySalary.reorder(Arel.sql("name DESC")).unscope(:order).order(Arel.sql("id DESC, name DESC")).collect { |dev| [dev.name, dev.id] }
<add> received = DeveloperOrderedBySalary.reorder("name DESC").unscope(:order).order(Arel.sql("id DESC, name DESC")).collect { |dev| [dev.name, dev.id] }
<ide> assert_equal expected, received
<ide>
<ide> expected_2 = Developer.all.collect { |dev| [dev.name, dev.id] }
<ide> received_2 = Developer.order(Arel.sql("id DESC, name DESC")).unscope(:order).collect { |dev| [dev.name, dev.id] }
<ide> assert_equal expected_2, received_2
<ide>
<ide> expected_3 = Developer.all.collect { |dev| [dev.name, dev.id] }
<del> received_3 = Developer.reorder(Arel.sql("name DESC")).unscope(:order).collect { |dev| [dev.name, dev.id] }
<add> received_3 = Developer.reorder("name DESC").unscope(:order).collect { |dev| [dev.name, dev.id] }
<ide> assert_equal expected_3, received_3
<ide> end
<ide>
<ide> def test_unscope_with_where_attributes
<del> expected = Developer.order(Arel.sql("salary DESC")).collect(&:name)
<add> expected = Developer.order("salary DESC").collect(&:name)
<ide> received = DeveloperOrderedBySalary.where(name: "David").unscope(where: :name).collect(&:name)
<ide> assert_equal expected, received
<ide>
<del> expected_2 = Developer.order(Arel.sql("salary DESC")).collect(&:name)
<add> expected_2 = Developer.order("salary DESC").collect(&:name)
<ide> received_2 = DeveloperOrderedBySalary.select("id").where("name" => "Jamis").unscope({ where: :name }, :select).collect(&:name)
<ide> assert_equal expected_2, received_2
<ide>
<del> expected_3 = Developer.order(Arel.sql("salary DESC")).collect(&:name)
<add> expected_3 = Developer.order("salary DESC").collect(&:name)
<ide> received_3 = DeveloperOrderedBySalary.select("id").where("name" => "Jamis").unscope(:select, :where).collect(&:name)
<ide> assert_equal expected_3, received_3
<ide>
<del> expected_4 = Developer.order(Arel.sql("salary DESC")).collect(&:name)
<add> expected_4 = Developer.order("salary DESC").collect(&:name)
<ide> received_4 = DeveloperOrderedBySalary.where.not("name" => "Jamis").unscope(where: :name).collect(&:name)
<ide> assert_equal expected_4, received_4
<ide>
<del> expected_5 = Developer.order(Arel.sql("salary DESC")).collect(&:name)
<add> expected_5 = Developer.order("salary DESC").collect(&:name)
<ide> received_5 = DeveloperOrderedBySalary.where.not("name" => ["Jamis", "David"]).unscope(where: :name).collect(&:name)
<ide> assert_equal expected_5, received_5
<ide>
<del> expected_6 = Developer.order(Arel.sql("salary DESC")).collect(&:name)
<add> expected_6 = Developer.order("salary DESC").collect(&:name)
<ide> received_6 = DeveloperOrderedBySalary.where(Developer.arel_table["name"].eq("David")).unscope(where: :name).collect(&:name)
<ide> assert_equal expected_6, received_6
<ide>
<del> expected_7 = Developer.order(Arel.sql("salary DESC")).collect(&:name)
<add> expected_7 = Developer.order("salary DESC").collect(&:name)
<ide> received_7 = DeveloperOrderedBySalary.where(Developer.arel_table[:name].eq("David")).unscope(where: :name).collect(&:name)
<ide> assert_equal expected_7, received_7
<ide> end
<ide>
<ide> def test_unscope_comparison_where_clauses
<ide> # unscoped for WHERE (`developers`.`id` <= 2)
<del> expected = Developer.order(Arel.sql("salary DESC")).collect(&:name)
<add> expected = Developer.order("salary DESC").collect(&:name)
<ide> received = DeveloperOrderedBySalary.where(id: -Float::INFINITY..2).unscope(where: :id).collect { |dev| dev.name }
<ide> assert_equal expected, received
<ide>
<ide> # unscoped for WHERE (`developers`.`id` < 2)
<del> expected = Developer.order(Arel.sql("salary DESC")).collect(&:name)
<add> expected = Developer.order("salary DESC").collect(&:name)
<ide> received = DeveloperOrderedBySalary.where(id: -Float::INFINITY...2).unscope(where: :id).collect { |dev| dev.name }
<ide> assert_equal expected, received
<ide> end
<ide>
<ide> def test_unscope_multiple_where_clauses
<del> expected = Developer.order(Arel.sql("salary DESC")).collect(&:name)
<add> expected = Developer.order("salary DESC").collect(&:name)
<ide> received = DeveloperOrderedBySalary.where(name: "Jamis").where(id: 1).unscope(where: [:name, :id]).collect(&:name)
<ide> assert_equal expected, received
<ide> end
<ide>
<ide> def test_unscope_string_where_clauses_involved
<del> dev_relation = Developer.order(Arel.sql("salary DESC")).where("created_at > ?", 1.year.ago)
<add> dev_relation = Developer.order("salary DESC").where("created_at > ?", 1.year.ago)
<ide> expected = dev_relation.collect(&:name)
<ide>
<ide> dev_ordered_relation = DeveloperOrderedBySalary.where(name: "Jamis").where("created_at > ?", 1.year.ago)
<ide> def test_unscope_string_where_clauses_involved
<ide> end
<ide>
<ide> def test_unscope_with_grouping_attributes
<del> expected = Developer.order(Arel.sql("salary DESC")).collect(&:name)
<add> expected = Developer.order("salary DESC").collect(&:name)
<ide> received = DeveloperOrderedBySalary.group(:name).unscope(:group).collect(&:name)
<ide> assert_equal expected, received
<ide>
<del> expected_2 = Developer.order(Arel.sql("salary DESC")).collect(&:name)
<add> expected_2 = Developer.order("salary DESC").collect(&:name)
<ide> received_2 = DeveloperOrderedBySalary.group("name").unscope(:group).collect(&:name)
<ide> assert_equal expected_2, received_2
<ide> end
<ide>
<ide> def test_unscope_with_limit_in_query
<del> expected = Developer.order(Arel.sql("salary DESC")).collect(&:name)
<add> expected = Developer.order("salary DESC").collect(&:name)
<ide> received = DeveloperOrderedBySalary.limit(1).unscope(:limit).collect(&:name)
<ide> assert_equal expected, received
<ide> end
<ide> def test_order_to_unscope_reordering
<ide>
<ide> def test_unscope_reverse_order
<ide> expected = Developer.all.collect(&:name)
<del> received = Developer.order(Arel.sql("salary DESC")).reverse_order.unscope(:order).collect(&:name)
<add> received = Developer.order("salary DESC").reverse_order.unscope(:order).collect(&:name)
<ide> assert_equal expected, received
<ide> end
<ide>
<ide> def test_unscope_select
<del> expected = Developer.order(Arel.sql("salary ASC")).collect(&:name)
<del> received = Developer.order(Arel.sql("salary DESC")).reverse_order.select(:name).unscope(:select).collect(&:name)
<add> expected = Developer.order("salary ASC").collect(&:name)
<add> received = Developer.order("salary DESC").reverse_order.select(:name).unscope(:select).collect(&:name)
<ide> assert_equal expected, received
<ide>
<ide> expected_2 = Developer.all.collect(&:id)
<ide> def test_unscope_errors_with_invalid_value
<ide> end
<ide>
<ide> assert_raises(ArgumentError) do
<del> Developer.order(Arel.sql("name DESC")).reverse_order.unscope(:reverse_order)
<add> Developer.order("name DESC").reverse_order.unscope(:reverse_order)
<ide> end
<ide>
<ide> assert_raises(ArgumentError) do
<del> Developer.order(Arel.sql("name DESC")).where(name: "Jamis").unscope()
<add> Developer.order("name DESC").where(name: "Jamis").unscope()
<ide> end
<ide> end
<ide>
<ide> def test_unscope_merging
<ide> end
<ide>
<ide> def test_order_in_default_scope_should_not_prevail
<del> expected = Developer.all.merge!(order: Arel.sql("salary desc")).to_a.collect(&:salary)
<add> expected = Developer.all.merge!(order: "salary desc").to_a.collect(&:salary)
<ide> received = DeveloperOrderedBySalary.all.merge!(order: "salary").to_a.collect(&:salary)
<ide> assert_equal expected, received
<ide> end
<ide><path>activerecord/test/cases/scoping/named_scoping_test.rb
<ide> def private_method; end
<ide>
<ide> def test_scopes_on_relations
<ide> # Topic.replied
<del> approved_topics = Topic.all.approved.order(Arel.sql("id DESC"))
<add> approved_topics = Topic.all.approved.order("id DESC")
<ide> assert_equal topics(:fifth), approved_topics.first
<ide>
<ide> replied_approved_topics = approved_topics.replied
<ide> assert_equal topics(:third), replied_approved_topics.first
<ide> end
<ide>
<ide> def test_index_on_scope
<del> approved = Topic.approved.order(Arel.sql("id ASC"))
<add> approved = Topic.approved.order("id ASC")
<ide> assert_equal topics(:second), approved[0]
<ide> assert approved.loaded?
<ide> end
<ide><path>activerecord/test/cases/scoping/relation_scoping_test.rb
<ide> def test_scope_breaks_caching_on_collections
<ide> end
<ide>
<ide> def test_reverse_order
<del> assert_equal Developer.order(Arel.sql("id DESC")).to_a.reverse, Developer.order(Arel.sql("id DESC")).reverse_order
<add> assert_equal Developer.order("id DESC").to_a.reverse, Developer.order("id DESC").reverse_order
<ide> end
<ide>
<ide> def test_reverse_order_with_arel_node
<del> assert_equal Developer.order(Arel.sql("id DESC")).to_a.reverse, Developer.order(Developer.arel_table[:id].desc).reverse_order
<add> assert_equal Developer.order("id DESC").to_a.reverse, Developer.order(Developer.arel_table[:id].desc).reverse_order
<ide> end
<ide>
<ide> def test_reverse_order_with_multiple_arel_nodes
<del> assert_equal Developer.order(Arel.sql("id DESC")).order(Arel.sql("name DESC")).to_a.reverse, Developer.order(Developer.arel_table[:id].desc).order(Developer.arel_table[:name].desc).reverse_order
<add> assert_equal Developer.order("id DESC").order("name DESC").to_a.reverse, Developer.order(Developer.arel_table[:id].desc).order(Developer.arel_table[:name].desc).reverse_order
<ide> end
<ide>
<ide> def test_reverse_order_with_arel_nodes_and_strings
<del> assert_equal Developer.order(Arel.sql("id DESC")).order(Arel.sql("name DESC")).to_a.reverse, Developer.order(Arel.sql("id DESC")).order(Developer.arel_table[:name].desc).reverse_order
<add> assert_equal Developer.order("id DESC").order("name DESC").to_a.reverse, Developer.order("id DESC").order(Developer.arel_table[:name].desc).reverse_order
<ide> end
<ide>
<ide> def test_double_reverse_order_produces_original_order
<del> assert_equal Developer.order(Arel.sql("name DESC")), Developer.order(Arel.sql("name DESC")).reverse_order.reverse_order
<add> assert_equal Developer.order("name DESC"), Developer.order("name DESC").reverse_order.reverse_order
<ide> end
<ide>
<ide> def test_scoped_find
<ide> def test_scoped_find_first
<ide> end
<ide>
<ide> def test_scoped_find_last
<del> highest_salary = Developer.order(Arel.sql("salary DESC")).first
<add> highest_salary = Developer.order("salary DESC").first
<ide>
<ide> Developer.order("salary").scoping do
<ide> assert_equal highest_salary, Developer.last
<ide> end
<ide> end
<ide>
<ide> def test_scoped_find_last_preserves_scope
<del> lowest_salary = Developer.order(Arel.sql("salary ASC")).first
<del> highest_salary = Developer.order(Arel.sql("salary DESC")).first
<add> lowest_salary = Developer.order("salary ASC").first
<add> highest_salary = Developer.order("salary DESC").first
<ide>
<ide> Developer.order("salary").scoping do
<ide> assert_equal highest_salary, Developer.last
<ide><path>activerecord/test/cases/unsafe_raw_sql_test.rb
<ide> class UnsafeRawSqlTest < ActiveRecord::TestCase
<ide> assert_equal ids_expected, ids_disabled
<ide> end
<ide>
<add> test "order: allows table and column name" do
<add> ids_expected = Post.order(Arel.sql("title")).pluck(:id)
<add>
<add> ids_depr = with_unsafe_raw_sql_deprecated { Post.order("posts.title").pluck(:id) }
<add> ids_disabled = with_unsafe_raw_sql_disabled { Post.order("posts.title").pluck(:id) }
<add>
<add> assert_equal ids_expected, ids_depr
<add> assert_equal ids_expected, ids_disabled
<add> end
<add>
<add> test "order: allows column name and direction in string" do
<add> ids_expected = Post.order(Arel.sql("title desc")).pluck(:id)
<add>
<add> ids_depr = with_unsafe_raw_sql_deprecated { Post.order("title desc").pluck(:id) }
<add> ids_disabled = with_unsafe_raw_sql_disabled { Post.order("title desc").pluck(:id) }
<add>
<add> assert_equal ids_expected, ids_depr
<add> assert_equal ids_expected, ids_disabled
<add> end
<add>
<add> test "order: allows table name, column name and direction in string" do
<add> ids_expected = Post.order(Arel.sql("title desc")).pluck(:id)
<add>
<add> ids_depr = with_unsafe_raw_sql_deprecated { Post.order("posts.title desc").pluck(:id) }
<add> ids_disabled = with_unsafe_raw_sql_disabled { Post.order("posts.title desc").pluck(:id) }
<add>
<add> assert_equal ids_expected, ids_depr
<add> assert_equal ids_expected, ids_disabled
<add> end
<add>
<ide> test "order: disallows invalid column name" do
<ide> with_unsafe_raw_sql_disabled do
<ide> assert_raises(ActiveRecord::UnknownAttributeReference) do
<del> Post.order("title asc").pluck(:id)
<add> Post.order("foo asc").pluck(:id)
<ide> end
<ide> end
<ide> end
<ide> class UnsafeRawSqlTest < ActiveRecord::TestCase
<ide> assert_equal values_expected, values_disabled
<ide> end
<ide>
<add> test "pluck: allows table and column names" do
<add> titles_expected = Post.pluck(Arel.sql("title"))
<add>
<add> titles_depr = with_unsafe_raw_sql_deprecated { Post.pluck("posts.title") }
<add> titles_disabled = with_unsafe_raw_sql_disabled { Post.pluck("posts.title") }
<add>
<add> assert_equal titles_expected, titles_depr
<add> assert_equal titles_expected, titles_disabled
<add> end
<add>
<ide> test "pluck: disallows invalid column name" do
<ide> with_unsafe_raw_sql_disabled do
<ide> assert_raises(ActiveRecord::UnknownAttributeReference) do
<ide><path>activerecord/test/models/author.rb
<ide> class Author < ActiveRecord::Base
<ide> has_many :posts_with_comments, -> { includes(:comments) }, class_name: "Post"
<ide> has_many :popular_grouped_posts, -> { includes(:comments).group("type").having("SUM(comments_count) > 1").select("type") }, class_name: "Post"
<ide> has_many :posts_with_comments_sorted_by_comment_id, -> { includes(:comments).order(Arel.sql("comments.id")) }, class_name: "Post"
<del> has_many :posts_sorted_by_id_limited, -> { order(Arel.sql("posts.id")).limit(1) }, class_name: "Post"
<add> has_many :posts_sorted_by_id_limited, -> { order("posts.id").limit(1) }, class_name: "Post"
<ide> has_many :posts_with_categories, -> { includes(:categories) }, class_name: "Post"
<del> has_many :posts_with_comments_and_categories, -> { includes(:comments, :categories).order(Arel.sql("posts.id")) }, class_name: "Post"
<add> has_many :posts_with_comments_and_categories, -> { includes(:comments, :categories).order("posts.id") }, class_name: "Post"
<ide> has_many :posts_with_special_categorizations, class_name: "PostWithSpecialCategorization"
<ide> has_one :post_about_thinking, -> { where("posts.title like '%thinking%'") }, class_name: "Post"
<ide> has_one :post_about_thinking_with_last_comment, -> { where("posts.title like '%thinking%'").includes(:last_comment) }, class_name: "Post"
<ide><path>activerecord/test/models/car.rb
<ide> class Car < ActiveRecord::Base
<ide> scope :incl_tyres, -> { includes(:tyres) }
<ide> scope :incl_engines, -> { includes(:engines) }
<ide>
<del> scope :order_using_new_style, -> { order(Arel.sql("name asc")) }
<add> scope :order_using_new_style, -> { order("name asc") }
<ide> end
<ide>
<ide> class CoolCar < Car
<del> default_scope { order(Arel.sql("name desc")) }
<add> default_scope { order("name desc") }
<ide> end
<ide>
<ide> class FastCar < Car
<del> default_scope { order(Arel.sql("name desc")) }
<add> default_scope { order("name desc") }
<ide> end
<ide><path>activerecord/test/models/company.rb
<ide> class Firm < Company
<ide> has_many :clients, -> { order "id" }, dependent: :destroy, before_remove: :log_before_remove, after_remove: :log_after_remove
<ide> has_many :unsorted_clients, class_name: "Client"
<ide> has_many :unsorted_clients_with_symbol, class_name: :Client
<del> has_many :clients_sorted_desc, -> { order Arel.sql("id DESC") }, class_name: "Client"
<add> has_many :clients_sorted_desc, -> { order "id DESC" }, class_name: "Client"
<ide> has_many :clients_of_firm, -> { order "id" }, foreign_key: "client_of", class_name: "Client", inverse_of: :firm
<ide> has_many :clients_ordered_by_name, -> { order "name" }, class_name: "Client"
<ide> has_many :unvalidated_clients_of_firm, foreign_key: "client_of", class_name: "Client", validate: false
<ide><path>activerecord/test/models/company_in_module.rb
<ide> class Company < ActiveRecord::Base
<ide>
<ide> class Firm < Company
<ide> has_many :clients, -> { order("id") }, dependent: :destroy
<del> has_many :clients_sorted_desc, -> { order(Arel.sql("id DESC")) }, class_name: "Client"
<add> has_many :clients_sorted_desc, -> { order("id DESC") }, class_name: "Client"
<ide> has_many :clients_of_firm, -> { order "id" }, foreign_key: "client_of", class_name: "Client"
<ide> has_many :clients_like_ms, -> { where("name = 'Microsoft'").order("id") }, class_name: "Client"
<ide> has_one :account, class_name: "MyApplication::Billing::Account", dependent: :destroy
<ide><path>activerecord/test/models/developer.rb
<ide>
<ide> module DeveloperProjectsAssociationExtension2
<ide> def find_least_recent
<del> order(Arel.sql("id ASC")).first
<add> order("id ASC").first
<ide> end
<ide> end
<ide>
<ide> class Developer < ActiveRecord::Base
<ide>
<ide> has_and_belongs_to_many :projects do
<ide> def find_most_recent
<del> order(Arel.sql("id DESC")).first
<add> order("id DESC").first
<ide> end
<ide> end
<ide>
<ide> def find_most_recent
<ide> join_table: "developers_projects",
<ide> association_foreign_key: "project_id" do
<ide> def find_least_recent
<del> order(Arel.sql("id ASC")).first
<add> order("id ASC").first
<ide> end
<ide> end
<ide>
<ide> class DeveloperWithIncludes < ActiveRecord::Base
<ide>
<ide> class DeveloperFilteredOnJoins < ActiveRecord::Base
<ide> self.table_name = "developers"
<del> has_and_belongs_to_many :projects, -> { order(Arel.sql("projects.id")) }, foreign_key: "developer_id", join_table: "developers_projects"
<add> has_and_belongs_to_many :projects, -> { order("projects.id") }, foreign_key: "developer_id", join_table: "developers_projects"
<ide>
<ide> def self.default_scope
<ide> joins(:projects).where(projects: { name: "Active Controller" })
<ide> def self.default_scope
<ide>
<ide> class DeveloperOrderedBySalary < ActiveRecord::Base
<ide> self.table_name = "developers"
<del> default_scope { order(Arel.sql("salary DESC")) }
<add> default_scope { order("salary DESC") }
<ide>
<del> scope :by_name, -> { order(Arel.sql("name DESC")) }
<add> scope :by_name, -> { order("name DESC") }
<ide> end
<ide>
<ide> class DeveloperCalledDavid < ActiveRecord::Base
<ide> class ModuleIncludedPoorDeveloperCalledJamis < DeveloperCalledJamis
<ide>
<ide> class EagerDeveloperWithDefaultScope < ActiveRecord::Base
<ide> self.table_name = "developers"
<del> has_and_belongs_to_many :projects, -> { order(Arel.sql("projects.id")) }, foreign_key: "developer_id", join_table: "developers_projects"
<add> has_and_belongs_to_many :projects, -> { order("projects.id") }, foreign_key: "developer_id", join_table: "developers_projects"
<ide>
<ide> default_scope { includes(:projects) }
<ide> end
<ide>
<ide> class EagerDeveloperWithClassMethodDefaultScope < ActiveRecord::Base
<ide> self.table_name = "developers"
<del> has_and_belongs_to_many :projects, -> { order(Arel.sql("projects.id")) }, foreign_key: "developer_id", join_table: "developers_projects"
<add> has_and_belongs_to_many :projects, -> { order("projects.id") }, foreign_key: "developer_id", join_table: "developers_projects"
<ide>
<ide> def self.default_scope
<ide> includes(:projects)
<ide> def self.default_scope
<ide>
<ide> class EagerDeveloperWithLambdaDefaultScope < ActiveRecord::Base
<ide> self.table_name = "developers"
<del> has_and_belongs_to_many :projects, -> { order(Arel.sql("projects.id")) }, foreign_key: "developer_id", join_table: "developers_projects"
<add> has_and_belongs_to_many :projects, -> { order("projects.id") }, foreign_key: "developer_id", join_table: "developers_projects"
<ide>
<ide> default_scope lambda { includes(:projects) }
<ide> end
<ide>
<ide> class EagerDeveloperWithBlockDefaultScope < ActiveRecord::Base
<ide> self.table_name = "developers"
<del> has_and_belongs_to_many :projects, -> { order(Arel.sql("projects.id")) }, foreign_key: "developer_id", join_table: "developers_projects"
<add> has_and_belongs_to_many :projects, -> { order("projects.id") }, foreign_key: "developer_id", join_table: "developers_projects"
<ide>
<ide> default_scope { includes(:projects) }
<ide> end
<ide>
<ide> class EagerDeveloperWithCallableDefaultScope < ActiveRecord::Base
<ide> self.table_name = "developers"
<del> has_and_belongs_to_many :projects, -> { order(Arel.sql("projects.id")) }, foreign_key: "developer_id", join_table: "developers_projects"
<add> has_and_belongs_to_many :projects, -> { order("projects.id") }, foreign_key: "developer_id", join_table: "developers_projects"
<ide>
<ide> default_scope OpenStruct.new(call: includes(:projects))
<ide> end
<ide><path>activerecord/test/models/membership.rb
<ide> class CurrentMembership < Membership
<ide> end
<ide>
<ide> class SuperMembership < Membership
<del> belongs_to :member, -> { order(Arel.sql("members.id DESC")) }
<add> belongs_to :member, -> { order("members.id DESC") }
<ide> belongs_to :club
<ide> end
<ide>
<ide><path>activerecord/test/models/pirate.rb
<ide> class Pirate < ActiveRecord::Base
<ide> belongs_to :parrot, validate: true
<ide> belongs_to :non_validated_parrot, class_name: "Parrot"
<del> has_and_belongs_to_many :parrots, -> { order(Arel.sql("parrots.id ASC")) }, validate: true
<add> has_and_belongs_to_many :parrots, -> { order("parrots.id ASC") }, validate: true
<ide> has_and_belongs_to_many :non_validated_parrots, class_name: "Parrot"
<ide> has_and_belongs_to_many :parrots_with_method_callbacks, class_name: "Parrot",
<ide> before_add: :log_before_add,
<ide> class Pirate < ActiveRecord::Base
<ide> has_one :ship
<ide> has_one :update_only_ship, class_name: "Ship"
<ide> has_one :non_validated_ship, class_name: "Ship"
<del> has_many :birds, -> { order(Arel.sql("birds.id ASC")) }
<add> has_many :birds, -> { order("birds.id ASC") }
<ide> has_many :birds_with_method_callbacks, class_name: "Bird",
<ide> before_add: :log_before_add,
<ide> after_add: :log_after_add,
<ide><path>activerecord/test/models/post.rb
<ide> def greeting
<ide> def first_comment
<ide> super.body
<ide> end
<del> has_one :first_comment, -> { order(Arel.sql("id ASC")) }, class_name: "Comment"
<del> has_one :last_comment, -> { order(Arel.sql("id desc")) }, class_name: "Comment"
<add> has_one :first_comment, -> { order("id ASC") }, class_name: "Comment"
<add> has_one :last_comment, -> { order("id desc") }, class_name: "Comment"
<ide>
<ide> scope :with_special_comments, -> { joins(:comments).where(comments: { type: "SpecialComment" }) }
<ide> scope :with_very_special_comments, -> { joins(:comments).where(comments: { type: "VerySpecialComment" }) }
<ide> def first_comment
<ide>
<ide> has_many :comments do
<ide> def find_most_recent
<del> order(Arel.sql("id DESC")).first
<add> order("id DESC").first
<ide> end
<ide>
<ide> def newest
<ide> def arel_attribute(name, table)
<ide> def enforce_raw_sql_whitelist(*args)
<ide> # noop
<ide> end
<add>
<add> def attribute_names_and_aliases
<add> []
<add> end
<ide> end
<ide> end
<ide><path>activerecord/test/models/tag.rb
<ide> class Tag < ActiveRecord::Base
<ide> class OrderedTag < Tag
<ide> self.table_name = "tags"
<ide>
<del> has_many :taggings, -> { order(Arel.sql("taggings.id DESC")) }, foreign_key: "tag_id"
<add> has_many :taggings, -> { order("taggings.id DESC") }, foreign_key: "tag_id"
<ide> has_many :tagged_posts, through: :taggings, source: "taggable", source_type: "Post"
<ide> end | 27 |
Javascript | Javascript | update typedef to be more readable with @property | f7790f838266e248d42d66e999349a1b12b1fb0b | <ide><path>lib/util/identifier.js
<ide> "use strict";
<ide> const path = require("path");
<ide>
<del>/** @typedef {{relativePaths: Map<string, string|Map<string,string>>}} MakeRelativePathsCache */
<add>/**
<add> * @typedef {Object} MakeRelativePathsCache
<add> * @property {Map<string, Map<string, string>>=} relativePaths
<add> */
<ide>
<ide> /**
<ide> * | 1 |
PHP | PHP | add methods to cast stringables | 66d37573dd6017dba1531c8ff9f294b1898351e6 | <ide><path>src/Illuminate/Support/Stringable.php
<ide> namespace Illuminate\Support;
<ide>
<ide> use Closure;
<add>use Illuminate\Support\Facades\Date;
<ide> use Illuminate\Support\Traits\Conditionable;
<ide> use Illuminate\Support\Traits\Macroable;
<ide> use Illuminate\Support\Traits\Tappable;
<ide> public function toString()
<ide> return $this->value;
<ide> }
<ide>
<add> /**
<add> * Get the underlying string value as an integer.
<add> *
<add> * @return int
<add> */
<add> public function toInteger()
<add> {
<add> return intval($this->value);
<add> }
<add>
<add> /**
<add> * Get the underlying string value as a float.
<add> *
<add> * @return float
<add> */
<add> public function toFloat()
<add> {
<add> return floatval($this->value);
<add> }
<add>
<add> /**
<add> * Get the underlying string value as a boolean.
<add> *
<add> * Returns true when value is "1", "true", "on", and "yes". Otherwise, returns false.
<add> *
<add> * @return bool
<add> */
<add> public function toBoolean()
<add> {
<add> return filter_var($this->value, FILTER_VALIDATE_BOOLEAN);
<add> }
<add>
<add> /**
<add> * Get the underlying string value as a Carbon instance.
<add> *
<add> * @param string|null $format
<add> * @param string|null $tz
<add> * @return \Illuminate\Support\Carbon
<add> *
<add> * @throws \Carbon\Exceptions\InvalidFormatException
<add> */
<add> public function toDate($format = null, $tz = null)
<add> {
<add> if (is_null($format)) {
<add> return Date::parse($this->value, $tz);
<add> }
<add>
<add> return Date::createFromFormat($format, $this->value, $tz);
<add> }
<add>
<ide> /**
<ide> * Convert the object to a string when JSON encoded.
<ide> *
<ide><path>tests/Support/SupportStringableTest.php
<ide>
<ide> namespace Illuminate\Tests\Support;
<ide>
<add>use Illuminate\Support\Carbon;
<ide> use Illuminate\Support\Collection;
<ide> use Illuminate\Support\HtmlString;
<ide> use Illuminate\Support\Stringable;
<ide> public function testExactly()
<ide> $this->assertFalse($this->stringable('[]')->exactly([]));
<ide> $this->assertFalse($this->stringable('0')->exactly(0));
<ide> }
<add>
<add> public function testToInteger()
<add> {
<add> $this->assertSame(123, $this->stringable('123')->toInteger());
<add> $this->assertSame(456, $this->stringable(456)->toInteger());
<add> $this->assertSame(78, $this->stringable('078')->toInteger());
<add> $this->assertSame(901, $this->stringable(' 901')->toInteger());
<add> $this->assertSame(0, $this->stringable('nan')->toInteger());
<add> $this->assertSame(1, $this->stringable('1ab')->toInteger());
<add> $this->assertSame(2, $this->stringable('2_000')->toInteger());
<add> }
<add>
<add> public function testToFloat()
<add> {
<add> $this->assertSame(1.23, $this->stringable('1.23')->toFloat());
<add> $this->assertSame(45.6, $this->stringable(45.6)->toFloat());
<add> $this->assertSame(.6, $this->stringable('.6')->toFloat());
<add> $this->assertSame(0.78, $this->stringable('0.78')->toFloat());
<add> $this->assertSame(90.1, $this->stringable(' 90.1')->toFloat());
<add> $this->assertSame(0.0, $this->stringable('nan')->toFloat());
<add> $this->assertSame(1.0, $this->stringable('1.ab')->toFloat());
<add> $this->assertSame(1e3, $this->stringable('1e3')->toFloat());
<add> }
<add>
<add> public function testBooleanMethod()
<add> {
<add> $this->assertTrue($this->stringable(true)->toBoolean());
<add> $this->assertTrue($this->stringable('true')->toBoolean());
<add> $this->assertFalse($this->stringable('false')->toBoolean());
<add> $this->assertTrue($this->stringable('1')->toBoolean());
<add> $this->assertFalse($this->stringable('0')->toBoolean());
<add> $this->assertTrue($this->stringable('on')->toBoolean());
<add> $this->assertFalse($this->stringable('off')->toBoolean());
<add> $this->assertTrue($this->stringable('yes')->toBoolean());
<add> $this->assertFalse($this->stringable('no')->toBoolean());
<add> }
<add>
<add> public function testToDate()
<add> {
<add> $current = Carbon::create(2020, 1, 1, 16, 30, 25);
<add>
<add> $this->assertEquals($current, $this->stringable('20-01-01 16:30:25')->toDate());
<add> $this->assertEquals($current, $this->stringable('1577896225')->toDate('U'));
<add> $this->assertEquals($current, $this->stringable('20-01-01 13:30:25')->toDate(null, 'America/Santiago'));
<add>
<add> $this->assertTrue($this->stringable('2020-01-01')->toDate()->isSameDay($current));
<add> $this->assertTrue($this->stringable('16:30:25')->toDate()->isSameSecond('16:30:25'));
<add> }
<add>
<add> public function testToDateThrowsException()
<add> {
<add> $this->expectException(\Carbon\Exceptions\InvalidFormatException::class);
<add>
<add> $this->stringable('not a date')->toDate();
<add> }
<ide> } | 2 |
Text | Text | add 4.2.z back to the maintenance list | fdc52ddb76d69235c01494a5f19c56cb9bddff0c | <ide><path>guides/source/maintenance_policy.md
<ide> from.
<ide> In special situations, where someone from the Core Team agrees to support more series,
<ide> they are included in the list of supported series.
<ide>
<del>**Currently included series:** `5.0.Z`.
<add>**Currently included series:** `5.0.Z`, `4.2.Z`.
<ide>
<ide> Security Issues
<ide> --------------- | 1 |
Javascript | Javascript | fix inner loop of "remove" method | f4d8c5f20f99f8f8e4c54536305d5d03199d4181 | <ide><path>lib/Chunk.js
<ide> class Chunk {
<ide> chunk.addParent(parentChunk);
<ide> // add "sub chunk" to parent
<ide> parentChunk.addChunk(chunk);
<del>
<del> // remove this as parent of every "sub chunk"
<del> const idx = chunk.parents.indexOf(this);
<del> if(idx >= 0) {
<del> chunk.parents.splice(idx, 1);
<del> }
<ide> });
<ide> });
<ide>
<add> /**
<add> * we need to iterate again over the chunks
<add> * to remove this from the chunks parents.
<add> * This can not be done in the above loop
<add> * as it is not garuanteed that `this.parents` contains anything.
<add> */
<add> this.chunks.forEach(chunk => {
<add> // remove this as parent of every "sub chunk"
<add> const idx = chunk.parents.indexOf(this);
<add> if(idx >= 0) {
<add> chunk.parents.splice(idx, 1);
<add> }
<add> });
<add>
<ide> // cleanup blocks
<ide> this.blocks.forEach(block => {
<ide> const idx = block.chunks.indexOf(this); | 1 |
Javascript | Javascript | use clearinterval instead of cleartimer on a timer | b3fad60d7d49fee1ef6f2f60f6be5f520f4456a8 | <ide><path>packages/next/client/dev/error-overlay/eventsource.js
<ide> function EventSourceWrapper(options) {
<ide>
<ide> return {
<ide> close: () => {
<del> clearTimeout(timer)
<add> clearInterval(timer)
<ide> source.close()
<ide> },
<ide> addMessageListener: function(fn) { | 1 |
Javascript | Javascript | fix eslint issues | 7ade84efb733aaa677d8f005ef94c2d9b82259e8 | <ide><path>packages/@ember/-internals/metal/tests/accessors/mandatory_setters_test.js
<ide> if (DEBUG) {
<ide> ['@test watched ES5 setter should not be smashed by mandatory setter'](assert) {
<ide> let value;
<ide> let obj = {
<del> get foo() {},
<add> get foo() {
<add> return value;
<add> },
<ide> set foo(_value) {
<ide> value = _value;
<ide> },
<ide><path>packages/@ember/-internals/runtime/lib/system/core_object.js
<ide> class CoreObject {
<ide>
<ide> static get superclass() {
<ide> let c = Object.getPrototypeOf(this);
<del> if (c !== Function.prototype) return c;
<add> return c !== Function.prototype ? c : undefined;
<ide> }
<ide>
<ide> static proto() { | 2 |
Javascript | Javascript | convert the password prompt to a class | 2b7137ba0a05aede70b03860c9546adba19490a6 | <ide><path>web/app.js
<ide> var PDFViewerApplication = {
<ide> });
<ide> }
<ide>
<del> PasswordPrompt.initialize({
<add> this.passwordPrompt = new PasswordPrompt({
<ide> overlayName: 'passwordOverlay',
<del> passwordField: document.getElementById('password'),
<del> passwordText: document.getElementById('passwordText'),
<del> passwordSubmit: document.getElementById('passwordSubmit'),
<del> passwordCancel: document.getElementById('passwordCancel')
<add> label: document.getElementById('passwordText'),
<add> input: document.getElementById('password'),
<add> submitButton: document.getElementById('passwordSubmit'),
<add> cancelButton: document.getElementById('passwordCancel')
<ide> });
<del> this.passwordPrompt = PasswordPrompt;
<ide>
<ide> this.pdfOutlineViewer = new PDFOutlineViewer({
<ide> container: document.getElementById('outlineView'),
<ide> var PDFViewerApplication = {
<ide> var loadingTask = pdfjsLib.getDocument(parameters);
<ide> this.pdfLoadingTask = loadingTask;
<ide>
<del> loadingTask.onPassword = function passwordNeeded(updatePassword, reason) {
<del> PasswordPrompt.updatePassword = updatePassword;
<del> PasswordPrompt.reason = reason;
<del> PasswordPrompt.open();
<add> loadingTask.onPassword = function passwordNeeded(updateCallback, reason) {
<add> self.passwordPrompt.setUpdateCallback(updateCallback, reason);
<add> self.passwordPrompt.open();
<ide> };
<ide>
<ide> loadingTask.onProgress = function getDocumentProgress(progressData) {
<ide><path>web/password_prompt.js
<ide> var mozL10n = uiUtils.mozL10n;
<ide> var OverlayManager = overlayManager.OverlayManager;
<ide>
<del>var PasswordPrompt = {
<del> overlayName: null,
<del> updatePassword: null,
<del> reason: null,
<del> passwordField: null,
<del> passwordText: null,
<del> passwordSubmit: null,
<del> passwordCancel: null,
<del>
<del> initialize: function secondaryToolbarInitialize(options) {
<del> this.overlayName = options.overlayName;
<del> this.passwordField = options.passwordField;
<del> this.passwordText = options.passwordText;
<del> this.passwordSubmit = options.passwordSubmit;
<del> this.passwordCancel = options.passwordCancel;
<add>/**
<add> * @typedef {Object} PasswordPromptOptions
<add> * @property {string} overlayName - Name of the overlay for the overlay manager.
<add> * @property {HTMLParagraphElement} label - Label containing instructions for
<add> * entering the password.
<add> * @property {HTMLInputElement} input - Input field for entering the password.
<add> * @property {HTMLButtonElement} submitButton - Button for submitting the
<add> * password.
<add> * @property {HTMLButtonElement} cancelButton - Button for cancelling password
<add> * entry.
<add> */
<ide>
<del> // Attach the event listeners.
<del> this.passwordSubmit.addEventListener('click',
<del> this.verifyPassword.bind(this));
<add>/**
<add> * @class
<add> */
<add>var PasswordPrompt = (function PasswordPromptClosure() {
<add> /**
<add> * @constructs PasswordPrompt
<add> * @param {PasswordPromptOptions} options
<add> */
<add> function PasswordPrompt(options) {
<add> this.overlayName = options.overlayName;
<add> this.label = options.label;
<add> this.input = options.input;
<add> this.submitButton = options.submitButton;
<add> this.cancelButton = options.cancelButton;
<ide>
<del> this.passwordCancel.addEventListener('click', this.close.bind(this));
<add> this.updateCallback = null;
<add> this.reason = null;
<ide>
<del> this.passwordField.addEventListener('keydown', function (e) {
<add> // Attach the event listeners.
<add> this.submitButton.addEventListener('click', this.verify.bind(this));
<add> this.cancelButton.addEventListener('click', this.close.bind(this));
<add> this.input.addEventListener('keydown', function (e) {
<ide> if (e.keyCode === 13) { // Enter key
<del> this.verifyPassword();
<add> this.verify();
<ide> }
<ide> }.bind(this));
<ide>
<ide> OverlayManager.register(this.overlayName, this.close.bind(this), true);
<del> },
<del>
<del> open: function passwordPromptOpen() {
<del> OverlayManager.open(this.overlayName).then(function () {
<del> this.passwordField.type = 'password';
<del> this.passwordField.focus();
<del>
<del> var promptString = mozL10n.get('password_label', null,
<del> 'Enter the password to open this PDF file.');
<add> }
<ide>
<del> if (this.reason === pdfjsLib.PasswordResponses.INCORRECT_PASSWORD) {
<del> promptString = mozL10n.get('password_invalid', null,
<del> 'Invalid password. Please try again.');
<add> PasswordPrompt.prototype = {
<add> open: function PasswordPrompt_open() {
<add> OverlayManager.open(this.overlayName).then(function () {
<add> this.input.type = 'password';
<add> this.input.focus();
<add>
<add> var promptString = mozL10n.get('password_label', null,
<add> 'Enter the password to open this PDF file.');
<add>
<add> if (this.reason === pdfjsLib.PasswordResponses.INCORRECT_PASSWORD) {
<add> promptString = mozL10n.get('password_invalid', null,
<add> 'Invalid password. Please try again.');
<add> }
<add>
<add> this.label.textContent = promptString;
<add> }.bind(this));
<add> },
<add>
<add> close: function PasswordPrompt_close() {
<add> OverlayManager.close(this.overlayName).then(function () {
<add> this.input.value = '';
<add> this.input.type = '';
<add> }.bind(this));
<add> },
<add>
<add> verify: function PasswordPrompt_verify() {
<add> var password = this.input.value;
<add> if (password && password.length > 0) {
<add> this.close();
<add> return this.updateCallback(password);
<ide> }
<add> },
<ide>
<del> this.passwordText.textContent = promptString;
<del> }.bind(this));
<del> },
<del>
<del> close: function passwordPromptClose() {
<del> OverlayManager.close(this.overlayName).then(function () {
<del> this.passwordField.value = '';
<del> this.passwordField.type = '';
<del> }.bind(this));
<del> },
<del>
<del> verifyPassword: function passwordPromptVerifyPassword() {
<del> var password = this.passwordField.value;
<del> if (password && password.length > 0) {
<del> this.close();
<del> return this.updatePassword(password);
<add> setUpdateCallback:
<add> function PasswordPrompt_setUpdateCallback(updateCallback, reason) {
<add> this.updateCallback = updateCallback;
<add> this.reason = reason;
<ide> }
<del> }
<del>};
<add> };
<add>
<add> return PasswordPrompt;
<add>})();
<ide>
<ide> exports.PasswordPrompt = PasswordPrompt;
<ide> })); | 2 |
Text | Text | update code examples in domain.md | 5a42cd1505578a50faa1255c233de25bb1efe059 | <ide><path>doc/api/domain.md
<ide> For example, this is not a good idea:
<ide> ```js
<ide> // XXX WARNING! BAD IDEA!
<ide>
<del>var d = require('domain').create();
<add>const d = require('domain').create();
<ide> d.on('error', (er) => {
<ide> // The error won't crash the process, but what it does is worse!
<ide> // Though we've prevented abrupt process restarting, we are leaking
<ide> // resources like crazy if this ever happens.
<ide> // This is no better than process.on('uncaughtException')!
<del> console.log('error, but oh well', er.message);
<add> console.log(`error, but oh well ${er.message}`);
<ide> });
<ide> d.run(() => {
<ide> require('http').createServer((req, res) => {
<ide> if (cluster.isMaster) {
<ide> // worker processes to serve requests. How it works, caveats, etc.
<ide>
<ide> const server = require('http').createServer((req, res) => {
<del> var d = domain.create();
<add> const d = domain.create();
<ide> d.on('error', (er) => {
<del> console.error('error', er.stack);
<add> console.error(`error ${er.stack}`);
<ide>
<ide> // Note: we're in dangerous territory!
<ide> // By definition, something unexpected occurred,
<ide> if (cluster.isMaster) {
<ide>
<ide> try {
<ide> // make sure we close down within 30 seconds
<del> var killtimer = setTimeout(() => {
<add> const killtimer = setTimeout(() => {
<ide> process.exit(1);
<ide> }, 30000);
<ide> // But don't keep the process open just for that!
<ide> if (cluster.isMaster) {
<ide> res.end('Oops, there was a problem!\n');
<ide> } catch (er2) {
<ide> // oh well, not much we can do at this point.
<del> console.error('Error sending 500!', er2.stack);
<add> console.error(`Error sending 500! ${er2.stack}`);
<ide> }
<ide> });
<ide>
<ide> if (cluster.isMaster) {
<ide> // This part is not important. Just an example routing thing.
<ide> // You'd put your fancy application logic here.
<ide> function handleRequest(req, res) {
<del> switch(req.url) {
<add> switch (req.url) {
<ide> case '/error':
<ide> // We do some async stuff, and then...
<ide> setTimeout(() => {
<ide> serverDomain.run(() => {
<ide> // req and res are also created in the scope of serverDomain
<ide> // however, we'd prefer to have a separate domain for each request.
<ide> // create it first thing, and add req and res to it.
<del> var reqd = domain.create();
<add> const reqd = domain.create();
<ide> reqd.add(req);
<ide> reqd.add(res);
<ide> reqd.on('error', (er) => {
<ide> console.error('Error', er, req.url);
<ide> try {
<ide> res.writeHead(500);
<ide> res.end('Error occurred, sorry.');
<del> } catch (er) {
<del> console.error('Error sending 500', er, req.url);
<add> } catch (er2) {
<add> console.error('Error sending 500', er2, req.url);
<ide> }
<ide> });
<ide> }).listen(1337); | 1 |
Javascript | Javascript | upgrade abstractplugin to es6 | ccfd5a8e6974d057d9551297fdbb777d0776b450 | <ide><path>lib/AbstractPlugin.js
<ide> MIT License http://www.opensource.org/licenses/mit-license.php
<ide> Author Tobias Koppers @sokra
<ide> */
<del>function AbstractPlugin(plugins) {
<del> this._plugins = plugins || {};
<del>}
<del>module.exports = AbstractPlugin;
<add>"use strict";
<add>
<add>class AbstractPlugin {
<add> static create(plugins) {
<add> return class Plugin extends AbstractPlugin {
<add> constructor() {
<add> super(plugins);
<add> }
<add> };
<add> }
<ide>
<del>AbstractPlugin.create = function(plugins) {
<del> function Plugin() {
<del> AbstractPlugin.call(this, plugins);
<add> constructor(plugins) {
<add> this._plugins = plugins || {};
<ide> }
<del> Plugin.prototype = Object.create(AbstractPlugin.prototype);
<del> return Plugin;
<del>};
<ide>
<del>AbstractPlugin.prototype.apply = function(object) {
<del> for(var name in this._plugins) {
<del> object.plugin(name, this._plugins[name]);
<add> apply(object) {
<add> for(const name in this._plugins) {
<add> object.plugin(name, this._plugins[name]);
<add> }
<ide> }
<del>};
<add>}
<add>
<add>module.exports = AbstractPlugin; | 1 |
Ruby | Ruby | remove todos related to exceptron [ci skip] | 4327444cfdbfc8ce93ca0efa50ac6cf00940fbaa | <ide><path>actionview/lib/action_view/renderer/streaming_template_renderer.rb
<ide> module ActionView
<ide> # == TODO
<ide> #
<ide> # * Support streaming from child templates, partials and so on.
<del> # * Integrate exceptions with exceptron
<ide> # * Rack::Cache needs to support streaming bodies
<ide> class StreamingTemplateRenderer < TemplateRenderer #:nodoc:
<ide> # A valid Rack::Body (i.e. it responds to each).
<ide> def each(&block)
<ide> private
<ide>
<ide> # This is the same logging logic as in ShowExceptions middleware.
<del> # TODO Once "exceptron" is in, refactor this piece to simply re-use exceptron.
<ide> def log_error(exception)
<ide> logger = ActionView::Base.logger
<ide> return unless logger | 1 |
PHP | PHP | add testisregisteredstreamwrapper method | 5d91ab036d4414731239ed9041d95a99874f3f8f | <ide><path>tests/TestCase/Filesystem/FolderTest.php
<ide> public function testSortByName()
<ide>
<ide> $this->assertSame(['a.txt', 'b.txt', 'c.txt'], $results);
<ide> }
<add>
<add> /**
<add> * testIsRegisteredStreamWrapper
<add> *
<add> * @return void
<add> */
<add> public function testIsRegisteredStreamWrapper()
<add> {
<add> foreach (stream_get_wrappers() as $wrapper) {
<add> $this->assertTrue(Folder::isRegisteredStreamWrapper($wrapper . "://path/to/file"));
<add> $this->assertFalse(Folder::isRegisteredStreamWrapper("bad." . $wrapper . "://path/to/file"));
<add> }
<add>
<add> $wrapper = 'unit.test1-';
<add> $this->assertFalse(Folder::isRegisteredStreamWrapper($wrapper . "://path/to/file"));
<add> stream_wrapper_register($wrapper, self::class);
<add> $this->assertTrue(Folder::isRegisteredStreamWrapper($wrapper . "://path/to/file"));
<add> stream_wrapper_unregister($wrapper);
<add> }
<ide> } | 1 |
Python | Python | ensure new setuptools before building sdist | 7cbdcaddf32f460f71e2c9370d5430ff91461e6a | <ide><path>fabfile.py
<ide> def make():
<ide> def sdist():
<ide> with virtualenv(VENV_DIR) as venv_local:
<ide> with lcd(path.dirname(__file__)):
<add> local('python -m pip install -U setuptools')
<ide> local('python setup.py sdist')
<ide>
<ide> def wheel(): | 1 |
Go | Go | lock container while connecting to a new network | 4d0888e32bccfd8c0f27a7b66b2a5607d42e2698 | <ide><path>daemon/container_operations.go
<ide> func (daemon *Daemon) ConnectToNetwork(container *container.Container, idOrName
<ide> if endpointConfig == nil {
<ide> endpointConfig = &networktypes.EndpointSettings{}
<ide> }
<add> container.Lock()
<add> defer container.Unlock()
<add>
<ide> if !container.Running {
<ide> if container.RemovalInProgress || container.Dead {
<ide> return errRemovalContainer(container.ID)
<ide> func (daemon *Daemon) ConnectToNetwork(container *container.Container, idOrName
<ide> return err
<ide> }
<ide> }
<del> if err := container.ToDiskLocking(); err != nil {
<add> if err := container.ToDisk(); err != nil {
<ide> return fmt.Errorf("Error saving container to disk: %v", err)
<ide> }
<ide> return nil
<ide> func (daemon *Daemon) ConnectToNetwork(container *container.Container, idOrName
<ide> // DisconnectFromNetwork disconnects container from network n.
<ide> func (daemon *Daemon) DisconnectFromNetwork(container *container.Container, networkName string, force bool) error {
<ide> n, err := daemon.FindNetwork(networkName)
<add> container.Lock()
<add> defer container.Unlock()
<add>
<ide> if !container.Running || (err != nil && force) {
<ide> if container.RemovalInProgress || container.Dead {
<ide> return errRemovalContainer(container.ID)
<ide> func (daemon *Daemon) DisconnectFromNetwork(container *container.Container, netw
<ide> return err
<ide> }
<ide>
<del> if err := container.ToDiskLocking(); err != nil {
<add> if err := container.ToDisk(); err != nil {
<ide> return fmt.Errorf("Error saving container to disk: %v", err)
<ide> }
<ide> | 1 |
PHP | PHP | add tests for callback related features | 9291f2b7113d8127faab7aa49ff23991fd335a2a | <ide><path>Cake/ORM/Table.php
<ide> public function delete(Entity $entity, array $options = []) {
<ide> $process = function () use ($entity, $options) {
<ide> return $this->_processDelete($entity, $options);
<ide> };
<add>
<ide> if ($options['atomic']) {
<ide> $success = $this->connection()->transactional($process);
<ide> } else {
<ide><path>Cake/Test/TestCase/ORM/TableTest.php
<ide> public function testUpdateNoPrimaryButOtherKeys() {
<ide> * @return void
<ide> */
<ide> public function testDelete() {
<del> $table = TableRegistry::get('user');
<add> $table = TableRegistry::get('users');
<ide> $conditions = [
<ide> 'limit' => 1,
<ide> 'conditions' => [
<ide> public function testDeleteBelongsToMany() {
<ide> * @return void
<ide> */
<ide> public function testDeleteCallbacks() {
<del> $this->markTestIncomplete('not done');
<add> $entity = new \Cake\ORM\Entity(['id' => 1, 'name' => 'mark']);
<add> $options = new \ArrayObject(['atomic' => true, 'cascade' => true]);
<add>
<add> $mock = $this->getMock('Cake\Event\EventManager');
<add> $mock->expects($this->at(0))
<add> ->method('dispatch')
<add> ->with($this->logicalAnd(
<add> $this->attributeEqualTo('_name', 'Model.beforeDelete'),
<add> $this->attributeEqualTo(
<add> 'data',
<add> ['entity' => $entity, 'options' => $options]
<add> )
<add> ));
<add>
<add> $mock->expects($this->at(1))
<add> ->method('dispatch')
<add> ->with($this->logicalAnd(
<add> $this->attributeEqualTo('_name', 'Model.afterDelete'),
<add> $this->attributeEqualTo(
<add> 'data',
<add> ['entity' => $entity, 'options' => $options]
<add> )
<add> ));
<add>
<add> $table = TableRegistry::get('users', ['eventManager' => $mock]);
<add> $entity->isNew(false);
<add> $table->delete($entity);
<ide> }
<ide>
<ide> /**
<ide> public function testDeleteCallbacks() {
<ide> * @return void
<ide> */
<ide> public function testDeleteBeforeDeleteAbort() {
<del> $this->markTestIncomplete('not done');
<add> $entity = new \Cake\ORM\Entity(['id' => 1, 'name' => 'mark']);
<add> $options = new \ArrayObject(['atomic' => true, 'cascade' => true]);
<add>
<add> $mock = $this->getMock('Cake\Event\EventManager');
<add> $mock->expects($this->once())
<add> ->method('dispatch')
<add> ->will($this->returnCallback(function ($event) {
<add> $event->stopPropagation();
<add> }));
<add>
<add> $table = TableRegistry::get('users', ['eventManager' => $mock]);
<add> $entity->isNew(false);
<add> $result = $table->delete($entity);
<add> $this->assertNull($result);
<ide> }
<ide>
<ide> /**
<ide> public function testDeleteBeforeDeleteAbort() {
<ide> * @return void
<ide> */
<ide> public function testDeleteBeforeDeleteReturnResult() {
<del> $this->markTestIncomplete('not done');
<add> $entity = new \Cake\ORM\Entity(['id' => 1, 'name' => 'mark']);
<add> $options = new \ArrayObject(['atomic' => true, 'cascade' => true]);
<add>
<add> $mock = $this->getMock('Cake\Event\EventManager');
<add> $mock->expects($this->once())
<add> ->method('dispatch')
<add> ->will($this->returnCallback(function ($event) {
<add> $event->stopPropagation();
<add> $event->result = 'got stopped';
<add> }));
<add>
<add> $table = TableRegistry::get('users', ['eventManager' => $mock]);
<add> $entity->isNew(false);
<add> $result = $table->delete($entity);
<add> $this->assertEquals('got stopped', $result);
<ide> }
<ide>
<ide> /**
<ide> public function testDeleteBeforeDeleteReturnResult() {
<ide> * @return void
<ide> */
<ide> public function testDeleteIsNew() {
<del> $this->markTestIncomplete('not done');
<add> $entity = new \Cake\ORM\Entity(['id' => 1, 'name' => 'mark']);
<add>
<add> $table = $this->getMock(
<add> 'Cake\ORM\Table',
<add> ['_buildQuery'],
<add> [['connection' => $this->connection]]
<add> );
<add> $table->expects($this->never())
<add> ->method('_buildQuery');
<add>
<add> $entity->isNew(true);
<add> $result = $table->delete($entity);
<add> $this->assertFalse($result);
<ide> }
<ide>
<ide> } | 2 |
Javascript | Javascript | add shift+enter to go to previous search result | 6215e1c2db7e55a364a1156be38b4e3d7490d551 | <ide><path>packages/react-devtools-shared/src/devtools/views/Components/SearchInput.js
<ide> export default function SearchInput(props: Props) {
<ide> );
<ide>
<ide> const handleInputKeyPress = useCallback(
<del> ({key}) => {
<add> ({key, shiftKey}) => {
<ide> if (key === 'Enter') {
<del> dispatch({type: 'GO_TO_NEXT_SEARCH_RESULT'});
<add> if (shiftKey) {
<add> dispatch({type: 'GO_TO_PREVIOUS_SEARCH_RESULT'});
<add> } else {
<add> dispatch({type: 'GO_TO_NEXT_SEARCH_RESULT'});
<add> }
<ide> }
<ide> },
<ide> [dispatch], | 1 |
Javascript | Javascript | add setrequestheader method to fileloader | 6f68f3c93fcbdf7a79bc9e99e47a17569580c502 | <ide><path>src/loaders/FileLoader.js
<ide> Object.assign( FileLoader.prototype, {
<ide>
<ide> if ( request.overrideMimeType ) request.overrideMimeType( this.mimeType !== undefined ? this.mimeType : 'text/plain' );
<ide>
<add> if ( this.requestHeader !== undefined ) {
<add>
<add> var keys = Object.keys( this.requestHeader );
<add>
<add> for ( var i = 0, il = keys.length; i < il; i ++ ) {
<add>
<add> var header = keys[ i ];
<add> var value = this.requestHeader[ header ];
<add>
<add> request.setRequestHeader( header, value );
<add>
<add> }
<add>
<add> }
<add>
<ide> request.send( null );
<ide>
<ide> }
<ide> Object.assign( FileLoader.prototype, {
<ide> this.mimeType = value;
<ide> return this;
<ide>
<add> },
<add>
<add> setRequestHeader: function ( value ) {
<add>
<add> this.requestHeader = value;
<add> return this;
<add>
<ide> }
<ide>
<ide> } ); | 1 |
Java | Java | call soloader.init in mainapplication#oncreate | 0907ef668921a63f3ed447d442f84c54a5a20452 | <ide><path>local-cli/generator-android/templates/package/MainApplication.java
<ide> import com.facebook.react.ReactNativeHost;
<ide> import com.facebook.react.ReactPackage;
<ide> import com.facebook.react.shell.MainReactPackage;
<add>import com.facebook.soloader.SoLoader;
<ide>
<ide> import java.util.Arrays;
<ide> import java.util.List;
<ide> protected List<ReactPackage> getPackages() {
<ide>
<ide> @Override
<ide> public ReactNativeHost getReactNativeHost() {
<del> return mReactNativeHost;
<add> return mReactNativeHost;
<add> }
<add>
<add> @Override
<add> public void onCreate() {
<add> super.onCreate();
<add> SoLoader.init(this, /* native exopackage */ false);
<ide> }
<ide> } | 1 |
Python | Python | set color to operators in cloud_sql.py | 3dd7b1ddbaa3170fbda30a8323286abf075f30ba | <ide><path>airflow/providers/google/cloud/operators/cloud_sql.py
<ide> class CloudSQLCreateInstanceOperator(CloudSQLBaseOperator):
<ide> 'impersonation_chain',
<ide> )
<ide> # [END gcp_sql_create_template_fields]
<add> ui_color = '#FADBDA'
<ide> operator_extra_links = (CloudSQLInstanceLink(),)
<ide>
<ide> def __init__(
<ide> class CloudSQLInstancePatchOperator(CloudSQLBaseOperator):
<ide> 'impersonation_chain',
<ide> )
<ide> # [END gcp_sql_patch_template_fields]
<add> ui_color = '#FBDAC8'
<ide> operator_extra_links = (CloudSQLInstanceLink(),)
<ide>
<ide> def __init__(
<ide> class CloudSQLDeleteInstanceOperator(CloudSQLBaseOperator):
<ide> 'impersonation_chain',
<ide> )
<ide> # [END gcp_sql_delete_template_fields]
<add> ui_color = '#FEECD2'
<ide>
<ide> def execute(self, context: 'Context') -> Optional[bool]:
<ide> hook = CloudSQLHook(
<ide> class CloudSQLCreateInstanceDatabaseOperator(CloudSQLBaseOperator):
<ide> 'impersonation_chain',
<ide> )
<ide> # [END gcp_sql_db_create_template_fields]
<add> ui_color = '#FFFCDB'
<ide> operator_extra_links = (CloudSQLInstanceDatabaseLink(),)
<ide>
<ide> def __init__(
<ide> class CloudSQLPatchInstanceDatabaseOperator(CloudSQLBaseOperator):
<ide> 'impersonation_chain',
<ide> )
<ide> # [END gcp_sql_db_patch_template_fields]
<add> ui_color = '#ECF4D9'
<ide> operator_extra_links = (CloudSQLInstanceDatabaseLink(),)
<ide>
<ide> def __init__(
<ide> class CloudSQLDeleteInstanceDatabaseOperator(CloudSQLBaseOperator):
<ide> 'impersonation_chain',
<ide> )
<ide> # [END gcp_sql_db_delete_template_fields]
<add> ui_color = '#D5EAD8'
<ide>
<ide> def __init__(
<ide> self,
<ide> class CloudSQLExportInstanceOperator(CloudSQLBaseOperator):
<ide> 'impersonation_chain',
<ide> )
<ide> # [END gcp_sql_export_template_fields]
<add> ui_color = '#D4ECEA'
<ide> operator_extra_links = (CloudSQLInstanceLink(), FileDetailsLink())
<ide>
<ide> def __init__(
<ide> class CloudSQLImportInstanceOperator(CloudSQLBaseOperator):
<ide> 'impersonation_chain',
<ide> )
<ide> # [END gcp_sql_import_template_fields]
<add> ui_color = '#D3EDFB'
<ide> operator_extra_links = (CloudSQLInstanceLink(), FileDetailsLink())
<ide>
<ide> def __init__(
<ide> class CloudSQLExecuteQueryOperator(BaseOperator):
<ide> template_ext: Sequence[str] = ('.sql',)
<ide> template_fields_renderers = {'sql': 'sql'}
<ide> # [END gcp_sql_query_template_fields]
<add> ui_color = '#D3DEF1'
<ide>
<ide> def __init__(
<ide> self, | 1 |
Text | Text | adjust line breaks | 43070d73a1913691e401cece5d604b2fbc037c45 | <ide><path>docs/Homebrew-brew-Maintainer-Guide.md
<ide> # Homebrew/brew Maintainer Guide
<ide>
<del>This document describes a few components of the `Homebrew/brew` repository that are useful for maintainers to
<del>be aware of, but don't necessarily need to appear in documentation for most users and contributors.
<add>This document describes a few components of the `Homebrew/brew` repository that are useful for maintainers to be aware of, but don't necessarily need to appear in documentation for most users and contributors.
<ide>
<ide> ## Reviewing PRs
<ide>
<ide> Using `gh pr checkout NUMBER` is a super easy way to check out a PR branch using
<ide> When reviewing a PR, use "comment", "approve", or "request changes" when submitting based on the following guidelines:
<ide>
<ide> - Comment: if the PR isn't quite ready to be merged
<del>- Approve: if you feel that the PR is in a good state to be merged, even if there are
<del> non-blocking changes you'd like to be made
<del>- Request changes: if you feel strongly that the PR is likely to cause a problem for users or
<del> have another reason to oppose the PR.
<add>- Approve: if you feel that the PR is in a good state to be merged, even if there are non-blocking changes you'd like to be made
<add>- Request changes: if you feel strongly that the PR is likely to cause a problem for users or have another reason to oppose the PR.
<ide>
<ide> ## Merging PRs
<ide>
<del>Merging is done using the standard Merge button in the `Homebrew/brew` repository to preserve
<del>history and GPG commit signing. The Squash and Merge and Rebase and Merge buttons are disabled.
<add>Merging is done using the standard Merge button in the `Homebrew/brew` repository to preserve history and GPG commit signing. The Squash and Merge and Rebase and Merge buttons are disabled.
<ide>
<ide> PRs must meet the following conditions to be merged:
<ide>
<del>- Have at least one maintainer (or [@BrewTestBot](https://github.com/BrewTestBot)) approval.
<del> See the ["Automatic approvals" section below](#automatic-approvals).
<del> for more details about how [@BrewTestBot](https://github.com/BrewTestBot) approves PRs.
<del>- Have passing CI. This is a _mandatory_ step. PRs with failing CI should _never_ be merged.
<del> See the ["CI" section below](#ci) for more information about `Homebrew/brew` CI.
<add>- Have at least one maintainer (or [@BrewTestBot](https://github.com/BrewTestBot)) approval. See the ["Automatic approvals" section below](#automatic-approvals). for more details about how [@BrewTestBot](https://github.com/BrewTestBot) approves PRs.
<add>- Have passing CI. This is a _mandatory_ step. PRs with failing CI should _never_ be merged. See the ["CI" section below](#ci) for more information about `Homebrew/brew` CI.
<ide>
<del>If possible, PRs should also have GPG-signed commits (see the private `ops` repository for
<del>instructions on setting this up).
<add>If possible, PRs should also have GPG-signed commits (see the private `ops` repository for instructions on setting this up).
<ide>
<ide> ## Automatic approvals
<ide>
<del>To ensure that non-urgent PRs have the opportunity to be seen and reviewed by any other maintainers who wish
<del>to take a look, all PRs require an approval before they can be merged. However, not every PR is
<del>reviewed by another maintainer, and some PRs are urgent enough that they need to be merged without
<del>an approval by another maintainer.
<add>To ensure that non-urgent PRs have the opportunity to be seen and reviewed by any other maintainers who wish to take a look, all PRs require an approval before they can be merged. However, not every PR is reviewed by another maintainer, and some PRs are urgent enough that they need to be merged without an approval by another maintainer.
<ide>
<del>As a compromise between always needing a review and allowing maintainers to merge PRs they deem ready,
<del>the `Triage` CI job will ensure that PRs cannot be merged until they've been open for 24 hours
<del>(only counting hours that occur Monday to Friday). After the triage period has expired, the
<del>CI job will show up as "passed" and [@BrewTestBot](https://github.com/BrewTestBot) will approve the PR,
<del>allowing it to be merged. This gives all maintainers a reasonable opportunity to review every PR,
<del>but won't block any PR for lack of reviews.
<add>As a compromise between always needing a review and allowing maintainers to merge PRs they deem ready, the `Triage` CI job will ensure that PRs cannot be merged until they've been open for 24 hours (only counting hours that occur Monday to Friday). After the triage period has expired, the CI job will show up as "passed" and [@BrewTestBot](https://github.com/BrewTestBot) will approve the PR, allowing it to be merged. This gives all maintainers a reasonable opportunity to review every PR, but won't block any PR for lack of reviews.
<ide>
<del>If the PR is urgent enough that it is necessary to bypass that 24 hour window, the `critical` label
<del>should be applied to the PR. When this label is applied, the `Triage` CI job will immediately be
<del>successful and [@BrewTestBot](https://github.com/BrewTestBot) will approve the PR.
<add>If the PR is urgent enough that it is necessary to bypass that 24 hour window, the `critical` label should be applied to the PR. When this label is applied, the `Triage` CI job will immediately be successful and [@BrewTestBot](https://github.com/BrewTestBot) will approve the PR.
<ide>
<ide> ## CI
<ide>
<del>Every PR in `Homebrew/brew` runs a series of CI tests to try to prevent bugs from being introduced.
<del>**A PR _must_ have passing CI before it can be merged**.
<add>Every PR in `Homebrew/brew` runs a series of CI tests to try to prevent bugs from being introduced. **A PR _must_ have passing CI before it can be merged**.
<ide>
<ide> There are many checks that run on every PR. The following is a quick list of the various checks and what they represent:
<ide>
<del>- `Vendor Gems / vendor-gems`: This is skipped except for dependabot PRs. It updates the RBI files to match
<del> any new/changed dependencies. See [Type Checking With Sorbet](Typechecking.md) for more information about RBI files
<del> and typechecking.
<del>- `Triage / review`: This verifies that the PR has been open for long enough.
<del> See the ["Automatic approvals" section above](#automatic-approvals) for more information about automatic approvals.
<del>- `codecov/patch` and `codecov/project`: These show the Codecov report for the PR.
<del> See the ["`brew tests` and Codecov" section below](#brew-tests-and-codecov) for more info about Codecov.
<del>- `CI / vendored gems (Linux)`: This checks whether there was a change to the vendored gems on Linux that needs to be
<del> committed to the PR branch.
<add>- `Vendor Gems / vendor-gems`: This is skipped except for dependabot PRs. It updates the RBI files to match any new/changed dependencies. See [Type Checking With Sorbet](Typechecking.md) for more information about RBI files and typechecking.
<add>- `Triage / review`: This verifies that the PR has been open for long enough. See the ["Automatic approvals" section above](#automatic-approvals) for more information about automatic approvals.
<add>- `codecov/patch` and `codecov/project`: These show the Codecov report for the PR. See the ["`brew tests` and Codecov" section below](#brew-tests-and-codecov) for more info about Codecov.
<add>- `CI / vendored gems (Linux)`: This checks whether there was a change to the vendored gems on Linux that needs to be committed to the PR branch.
<ide> - `CI / test default formula (Linux)`: This runs `brew test-bot` on Linux to ensure it still works as expected.
<del>- `CI / syntax`: This is run first to check whether the PR passes `brew style` and `brew typecheck`. If this job fails the
<del> following jobs will not run.
<del>- `CI / tap syntax (Linux)`: This runs `brew style` and `brew audit` on all official taps
<del> (note that although this has Linux in its name, it does check `Homebrew/homebrew-core` and all cask repos).
<add>- `CI / syntax`: This is run first to check whether the PR passes `brew style` and `brew typecheck`. If this job fails the following jobs will not run.
<add>- `CI / tap syntax (Linux)`: This runs `brew style` and `brew audit` on all official taps (note that although this has Linux in its name, it does check `Homebrew/homebrew-core` and all cask repos).
<ide> - `CI / docker`: This builds and deploys a new Homebrew Docker image.
<del>- `CI / test everything (macOS)`: This runs several checks on macOS including `brew tests`, `brew update-tests`,
<del> `brew test-bot --only-formulae --test-default-formula`, `brew readall` and `brew doctor`.
<del>- `CI / tests (no-compatibility mode)`, `CI / tests (generic OS)` and `CI / tests (Linux)`: These run
<del> `brew tests` with various options on Linux.
<add>- `CI / test everything (macOS)`: This runs several checks on macOS including `brew tests`, `brew update-tests`, `brew test-bot --only-formulae --test-default-formula`, `brew readall` and `brew doctor`.
<add>- `CI / tests (no-compatibility mode)`, `CI / tests (generic OS)` and `CI / tests (Linux)`: These run `brew tests` with various options on Linux.
<ide>
<ide> _Note that this list is non-exhaustive and can change over time._
<ide>
<ide> ### `brew tests` and Codecov
<ide>
<del>A coverage report is generated by Codecov for every PR, and its results are shown as CI jobs.
<del>These reports are publicly viewable on [Homebrew/brew's Codecov page](https://app.codecov.io/gh/Homebrew/brew).
<del>Additionally, annotations will appear in the PR's "Files changed" tab where lines of code have been
<del>added that aren't being hit by `brew tests`. If the Codecov job fails, that's a sign that some
<del>more tests should be added to test the functionality being added in the PR.
<add>A coverage report is generated by Codecov for every PR, and its results are shown as CI jobs. These reports are publicly viewable on [Homebrew/brew's Codecov page](https://app.codecov.io/gh/Homebrew/brew). Additionally, annotations will appear in the PR's "Files changed" tab where lines of code have been added that aren't being hit by `brew tests`. If the Codecov job fails, that's a sign that some more tests should be added to test the functionality being added in the PR.
<ide>
<del>Codecov should be used as a guide to indicate when more tests are probably needed, but it's unrealistic for
<del>every line of code to have a test associated with it, especially when testing would require a slow
<del>integration test. For this reason, it's okay to merge PRs that fail the Codecov check if necessary,
<del>but this should be avoided if possible.
<add>Codecov should be used as a guide to indicate when more tests are probably needed, but it's unrealistic for every line of code to have a test associated with it, especially when testing would require a slow integration test. For this reason, it's okay to merge PRs that fail the Codecov check if necessary, but this should be avoided if possible.
<ide>
<ide> ### `brew tests` and BuildPulse
<ide>
<del>BuildPulse monitors CI jobs for every push to `Homebrew/brew` to detect flaky tests and track them over time. The
<del>reports are available to Homebrew maintainers on [buildpulse.io](https://buildpulse.io/installations) and daily
<del>summaries are published to [`#buildpulse-health`](https://machomebrew.slack.com/archives/C0268BSJBJ8) in Slack.
<add>BuildPulse monitors CI jobs for every push to `Homebrew/brew` to detect flaky tests and track them over time. The reports are available to Homebrew maintainers on [buildpulse.io](https://buildpulse.io/installations) and daily summaries are published to [`#buildpulse-health`](https://machomebrew.slack.com/archives/C0268BSJBJ8) in Slack.
<ide>
<del>BuildPulse can be used as a guide to identify which flaky tests are causing the most disruption to the CI suite. To make
<del>the biggest improvements to the reliability of the build, we can focus on the most disruptive flaky tests first (i.e.
<del>the tests causing the most intermittent failures).
<add>BuildPulse can be used as a guide to identify which flaky tests are causing the most disruption to the CI suite. To make the biggest improvements to the reliability of the build, we can focus on the most disruptive flaky tests first (i.e. the tests causing the most intermittent failures).
<ide>
<del>To help find the root cause for a particular flaky test, buildpulse.io provides links to the most recent CI job and
<del>commit where the test failed and then passed with no change to the underlying code. You may want to check out the code
<del>at that commit to attempt to reproduce the failure locally. You can also see the list of recent failures on
<del>[buildpulse.io](https://buildpulse.io) to determine if the test always fails the same way.
<add>To help find the root cause for a particular flaky test, buildpulse.io provides links to the most recent CI job and commit where the test failed and then passed with no change to the underlying code. You may want to check out the code at that commit to attempt to reproduce the failure locally. You can also see the list of recent failures on [buildpulse.io](https://buildpulse.io) to determine if the test always fails the same way.
<ide>
<ide> ## Manpages and shell completions
<ide>
<del>Homebrew's manpages and shell completions are generated automatically by the `brew generate-man-completions` command.
<del>Contributors are welcome to run this command and commit the changes in a PR, but they don't have to. If they don't,
<del>a follow-up PR to make the necessary changes will be opened automatically by [@BrewTestBot](https://github.com/BrewTestBot)
<del>once the original PR is merged. These follow-up PRs can be merged immediately if the changes seem correct.
<add>Homebrew's manpages and shell completions are generated automatically by the `brew generate-man-completions` command. Contributors are welcome to run this command and commit the changes in a PR, but they don't have to. If they don't, a follow-up PR to make the necessary changes will be opened automatically by [@BrewTestBot](https://github.com/BrewTestBot) once the original PR is merged. These follow-up PRs can be merged immediately if the changes seem correct.
<ide>
<del>An update can be requested manually by triggering the workflow from the
<del>[Update maintainers, manpage and completions](https://github.com/Homebrew/brew/actions/workflows/update-man-completions.yml)
<del>section under the "Actions" tab. Click on the "Run workflow" dropdown and then the "Run workflow" button.
<del>A PR will be opened shortly if there are any changes.
<add>An update can be requested manually by triggering the workflow from the [Update maintainers, manpage and completions](https://github.com/Homebrew/brew/actions/workflows/update-man-completions.yml) section under the "Actions" tab. Click on the "Run workflow" dropdown and then the "Run workflow" button. A PR will be opened shortly if there are any changes.
<ide><path>docs/Homebrew-homebrew-cask-Maintainer-Guide.md
<ide> # Homebrew/homebrew-cask Maintainer Guide
<ide>
<del>This guide is intended to help maintainers effectively maintain the cask repositories.
<del>It is meant to be used in conjunction with the more generic [Maintainer Guidelines](Maintainer-Guidelines.md).
<add>This guide is intended to help maintainers effectively maintain the cask repositories. It is meant to be used in conjunction with the more generic [Maintainer Guidelines](Maintainer-Guidelines.md).
<ide>
<ide> This guide applies to all four of the cask repositories:
<ide>
<ide> - [Homebrew/homebrew-cask](https://github.com/Homebrew/homebrew-cask): The main cask repository
<ide> - [Homebrew/homebrew-cask-drivers](https://github.com/Homebrew/homebrew-cask-drivers): Casks of drivers
<ide> - [Homebrew/homebrew-cask-fonts](https://github.com/Homebrew/homebrew-cask-fonts): Casks of fonts
<del>- [Homebrew/homebrew-cask-versions](https://github.com/Homebrew/homebrew-cask-versions): Alternate versions of Casks
<add>- [Homebrew/homebrew-cask-versions](https://github.com/Homebrew/homebrew-cask-versions): Alternate versions of casks
<ide>
<ide> ## Common Situations
<ide>
<ide> Here is a list of the most common situations that arise in cask PRs and how to handle them:
<ide>
<ide> - The `version` and `sha256` both change (keeping the same format): Merge.
<del>- Only the `sha256` changes: Merge unless the version needs to be updated as well.
<del> It’s not uncommon for upstream vendors to update versions in-place.
<del> However, be wary for times when e.g. upstream could have been hacked.
<del>- `livecheck` is updated: Use your best judgement and try to make sure that the changes
<del> follow the [`livecheck` guidelines](Brew-Livecheck.md).
<del>- Only the `version` changes or the `version` format changes: Use your best judgement and
<del> merge if it seems correct (this is relatively rare).
<add>- Only the `sha256` changes: Merge unless the version needs to be updated as well. It’s not uncommon for upstream vendors to update versions in-place. However, be wary for times when e.g. upstream could have been hacked.
<add>- `livecheck` is updated: Use your best judgement and try to make sure that the changes follow the [`livecheck` guidelines](Brew-Livecheck.md).
<add>- Only the `version` changes or the `version` format changes: Use your best judgement and merge if it seems correct (this is relatively rare).
<ide> - Other changes (including adding new casks): Use the [Cask Cookbook](Cask-Cookbook.md) to determine what's correct.
<ide>
<ide> If in doubt, ask another cask maintainer on GitHub or Slack.
<ide>
<del>Note that unlike formulae, casks do not consider the `sha256` stanza to be a meaningful security measure
<del>as maintainers cannot realistically check them for authenticity. Casks download from upstream; if a malicious
<del>actor compromised a URL, they could potentially compromise a version and make it look like an update.
<add>Note that unlike formulae, casks do not consider the `sha256` stanza to be a meaningful security measure as maintainers cannot realistically check them for authenticity. Casks download from upstream; if a malicious actor compromised a URL, they could potentially compromise a version and make it look like an update.
<ide>
<ide> ## Merging
<ide>
<del>In general, using GitHub's Squash and Merge button is the best way to merge a PR. This can be used when
<del>the PR modifies only one cask, regardless of the number of commits or whether the commit message
<del>format is correct. When merging using this method, the commit message can be modified if needed.
<del>Usually, version bump commit messages follow the form `Update CASK from OLD_VERSION to NEW_VERSION`.
<add>In general, using GitHub's Squash and Merge button is the best way to merge a PR. This can be used when the PR modifies only one cask, regardless of the number of commits or whether the commit message format is correct. When merging using this method, the commit message can be modified if needed. Usually, version bump commit messages follow the form `Update CASK from OLD_VERSION to NEW_VERSION`.
<ide>
<del>If the PR modifies multiple casks, use the Rebase and Merge button to merge the PR. This will use the
<del>commit messages from the PR, so make sure that they are appropriate before merging. If needed,
<del>checkout the PR, squash/reword the commits and force-push back to the PR branch to ensure the proper commit format.
<add>If the PR modifies multiple casks, use the Rebase and Merge button to merge the PR. This will use the commit messages from the PR, so make sure that they are appropriate before merging. If needed, checkout the PR, squash/reword the commits and force-push back to the PR branch to ensure the proper commit format.
<ide>
<ide> Finally, make sure to thank the contributor for submitting a PR!
<ide>
<ide> ## Other Tips
<ide>
<del>A maintainer can easily rebase a PR onto the latest `master` branch by adding a `/rebase` comment.
<del>`BrewTestBot` will automatically rebase the PR and add a reaction to
<del>the comment once the rebase is in progress and complete.
<add>A maintainer can easily rebase a PR onto the latest `master` branch by adding a `/rebase` comment. `BrewTestBot` will automatically rebase the PR and add a reaction to the comment once the rebase is in progress and complete.
<ide><path>docs/Homebrew-homebrew-core-Maintainer-Guide.md
<ide> A detailed checklist can be found [below](#detailed-merge-checklist). This is al
<ide> - Ensure the name seems reasonable.
<ide> - Add aliases.
<ide> - Ensure it uses `keg_only :provided_by_macos` if it already comes with macOS.
<del>- Ensure it is not a library that can be installed with
<del> [gem](https://en.wikipedia.org/wiki/RubyGems),
<del> [cpan](https://en.wikipedia.org/wiki/Cpan) or
<del> [pip](https://pip.pypa.io/en/stable/).
<del>- Ensure that any dependencies are accurate and minimal. We don't need to
<del> support every possible optional feature for the software.
<add>- Ensure it is not a library that can be installed with [gem](https://en.wikipedia.org/wiki/RubyGems), [cpan](https://en.wikipedia.org/wiki/Cpan) or [pip](https://pip.pypa.io/en/stable/).
<add>- Ensure that any dependencies are accurate and minimal. We don't need to support every possible optional feature for the software.
<ide> - When bottles aren't required or affected, use the GitHub squash & merge workflow for a single-formula PR or rebase & merge workflow for a multiple-formulae PR. See the ["How to merge without bottles" section below](#how-to-merge-without-bottles) for more details.
<ide> - Use `brew pr-publish` or `brew pr-pull` otherwise, which adds messages to auto-close pull requests and pull bottles built by the Brew Test Bot.
<ide> - Thank people for contributing.
<ide>
<del>Checking dependencies is important, because they will probably stick around
<del>forever. Nobody really checks if they are necessary or not.
<add>Checking dependencies is important, because they will probably stick around forever. Nobody really checks if they are necessary or not.
<ide>
<del>Depend on as little stuff as possible. Disable X11 functionality if possible.
<del>For example, we build Wireshark, but not the heavy GUI.
<add>Depend on as little stuff as possible. Disable X11 functionality if possible. For example, we build Wireshark, but not the heavy GUI.
<ide>
<del>Homebrew is about Unix software. Stuff that builds to an `.app` should
<del>be in Homebrew Cask instead.
<add>Homebrew is about Unix software. Stuff that builds to an `.app` should be in Homebrew Cask instead.
<ide>
<ide> ## Merging, rebasing, cherry-picking
<ide>
<del>For most PRs that make formula modifications, you can simply approve the PR and an automatic
<del>merge (with bottles) will be performed by [@BrewTestBot](https://github.com/BrewTestBot).
<del>See [Brew Test Bot For Core Contributors](Brew-Test-Bot-For-Core-Contributors.md) for more information.
<add>For most PRs that make formula modifications, you can simply approve the PR and an automatic merge (with bottles) will be performed by [@BrewTestBot](https://github.com/BrewTestBot). See [Brew Test Bot For Core Contributors](Brew-Test-Bot-For-Core-Contributors.md) for more information.
<ide>
<del>Certain PRs may not be merged automatically by [@BrewTestBot](https://github.com/BrewTestBot),
<del>even after they've been approved. This includes PRs with the `new formula`, `automerge-skip`,
<del>and `linux-only` labels. To trigger a merge for these PRs, run `brew pr-publish`.
<add>Certain PRs may not be merged automatically by [@BrewTestBot](https://github.com/BrewTestBot), even after they've been approved. This includes PRs with the `new formula`, `automerge-skip`, and `linux-only` labels. To trigger a merge for these PRs, run `brew pr-publish`.
<ide>
<del>PRs modifying formulae that don't need bottles or making changes that don't
<del>require new bottles to be pulled should use GitHub's squash & merge or rebase & merge workflows.
<del>See the [table below](#how-to-merge-without-bottles) for more details.
<add>PRs modifying formulae that don't need bottles or making changes that don't require new bottles to be pulled should use GitHub's squash & merge or rebase & merge workflows. See the [table below](#how-to-merge-without-bottles) for more details.
<ide>
<ide> Otherwise, you should use `brew pr-pull` (or `rebase`/`cherry-pick` contributions).
<ide>
<del>Don’t `rebase` until you finally `push`. Once `master` is pushed, you can’t
<del>`rebase`: **you’re a maintainer now!**
<add>Don’t `rebase` until you finally `push`. Once `master` is pushed, you can’t `rebase`: **you’re a maintainer now!**
<ide>
<ide> Cherry-picking changes the date of the commit, which kind of sucks.
<ide>
<del>Don’t `merge` unclean branches. So if someone is still learning `git` and
<del>their branch is filled with nonsensical merges, then `rebase` and squash
<del>the commits. Our main branch history should be useful to other people,
<del>not confusing.
<add>Don’t `merge` unclean branches. So if someone is still learning `git` and their branch is filled with nonsensical merges, then `rebase` and squash the commits. Our main branch history should be useful to other people, not confusing.
<ide>
<ide> Here’s a flowchart for managing a PR which is ready to merge:
<ide>
<ide> Here are guidelines about when to use squash & merge versus rebase & merge. Thes
<ide>
<ide> ## Naming
<ide>
<del>The name is the strictest item, because avoiding a later name change is
<del>desirable.
<add>The name is the strictest item, because avoiding a later name change is desirable.
<ide>
<del>Choose a name that’s the most common name for the project.
<del>For example, we initially chose `objective-caml` but we should have chosen `ocaml`.
<del>Choose what people say to each other when talking about the project.
<add>Choose a name that’s the most common name for the project. For example, we initially chose `objective-caml` but we should have chosen `ocaml`. Choose what people say to each other when talking about the project.
<ide>
<del>Formulae that are also packaged by other package managers (e.g. Debian, Ubuntu) should be
<del>named consistently (subject to minor differences due to Homebrew formula naming conventions).
<add>Formulae that are also packaged by other package managers (e.g. Debian, Ubuntu) should be named consistently (subject to minor differences due to Homebrew formula naming conventions).
<ide>
<del>Add other names as aliases as symlinks in `Aliases` in the tap root. Ensure the
<del>name referenced on the homepage is one of these, as it may be different and have
<del>underscores and hyphens and so on.
<add>Add other names as aliases as symlinks in `Aliases` in the tap root. Ensure the name referenced on the homepage is one of these, as it may be different and have underscores and hyphens and so on.
<ide>
<ide> We now accept versioned formulae as long as they [meet the requirements](Versions.md).
<ide>
<ide> ## Testing
<ide>
<ide> We need to at least check that it builds. Use the [Brew Test Bot](Brew-Test-Bot.md) for this.
<ide>
<del>Verify the formula works if possible. If you can’t tell (e.g. if it’s a
<del>library) trust the original contributor, it worked for them, so chances are it
<del>is fine. If you aren’t an expert in the tool in question, you can’t really
<del>gauge if the formula installed the program correctly. At some point an expert
<del>will come along, cry blue murder that it doesn’t work, and fix it. This is how
<del>open source works. Ideally, request a `test do` block to test that
<del>functionality is consistently available.
<del>
<del>If the formula uses a repository, then the `url` parameter should have a
<del>tag or revision. `url`s have versions and are stable (not yet
<del>implemented!).
<del>
<del>Don't merge any formula updates with failing `brew test`s. If a `test do` block
<del>is failing it needs to be fixed. This may involve replacing more involved tests
<del>with those that are more reliable. This is fine: false positives are better than
<del>false negatives as we don't want to teach maintainers to merge red PRs. If the
<del>test failure is believed to be due to a bug in `Homebrew/brew` or the CI system,
<del>that bug must be fixed, or worked around in the formula to yield a passing test,
<del>before the PR can be merged.
<add>Verify the formula works if possible. If you can’t tell (e.g. if it’s a library) trust the original contributor, it worked for them, so chances are it is fine. If you aren’t an expert in the tool in question, you can’t really gauge if the formula installed the program correctly. At some point an expert will come along, cry blue murder that it doesn’t work, and fix it. This is how open source works. Ideally, request a `test do` block to test that functionality is consistently available.
<add>
<add>If the formula uses a repository, then the `url` parameter should have a tag or revision. `url`s have versions and are stable (not yet implemented!).
<add>
<add>Don't merge any formula updates with failing `brew test`s. If a `test do` block is failing it needs to be fixed. This may involve replacing more involved tests with those that are more reliable. This is fine: false positives are better than false negatives as we don't want to teach maintainers to merge red PRs. If the test failure is believed to be due to a bug in `Homebrew/brew` or the CI system, that bug must be fixed, or worked around in the formula to yield a passing test, before the PR can be merged.
<ide>
<ide> ## Duplicates
<ide>
<ide> Formulae that:
<ide>
<ide> should not be removed from Homebrew. The exception to this rule are [versioned formulae](Versions.md) for which there are higher standards of usage and a maximum number of versions for a given formula.
<ide>
<del>For more information about deprecating, disabling and removing formulae, see the
<del>[Deprecating, Disabling, and Removing Formulae page](Deprecating-Disabling-and-Removing-Formulae.md).
<add>For more information about deprecating, disabling and removing formulae, see the [Deprecating, Disabling, and Removing Formulae page](Deprecating-Disabling-and-Removing-Formulae.md).
<ide>
<ide> ## Detailed merge checklist
<ide>
<del>The following checklist is intended to help maintainers decide on
<del>whether to merge, request changes or close a PR. It also brings more
<del>transparency for contributors in addition to the
<del>[Acceptable Formulae](Acceptable-Formulae.md) requirements.
<add>The following checklist is intended to help maintainers decide on whether to merge, request changes or close a PR. It also brings more transparency for contributors in addition to the [Acceptable Formulae](Acceptable-Formulae.md) requirements.
<ide>
<ide> - previously opened active PRs, as we would like to be fair to contributors who came first
<ide> - patches/`inreplace` that have been applied to upstream and can be removed
<ide><path>docs/Maintainer-Guidelines.md
<ide> # Maintainer Guidelines
<ide>
<del>**This guide is for maintainers.** These special people have **write
<del>access** to Homebrew’s repository and help merge the contributions of
<del>others. You may find what is written here interesting, but it’s
<del>definitely not a beginner’s guide.
<add>**This guide is for maintainers.** These special people have **write access** to Homebrew’s repository and help merge the contributions of others. You may find what is written here interesting, but it’s definitely not a beginner’s guide.
<ide>
<ide> Maybe you were looking for the [Formula Cookbook](Formula-Cookbook.md) or [Cask Cookbook](Cask-Cookbook.md)?
<ide>
<ide> ## Overview
<ide>
<del>All Homebrew maintainers are encouraged to contribute to all parts of the project,
<del>but there are four main teams that maintainers tend to be a part of:
<del>
<del>- `brew` maintainers: this team maintains the [`Homebrew/brew`](https://github.com/Homebrew/brew) repository.
<del> See the [Homebrew/brew Maintainer Guide](Homebrew-brew-Maintainer-Guide.md) for more details about being a `brew` maintainer.
<del>- Core maintainers: this team maintains the [`Homebrew/homebrew-core`](https://github.com/Homebrew/homebrew-core)
<del> repository. See the [Homebrew/homebrew-core Maintainer Guide](Homebrew-homebrew-core-Maintainer-Guide.md)
<del> for more details about being a core maintainer.
<del>- Linux maintainers: this team maintains the [`Homebrew/homebrew-core`](https://github.com/Homebrew/homebrew-core)
<del> repository on Linux.
<del>- Cask maintainers: this team maintains the [`Homebrew/homebrew-cask`](https://github.com/Homebrew/homebrew-cask),
<del> [`Homebrew/homebrew-cask-drivers`](https://github.com/Homebrew/homebrew-cask-drivers),
<del> [`Homebrew/homebrew-cask-fonts`](https://github.com/Homebrew/homebrew-cask-fonts) and
<del> [`Homebrew/homebrew-cask-versions`](https://github.com/Homebrew/homebrew-cask-versions) repositories.
<del> See the [Homebrew/homebrew-cask Maintainer Guide](Homebrew-homebrew-cask-Maintainer-Guide.md)
<del> for more details about being a cask maintainer.
<del>
<del>These documents are meant to serve as guiding principles. As a maintainer, you can make a call to either
<del>request changes from a contributor or help them out based on their comfort and previous contributions.
<del>Remember, as a team we [Prioritise Maintainers Over Users](Maintainers-Avoiding-Burnout.md) to avoid
<del>burnout. If you wish to change or discuss any of the guidelines: open a PR to suggest a change.
<add>All Homebrew maintainers are encouraged to contribute to all parts of the project, but there are four main teams that maintainers tend to be a part of:
<add>
<add>- `brew` maintainers: this team maintains the [`Homebrew/brew`](https://github.com/Homebrew/brew) repository. See the [Homebrew/brew Maintainer Guide](Homebrew-brew-Maintainer-Guide.md) for more details about being a `brew` maintainer.
<add>- Core maintainers: this team maintains the [`Homebrew/homebrew-core`](https://github.com/Homebrew/homebrew-core) repository. See the [Homebrew/homebrew-core Maintainer Guide](Homebrew-homebrew-core-Maintainer-Guide.md) for more details about being a core maintainer.
<add>- Linux maintainers: this team maintains the [`Homebrew/homebrew-core`](https://github.com/Homebrew/homebrew-core) repository on Linux.
<add>- Cask maintainers: this team maintains the [`Homebrew/homebrew-cask`](https://github.com/Homebrew/homebrew-cask), [`Homebrew/homebrew-cask-drivers`](https://github.com/Homebrew/homebrew-cask-drivers), [`Homebrew/homebrew-cask-fonts`](https://github.com/Homebrew/homebrew-cask-fonts) and [`Homebrew/homebrew-cask-versions`](https://github.com/Homebrew/homebrew-cask-versions) repositories. See the [Homebrew/homebrew-cask Maintainer Guide](Homebrew-homebrew-cask-Maintainer-Guide.md) for more details about being a cask maintainer.
<add>
<add>These documents are meant to serve as guiding principles. As a maintainer, you can make a call to either request changes from a contributor or help them out based on their comfort and previous contributions. Remember, as a team we [Prioritise Maintainers Over Users](Maintainers-Avoiding-Burnout.md) to avoid burnout. If you wish to change or discuss any of the guidelines: open a PR to suggest a change.
<ide>
<ide> ## Mission
<ide>
<ide> Homebrew aims to be the missing package manager for macOS (and Linux). Its prima
<ide>
<ide> ### Add comments
<ide>
<del>It may be enough to refer to an issue ticket, but make sure changes are clear so that
<del>if you came to them unaware of the surrounding issues they would make sense
<del>to you. Many times on other projects I’ve seen code removed because the
<del>new guy didn’t know why it was there. Regressions suck.
<add>It may be enough to refer to an issue ticket, but make sure changes are clear so that if you came to them unaware of the surrounding issues they would make sense to you. Many times on other projects I’ve seen code removed because the new guy didn’t know why it was there. Regressions suck.
<ide>
<ide> ### Don’t allow bloated diffs
<ide>
<del>Amend a cherry-pick to remove commits that are only changes in
<del>whitespace. They are not acceptable because our history is important and
<del>`git blame` should be useful.
<add>Amend a cherry-pick to remove commits that are only changes in whitespace. They are not acceptable because our history is important and `git blame` should be useful.
<ide>
<del>Whitespace corrections (to Ruby standard etc.) are allowed (in fact this
<del>is a good opportunity to do it) provided the line itself has some kind
<del>of modification that is not whitespace in it. But be careful about
<del>making changes to inline patches—make sure they still apply.
<add>Whitespace corrections (to Ruby standard etc.) are allowed (in fact this is a good opportunity to do it) provided the line itself has some kind of modification that is not whitespace in it. But be careful about making changes to inline patches—make sure they still apply.
<ide>
<ide> ### Closing issues/PRs
<ide>
<ide><path>docs/Releases.md
<ide> # Releases
<ide>
<del>Since Homebrew 1.0.0 most Homebrew users (those who haven't run a `dev-cmd` or
<del>set `HOMEBREW_DEVELOPER=1` which is ~99.9% based on analytics data) require tags
<del>on the [Homebrew/brew repository](https://github.com/homebrew/brew)
<del>in order to get new versions of Homebrew. There are a few steps in making a new
<del>Homebrew release:
<add>Since Homebrew 1.0.0 most Homebrew users (those who haven't run a `dev-cmd` or set `HOMEBREW_DEVELOPER=1` which is ~99.9% based on analytics data) require tags on the [Homebrew/brew repository](https://github.com/homebrew/brew) in order to get new versions of Homebrew. There are a few steps in making a new Homebrew release:
<ide>
<del>1. Check the [Homebrew/brew pull requests](https://github.com/homebrew/brew/pulls),
<del> [issues](https://github.com/homebrew/brew/issues),
<del> [Homebrew/homebrew-core issues](https://github.com/homebrew/homebrew-core/issues) and
<del> [Homebrew/discussions (forum)](https://github.com/homebrew/discussions/discussions) to see if there is
<del> anything pressing that needs to be fixed or merged before the next release.
<del> If so, fix and merge these changes.
<del>2. Ensure that no code changes have happened for at least a couple of hours (ideally 4 hours),
<del> at least one Homebrew/homebrew-core pull request CI job has completed successfully,
<del> checked the state of the Homebrew/brew `master` CI job (i.e. main jobs green or green after rerunning),
<del> and that you are confident there are no major regressions on the current `master`,
<del> branch.
<del>3. Run `brew release` to create a new draft release. For major or minor version bumps,
<del> pass `--major` or `--minor`, respectively.
<add>1. Check the [Homebrew/brew pull requests](https://github.com/homebrew/brew/pulls), [issues](https://github.com/homebrew/brew/issues), [Homebrew/homebrew-core issues](https://github.com/homebrew/homebrew-core/issues) and [Homebrew/discussions (forum)](https://github.com/homebrew/discussions/discussions) to see if there is anything pressing that needs to be fixed or merged before the next release. If so, fix and merge these changes.
<add>2. Ensure that no code changes have happened for at least a couple of hours (ideally 4 hours), at least one Homebrew/homebrew-core pull request CI job has completed successfully, checked the state of the Homebrew/brew `master` CI job (i.e. main jobs green or green after rerunning), and that you are confident there are no major regressions on the current `master`, branch.
<add>3. Run `brew release` to create a new draft release. For major or minor version bumps, pass `--major` or `--minor`, respectively.
<ide> 4. Publish the draft release on [GitHub](https://github.com/Homebrew/brew/releases).
<ide>
<ide> If this is a major or minor release (e.g. X.0.0 or X.Y.0) then there are a few more steps:
<ide>
<del>1. Before creating the tag you should delete any `odisabled` code, make any
<del> `odeprecated` code `odisabled`, uncomment any `# odeprecated` code and add
<del> any new `odeprecations` that are desired. Also delete any command argument
<del> definitions that pass `replacement: ...`.
<del>2. Write up a release notes blog post to <https://brew.sh>
<del> e.g. [brew.sh#319](https://github.com/Homebrew/brew.sh/pull/319).
<del> This should use the output from `brew release [--major|--minor]` as input but
<del> have the wording adjusted to be more human readable and explain not just what has changed but why.
<del>3. When the release has shipped and the blog post has been merged, tweet the
<del> blog post as the [@MacHomebrew Twitter account](https://twitter.com/MacHomebrew)
<del> or tweet it yourself and retweet it with the @MacHomebrew Twitter account
<del> (credentials are in 1Password).
<add>1. Before creating the tag you should delete any `odisabled` code, make any `odeprecated` code `odisabled`, uncomment any `# odeprecated` code and add any new `odeprecations` that are desired. Also delete any command argument definitions that pass `replacement: ...`.
<add>2. Write up a release notes blog post to <https://brew.sh> e.g. [brew.sh#319](https://github.com/Homebrew/brew.sh/pull/319). This should use the output from `brew release [--major|--minor]` as input but have the wording adjusted to be more human readable and explain not just what has changed but why.
<add>3. When the release has shipped and the blog post has been merged, tweet the blog post as the [@MacHomebrew Twitter account](https://twitter.com/MacHomebrew) or tweet it yourself and retweet it with the @MacHomebrew Twitter account (credentials are in 1Password).
<ide> 4. Consider whether to submit it to other sources e.g. Hacker News, Reddit.
<del> - Pros: gets a wider reach and user feedback
<del> - Cons: negative comments are common and people take this as a chance to
<del> complain about Homebrew (regardless of their usage)
<add> - Pros: gets a wider reach and user feedback
<add> - Cons: negative comments are common and people take this as a chance to complain about Homebrew (regardless of their usage)
<ide>
<del>Please do not manually create a release based on older commits on the `master` branch.
<del>It's very hard to judge whether these have been sufficiently tested by users or if they will
<del>cause negative side-effects with the current state of Homebrew/homebrew-core.
<del>If a new branch is needed ASAP but there are things on `master` that cannot be released yet
<del>(e.g. new deprecations and you want to make a patch release) then revert the relevant PRs,
<del>follow the process above and then revert the reverted PRs to reapply them on `master`.
<add>Please do not manually create a release based on older commits on the `master` branch. It's very hard to judge whether these have been sufficiently tested by users or if they will cause negative side-effects with the current state of Homebrew/homebrew-core. If a new branch is needed ASAP but there are things on `master` that cannot be released yet (e.g. new deprecations and you want to make a patch release) then revert the relevant PRs, follow the process above and then revert the reverted PRs to reapply them on `master`. | 5 |
Javascript | Javascript | change var to let | de20c530d4e5ff2106c0da1a7a1359c5a689da7a | <ide><path>lib/internal/fs/streams.js
<ide> WriteStream.prototype._writev = function(data, cb) {
<ide> const chunks = new Array(len);
<ide> let size = 0;
<ide>
<del> for (var i = 0; i < len; i++) {
<add> for (let i = 0; i < len; i++) {
<ide> const chunk = data[i].chunk;
<ide>
<ide> chunks[i] = chunk; | 1 |
Javascript | Javascript | use template literals | 2b0af7c4e721bde1a26718e21ab1724bc2438a1f | <ide><path>lib/DelegatedModuleFactoryPlugin.js
<ide> class DelegatedModuleFactoryPlugin {
<ide> (data, callback) => {
<ide> const [dependency] = data.dependencies;
<ide> const { request } = dependency;
<del> if (request && request.startsWith(scope + "/")) {
<add> if (request && request.startsWith(`${scope}/`)) {
<ide> const innerRequest = "." + request.substr(scope.length);
<ide> let resolved;
<ide> if (innerRequest in this.options.content) { | 1 |
PHP | PHP | fix code style | 3965f76737c16f4b9890aa40b9ff348f154f9eff | <ide><path>src/I18n/TranslatorRegistry.php
<ide> public function __construct(
<ide> ) {
<ide> parent::__construct($packages, $formatters, $factory, $locale);
<ide>
<del> $this->setFallbackLoader(function($name, $locale) {
<add> $this->setFallbackLoader(function ($name, $locale) {
<ide> $chain = new ChainMessagesLoader([
<ide> new MessagesFileLoader($name, $locale, 'mo'),
<ide> new MessagesFileLoader($name, $locale, 'po') | 1 |
Java | Java | fix the compose covariance | 855252b72070ed448d8b8fdcde63fbc4d48075d2 | <ide><path>rxjava-core/src/main/java/rx/Observable.java
<ide> public void call(Subscriber<? super R> o) {
<ide> * @see <a href="https://github.com/Netflix/RxJava/wiki/Implementing-Your-Own-Operators">RxJava wiki: Implementing Your Own Operators</a>
<ide> * @since 0.20
<ide> */
<del> public <R> Observable<? extends R> compose(Transformer<? super T, ? extends R> transformer) {
<add> public <R> Observable<R> compose(Transformer<? super T, R> transformer) {
<ide> return transformer.call(this);
<ide> }
<ide>
<ide> public <R> Observable<? extends R> compose(Transformer<? super T, ? extends R> t
<ide> * @warn more complete description needed
<ide> * @since 0.20
<ide> */
<del> public static interface Transformer<T, R> extends Func1<Observable<? extends T>, Observable<? extends R>> {
<add> public static interface Transformer<T, R> extends Func1<Observable<? extends T>, Observable<R>> {
<ide> // cover for generics insanity
<ide> }
<ide>
<ide><path>rxjava-core/src/test/java/rx/CovarianceTest.java
<ide> public Integer call(Media t1, Media t2) {
<ide> @Test
<ide> public void testCovarianceOfCompose() {
<ide> Observable<HorrorMovie> movie = Observable.<HorrorMovie> from(new HorrorMovie());
<del> movie.compose(new Transformer<Movie, Movie>() {
<add> Observable<Movie> movie2 = movie.compose(new Transformer<Movie, Movie>() {
<ide>
<ide> @Override
<del> public Observable<? extends Movie> call(Observable<? extends Movie> t1) {
<add> public Observable<Movie> call(Observable<? extends Movie> t1) {
<ide> return Observable.from(new Movie());
<ide> }
<ide>
<ide> public Observable<? extends Movie> call(Observable<? extends Movie> t1) {
<ide> @Test
<ide> public void testCovarianceOfCompose2() {
<ide> Observable<Movie> movie = Observable.<Movie> from(new HorrorMovie());
<del> movie.compose(new Transformer<Movie, Movie>() {
<add> Observable<HorrorMovie> movie2 = movie.compose(new Transformer<Movie, HorrorMovie>() {
<ide> @Override
<del> public Observable<? extends Movie> call(Observable<? extends Movie> t1) {
<add> public Observable<HorrorMovie> call(Observable<? extends Movie> t1) {
<ide> return Observable.from(new HorrorMovie());
<ide> }
<ide> }); | 2 |
Javascript | Javascript | return error objects on error | e3d4a7d9994ff7d1bd144779f8b91d8302112bc0 | <ide><path>lib/inspector.js
<ide> const {
<ide> ERR_INSPECTOR_ALREADY_CONNECTED,
<ide> ERR_INSPECTOR_CLOSED,
<add> ERR_INSPECTOR_COMMAND,
<ide> ERR_INSPECTOR_NOT_AVAILABLE,
<ide> ERR_INSPECTOR_NOT_CONNECTED,
<ide> ERR_INVALID_ARG_TYPE,
<ide> class Session extends EventEmitter {
<ide> if (parsed.id) {
<ide> const callback = this[messageCallbacksSymbol].get(parsed.id);
<ide> this[messageCallbacksSymbol].delete(parsed.id);
<del> if (callback)
<del> callback(parsed.error || null, parsed.result || null);
<add> if (callback) {
<add> if (parsed.error) {
<add> return callback(new ERR_INSPECTOR_COMMAND(parsed.error.code,
<add> parsed.error.message));
<add> }
<add>
<add> callback(null, parsed.result);
<add> }
<ide> } else {
<ide> this.emit(parsed.method, parsed);
<ide> this.emit('inspectorNotification', parsed);
<ide><path>test/sequential/test-inspector-runtime-evaluate-with-timeout.js
<ide> const common = require('../common');
<ide> common.skipIfInspectorDisabled();
<ide>
<ide> (async function test() {
<del> const { strictEqual } = require('assert');
<add> const assert = require('assert');
<ide> const { Session } = require('inspector');
<ide> const { promisify } = require('util');
<ide>
<ide> const session = new Session();
<ide> session.connect();
<ide> session.post = promisify(session.post);
<del> const result = await session.post('Runtime.evaluate', {
<del> expression: 'for(;;);',
<del> timeout: 0
<del> }).catch((e) => e);
<del> strictEqual(result.message, 'Execution was terminated');
<add> await assert.rejects(
<add> session.post('Runtime.evaluate', {
<add> expression: 'for(;;);',
<add> timeout: 0
<add> }),
<add> {
<add> code: 'ERR_INSPECTOR_COMMAND',
<add> message: 'Inspector error -32000: Execution was terminated'
<add> }
<add> );
<ide> session.disconnect();
<ide> })(); | 2 |
Text | Text | remove statement about (ec)dhe performance | 986cf3b986c6c3e1495327eb3aa1ffc3972f2442 | <ide><path>doc/api/tls.md
<ide> the character "E" appended to the traditional abbreviations):
<ide> * [ECDHE][]: An ephemeral version of the Elliptic Curve Diffie-Hellman
<ide> key-agreement protocol.
<ide>
<del>Ephemeral methods may have some performance drawbacks, because key generation
<del>is expensive.
<del>
<ide> To use perfect forward secrecy using `DHE` with the `tls` module, it is required
<ide> to generate Diffie-Hellman parameters and specify them with the `dhparam`
<ide> option to [`tls.createSecureContext()`][]. The following illustrates the use of | 1 |
Ruby | Ruby | add more test coverage for option descriptions | 20452f3edc22ede05814a7355d65c9ef4547b3b0 | <ide><path>Library/Homebrew/test/test_software_spec.rb
<ide> def test_cxx11_option_special_case
<ide> refute @spec.option_defined?("cxx11")
<ide> end
<ide>
<add> def test_option_description
<add> @spec.option("bar", "description")
<add> assert_equal "description", @spec.options.first.description
<add> end
<add>
<add> def test_option_description_defaults_to_empty_string
<add> @spec.option("foo")
<add> assert_equal "", @spec.options.first.description
<add> end
<add>
<ide> def test_depends_on
<ide> @spec.depends_on('foo')
<ide> assert_equal 'foo', @spec.deps.first.name | 1 |
Go | Go | extract ioctl from wrapper | 1214b8897bba2a33a7ded8779bcdc966fe1cb176 | <ide><path>graphdriver/devmapper/attachLoopback.go
<ide> package devmapper
<ide> import (
<ide> "fmt"
<ide> "github.com/dotcloud/docker/utils"
<del> "unsafe"
<ide> )
<ide>
<del>func ioctlLoopCtlGetFree(fd uintptr) (int, error) {
<del> index, _, err := sysSyscall(sysSysIoctl, fd, LoopCtlGetFree, 0)
<del> if err != 0 {
<del> return 0, err
<del> }
<del> return int(index), nil
<del>}
<del>
<del>func ioctlLoopSetFd(loopFd, sparseFd uintptr) error {
<del> if _, _, err := sysSyscall(sysSysIoctl, loopFd, LoopSetFd, sparseFd); err != 0 {
<del> return err
<del> }
<del> return nil
<del>}
<del>
<del>func ioctlLoopSetStatus64(loopFd uintptr, loopInfo *LoopInfo64) error {
<del> _, _, err := sysSyscall(sysSysIoctl, loopFd, LoopSetStatus64, uintptr(unsafe.Pointer(loopInfo)))
<del> if err != 0 {
<del> return err
<del> }
<del> return nil
<del>}
<del>
<del>func ioctlLoopClrFd(loopFd uintptr) error {
<del> _, _, err := sysSyscall(sysSysIoctl, loopFd, LoopClrFd, 0)
<del> if err != 0 {
<del> return err
<del> }
<del> return nil
<add>func stringToLoopName(src string) [LoNameSize]uint8 {
<add> var dst [LoNameSize]uint8
<add> copy(dst[:], src[:])
<add> return dst
<ide> }
<ide>
<del>// //func dmGetLoopbackBackingFileFct(fd uintptr) (uint64, uint64, sysErrno) {
<del>// func ioctlLoopGetStatus64(loopFd uintptr) (*LoopInfo64, error) {
<del>// var lo64 C.struct_loop_info64
<del>// _, _, err := sysSyscall(sysSysIoctl, fd, C.LOOP_GET_STATUS64, uintptr(unsafe.Pointer(&lo64)))
<del>// return uint64(lo64.lo_device), uint64(lo64.lo_inode), sysErrno(err)
<del>// }
<del>
<del>// func dmLoopbackSetCapacityFct(fd uintptr) sysErrno {
<del>// _, _, err := sysSyscall(sysSysIoctl, fd, C.LOOP_SET_CAPACITY, 0)
<del>// return sysErrno(err)
<del>// }
<del>
<del>// func dmGetBlockSizeFct(fd uintptr) (int64, sysErrno) {
<del>// var size int64
<del>// _, _, err := sysSyscall(sysSysIoctl, fd, C.BLKGETSIZE64, uintptr(unsafe.Pointer(&size)))
<del>// return size, sysErrno(err)
<del>// }
<del>
<del>// func getBlockSizeFct(fd uintptr, size *uint64) sysErrno {
<del>// _, _, err := sysSyscall(sysSysIoctl, fd, C.BLKGETSIZE64, uintptr(unsafe.Pointer(&size)))
<del>// return sysErrno(err)
<del>// }
<del>
<ide> func getNextFreeLoopbackIndex() (int, error) {
<ide> f, err := osOpenFile("/dev/loop-control", osORdOnly, 0644)
<ide> if err != nil {
<ide> func openNextAvailableLoopback(index int, sparseFile *osFile) (loopFile *osFile,
<ide> return loopFile, nil
<ide> }
<ide>
<del>func stringToLoopName(src string) [LoNameSize]uint8 {
<del> var dst [LoNameSize]uint8
<del> copy(dst[:], src[:])
<del> return dst
<del>}
<del>
<ide> // attachLoopDevice attaches the given sparse file to the next
<ide> // available loopback device. It returns an opened *osFile.
<ide> func attachLoopDevice(sparseName string) (loop *osFile, err error) {
<ide><path>graphdriver/devmapper/devmapper.go
<ide> func (t *Task) GetNextTarget(next uintptr) (nextPtr uintptr, start uint64,
<ide> }
<ide>
<ide> func getLoopbackBackingFile(file *osFile) (uint64, uint64, error) {
<del> dev, inode, err := DmGetLoopbackBackingFile(file.Fd())
<del> if err != 0 {
<add> loopInfo, err := ioctlLoopGetStatus64(file.Fd())
<add> if err != nil {
<add> utils.Errorf("Error get loopback backing file: %s\n", err)
<ide> return 0, 0, ErrGetLoopbackBackingFile
<ide> }
<del> return dev, inode, nil
<add> return loopInfo.loDevice, loopInfo.loInode, nil
<ide> }
<ide>
<ide> func LoopbackSetCapacity(file *osFile) error {
<del> if err := DmLoopbackSetCapacity(file.Fd()); err != 0 {
<add> if err := ioctlLoopSetCapacity(file.Fd(), 0); err != nil {
<add> utils.Errorf("Error loopbackSetCapacity: %s", err)
<ide> return ErrLoopbackSetCapacity
<ide> }
<ide> return nil
<ide> func RemoveDevice(name string) error {
<ide> }
<ide>
<ide> func GetBlockDeviceSize(file *osFile) (uint64, error) {
<del> size, errno := DmGetBlockSize(file.Fd())
<del> if size == -1 || errno != 0 {
<add> size, err := ioctlBlkGetSize64(file.Fd())
<add> if err != nil {
<add> utils.Errorf("Error getblockdevicesize: %s", err)
<ide> return 0, ErrGetBlockSize
<ide> }
<ide> return uint64(size), nil
<ide><path>graphdriver/devmapper/devmapper_wrapper.go
<ide> import (
<ide> )
<ide>
<ide> type (
<del> CDmTask C.struct_dm_task
<add> CDmTask C.struct_dm_task
<add>
<ide> CLoopInfo64 C.struct_loop_info64
<ide> LoopInfo64 struct {
<ide> loDevice uint64 /* ioctl r/o */
<ide> type (
<ide>
<ide> // FIXME: Make sure the values are defined in C
<ide> const (
<del> LoopSetFd = C.LOOP_SET_FD
<del> LoopCtlGetFree = C.LOOP_CTL_GET_FREE
<del> LoopSetStatus64 = C.LOOP_SET_STATUS64
<del> LoopClrFd = C.LOOP_CLR_FD
<add> LoopSetFd = C.LOOP_SET_FD
<add> LoopCtlGetFree = C.LOOP_CTL_GET_FREE
<add> LoopGetStatus64 = C.LOOP_GET_STATUS64
<add> LoopSetStatus64 = C.LOOP_SET_STATUS64
<add> LoopClrFd = C.LOOP_CLR_FD
<add> LoopSetCapacity = C.LOOP_SET_CAPACITY
<add>
<ide> LoFlagsAutoClear = C.LO_FLAGS_AUTOCLEAR
<ide> LoFlagsReadOnly = C.LO_FLAGS_READ_ONLY
<ide> LoFlagsPartScan = C.LO_FLAGS_PARTSCAN
<ide> LoKeySize = C.LO_KEY_SIZE
<ide> LoNameSize = C.LO_NAME_SIZE
<add>
<add> BlkGetSize64 = C.BLKGETSIZE64
<ide> )
<ide>
<ide> var (
<del> DmGetBlockSize = dmGetBlockSizeFct
<del> DmGetLibraryVersion = dmGetLibraryVersionFct
<del> DmGetNextTarget = dmGetNextTargetFct
<del> DmLogInitVerbose = dmLogInitVerboseFct
<del> DmSetDevDir = dmSetDevDirFct
<del> DmTaskAddTarget = dmTaskAddTargetFct
<del> DmTaskCreate = dmTaskCreateFct
<del> DmTaskDestroy = dmTaskDestroyFct
<del> DmTaskGetInfo = dmTaskGetInfoFct
<del> DmTaskRun = dmTaskRunFct
<del> DmTaskSetAddNode = dmTaskSetAddNodeFct
<del> DmTaskSetCookie = dmTaskSetCookieFct
<del> DmTaskSetMessage = dmTaskSetMessageFct
<del> DmTaskSetName = dmTaskSetNameFct
<del> DmTaskSetRo = dmTaskSetRoFct
<del> DmTaskSetSector = dmTaskSetSectorFct
<del> DmUdevWait = dmUdevWaitFct
<del> GetBlockSize = getBlockSizeFct
<del> LogWithErrnoInit = logWithErrnoInitFct
<del> DmGetLoopbackBackingFile = dmGetLoopbackBackingFileFct
<del> DmLoopbackSetCapacity = dmLoopbackSetCapacityFct
<add> DmGetLibraryVersion = dmGetLibraryVersionFct
<add> DmGetNextTarget = dmGetNextTargetFct
<add> DmLogInitVerbose = dmLogInitVerboseFct
<add> DmSetDevDir = dmSetDevDirFct
<add> DmTaskAddTarget = dmTaskAddTargetFct
<add> DmTaskCreate = dmTaskCreateFct
<add> DmTaskDestroy = dmTaskDestroyFct
<add> DmTaskGetInfo = dmTaskGetInfoFct
<add> DmTaskRun = dmTaskRunFct
<add> DmTaskSetAddNode = dmTaskSetAddNodeFct
<add> DmTaskSetCookie = dmTaskSetCookieFct
<add> DmTaskSetMessage = dmTaskSetMessageFct
<add> DmTaskSetName = dmTaskSetNameFct
<add> DmTaskSetRo = dmTaskSetRoFct
<add> DmTaskSetSector = dmTaskSetSectorFct
<add> DmUdevWait = dmUdevWaitFct
<add> LogWithErrnoInit = logWithErrnoInitFct
<ide> )
<ide>
<ide> func free(p *C.char) {
<ide> func dmGetNextTargetFct(task *CDmTask, next uintptr, start, length *uint64, targ
<ide> return uintptr(nextp)
<ide> }
<ide>
<del>func dmGetLoopbackBackingFileFct(fd uintptr) (uint64, uint64, sysErrno) {
<del> var lo64 C.struct_loop_info64
<del> _, _, err := sysSyscall(sysSysIoctl, fd, C.LOOP_GET_STATUS64, uintptr(unsafe.Pointer(&lo64)))
<del> return uint64(lo64.lo_device), uint64(lo64.lo_inode), sysErrno(err)
<del>}
<del>
<del>func dmLoopbackSetCapacityFct(fd uintptr) sysErrno {
<del> _, _, err := sysSyscall(sysSysIoctl, fd, C.LOOP_SET_CAPACITY, 0)
<del> return sysErrno(err)
<del>}
<del>
<del>func dmGetBlockSizeFct(fd uintptr) (int64, sysErrno) {
<del> var size int64
<del> _, _, err := sysSyscall(sysSysIoctl, fd, C.BLKGETSIZE64, uintptr(unsafe.Pointer(&size)))
<del> return size, sysErrno(err)
<del>}
<del>
<del>func getBlockSizeFct(fd uintptr, size *uint64) sysErrno {
<del> _, _, err := sysSyscall(sysSysIoctl, fd, C.BLKGETSIZE64, uintptr(unsafe.Pointer(&size)))
<del> return sysErrno(err)
<del>}
<del>
<ide> func dmUdevWaitFct(cookie uint) int {
<ide> return int(C.dm_udev_wait(C.uint32_t(cookie)))
<ide> }
<ide><path>graphdriver/devmapper/ioctl.go
<add>package devmapper
<add>
<add>import (
<add> "unsafe"
<add>)
<add>
<add>func ioctlLoopCtlGetFree(fd uintptr) (int, error) {
<add> index, _, err := sysSyscall(sysSysIoctl, fd, LoopCtlGetFree, 0)
<add> if err != 0 {
<add> return 0, err
<add> }
<add> return int(index), nil
<add>}
<add>
<add>func ioctlLoopSetFd(loopFd, sparseFd uintptr) error {
<add> if _, _, err := sysSyscall(sysSysIoctl, loopFd, LoopSetFd, sparseFd); err != 0 {
<add> return err
<add> }
<add> return nil
<add>}
<add>
<add>func ioctlLoopSetStatus64(loopFd uintptr, loopInfo *LoopInfo64) error {
<add> if _, _, err := sysSyscall(sysSysIoctl, loopFd, LoopSetStatus64, uintptr(unsafe.Pointer(loopInfo))); err != 0 {
<add> return err
<add> }
<add> return nil
<add>}
<add>
<add>func ioctlLoopClrFd(loopFd uintptr) error {
<add> if _, _, err := sysSyscall(sysSysIoctl, loopFd, LoopClrFd, 0); err != 0 {
<add> return err
<add> }
<add> return nil
<add>}
<add>
<add>func ioctlLoopGetStatus64(loopFd uintptr) (*LoopInfo64, error) {
<add> loopInfo := &LoopInfo64{}
<add>
<add> if _, _, err := sysSyscall(sysSysIoctl, loopFd, LoopGetStatus64, uintptr(unsafe.Pointer(loopInfo))); err != 0 {
<add> return nil, err
<add> }
<add> return loopInfo, nil
<add>}
<add>
<add>func ioctlLoopSetCapacity(loopFd uintptr, value int) error {
<add> if _, _, err := sysSyscall(sysSysIoctl, loopFd, LoopSetCapacity, uintptr(value)); err != 0 {
<add> return err
<add> }
<add> return nil
<add>}
<add>
<add>func ioctlBlkGetSize64(fd uintptr) (int64, error) {
<add> var size int64
<add> if _, _, err := sysSyscall(sysSysIoctl, fd, BlkGetSize64, uintptr(unsafe.Pointer(&size))); err != 0 {
<add> return 0, err
<add> }
<add> return size, nil
<add>} | 4 |
Text | Text | update local env setup | ce092d657701a800a93971558d10fe920c51b912 | <ide><path>CONTRIBUTING.md
<ide> To setup your local dev environment, you will need the following tools.
<ide> 2. [git](https://github.com/) for code repository management.
<ide> 3. [python](https://www.python.org/) to build and code in Keras.
<ide>
<del>Using Apple Mac as an example (and linux will be very similar), the following
<del>commands set up and check the configuration of a local workspace.
<add>The following commands checks the tools above are successfully installed.
<add>Note that Keras requires at least Python 3.7 to run.
<ide>
<ide> ```shell
<del>scottzhu-macbookpro2:~ scottzhu$ which bazel
<del>/Users/scottzhu/bin/bazel
<add>bazel --version
<add>git --version
<add>python --version
<add>```
<add>
<add>A [Python virtual environment](https://docs.python.org/3/tutorial/venv.html)
<add>(venv) is a powerful tool to create a self-contained environment that isolates
<add>any change from the system level config. It is highly recommended to avoid any
<add>unexpected dependency or version issue.
<add>
<add>With the following commands, you create a new venv, named `venv_dir`.
<add>
<add>```shell
<add>mkdir venv_dir
<add>python3 -m venv venv_dir
<add>```
<ide>
<del>scottzhu-macbookpro2:~ scottzhu$ which git
<del>/usr/local/git/current/bin/git
<add>You can activate the venv with the following command.
<add>You should always run the tests with the venv activated.
<add>You need to activate the venv everytime you open a new shell.
<ide>
<del>scottzhu-macbookpro2:~ scottzhu$ which python
<del>/usr/bin/python
<add>```shell
<add>source venv_dir/bin/activate # for linux or MacOS
<add>venv_dir\Scripts\activate.bat # for Windows
<add>```
<ide>
<del># Keras requires at least python 3.7
<del>scottzhu-macbookpro2:~ scottzhu$ python --version
<del>Python 3.9.6
<add>Clone your forked repo to your local machine. Go to the cloned directory to
<add>install the dependencies into the venv.
<add>
<add>```shell
<add>git clone https://github.com/YOUR_GITHUB_USERNAME/keras.git
<add>cd keras
<add>pip install -r requirements.txt
<ide> ```
<ide>
<del>A [Python virtual environment](https://docs.python.org/3/tutorial/venv.html) is a
<del>powerful tool to create a self-contained environment that isolates any change
<del>from the system level config. It is highly recommended to avoid any unexpected
<del>dependency or version issue.
<add>The environment setup is completed. You may need to update the `tf-nightly`
<add>version regularly to keep your environment up-to-date with the following
<add>command.
<ide>
<ide> ```shell
<del>scottzhu-macbookpro2:workspace scottzhu$ git clone https://github.com/keras-team/keras.git
<del>Cloning into 'keras'...
<del>remote: Enumerating objects: 492, done.
<del>remote: Counting objects: 100% (492/492), done.
<del>remote: Compressing objects: 100% (126/126), done.
<del>remote: Total 35951 (delta 381), reused 443 (delta 366), pack-reused 35459
<del>Receiving objects: 100% (35951/35951), 15.70 MiB | 16.09 MiB/s, done.
<del>Resolving deltas: 100% (26243/26243), done.
<del>
<del>scottzhu-macbookpro2:workspace scottzhu$ mkdir venv_dir
<del>scottzhu-macbookpro2:workspace scottzhu$ python3 -m venv venv_dir
<del>scottzhu-macbookpro2:workspace scottzhu$ source venv_dir/bin/activate
<del>(venv_dir) scottzhu-macbookpro2:workspace scottzhu$ ls
<del>keras venv_dir
<del>
<del>(venv_dir) scottzhu-macbookpro2:workspace scottzhu$ cd keras
<del>
<del>(venv_dir) scottzhu-macbookpro2:workspace scottzhu$ pip install -r requirements.txt
<del>Collecting pandas
<del> Using cached pandas-1.2.3-cp38-cp38-manylinux1_x86_64.whl (9.7 MB)
<del>Collecting pydot
<del>...
<del>...
<del>...
<del>
<del># Since tf-nightly uses keras-nightly as a dependency, we need to uninstall
<del># keras-nightly so that tests will run against keras code in local workspace.
<del>(venv_dir) scottzhu-macbookpro2:workspace scottzhu$ pip uninstall keras-nightly
<del>Found existing installation: keras-nightly 2.5.0.dev2021032500
<del>Uninstalling keras-nightly-2.5.0.dev2021032500:
<del> Successfully uninstalled keras-nightly-2.5.0.dev2021032500
<add>pip install --upgrade tf-nightly
<ide> ```
<ide>
<ide> ## Run tests | 1 |
PHP | PHP | remove unneeded variable | 88c745afb24aef1feeb4aaa76c4f87d0cf49ec47 | <ide><path>src/Illuminate/Database/Connection.php
<ide> public function selectFromWriteConnection($query, $bindings = [])
<ide> */
<ide> public function select($query, $bindings = [], $useReadPdo = true)
<ide> {
<del> return $this->run($query, $bindings, function ($me, $query, $bindings) use ($useReadPdo) {
<add> return $this->run($query, $bindings, function ($query, $bindings) use ($useReadPdo) {
<ide> if ($this->pretending()) {
<ide> return [];
<ide> }
<ide> public function select($query, $bindings = [], $useReadPdo = true)
<ide> */
<ide> public function cursor($query, $bindings = [], $useReadPdo = true)
<ide> {
<del> $statement = $this->run($query, $bindings, function ($me, $query, $bindings) use ($useReadPdo) {
<del> if ($me->pretending()) {
<add> $statement = $this->run($query, $bindings, function ($query, $bindings) use ($useReadPdo) {
<add> if ($this->pretending()) {
<ide> return [];
<ide> }
<ide>
<ide> public function cursor($query, $bindings = [], $useReadPdo = true)
<ide>
<ide> $statement->setFetchMode($this->fetchMode);
<ide>
<del> $me->bindValues(
<del> $statement, $me->prepareBindings($bindings)
<add> $this->bindValues(
<add> $statement, $this->prepareBindings($bindings)
<ide> );
<ide>
<ide> // Next, we'll execute the query against the database and return the statement
<ide> public function delete($query, $bindings = [])
<ide> */
<ide> public function statement($query, $bindings = [])
<ide> {
<del> return $this->run($query, $bindings, function ($me, $query, $bindings) {
<del> if ($me->pretending()) {
<add> return $this->run($query, $bindings, function ($query, $bindings) {
<add> if ($this->pretending()) {
<ide> return true;
<ide> }
<ide>
<ide> $statement = $this->getPdo()->prepare($query);
<ide>
<del> $this->bindValues($statement, $me->prepareBindings($bindings));
<add> $this->bindValues($statement, $this->prepareBindings($bindings));
<ide>
<ide> return $statement->execute();
<ide> });
<ide> public function statement($query, $bindings = [])
<ide> */
<ide> public function affectingStatement($query, $bindings = [])
<ide> {
<del> return $this->run($query, $bindings, function ($me, $query, $bindings) {
<del> if ($me->pretending()) {
<add> return $this->run($query, $bindings, function ($query, $bindings) {
<add> if ($this->pretending()) {
<ide> return 0;
<ide> }
<ide>
<ide> // For update or delete statements, we want to get the number of rows affected
<ide> // by the statement and return that back to the developer. We'll first need
<ide> // to execute the statement and then we'll use PDO to fetch the affected.
<del> $statement = $me->getPdo()->prepare($query);
<add> $statement = $this->getPdo()->prepare($query);
<ide>
<del> $this->bindValues($statement, $me->prepareBindings($bindings));
<add> $this->bindValues($statement, $this->prepareBindings($bindings));
<ide>
<ide> $statement->execute();
<ide>
<ide> public function affectingStatement($query, $bindings = [])
<ide> */
<ide> public function unprepared($query)
<ide> {
<del> return $this->run($query, [], function ($me, $query) {
<del> if ($me->pretending()) {
<add> return $this->run($query, [], function ($query) {
<add> if ($this->pretending()) {
<ide> return true;
<ide> }
<ide>
<del> return (bool) $me->getPdo()->exec($query);
<add> return (bool) $this->getPdo()->exec($query);
<ide> });
<ide> }
<ide>
<ide> protected function runQueryCallback($query, $bindings, Closure $callback)
<ide> // run the SQL against the PDO connection. Then we can calculate the time it
<ide> // took to execute and log the query SQL, bindings and time in our memory.
<ide> try {
<del> $result = $callback($this, $query, $bindings);
<add> $result = $callback($query, $bindings);
<ide> }
<ide>
<ide> // If an exception occurs when attempting to run a query, we'll format the error | 1 |
Text | Text | add link to meteor talk | 3119d66e26bbda43f657128859c092b24e53c829 | <ide><path>docs/docs/videos.md
<ide> by [Stoyan Stefanov](http://www.phpied.com/)
<ide> <iframe width="650" height="315" src="//www.youtube.com/embed/R2CGKiNnPFo" frameborder="0" allowfullscreen></iframe>
<ide>
<ide> **In Russian** by [Alexander Solovyov](http://solovyov.net/)
<add>
<add>### -Functional DOM programming" - Meteor DevShop 11
<add>
<add><iframe width="650" height="315" src="//www.youtube.com/embed/Lqcs6hPOcFw?t=49m23s" frameborder="0" allowfullscreen></iframe> | 1 |
Text | Text | add explicit directions to link | 50cafde97d348e9fb5a07c2ecaf266eedb2fc64b | <ide><path>test/README.md
<ide> But don't give up hope!!! Although our tests may appear complex and overwhelming
<ide> ## tl;dr
<ide> * Clone repo
<ide> * install and link deps
<del> * `yarn install.`
<add> * `yarn install && yarn link && yarn link webpack`
<ide> * `npm run test` or `npm t`
<ide> * To run an individual suite: (recommended during development for easier isolated diffs)
<ide> | 1 |
PHP | PHP | continue refactor of view files | 4320acdb96b4fed9e0b31f708d3599b7d488bfca | <ide><path>cake/libs/view/view.php
<ide> class View extends Object {
<ide> var $loaded = array();
<ide>
<ide> /**
<del> * File extension. Defaults to Cake's conventional ".thtml".
<add> * File extension. Defaults to Cake's template ".ctp".
<ide> *
<ide> * @var array
<ide> */
<del> var $ext = '.thtml';
<add> var $ext = '.ctp';
<ide>
<ide> /**
<ide> * Sub-directory for this view file.
<ide> function __construct(&$controller) {
<ide>
<ide> /**
<ide> * Renders view for given action and layout. If $file is given, that is used
<del> * for a view filename (e.g. customFunkyView.thtml).
<add> * for a view filename (e.g. customFunkyView.ctp).
<ide> *
<ide> * @param string $action Name of action to render for
<ide> * @param string $layout Layout to use
<ide> function render($action = null, $layout = null, $file = null) {
<ide>
<ide> if (file_exists(VIEWS . $viewDir . DS . $errorAction . $this->ext)) {
<ide> $missingViewFileName = VIEWS . $viewDir . DS . $errorAction . $this->ext;
<del> } elseif($missingViewFileName = fileExistsInPath(LIBS . 'view' . DS . 'templates' . DS . $viewDir . DS . $errorAction . '.thtml')) {
<add> } elseif($missingViewFileName = fileExistsInPath(LIBS . 'view' . DS . 'templates' . DS . $viewDir . DS . $errorAction . '.ctp')) {
<ide> } else {
<ide> $missingViewFileName = false;
<ide> }
<ide> function render($action = null, $layout = null, $file = null) {
<ide> }
<ide>
<ide> if ($viewFileName && !$this->hasRendered) {
<del> if (substr($viewFileName, -5) === 'thtml') {
<add> if (substr($viewFileName, -4) === 'ctp' || substr($viewFileName, -5) === 'thtml') {
<ide> $out = View::_render($viewFileName, $this->_viewVars);
<ide> } else {
<ide> $out = $this->_render($viewFileName, $this->_viewVars);
<ide> function renderElement($name, $params = array()) {
<ide> if (file_exists(APP . 'plugins' . DS . $this->plugin . DS . 'views' . DS . 'elements' . DS . $name . $this->ext)) {
<ide> $elementFileName = APP . 'plugins' . DS . $this->plugin . DS . 'views' . DS . 'elements' . DS . $name . $this->ext;
<ide> return $this->_render($elementFileName, array_merge($this->_viewVars, $params), false);
<add> } elseif (file_exists(APP . 'plugins' . DS . $this->plugin . DS . 'views' . DS . 'elements' . DS . $name . '.thtml')) {
<add> $elementFileName = APP . 'plugins' . DS . $this->plugin . DS . 'views' . DS . 'elements' . DS . $name . '.thtml';
<add> return $this->_render($elementFileName, array_merge($this->_viewVars, $params), false);
<ide> }
<ide> }
<ide>
<ide> function renderElement($name, $params = array()) {
<ide> if (file_exists($path . 'elements' . DS . $name . $this->ext)) {
<ide> $elementFileName = $path . 'elements' . DS . $name . $this->ext;
<ide> return $this->_render($elementFileName, array_merge($this->_viewVars, $params), false);
<add> } elseif (file_exists($path . 'elements' . DS . $name . '.thtml')) {
<add> $elementFileName = $path . 'elements' . DS . $name . '.thtml';
<add> return $this->_render($elementFileName, array_merge($this->_viewVars, $params), false);
<ide> }
<ide> }
<ide> return "(Error rendering Element: {$name})";
<ide> function renderLayout($content_for_layout) {
<ide> $layout_fn = $this->_getLayoutFileName();
<ide>
<ide> if (DEBUG > 2 && $this->controller != null) {
<del> $debug = View::_render(LIBS . 'view' . DS . 'templates' . DS . 'elements' . DS . 'dump.thtml', array('controller' => $this->controller), false);
<add> $debug = View::_render(LIBS . 'view' . DS . 'templates' . DS . 'elements' . DS . 'dump.ctp', array('controller' => $this->controller), false);
<ide> } else {
<ide> $debug = '';
<ide> }
<ide> function renderLayout($content_for_layout) {
<ide> $data_for_layout = array_merge($data_for_layout, $this->loaded);
<ide> }
<ide>
<del> if (substr($layout_fn, -5) === 'thtml') {
<add> if (substr($layout_fn, -4) === 'ctp' || substr($layout_fn, -5) === 'thtml') {
<ide> $out = View::_render($layout_fn, $data_for_layout, $loadHelpers, true);
<ide> } else {
<ide> $out = $this->_render($layout_fn, $data_for_layout, $loadHelpers);
<ide> function set($one, $two = null) {
<ide> function error($code, $name, $message) {
<ide> header ("HTTP/1.0 {$code} {$name}");
<ide> print ($this->_render(VIEWS . 'layouts/error.thtml', array('code' => $code,
<del> 'name' => $name,
<del> 'message' => $message)));
<add> 'name' => $name,
<add> 'message' => $message)));
<ide> }
<ide>
<ide> /**************************************************************************
<ide> * Private methods.
<ide> *************************************************************************/
<ide>
<ide> /**
<del> * Returns filename of given action's template file (.thtml) as a string. CamelCased action names will be under_scored! This means that you can have LongActionNames that refer to long_action_names.thtml views.
<add> * Returns filename of given action's template file (.ctp) as a string. CamelCased action names will be under_scored! This means that you can have LongActionNames that refer to long_action_names.ctp views.
<ide> *
<ide> * @param string $action Controller action to find template filename for
<ide> * @return string Template filename
<ide> function _getViewFileName($action) {
<ide> $viewFileName = APP . 'plugins' . DS . $this->plugin . DS . 'views' . DS . $this->viewPath . DS . $action . $this->ext;
<ide>
<ide> if (file_exists(APP . 'views' . DS . 'plugins' . DS . $this->plugin . DS . $this->subDir . $type . $action . $this->ext)) {
<del> $viewFileName = APP . 'views' . DS . 'plugins' . DS . $this->plugin . DS . $this->subDir . $type . $action . $this->ext;
<del> return $viewFileName;
<del>
<add> return APP . 'views' . DS . 'plugins' . DS . $this->plugin . DS . $this->subDir . $type . $action . $this->ext;
<ide> } elseif (file_exists(APP . 'plugins' . DS . $this->plugin . DS . 'views' . DS . $this->viewPath . DS . $action . $this->ext)) {
<del> $viewFileName = APP . 'plugins' . DS . $this->plugin . DS . 'views' . DS . $this->viewPath . DS . $action . $this->ext;
<del> return $viewFileName;
<del>
<add> return APP . 'plugins' . DS . $this->plugin . DS . 'views' . DS . $this->viewPath . DS . $action . $this->ext;
<add> } elseif (file_exists(APP . 'views' . DS . 'plugins' . DS . $this->plugin . DS . $this->subDir . $type . $action . '.thtml')) {
<add> return APP . 'views' . DS . 'plugins' . DS . $this->plugin . DS . $this->subDir . $type . $action . '.thtml';
<add> } elseif (file_exists(APP . 'plugins' . DS . $this->plugin . DS . 'views' . DS . $this->viewPath . DS . $action . '.thtml')) {
<add> return APP . 'plugins' . DS . $this->plugin . DS . 'views' . DS . $this->viewPath . DS . $action . '.thtml';
<ide> } else {
<add> $viewFileName = APP . 'plugins' . DS . $this->plugin . DS . 'views' . DS . $this->viewPath . DS . $action . $this->ext;
<ide> return $this->cakeError('missingView', array(array(
<ide> 'className' => $this->name,
<ide> 'action' => $action,
<ide> function _getViewFileName($action) {
<ide>
<ide> foreach($paths->viewPaths as $path) {
<ide> if (file_exists($path . $this->viewPath . DS . $this->subDir . $type . $action . $this->ext)) {
<del> $viewFileName = $path . $this->viewPath . DS . $this->subDir . $type . $action . $this->ext;
<del> return $viewFileName;
<add> return $path . $this->viewPath . DS . $this->subDir . $type . $action . $this->ext;
<add> } elseif (file_exists($path . $this->viewPath . DS . $this->subDir . $type . $action . '.thtml')) {
<add> return $path . $this->viewPath . DS . $this->subDir . $type . $action . '.thtml';
<ide> }
<ide> }
<ide>
<del> if ($viewFileName = fileExistsInPath(LIBS . 'view' . DS . 'templates' . DS . 'errors' . DS . $type . $action . '.thtml')) {
<del> } elseif($viewFileName = fileExistsInPath(LIBS . 'view' . DS . 'templates' . DS . $this->viewPath . DS . $type . $action . '.thtml')) {
<add> if ($viewFileName = fileExistsInPath(LIBS . 'view' . DS . 'templates' . DS . 'errors' . DS . $type . $action . '.ctp')) {
<add> } elseif($viewFileName = fileExistsInPath(LIBS . 'view' . DS . 'templates' . DS . $this->viewPath . DS . $type . $action . '.ctp')) {
<ide> } else {
<ide> $viewFileName = VIEWS . $this->viewPath . DS . $this->subDir . $type . $action . $this->ext;
<ide> }
<ide> function _getViewFileName($action) {
<ide> /**
<ide> * Returns layout filename for this template as a string.
<ide> *
<del> * @return string Filename for layout file (.thtml).
<add> * @return string Filename for layout file (.ctp).
<ide> * @access private
<ide> */
<ide> function _getLayoutFileName() {
<ide> function _getLayoutFileName() {
<ide> }
<ide>
<ide> if(!is_null($this->layoutPath)){
<del> $this->layoutPath = $this->layoutPath . DS;
<add> $type = $this->layoutPath . DS;
<ide> }
<ide>
<ide> if (!is_null($this->plugin)) {
<del> if (file_exists(APP . 'views' . DS . 'plugins' . DS . $this->plugin . DS . 'layouts' . DS . $this->subDir . $type . $action . $this->ext)) {
<del> $layoutFileName = APP . 'views' . DS . 'plugins' . DS . $this->plugin . DS . 'layouts' . DS . $this->subDir . $type . $action . $this->ext;
<del> return $layoutFileName;
<del>
<del> } elseif (file_exists(APP . 'plugins' . DS . $this->plugin . DS . 'views' . DS . 'layouts' . DS . $this->layout . $this->ext)) {
<del> $layoutFileName = APP . 'plugins' . DS . $this->plugin . DS . 'views' . DS . 'layouts' . DS . $this->layout . $this->ext;
<del> return $layoutFileName;
<del>
<add> if (file_exists(APP . 'views' . DS . 'plugins' . DS . $this->plugin . DS . 'layouts' . DS . $this->subDir . $type . $this->layout . $this->ext)) {
<add> return APP . 'views' . DS . 'plugins' . DS . $this->plugin . DS . 'layouts' . DS . $this->subDir . $type . $this->layout . $this->ext;
<add> } elseif (file_exists(APP . 'plugins' . DS . $this->plugin . DS . 'views' . DS . 'layouts' . DS . $this->subDir . $type . $this->layout . $this->ext)) {
<add> return APP . 'plugins' . DS . $this->plugin . DS . 'views' . DS . 'layouts' . DS . $this->subDir . $type . $this->layout . $this->ext;
<add> } elseif (file_exists(APP . 'views' . DS . 'plugins' . DS . $this->plugin . DS . 'layouts' . DS . $this->subDir . $type . $this->layout . '.thtml')) {
<add> return APP . 'views' . DS . 'plugins' . DS . $this->plugin . DS . 'layouts' . DS . $this->subDir . $type . $this->layout . '.thtml';
<add> } elseif (file_exists(APP . 'plugins' . DS . $this->plugin . DS . 'views' . DS . 'layouts' . DS . $this->subDir . $type . $this->layout . '.thtml')) {
<add> return APP . 'plugins' . DS . $this->plugin . DS . 'views' . DS . 'layouts' . DS . $this->subDir . $type . $this->layout . '.thtml';
<ide> }
<ide> }
<ide>
<ide> $paths = Configure::getInstance();
<ide>
<ide> foreach($paths->viewPaths as $path) {
<ide> if (file_exists($path . 'layouts' . DS . $this->subDir . $this->layoutPath . $type . $this->layout . $this->ext)) {
<del> $layoutFileName = $path . 'layouts' . DS . $this->subDir . $this->layoutPath . $type . $this->layout . $this->ext;
<del> return $layoutFileName;
<add> return $path . 'layouts' . DS . $this->subDir . $this->layoutPath . $type . $this->layout . $this->ext;
<add> } elseif (file_exists($path . 'layouts' . DS . $this->subDir . $this->layoutPath . $type . $this->layout . '.thtml')) {
<add> return $path . 'layouts' . DS . $this->subDir . $this->layoutPath . $type . $this->layout . '.thtml';
<ide> }
<ide> }
<ide>
<del>
<del> if($layoutFileName = fileExistsInPath(LIBS . 'view' . DS . 'templates' . DS . 'layouts' . DS . $type . $this->layout . '.thtml')) {
<add> if($layoutFileName = fileExistsInPath(LIBS . 'view' . DS . 'templates' . DS . 'layouts' . DS . $type . $this->layout . '.ctp')) {
<ide> } else {
<ide> $layoutFileName = LAYOUTS . $type . $this->layout . $this->ext;
<ide> } | 1 |
Javascript | Javascript | fix flaky test | ad997344893403948a70edb47b0d056658458e39 | <ide><path>test/integration/react-streaming-and-server-components/test/rsc.js
<ide> export default function (context, { runtime, env }) {
<ide> expect(content).toContain('component:index.server')
<ide>
<ide> await browser.waitForElementByCss('#goto-streaming-rsc').click()
<del> await new Promise((res) => setTimeout(res, 1500))
<add>
<add> // Wait for navigation and streaming to finish.
<add> await check(
<add> () => browser.elementByCss('#content').text(),
<add> 'next_streaming_data'
<add> )
<ide> expect(await browser.url()).toBe(
<ide> `http://localhost:${context.appPort}/streaming-rsc`
<ide> )
<del>
<del> content = await browser.elementByCss('#content').text()
<del> expect(content).toContain('next_streaming_data')
<ide> })
<ide>
<ide> it('should handle streaming server components correctly', async () => { | 1 |
Python | Python | add exhaustive test for einsum specialized loops | d9dae76fce6fe3e87ee01670d3905f6fbdd04569 | <ide><path>numpy/core/tests/test_einsum.py
<ide> import itertools
<ide>
<add>import pytest
<add>
<ide> import numpy as np
<ide> from numpy.testing import (
<ide> assert_, assert_equal, assert_array_equal, assert_almost_equal,
<ide> def test_einsum_all_contig_non_contig_output(self):
<ide> np.einsum('ij,jk->ik', x, x, out=out)
<ide> assert_array_equal(out.base, correct_base)
<ide>
<add> @pytest.mark.parametrize("dtype",
<add> np.typecodes["AllFloat"] + np.typecodes["AllInteger"])
<add> def test_different_paths(self, dtype):
<add> # Test originally added to cover broken float16 path: gh-20305
<add> # Likely most are covered elsewhere, at least partially.
<add> dtype = np.dtype(dtype)
<add> # Simple test, designed to excersize most specialized code paths,
<add> # note the +0.5 for floats. This makes sure we use a float value
<add> # where the results must be exact.
<add> arr = (np.arange(7) + 0.5).astype(dtype)
<add> scalar = np.array(2, dtype=dtype)
<add>
<add> # contig -> scalar:
<add> res = np.einsum('i->', arr)
<add> assert res == arr.sum()
<add> # contig, contig -> contig:
<add> res = np.einsum('i,i->i', arr, arr)
<add> assert_array_equal(res, arr * arr)
<add> # noncontig, noncontig -> contig:
<add> res = np.einsum('i,i->i', arr.repeat(2)[::2], arr.repeat(2)[::2])
<add> assert_array_equal(res, arr * arr)
<add> # contig + contig -> scalar
<add> assert np.einsum('i,i->', arr, arr) == (arr * arr).sum()
<add> # contig + scalar -> contig (with out)
<add> out = np.ones(7, dtype=dtype)
<add> res = np.einsum('i,->i', arr, dtype.type(2), out=out)
<add> assert_array_equal(res, arr * dtype.type(2))
<add> # scalar + contig -> contig (with out)
<add> res = np.einsum(',i->i', scalar, arr)
<add> assert_array_equal(res, arr * dtype.type(2))
<add> # scalar + contig -> scalar
<add> res = np.einsum(',i->', scalar, arr)
<add> # Use einsum to compare to not have difference due to sum round-offs:
<add> assert res == np.einsum('i->', scalar * arr)
<add> # contig + scalar -> scalar
<add> res = np.einsum('i,->', arr, scalar)
<add> # Use einsum to compare to not have difference due to sum round-offs:
<add> assert res == np.einsum('i->', scalar * arr)
<add> # contig + contig + contig -> scalar
<add> arr = np.array([0.5, 0.5, 0.25, 4.5, 3.], dtype=dtype)
<add> res = np.einsum('i,i,i->', arr, arr, arr)
<add> assert_array_equal(res, (arr * arr * arr).sum())
<add> # four arrays:
<add> res = np.einsum('i,i,i,i->', arr, arr, arr, arr)
<add> assert_array_equal(res, (arr * arr * arr * arr).sum())
<add>
<ide> def test_small_boolean_arrays(self):
<ide> # See gh-5946.
<ide> # Use array of True embedded in False. | 1 |
Python | Python | add detection generator back | 394cefcc13bf2a07121b94f1426d038700b1b61a | <ide><path>official/vision/beta/projects/yolo/modeling/layers/detection_generator.py
<add>"""Contains common building blocks for yolo neural networks."""
<add>import tensorflow as tf
<add>import tensorflow.keras as ks
<add>import tensorflow.keras.backend as K
<add>
<add># from official.vision.beta.projects.yolo.ops import loss_utils
<add>from official.vision.beta.projects.yolo.ops import box_ops
<add># from official.vision.beta.projects.yolo.losses import yolo_loss
<add># from official.vision.beta.projects.yolo.ops import nms_ops
<add>
<add>
<add>@ks.utils.register_keras_serializable(package='yolo')
<add>class YoloLayer(ks.Model):
<add>
<add> def __init__(self,
<add> masks,
<add> anchors,
<add> classes,
<add> iou_thresh=0.0,
<add> ignore_thresh=0.7,
<add> truth_thresh=1.0,
<add> nms_thresh=0.6,
<add> max_delta=10.0,
<add> loss_type='ciou',
<add> use_tie_breaker=True,
<add> iou_normalizer=1.0,
<add> cls_normalizer=1.0,
<add> obj_normalizer=1.0,
<add> use_scaled_loss=False,
<add> darknet = None,
<add> pre_nms_points=5000,
<add> label_smoothing=0.0,
<add> max_boxes=200,
<add> new_cords=False,
<add> path_scale=None,
<add> scale_xy=None,
<add> nms_type='greedy',
<add> objectness_smooth=False,
<add> **kwargs):
<add> """
<add> parameters for the loss functions used at each detection head output
<add>
<add>scale_anchors: `int` for how much to scale this level to get the orginal
<add> input shape
<add>
<add> Args:
<add> masks: `List[int]` for the output level that this specific model output
<add> level.
<add> anchors: `List[List[int]]` for the anchor boxes that are used in the
<add> model.
<add> classes: `int` for the number of classes.
<add> iou_thresh: `float` to use many anchors per object if IoU(Obj, Anchor) >
<add> iou_thresh.
<add> ignore_thresh: `float` for the IOU value over which the loss is not
<add> propagated, and a detection is assumed to have been made.
<add> truth_thresh: `float` for the IOU value over which the loss is propagated
<add> despite a detection being made'.
<add> nms_thresh: `float` for the minimum IOU value for an overlap.
<add> max_delta: gradient clipping to apply to the box loss.
<add> loss_type: `str` for the typeof iou loss to use with in {ciou, diou,
<add> giou, iou}.
<add>
<add> iou_normalizer: `float` for how much to scale the loss on the IOU or the
<add> boxes.
<add> cls_normalizer: `float` for how much to scale the loss on the classes.
<add> obj_normalizer: `float` for how much to scale loss on the detection map.
<add> objectness_smooth: `float` for how much to smooth the loss on the
<add> detection map.
<add> use_scaled_loss: `bool` for whether to use the scaled loss
<add> or the traditional loss.
<add>
<add> label_smoothing: `float` for how much to smooth the loss on the classes.
<add> new_cords: `bool` for which scaling type to use.
<add> scale_xy: dictionary `float` values inidcating how far each pixel can see
<add> outside of its containment of 1.0. a value of 1.2 indicates there is a
<add> 20% extended radius around each pixel that this specific pixel can
<add> predict values for a center at. the center can range from 0 - value/2
<add> to 1 + value/2, this value is set in the yolo filter, and resused here.
<add> there should be one value for scale_xy for each level from min_level to
<add> max_level.
<add> nms_type: "greedy",
<add> nms_thresh: 0.6,
<add> iou_thresh: 0.213,
<add> name=None,
<add>
<add>
<add> Return:
<add> loss: `float` for the actual loss.
<add> box_loss: `float` loss on the boxes used for metrics.
<add> conf_loss: `float` loss on the confidence used for metrics.
<add> class_loss: `float` loss on the classes used for metrics.
<add> avg_iou: `float` metric for the average iou between predictions
<add> and ground truth.
<add> avg_obj: `float` metric for the average confidence of the model
<add> for predictions.
<add> recall50: `float` metric for how accurate the model is.
<add> precision50: `float` metric for how precise the model is.
<add> """
<add> super().__init__(**kwargs)
<add> self._masks = masks
<add> self._anchors = anchors
<add> self._thresh = iou_thresh
<add> self._ignore_thresh = ignore_thresh
<add> self._truth_thresh = truth_thresh
<add> self._iou_normalizer = iou_normalizer
<add> self._cls_normalizer = cls_normalizer
<add> self._obj_normalizer = obj_normalizer
<add> self._objectness_smooth = objectness_smooth
<add> self._nms_thresh = nms_thresh
<add> self._max_boxes = max_boxes
<add> self._max_delta = max_delta
<add> self._classes = classes
<add> self._loss_type = loss_type
<add> self._use_tie_breaker = use_tie_breaker
<add>
<add> self._use_scaled_loss = use_scaled_loss
<add> self._darknet = darknet
<add>
<add> self._pre_nms_points = pre_nms_points
<add> self._label_smoothing = label_smoothing
<add> self._keys = list(masks.keys())
<add> self._len_keys = len(self._keys)
<add> self._new_cords = new_cords
<add> self._path_scale = path_scale or {
<add> key: 2**int(key) for key, _ in masks.items()
<add> }
<add>
<add> self._nms_types = {
<add> 'greedy': 1,
<add> 'iou': 2,
<add> 'giou': 3,
<add> 'ciou': 4,
<add> 'diou': 5,
<add> 'class_independent': 6,
<add> 'weighted_diou': 7
<add> }
<add>
<add> self._nms_type = self._nms_types[nms_type]
<add>
<add> if self._nms_type >= 2 and self._nms_type <= 5:
<add> self._nms = nms_ops.TiledNMS(iou_type=nms_type)
<add>
<add> self._scale_xy = scale_xy or {key: 1.0 for key, _ in masks.items()}
<add>
<add> self._generator = {}
<add> self._len_mask = {}
<add> for key in self._keys:
<add> anchors = [self._anchors[mask] for mask in self._masks[key]]
<add> self._generator[key] = self.get_generators(anchors, self._path_scale[key],
<add> key)
<add> self._len_mask[key] = len(self._masks[key])
<add> return
<add>
<add> def get_generators(self, anchors, path_scale, path_key):
<add> # anchor_generator = loss_utils.GridGenerator(
<add> # anchors, scale_anchors=path_scale)
<add> # return anchor_generator
<add> return None
<add>
<add> def rm_nan_inf(self, x, val=0.0):
<add> x = tf.where(tf.math.is_nan(x), tf.cast(val, dtype=x.dtype), x)
<add> x = tf.where(tf.math.is_inf(x), tf.cast(val, dtype=x.dtype), x)
<add> return x
<add>
<add> def parse_prediction_path(self, key, inputs):
<add> shape_ = tf.shape(inputs)
<add> shape = inputs.get_shape().as_list()
<add> batchsize, height, width = shape_[0], shape[1], shape[2]
<add>
<add> generator = self._generator[key]
<add> len_mask = self._len_mask[key]
<add> scale_xy = self._scale_xy[key]
<add>
<add> # reshape the yolo output to (batchsize,
<add> # width,
<add> # height,
<add> # number_anchors,
<add> # remaining_points)
<add>
<add> data = tf.reshape(inputs, [-1, height, width, len_mask, self._classes + 5])
<add>
<add> # use the grid generator to get the formatted anchor boxes and grid points
<add> # in shape [1, height, width, 2]
<add> centers, anchors = generator(height, width, batchsize, dtype=data.dtype)
<add>
<add> # # tempcode
<add> # centers /= tf.cast([width, height], centers.dtype)
<add> # anchors /= tf.cast([width, height], anchors.dtype)
<add>
<add> # split the yolo detections into boxes, object score map, classes
<add> boxes, obns_scores, class_scores = tf.split(
<add> data, [4, 1, self._classes], axis=-1)
<add>
<add> # determine the number of classes
<add> classes = class_scores.get_shape().as_list()[
<add> -1] #tf.shape(class_scores)[-1]
<add>
<add> # # configurable to use the new coordinates in scaled Yolo v4 or not
<add> # if not self._new_cords[key]:
<add> # # coordinates from scaled yolov4
<add> # _, _, boxes = yolo_loss.get_predicted_box(
<add> # tf.cast(height, data.dtype), tf.cast(width, data.dtype), boxes,
<add> # anchors, centers, scale_xy)
<add> # else:
<add> # # coordinates from regular yolov3 - v4
<add> # _, _, boxes = yolo_loss.get_predicted_box_newcords(
<add> # tf.cast(height, data.dtype), tf.cast(width, data.dtype), boxes,
<add> # anchors, centers, scale_xy)
<add> boxes = None
<add>
<add> # convert boxes from yolo(x, y, w. h) to tensorflow(ymin, xmin, ymax, xmax)
<add> boxes = box_ops.xcycwh_to_yxyx(boxes)
<add>
<add> # activate and detection map
<add> obns_scores = tf.math.sigmoid(obns_scores)
<add>
<add> # threshold the detection map
<add> obns_mask = tf.cast(obns_scores > self._thresh, obns_scores.dtype)
<add>
<add> # convert detection map to class detection probabailities
<add> class_scores = tf.math.sigmoid(class_scores) * obns_mask * obns_scores
<add> class_scores *= tf.cast(class_scores > self._thresh, class_scores.dtype)
<add>
<add> fill = height * width * len_mask
<add> # platten predictions to [batchsize, N, -1] for non max supression
<add> boxes = tf.reshape(boxes, [-1, fill, 4])
<add> class_scores = tf.reshape(class_scores, [-1, fill, classes])
<add> obns_scores = tf.reshape(obns_scores, [-1, fill])
<add>
<add> return obns_scores, boxes, class_scores
<add>
<add> def call(self, inputs):
<add> boxes = []
<add> class_scores = []
<add> object_scores = []
<add> levels = list(inputs.keys())
<add> min_level = int(min(levels))
<add> max_level = int(max(levels))
<add>
<add> # aggregare boxes over each scale
<add> for i in range(min_level, max_level + 1):
<add> key = str(i)
<add> object_scores_, boxes_, class_scores_ = self.parse_prediction_path(
<add> key, inputs[key])
<add> boxes.append(boxes_)
<add> class_scores.append(class_scores_)
<add> object_scores.append(object_scores_)
<add>
<add> # colate all predicitons
<add> boxes = tf.concat(boxes, axis=1)
<add> object_scores = K.concatenate(object_scores, axis=1)
<add> class_scores = K.concatenate(class_scores, axis=1)
<add>
<add> # # apply nms
<add> # if self._nms_type == 7:
<add> # boxes, class_scores, object_scores = nms_ops.non_max_suppression2(
<add> # boxes,
<add> # class_scores,
<add> # object_scores,
<add> # self._max_boxes,
<add> # pre_nms_thresh = self._thresh,
<add> # nms_thresh = self._nms_thresh,
<add> # prenms_top_k=self._pre_nms_points)
<add> # elif self._nms_type == 6:
<add> # boxes, class_scores, object_scores = nms_ops.nms(
<add> # boxes,
<add> # class_scores,
<add> # object_scores,
<add> # self._max_boxes,
<add> # self._thresh,
<add> # self._nms_thresh,
<add> # prenms_top_k=self._pre_nms_points)
<add> # elif self._nms_type == 1:
<add> # # greedy NMS
<add> # boxes = tf.cast(boxes, dtype=tf.float32)
<add> # class_scores = tf.cast(class_scores, dtype=tf.float32)
<add> # nms_items = tf.image.combined_non_max_suppression(
<add> # tf.expand_dims(boxes, axis=-2),
<add> # class_scores,
<add> # self._pre_nms_points,
<add> # self._max_boxes,
<add> # iou_threshold=self._nms_thresh,
<add> # score_threshold=self._thresh)
<add> # # cast the boxes and predicitons abck to original datatype
<add> # boxes = tf.cast(nms_items.nmsed_boxes, object_scores.dtype)
<add> # class_scores = tf.cast(nms_items.nmsed_classes, object_scores.dtype)
<add> # object_scores = tf.cast(nms_items.nmsed_scores, object_scores.dtype)
<add> #
<add> # else:
<add> # boxes = tf.cast(boxes, dtype=tf.float32)
<add> # class_scores = tf.cast(class_scores, dtype=tf.float32)
<add> # boxes, confidence, classes, valid = self._nms.complete_nms(
<add> # tf.expand_dims(boxes, axis=-2),
<add> # class_scores,
<add> # pre_nms_top_k=self._pre_nms_points,
<add> # max_num_detections=self._max_boxes,
<add> # nms_iou_threshold=self._nms_thresh,
<add> # pre_nms_score_threshold=self._thresh)
<add> # boxes = tf.cast(boxes, object_scores.dtype)
<add> # class_scores = tf.cast(classes, object_scores.dtype)
<add> # object_scores = tf.cast(confidence, object_scores.dtype)
<add>
<add> # compute the number of valid detections
<add> num_detections = tf.math.reduce_sum(tf.math.ceil(object_scores), axis=-1)
<add>
<add> # format and return
<add> return {
<add> 'bbox': boxes,
<add> 'classes': class_scores,
<add> 'confidence': object_scores,
<add> 'num_detections': num_detections,
<add> }
<add>
<add> @property
<add> def losses(self):
<add> """ Generates a dictionary of losses to apply to each path
<add>
<add> Done in the detection generator because all parameters are the same
<add> across both loss and detection generator
<add> """
<add> # loss_dict = {}
<add> # for key in self._keys:
<add> # loss_dict[key] = yolo_loss.Yolo_Loss(
<add> # classes=self._classes,
<add> # anchors=self._anchors,
<add> # darknet=self._darknet,
<add> # truth_thresh=self._truth_thresh[key],
<add> # ignore_thresh=self._ignore_thresh[key],
<add> # loss_type=self._loss_type[key],
<add> # iou_normalizer=self._iou_normalizer[key],
<add> # cls_normalizer=self._cls_normalizer[key],
<add> # obj_normalizer=self._obj_normalizer[key],
<add> # new_cords=self._new_cords[key],
<add> # objectness_smooth=self._objectness_smooth[key],
<add> # use_scaled_loss=self._use_scaled_loss,
<add> # label_smoothing=self._label_smoothing,
<add> # mask=self._masks[key],
<add> # max_delta=self._max_delta[key],
<add> # scale_anchors=self._path_scale[key],
<add> # scale_x_y=self._scale_xy[key])
<add> # return loss_dict
<add> return None
<add>
<add> def get_config(self):
<add> return {
<add> 'masks': dict(self._masks),
<add> 'anchors': [list(a) for a in self._anchors],
<add> 'thresh': self._thresh,
<add> 'max_boxes': self._max_boxes,
<add> }
<ide><path>official/vision/beta/projects/yolo/modeling/yolo_model.py
<ide> def __init__(self,
<ide> backbone: `tf.keras.Model`, a backbone network.
<ide> decoder: `tf.keras.Model`, a decoder network.
<ide> head: `YoloHead`, the YOLO head.
<del> filter: `tf.keras.Model`, the detection generator.
<add> detection_generator: `tf.keras.Model`, the detection generator.
<ide> **kwargs: keyword arguments to be passed.
<ide> """
<ide> super().__init__(**kwargs)
<ide> def __init__(self,
<ide> 'backbone': backbone,
<ide> 'decoder': decoder,
<ide> 'head': head,
<del> 'filter': filter
<add> 'detection_generator': detection_generator
<ide> }
<ide>
<ide> # model components
<ide> self._backbone = backbone
<ide> self._decoder = decoder
<ide> self._head = head
<del> self._filter = filter
<add> self._detection_generator = detection_generator
<ide>
<ide> def call(self, inputs, training=False):
<ide> maps = self._backbone(inputs)
<ide> def call(self, inputs, training=False):
<ide> return {"raw_output": raw_predictions}
<ide> else:
<ide> # Post-processing.
<del> predictions = self._filter(raw_predictions)
<add> predictions = self._detection_generator(raw_predictions)
<ide> predictions.update({"raw_output": raw_predictions})
<ide> return predictions
<ide>
<ide> def head(self):
<ide> return self._head
<ide>
<ide> @property
<del> def filter(self):
<del> return self._filter
<add> def detection_generator(self):
<add> return self._detection_generator
<ide>
<ide> def get_config(self):
<ide> return self._config_dict | 2 |
Javascript | Javascript | add tests for querystring.unescapebuffer | 2a750bffcc6b8c33df9f575b83979d245cd7b384 | <ide><path>test/simple/test-querystring.js
<ide> assert.equal(f, "a:b;q:x%3Ay%3By%3Az");
<ide>
<ide>
<ide> assert.deepEqual({}, qs.parse());
<add>
<add>
<add>
<add>var b = qs.unescapeBuffer('%d3%f2Ug%1f6v%24%5e%98%cb%0d%ac%a2%2f%9d%eb%d8%a2%e6')
<add>// <Buffer d3 f2 55 67 1f 36 76 24 5e 98 cb 0d ac a2 2f 9d eb d8 a2 e6>
<add>assert.equal(0xd3, b[0]);
<add>assert.equal(0xf2, b[1]);
<add>assert.equal(0x55, b[2]);
<add>assert.equal(0x67, b[3]);
<add>assert.equal(0x1f, b[4]);
<add>assert.equal(0x36, b[5]);
<add>assert.equal(0x76, b[6]);
<add>assert.equal(0x24, b[7]);
<add>assert.equal(0x5e, b[8]);
<add>assert.equal(0x98, b[9]);
<add>assert.equal(0xcb, b[10]);
<add>assert.equal(0x0d, b[11]);
<add>assert.equal(0xac, b[12]);
<add>assert.equal(0xa2, b[13]);
<add>assert.equal(0x2f, b[14]);
<add>assert.equal(0x9d, b[15]);
<add>assert.equal(0xeb, b[16]);
<add>assert.equal(0xd8, b[17]);
<add>assert.equal(0xa2, b[18]);
<add>assert.equal(0xe6, b[19]);
<add> | 1 |
Javascript | Javascript | drop testing on jsdom with node 0.10 & 0.12 | c7431c7793f7605250807f91bee7c9ddcbaeb91b | <ide><path>Gruntfile.js
<ide> module.exports = function( grunt ) {
<ide> var fs = require( "fs" ),
<ide> gzip = require( "gzip-js" ),
<ide> srcHintOptions = readOptionalJSON( "src/.jshintrc" ),
<del> newNode = !/^v0/.test( process.version ),
<del>
<del> // Allow to skip jsdom-related tests in Node.js < 1.0.0
<del> runJsdomTests = newNode || ( function() {
<del> try {
<del> require( "jsdom" );
<del> return true;
<del> } catch ( e ) {
<del> return false;
<del> }
<del> } )();
<add>
<add> // Skip jsdom-related tests in Node.js 0.10 & 0.12
<add> runJsdomTests = !/^v0/.test( process.version );
<ide>
<ide> if ( !grunt.option( "filename" ) ) {
<ide> grunt.option( "filename", "jquery.js" );
<ide><path>build/tasks/install_old_jsdom.js
<del>module.exports = function( grunt ) {
<del>
<del> "use strict";
<del>
<del> // Run this task to run jsdom-related tests on Node.js < 1.0.0.
<del> grunt.registerTask( "old_jsdom", function() {
<del> if ( !/^v0/.test( process.version ) ) {
<del> console.warn( "The old_jsdom task doesn\'t need to be run in io.js or new Node.js" );
<del> return;
<del> }
<del>
<del> // Use npm on the command-line
<del> // There is no local npm
<del> grunt.util.spawn( {
<del> cmd: "npm",
<del> args: [ "install", "jsdom@3" ],
<del> opts: { stdio: "inherit" }
<del> }, this.async() );
<del> } );
<del>}; | 2 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.