content_type stringclasses 8 values | main_lang stringclasses 7 values | message stringlengths 1 50 | sha stringlengths 40 40 | patch stringlengths 52 962k | file_count int64 1 300 |
|---|---|---|---|---|---|
Text | Text | add v4.2.0-beta.1 to changelog | 5ea53d595877c49aad18c8a6ca68d23b1ea226f5 | <ide><path>CHANGELOG.md
<ide> # Ember Changelog
<ide>
<add>### v4.2.0-beta.1 (December 28, 2021)
<add>
<add>- [#19878](https://github.com/emberjs/ember.js/pull/19772) [BUGFIX] Allow class-based helpers to work in strict-mode.
<add>
<ide> ### v4.1.0 (December 28, 2021)
<ide>
<ide> - [#19772](https://github.com/emberjs/ember.js/pull/19772) / [#19826](https://github.com/emberjs/ember.js/pull/19826) [FEATURE] Add a `@cached` decorator per [RFC #0566](https://github.com/emberjs/rfcs/blob/af64915b5ecde010fce09309a47ee6d2447588d0/text/0566-memo-decorator.md). | 1 |
Python | Python | handle conversion of pipeline components correctly | fd1a9225d80a667bfd88c5d21bdaeaf156003cf2 | <ide><path>spacy/cli/package.py
<ide> def generate_pipeline():
<ide> "parser, ner. For more information, see the docs on processing pipelines.",
<ide> title="Enter your model's pipeline components")
<ide> pipeline = util.get_raw_input("Pipeline components", True)
<del> replace = {'True': True, 'False': False}
<del> return replace[pipeline] if pipeline in replace else pipeline.split(', ')
<add> subs = {'True': True, 'False': False}
<add> if pipeline in subs:
<add> return subs[pipeline]
<add> else:
<add> return [p.strip() for p in pipeline.split(',')]
<ide>
<ide>
<ide> def validate_meta(meta, keys): | 1 |
Javascript | Javascript | forgive unregistration of a non-registered handler | ab59cc6c44705b1244a77eba999d736f9eb3c6ae | <ide><path>src/jqLite.js
<ide> function JQLiteOff(element, type, fn) {
<ide> removeEventListenerFn(element, type, events[type]);
<ide> delete events[type];
<ide> } else {
<del> arrayRemove(events[type], fn);
<add> arrayRemove(events[type] || [], fn);
<ide> }
<ide> });
<ide> }
<ide><path>test/jqLiteSpec.js
<ide> describe('jqLite', function() {
<ide> aElem.off('click', function() {});
<ide> });
<ide>
<add> it('should do nothing when a specific listener was not registered', function () {
<add> var aElem = jqLite(a);
<add> aElem.on('click', function() {});
<add>
<add> aElem.off('mouseenter', function() {});
<add> });
<ide>
<ide> it('should deregister all listeners', function() {
<ide> var aElem = jqLite(a), | 2 |
Text | Text | add thealphanerd to collaborators | 7b89a3de69b22face8a7077c0b695eba6d9fb353 | <ide><path>README.md
<ide> information about the governance of the Node.js project, see
<ide> * [srl295](https://github.com/srl295) - **Steven R Loomis** <srloomis@us.ibm.com>
<ide> * [targos](https://github.com/targos) - **Michaël Zasso** <mic.besace@gmail.com>
<ide> * [tellnes](https://github.com/tellnes) - **Christian Tellnes** <christian@tellnes.no>
<add>* [thealphanerd](http://github.com/thealphanerd) - **Myles Borins** <myles.borins@gmail.com>
<ide> * [thefourtheye](https://github.com/thefourtheye) - **Sakthipriyan Vairamani** <thechargingvolcano@gmail.com>
<ide> * [thlorenz](https://github.com/thlorenz) - **Thorsten Lorenz** <thlorenz@gmx.de>
<ide> * [Trott](https://github.com/Trott) - **Rich Trott** <rtrott@gmail.com> | 1 |
Mixed | Ruby | introduce custom metadata | e106a4a1d2964fee475aa58dab7ab229293af692 | <ide><path>activestorage/CHANGELOG.md
<add>* Setting custom metadata on blobs are now persisted to remote storage.
<add>
<add> *joshuamsager*
<add>
<ide> * Support direct uploads to multiple services.
<ide>
<ide> *Dmitry Tsepelev*
<ide><path>activestorage/app/models/active_storage/blob.rb
<ide> class ActiveStorage::Blob < ActiveStorage::Record
<ide> self.service_name ||= self.class.service&.name
<ide> end
<ide>
<del> after_update_commit :update_service_metadata, if: :content_type_previously_changed?
<add> after_update_commit :update_service_metadata, if: -> { content_type_previously_changed? || metadata_previously_changed? }
<ide>
<ide> before_destroy(prepend: true) do
<ide> raise ActiveRecord::InvalidForeignKey if attachments.exists?
<ide> def filename
<ide> ActiveStorage::Filename.new(self[:filename])
<ide> end
<ide>
<add> def custom_metadata
<add> self[:metadata][:custom] || {}
<add> end
<add>
<add> def custom_metadata=(metadata)
<add> self[:metadata] = self[:metadata].merge(custom: metadata)
<add> end
<add>
<ide> # Returns true if the content_type of this blob is in the image range, like image/png.
<ide> def image?
<ide> content_type.start_with?("image")
<ide> def url(expires_in: ActiveStorage.service_urls_expire_in, disposition: :inline,
<ide> # Returns a URL that can be used to directly upload a file for this blob on the service. This URL is intended to be
<ide> # short-lived for security and only generated on-demand by the client-side JavaScript responsible for doing the uploading.
<ide> def service_url_for_direct_upload(expires_in: ActiveStorage.service_urls_expire_in)
<del> service.url_for_direct_upload key, expires_in: expires_in, content_type: content_type, content_length: byte_size, checksum: checksum
<add> service.url_for_direct_upload key, expires_in: expires_in, content_type: content_type, content_length: byte_size, checksum: checksum, custom_metadata: custom_metadata
<ide> end
<ide>
<ide> # Returns a Hash of headers for +service_url_for_direct_upload+ requests.
<ide> def service_headers_for_direct_upload
<del> service.headers_for_direct_upload key, filename: filename, content_type: content_type, content_length: byte_size, checksum: checksum
<add> service.headers_for_direct_upload key, filename: filename, content_type: content_type, content_length: byte_size, checksum: checksum, custom_metadata: custom_metadata
<ide> end
<ide>
<ide> def content_type_for_serving # :nodoc:
<ide> def web_image?
<ide>
<ide> def service_metadata
<ide> if forcibly_serve_as_binary?
<del> { content_type: ActiveStorage.binary_content_type, disposition: :attachment, filename: filename }
<add> { content_type: ActiveStorage.binary_content_type, disposition: :attachment, filename: filename, custom_metadata: custom_metadata }
<ide> elsif !allowed_inline?
<del> { content_type: content_type, disposition: :attachment, filename: filename }
<add> { content_type: content_type, disposition: :attachment, filename: filename, custom_metadatata: custom_metadata }
<ide> else
<del> { content_type: content_type }
<add> { content_type: content_type, custom_metadata: custom_metadata }
<ide> end
<ide> end
<ide>
<ide><path>activestorage/lib/active_storage/service.rb
<ide> def url(key, **options)
<ide> # The URL will be valid for the amount of seconds specified in +expires_in+.
<ide> # You must also provide the +content_type+, +content_length+, and +checksum+ of the file
<ide> # that will be uploaded. All these attributes will be validated by the service upon upload.
<del> def url_for_direct_upload(key, expires_in:, content_type:, content_length:, checksum:)
<add> def url_for_direct_upload(key, expires_in:, content_type:, content_length:, checksum:, custom_metadata: {})
<ide> raise NotImplementedError
<ide> end
<ide>
<ide> # Returns a Hash of headers for +url_for_direct_upload+ requests.
<del> def headers_for_direct_upload(key, filename:, content_type:, content_length:, checksum:)
<add> def headers_for_direct_upload(key, filename:, content_type:, content_length:, checksum:, custom_metadata: {})
<ide> {}
<ide> end
<ide>
<ide> def public_url(key, **)
<ide> raise NotImplementedError
<ide> end
<ide>
<add> def custom_metadata_headers(metadata)
<add> raise NotImplementedError
<add> end
<ide>
<ide> def instrument(operation, payload = {}, &block)
<ide> ActiveSupport::Notifications.instrument(
<ide><path>activestorage/lib/active_storage/service/azure_storage_service.rb
<ide> def initialize(storage_account_name:, storage_access_key:, container:, public: f
<ide> @public = public
<ide> end
<ide>
<del> def upload(key, io, checksum: nil, filename: nil, content_type: nil, disposition: nil, **)
<add> def upload(key, io, checksum: nil, filename: nil, content_type: nil, disposition: nil, custom_metadata: {}, **)
<ide> instrument :upload, key: key, checksum: checksum do
<ide> handle_errors do
<ide> content_disposition = content_disposition_with(filename: filename, type: disposition) if disposition && filename
<ide>
<del> client.create_block_blob(container, key, IO.try_convert(io) || io, content_md5: checksum, content_type: content_type, content_disposition: content_disposition)
<add> client.create_block_blob(container, key, IO.try_convert(io) || io, content_md5: checksum, content_type: content_type, content_disposition: content_disposition, metadata: custom_metadata)
<ide> end
<ide> end
<ide> end
<ide> def exist?(key)
<ide> end
<ide> end
<ide>
<del> def url_for_direct_upload(key, expires_in:, content_type:, content_length:, checksum:)
<add> def url_for_direct_upload(key, expires_in:, content_type:, content_length:, checksum:, custom_metadata: {})
<ide> instrument :url, key: key do |payload|
<ide> generated_url = signer.signed_uri(
<ide> uri_for(key), false,
<ide> def url_for_direct_upload(key, expires_in:, content_type:, content_length:, chec
<ide> end
<ide> end
<ide>
<del> def headers_for_direct_upload(key, content_type:, checksum:, filename: nil, disposition: nil, **)
<add> def headers_for_direct_upload(key, content_type:, checksum:, filename: nil, disposition: nil, custom_metadata:, **)
<ide> content_disposition = content_disposition_with(type: disposition, filename: filename) if filename
<ide>
<del> { "Content-Type" => content_type, "Content-MD5" => checksum, "x-ms-blob-content-disposition" => content_disposition, "x-ms-blob-type" => "BlockBlob" }
<add> { "Content-Type" => content_type, "Content-MD5" => checksum, "x-ms-blob-content-disposition" => content_disposition, "x-ms-blob-type" => "BlockBlob", **custom_metadata_headers(custom_metadata) }
<ide> end
<ide>
<ide> private
<ide> def handle_errors
<ide> raise
<ide> end
<ide> end
<add>
<add> def custom_metadata_headers(metadata)
<add> metadata.transform_keys { |key| "x-ms-meta-#{key}" }
<add> end
<ide> end
<ide> end
<ide><path>activestorage/lib/active_storage/service/disk_service.rb
<ide> def exist?(key)
<ide> end
<ide> end
<ide>
<del> def url_for_direct_upload(key, expires_in:, content_type:, content_length:, checksum:)
<add> def url_for_direct_upload(key, expires_in:, content_type:, content_length:, checksum:, custom_metadata: {})
<ide> instrument :url, key: key do |payload|
<ide> verified_token_with_expiration = ActiveStorage.verifier.generate(
<ide> {
<ide><path>activestorage/lib/active_storage/service/gcs_service.rb
<ide> def initialize(public: false, **config)
<ide> @public = public
<ide> end
<ide>
<del> def upload(key, io, checksum: nil, content_type: nil, disposition: nil, filename: nil)
<add> def upload(key, io, checksum: nil, content_type: nil, disposition: nil, filename: nil, custom_metadata: {})
<ide> instrument :upload, key: key, checksum: checksum do
<ide> # GCS's signed URLs don't include params such as response-content-type response-content_disposition
<ide> # in the signature, which means an attacker can modify them and bypass our effort to force these to
<ide> # binary and attachment when the file's content type requires it. The only way to force them is to
<ide> # store them as object's metadata.
<ide> content_disposition = content_disposition_with(type: disposition, filename: filename) if disposition && filename
<del> bucket.create_file(io, key, md5: checksum, cache_control: @config[:cache_control], content_type: content_type, content_disposition: content_disposition)
<add> bucket.create_file(io, key, md5: checksum, cache_control: @config[:cache_control], content_type: content_type, content_disposition: content_disposition, metadata: custom_metadata)
<ide> rescue Google::Cloud::InvalidArgumentError
<ide> raise ActiveStorage::IntegrityError
<ide> end
<ide> def download(key, &block)
<ide> end
<ide> end
<ide>
<del> def update_metadata(key, content_type:, disposition: nil, filename: nil)
<add> def update_metadata(key, content_type:, disposition: nil, filename: nil, custom_metadata: {})
<ide> instrument :update_metadata, key: key, content_type: content_type, disposition: disposition do
<ide> file_for(key).update do |file|
<ide> file.content_type = content_type
<ide> file.content_disposition = content_disposition_with(type: disposition, filename: filename) if disposition && filename
<add> file.metadata = custom_metadata
<ide> end
<ide> end
<ide> end
<ide> def exist?(key)
<ide> end
<ide> end
<ide>
<del> def url_for_direct_upload(key, expires_in:, checksum:, **)
<add> def url_for_direct_upload(key, expires_in:, checksum:, custom_metadata: {}, **)
<ide> instrument :url, key: key do |payload|
<ide> headers = {}
<ide> version = :v2
<ide> def url_for_direct_upload(key, expires_in:, checksum:, **)
<ide> version = :v4
<ide> end
<ide>
<add> headers.merge!(custom_metadata_headers(custom_metadata))
<add>
<ide> args = {
<ide> content_md5: checksum,
<ide> expires: expires_in,
<ide> def url_for_direct_upload(key, expires_in:, checksum:, **)
<ide> end
<ide> end
<ide>
<del> def headers_for_direct_upload(key, checksum:, filename: nil, disposition: nil, **)
<add> def headers_for_direct_upload(key, checksum:, filename: nil, disposition: nil, custom_metadata: {}, **)
<ide> content_disposition = content_disposition_with(type: disposition, filename: filename) if filename
<ide>
<del> headers = { "Content-MD5" => checksum, "Content-Disposition" => content_disposition }
<del>
<add> headers = { "Content-MD5" => checksum, "Content-Disposition" => content_disposition, **custom_metadata_headers(custom_metadata) }
<ide> if @config[:cache_control].present?
<ide> headers["Cache-Control"] = @config[:cache_control]
<ide> end
<ide> def signer
<ide> response.signed_blob
<ide> end
<ide> end
<add>
<add> def custom_metadata_headers(metadata)
<add> metadata.transform_keys { |key| "x-goog-meta-#{key}" }
<add> end
<ide> end
<ide> end
<ide><path>activestorage/lib/active_storage/service/s3_service.rb
<ide> def initialize(bucket:, upload: {}, public: false, **options)
<ide> @upload_options[:acl] = "public-read" if public?
<ide> end
<ide>
<del> def upload(key, io, checksum: nil, filename: nil, content_type: nil, disposition: nil, **)
<add> def upload(key, io, checksum: nil, filename: nil, content_type: nil, disposition: nil, custom_metadata: {}, **)
<ide> instrument :upload, key: key, checksum: checksum do
<ide> content_disposition = content_disposition_with(filename: filename, type: disposition) if disposition && filename
<ide>
<ide> if io.size < multipart_upload_threshold
<del> upload_with_single_part key, io, checksum: checksum, content_type: content_type, content_disposition: content_disposition
<add> upload_with_single_part key, io, checksum: checksum, content_type: content_type, content_disposition: content_disposition, custom_metadata: custom_metadata
<ide> else
<del> upload_with_multipart key, io, content_type: content_type, content_disposition: content_disposition
<add> upload_with_multipart key, io, content_type: content_type, content_disposition: content_disposition, custom_metadata: custom_metadata
<ide> end
<ide> end
<ide> end
<ide> def exist?(key)
<ide> end
<ide> end
<ide>
<del> def url_for_direct_upload(key, expires_in:, content_type:, content_length:, checksum:)
<add> def url_for_direct_upload(key, expires_in:, content_type:, content_length:, checksum:, custom_metadata: {})
<ide> instrument :url, key: key do |payload|
<ide> generated_url = object_for(key).presigned_url :put, expires_in: expires_in.to_i,
<ide> content_type: content_type, content_length: content_length, content_md5: checksum,
<del> whitelist_headers: ["content-length"], **upload_options
<add> metadata: custom_metadata, whitelist_headers: ["content-length"], **upload_options
<ide>
<ide> payload[:url] = generated_url
<ide>
<ide> generated_url
<ide> end
<ide> end
<ide>
<del> def headers_for_direct_upload(key, content_type:, checksum:, filename: nil, disposition: nil, **)
<add> def headers_for_direct_upload(key, content_type:, checksum:, filename: nil, disposition: nil, custom_metadata: {}, **)
<ide> content_disposition = content_disposition_with(type: disposition, filename: filename) if filename
<ide>
<del> { "Content-Type" => content_type, "Content-MD5" => checksum, "Content-Disposition" => content_disposition }
<add> { "Content-Type" => content_type, "Content-MD5" => checksum, "Content-Disposition" => content_disposition, **custom_metadata_headers(custom_metadata) }
<ide> end
<ide>
<ide> private
<ide> def public_url(key, **client_opts)
<ide> MAXIMUM_UPLOAD_PARTS_COUNT = 10000
<ide> MINIMUM_UPLOAD_PART_SIZE = 5.megabytes
<ide>
<del> def upload_with_single_part(key, io, checksum: nil, content_type: nil, content_disposition: nil)
<del> object_for(key).put(body: io, content_md5: checksum, content_type: content_type, content_disposition: content_disposition, **upload_options)
<add> def upload_with_single_part(key, io, checksum: nil, content_type: nil, content_disposition: nil, custom_metadata: {})
<add> object_for(key).put(body: io, content_md5: checksum, content_type: content_type, content_disposition: content_disposition, metadata: custom_metadata, **upload_options)
<ide> rescue Aws::S3::Errors::BadDigest
<ide> raise ActiveStorage::IntegrityError
<ide> end
<ide>
<del> def upload_with_multipart(key, io, content_type: nil, content_disposition: nil)
<add> def upload_with_multipart(key, io, content_type: nil, content_disposition: nil, custom_metadata: {})
<ide> part_size = [ io.size.fdiv(MAXIMUM_UPLOAD_PARTS_COUNT).ceil, MINIMUM_UPLOAD_PART_SIZE ].max
<ide>
<del> object_for(key).upload_stream(content_type: content_type, content_disposition: content_disposition, part_size: part_size, **upload_options) do |out|
<add> object_for(key).upload_stream(content_type: content_type, content_disposition: content_disposition, part_size: part_size, metadata: custom_metadata, **upload_options) do |out|
<ide> IO.copy_stream(io, out)
<ide> end
<ide> end
<ide> def stream(key)
<ide> offset += chunk_size
<ide> end
<ide> end
<add>
<add> def custom_metadata_headers(metadata)
<add> metadata.transform_keys { |key| "x-amz-meta-#{key}" }
<add> end
<ide> end
<ide> end
<ide><path>activestorage/test/controllers/direct_uploads_controller_test.rb
<ide> class ActiveStorage::S3DirectUploadsControllerTest < ActionDispatch::Integration
<ide> "my_key_1": "my_value_1",
<ide> "my_key_2": "my_value_2",
<ide> "platform": "my_platform",
<del> "library_ID": "12345"
<add> "library_ID": "12345",
<add> custom: {
<add> "my_key_3": "my_value_3"
<add> }
<ide> }
<ide>
<ide> ActiveStorage::DirectUploadToken.stub(:verify_direct_upload_token, "s3") do
<ide> class ActiveStorage::S3DirectUploadsControllerTest < ActionDispatch::Integration
<ide> assert_equal "text/plain", details["content_type"]
<ide> assert_match SERVICE_CONFIGURATIONS[:s3][:bucket], details["direct_upload"]["url"]
<ide> assert_match(/s3(-[-a-z0-9]+)?\.(\S+)?amazonaws\.com/, details["direct_upload"]["url"])
<del> assert_equal({ "Content-Type" => "text/plain", "Content-MD5" => checksum, "Content-Disposition" => "inline; filename=\"hello.txt\"; filename*=UTF-8''hello.txt" }, details["direct_upload"]["headers"])
<add> assert_equal({ "Content-Type" => "text/plain", "Content-MD5" => checksum, "Content-Disposition" => "inline; filename=\"hello.txt\"; filename*=UTF-8''hello.txt", "x-amz-meta-my_key_3" => "my_value_3" }, details["direct_upload"]["headers"])
<ide> end
<ide> end
<ide> end
<ide> class ActiveStorage::GCSDirectUploadsControllerTest < ActionDispatch::Integratio
<ide> "my_key_1": "my_value_1",
<ide> "my_key_2": "my_value_2",
<ide> "platform": "my_platform",
<del> "library_ID": "12345"
<add> "library_ID": "12345",
<add> custom: {
<add> "my_key_3": "my_value_3"
<add> }
<ide> }
<ide>
<ide> ActiveStorage::DirectUploadToken.stub(:verify_direct_upload_token, "gcs") do
<ide> class ActiveStorage::GCSDirectUploadsControllerTest < ActionDispatch::Integratio
<ide> assert_equal metadata, details["metadata"].transform_keys(&:to_sym)
<ide> assert_equal "text/plain", details["content_type"]
<ide> assert_match %r{storage\.googleapis\.com/#{@config[:bucket]}}, details["direct_upload"]["url"]
<del> assert_equal({ "Content-MD5" => checksum, "Content-Disposition" => "inline; filename=\"hello.txt\"; filename*=UTF-8''hello.txt" }, details["direct_upload"]["headers"])
<add> assert_equal({ "Content-MD5" => checksum, "Content-Disposition" => "inline; filename=\"hello.txt\"; filename*=UTF-8''hello.txt", "x-goog-meta-my_key_3" => "my_value_3" }, details["direct_upload"]["headers"])
<ide> end
<ide> end
<ide> end
<ide><path>activestorage/test/models/blob_test.rb
<ide> class ActiveStorage::BlobTest < ActiveSupport::TestCase
<ide> test "updating the content_type updates service metadata" do
<ide> blob = directly_upload_file_blob(filename: "racecar.jpg", content_type: "application/octet-stream")
<ide>
<del> expected_arguments = [blob.key, content_type: "image/jpeg"]
<add> expected_arguments = [blob.key, content_type: "image/jpeg", custom_metadata: {}]
<ide>
<ide> assert_called_with(blob.service, :update_metadata, expected_arguments) do
<ide> blob.update!(content_type: "image/jpeg")
<ide> end
<ide> end
<ide>
<add> test "updating the metadata updates service metadata" do
<add> blob = directly_upload_file_blob(filename: "racecar.jpg", content_type: "application/octet-stream")
<add>
<add> expected_arguments = [
<add> blob.key,
<add> {
<add> content_type: "application/octet-stream",
<add> disposition: :attachment,
<add> filename: blob.filename,
<add> custom_metadatata: { "test" => true }
<add> }
<add> ]
<add>
<add> assert_called_with(blob.service, :update_metadata, expected_arguments) do
<add> blob.update!(metadata: { custom: { "test" => true } })
<add> end
<add> end
<add>
<ide> test "scope_for_strict_loading adds includes only when track_variants and strict_loading_by_default" do
<ide> assert_empty(
<ide> ActiveStorage::Blob.scope_for_strict_loading.includes_values,
<ide><path>activestorage/test/service/azure_storage_service_test.rb
<ide> class ActiveStorage::Service::AzureStorageServiceTest < ActiveSupport::TestCase
<ide> @service.delete key
<ide> end
<ide>
<add> test "upload with custom_metadata" do
<add> key = SecureRandom.base58(24)
<add> data = "Foobar"
<add>
<add> @service.upload(key, StringIO.new(data), checksum: OpenSSL::Digest::MD5.base64digest(data), filename: ActiveStorage::Filename.new("test.txt"), custom_metadata: { "foo" => "baz" })
<add> url = @service.url(key, expires_in: 2.minutes, disposition: :inline, content_type: "text/html", filename: ActiveStorage::Filename.new("test.html"))
<add>
<add> response = Net::HTTP.get_response(URI(url))
<add> assert_equal("baz", response["x-ms-meta-foo"])
<add> ensure
<add> @service.delete key
<add> end
<add>
<ide> test "signed URL generation" do
<ide> url = @service.url(@key, expires_in: 5.minutes,
<ide> disposition: :inline, filename: ActiveStorage::Filename.new("avatar.png"), content_type: "image/png")
<ide><path>activestorage/test/service/gcs_service_test.rb
<ide> class ActiveStorage::Service::GCSServiceTest < ActiveSupport::TestCase
<ide> service.delete key
<ide> end
<ide>
<del> test "update metadata" do
<add> test "upload with custom_metadata" do
<ide> key = SecureRandom.base58(24)
<ide> data = "Something else entirely!"
<del> @service.upload(key, StringIO.new(data), checksum: OpenSSL::Digest::MD5.base64digest(data), disposition: :attachment, filename: ActiveStorage::Filename.new("test.html"), content_type: "text/html")
<add> @service.upload(key, StringIO.new(data), checksum: Digest::MD5.base64digest(data), content_type: "text/plain", custom_metadata: { "foo" => "baz" })
<ide>
<del> @service.update_metadata(key, disposition: :inline, filename: ActiveStorage::Filename.new("test.txt"), content_type: "text/plain")
<add> url = @service.url(key, expires_in: 2.minutes, disposition: :inline, content_type: "text/html", filename: ActiveStorage::Filename.new("test.html"))
<add>
<add> response = Net::HTTP.get_response(URI(url))
<add> assert_equal("baz", response["x-goog-meta-foo"])
<add> ensure
<add> @service.delete key
<add> end
<add>
<add> test "update custom_metadata" do
<add> key = SecureRandom.base58(24)
<add> data = "Something else entirely!"
<add> @service.upload(key, StringIO.new(data), checksum: OpenSSL::Digest::MD5.base64digest(data), disposition: :attachment, filename: ActiveStorage::Filename.new("test.html"), content_type: "text/html", custom_metadata: { "foo" => "baz" })
<add>
<add> @service.update_metadata(key, disposition: :inline, filename: ActiveStorage::Filename.new("test.txt"), content_type: "text/plain", custom_metadata: { "foo" => "bar" })
<ide> url = @service.url(key, expires_in: 2.minutes, disposition: :attachment, content_type: "text/html", filename: ActiveStorage::Filename.new("test.html"))
<ide>
<ide> response = Net::HTTP.get_response(URI(url))
<ide> assert_equal "text/plain", response.content_type
<ide> assert_match(/inline;.*test.txt/, response["Content-Disposition"])
<add> assert_equal("bar", response["x-goog-meta-foo"])
<ide> ensure
<ide> @service.delete key
<ide> end
<ide><path>activestorage/test/service/s3_service_test.rb
<ide> class ActiveStorage::Service::S3ServiceTest < ActiveSupport::TestCase
<ide> @service.delete key
<ide> end
<ide>
<add> test "upload with custom_metadata" do
<add> key = SecureRandom.base58(24)
<add> data = "Something else entirely!"
<add> @service.upload(
<add> key,
<add> StringIO.new(data),
<add> checksum: Digest::MD5.base64digest(data),
<add> content_type: "text/plain",
<add> custom_metadata: { "foo" => "baz" },
<add> filename: "custom_metadata.txt"
<add> )
<add>
<add> url = @service.url(key, expires_in: 2.minutes, disposition: :inline, content_type: "text/html", filename: ActiveStorage::Filename.new("test.html"))
<add>
<add> response = Net::HTTP.get_response(URI(url))
<add> assert_equal("baz", response["x-amz-meta-foo"])
<add> ensure
<add> @service.delete key
<add> end
<add>
<ide> test "upload with content disposition" do
<ide> key = SecureRandom.base58(24)
<ide> data = "Something else entirely!" | 12 |
Text | Text | add example of event close for child_process | 89344f5bee0dc566e5990c9618efbb73d6a9284d | <ide><path>doc/api/child_process.md
<ide> The `'close'` event is emitted when the stdio streams of a child process have
<ide> been closed. This is distinct from the [`'exit'`][] event, since multiple
<ide> processes might share the same stdio streams.
<ide>
<add>```js
<add>const { spawn } = require('child_process');
<add>const ls = spawn('ls', ['-lh', '/usr']);
<add>
<add>ls.stdout.on('data', (data) => {
<add> console.log(`stdout: ${data}`);
<add>});
<add>
<add>ls.on('close', (code) => {
<add> console.log(`child process close all stdio with code ${code}`);
<add>});
<add>
<add>ls.on('exit', (code) => {
<add> console.log(`child process exited with code ${code}`);
<add>});
<add>```
<add>
<ide> ### Event: 'disconnect'
<ide> <!-- YAML
<ide> added: v0.7.2 | 1 |
Ruby | Ruby | remove pointless use of 'private' | 99db97a32237876f65468aa116441a621b80cc01 | <ide><path>activerecord/lib/active_record/associations/has_one_through_association.rb
<ide> def create_through_record(new_value)
<ide> end
<ide> end
<ide>
<del> private
<ide> def find_target
<ide> update_stale_state
<ide> scoped.first | 1 |
Go | Go | relax second cache key requirements for schema1 | 85167fc63409524fa870bf9bafd7193f08a6d8ed | <ide><path>builder/builder-next/adapters/containerimage/pull.go
<ide> func (p *puller) CacheKey(ctx context.Context, g session.Group, index int) (stri
<ide> return dgst.String(), nil, false, nil
<ide> }
<ide>
<del> if len(p.config) == 0 {
<add> if len(p.config) == 0 && p.desc.MediaType != images.MediaTypeDockerSchema1Manifest {
<ide> return "", nil, false, errors.Errorf("invalid empty config file resolved for %s", p.src.Reference.String())
<ide> }
<ide>
<ide> k := cacheKeyFromConfig(p.config).String()
<del> if k == "" {
<add> if k == "" || p.desc.MediaType == images.MediaTypeDockerSchema1Manifest {
<ide> dgst, err := p.mainManifestKey(p.platform)
<ide> if err != nil {
<ide> return "", nil, false, err | 1 |
Javascript | Javascript | provide correct serialization context | dfbce79f3cbe7b85f1577b27df6cca534f3f1922 | <ide><path>lib/serialization/Serializer.js
<ide> class Serializer {
<ide> let current = obj;
<ide> for (const middleware of this.serializeMiddlewares) {
<ide> if (current && typeof current.then === "function") {
<del> current = current.then(
<del> data => data && middleware.serialize(data, context)
<del> );
<add> current = current.then(data => data && middleware.serialize(data, ctx));
<ide> } else if (current) {
<ide> try {
<ide> current = middleware.serialize(current, ctx);
<ide> class Serializer {
<ide> let current = value;
<ide> for (const middleware of this.deserializeMiddlewares) {
<ide> if (current && typeof current.then === "function") {
<del> current = current.then(data => middleware.deserialize(data, context));
<add> current = current.then(data => middleware.deserialize(data, ctx));
<ide> } else {
<ide> current = middleware.deserialize(current, ctx);
<ide> } | 1 |
Text | Text | fix the data-height of step 5 codepen embed | 83f56370e6040ee58ed67f996dce3f9d1ae56c86 | <ide><path>docs/docs/thinking-in-react.md
<ide> You can start seeing how your application will behave: set `filterText` to `"bal
<ide>
<ide> ## Step 5: Add Inverse Data Flow
<ide>
<del><p data-height="265" data-theme-id="0" data-slug-hash="qRqmjd" data-default-tab="js,result" data-user="rohan10" data-embed-version="2" data-pen-title="Thinking In React: Step 5" class="codepen">See the Pen <a href="http://codepen.io/rohan10/pen/qRqmjd">Thinking In React: Step 5</a> on <a href="http://codepen.io">CodePen</a>.</p>
<add><p data-height="600" data-theme-id="0" data-slug-hash="qRqmjd" data-default-tab="js,result" data-user="rohan10" data-embed-version="2" data-pen-title="Thinking In React: Step 5" class="codepen">See the Pen <a href="http://codepen.io/rohan10/pen/qRqmjd">Thinking In React: Step 5</a> on <a href="http://codepen.io">CodePen</a>.</p>
<ide> <script async src="https://production-assets.codepen.io/assets/embed/ei.js"></script>
<ide>
<ide> So far, we've built an app that renders correctly as a function of props and state flowing down the hierarchy. Now it's time to support data flowing the other way: the form components deep in the hierarchy need to update the state in `FilterableProductTable`. | 1 |
PHP | PHP | use random iv values in rijndael | 974ac44fb43a7ecba9f14d16a0c7e39530f14e88 | <ide><path>lib/Cake/Test/Case/Utility/SecurityTest.php
<ide> public function testRijndael() {
<ide> $result = Security::rijndael('', $key, 'encrypt');
<ide> $this->assertEquals('', Security::rijndael($result, $key, 'decrypt'));
<ide>
<del> $result = Security::rijndael($txt, $key = 'this is my key of over 32 chars, yes it is', 'encrypt');
<add> $key = 'this is my key of over 32 chars, yes it is';
<add> $result = Security::rijndael($txt, $key, 'encrypt');
<ide> $this->assertEquals($txt, Security::rijndael($result, $key, 'decrypt'));
<ide> }
<ide>
<add>/**
<add> * Test that rijndael() can still decrypt values with a fixed iv.
<add> *
<add> * @return
<add> */
<add> public function testRijndaelBackwardCompatibility() {
<add> $this->skipIf(!function_exists('mcrypt_encrypt'));
<add>
<add> $txt = 'The quick brown fox jumped over the lazy dog.';
<add> $key = 'DYhG93b0qyJfIxfs2guVoUubWwvniR2G0FgaC9mi';
<add>
<add> // Encrypted before random iv
<add> $value = base64_decode('1WPjnq96LMzLGwNgmudHF+cAIqVUN5DaUZEpf5tm1EzSgt5iYY9o3d66iRI/fKJLTlTVGsa8HzW0jDNitmVXoQ==');
<add> $this->assertEquals($txt, Security::rijndael($value, $key, 'decrypt'));
<add> }
<add>
<ide> /**
<ide> * testRijndaelInvalidOperation method
<ide> *
<ide><path>lib/Cake/Utility/Security.php
<ide> public static function cipher($text, $key) {
<ide> /**
<ide> * Encrypts/Decrypts a text using the given key using rijndael method.
<ide> *
<add> * Prior to 2.3.1, a fixed initialization vector was used. This was not
<add> * secure. This method now uses a random iv, and will silently upgrade values when
<add> * they are re-encrypted.
<add> *
<ide> * @param string $text Encrypted string to decrypt, normal string to encrypt
<ide> * @param string $key Key to use as the encryption key for encrypted data.
<ide> * @param string $operation Operation to perform, encrypt or decrypt
<ide> public static function rijndael($text, $key, $operation) {
<ide> }
<ide> $algorithm = MCRYPT_RIJNDAEL_256;
<ide> $mode = MCRYPT_MODE_CBC;
<add> $ivSize = mcrypt_get_iv_size($algorithm, $mode);
<add>
<ide> $cryptKey = substr($key, 0, 32);
<del> $iv = substr($key, strlen($key) - 32, 32);
<ide>
<ide> if ($operation === 'encrypt') {
<del> return mcrypt_encrypt($algorithm, $cryptKey, $text, $mode, $iv);
<add> $iv = mcrypt_create_iv($ivSize, MCRYPT_RAND);
<add> return $iv . '$$' . mcrypt_encrypt($algorithm, $cryptKey, $text, $mode, $iv);
<add> }
<add> // Backwards compatible decrypt with fixed iv
<add> if (substr($text, $ivSize, 2) !== '$$') {
<add> $iv = substr($key, strlen($key) - 32, 32);
<add> return rtrim(mcrypt_decrypt($algorithm, $cryptKey, $text, $mode, $iv), "\0");
<ide> }
<add> $iv = substr($text, 0, $ivSize);
<add> $text = substr($text, $ivSize + 2);
<ide> return rtrim(mcrypt_decrypt($algorithm, $cryptKey, $text, $mode, $iv), "\0");
<ide> }
<ide> | 2 |
Ruby | Ruby | redirect stderr to /dev/null rather than close | dab04e33217403ab44553308999de8e6e1c4a8c0 | <ide><path>Library/Homebrew/utils/popen.rb
<ide> def self.popen(args, mode, options = {})
<ide>
<ide> yield pipe
<ide> else
<del> options[:err] ||= :close unless ENV["HOMEBREW_STDERR"]
<add> options[:err] ||= "/dev/null" unless ENV["HOMEBREW_STDERR"]
<ide> begin
<ide> exec(*args, options)
<ide> rescue Errno::ENOENT | 1 |
Text | Text | fix openapi links | a49d744d5ee84ae2e89abde30ceddd2463e1f676 | <ide><path>docs/api-guide/schemas.md
<ide> has to be rendered into the actual bytes that are used in the response.
<ide> REST framework includes a few different renderers that you can use for
<ide> encoding the API schema.
<ide>
<del>* `renderers.OpenAPIRenderer` - Renders into YAML-based [OpenAPI][openapi], the most widely used API schema format.
<del>* `renderers.JSONOpenAPIRenderer` - Renders into JSON-based [OpenAPI][openapi].
<add>* `renderers.OpenAPIRenderer` - Renders into YAML-based [OpenAPI][open-api], the most widely used API schema format.
<add>* `renderers.JSONOpenAPIRenderer` - Renders into JSON-based [OpenAPI][open-api].
<ide> * `renderers.CoreJSONRenderer` - Renders into [Core JSON][corejson], a format designed for
<ide> use with the `coreapi` client library.
<ide> | 1 |
Go | Go | remove unused useshimv2() | bf7fd015f785d7e827f78dddcd274c1f05879059 | <ide><path>daemon/daemon_windows.go
<ide> func (daemon *Daemon) initRuntimes(_ map[string]types.Runtime) error {
<ide> func setupResolvConf(config *config.Config) {
<ide> }
<ide>
<del>func (daemon *Daemon) useShimV2() bool {
<del> return true
<del>}
<del>
<ide> // RawSysInfo returns *sysinfo.SysInfo .
<ide> func (daemon *Daemon) RawSysInfo(quiet bool) *sysinfo.SysInfo {
<ide> return sysinfo.New(quiet)
<ide><path>libcontainerd/libcontainerd_windows.go
<ide> import (
<ide> // NewClient creates a new libcontainerd client from a containerd client
<ide> func NewClient(ctx context.Context, cli *containerd.Client, stateDir, ns string, b libcontainerdtypes.Backend) (libcontainerdtypes.Client, error) {
<ide> if !system.ContainerdRuntimeSupported() {
<del> // useShimV2 is ignored for windows
<ide> return local.NewClient(ctx, cli, stateDir, ns, b)
<ide> }
<ide> return remote.NewClient(ctx, cli, stateDir, ns, b) | 2 |
PHP | PHP | apply fixes from styleci | 0612709ac926414342a4d982fdfd8dca024d4abb | <ide><path>src/Illuminate/View/Compilers/Concerns/CompilesLayouts.php
<ide> protected function compileExtends($expression)
<ide>
<ide> return '';
<ide> }
<del>
<add>
<ide> /**
<ide> * Compile the extends-first statements into valid PHP.
<ide> * | 1 |
Python | Python | use pandas.io.gbq for bigquery integration | 8bae4e1ccabe6426a68f53ff322e1e667c36f730 | <ide><path>airflow/hooks/bigquery_hook.py
<ide> from airflow.hooks.dbapi_hook import DbApiHook
<ide> from apiclient.discovery import build
<ide> from oauth2client.client import SignedJwtAssertionCredentials
<add>from pandas.io.gbq import GbqConnector, _parse_data as gbq_parse_data
<add>from pandas.tools.merge import concat
<ide>
<ide> logging.getLogger("bigquery").setLevel(logging.INFO)
<ide>
<ide> def get_pandas_df(self, bql, parameters=None):
<ide> connection_info = self.get_connection(self.bigquery_conn_id)
<ide> connection_extras = connection_info.extra_dejson
<ide> project = connection_extras['project']
<del> response = service.jobs().query(projectId=project, body={
<del> "query": bql,
<del> }).execute()
<add> connector = BigQueryPandasConnector(project, service)
<add> schema, pages = connector.run_query(bql, verbose=False)
<add> dataframe_list = []
<ide>
<del> if len(response['rows']) > 0:
<del> # Extract column names from response.
<del> columns = [c['name'] for c in response['schema']['fields']]
<add> while len(pages) > 0:
<add> page = pages.pop()
<add> dataframe_list.append(gbq_parse_data(schema, page))
<ide>
<del> # Extract data into a two-dimensional list of values.
<del> data = []
<del> for row in response['rows']:
<del> row = map(lambda kv: kv['v'], row['f'])
<del> data.append(row)
<del>
<del> return pandas.DataFrame(data, columns = columns)
<add> if len(dataframe_list) > 0:
<add> return concat(dataframe_list, ignore_index=True)
<ide> else:
<del> return pandas.DataFrame()
<add> return gbq_parse_data(schema, [])
<add>
<add>class BigQueryPandasConnector(GbqConnector):
<add> """
<add> This connector behaves identically to GbqConnector (from Pandas), except
<add> that it allows the service to be injected, and disables a call to
<add> self.get_credentials(). This allows Airflow to use BigQuery with Pandas
<add> without forcing a three legged OAuth connection. Instead, we can inject
<add> service account credentials into the binding.
<add> """
<add> def __init__(self, project_id, service, reauth=False):
<add> self.test_google_api_imports()
<add> self.project_id = project_id
<add> self.reauth = reauth
<add> self.service = service | 1 |
Text | Text | add cards for all geotrend models | 0a80959bddd5da08742d22dca07e0facf0b4cd11 | <ide><path>model_cards/Geotrend/bert-base-15lang-cased/README.md
<add>---
<add>language: multilingual
<add>
<add>datasets: wikipedia
<add>
<add>license: apache-2.0
<add>---
<add>
<add># bert-base-15lang-cased
<add>
<add>We are sharing smaller versions of [bert-base-multilingual-cased](https://huggingface.co/bert-base-multilingual-cased) that handle a custom number of languages.
<add>
<add>Unlike [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased), our versions give exactly the same representations produced by the original model which preserves the original accuracy.
<add>
<add>The measurements below have been computed on a [Google Cloud n1-standard-1 machine (1 vCPU, 3.75 GB)](https://cloud.google.com/compute/docs/machine-types\#n1_machine_type):
<add>| Model | Num parameters | Size | Memory | Loading time |
<add>| ------------------------------- | -------------- | -------- | -------- | ------------ |
<add>| bert-base-multilingual-cased | 178 million | 714 MB | 1400 MB | 4.2 sec |
<add>| Geotrend/bert-base-15lang-cased | 141 million | 564 MB | 1098 MB | 3.1 sec |
<add>
<add>Handled languages: en, fr, es, de, zh, ar, ru, vi, el, bg, th, tr, hi, ur and sw.
<add>
<add>For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
<add>
<add>## How to use
<add>
<add>```python
<add>from transformers import AutoTokenizer, AutoModel
<add>
<add>tokenizer = AutoTokenizer.from_pretrained("Geotrend/bert-base-15lang-cased")
<add>model = AutoModel.from_pretrained("Geotrend/bert-base-15lang-cased")
<add>
<add>```
<add>
<add>### How to cite
<add>
<add>```bibtex
<add>@inproceedings{smallermbert,
<add> title={Load What You Need: Smaller Versions of Mutlilingual BERT},
<add> author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
<add> booktitle={SustaiNLP / EMNLP},
<add> year={2020}
<add>}
<add>```
<add>
<add>## Contact
<add>
<add>Please contact amine@geotrend.fr for any question, feedback or request.
<ide><path>model_cards/Geotrend/bert-base-ar-cased/README.md
<add>---
<add>language: ar
<add>
<add>datasets: wikipedia
<add>
<add>license: apache-2.0
<add>---
<add>
<add># bert-base-ar-cased
<add>
<add>We are sharing smaller versions of [bert-base-multilingual-cased](https://huggingface.co/bert-base-multilingual-cased) that handle a custom number of languages.
<add>
<add>Unlike [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased), our versions give exactly the same representations produced by the original model which preserves the original accuracy.
<add>
<add>
<add>For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
<add>
<add>## How to use
<add>
<add>```python
<add>from transformers import AutoTokenizer, AutoModel
<add>
<add>tokenizer = AutoTokenizer.from_pretrained("Geotrend/bert-base-ar-cased")
<add>model = AutoModel.from_pretrained("Geotrend/bert-base-ar-cased")
<add>
<add>```
<add>
<add>### How to cite
<add>
<add>```bibtex
<add>@inproceedings{smallermbert,
<add> title={Load What You Need: Smaller Versions of Mutlilingual BERT},
<add> author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
<add> booktitle={SustaiNLP / EMNLP},
<add> year={2020}
<add>}
<add>```
<add>
<add>## Contact
<add>
<add>Please contact amine@geotrend.fr for any question, feedback or request.
<ide><path>model_cards/Geotrend/bert-base-bg-cased/README.md
<add>---
<add>language: bg
<add>
<add>datasets: wikipedia
<add>
<add>license: apache-2.0
<add>---
<add>
<add># bert-base-bg-cased
<add>
<add>We are sharing smaller versions of [bert-base-multilingual-cased](https://huggingface.co/bert-base-multilingual-cased) that handle a custom number of languages.
<add>
<add>Unlike [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased), our versions give exactly the same representations produced by the original model which preserves the original accuracy.
<add>
<add>For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
<add>
<add>## How to use
<add>
<add>```python
<add>from transformers import AutoTokenizer, AutoModel
<add>
<add>tokenizer = AutoTokenizer.from_pretrained("Geotrend/bert-base-bg-cased")
<add>model = AutoModel.from_pretrained("Geotrend/bert-base-bg-cased")
<add>
<add>```
<add>
<add>### How to cite
<add>
<add>```bibtex
<add>@inproceedings{smallermbert,
<add> title={Load What You Need: Smaller Versions of Mutlilingual BERT},
<add> author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
<add> booktitle={SustaiNLP / EMNLP},
<add> year={2020}
<add>}
<add>```
<add>
<add>## Contact
<add>
<add>Please contact amine@geotrend.fr for any question, feedback or request.
<ide><path>model_cards/Geotrend/bert-base-de-cased/README.md
<add>---
<add>language: de
<add>
<add>datasets: wikipedia
<add>
<add>license: apache-2.0
<add>---
<add>
<add># bert-base-de-cased
<add>
<add>We are sharing smaller versions of [bert-base-multilingual-cased](https://huggingface.co/bert-base-multilingual-cased) that handle a custom number of languages.
<add>
<add>Unlike [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased), our versions give exactly the same representations produced by the original model which preserves the original accuracy.
<add>
<add>For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
<add>
<add>## How to use
<add>
<add>```python
<add>from transformers import AutoTokenizer, AutoModel
<add>
<add>tokenizer = AutoTokenizer.from_pretrained("Geotrend/bert-base-de-cased")
<add>model = AutoModel.from_pretrained("Geotrend/bert-base-de-cased")
<add>
<add>```
<add>
<add>### How to cite
<add>
<add>```bibtex
<add>@inproceedings{smallermbert,
<add> title={Load What You Need: Smaller Versions of Mutlilingual BERT},
<add> author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
<add> booktitle={SustaiNLP / EMNLP},
<add> year={2020}
<add>}
<add>```
<add>
<add>## Contact
<add>
<add>Please contact amine@geotrend.fr for any question, feedback or request.
<ide><path>model_cards/Geotrend/bert-base-el-cased/README.md
<add>---
<add>language: el
<add>
<add>datasets: wikipedia
<add>
<add>license: apache-2.0
<add>---
<add>
<add># bert-base-el-cased
<add>
<add>We are sharing smaller versions of [bert-base-multilingual-cased](https://huggingface.co/bert-base-multilingual-cased) that handle a custom number of languages.
<add>
<add>Unlike [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased), our versions give exactly the same representations produced by the original model which preserves the original accuracy.
<add>
<add>For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
<add>
<add>## How to use
<add>
<add>```python
<add>from transformers import AutoTokenizer, AutoModel
<add>
<add>tokenizer = AutoTokenizer.from_pretrained("Geotrend/bert-base-el-cased")
<add>model = AutoModel.from_pretrained("Geotrend/bert-base-el-cased")
<add>
<add>```
<add>
<add>### How to cite
<add>
<add>```bibtex
<add>@inproceedings{smallermbert,
<add> title={Load What You Need: Smaller Versions of Mutlilingual BERT},
<add> author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
<add> booktitle={SustaiNLP / EMNLP},
<add> year={2020}
<add>}
<add>```
<add>
<add>## Contact
<add>
<add>Please contact amine@geotrend.fr for any question, feedback or request.
<ide><path>model_cards/Geotrend/bert-base-en-ar-cased/README.md
<add>---
<add>language: multilingual
<add>
<add>datasets: wikipedia
<add>
<add>license: apache-2.0
<add>---
<add>
<add># bert-base-en-ar-cased
<add>
<add>We are sharing smaller versions of [bert-base-multilingual-cased](https://huggingface.co/bert-base-multilingual-cased) that handle a custom number of languages.
<add>
<add>Unlike [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased), our versions give exactly the same representations produced by the original model which preserves the original accuracy.
<add>
<add>For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
<add>
<add>## How to use
<add>
<add>```python
<add>from transformers import AutoTokenizer, AutoModel
<add>
<add>tokenizer = AutoTokenizer.from_pretrained("Geotrend/bert-base-en-ar-cased")
<add>model = AutoModel.from_pretrained("Geotrend/bert-base-en-ar-cased")
<add>
<add>```
<add>
<add>### How to cite
<add>
<add>```bibtex
<add>@inproceedings{smallermbert,
<add> title={Load What You Need: Smaller Versions of Mutlilingual BERT},
<add> author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
<add> booktitle={SustaiNLP / EMNLP},
<add> year={2020}
<add>}
<add>```
<add>
<add>## Contact
<add>
<add>Please contact amine@geotrend.fr for any question, feedback or request.
<ide><path>model_cards/Geotrend/bert-base-en-bg-cased/README.md
<add>---
<add>language: multilingual
<add>
<add>datasets: wikipedia
<add>
<add>license: apache-2.0
<add>---
<add>
<add># bert-base-en-bg-cased
<add>
<add>We are sharing smaller versions of [bert-base-multilingual-cased](https://huggingface.co/bert-base-multilingual-cased) that handle a custom number of languages.
<add>
<add>Unlike [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased), our versions give exactly the same representations produced by the original model which preserves the original accuracy.
<add>
<add>For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
<add>
<add>## How to use
<add>
<add>```python
<add>from transformers import AutoTokenizer, AutoModel
<add>
<add>tokenizer = AutoTokenizer.from_pretrained("Geotrend/bert-base-en-bg-cased")
<add>model = AutoModel.from_pretrained("Geotrend/bert-base-en-bg-cased")
<add>
<add>```
<add>
<add>### How to cite
<add>
<add>```bibtex
<add>@inproceedings{smallermbert,
<add> title={Load What You Need: Smaller Versions of Mutlilingual BERT},
<add> author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
<add> booktitle={SustaiNLP / EMNLP},
<add> year={2020}
<add>}
<add>```
<add>
<add>## Contact
<add>
<add>Please contact amine@geotrend.fr for any question, feedback or request.
<ide><path>model_cards/Geotrend/bert-base-en-cased/README.md
<add>---
<add>language: en
<add>
<add>datasets: wikipedia
<add>
<add>license: apache-2.0
<add>---
<add>
<add># bert-base-en-cased
<add>
<add>We are sharing smaller versions of [bert-base-multilingual-cased](https://huggingface.co/bert-base-multilingual-cased) that handle a custom number of languages.
<add>
<add>Unlike [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased), our versions give exactly the same representations produced by the original model which preserves the original accuracy.
<add>
<add>For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
<add>
<add>## How to use
<add>
<add>```python
<add>from transformers import AutoTokenizer, AutoModel
<add>
<add>tokenizer = AutoTokenizer.from_pretrained("Geotrend/bert-base-en-cased")
<add>model = AutoModel.from_pretrained("Geotrend/bert-base-en-cased")
<add>
<add>```
<add>
<add>### How to cite
<add>
<add>```bibtex
<add>@inproceedings{smallermbert,
<add> title={Load What You Need: Smaller Versions of Mutlilingual BERT},
<add> author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
<add> booktitle={SustaiNLP / EMNLP},
<add> year={2020}
<add>}
<add>```
<add>
<add>## Contact
<add>
<add>Please contact amine@geotrend.fr for any question, feedback or request.
<ide><path>model_cards/Geotrend/bert-base-en-de-cased/README.md
<add>---
<add>language: multilingual
<add>
<add>datasets: wikipedia
<add>
<add>license: apache-2.0
<add>---
<add>
<add># bert-base-en-de-cased
<add>
<add>We are sharing smaller versions of [bert-base-multilingual-cased](https://huggingface.co/bert-base-multilingual-cased) that handle a custom number of languages.
<add>
<add>Unlike [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased), our versions give exactly the same representations produced by the original model which preserves the original accuracy.
<add>
<add>For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
<add>
<add>## How to use
<add>
<add>```python
<add>from transformers import AutoTokenizer, AutoModel
<add>
<add>tokenizer = AutoTokenizer.from_pretrained("Geotrend/bert-base-en-de-cased")
<add>model = AutoModel.from_pretrained("Geotrend/bert-base-en-de-cased")
<add>
<add>```
<add>
<add>### How to cite
<add>
<add>```bibtex
<add>@inproceedings{smallermbert,
<add> title={Load What You Need: Smaller Versions of Mutlilingual BERT},
<add> author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
<add> booktitle={SustaiNLP / EMNLP},
<add> year={2020}
<add>}
<add>```
<add>
<add>## Contact
<add>
<add>Please contact amine@geotrend.fr for any question, feedback or request.
<ide><path>model_cards/Geotrend/bert-base-en-el-cased/README.md
<add>---
<add>language: multilingual
<add>
<add>datasets: wikipedia
<add>
<add>license: apache-2.0
<add>---
<add>
<add># bert-base-en-el-cased
<add>
<add>We are sharing smaller versions of [bert-base-multilingual-cased](https://huggingface.co/bert-base-multilingual-cased) that handle a custom number of languages.
<add>
<add>Unlike [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased), our versions give exactly the same representations produced by the original model which preserves the original accuracy.
<add>
<add>For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
<add>
<add>## How to use
<add>
<add>```python
<add>from transformers import AutoTokenizer, AutoModel
<add>
<add>tokenizer = AutoTokenizer.from_pretrained("Geotrend/bert-base-en-el-cased")
<add>model = AutoModel.from_pretrained("Geotrend/bert-base-en-el-cased")
<add>
<add>```
<add>
<add>### How to cite
<add>
<add>```bibtex
<add>@inproceedings{smallermbert,
<add> title={Load What You Need: Smaller Versions of Mutlilingual BERT},
<add> author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
<add> booktitle={SustaiNLP / EMNLP},
<add> year={2020}
<add>}
<add>```
<add>
<add>## Contact
<add>
<add>Please contact amine@geotrend.fr for any question, feedback or request.
<ide><path>model_cards/Geotrend/bert-base-en-es-cased/README.md
<add>---
<add>language: multilingual
<add>
<add>datasets: wikipedia
<add>
<add>license: apache-2.0
<add>---
<add>
<add># bert-base-en-es-cased
<add>
<add>We are sharing smaller versions of [bert-base-multilingual-cased](https://huggingface.co/bert-base-multilingual-cased) that handle a custom number of languages.
<add>
<add>Unlike [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased), our versions give exactly the same representations produced by the original model which preserves the original accuracy.
<add>
<add>For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
<add>
<add>## How to use
<add>
<add>```python
<add>from transformers import AutoTokenizer, AutoModel
<add>
<add>tokenizer = AutoTokenizer.from_pretrained("Geotrend/bert-base-en-es-cased")
<add>model = AutoModel.from_pretrained("Geotrend/bert-base-en-es-cased")
<add>
<add>```
<add>
<add>### How to cite
<add>
<add>```bibtex
<add>@inproceedings{smallermbert,
<add> title={Load What You Need: Smaller Versions of Mutlilingual BERT},
<add> author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
<add> booktitle={SustaiNLP / EMNLP},
<add> year={2020}
<add>}
<add>```
<add>
<add>## Contact
<add>
<add>Please contact amine@geotrend.fr for any question, feedback or request.
<ide><path>model_cards/Geotrend/bert-base-en-fr-cased/README.md
<add>---
<add>language: multilingual
<add>
<add>datasets: wikipedia
<add>
<add>license: apache-2.0
<add>---
<add>
<add># bert-base-en-fr-cased
<add>
<add>We are sharing smaller versions of [bert-base-multilingual-cased](https://huggingface.co/bert-base-multilingual-cased) that handle a custom number of languages.
<add>
<add>Unlike [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased), our versions give exactly the same representations produced by the original model which preserves the original accuracy.
<add>
<add>For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
<add>
<add>## How to use
<add>
<add>```python
<add>from transformers import AutoTokenizer, AutoModel
<add>
<add>tokenizer = AutoTokenizer.from_pretrained("Geotrend/bert-base-en-fr-cased")
<add>model = AutoModel.from_pretrained("Geotrend/bert-base-en-fr-cased")
<add>
<add>```
<add>
<add>### How to cite
<add>
<add>```bibtex
<add>@inproceedings{smallermbert,
<add> title={Load What You Need: Smaller Versions of Mutlilingual BERT},
<add> author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
<add> booktitle={SustaiNLP / EMNLP},
<add> year={2020}
<add>}
<add>```
<add>
<add>## Contact
<add>
<add>Please contact amine@geotrend.fr for any question, feedback or request.
<ide><path>model_cards/Geotrend/bert-base-en-hi-cased/README.md
<add>---
<add>language: multilingual
<add>
<add>datasets: wikipedia
<add>
<add>license: apache-2.0
<add>---
<add>
<add># bert-base-en-hi-cased
<add>
<add>We are sharing smaller versions of [bert-base-multilingual-cased](https://huggingface.co/bert-base-multilingual-cased) that handle a custom number of languages.
<add>
<add>Unlike [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased), our versions give exactly the same representations produced by the original model which preserves the original accuracy.
<add>
<add>For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
<add>
<add>## How to use
<add>
<add>```python
<add>from transformers import AutoTokenizer, AutoModel
<add>
<add>tokenizer = AutoTokenizer.from_pretrained("Geotrend/bert-base-en-hi-cased")
<add>model = AutoModel.from_pretrained("Geotrend/bert-base-en-hi-cased")
<add>
<add>```
<add>
<add>### How to cite
<add>
<add>```bibtex
<add>@inproceedings{smallermbert,
<add> title={Load What You Need: Smaller Versions of Mutlilingual BERT},
<add> author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
<add> booktitle={SustaiNLP / EMNLP},
<add> year={2020}
<add>}
<add>```
<add>
<add>## Contact
<add>
<add>Please contact amine@geotrend.fr for any question, feedback or request.
<ide><path>model_cards/Geotrend/bert-base-en-ru-cased/README.md
<add>---
<add>language: multilingual
<add>
<add>datasets: wikipedia
<add>
<add>license: apache-2.0
<add>---
<add>
<add># bert-base-en-ru-cased
<add>
<add>We are sharing smaller versions of [bert-base-multilingual-cased](https://huggingface.co/bert-base-multilingual-cased) that handle a custom number of languages.
<add>
<add>Unlike [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased), our versions give exactly the same representations produced by the original model which preserves the original accuracy.
<add>
<add>For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
<add>
<add>## How to use
<add>
<add>```python
<add>from transformers import AutoTokenizer, AutoModel
<add>
<add>tokenizer = AutoTokenizer.from_pretrained("Geotrend/bert-base-en-ru-cased")
<add>model = AutoModel.from_pretrained("Geotrend/bert-base-en-ru-cased")
<add>
<add>```
<add>
<add>### How to cite
<add>
<add>```bibtex
<add>@inproceedings{smallermbert,
<add> title={Load What You Need: Smaller Versions of Mutlilingual BERT},
<add> author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
<add> booktitle={SustaiNLP / EMNLP},
<add> year={2020}
<add>}
<add>```
<add>
<add>## Contact
<add>
<add>Please contact amine@geotrend.fr for any question, feedback or request.
<ide><path>model_cards/Geotrend/bert-base-en-sw-cased/README.md
<add>---
<add>language: multilingual
<add>
<add>datasets: wikipedia
<add>
<add>license: apache-2.0
<add>---
<add>
<add># bert-base-en-sw-cased
<add>
<add>We are sharing smaller versions of [bert-base-multilingual-cased](https://huggingface.co/bert-base-multilingual-cased) that handle a custom number of languages.
<add>
<add>Unlike [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased), our versions give exactly the same representations produced by the original model which preserves the original accuracy.
<add>
<add>For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
<add>
<add>## How to use
<add>
<add>```python
<add>from transformers import AutoTokenizer, AutoModel
<add>
<add>tokenizer = AutoTokenizer.from_pretrained("Geotrend/bert-base-en-sw-cased")
<add>model = AutoModel.from_pretrained("Geotrend/bert-base-en-sw-cased")
<add>
<add>```
<add>
<add>### How to cite
<add>
<add>```bibtex
<add>@inproceedings{smallermbert,
<add> title={Load What You Need: Smaller Versions of Mutlilingual BERT},
<add> author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
<add> booktitle={SustaiNLP / EMNLP},
<add> year={2020}
<add>}
<add>```
<add>
<add>## Contact
<add>
<add>Please contact amine@geotrend.fr for any question, feedback or request.
<ide><path>model_cards/Geotrend/bert-base-en-th-cased/README.md
<add>---
<add>language: multilingual
<add>
<add>datasets: wikipedia
<add>
<add>license: apache-2.0
<add>---
<add>
<add># bert-base-en-th-cased
<add>
<add>We are sharing smaller versions of [bert-base-multilingual-cased](https://huggingface.co/bert-base-multilingual-cased) that handle a custom number of languages.
<add>
<add>Unlike [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased), our versions give exactly the same representations produced by the original model which preserves the original accuracy.
<add>
<add>For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
<add>
<add>## How to use
<add>
<add>```python
<add>from transformers import AutoTokenizer, AutoModel
<add>
<add>tokenizer = AutoTokenizer.from_pretrained("Geotrend/bert-base-en-th-cased")
<add>model = AutoModel.from_pretrained("Geotrend/bert-base-en-th-cased")
<add>
<add>```
<add>
<add>### How to cite
<add>
<add>```bibtex
<add>@inproceedings{smallermbert,
<add> title={Load What You Need: Smaller Versions of Mutlilingual BERT},
<add> author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
<add> booktitle={SustaiNLP / EMNLP},
<add> year={2020}
<add>}
<add>```
<add>
<add>## Contact
<add>
<add>Please contact amine@geotrend.fr for any question, feedback or request.
<ide><path>model_cards/Geotrend/bert-base-en-tr-cased/README.md
<add>---
<add>language: multilingual
<add>
<add>datasets: wikipedia
<add>
<add>license: apache-2.0
<add>---
<add>
<add># bert-base-en-tr-cased
<add>
<add>We are sharing smaller versions of [bert-base-multilingual-cased](https://huggingface.co/bert-base-multilingual-cased) that handle a custom number of languages.
<add>
<add>Unlike [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased), our versions give exactly the same representations produced by the original model which preserves the original accuracy.
<add>
<add>For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
<add>
<add>## How to use
<add>
<add>```python
<add>from transformers import AutoTokenizer, AutoModel
<add>
<add>tokenizer = AutoTokenizer.from_pretrained("Geotrend/bert-base-en-tr-cased")
<add>model = AutoModel.from_pretrained("Geotrend/bert-base-en-tr-cased")
<add>
<add>```
<add>
<add>### How to cite
<add>
<add>```bibtex
<add>@inproceedings{smallermbert,
<add> title={Load What You Need: Smaller Versions of Mutlilingual BERT},
<add> author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
<add> booktitle={SustaiNLP / EMNLP},
<add> year={2020}
<add>}
<add>```
<add>
<add>## Contact
<add>
<add>Please contact amine@geotrend.fr for any question, feedback or request.
<ide><path>model_cards/Geotrend/bert-base-en-ur-cased/README.md
<add>---
<add>language: multilingual
<add>
<add>datasets: wikipedia
<add>
<add>license: apache-2.0
<add>---
<add>
<add># bert-base-en-ur-cased
<add>
<add>We are sharing smaller versions of [bert-base-multilingual-cased](https://huggingface.co/bert-base-multilingual-cased) that handle a custom number of languages.
<add>
<add>Unlike [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased), our versions give exactly the same representations produced by the original model which preserves the original accuracy.
<add>
<add>For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
<add>
<add>## How to use
<add>
<add>```python
<add>from transformers import AutoTokenizer, AutoModel
<add>
<add>tokenizer = AutoTokenizer.from_pretrained("Geotrend/bert-base-en-ur-cased")
<add>model = AutoModel.from_pretrained("Geotrend/bert-base-en-ur-cased")
<add>
<add>```
<add>
<add>### How to cite
<add>
<add>```bibtex
<add>@inproceedings{smallermbert,
<add> title={Load What You Need: Smaller Versions of Mutlilingual BERT},
<add> author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
<add> booktitle={SustaiNLP / EMNLP},
<add> year={2020}
<add>}
<add>```
<add>
<add>## Contact
<add>
<add>Please contact amine@geotrend.fr for any question, feedback or request.
<ide><path>model_cards/Geotrend/bert-base-en-vi-cased/README.md
<add>---
<add>language: multilingual
<add>
<add>datasets: wikipedia
<add>
<add>license: apache-2.0
<add>---
<add>
<add># bert-base-en-vi-cased
<add>
<add>We are sharing smaller versions of [bert-base-multilingual-cased](https://huggingface.co/bert-base-multilingual-cased) that handle a custom number of languages.
<add>
<add>Unlike [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased), our versions give exactly the same representations produced by the original model which preserves the original accuracy.
<add>
<add>For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
<add>
<add>## How to use
<add>
<add>```python
<add>from transformers import AutoTokenizer, AutoModel
<add>
<add>tokenizer = AutoTokenizer.from_pretrained("Geotrend/bert-base-en-vi-cased")
<add>model = AutoModel.from_pretrained("Geotrend/bert-base-en-vi-cased")
<add>
<add>```
<add>
<add>### How to cite
<add>
<add>```bibtex
<add>@inproceedings{smallermbert,
<add> title={Load What You Need: Smaller Versions of Mutlilingual BERT},
<add> author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
<add> booktitle={SustaiNLP / EMNLP},
<add> year={2020}
<add>}
<add>```
<add>
<add>## Contact
<add>
<add>Please contact amine@geotrend.fr for any question, feedback or request.
<ide><path>model_cards/Geotrend/bert-base-en-zh-cased/README.md
<add>---
<add>language: multilingual
<add>
<add>datasets: wikipedia
<add>
<add>license: apache-2.0
<add>---
<add>
<add># bert-base-en-zh-cased
<add>
<add>We are sharing smaller versions of [bert-base-multilingual-cased](https://huggingface.co/bert-base-multilingual-cased) that handle a custom number of languages.
<add>
<add>Unlike [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased), our versions give exactly the same representations produced by the original model which preserves the original accuracy.
<add>
<add>For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
<add>
<add>## How to use
<add>
<add>```python
<add>from transformers import AutoTokenizer, AutoModel
<add>
<add>tokenizer = AutoTokenizer.from_pretrained("Geotrend/bert-base-en-zh-cased")
<add>model = AutoModel.from_pretrained("Geotrend/bert-base-en-zh-cased")
<add>
<add>```
<add>
<add>### How to cite
<add>
<add>```bibtex
<add>@inproceedings{smallermbert,
<add> title={Load What You Need: Smaller Versions of Mutlilingual BERT},
<add> author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
<add> booktitle={SustaiNLP / EMNLP},
<add> year={2020}
<add>}
<add>```
<add>
<add>## Contact
<add>
<add>Please contact amine@geotrend.fr for any question, feedback or request.
<ide><path>model_cards/Geotrend/bert-base-es-cased/README.md
<add>---
<add>language: es
<add>
<add>datasets: wikipedia
<add>
<add>license: apache-2.0
<add>---
<add>
<add># bert-base-es-cased
<add>
<add>We are sharing smaller versions of [bert-base-multilingual-cased](https://huggingface.co/bert-base-multilingual-cased) that handle a custom number of languages.
<add>
<add>Unlike [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased), our versions give exactly the same representations produced by the original model which preserves the original accuracy.
<add>
<add>For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
<add>
<add>## How to use
<add>
<add>```python
<add>from transformers import AutoTokenizer, AutoModel
<add>
<add>tokenizer = AutoTokenizer.from_pretrained("Geotrend/bert-base-es-cased")
<add>model = AutoModel.from_pretrained("Geotrend/bert-base-es-cased")
<add>
<add>```
<add>
<add>### How to cite
<add>
<add>```bibtex
<add>@inproceedings{smallermbert,
<add> title={Load What You Need: Smaller Versions of Mutlilingual BERT},
<add> author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
<add> booktitle={SustaiNLP / EMNLP},
<add> year={2020}
<add>}
<add>```
<add>
<add>## Contact
<add>
<add>Please contact amine@geotrend.fr for any question, feedback or request.
<ide><path>model_cards/Geotrend/bert-base-fr-cased/README.md
<add>---
<add>language: fr
<add>
<add>datasets: wikipedia
<add>
<add>license: apache-2.0
<add>---
<add>
<add># bert-base-fr-cased
<add>
<add>We are sharing smaller versions of [bert-base-multilingual-cased](https://huggingface.co/bert-base-multilingual-cased) that handle a custom number of languages.
<add>
<add>Unlike [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased), our versions give exactly the same representations produced by the original model which preserves the original accuracy.
<add>
<add>For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
<add>
<add>## How to use
<add>
<add>```python
<add>from transformers import AutoTokenizer, AutoModel
<add>
<add>tokenizer = AutoTokenizer.from_pretrained("Geotrend/bert-base-fr-cased")
<add>model = AutoModel.from_pretrained("Geotrend/bert-base-fr-cased")
<add>
<add>```
<add>
<add>### How to cite
<add>
<add>```bibtex
<add>@inproceedings{smallermbert,
<add> title={Load What You Need: Smaller Versions of Mutlilingual BERT},
<add> author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
<add> booktitle={SustaiNLP / EMNLP},
<add> year={2020}
<add>}
<add>```
<add>
<add>## Contact
<add>
<add>Please contact amine@geotrend.fr for any question, feedback or request.
<ide><path>model_cards/Geotrend/bert-base-hi-cased/README.md
<add>---
<add>language: hi
<add>
<add>datasets: wikipedia
<add>
<add>license: apache-2.0
<add>---
<add>
<add># bert-base-hi-cased
<add>
<add>We are sharing smaller versions of [bert-base-multilingual-cased](https://huggingface.co/bert-base-multilingual-cased) that handle a custom number of languages.
<add>
<add>Unlike [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased), our versions give exactly the same representations produced by the original model which preserves the original accuracy.
<add>
<add>For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
<add>
<add>## How to use
<add>
<add>```python
<add>from transformers import AutoTokenizer, AutoModel
<add>
<add>tokenizer = AutoTokenizer.from_pretrained("Geotrend/bert-base-hi-cased")
<add>model = AutoModel.from_pretrained("Geotrend/bert-base-hi-cased")
<add>
<add>```
<add>
<add>### How to cite
<add>
<add>```bibtex
<add>@inproceedings{smallermbert,
<add> title={Load What You Need: Smaller Versions of Mutlilingual BERT},
<add> author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
<add> booktitle={SustaiNLP / EMNLP},
<add> year={2020}
<add>}
<add>```
<add>
<add>## Contact
<add>
<add>Please contact amine@geotrend.fr for any question, feedback or request.
<ide><path>model_cards/Geotrend/bert-base-ru-cased/README.md
<add>---
<add>language: ru
<add>
<add>datasets: wikipedia
<add>
<add>license: apache-2.0
<add>---
<add>
<add># bert-base-ru-cased
<add>
<add>We are sharing smaller versions of [bert-base-multilingual-cased](https://huggingface.co/bert-base-multilingual-cased) that handle a custom number of languages.
<add>
<add>Unlike [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased), our versions give exactly the same representations produced by the original model which preserves the original accuracy.
<add>
<add>For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
<add>
<add>## How to use
<add>
<add>```python
<add>from transformers import AutoTokenizer, AutoModel
<add>
<add>tokenizer = AutoTokenizer.from_pretrained("Geotrend/bert-base-ru-cased")
<add>model = AutoModel.from_pretrained("Geotrend/bert-base-ru-cased")
<add>
<add>```
<add>
<add>### How to cite
<add>
<add>```bibtex
<add>@inproceedings{smallermbert,
<add> title={Load What You Need: Smaller Versions of Mutlilingual BERT},
<add> author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
<add> booktitle={SustaiNLP / EMNLP},
<add> year={2020}
<add>}
<add>```
<add>
<add>## Contact
<add>
<add>Please contact amine@geotrend.fr for any question, feedback or request.
<ide><path>model_cards/Geotrend/bert-base-sw-cased/README.md
<add>---
<add>language: sw
<add>
<add>datasets: wikipedia
<add>
<add>license: apache-2.0
<add>---
<add>
<add># bert-base-sw-cased
<add>
<add>We are sharing smaller versions of [bert-base-multilingual-cased](https://huggingface.co/bert-base-multilingual-cased) that handle a custom number of languages.
<add>
<add>Unlike [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased), our versions give exactly the same representations produced by the original model which preserves the original accuracy.
<add>
<add>For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
<add>
<add>## How to use
<add>
<add>```python
<add>from transformers import AutoTokenizer, AutoModel
<add>
<add>tokenizer = AutoTokenizer.from_pretrained("Geotrend/bert-base-sw-cased")
<add>model = AutoModel.from_pretrained("Geotrend/bert-base-sw-cased")
<add>
<add>```
<add>
<add>### How to cite
<add>
<add>```bibtex
<add>@inproceedings{smallermbert,
<add> title={Load What You Need: Smaller Versions of Mutlilingual BERT},
<add> author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
<add> booktitle={SustaiNLP / EMNLP},
<add> year={2020}
<add>}
<add>```
<add>
<add>## Contact
<add>
<add>Please contact amine@geotrend.fr for any question, feedback or request.
<ide><path>model_cards/Geotrend/bert-base-th-cased/README.md
<add>---
<add>language: th
<add>
<add>datasets: wikipedia
<add>
<add>license: apache-2.0
<add>---
<add>
<add># bert-base-th-cased
<add>
<add>We are sharing smaller versions of [bert-base-multilingual-cased](https://huggingface.co/bert-base-multilingual-cased) that handle a custom number of languages.
<add>
<add>Unlike [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased), our versions give exactly the same representations produced by the original model which preserves the original accuracy.
<add>
<add>For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
<add>
<add>## How to use
<add>
<add>```python
<add>from transformers import AutoTokenizer, AutoModel
<add>
<add>tokenizer = AutoTokenizer.from_pretrained("Geotrend/bert-base-th-cased")
<add>model = AutoModel.from_pretrained("Geotrend/bert-base-th-cased")
<add>
<add>```
<add>
<add>### How to cite
<add>
<add>```bibtex
<add>@inproceedings{smallermbert,
<add> title={Load What You Need: Smaller Versions of Mutlilingual BERT},
<add> author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
<add> booktitle={SustaiNLP / EMNLP},
<add> year={2020}
<add>}
<add>```
<add>
<add>## Contact
<add>
<add>Please contact amine@geotrend.fr for any question, feedback or request.
<ide><path>model_cards/Geotrend/bert-base-tr-cased/README.md
<add>---
<add>language: tr
<add>
<add>datasets: wikipedia
<add>
<add>license: apache-2.0
<add>---
<add>
<add># bert-base-tr-cased
<add>
<add>We are sharing smaller versions of [bert-base-multilingual-cased](https://huggingface.co/bert-base-multilingual-cased) that handle a custom number of languages.
<add>
<add>Unlike [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased), our versions give exactly the same representations produced by the original model which preserves the original accuracy.
<add>
<add>For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
<add>
<add>## How to use
<add>
<add>```python
<add>from transformers import AutoTokenizer, AutoModel
<add>
<add>tokenizer = AutoTokenizer.from_pretrained("Geotrend/bert-base-tr-cased")
<add>model = AutoModel.from_pretrained("Geotrend/bert-base-tr-cased")
<add>
<add>```
<add>
<add>### How to cite
<add>
<add>```bibtex
<add>@inproceedings{smallermbert,
<add> title={Load What You Need: Smaller Versions of Mutlilingual BERT},
<add> author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
<add> booktitle={SustaiNLP / EMNLP},
<add> year={2020}
<add>}
<add>```
<add>
<add>## Contact
<add>
<add>Please contact amine@geotrend.fr for any question, feedback or request.
<ide><path>model_cards/Geotrend/bert-base-ur-cased/README.md
<add>---
<add>language: ur
<add>
<add>datasets: wikipedia
<add>
<add>license: apache-2.0
<add>---
<add>
<add># bert-base-ur-cased
<add>
<add>We are sharing smaller versions of [bert-base-multilingual-cased](https://huggingface.co/bert-base-multilingual-cased) that handle a custom number of languages.
<add>
<add>Unlike [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased), our versions give exactly the same representations produced by the original model which preserves the original accuracy.
<add>
<add>For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
<add>
<add>## How to use
<add>
<add>```python
<add>from transformers import AutoTokenizer, AutoModel
<add>
<add>tokenizer = AutoTokenizer.from_pretrained("Geotrend/bert-base-ur-cased")
<add>model = AutoModel.from_pretrained("Geotrend/bert-base-ur-cased")
<add>
<add>```
<add>
<add>### How to cite
<add>
<add>```bibtex
<add>@inproceedings{smallermbert,
<add> title={Load What You Need: Smaller Versions of Mutlilingual BERT},
<add> author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
<add> booktitle={SustaiNLP / EMNLP},
<add> year={2020}
<add>}
<add>```
<add>
<add>## Contact
<add>
<add>Please contact amine@geotrend.fr for any question, feedback or request.
<ide><path>model_cards/Geotrend/bert-base-vi-cased/README.md
<add>---
<add>language: vi
<add>
<add>datasets: wikipedia
<add>
<add>license: apache-2.0
<add>---
<add>
<add># bert-base-vi-cased
<add>
<add>We are sharing smaller versions of [bert-base-multilingual-cased](https://huggingface.co/bert-base-multilingual-cased) that handle a custom number of languages.
<add>
<add>Unlike [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased), our versions give exactly the same representations produced by the original model which preserves the original accuracy.
<add>
<add>For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
<add>
<add>## How to use
<add>
<add>```python
<add>from transformers import AutoTokenizer, AutoModel
<add>
<add>tokenizer = AutoTokenizer.from_pretrained("Geotrend/bert-base-vi-cased")
<add>model = AutoModel.from_pretrained("Geotrend/bert-base-vi-cased")
<add>
<add>```
<add>
<add>### How to cite
<add>
<add>```bibtex
<add>@inproceedings{smallermbert,
<add> title={Load What You Need: Smaller Versions of Mutlilingual BERT},
<add> author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
<add> booktitle={SustaiNLP / EMNLP},
<add> year={2020}
<add>}
<add>```
<add>
<add>## Contact
<add>
<add>Please contact amine@geotrend.fr for any question, feedback or request.
<ide><path>model_cards/Geotrend/bert-base-zh-cased/README.md
<add>---
<add>language: zh
<add>
<add>datasets: wikipedia
<add>
<add>license: apache-2.0
<add>---
<add>
<add># bert-base-zh-cased
<add>
<add>We are sharing smaller versions of [bert-base-multilingual-cased](https://huggingface.co/bert-base-multilingual-cased) that handle a custom number of languages.
<add>
<add>Unlike [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased), our versions give exactly the same representations produced by the original model which preserves the original accuracy.
<add>
<add>For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
<add>
<add>## How to use
<add>
<add>```python
<add>from transformers import AutoTokenizer, AutoModel
<add>
<add>tokenizer = AutoTokenizer.from_pretrained("Geotrend/bert-base-zh-cased")
<add>model = AutoModel.from_pretrained("Geotrend/bert-base-zh-cased")
<add>
<add>```
<add>
<add>### How to cite
<add>
<add>```bibtex
<add>@inproceedings{smallermbert,
<add> title={Load What You Need: Smaller Versions of Mutlilingual BERT},
<add> author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
<add> booktitle={SustaiNLP / EMNLP},
<add> year={2020}
<add>}
<add>```
<add>
<add>## Contact
<add>
<add>Please contact amine@geotrend.fr for any question, feedback or request. | 30 |
Text | Text | fix broken data fetching links in docs | 60488e68e323ce7ca1b2eee74ea560f486d34e68 | <ide><path>docs/basic-features/eslint.md
<ide> This will take precedence over the configuration from `next.config.js`.
<ide>
<ide> Next.js provides an ESLint plugin, [`eslint-plugin-next`](https://www.npmjs.com/package/@next/eslint-plugin-next), already bundled within the base configuration that makes it possible to catch common issues and problems in a Next.js application. The full set of rules is as follows:
<ide>
<del>| | Rule | Description |
<del>| :-: | ---------------------------------------------------------------------------------------------- | ---------------------------------------------------------------------------------------------------------------------------- |
<del>| ✔️ | [next/google-font-display](https://nextjs.org/docs/messages/google-font-display) | Enforce optional or swap font-display behavior with Google Fonts |
<del>| ✔️ | [next/google-font-preconnect](https://nextjs.org/docs/messages/google-font-preconnect) | Enforce preconnect usage with Google Fonts |
<del>| ✔️ | [next/link-passhref](https://nextjs.org/docs/messages/link-passhref) | Enforce passHref prop usage with custom Link components |
<del>| ✔️ | [next/no-css-tags](https://nextjs.org/docs/messages/no-css-tags) | Prevent manual stylesheet tags |
<del>| ✔️ | [next/no-document-import-in-page](https://nextjs.org/docs/messages/no-document-import-in-page) | Disallow importing next/document outside of pages/document.js |
<del>| ✔️ | [next/no-head-import-in-document](https://nextjs.org/docs/messages/no-head-import-in-document) | Disallow importing next/head in pages/document.js |
<del>| ✔️ | [next/no-html-link-for-pages](https://nextjs.org/docs/messages/no-html-link-for-pages) | Prohibit HTML anchor links to pages without a Link component |
<del>| ✔️ | [next/no-img-element](https://nextjs.org/docs/messages/no-img-element) | Prohibit usage of HTML <img> element |
<del>| ✔️ | [next/no-head-element](https://nextjs.org/docs/messages/no-head-element) | Prohibit usage of HTML <head> element |
<del>| ✔️ | [next/no-page-custom-font](https://nextjs.org/docs/messages/no-page-custom-font) | Prevent page-only custom fonts |
<del>| ✔️ | [next/no-sync-scripts](https://nextjs.org/docs/messages/no-sync-scripts) | Forbid synchronous scripts |
<del>| ✔️ | [next/no-title-in-document-head](https://nextjs.org/docs/messages/no-title-in-document-head) | Disallow using <title> with Head from next/document |
<del>| ✔️ | [next/no-unwanted-polyfillio](https://nextjs.org/docs/messages/no-unwanted-polyfillio) | Prevent duplicate polyfills from Polyfill.io |
<del>| ✔️ | [next/inline-script-id](https://nextjs.org/docs/messages/inline-script-id) | Enforce id attribute on next/script components with inline content |
<del>| ✔️ | next/no-typos | Ensure no typos were made declaring [Next.js's data fetching function](https://nextjs.org/docs/basic-features/data-fetching) |
<del>| ✔️ | [next/next-script-for-ga](https://nextjs.org/docs/messages/next-script-for-ga) | Use the Script component to defer loading of the script until necessary. |
<add>| | Rule | Description |
<add>| :-: | ------------------------------------------------------------------------------- | ---------------------------------------------------------------------------------------------------------------------- |
<add>| ✔️ | [next/google-font-display](/docs/messages/google-font-display.md) | Enforce optional or swap font-display behavior with Google Fonts |
<add>| ✔️ | [next/google-font-preconnect](/docs/messages/google-font-preconnect.md) | Enforce preconnect usage with Google Fonts |
<add>| ✔️ | [next/link-passhref](/docs/messages/link-passhref.md) | Enforce passHref prop usage with custom Link components |
<add>| ✔️ | [next/no-css-tags](/docs/messages/no-css-tags.md) | Prevent manual stylesheet tags |
<add>| ✔️ | [next/no-document-import-in-page](/docs/messages/no-document-import-in-page.md) | Disallow importing next/document outside of pages/document.js |
<add>| ✔️ | [next/no-head-import-in-document](/docs/messages/no-head-import-in-document.md) | Disallow importing next/head in pages/document.js |
<add>| ✔️ | [next/no-html-link-for-pages](/docs/messages/no-html-link-for-pages.md) | Prohibit HTML anchor links to pages without a Link component |
<add>| ✔️ | [next/no-img-element](/docs/messages/no-img-element.md) | Prohibit usage of HTML <img> element |
<add>| ✔️ | [next/no-head-element](/docs/messages/no-head-element.md) | Prohibit usage of HTML <head> element |
<add>| ✔️ | [next/no-page-custom-font](/docs/messages/no-page-custom-font.md) | Prevent page-only custom fonts |
<add>| ✔️ | [next/no-sync-scripts](/docs/messages/no-sync-scripts.md) | Forbid synchronous scripts |
<add>| ✔️ | [next/no-title-in-document-head](/docs/messages/no-title-in-document-head.md) | Disallow using <title> with Head from next/document |
<add>| ✔️ | [next/no-unwanted-polyfillio](/docs/messages/no-unwanted-polyfillio.md) | Prevent duplicate polyfills from Polyfill.io |
<add>| ✔️ | [next/inline-script-id](/docs/messages/inline-script-id.md) | Enforce id attribute on next/script components with inline content |
<add>| ✔️ | next/no-typos | Ensure no typos were made declaring [Next.js's data fetching function](/docs/basic-features/data-fetching/overview.md) |
<add>| ✔️ | [next/next-script-for-ga](/docs/messages/next-script-for-ga.md) | Use the Script component to defer loading of the script until necessary. |
<ide>
<ide> - ✔: Enabled in the recommended configuration
<ide>
<ide><path>errors/conflicting-ssg-paths.md
<ide> export default function CatchAll() {
<ide>
<ide> ### Useful Links
<ide>
<del>- [`getStaticPaths` Documentation](https://nextjs.org/docs/api-reference/data-fetching/get-static-paths)
<add>- [`getStaticPaths` Documentation](https://nextjs.org/docs/basic-features/data-fetching/get-static-paths)
<ide><path>errors/gsp-redirect-during-prerender.md
<ide> Remove any paths that result in a redirect from being prerendered in `getStaticP
<ide>
<ide> ### Useful Links
<ide>
<del>- [Data Fetching Documentation](/docs/basic-features/data-fetching/get-static-props.md)
<add>- [Data Fetching Documentation](https://nextjs.org/docs/basic-features/data-fetching/get-static-props)
<ide><path>errors/gssp-export.md
<ide> The `getServerSideProps` lifecycle is not compatible with `next export`, so you'
<ide>
<ide> ### Useful Links
<ide>
<del>- [Automatic Static Optimization](/docs/advanced-features/automatic-static-optimization.md)
<del>- [`getStaticProps` documentation](/docs/basic-features/data-fetching/get-static-props.md)
<del>- [`exportPathMap` documentation](/docs/api-reference/next.config.js/exportPathMap.md)
<add>- [Automatic Static Optimization](https://nextjs.org/docs/advanced-features/automatic-static-optimization)
<add>- [`getStaticProps` documentation](https://nextjs.org/docs/basic-features/data-fetching/get-static-props)
<add>- [`exportPathMap` documentation](https://nextjs.org/docs/api-reference/next.config.js/exportPathMap)
<ide><path>errors/gssp-mixed-not-found-redirect.md
<ide> Make sure only `notFound` **or** `redirect` is being returned on your page's `ge
<ide>
<ide> ### Useful Links
<ide>
<del>- [`getStaticProps` Documentation](/docs/basic-features/data-fetching/get-static-props.md)
<del>- [`getServerSideProps` Documentation](/docs/basic-features/data-fetching/get-server-side-props.md)
<add>- [`getStaticProps` Documentation](https://nextjs.org/docs/basic-features/data-fetching/get-static-props)
<add>- [`getServerSideProps` Documentation](https://nextjs.org/docs/basic-features/data-fetching/get-server-side-props)
<ide><path>errors/gssp-no-mutating-res.md
<ide> If you’re using a custom server and running into this problem due to session m
<ide>
<ide> ### Useful Links
<ide>
<del>- [Data Fetching Docs](https://nextjs.org/docs/basic-features/data-fetching)
<add>- [Data Fetching Docs](https://nextjs.org/docs/basic-features/data-fetching/index)
<ide><path>errors/invalid-redirect-gssp.md
<ide> export const getStaticProps = ({ params }) => {
<ide>
<ide> ### Useful Links
<ide>
<del>- [Data Fetching Documentation](/docs/basic-features/data-fetching/get-static-props.md)
<add>- [Data Fetching Documentation](https://nextjs.org/docs/basic-features/data-fetching/get-static-props)
<ide><path>errors/large-page-data.md
<ide> Reduce the amount of data returned from `getStaticProps`, `getServerSideProps`,
<ide>
<ide> ### Useful Links
<ide>
<del>- [Data Fetching Documentation](https://nextjs.org/docs/basic-features/data-fetching)
<add>- [Data Fetching Documentation](https://nextjs.org/docs/basic-features/data-fetching/overview)
<ide><path>errors/page-data-collection-timeout.md
<ide> When restarted it will retry all uncompleted jobs, but if a job was unsuccessful
<ide>
<ide> ### Useful Links
<ide>
<del>- [`getStaticPaths`](/docs/basic-features/data-fetching/get-static-paths.md)
<add>- [`getStaticPaths`](https://nextjs.org/docs/basic-features/data-fetching/get-static-paths)
<add>- [`getStaticProps`](https://nextjs.org/docs/basic-features/data-fetching/get-static-props)
<ide><path>errors/prerender-error.md
<ide> While prerendering a page an error occurred. This can occur for many reasons fro
<ide> - Set default values for all dynamic pages' props (avoid `undefined`, use `null` instead so it can be serialized)
<ide> - Check for any out of date modules that you might be relying on
<ide> - Make sure your component handles `fallback` if it is enabled in `getStaticPaths`. [Fallback docs](https://nextjs.org/docs/api-reference/data-fetching/get-static-paths#fallback-false)
<del>- Make sure you are not trying to export (`next export`) pages that have server-side rendering enabled [(getServerSideProps)](/docs/basic-features/data-fetching/get-server-side-props.md)
<add>- Make sure you are not trying to export (`next export`) pages that have server-side rendering enabled [(getServerSideProps)](https://nextjs.org/docs/basic-features/data-fetching/get-server-side-props)
<ide><path>errors/ssg-fallback-true-export.md
<ide> If you would like the `fallback: true` behavior, `next export` should not be use
<ide> ### Useful Links
<ide>
<ide> - [deployment documentation](https://nextjs.org/docs/deployment#vercel-recommended)
<del>- [`fallback: true` documentation](https://nextjs.org/docs/basic-features/data-fetching#fallback-true)
<add>- [`fallback: true` documentation](https://nextjs.org/docs/api-reference/data-fetching/get-static-paths#fallback-true) | 11 |
PHP | PHP | fix presenceverifier contract | b4df9393bf0185929aa864529798caae740da99a | <ide><path>src/Illuminate/Validation/DatabasePresenceVerifier.php
<ide> namespace Illuminate\Validation;
<ide>
<ide> use Closure;
<del>use Illuminate\Support\Str;
<ide> use Illuminate\Database\ConnectionResolverInterface;
<add>use Illuminate\Support\Str;
<ide>
<ide> class DatabasePresenceVerifier implements PresenceVerifierInterface
<ide> {
<ide> protected function addWhere($query, $key, $extraValue)
<ide> * @param string $table
<ide> * @return \Illuminate\Database\Query\Builder
<ide> */
<del> public function table($table)
<add> protected function table($table)
<ide> {
<ide> return $this->db->connection($this->connection)->table($table)->useWritePdo();
<ide> }
<ide><path>src/Illuminate/Validation/PresenceVerifierInterface.php
<ide> public function getCount($collection, $column, $value, $excludeId = null, $idCol
<ide> * @return int
<ide> */
<ide> public function getMultiCount($collection, $column, array $values, array $extra = []);
<add>
<add> /**
<add> * Set the connection to be used.
<add> *
<add> * @param string $connection
<add> * @return void
<add> */
<add> public function setConnection($connection);
<ide> } | 2 |
Go | Go | enable auto-creation of host-path on windows | 4e080347af657ca3a0c103c6bc6cd6a8157d20d8 | <ide><path>volume/volume.go
<ide> package volume
<ide> import (
<ide> "fmt"
<ide> "os"
<del> "runtime"
<ide> "strings"
<ide>
<ide> "github.com/docker/docker/pkg/stringid"
<add> "github.com/docker/docker/pkg/system"
<ide> )
<ide>
<ide> // DefaultDriverName is the driver name used for the driver
<ide> func (m *MountPoint) Setup() (string, error) {
<ide> }
<ide> return m.Volume.Mount(m.ID)
<ide> }
<del> if len(m.Source) > 0 {
<del> if _, err := os.Stat(m.Source); err != nil {
<del> if !os.IsNotExist(err) {
<del> return "", err
<del> }
<del> if runtime.GOOS != "windows" { // Windows does not have deprecation issues here
<del> if err := os.MkdirAll(m.Source, 0755); err != nil {
<del> return "", err
<del> }
<del> }
<add> if len(m.Source) == 0 {
<add> return "", fmt.Errorf("Unable to setup mount point, neither source nor volume defined")
<add> }
<add> // system.MkdirAll() produces an error if m.Source exists and is a file (not a directory),
<add> // so first check if the path does not exist
<add> if _, err := os.Stat(m.Source); err != nil {
<add> if !os.IsNotExist(err) {
<add> return "", err
<add> }
<add> if err := system.MkdirAll(m.Source, 0755); err != nil {
<add> return "", err
<ide> }
<del> return m.Source, nil
<ide> }
<del> return "", fmt.Errorf("Unable to setup mount point, neither source nor volume defined")
<add> return m.Source, nil
<ide> }
<ide>
<ide> // Path returns the path of a volume in a mount point. | 1 |
Javascript | Javascript | fix comment on grammarregistry class | 00bea438401d01d5383872bff9bc6ec4c803ebad | <ide><path>src/grammar-registry.js
<ide> const {Point, Range} = require('text-buffer')
<ide> const GRAMMAR_SELECTION_RANGE = Range(Point.ZERO, Point(10, 0)).freeze()
<ide> const PATH_SPLIT_REGEX = new RegExp('[/.]')
<ide>
<del>// Extended: Syntax class holding the grammars used for tokenizing.
<add>// Extended: This class holds the grammars used for tokenizing.
<ide> //
<ide> // An instance of this class is always available as the `atom.grammars` global.
<del>//
<del>// The Syntax class also contains properties for things such as the
<del>// language-specific comment regexes. See {::getProperty} for more details.
<ide> module.exports =
<ide> class GrammarRegistry extends FirstMate.GrammarRegistry {
<ide> constructor ({config} = {}) { | 1 |
PHP | PHP | add assert that habtm save does not return false | 842b1802d99344f6ce0b0182b1a5b39dd2a5f551 | <ide><path>lib/Cake/Test/Case/Model/ModelWriteTest.php
<ide> public function testSaveHabtmNoPrimaryData() {
<ide>
<ide> $TestModel->id = 2;
<ide> $data = array('Tag' => array('Tag' => array(2)));
<del> $TestModel->save($data);
<add> $result = $TestModel->save($data);
<add>
<add> $this->assertEquals($data['Tag'], $result['Tag']);
<ide>
<ide> $result = $TestModel->findById(2);
<ide> $expected = array( | 1 |
Python | Python | add test for lazy loading error | 2305b056c3a3161c92ea61e6fceb67262766bc40 | <ide><path>tests/test_cli.py
<ide> def create_app():
<ide> assert app.name == "testapp"
<ide>
<ide>
<add>def test_lazy_load_error(monkeypatch):
<add> """When using lazy loading, the correct exception should be
<add> re-raised.
<add> """
<add>
<add> class BadExc(Exception):
<add> pass
<add>
<add> def bad_load():
<add> raise BadExc
<add>
<add> lazy = DispatchingApp(bad_load, use_eager_loading=False)
<add>
<add> with pytest.raises(BadExc):
<add> lazy._flush_bg_loading_exception()
<add>
<add>
<ide> def test_with_appcontext(runner):
<ide> @click.command()
<ide> @with_appcontext
<ide> def test_click_7_deprecated():
<ide> pytest.deprecated_call(cli_main, match=".* Click 7 is deprecated")
<ide> else:
<ide> cli_main()
<del>
<del>
<del>def test_load_in_background():
<del> pytest.raises(Exception, DispatchingApp, "appname123") | 1 |
Javascript | Javascript | run tests against firefox 85 on macos catalina | 58cd89742029291a30678bfe440fad958c9a0275 | <ide><path>protractor-circleci-conf.js
<ide> config.multiCapabilities = [
<ide> }),
<ide> capabilitiesForSauceLabs({
<ide> browserName: 'firefox',
<del> platform: 'OS X 10.14',
<del> version: '76'
<add> platform: 'OS X 10.15',
<add> version: '85'
<ide> })
<ide> ];
<ide> | 1 |
Go | Go | remove unused code | a34cb321865c16ac42a12fe32fa3c8c4a5ffd263 | <ide><path>integration-cli/utils.go
<ide> import (
<ide> "errors"
<ide> "fmt"
<ide> "io"
<del> "net/http"
<ide> "net/http/httptest"
<ide> "os"
<ide> "os/exec"
<ide> type FileServer struct {
<ide> *httptest.Server
<ide> }
<ide>
<del>func fileServer(files map[string]string) (*FileServer, error) {
<del> var handler http.HandlerFunc = func(w http.ResponseWriter, r *http.Request) {
<del> if filePath, found := files[r.URL.Path]; found {
<del> http.ServeFile(w, r, filePath)
<del> } else {
<del> http.Error(w, http.StatusText(404), 404)
<del> }
<del> }
<del>
<del> for _, file := range files {
<del> if _, err := os.Stat(file); err != nil {
<del> return nil, err
<del> }
<del> }
<del> server := httptest.NewServer(handler)
<del> return &FileServer{
<del> Server: server,
<del> }, nil
<del>}
<del>
<ide> // randomUnixTmpDirPath provides a temporary unix path with rand string appended.
<ide> // does not create or checks if it exists.
<ide> func randomUnixTmpDirPath(s string) string { | 1 |
Python | Python | add state attribute to kubernetespod | a2b69f3f8d4d6b81299d69a8c45ecd7341d8fb77 | <ide><path>libcloud/container/drivers/kubernetes.py
<ide>
<ide>
<ide> class KubernetesPod(object):
<del> def __init__(self, id, name, containers, namespace):
<add> def __init__(self, id, name, containers, namespace, state):
<ide> """
<ide> A Kubernetes pod
<ide> """
<ide> self.id = id
<ide> self.name = name
<ide> self.containers = containers
<ide> self.namespace = namespace
<add> self.state = state
<ide>
<ide>
<ide> class KubernetesContainerDriver(KubernetesDriverMixin, ContainerDriver):
<ide> def _to_pod(self, data):
<ide> id=data["metadata"]["uid"],
<ide> name=data["metadata"]["name"],
<ide> namespace=data["metadata"]["namespace"],
<add> state=data["status"]["phase"].lower(),
<ide> containers=containers,
<ide> )
<ide> | 1 |
Java | Java | use synchonous api for synchonous okhttp requests | 37b32d38bcb8a5e28ef17104ae8113c33b59a5df | <ide><path>spring-web/src/main/java/org/springframework/http/client/OkHttpAsyncClientHttpRequest.java
<add>/*
<add> * Copyright 2002-2016 the original author or authors.
<add> *
<add> * Licensed under the Apache License, Version 2.0 (the "License");
<add> * you may not use this file except in compliance with the License.
<add> * You may obtain a copy of the License at
<add> *
<add> * http://www.apache.org/licenses/LICENSE-2.0
<add> *
<add> * Unless required by applicable law or agreed to in writing, software
<add> * distributed under the License is distributed on an "AS IS" BASIS,
<add> * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
<add> * See the License for the specific language governing permissions and
<add> * limitations under the License.
<add> */
<add>
<add>package org.springframework.http.client;
<add>
<add>import java.io.IOException;
<add>import java.net.URI;
<add>
<add>import com.squareup.okhttp.Call;
<add>import com.squareup.okhttp.Callback;
<add>import com.squareup.okhttp.OkHttpClient;
<add>import com.squareup.okhttp.Request;
<add>import com.squareup.okhttp.Response;
<add>
<add>import org.springframework.http.HttpHeaders;
<add>import org.springframework.http.HttpMethod;
<add>import org.springframework.util.concurrent.ListenableFuture;
<add>import org.springframework.util.concurrent.SettableListenableFuture;
<add>
<add>/**
<add> * {@link AsyncClientHttpRequest} implementation that uses OkHttp to execute requests.
<add> *
<add> * <p>Created via the {@link OkHttpClientHttpRequestFactory}.
<add> *
<add> * @author Luciano Leggieri
<add> * @author Arjen Poutsma
<add> * @since 4.3
<add> */
<add>class OkHttpAsyncClientHttpRequest extends AbstractBufferingAsyncClientHttpRequest {
<add>
<add> private final OkHttpClient client;
<add>
<add> private final URI uri;
<add>
<add> private final HttpMethod method;
<add>
<add>
<add> public OkHttpAsyncClientHttpRequest(OkHttpClient client, URI uri, HttpMethod method) {
<add> this.client = client;
<add> this.uri = uri;
<add> this.method = method;
<add> }
<add>
<add>
<add> @Override
<add> public HttpMethod getMethod() {
<add> return this.method;
<add> }
<add>
<add> @Override
<add> public URI getURI() {
<add> return this.uri;
<add> }
<add>
<add> @Override
<add> protected ListenableFuture<ClientHttpResponse> executeInternal(HttpHeaders headers, byte[] content)
<add> throws IOException {
<add>
<add> Request request = OkHttpClientHttpRequestFactory
<add> .buildRequest(headers, content, this.uri, this.method);
<add>
<add> return new OkHttpListenableFuture(this.client.newCall(request));
<add> }
<add>
<add> private static class OkHttpListenableFuture
<add> extends SettableListenableFuture<ClientHttpResponse> {
<add>
<add> private final Call call;
<add>
<add> public OkHttpListenableFuture(Call call) {
<add> this.call = call;
<add> this.call.enqueue(new Callback() {
<add> @Override
<add> public void onResponse(Response response) {
<add> set(new OkHttpClientHttpResponse(response));
<add> }
<add>
<add> @Override
<add> public void onFailure(Request request, IOException ex) {
<add> setException(ex);
<add> }
<add> });
<add> }
<add>
<add> @Override
<add> protected void interruptTask() {
<add> this.call.cancel();
<add> }
<add> }
<add>
<add>}
<ide><path>spring-web/src/main/java/org/springframework/http/client/OkHttpClientHttpRequest.java
<ide> /*
<del> * Copyright 2002-2015 the original author or authors.
<add> * Copyright 2002-2016 the original author or authors.
<ide> *
<ide> * Licensed under the Apache License, Version 2.0 (the "License");
<ide> * you may not use this file except in compliance with the License.
<ide>
<ide> import java.io.IOException;
<ide> import java.net.URI;
<del>import java.net.URL;
<del>import java.util.List;
<del>import java.util.Map;
<del>import java.util.concurrent.ExecutionException;
<del>
<del>import com.squareup.okhttp.Call;
<del>import com.squareup.okhttp.Callback;
<del>import com.squareup.okhttp.MediaType;
<add>
<ide> import com.squareup.okhttp.OkHttpClient;
<ide> import com.squareup.okhttp.Request;
<del>import com.squareup.okhttp.RequestBody;
<del>import com.squareup.okhttp.Response;
<ide>
<ide> import org.springframework.http.HttpHeaders;
<ide> import org.springframework.http.HttpMethod;
<del>import org.springframework.util.StringUtils;
<del>import org.springframework.util.concurrent.ListenableFuture;
<del>import org.springframework.util.concurrent.SettableListenableFuture;
<ide>
<ide> /**
<ide> * {@link ClientHttpRequest} implementation that uses OkHttp to execute requests.
<ide> * @author Arjen Poutsma
<ide> * @since 4.2
<ide> */
<del>class OkHttpClientHttpRequest extends AbstractBufferingAsyncClientHttpRequest implements ClientHttpRequest {
<add>class OkHttpClientHttpRequest extends AbstractBufferingClientHttpRequest {
<ide>
<ide> private final OkHttpClient client;
<ide>
<ide> public URI getURI() {
<ide> return this.uri;
<ide> }
<ide>
<del> @Override
<del> protected ListenableFuture<ClientHttpResponse> executeInternal(HttpHeaders headers, byte[] content)
<del> throws IOException {
<del>
<del> MediaType contentType = getContentType(headers);
<del> RequestBody body = (content.length > 0 ? RequestBody.create(contentType, content) : null);
<del>
<del> URL url = this.uri.toURL();
<del> String methodName = this.method.name();
<del> Request.Builder builder = new Request.Builder().url(url).method(methodName, body);
<del>
<del> for (Map.Entry<String, List<String>> entry : headers.entrySet()) {
<del> String headerName = entry.getKey();
<del> for (String headerValue : entry.getValue()) {
<del> builder.addHeader(headerName, headerValue);
<del> }
<del> }
<del> Request request = builder.build();
<del>
<del> return new OkHttpListenableFuture(this.client.newCall(request));
<del> }
<del>
<del> private MediaType getContentType(HttpHeaders headers) {
<del> String rawContentType = headers.getFirst("Content-Type");
<del> return (StringUtils.hasText(rawContentType) ? MediaType.parse(rawContentType) : null);
<del> }
<ide>
<ide> @Override
<del> public ClientHttpResponse execute() throws IOException {
<del> try {
<del> return executeAsync().get();
<del> }
<del> catch (InterruptedException ex) {
<del> throw new IOException(ex.getMessage(), ex);
<del> }
<del> catch (ExecutionException ex) {
<del> Throwable cause = ex.getCause();
<del> if (cause instanceof IOException) {
<del> throw (IOException) cause;
<del> }
<del> throw new IOException(cause.getMessage(), cause);
<del> }
<del> }
<add> protected ClientHttpResponse executeInternal(HttpHeaders headers, byte[] content)
<add> throws IOException {
<ide>
<add> Request request = OkHttpClientHttpRequestFactory
<add> .buildRequest(headers, content, this.uri, this.method);
<ide>
<del> private static class OkHttpListenableFuture extends SettableListenableFuture<ClientHttpResponse> {
<del>
<del> private final Call call;
<del>
<del> public OkHttpListenableFuture(Call call) {
<del> this.call = call;
<del> this.call.enqueue(new Callback() {
<del> @Override
<del> public void onResponse(Response response) {
<del> set(new OkHttpClientHttpResponse(response));
<del> }
<del> @Override
<del> public void onFailure(Request request, IOException ex) {
<del> setException(ex);
<del> }
<del> });
<del> }
<del>
<del> @Override
<del> protected void interruptTask() {
<del> this.call.cancel();
<del> }
<add> return new OkHttpClientHttpResponse(this.client.newCall(request).execute());
<ide> }
<ide>
<ide> }
<ide><path>spring-web/src/main/java/org/springframework/http/client/OkHttpClientHttpRequestFactory.java
<ide> /*
<del> * Copyright 2002-2015 the original author or authors.
<add> * Copyright 2002-2016 the original author or authors.
<ide> *
<ide> * Licensed under the Apache License, Version 2.0 (the "License");
<ide> * you may not use this file except in compliance with the License.
<ide>
<ide> package org.springframework.http.client;
<ide>
<add>import java.net.MalformedURLException;
<ide> import java.net.URI;
<add>import java.net.URL;
<add>import java.util.List;
<add>import java.util.Map;
<ide> import java.util.concurrent.TimeUnit;
<ide>
<ide> import com.squareup.okhttp.OkHttpClient;
<add>import com.squareup.okhttp.Request;
<add>import com.squareup.okhttp.RequestBody;
<ide>
<ide> import org.springframework.beans.factory.DisposableBean;
<add>import org.springframework.http.HttpHeaders;
<ide> import org.springframework.http.HttpMethod;
<ide> import org.springframework.util.Assert;
<add>import org.springframework.util.StringUtils;
<ide>
<ide> /**
<ide> * {@link ClientHttpRequestFactory} implementation that uses
<ide> public void setConnectTimeout(int connectTimeout) {
<ide>
<ide> @Override
<ide> public ClientHttpRequest createRequest(URI uri, HttpMethod httpMethod) {
<del> return createRequestInternal(uri, httpMethod);
<add> return new OkHttpClientHttpRequest(this.client, uri, httpMethod);
<ide> }
<ide>
<ide> @Override
<ide> public AsyncClientHttpRequest createAsyncRequest(URI uri, HttpMethod httpMethod) {
<del> return createRequestInternal(uri, httpMethod);
<add> return new OkHttpAsyncClientHttpRequest(this.client, uri, httpMethod);
<ide> }
<ide>
<del> private OkHttpClientHttpRequest createRequestInternal(URI uri, HttpMethod httpMethod) {
<del> return new OkHttpClientHttpRequest(this.client, uri, httpMethod);
<add> static Request buildRequest(HttpHeaders headers, byte[] content, URI uri,
<add> HttpMethod method) throws MalformedURLException {
<add> com.squareup.okhttp.MediaType contentType = getContentType(headers);
<add> RequestBody body =
<add> (content.length > 0 ? RequestBody.create(contentType, content) : null);
<add>
<add> URL url = uri.toURL();
<add> String methodName = method.name();
<add> Request.Builder builder = new Request.Builder().url(url).method(methodName, body);
<add>
<add> for (Map.Entry<String, List<String>> entry : headers.entrySet()) {
<add> String headerName = entry.getKey();
<add> for (String headerValue : entry.getValue()) {
<add> builder.addHeader(headerName, headerValue);
<add> }
<add> }
<add> return builder.build();
<add> }
<add>
<add> private static com.squareup.okhttp.MediaType getContentType(HttpHeaders headers) {
<add> String rawContentType = headers.getFirst(HttpHeaders.CONTENT_TYPE);
<add> return (StringUtils.hasText(rawContentType) ?
<add> com.squareup.okhttp.MediaType.parse(rawContentType) : null);
<ide> }
<ide>
<add>
<ide> @Override
<ide> public void destroy() throws Exception {
<ide> if (this.defaultClient) { | 3 |
Javascript | Javascript | use getbytes() instead of looping over getbyte() | fc73e2e17354d5e5682f956da35ad73ccfa3d710 | <ide><path>src/core/core.js
<ide> * See the License for the specific language governing permissions and
<ide> * limitations under the License.
<ide> */
<del>/* globals assert, calculateMD5, Catalog, Dict, error, info, isArray,
<del> isArrayBuffer, isName, isStream, isString, createPromiseCapability,
<add>/* globals assert, bytesToString, calculateMD5, Catalog, Dict, error, info,
<add> isArray, isArrayBuffer, isName, isStream, isString,
<ide> Linearization, NullStream, PartialEvaluator, shadow, Stream, Lexer,
<ide> StreamsSequenceStream, stringToPDFString, stringToBytes, Util, XRef,
<ide> MissingDataException, Promise, Annotation, ObjectLoader, OperatorList
<ide> var PDFDocument = (function PDFDocumentClosure() {
<ide> function find(stream, needle, limit, backwards) {
<ide> var pos = stream.pos;
<ide> var end = stream.end;
<del> var strBuf = [];
<ide> if (pos + limit > end) {
<ide> limit = end - pos;
<ide> }
<del> for (var n = 0; n < limit; ++n) {
<del> strBuf.push(String.fromCharCode(stream.getByte()));
<del> }
<del> var str = strBuf.join('');
<add> var str = bytesToString(stream.getBytes(limit));
<ide> stream.pos = pos;
<ide> var index = backwards ? str.lastIndexOf(needle) : str.indexOf(needle);
<ide> if (index == -1) {
<ide><path>src/core/stream.js
<ide> var FlateStream = (function FlateStreamClosure() {
<ide> this.eof = true;
<ide> }
<ide> } else {
<del> for (var n = bufferLength; n < end; ++n) {
<del> if ((b = str.getByte()) === -1) {
<del> this.eof = true;
<del> break;
<del> }
<del> buffer[n] = b;
<add> var block = str.getBytes(blockLen);
<add> buffer.set(block, bufferLength);
<add> if (block.length < blockLen) {
<add> this.eof = true;
<ide> }
<ide> }
<ide> return; | 2 |
Java | Java | update javadoc for testcontexttransactionutils | 7bcda3a29bce4568aad89939987402320946bc5a | <ide><path>spring-test/src/main/java/org/springframework/test/context/transaction/TestContextTransactionUtils.java
<ide> public static DataSource retrieveDataSource(TestContext testContext, @Nullable S
<ide> * <li>Look up the transaction manager by type and explicit name, if the supplied
<ide> * {@code name} is non-empty, throwing a {@link BeansException} if the named
<ide> * transaction manager does not exist.
<del> * <li>Attempt to look up the single transaction manager by type.
<ide> * <li>Attempt to look up the transaction manager via a
<ide> * {@link TransactionManagementConfigurer}, if present.
<add> * <li>Attempt to look up the single transaction manager by type.
<ide> * <li>Attempt to look up the <em>primary</em> transaction manager by type.
<ide> * <li>Attempt to look up the transaction manager by type and the
<ide> * {@linkplain #DEFAULT_TRANSACTION_MANAGER_NAME default transaction manager | 1 |
Ruby | Ruby | fix exceptions messages in ac layouts | c16f4f1f38c8d209379cfec45b259538aba61c7c | <ide><path>actionpack/lib/abstract_controller/layouts.rb
<ide> def _write_layout_method
<ide> when false
<ide> nil
<ide> when true
<del> raise ArgumentError, "Layouts must be specified as a String, Symbol, false, or nil"
<add> raise ArgumentError, "Layouts must be specified as a String, Symbol, Proc, false, or nil"
<ide> when nil
<ide> name_clause
<ide> end
<ide> def _layout_for_option(name)
<ide> when false, nil then nil
<ide> else
<ide> raise ArgumentError,
<del> "String, true, or false, expected for `layout'; you passed #{name.inspect}"
<add> "String, Proc, :default, true, or false, expected for `layout'; you passed #{name.inspect}"
<ide> end
<ide> end
<ide> | 1 |
PHP | PHP | fix mariadb test failure | f3cc3c4f2a267e585f84c9c18dd9868f557c139e | <ide><path>tests/TestCase/Database/Schema/MysqlSchemaTest.php
<ide> public function testDescribeTable()
<ide> if (ConnectionManager::get('test')->getDriver()->isMariadb()) {
<ide> $expected['created_with_precision']['default'] = 'current_timestamp(3)';
<ide> $expected['created_with_precision']['comment'] = '';
<add> if (!empty($expected['collate'])) {
<add> $expected['collate'] = 'utf8mb3_general_ci';
<add> }
<ide> }
<ide>
<ide> $this->assertEquals(['id'], $result->getPrimaryKey()); | 1 |
Ruby | Ruby | add return for 1.8.7 | 665ef116ac9000d514c03fc61b216513f5cb7b25 | <ide><path>activerecord/test/cases/relation_scoping_test.rb
<ide> def test_default_scope_include_with_count
<ide>
<ide> def test_default_scope_is_threadsafe
<ide> if in_memory_db?
<del> skip "in memory db can't share a db between threads"
<add> return skip "in memory db can't share a db between threads"
<ide> end
<ide>
<ide> threads = [] | 1 |
Javascript | Javascript | make .deprecate() warn only once | 52bd0f93bb786de028463ef56ddb1e22568584b9 | <ide><path>src/node.js
<ide> NativeModule._cache[this.id] = this;
<ide> };
<ide>
<add> // Wrap a core module's method in a wrapper that will warn on first use
<add> // and then return the result of invoking the original function. After
<add> // first being called the original method is restored.
<ide> NativeModule.prototype.deprecate = function(method, message) {
<ide> var original = this.exports[method];
<ide> var self = this;
<add> var warned = false;
<add> message = message || '';
<ide>
<ide> Object.defineProperty(this.exports, method, {
<ide> enumerable: false,
<ide> value: function() {
<del> message = self.id + '.' + method + ' is deprecated. ' + (message || '');
<add> if (!warned) {
<add> warned = true;
<add> message = self.id + '.' + method + ' is deprecated. ' + message;
<ide>
<del> if ((new RegExp('\\b' + self.id + '\\b')).test(process.env.NODE_DEBUG))
<del> console.trace(message);
<del> else
<del> console.error(message);
<add> var moduleIdCheck = new RegExp('\\b' + self.id + '\\b');
<add> if (moduleIdCheck.test(process.env.NODE_DEBUG))
<add> console.trace(message);
<add> else
<add> console.error(message);
<ide>
<del> self.exports[method] = original;
<add> self.exports[method] = original;
<add> }
<ide> return original.apply(this, arguments);
<ide> }
<ide> }); | 1 |
Python | Python | fix some docstring in encoder-decoder models | afb07a79ab269f5b81abbbb5fc446b20454a51a7 | <ide><path>src/transformers/models/encoder_decoder/modeling_encoder_decoder.py
<ide> general usage and behavior.
<ide>
<ide> Parameters:
<del> config (:class:`~transformers.PretrainedConfig`): Model configuration class with all the parameters of the model.
<add> config (:class:`~transformers.EncoderDecoderConfig`): Model configuration class with all the parameters of the model.
<ide> Initializing with a config file does not load the weights associated with the model, only the
<ide> configuration. Check out the :meth:`~transformers.PreTrainedModel.from_pretrained` method to load the model
<ide> weights.
<ide> @add_start_docstrings(ENCODER_DECODER_START_DOCSTRING)
<ide> class EncoderDecoderModel(PreTrainedModel):
<ide> r"""
<del> :class:`~transformers.EncoderDecoder` is a generic model class that will be instantiated as a transformer
<add> :class:`~transformers.EncoderDecoderModel` is a generic model class that will be instantiated as a transformer
<ide> architecture with one of the base model classes of the library as encoder and another one as decoder when created
<ide> with the :meth`~transformers.AutoModel.from_pretrained` class method for the encoder and
<ide> :meth`~transformers.AutoModelForCausalLM.from_pretrained` class method for the decoder.
<ide><path>src/transformers/models/speech_encoder_decoder/modeling_speech_encoder_decoder.py
<ide> general usage and behavior.
<ide>
<ide> Parameters:
<del> config (:class:`~transformers.PretrainedConfig`): Model configuration class with all the parameters of the model.
<add> config (:class:`~transformers.SpeechEncoderDecoderConfig`): Model configuration class with all the parameters of the model.
<ide> Initializing with a config file does not load the weights associated with the model, only the
<ide> configuration. Check out the :meth:`~transformers.PreTrainedModel.from_pretrained` method to load the model
<ide> weights.
<ide> @add_start_docstrings(SPEECH_ENCODER_DECODER_START_DOCSTRING)
<ide> class SpeechEncoderDecoderModel(PreTrainedModel):
<ide> r"""
<del> :class:`~transformers.EncoderDecoder` is a generic model class that will be instantiated as a transformer
<del> architecture with one of the base model classes of the library as encoder and another one as decoder when created
<del> with the :meth`~transformers.AutoModel.from_pretrained` class method for the encoder and
<add> :class:`~transformers.SpeechEncoderDecoderModel` is a generic model class that will be instantiated as a
<add> transformer architecture with one of the base model classes of the library as encoder and another one as decoder
<add> when created with the :meth`~transformers.AutoModel.from_pretrained` class method for the encoder and
<ide> :meth`~transformers.AutoModelForCausalLM.from_pretrained` class method for the decoder.
<ide> """
<ide> config_class = SpeechEncoderDecoderConfig | 2 |
Ruby | Ruby | pass arguments instead of reopening pathname | ccd31a2dd26ce5db8f247f8261e808702d061454 | <ide><path>Library/Homebrew/cmd/tap.rb
<ide> def link_tap_formula formulae
<ide> to.make_relative_symlink(from)
<ide> rescue SystemCallError
<ide> to = to.realpath if to.exist?
<del> opoo "Could not tap #{Tty.white}#{from.tap_ref}#{Tty.reset} over #{Tty.white}#{to.tap_ref}#{Tty.reset}"
<add> opoo "Could not tap #{Tty.white}#{tap_ref(from)}#{Tty.reset} over #{Tty.white}#{tap_ref(to)}#{Tty.reset}"
<ide> else
<ide> ignores << formula.basename.to_s
<ide> tapped += 1
<ide> def private_tap?(user, repo)
<ide> rescue GitHub::Error
<ide> false
<ide> end
<del>end
<del>
<ide>
<del>class Pathname
<del> def tap_ref
<del> case self.to_s
<add> def tap_ref(path)
<add> case path.to_s
<ide> when %r{^#{HOMEBREW_LIBRARY}/Taps/([\w_-]+)/([\w_-]+)/(.+)}
<ide> "#$1/#$2/#{File.basename($3, '.rb')}"
<ide> when %r{^#{HOMEBREW_LIBRARY}/Formula/(.+)} | 1 |
Javascript | Javascript | render links below nodes and labels | dfd0ad354dbdae201a1c5fa775c73581774e7e94 | <ide><path>examples/tree/tree.js
<ide> var vis = d3.select("body").append("svg:svg")
<ide> .attr("transform", "translate(40, 100)");
<ide>
<ide> d3.json("flare.json", function(json) {
<del> var node = vis.data(d3.entries(json)).selectAll("g.node")
<del> .data(tree);
<del> var nodeEnter = node.enter().append("svg:g")
<add> var nodes = tree(d3.entries(json)[0]);
<add>
<add> var link = vis.selectAll("g.link")
<add> .data(nodes)
<add> .enter().append("svg:g")
<add> .attr("class", "link");
<add>
<add> link.selectAll("line")
<add> .data(children)
<add> .enter().append("svg:line")
<add> .attr("x1", function(d) { return d.parent.x; })
<add> .attr("y1", function(d) { return d.parent.y; })
<add> .attr("x2", function(d) { return d.child.x; })
<add> .attr("y2", function(d) { return d.child.y; });
<add>
<add> var node = vis.selectAll("g.node")
<add> .data(nodes)
<add> .enter().append("svg:g")
<ide> .attr("class", "node")
<ide> .attr("transform", function(d) { return "translate(" + d.x + "," + d.y + ")"; })
<ide>
<del> nodeEnter.selectAll("line.link")
<del> .data(function(d) {
<del> return (d.children || []).map(function(c) {
<del> return {x: c.x-d.x, y: c.y-d.y};
<del> });
<del> })
<del> .enter().append("svg:line")
<del> .attr("class", "link")
<del> .attr("x1", 0).attr("y1", 0)
<del> .attr("x2", function(d) { return d.x; })
<del> .attr("y2", function(d) { return d.y; });
<del> nodeEnter.append("svg:circle")
<add> node.append("svg:circle")
<ide> .attr("class", "node")
<ide> .attr("r", 5);
<del> nodeEnter.append("svg:text")
<add>
<add> node.append("svg:text")
<ide> .attr("dx", function(d) { return d.children ? -8 : 8; })
<ide> .attr("dy", 3)
<ide> .attr("text-anchor", function(d) { return d.children ? "end" : "start"; })
<ide> .text(function(d) { return d.data.key; });
<ide>
<add> // Returns parent+child objects for any children of `d`.
<add> function children(d) {
<add> return (d.children || []).map(function(v) {
<add> return {
<add> parent: d,
<add> child: v
<add> };
<add> });
<add> }
<ide> }); | 1 |
Javascript | Javascript | adjust clipboard mock | 2aba3522ab7dfe113ec26131b747f7f6f0486194 | <ide><path>jest/setup.js
<ide> jest
<ide> getRecommendedTimeoutMillis: jest.fn(),
<ide> },
<ide> }))
<add> .mock('../Libraries/Components/Clipboard/Clipboard', () => ({
<add> getString: jest.fn(() => ''),
<add> setString: jest.fn(),
<add> }))
<ide> .mock('../Libraries/Components/RefreshControl/RefreshControl', () =>
<ide> jest.requireActual(
<ide> '../Libraries/Components/RefreshControl/__mocks__/RefreshControlMock',
<ide> jest
<ide> process.nextTick(() => callback(null, [])),
<ide> ),
<ide> },
<del> Clipboard: {
<del> getString: jest.fn(() => ''),
<del> setString: jest.fn(),
<del> },
<ide> DeviceInfo: {
<ide> getConstants() {
<ide> return { | 1 |
Go | Go | add test-case for legacy format | 346a438da82dec69874e226b49d6419150c456a9 | <ide><path>opts/opts_test.go
<ide> func TestNamedMapOpts(t *testing.T) {
<ide> }
<ide>
<ide> func TestParseLink(t *testing.T) {
<del> name, alias, err := ParseLink("name:alias")
<del> if err != nil {
<del> t.Fatalf("Expected not to error out on a valid name:alias format but got: %v", err)
<del> }
<del> if name != "name" {
<del> t.Fatalf("Link name should have been name, got %s instead", name)
<del> }
<del> if alias != "alias" {
<del> t.Fatalf("Link alias should have been alias, got %s instead", alias)
<del> }
<del> // short format definition
<del> name, alias, err = ParseLink("name")
<del> if err != nil {
<del> t.Fatalf("Expected not to error out on a valid name only format but got: %v", err)
<del> }
<del> if name != "name" {
<del> t.Fatalf("Link name should have been name, got %s instead", name)
<del> }
<del> if alias != "name" {
<del> t.Fatalf("Link alias should have been name, got %s instead", alias)
<del> }
<del> // empty string link definition is not allowed
<del> if _, _, err := ParseLink(""); err == nil || !strings.Contains(err.Error(), "empty string specified for links") {
<del> t.Fatalf("Expected error 'empty string specified for links' but got: %v", err)
<del> }
<del> // more than two colons are not allowed
<del> if _, _, err := ParseLink("link:alias:wrong"); err == nil || !strings.Contains(err.Error(), "bad format for links: link:alias:wrong") {
<del> t.Fatalf("Expected error 'bad format for links: link:alias:wrong' but got: %v", err)
<del> }
<add> t.Run("name and alias", func(t *testing.T) {
<add> name, alias, err := ParseLink("name:alias")
<add> assert.Check(t, err)
<add> assert.Check(t, is.Equal(name, "name"))
<add> assert.Check(t, is.Equal(alias, "alias"))
<add> })
<add> t.Run("short format", func(t *testing.T) {
<add> name, alias, err := ParseLink("name")
<add> assert.Check(t, err)
<add> assert.Check(t, is.Equal(name, "name"))
<add> assert.Check(t, is.Equal(alias, "name"))
<add> })
<add> t.Run("empty string", func(t *testing.T) {
<add> _, _, err := ParseLink("")
<add> assert.Check(t, is.Error(err, "empty string specified for links"))
<add> })
<add> t.Run("more than two colons", func(t *testing.T) {
<add> _, _, err := ParseLink("link:alias:wrong")
<add> assert.Check(t, is.Error(err, "bad format for links: link:alias:wrong"))
<add> })
<add> t.Run("legacy format", func(t *testing.T) {
<add> name, alias, err := ParseLink("/foo:/c1/bar")
<add> assert.Check(t, err)
<add> assert.Check(t, is.Equal(name, "foo"))
<add> assert.Check(t, is.Equal(alias, "bar"))
<add> })
<ide> } | 1 |
Python | Python | add explanation for comments=none in loadtxt | 406ccc9c574555ebdbda3de6abfc44a833f523e7 | <ide><path>numpy/lib/npyio.py
<ide> def loadtxt(fname, dtype=float, comments='#', delimiter=None,
<ide> the data-type.
<ide> comments : str or sequence of str, optional
<ide> The characters or list of characters used to indicate the start of a
<del> comment. For backwards compatibility, byte strings will be decoded as
<del> 'latin1'. The default is '#'.
<add> comment. None implies no comments. For backwards compatibility, byte
<add> strings will be decoded as 'latin1'. The default is '#'.
<ide> delimiter : str, optional
<ide> The string used to separate values. For backwards compatibility, byte
<ide> strings will be decoded as 'latin1'. The default is whitespace. | 1 |
Python | Python | remove unused import | 48589f523d603e332909ac34e1d12bdfcddb0acb | <ide><path>libcloud/__init__.py
<ide> def enable_debug(fo):
<ide> @param fo: Where to append debugging information
<ide> @type fo: File like object, only write operations are used.
<ide> """
<del> import httplib
<ide> from libcloud.base import ConnectionKey, LoggingHTTPConnection, LoggingHTTPSConnection
<ide> LoggingHTTPSConnection.log = fo
<ide> LoggingHTTPConnection.log = fo | 1 |
Java | Java | fix regressions from 8f558e7 | 71683c5f8db19e3795f27d0c7ad7b4dbaa3083fe | <ide><path>spring-webmvc/src/main/java/org/springframework/web/servlet/handler/AbstractHandlerMethodMapping.java
<ide> import java.util.Collection;
<ide> import java.util.Collections;
<ide> import java.util.Comparator;
<add>import java.util.HashMap;
<ide> import java.util.IdentityHashMap;
<ide> import java.util.LinkedHashMap;
<ide> import java.util.List;
<ide> public boolean matches(Method method) {
<ide> protected abstract T getMappingForMethod(Method method, Class<?> handlerType);
<ide>
<ide> /**
<del> * Register a handler method and its unique mapping.
<del> * <p>Invoked at startup for each detected handler method. May also be
<del> * invoked at runtime after initialization is complete.
<add> * Register a handler method and its unique mapping. Invoked at startup for
<add> * each detected handler method.
<ide> * @param handler the bean name of the handler or the handler instance
<ide> * @param method the method to register
<ide> * @param mapping the mapping conditions associated with the handler method
<ide> * @throws IllegalStateException if another method was already registered
<ide> * under the same mapping
<add> * @deprecated as of 4.2 you can invoke the public methods
<add> * {@link #registerMapping(Object, Object, Method)} and
<add> * {@link #unregisterMapping(Object)} during initialization or at runtime,
<add> * i.e. after initialization is complete.
<ide> */
<del> public void registerHandlerMethod(Object handler, Method method, T mapping) {
<del> this.mappingRegistry.register(handler, method, mapping);
<del> }
<del>
<del> /**
<del> * Un-register a handler method.
<del> * <p>This method may be invoked at runtime after initialization has completed.
<del> * @param handlerMethod the handler method to be unregistered
<del> */
<del> public void unregisterHandlerMethod(HandlerMethod handlerMethod) {
<del> this.mappingRegistry.unregister(handlerMethod);
<add> @Deprecated
<add> protected void registerHandlerMethod(Object handler, Method method, T mapping) {
<add> this.mappingRegistry.register(mapping, handler, method);
<ide> }
<ide>
<ide> /**
<ide> protected CorsConfiguration initCorsConfiguration(Object handler, Method method,
<ide> protected void handlerMethodsInitialized(Map<T, HandlerMethod> handlerMethods) {
<ide> }
<ide>
<add> /**
<add> * Register the given mapping.
<add> * <p>This method may be invoked at runtime after initialization has completed.
<add> * @param mapping the mapping for the handler method
<add> * @param handler the handler
<add> * @param method the method
<add> */
<add> public void registerMapping(T mapping, Object handler, Method method) {
<add> this.mappingRegistry.register(mapping, handler, method);
<add> }
<add>
<add> /**
<add> * Un-register the given mapping.
<add> * <p>This method may be invoked at runtime after initialization has completed.
<add> * @param mapping the mapping to unregister
<add> */
<add> public void unregisterMapping(T mapping) {
<add> this.mappingRegistry.unregister(mapping);
<add> }
<ide>
<ide> /**
<ide> * Look up a handler method for the given request.
<ide> private class MappingRegistry {
<ide> private final Map<String, List<HandlerMethod>> mappingNameLookup =
<ide> new ConcurrentHashMap<String, List<HandlerMethod>>();
<ide>
<del> private final Map<Method, MappingDefinition<T>> methodLookup =
<del> new ConcurrentHashMap<Method, MappingDefinition<T>>();
<add> private final Map<T, MappingDefinition<T>> definitionMap = new HashMap<T, MappingDefinition<T>>();
<add>
<add> private final Map<Method, CorsConfiguration> corsLookup =
<add> new ConcurrentHashMap<Method, CorsConfiguration>();
<ide>
<ide>
<ide> private final ReentrantReadWriteLock readWriteLock = new ReentrantReadWriteLock();
<ide> public List<HandlerMethod> getHandlerMethodsByMappingName(String mappingName) {
<ide> */
<ide> public CorsConfiguration getCorsConfiguration(HandlerMethod handlerMethod) {
<ide> Method method = handlerMethod.getMethod();
<del> MappingDefinition<T> definition = this.methodLookup.get(method);
<del> return (definition != null ? definition.getCorsConfiguration() : null);
<add> return this.corsLookup.get(method);
<ide> }
<ide>
<ide> /**
<ide> public void releaseReadLock() {
<ide> }
<ide>
<ide>
<del> public void register(Object handler, Method method, T mapping) {
<add> public void register(T mapping, Object handler, Method method) {
<ide>
<ide> this.readWriteLock.writeLock().lock();
<ide> try {
<ide> public void register(Object handler, Method method, T mapping) {
<ide> }
<ide>
<ide> CorsConfiguration corsConfig = initCorsConfiguration(handler, method, mapping);
<add> if (corsConfig != null) {
<add> this.corsLookup.put(method, corsConfig);
<add> }
<ide>
<del> this.methodLookup.put(method,
<add> this.definitionMap.put(mapping,
<ide> new MappingDefinition<T>(mapping, handlerMethod, directUrls, name, corsConfig));
<ide> }
<ide> finally {
<ide> private void assertUniqueMethodMapping(HandlerMethod newHandlerMethod, T mapping
<ide> newHandlerMethod + "\nto " + mapping + ": There is already '" +
<ide> handlerMethod.getBean() + "' bean method\n" + handlerMethod + " mapped.");
<ide> }
<del> MappingDefinition<T> definition = this.methodLookup.get(newHandlerMethod.getMethod());
<del> if (definition != null) {
<del> throw new IllegalStateException("Cannot map " + newHandlerMethod.getMethod() +
<del> "\nto " + mapping + ".\n It is already mapped to " + definition.getMapping());
<del> }
<ide> }
<ide>
<ide> private List<String> getDirectUrls(T mapping) {
<ide> private void addMappingName(String name, HandlerMethod handlerMethod) {
<ide> }
<ide> }
<ide>
<del> public void unregister(HandlerMethod handlerMethod) {
<add> public void unregister(T mapping) {
<ide> this.readWriteLock.writeLock().lock();
<ide> try {
<del> MappingDefinition<T> definition = this.methodLookup.remove(handlerMethod.getMethod());
<add> MappingDefinition<T> definition = this.definitionMap.remove(mapping);
<ide> if (definition == null) {
<ide> return;
<ide> }
<ide> public void unregister(HandlerMethod handlerMethod) {
<ide> }
<ide>
<ide> removeMappingName(definition);
<add>
<add> this.corsLookup.remove(definition.getHandlerMethod().getMethod());
<ide> }
<ide> finally {
<ide> this.readWriteLock.writeLock().unlock();
<ide><path>spring-webmvc/src/test/java/org/springframework/web/servlet/handler/HandlerMethodMappingTests.java
<ide> public void setUp() throws Exception {
<ide>
<ide> @Test(expected = IllegalStateException.class)
<ide> public void registerDuplicates() {
<del> mapping.registerHandlerMethod(handler, method1, "foo");
<del> mapping.registerHandlerMethod(handler, method2, "foo");
<add> mapping.registerMapping("foo", handler, method1);
<add> mapping.registerMapping("foo", handler, method2);
<ide> }
<ide>
<ide> @Test
<ide> public void directMatch() throws Exception {
<ide> String key = "foo";
<del> mapping.registerHandlerMethod(handler, method1, key);
<add> mapping.registerMapping(key, handler, method1);
<ide>
<ide> HandlerMethod result = mapping.getHandlerInternal(new MockHttpServletRequest("GET", key));
<ide> assertEquals(method1, result.getMethod());
<ide> }
<ide>
<ide> @Test
<ide> public void patternMatch() throws Exception {
<del> mapping.registerHandlerMethod(handler, method1, "/fo*");
<del> mapping.registerHandlerMethod(handler, method2, "/f*");
<add> mapping.registerMapping("/fo*", handler, method1);
<add> mapping.registerMapping("/f*", handler, method2);
<ide>
<ide> HandlerMethod result = mapping.getHandlerInternal(new MockHttpServletRequest("GET", "/foo"));
<ide> assertEquals(method1, result.getMethod());
<ide> }
<ide>
<ide> @Test(expected = IllegalStateException.class)
<ide> public void ambiguousMatch() throws Exception {
<del> mapping.registerHandlerMethod(handler, method1, "/f?o");
<del> mapping.registerHandlerMethod(handler, method2, "/fo?");
<add> mapping.registerMapping("/f?o", handler, method1);
<add> mapping.registerMapping("/fo?", handler, method2);
<ide>
<ide> mapping.getHandlerInternal(new MockHttpServletRequest("GET", "/foo"));
<ide> }
<ide> public void detectHandlerMethodsInAncestorContexts() {
<ide> @Test
<ide> public void unregister() throws Exception {
<ide> String key = "foo";
<del> mapping.registerHandlerMethod(handler, method1, key);
<add> mapping.registerMapping(key, handler, method1);
<ide>
<ide> HandlerMethod handlerMethod = mapping.getHandlerInternal(new MockHttpServletRequest("GET", key));
<ide> assertEquals(method1, handlerMethod.getMethod());
<ide>
<del> mapping.unregisterHandlerMethod(handlerMethod);
<add> mapping.unregisterMapping(key);
<ide> assertNull(mapping.getHandlerInternal(new MockHttpServletRequest("GET", key)));
<ide> }
<ide>
<ide> protected Set<String> getMappingPathPatterns(String key) {
<ide> }
<ide> }
<ide>
<add> @SuppressWarnings("unused")
<ide> @Controller
<ide> static class MyHandler {
<ide> | 2 |
Javascript | Javascript | fix multichild tests so they work with fiber | c3310d3e326d241ec42e3db34586ce3d01cf554c | <ide><path>src/renderers/shared/stack/reconciler/__tests__/ReactMultiChildReconcile-test.js
<ide>
<ide> var React = require('React');
<ide> var ReactDOM = require('ReactDOM');
<del>var ReactDOMComponentTree = require('ReactDOMComponentTree');
<del>var ReactInstanceMap = require('ReactInstanceMap');
<ide>
<ide> var stripEmptyValues = function(obj) {
<ide> var ret = {};
<del> var name;
<del> for (name in obj) {
<add> for (var name in obj) {
<ide> if (!obj.hasOwnProperty(name)) {
<ide> continue;
<ide> }
<ide> var stripEmptyValues = function(obj) {
<ide> return ret;
<ide> };
<ide>
<del>/**
<del> * Child key names are wrapped like '.$key:0'. We strip the extra chars out
<del> * here. This relies on an implementation detail of the rendering system.
<del> */
<del>var getOriginalKey = function(childName) {
<del> var match = childName.match(/^\.\$([^.]+)$/);
<del> expect(match).not.toBeNull();
<del> return match[1];
<del>};
<add>var idCounter = 123;
<ide>
<ide> /**
<ide> * Contains internal static internal state in order to test that updates to
<ide> * existing children won't reinitialize components, when moving children -
<ide> * reusing existing DOM/memory resources.
<ide> */
<ide> class StatusDisplay extends React.Component {
<del> state = {internalState: Math.random()};
<add> state = {internalState: idCounter++};
<ide>
<del> getStatus = () => {
<add> getStatus() {
<ide> return this.props.status;
<del> };
<add> }
<ide>
<del> getInternalState = () => {
<add> getInternalState() {
<ide> return this.state.internalState;
<del> };
<add> }
<ide>
<ide> componentDidMount() {
<ide> this.props.onFlush();
<ide> class StatusDisplay extends React.Component {
<ide> render() {
<ide> return (
<ide> <div>
<del> {this.state.internalState}
<add> {this.props.contentKey}
<ide> </div>
<ide> );
<ide> }
<ide> class FriendsStatusDisplay extends React.Component {
<ide> * Refs are not maintained in the rendered order, and neither is
<ide> * `this._renderedChildren` (surprisingly).
<ide> */
<del> getOriginalKeys = () => {
<add> getOriginalKeys() {
<ide> var originalKeys = [];
<del> // TODO: Update this to a better test that doesn't rely so much on internal
<del> // implementation details.
<del> var statusDisplays =
<del> ReactInstanceMap.get(this)
<del> ._renderedComponent
<del> ._renderedChildren;
<del> var name;
<del> for (name in statusDisplays) {
<del> var child = statusDisplays[name];
<del> var isPresent = !!child;
<del> if (isPresent) {
<del> originalKeys[child._mountIndex] = getOriginalKey(name);
<add> for (var key in this.props.usernameToStatus) {
<add> if (this.props.usernameToStatus[key]) {
<add> originalKeys.push(key);
<ide> }
<ide> }
<ide> return originalKeys;
<del> };
<add> }
<ide>
<ide> /**
<ide> * Retrieves the rendered children in a nice format for comparing to the input
<ide> * `this.props.usernameToStatus`.
<ide> */
<del> getStatusDisplays = () => {
<add> getStatusDisplays() {
<ide> var res = {};
<del> var i;
<ide> var originalKeys = this.getOriginalKeys();
<del> for (i = 0; i < originalKeys.length; i++) {
<add> for (var i = 0; i < originalKeys.length; i++) {
<ide> var key = originalKeys[i];
<ide> res[key] = this.refs[key];
<ide> }
<ide> return res;
<del> };
<add> }
<ide>
<ide> /**
<ide> * Verifies that by the time a child is flushed, the refs that appeared
<ide> class FriendsStatusDisplay extends React.Component {
<ide> * but our internal layer API depends on this assumption. We need to change
<ide> * it to be more declarative before making ref resolution indeterministic.
<ide> */
<del> verifyPreviousRefsResolved = (flushedKey) => {
<del> var i;
<add> verifyPreviousRefsResolved(flushedKey) {
<ide> var originalKeys = this.getOriginalKeys();
<del> for (i = 0; i < originalKeys.length; i++) {
<add> for (var i = 0; i < originalKeys.length; i++) {
<ide> var key = originalKeys[i];
<ide> if (key === flushedKey) {
<ide> // We are only interested in children up to the current key.
<ide> return;
<ide> }
<ide> expect(this.refs[key]).toBeTruthy();
<ide> }
<del> };
<add> }
<ide>
<ide> render() {
<ide> var children = [];
<del> var key;
<del> for (key in this.props.usernameToStatus) {
<add> for (var key in this.props.usernameToStatus) {
<ide> var status = this.props.usernameToStatus[key];
<ide> children.push(
<ide> !status ? null :
<ide> <StatusDisplay
<ide> key={key}
<ide> ref={key}
<add> contentKey={key}
<ide> onFlush={this.verifyPreviousRefsResolved.bind(this, key)}
<ide> status={status}
<ide> />
<ide> function verifyStatesPreserved(lastInternalStates, statusDisplays) {
<ide> * Verifies that the internal representation of a set of `renderedChildren`
<ide> * accurately reflects what is in the DOM.
<ide> */
<del>function verifyDomOrderingAccurate(parentInstance, statusDisplays) {
<del> var containerNode = ReactDOM.findDOMNode(parentInstance);
<add>function verifyDomOrderingAccurate(outerContainer, statusDisplays) {
<add> var containerNode = outerContainer.firstChild;
<ide> var statusDisplayNodes = containerNode.childNodes;
<del> var i;
<del> var orderedDomIDs = [];
<del> for (i = 0; i < statusDisplayNodes.length; i++) {
<del> var inst = ReactDOMComponentTree.getInstanceFromNode(statusDisplayNodes[i]);
<del> orderedDomIDs.push(inst._rootNodeID);
<add> var orderedDomKeys = [];
<add> for (var i = 0; i < statusDisplayNodes.length; i++) {
<add> var contentKey = statusDisplayNodes[i].textContent;
<add> orderedDomKeys.push(contentKey);
<ide> }
<ide>
<del> var orderedLogicalIDs = [];
<add> var orderedLogicalKeys = [];
<ide> var username;
<ide> for (username in statusDisplays) {
<ide> if (!statusDisplays.hasOwnProperty(username)) {
<ide> continue;
<ide> }
<ide> var statusDisplay = statusDisplays[username];
<del> orderedLogicalIDs.push(
<del> ReactInstanceMap.get(statusDisplay)._renderedComponent._rootNodeID
<add> orderedLogicalKeys.push(
<add> statusDisplay.props.contentKey
<ide> );
<ide> }
<del> expect(orderedDomIDs).toEqual(orderedLogicalIDs);
<add> expect(orderedDomKeys).toEqual(orderedLogicalKeys);
<ide> }
<ide>
<ide> /**
<ide> * Todo: Check that internal state is preserved across transitions
<ide> */
<ide> function testPropsSequence(sequence) {
<del> var i;
<ide> var container = document.createElement('div');
<ide> var parentInstance = ReactDOM.render(
<ide> <FriendsStatusDisplay {...sequence[0]} />,
<ide> function testPropsSequence(sequence) {
<ide> var lastInternalStates = getInternalStateByUserName(statusDisplays);
<ide> verifyStatuses(statusDisplays, sequence[0]);
<ide>
<del> for (i = 1; i < sequence.length; i++) {
<add> for (var i = 1; i < sequence.length; i++) {
<ide> ReactDOM.render(
<ide> <FriendsStatusDisplay {...sequence[i]} />,
<ide> container
<ide> );
<ide> statusDisplays = parentInstance.getStatusDisplays();
<ide> verifyStatuses(statusDisplays, sequence[i]);
<ide> verifyStatesPreserved(lastInternalStates, statusDisplays);
<del> verifyDomOrderingAccurate(parentInstance, statusDisplays);
<add> verifyDomOrderingAccurate(container, statusDisplays);
<ide>
<ide> lastInternalStates = getInternalStateByUserName(statusDisplays);
<ide> }
<ide><path>src/renderers/shared/stack/reconciler/__tests__/ReactMultiChildText-test.js
<ide> var testAllPermutations = function(testCases) {
<ide> var expectedResultAfterUpdate = testCases[j + 1];
<ide>
<ide> var container = document.createElement('div');
<del> var d = ReactDOM.render(<div>{renderWithChildren}</div>, container);
<del> expectChildren(d, expectedResultAfterRender);
<add> ReactDOM.render(<div>{renderWithChildren}</div>, container);
<add> expectChildren(container, expectedResultAfterRender);
<ide>
<del> d = ReactDOM.render(<div>{updateWithChildren}</div>, container);
<del> expectChildren(d, expectedResultAfterUpdate);
<add> ReactDOM.render(<div>{updateWithChildren}</div>, container);
<add> expectChildren(container, expectedResultAfterUpdate);
<ide> }
<ide> }
<ide> };
<ide>
<del>var expectChildren = function(d, children) {
<del> var outerNode = ReactDOM.findDOMNode(d);
<add>var expectChildren = function(container, children) {
<add> var outerNode = container.firstChild;
<ide> var textNode;
<ide> if (typeof children === 'string') {
<ide> textNode = outerNode.firstChild; | 2 |
PHP | PHP | wrap long lines | ad3dc5a3414cbbef533230a4539683be27bb2de4 | <ide><path>tests/TestCase/Routing/RouteCollectionTest.php
<ide> public function testParseRequestMissingRoute()
<ide> public function testParseRequestCheckHostCondition()
<ide> {
<ide> $routes = new RouteBuilder($this->collection, '/');
<del> $routes->connect('/fallback', ['controller' => 'Articles', 'action' => 'index'], ['_host' => '*.example.com']);
<add> $routes->connect(
<add> '/fallback',
<add> ['controller' => 'Articles', 'action' => 'index'],
<add> ['_host' => '*.example.com']
<add> );
<ide>
<ide> $request = new ServerRequest([
<ide> 'environment' => [
<ide> public static function hostProvider()
<ide> public function testParseRequestCheckHostConditionFail($host)
<ide> {
<ide> $routes = new RouteBuilder($this->collection, '/');
<del> $routes->connect('/fallback', ['controller' => 'Articles', 'action' => 'index'], ['_host' => '*.example.com']);
<add> $routes->connect(
<add> '/fallback',
<add> ['controller' => 'Articles', 'action' => 'index'],
<add> ['_host' => '*.example.com']
<add> );
<ide>
<ide> $request = new ServerRequest([
<ide> 'environment' => [ | 1 |
PHP | PHP | fix cs errors | 78284090bca2e27e5cb1e8fc0050b91c00302da2 | <ide><path>src/Http/Client/Adapter/Curl.php
<ide> namespace Cake\Http\Client\Adapter;
<ide>
<ide> use Cake\Http\Client\AdapterInterface;
<del>use Cake\Http\Client\Exception\RequestException;
<ide> use Cake\Http\Client\Exception\NetworkException;
<add>use Cake\Http\Client\Exception\RequestException;
<ide> use Cake\Http\Client\Request;
<ide> use Cake\Http\Client\Response;
<ide> use Composer\CaBundle\CaBundle;
<ide><path>src/Http/Client/Adapter/Stream.php
<ide> */
<ide> namespace Cake\Http\Client\Adapter;
<ide>
<del>use Cake\Core\Exception\Exception;
<del>use Cake\Http\Client\Exception\RequestException;
<del>use Cake\Http\Client\Exception\NetworkException;
<ide> use Cake\Http\Client\AdapterInterface;
<add>use Cake\Http\Client\Exception\NetworkException;
<add>use Cake\Http\Client\Exception\RequestException;
<ide> use Cake\Http\Client\Response;
<ide> use Composer\CaBundle\CaBundle;
<ide> use Psr\Http\Message\RequestInterface;
<ide><path>src/Http/Client/Exception/NetworkException.php
<ide> * @since 4.0.0
<ide> * @license https://opensource.org/licenses/mit-license.php MIT License
<ide> */
<del>namespace Cake\Http\Client\Exception;
<add>namespace Cake\Http\Client\Exception;
<ide>
<ide> use Psr\Http\Client\NetworkExceptionInterface;
<ide>
<ide><path>src/Http/Client/Exception/RequestException.php
<ide> * @since 4.0.0
<ide> * @license https://opensource.org/licenses/mit-license.php MIT License
<ide> */
<del>namespace Cake\Http\Client\Exception;
<add>namespace Cake\Http\Client\Exception;
<ide>
<ide> use Exception;
<ide> use Psr\Http\Client\RequestExceptionInterface;
<ide> class RequestException extends RuntimeException implements RequestExceptionInter
<ide> protected $request;
<ide>
<ide> /**
<del> * @param string $message
<del> * @param \Psr\Http\Message\RequestInterface $request
<add> * Constructor.
<ide> *
<del> * @param \Exception|null $previous
<add> * @param string $message Exeception message.
<add> * @param \Psr\Http\Message\RequestInterface $request Request instance.
<add> * @param \Exception|null $previous Previous Exception
<ide> */
<del> public function __construct($message, RequestInterface $request, Exception $previous = null)
<add> public function __construct($message, RequestInterface $request, ?Exception $previous = null)
<ide> {
<ide> $this->request = $request;
<ide> parent::__construct($message, 0, $previous); | 4 |
Python | Python | simplify the test | 8560d2077873049a06e5fe62899d2197584c9d43 | <ide><path>libcloud/test/compute/test_openstack.py
<ide> def test_auth_host_passed(self):
<ide> ex_force_auth_url='http://x.y.z.y:5000',
<ide> ex_tenant_name='admin')
<ide> self.assertEqual(d._ex_force_auth_url, forced_auth)
<add>
<ide> with requests_mock.Mocker() as mock:
<del> body1 = ComputeFileFixtures('openstack').load('v1_slug_servers_ips.xml')
<ide> body2 = ComputeFileFixtures('openstack').load('_v2_0__auth.json')
<ide>
<del> mock.register_uri('GET', 'https://test_endpoint.com/v2/1337/servers/test/ips', text=body1,
<del> headers={'content-type': 'application/xml; charset=UTF-8'})
<ide> mock.register_uri('POST', 'http://x.y.z.y:5000/v2.0/tokens', text=body2,
<ide> headers={'content-type': 'application/json; charset=UTF-8'})
<del> d.ex_list_ip_addresses('test')
<add> d.connection._populate_hosts_and_request_paths()
<ide> self.assertEqual(d.connection.host, 'test_endpoint.com')
<ide>
<ide> | 1 |
Javascript | Javascript | reduce user saves, provide callbacks | 17829b4bd33ec44d8ffe53c24a31e18b4884010a | <ide><path>controllers/challenge.js
<ide> exports.returnCurrentChallenge = function(req, res, next) {
<ide> return elem;
<ide> }
<ide> });
<del> req.user.save();
<ide> if (!req.user.currentChallenge) {
<ide> req.user.currentChallenge = {};
<ide> req.user.currentChallenge.challengeId = challengeMapWithIds['0'][0];
<ide> req.user.currentChallenge.challengeName = challengeMapWithNames['0'][0];
<ide> req.user.currentChallenge.challengeBlock = '0';
<del> req.user.save();
<del> return res.redirect('../challenges/learn-how-free-code-camp-works');
<add> req.user.save(function(err) {
<add> if (err) {
<add> return next(err);
<add> }
<add> return res.redirect('../challenges/learn-how-free-code-camp-works');
<add> });
<ide> }
<ide> var nameString = req.user.currentChallenge.challengeName.trim()
<ide> .toLowerCase()
<ide> .replace(/\s/g, '-');
<del> return res.redirect('../challenges/' + nameString);
<add> req.user.save(function(err) {
<add> if (err) {
<add> return next(err);
<add> }
<add> return res.redirect('../challenges/' + nameString);
<add> });
<ide> };
<ide>
<ide> exports.returnIndividualChallenge = function(req, res, next) {
<ide> exports.returnIndividualChallenge = function(req, res, next) {
<ide> ))
<ide> };
<ide> }
<del> req.user.save();
<ide>
<ide> var challengeType = {
<ide> 0: function() {
<ide> exports.returnIndividualChallenge = function(req, res, next) {
<ide> });
<ide> }
<ide> };
<del>
<del> return challengeType[challenge.challengeType]();
<del>
<add> req.user.save(function(err) {
<add> if (err) {
<add> return next(err);
<add> }
<add> return challengeType[challenge.challengeType]();
<add> });
<ide> });
<ide> };
<ide> | 1 |
PHP | PHP | improve test cases | ba61d35ded603bed60d1516895e06ee30eeb72fc | <ide><path>tests/TestCase/Validation/ValidatorTest.php
<ide> public function testAllowEmptyArray()
<ide> $data = [
<ide> 'items' => [1, 2, 3, 4, 5],
<ide> ];
<add> $expected = [
<add> 'items' => [
<add> 'hasAtMost' => 'The provided value is invalid',
<add> ],
<add> ];
<ide> $result = $validator->errors($data);
<del> $this->assertNotEmpty($result);
<add> $this->assertSame($expected, $result);
<ide>
<ide> $validator = new Validator();
<ide> $validator->allowEmptyArray('items', 'update', 'message');
<ide> public function testAllowEmptyFile()
<ide> 'error' => UPLOAD_ERR_FORM_SIZE,
<ide> ]
<ide> ];
<add> $expected = [
<add> 'photo' => [
<add> 'uploadedFile' => 'The provided value is invalid'
<add> ]
<add> ];
<ide> $result = $validator->errors($data);
<del> $this->assertNotEmpty($result);
<add> $this->assertSame($expected, $result);
<ide>
<ide> $data = [
<ide> 'photo' => '',
<ide> ];
<add> $expected = [
<add> 'photo' => [
<add> 'uploadedFile' => 'The provided value is invalid'
<add> ]
<add> ];
<ide> $result = $validator->errors($data);
<del> $this->assertNotEmpty($result);
<add> $this->assertSame($expected, $result);
<ide>
<ide> $data = [
<ide> 'photo' => [],
<ide> ];
<ide> $result = $validator->errors($data);
<add> $expected = [
<add> 'photo' => [
<add> 'uploadedFile' => 'The provided value is invalid'
<add> ]
<add> ];
<add> $this->assertSame($expected, $result);
<ide> $this->assertNotEmpty($result);
<ide>
<ide> $validator = new Validator();
<ide> public function testAllowEmptyDate()
<ide> $data = [
<ide> 'date' => [],
<ide> ];
<add> $expected = [
<add> 'date' => [
<add> 'date' => 'The provided value is invalid'
<add> ]
<add> ];
<ide> $result = $validator->errors($data);
<del> $this->assertNotEmpty($result);
<add> $this->assertSame($expected, $result);
<ide>
<ide> $validator = new Validator();
<ide> $validator->allowEmptyArray('date', 'update', 'message');
<ide> public function testAllowEmptyTime()
<ide> $data = [
<ide> 'time' => [],
<ide> ];
<add> $expected = [
<add> 'time' => [
<add> 'time' => 'The provided value is invalid'
<add> ]
<add> ];
<ide> $result = $validator->errors($data);
<del> $this->assertNotEmpty($result);
<add> $this->assertSame($expected, $result);
<ide>
<ide> $validator = new Validator();
<ide> $validator->allowEmptyArray('time', 'update', 'message');
<ide> public function testAllowEmptyDateTime()
<ide> {
<ide> $validator = new Validator();
<ide> $validator->allowEmptyDate('published')
<del> ->time('published');
<add> ->dateTime('published');
<ide>
<ide> $this->assertTrue($validator->field('published')->isEmptyAllowed());
<ide>
<ide> public function testAllowEmptyDateTime()
<ide> $data = [
<ide> 'published' => [],
<ide> ];
<add> $expected = [
<add> 'published' => [
<add> 'dateTime' => 'The provided value is invalid'
<add> ]
<add> ];
<ide> $result = $validator->errors($data);
<del> $this->assertNotEmpty($result);
<add> $this->assertSame($expected, $result);
<ide>
<ide> $validator = new Validator();
<ide> $validator->allowEmptyArray('published', 'update', 'message'); | 1 |
Text | Text | add 2.10.0-beta.1 to changelog.md | 63c9478c1477ec57f1e51c552ef9c7f529b1d267 | <ide><path>CHANGELOG.md
<ide> # Ember Changelog
<ide>
<del>### 2.9.0-beta.5 (October 6, 2016)
<del>
<del>- [#14403](https://github.com/emberjs/ember.js/pull/14403) [BUGFIX] Ensure `willInsertElement` is fired for tagless components.
<del>- [#14384](https://github.com/emberjs/ember.js/pull/14384) [BUGFIX] Fix an issue when `attrs` is used in class name and attribute bindings.
<del>- [#14417](https://github.com/emberjs/ember.js/pull/14417) [BUGFIX] Fix an issue when passing unused positional params to a component.
<del>- [#14419](https://github.com/emberjs/ember.js/pull/14419) [BUGFIX] Fix rendering `{{string.length}}` in a template.
<del>- [#14425](https://github.com/emberjs/ember.js/pull/14425) [BUGFIX] Fix rendering `{{this.attrs.*}}` in a template.
<del>- [#14427](https://github.com/emberjs/ember.js/pull/14427) [BUGFIX] Ensure `Ember.set` in `didUpdate` hooks causes a re-render.
<del>- [#13996](https://github.com/emberjs/ember.js/pull/13996) [BUGFIX] Fix an issue when passing closure actions via `attrs.*` to child components.
<del>
<del>### 2.9.0-beta.4 (September 28, 2016)
<del>
<del>- [#14361](https://github.com/emberjs/ember.js/pull/14361) [BUGFIX] Prevent usage of `this.element` when running in a non-interactive environment (i.e. FastBoot).
<del>- [#14361](https://github.com/emberjs/ember.js/pull/14361) [BUGFIX] Prevent `willRender` and `willUpdate` from running in FastBoot.
<del>- [#14344](https://github.com/emberjs/ember.js/pull/14344) [BUGFIX] Ensure `element` is present in `willInsertElement` hook.
<del>- [#14345](https://github.com/emberjs/ember.js/pull/14345) [BUGFIX] Fix an issue causing unneeded rerenders with closure actions.
<del>- [#14363](https://github.com/emberjs/ember.js/pull/14363) [BUGFIX] Always use `guidFor` for tagless component's with an `id`.
<del>- [#14365](https://github.com/emberjs/ember.js/pull/14365) [BUGFIX] Bump route-recognizer to fix an issue with encoding `/` in a dynamic segment.
<del>- [#14366](https://github.com/emberjs/ember.js/pull/14366) [BUGFIX] Fix `Ember.assign` export.
<del>- [#14367](https://github.com/emberjs/ember.js/pull/14367) [BUGFIX] Ensure feature flags are properly stripped.
<del>- [#14371](https://github.com/emberjs/ember.js/pull/14371) [BUGFIX] Lazily add `alias` dependent keys (correct a slight performance regression from [#14319](https://github.com/emberjs/ember.js/pull/14319)).
<del>
<del>### 2.9.0-beta.3 (September 20, 2016)
<del>
<del>- [#14313](https://github.com/emberjs/ember.js/pull/14313) [BUGFIX] Ensure `id` attribute bindings of `undefined` are handled properly.
<del>- [#14291](https://github.com/emberjs/ember.js/pull/14291) [BUGFIX] Fix component action bubbling semantics.
<del>- [#14296](https://github.com/emberjs/ember.js/pull/14296) [BUGFIX] Prevent invalid markup when used with XHTML `doctype`.
<del>- [#14300](https://github.com/emberjs/ember.js/pull/14300) [BUGFIX] Ensure component is `inDOM` during `didInsertElement`.
<del>- [#14312](https://github.com/emberjs/ember.js/pull/14312) [BUGFIX] Allow a components `layout` to be injected.
<del>- [#14315](https://github.com/emberjs/ember.js/pull/14315) [BUGFIX] Fix DOM output for properties and attributes.
<del>- [#14319](https://github.com/emberjs/ember.js/pull/14319) [BUGFIX] Fixes rerendering issues when rendering aliased paths.
<del>
<del>
<del>### 2.9.0-beta.2 (September 12, 2016)
<del>
<del>- [#14237](https://github.com/emberjs/ember.js/pull/14237) [BUGFIX] Ensure Engine Routes are deactivated before destruction
<del>- [#14176](https://github.com/emberjs/ember.js/pull/14176) [BUGFIX] Ensure Controller#transitionToRoute and Route#intermediateTransitionTo work in Engines
<del>- [#14244](https://github.com/emberjs/ember.js/pull/14244) [BUGFIX] Ensure params and hash are frozen in debug builds.
<del>- [#14245](https://github.com/emberjs/ember.js/pull/14245) [BUGFIX] Lookup partials from the current owner when rendering an Engines templates.
<del>- [#14247](https://github.com/emberjs/ember.js/pull/14247) [BUGFIX] Make `this.getAttr` an alias for `this.get`.
<del>- [#14252](https://github.com/emberjs/ember.js/pull/14252) [BUGFIX] Don't delete moduleName from the generated template compilation options.
<del>- [#14253](https://github.com/emberjs/ember.js/pull/14253) [BUGFIX] Prevent duplicated `Ember.meta` invocations.
<del>- [#14271](https://github.com/emberjs/ember.js/pull/14271) [BUGFIX] Ensure `undefined` and `null` values are not rendered as attributes or properties on initial render.
<del>- [#14272](https://github.com/emberjs/ember.js/pull/14272) [BUGFIX] Fix issues with `Transition#isActive` being falsey incorrectly.
<del>
<del>
<del>### 2.9.0-beta.1 (September 8, 2016)
<add>### 2.10.0-beta.1 (October 17, 2016)
<ide>
<ide> - [#14156](https://github.com/emberjs/ember.js/pull/14156) [FEATURE ember-glimmer] Enable by default.
<ide> | 1 |
Javascript | Javascript | create name/dashname at seed | 150d531079c474d3c0f930976c03286e364e1f31 | <ide><path>index.js
<ide> require('babel/register');
<ide> require('dotenv').load();
<ide> var fs = require('fs'),
<add> _ = require('lodash'),
<ide> path = require('path'),
<ide> app = require('../server/server'),
<ide> nonprofits = require('./nonprofits.json'),
<ide> jobs = require('./jobs.json');
<ide>
<del>var challangesRegex = /^(bonfire:|waypoint:|zipline:|basejump:|hike:)/i;
<del>
<ide> function getFilesFor(dir) {
<ide> return fs.readdirSync(path.join(__dirname, '/' + dir));
<ide> }
<ide> Challenge.destroyAll(function(err, info) {
<ide> console.log('Deleted ', info);
<ide> }
<ide> challenges.forEach(function(file) {
<del> var challenges = require('./challenges/' + file).challenges
<add> var challengeSpec = require('./challenges/' + file);
<add> var challenges = challengeSpec.challenges
<ide> .map(function(challenge) {
<ide> // NOTE(berks): add title for displaying in views
<del> //challenge.title = challenge.name.replace(challangesRegex, '').trim();
<del> challenge.name = challenge.title.replace(/[^a-zA-Z0-9 ]/g, ''); // Remove non-alphanumwhitespace chars
<del> challenge.dashedName = challenge.name.replace(/\s/g, '-'); // Replace with dasherize();
<add> challenge.name =
<add> _.capitalize(challenge.type) +
<add> ': ' +
<add> challenge.title.replace(/[^a-zA-Z0-9\s]/g, '');
<add> challenge.dashedName = challenge.name.toLowerCase().replace(/\s/g, '-');
<ide> return challenge;
<ide> });
<ide> | 1 |
Python | Python | fix regression introduced which broke config | 257a5a3d7b3f61b8d3a84697cde2f7bf8d85bc34 | <ide><path>glances/config.py
<ide> def user_config_dir():
<ide> path = os.path.expanduser('~/Library/Application Support')
<ide> else:
<ide> path = os.environ.get('XDG_CONFIG_HOME') or os.path.expanduser('~/.config')
<add> path = os.path.join(path, 'glances')
<ide>
<ide> return path
<ide>
<ide> def user_cache_dir():
<ide>
<ide> - Linux, *BSD, SunOS: ~/.cache/glances
<ide> - macOS: ~/Library/Caches/glances
<del> - Windows: %LOCALAPPDATA%\glances\cache
<add> - Windows: {%LOCALAPPDATA%,%APPDATA%}\glances\cache
<ide> """
<del> if WINDOWS and os.environ.get('LOCALAPPDATA') is not None:
<del> path = os.path.join(os.environ.get('LOCALAPPDATA'), 'glances', 'cache')
<add> if WINDOWS:
<add> path = os.path.join(os.environ.get('LOCALAPPDATA') or os.environ.get('APPDATA'),
<add> 'glances', 'cache')
<ide> elif MACOS:
<ide> path = os.path.expanduser('~/Library/Caches/glances')
<ide> else:
<ide> def system_config_dir():
<ide> path = '/usr/local/etc'
<ide> else:
<ide> path = os.environ.get('APPDATA')
<add> path = os.path.join(path, 'glances')
<ide>
<ide> return path
<ide>
<ide> def config_file_paths(self):
<ide> if self.config_dir:
<ide> paths.append(self.config_dir)
<ide>
<del> if user_config_dir() is not None:
<del> paths.append(os.path.join(user_config_dir(), self.config_filename))
<del> if system_config_dir() is not None:
<del> paths.append(os.path.join(system_config_dir(), self.config_filename))
<add> paths.append(os.path.join(user_config_dir(), self.config_filename))
<add> paths.append(os.path.join(system_config_dir(), self.config_filename))
<ide>
<ide> return paths
<ide> | 1 |
PHP | PHP | add index and rename foreign key | 4c1189f016848b86368808dbeaf3a21dceaf9279 | <ide><path>tests/TestCase/TestSuite/Schema/test_schema.php
<ide> 'table' => 'schema_generator',
<ide> 'columns' => [
<ide> 'id' => ['type' => 'integer'],
<del> 'relation_id' => ['type' => 'integer', 'null' => true],
<add> 'relation_id' => ['type' => 'integer'],
<ide> 'title' => ['type' => 'string', 'null' => true],
<ide> 'body' => 'text',
<ide> ],
<ide> 'constraints' => [
<ide> 'primary' => ['type' => 'primary', 'columns' => ['id']],
<del> 'relation_idx' => [
<add> 'relation_fk' => [
<ide> 'type' => 'foreign',
<ide> 'columns' => ['relation_id'],
<ide> 'references' => ['schema_generator_comment', 'id'],
<ide> ],
<ide> ],
<ide> 'indexes' => [
<add> 'relation_idx' => [
<add> 'type' => 'index',
<add> 'columns' => ['relation_id'],
<add> ],
<ide> 'title_idx' => [
<ide> 'type' => 'index',
<ide> 'columns' => ['title'], | 1 |
Java | Java | fix spinner mode | 7d1c827cb2231e4091698a3d33addd067f40af7d | <ide><path>ReactAndroid/src/main/java/com/facebook/react/modules/datepicker/DatePickerDialogFragment.java
<ide> import android.content.Context;
<ide> import android.content.DialogInterface;
<ide> import android.content.DialogInterface.OnDismissListener;
<add>import android.graphics.drawable.ColorDrawable;
<ide> import android.os.Build;
<ide> import android.os.Bundle;
<ide> import androidx.fragment.app.DialogFragment;
<ide> public Dialog onCreateDialog(Bundle savedInstanceState) {
<ide> break;
<ide> case SPINNER:
<ide> dialog = new DismissableDatePickerDialog(activityContext,
<del> activityContext.getResources().getIdentifier("SpinnerDatePickerDialog", "style", activityContext.getPackageName()),
<add> android.R.style.Theme_Holo_Light_Dialog,
<ide> onDateSetListener, year, month, day);
<add> dialog.getWindow().setBackgroundDrawable(new ColorDrawable(android.graphics.Color.TRANSPARENT));
<ide> break;
<ide> case DEFAULT:
<ide> dialog = new DismissableDatePickerDialog(activityContext, onDateSetListener, year, month, day); | 1 |
Ruby | Ruby | remove mass encryption | c41b354bf065c9e59d65abbaeb22897f95ab47c0 | <ide><path>activerecord/lib/active_record/encryption.rb
<ide> module Encryption
<ide> autoload :Key
<ide> autoload :KeyGenerator
<ide> autoload :KeyProvider
<del> autoload :MassEncryption
<ide> autoload :Message
<ide> autoload :MessageSerializer
<ide> autoload :NullEncryptor
<ide><path>activerecord/lib/active_record/encryption/mass_encryption.rb
<del># frozen_string_literal: true
<del>
<del>module ActiveRecord
<del> module Encryption
<del> # Encrypts all the models belonging to the provided list of classes
<del> class MassEncryption
<del> attr_reader :classes, :last_class, :last_id, :progress_monitor, :skip_rich_texts
<del>
<del> def initialize(progress_monitor: NullProgressMonitor.new, last_class: nil, last_id: nil, skip_rich_texts: false, quiet: true)
<del> @progress_monitor = progress_monitor
<del> @last_class = last_class
<del> @last_id = last_id
<del> @classes = []
<del> @skip_rich_texts = skip_rich_texts
<del> @quiet = true
<del>
<del> raise ArgumentError, "When passing a :last_id you must pass a :last_class too" if last_id.present? && last_class.blank?
<del> end
<del>
<del> def add(*classes)
<del> @classes.push(*classes)
<del> progress_monitor.total = calculate_total
<del> self
<del> end
<del>
<del> def encrypt
<del> included_classes.each.with_index do |klass, index|
<del> ClassMassEncryption.new(klass, progress_monitor: progress_monitor, last_id: last_id, skip_rich_texts: skip_rich_texts).encrypt
<del> end
<del> end
<del>
<del> private
<del> def calculate_total
<del> total = sum_all(classes) - sum_all(excluded_classes)
<del> total -= last_class.where("id < ?", last_id) if last_id.present?
<del> total
<del> end
<del>
<del> def sum_all(classes)
<del> classes.sum { |klass| klass.count }
<del> end
<del>
<del> def included_classes
<del> classes - excluded_classes
<del> end
<del>
<del> def excluded_classes
<del> if last_class
<del> last_class_index = classes.find_index(last_class)
<del> classes.find_all.with_index do |_, index|
<del> index >= last_class_index
<del> end
<del> else
<del> []
<del> end
<del> end
<del> end
<del>
<del> class ClassMassEncryption
<del> attr_reader :klass, :progress_monitor, :last_id, :skip_rich_texts
<del>
<del> def initialize(klass, progress_monitor: NullEncryptor.new, last_id: nil, skip_rich_texts: false, quiet: true)
<del> @klass = klass
<del> @progress_monitor = progress_monitor
<del> @last_id = last_id
<del> @skip_rich_texts = skip_rich_texts
<del> @quiet = quiet
<del> end
<del>
<del> def encrypt
<del> klass.where("id >= ?", last_id.to_i).find_each.with_index do |record, index|
<del> encrypt_record(record)
<del> progress_monitor.increment
<del>
<del> unless @quiet
<del> progress_monitor.log("Encrypting #{klass.name.tableize} (last id = #{record.id})...") if index % 500 == 0
<del> end
<del> end
<del> end
<del>
<del> private
<del> def encrypt_record(record)
<del> record.encrypt(skip_rich_texts: skip_rich_texts)
<del> rescue
<del> logger.error("Error when encrypting #{record.class} record with id #{record.id}")
<del> raise
<del> end
<del>
<del> def logger
<del> Rails.logger
<del> end
<del> end
<del>
<del> class NullProgressMonitor
<del> def increment
<del> end
<del>
<del> def total=(new_value) end
<del>
<del> def log(text)
<del> puts text
<del> end
<del> end
<del> end
<del>end
<ide><path>activerecord/test/cases/encryption/mass_encryption_test.rb
<del># frozen_string_literal: true
<del>
<del>require "cases/encryption/helper"
<del>require "models/author_encrypted"
<del>require "models/post_encrypted"
<del>
<del>class ActiveRecord::Encryption::MassEncryptionTest < ActiveRecord::TestCase
<del> setup do
<del> ActiveRecord::Encryption.config.support_unencrypted_data = true
<del> end
<del>
<del> test "It encrypts everything" do
<del> posts = ActiveRecord::Encryption.without_encryption do
<del> 3.times.collect { |index| EncryptedPost.create!(title: "Article #{index}", body: "Body #{index}") }
<del> end
<del>
<del> authors = ActiveRecord::Encryption.without_encryption do
<del> 3.times.collect { |index| EncryptedAuthor.create!(name: "Author #{index}") }
<del> end
<del>
<del> ActiveRecord::Encryption::MassEncryption.new\
<del> .add(EncryptedPost, EncryptedAuthor)
<del> .encrypt
<del>
<del> (posts + authors).each { |model| assert_encrypted_record(model.reload) }
<del> end
<del>end | 3 |
Go | Go | remove custom testingt interfaces | d79cc1b67dbd3e21daad25467e21c7692191d664 | <ide><path>integration-cli/cli/build/build.go
<ide> package build // import "github.com/docker/docker/integration-cli/cli/build"
<ide> import (
<ide> "io"
<ide> "strings"
<add> "testing"
<ide>
<ide> "github.com/docker/docker/testutil/fakecontext"
<ide> "gotest.tools/icmd"
<ide> )
<ide>
<del>type testingT interface {
<del> Fatal(args ...interface{})
<del> Fatalf(string, ...interface{})
<del> Name() string
<del>}
<del>
<ide> // WithStdinContext sets the build context from the standard input with the specified reader
<ide> func WithStdinContext(closer io.ReadCloser) func(*icmd.Cmd) func() {
<ide> return func(cmd *icmd.Cmd) func() {
<ide> func WithExternalBuildContext(ctx *fakecontext.Fake) func(*icmd.Cmd) func() {
<ide> }
<ide>
<ide> // WithBuildContext sets up the build context
<del>func WithBuildContext(t testingT, contextOperators ...func(*fakecontext.Fake) error) func(*icmd.Cmd) func() {
<add>func WithBuildContext(t testing.TB, contextOperators ...func(*fakecontext.Fake) error) func(*icmd.Cmd) func() {
<ide> // FIXME(vdemeester) de-duplicate that
<ide> ctx := fakecontext.New(t, "", contextOperators...)
<ide> return func(cmd *icmd.Cmd) func() {
<ide> func WithFile(name, content string) func(*fakecontext.Fake) error {
<ide> return fakecontext.WithFile(name, content)
<ide> }
<ide>
<del>func closeBuildContext(t testingT, ctx *fakecontext.Fake) func() {
<add>func closeBuildContext(t testing.TB, ctx *fakecontext.Fake) func() {
<ide> return func() {
<ide> if err := ctx.Close(); err != nil {
<ide> t.Fatal(err)
<ide><path>integration-cli/cli/cli.go
<ide> import (
<ide> "fmt"
<ide> "io"
<ide> "strings"
<add> "testing"
<ide> "time"
<ide>
<ide> "github.com/docker/docker/integration-cli/daemon"
<ide> "github.com/docker/docker/integration-cli/environment"
<ide> "github.com/pkg/errors"
<del> "gotest.tools/assert"
<ide> "gotest.tools/icmd"
<ide> )
<ide>
<ide> func SetTestEnvironment(env *environment.Execution) {
<ide> // CmdOperator defines functions that can modify a command
<ide> type CmdOperator func(*icmd.Cmd) func()
<ide>
<del>type testingT interface {
<del> assert.TestingT
<del> Fatal(args ...interface{})
<del> Fatalf(string, ...interface{})
<del> Name() string
<del>}
<del>
<ide> // DockerCmd executes the specified docker command and expect a success
<del>func DockerCmd(t testingT, args ...string) *icmd.Result {
<add>func DockerCmd(t testing.TB, args ...string) *icmd.Result {
<ide> return Docker(Args(args...)).Assert(t, icmd.Success)
<ide> }
<ide>
<ide> // BuildCmd executes the specified docker build command and expect a success
<del>func BuildCmd(t testingT, name string, cmdOperators ...CmdOperator) *icmd.Result {
<add>func BuildCmd(t testing.TB, name string, cmdOperators ...CmdOperator) *icmd.Result {
<ide> return Docker(Build(name), cmdOperators...).Assert(t, icmd.Success)
<ide> }
<ide>
<ide> // InspectCmd executes the specified docker inspect command and expect a success
<del>func InspectCmd(t testingT, name string, cmdOperators ...CmdOperator) *icmd.Result {
<add>func InspectCmd(t testing.TB, name string, cmdOperators ...CmdOperator) *icmd.Result {
<ide> return Docker(Inspect(name), cmdOperators...).Assert(t, icmd.Success)
<ide> }
<ide>
<ide> // WaitRun will wait for the specified container to be running, maximum 5 seconds.
<del>func WaitRun(t testingT, name string, cmdOperators ...CmdOperator) {
<add>func WaitRun(t testing.TB, name string, cmdOperators ...CmdOperator) {
<ide> WaitForInspectResult(t, name, "{{.State.Running}}", "true", 5*time.Second, cmdOperators...)
<ide> }
<ide>
<ide> // WaitExited will wait for the specified container to state exit, subject
<ide> // to a maximum time limit in seconds supplied by the caller
<del>func WaitExited(t testingT, name string, timeout time.Duration, cmdOperators ...CmdOperator) {
<add>func WaitExited(t testing.TB, name string, timeout time.Duration, cmdOperators ...CmdOperator) {
<ide> WaitForInspectResult(t, name, "{{.State.Status}}", "exited", timeout, cmdOperators...)
<ide> }
<ide>
<ide> // WaitRestart will wait for the specified container to restart once
<del>func WaitRestart(t testingT, name string, timeout time.Duration, cmdOperators ...CmdOperator) {
<add>func WaitRestart(t testing.TB, name string, timeout time.Duration, cmdOperators ...CmdOperator) {
<ide> WaitForInspectResult(t, name, "{{.RestartCount}}", "1", timeout, cmdOperators...)
<ide> }
<ide>
<ide> // WaitForInspectResult waits for the specified expression to be equals to the specified expected string in the given time.
<del>func WaitForInspectResult(t testingT, name, expr, expected string, timeout time.Duration, cmdOperators ...CmdOperator) {
<add>func WaitForInspectResult(t testing.TB, name, expr, expected string, timeout time.Duration, cmdOperators ...CmdOperator) {
<ide> after := time.After(timeout)
<ide>
<ide> args := []string{"inspect", "-f", expr, name}
<ide><path>integration-cli/daemon/daemon.go
<ide> import (
<ide> "gotest.tools/icmd"
<ide> )
<ide>
<del>type testingT interface {
<del> assert.TestingT
<del> logT
<del> Fatalf(string, ...interface{})
<del> Name() string
<del>}
<del>
<del>type logT interface {
<del> Logf(string, ...interface{})
<del>}
<del>
<ide> // Daemon represents a Docker daemon for the testing framework.
<ide> type Daemon struct {
<ide> *daemon.Daemon
<ide> type Daemon struct {
<ide> // New returns a Daemon instance to be used for testing.
<ide> // This will create a directory such as d123456789 in the folder specified by $DOCKER_INTEGRATION_DAEMON_DEST or $DEST.
<ide> // The daemon will not automatically start.
<del>func New(t testingT, dockerBinary string, dockerdBinary string, ops ...daemon.Option) *Daemon {
<add>func New(t testing.TB, dockerBinary string, dockerdBinary string, ops ...daemon.Option) *Daemon {
<ide> ops = append(ops, daemon.WithDockerdBinary(dockerdBinary))
<ide> d := daemon.New(t, ops...)
<ide> return &Daemon{
<ide><path>testutil/daemon/daemon.go
<ide> import (
<ide> "path/filepath"
<ide> "strconv"
<ide> "strings"
<add> "testing"
<ide> "time"
<ide>
<ide> "github.com/docker/docker/api/types"
<ide> import (
<ide> "gotest.tools/assert"
<ide> )
<ide>
<del>type testingT interface {
<del> assert.TestingT
<del> logT
<del> Fatalf(string, ...interface{})
<del> Name() string
<del>}
<del>
<ide> type logT interface {
<ide> Logf(string, ...interface{})
<ide> }
<ide> func NewDaemon(workingDir string, ops ...Option) (*Daemon, error) {
<ide> // This will create a directory such as d123456789 in the folder specified by
<ide> // $DOCKER_INTEGRATION_DAEMON_DEST or $DEST.
<ide> // The daemon will not automatically start.
<del>func New(t testingT, ops ...Option) *Daemon {
<add>func New(t testing.TB, ops ...Option) *Daemon {
<ide> if ht, ok := t.(testutil.HelperT); ok {
<ide> ht.Helper()
<ide> }
<ide> func (d *Daemon) NewClient(extraOpts ...client.Opt) (*client.Client, error) {
<ide> }
<ide>
<ide> // Cleanup cleans the daemon files : exec root (network namespaces, ...), swarmkit files
<del>func (d *Daemon) Cleanup(t testingT) {
<add>func (d *Daemon) Cleanup(t testing.TB) {
<ide> if ht, ok := t.(testutil.HelperT); ok {
<ide> ht.Helper()
<ide> }
<ide> func (d *Daemon) Cleanup(t testingT) {
<ide> }
<ide>
<ide> // Start starts the daemon and return once it is ready to receive requests.
<del>func (d *Daemon) Start(t testingT, args ...string) {
<add>func (d *Daemon) Start(t testing.TB, args ...string) {
<ide> if ht, ok := t.(testutil.HelperT); ok {
<ide> ht.Helper()
<ide> }
<ide> func (d *Daemon) StartWithLogFile(out *os.File, providedArgs ...string) error {
<ide>
<ide> // StartWithBusybox will first start the daemon with Daemon.Start()
<ide> // then save the busybox image from the main daemon and load it into this Daemon instance.
<del>func (d *Daemon) StartWithBusybox(t testingT, arg ...string) {
<add>func (d *Daemon) StartWithBusybox(t testing.TB, arg ...string) {
<ide> if ht, ok := t.(testutil.HelperT); ok {
<ide> ht.Helper()
<ide> }
<ide> func (d *Daemon) DumpStackAndQuit() {
<ide> // Stop will not delete the daemon directory. If a purged daemon is needed,
<ide> // instantiate a new one with NewDaemon.
<ide> // If an error occurs while starting the daemon, the test will fail.
<del>func (d *Daemon) Stop(t testingT) {
<add>func (d *Daemon) Stop(t testing.TB) {
<ide> if ht, ok := t.(testutil.HelperT); ok {
<ide> ht.Helper()
<ide> }
<ide> out2:
<ide>
<ide> // Restart will restart the daemon by first stopping it and the starting it.
<ide> // If an error occurs while starting the daemon, the test will fail.
<del>func (d *Daemon) Restart(t testingT, args ...string) {
<add>func (d *Daemon) Restart(t testing.TB, args ...string) {
<ide> if ht, ok := t.(testutil.HelperT); ok {
<ide> ht.Helper()
<ide> }
<ide> func (d *Daemon) Info(t assert.TestingT) types.Info {
<ide> return info
<ide> }
<ide>
<del>func cleanupRaftDir(t testingT, rootPath string) {
<add>func cleanupRaftDir(t testing.TB, rootPath string) {
<ide> if ht, ok := t.(testutil.HelperT); ok {
<ide> ht.Helper()
<ide> }
<ide><path>testutil/daemon/daemon_unix.go
<ide> import (
<ide> "os"
<ide> "path/filepath"
<ide> "strings"
<add> "testing"
<ide>
<ide> "github.com/docker/docker/testutil"
<ide> "golang.org/x/sys/unix"
<ide> "gotest.tools/assert"
<ide> )
<ide>
<del>func cleanupNetworkNamespace(t testingT, execRoot string) {
<add>func cleanupNetworkNamespace(t testing.TB, execRoot string) {
<ide> if ht, ok := t.(testutil.HelperT); ok {
<ide> ht.Helper()
<ide> }
<ide> func cleanupNetworkNamespace(t testingT, execRoot string) {
<ide> }
<ide>
<ide> // CgroupNamespace returns the cgroup namespace the daemon is running in
<del>func (d *Daemon) CgroupNamespace(t assert.TestingT) string {
<add>func (d *Daemon) CgroupNamespace(t testing.TB) string {
<ide> link, err := os.Readlink(fmt.Sprintf("/proc/%d/ns/cgroup", d.Pid()))
<ide> assert.NilError(t, err)
<ide>
<ide><path>testutil/daemon/daemon_windows.go
<ide> package daemon
<ide> import (
<ide> "fmt"
<ide> "strconv"
<add> "testing"
<ide>
<ide> "golang.org/x/sys/windows"
<ide> "gotest.tools/assert"
<ide> func signalDaemonReload(pid int) error {
<ide> return fmt.Errorf("daemon reload not supported")
<ide> }
<ide>
<del>func cleanupNetworkNamespace(t testingT, execRoot string) {
<add>func cleanupNetworkNamespace(t testing.TB, execRoot string) {
<ide> }
<ide>
<ide> // CgroupNamespace returns the cgroup namespace the daemon is running in
<del>func (d *Daemon) CgroupNamespace(t assert.TestingT) string {
<add>func (d *Daemon) CgroupNamespace(t testing.TB) string {
<ide> assert.Assert(t, false)
<ide> return "cgroup namespaces are not supported on Windows"
<ide> }
<ide><path>testutil/daemon/swarm.go
<ide> package daemon
<ide> import (
<ide> "context"
<ide> "fmt"
<add> "testing"
<ide>
<ide> "github.com/docker/docker/api/types/swarm"
<ide> "github.com/docker/docker/testutil"
<ide> var (
<ide> )
<ide>
<ide> // StartNode (re)starts the daemon
<del>func (d *Daemon) StartNode(t testingT) {
<add>func (d *Daemon) StartNode(t testing.TB) {
<ide> if ht, ok := t.(testutil.HelperT); ok {
<ide> ht.Helper()
<ide> }
<ide> d.Start(t, startArgs...)
<ide> }
<ide>
<ide> // StartNodeWithBusybox starts daemon to be used as a swarm node, and loads the busybox image
<del>func (d *Daemon) StartNodeWithBusybox(t testingT) {
<add>func (d *Daemon) StartNodeWithBusybox(t testing.TB) {
<ide> if ht, ok := t.(testutil.HelperT); ok {
<ide> ht.Helper()
<ide> }
<ide> d.StartWithBusybox(t, startArgs...)
<ide> }
<ide>
<ide> // RestartNode restarts a daemon to be used as a swarm node
<del>func (d *Daemon) RestartNode(t testingT) {
<add>func (d *Daemon) RestartNode(t testing.TB) {
<ide> if ht, ok := t.(testutil.HelperT); ok {
<ide> ht.Helper()
<ide> }
<ide> func (d *Daemon) RestartNode(t testingT) {
<ide> }
<ide>
<ide> // StartAndSwarmInit starts the daemon (with busybox) and init the swarm
<del>func (d *Daemon) StartAndSwarmInit(t testingT) {
<add>func (d *Daemon) StartAndSwarmInit(t testing.TB) {
<ide> d.StartNodeWithBusybox(t)
<ide> d.SwarmInit(t, swarm.InitRequest{})
<ide> }
<ide>
<ide> // StartAndSwarmJoin starts the daemon (with busybox) and join the specified swarm as worker or manager
<del>func (d *Daemon) StartAndSwarmJoin(t testingT, leader *Daemon, manager bool) {
<add>func (d *Daemon) StartAndSwarmJoin(t testing.TB, leader *Daemon, manager bool) {
<ide> if th, ok := t.(testutil.HelperT); ok {
<ide> th.Helper()
<ide> }
<ide><path>testutil/fakecontext/context.go
<ide> import (
<ide> "io/ioutil"
<ide> "os"
<ide> "path/filepath"
<add> "testing"
<ide>
<ide> "github.com/docker/docker/pkg/archive"
<ide> "github.com/docker/docker/testutil"
<ide> )
<ide>
<del>type testingT interface {
<del> Fatal(args ...interface{})
<del> Fatalf(string, ...interface{})
<del> Name() string
<del>}
<del>
<ide> // New creates a fake build context
<del>func New(t testingT, dir string, modifiers ...func(*Fake) error) *Fake {
<add>func New(t testing.TB, dir string, modifiers ...func(*Fake) error) *Fake {
<ide> if ht, ok := t.(testutil.HelperT); ok {
<ide> ht.Helper()
<ide> }
<ide> func (f *Fake) Close() error {
<ide> }
<ide>
<ide> // AsTarReader returns a ReadCloser with the contents of Dir as a tar archive.
<del>func (f *Fake) AsTarReader(t testingT) io.ReadCloser {
<add>func (f *Fake) AsTarReader(t testing.TB) io.ReadCloser {
<ide> if ht, ok := t.(testutil.HelperT); ok {
<ide> ht.Helper()
<ide> }
<ide><path>testutil/fakegit/fakegit.go
<ide> import (
<ide> "os"
<ide> "os/exec"
<ide> "path/filepath"
<add> "testing"
<ide>
<ide> "github.com/docker/docker/testutil"
<ide> "github.com/docker/docker/testutil/fakecontext"
<ide> "github.com/docker/docker/testutil/fakestorage"
<del> "gotest.tools/assert"
<ide> )
<ide>
<del>type testingT interface {
<del> assert.TestingT
<del> logT
<del> skipT
<del> Fatal(args ...interface{})
<del> Fatalf(string, ...interface{})
<del> Name() string
<del>}
<del>
<del>type logT interface {
<del> Logf(string, ...interface{})
<del>}
<del>
<del>type skipT interface {
<del> Skip(...interface{})
<del>}
<del>
<ide> type gitServer interface {
<ide> URL() string
<ide> Close() error
<ide> func (g *FakeGit) Close() {
<ide> }
<ide>
<ide> // New create a fake git server that can be used for git related tests
<del>func New(c testingT, name string, files map[string]string, enforceLocalServer bool) *FakeGit {
<add>func New(c testing.TB, name string, files map[string]string, enforceLocalServer bool) *FakeGit {
<ide> if ht, ok := c.(testutil.HelperT); ok {
<ide> ht.Helper()
<ide> }
<ide><path>testutil/fakestorage/fixtures.go
<ide> import (
<ide> "os/exec"
<ide> "path/filepath"
<ide> "sync"
<add> "testing"
<ide>
<ide> "github.com/docker/docker/api/types"
<ide> "github.com/docker/docker/pkg/archive"
<ide> import (
<ide>
<ide> var ensureHTTPServerOnce sync.Once
<ide>
<del>func ensureHTTPServerImage(t testingT) {
<add>func ensureHTTPServerImage(t testing.TB) {
<ide> if ht, ok := t.(testutil.HelperT); ok {
<ide> ht.Helper()
<ide> }
<ide><path>testutil/fakestorage/storage.go
<ide> import (
<ide> "net/url"
<ide> "os"
<ide> "strings"
<add> "testing"
<ide>
<ide> "github.com/docker/docker/api/types"
<ide> containertypes "github.com/docker/docker/api/types/container"
<ide> import (
<ide>
<ide> var testEnv *environment.Execution
<ide>
<del>type testingT interface {
<del> assert.TestingT
<del> logT
<del> skipT
<del> Fatal(args ...interface{})
<del> Fatalf(string, ...interface{})
<del> Name() string
<del>}
<del>
<del>type logT interface {
<del> Logf(string, ...interface{})
<del>}
<del>
<del>type skipT interface {
<del> Skip(...interface{})
<del>}
<del>
<ide> // Fake is a static file server. It might be running locally or remotely
<ide> // on test host.
<ide> type Fake interface {
<ide> func SetTestEnvironment(env *environment.Execution) {
<ide> }
<ide>
<ide> // New returns a static file server that will be use as build context.
<del>func New(t testingT, dir string, modifiers ...func(*fakecontext.Fake) error) Fake {
<add>func New(t testing.TB, dir string, modifiers ...func(*fakecontext.Fake) error) Fake {
<ide> if ht, ok := t.(testutil.HelperT); ok {
<ide> ht.Helper()
<ide> }
<ide> func (f *remoteFileServer) Close() error {
<ide> })
<ide> }
<ide>
<del>func newRemoteFileServer(t testingT, ctx *fakecontext.Fake, c client.APIClient) *remoteFileServer {
<add>func newRemoteFileServer(t testing.TB, ctx *fakecontext.Fake, c client.APIClient) *remoteFileServer {
<ide> var (
<ide> image = fmt.Sprintf("fileserver-img-%s", strings.ToLower(testutil.GenerateRandomAlphaOnlyString(10)))
<ide> container = fmt.Sprintf("fileserver-cnt-%s", strings.ToLower(testutil.GenerateRandomAlphaOnlyString(10)))
<ide><path>testutil/registry/registry.go
<ide> import (
<ide> "os"
<ide> "os/exec"
<ide> "path/filepath"
<add> "testing"
<ide> "time"
<ide>
<ide> "github.com/docker/docker/testutil"
<ide> const (
<ide> DefaultURL = "127.0.0.1:5000"
<ide> )
<ide>
<del>type testingT interface {
<del> assert.TestingT
<del> logT
<del> Fatal(...interface{})
<del> Fatalf(string, ...interface{})
<del> Name() string
<del>}
<del>
<del>type logT interface {
<del> Logf(string, ...interface{})
<del>}
<del>
<ide> // V2 represent a registry version 2
<ide> type V2 struct {
<ide> cmd *exec.Cmd
<ide> type Config struct {
<ide> }
<ide>
<ide> // NewV2 creates a v2 registry server
<del>func NewV2(t testingT, ops ...func(*Config)) *V2 {
<add>func NewV2(t testing.TB, ops ...func(*Config)) *V2 {
<ide> if ht, ok := t.(testutil.HelperT); ok {
<ide> ht.Helper()
<ide> }
<ide> http:
<ide> }
<ide>
<ide> // WaitReady waits for the registry to be ready to serve requests (or fail after a while)
<del>func (r *V2) WaitReady(t testingT) {
<add>func (r *V2) WaitReady(t testing.TB) {
<ide> if ht, ok := t.(testutil.HelperT); ok {
<ide> ht.Helper()
<ide> }
<ide> func (r *V2) WriteBlobContents(t assert.TestingT, blobDigest digest.Digest, data
<ide>
<ide> // TempMoveBlobData moves the existing data file aside, so that we can replace it with a
<ide> // malicious blob of data for example.
<del>func (r *V2) TempMoveBlobData(t testingT, blobDigest digest.Digest) (undo func()) {
<add>func (r *V2) TempMoveBlobData(t testing.TB, blobDigest digest.Digest) (undo func()) {
<ide> if ht, ok := t.(testutil.HelperT); ok {
<ide> ht.Helper()
<ide> }
<ide><path>testutil/registry/registry_mock.go
<ide> import (
<ide> "regexp"
<ide> "strings"
<ide> "sync"
<add> "testing"
<ide>
<ide> "github.com/docker/docker/testutil"
<ide> )
<ide> func (tr *Mock) RegisterHandler(path string, h handlerFunc) {
<ide> }
<ide>
<ide> // NewMock creates a registry mock
<del>func NewMock(t testingT) (*Mock, error) {
<add>func NewMock(t testing.TB) (*Mock, error) {
<ide> if ht, ok := t.(testutil.HelperT); ok {
<ide> ht.Helper()
<ide> } | 13 |
PHP | PHP | send json in debug mode if the request wants json | 5225389dfdf03d656b862bba59cebf1820e0e8f4 | <ide><path>src/Illuminate/Foundation/Exceptions/Handler.php
<ide> public function render($request, Exception $e)
<ide> return $this->convertValidationExceptionToResponse($e, $request);
<ide> }
<ide>
<add> if ($request->expectsJson() && config('app.debug')) {
<add> return $this->prepareJsonResponse($request, $e);
<add> }
<add>
<ide> return $this->prepareResponse($request, $e);
<ide> }
<ide>
<ide> protected function convertValidationExceptionToResponse(ValidationException $e,
<ide> }
<ide>
<ide> /**
<del> * Prepare response containing exception render.
<add> * Prepare a response for the given exception.
<ide> *
<ide> * @param \Illuminate\Http\Request $request
<ide> * @param \Exception $e
<ide> protected function prepareResponse($request, Exception $e)
<ide> return $this->toIlluminateResponse($this->convertExceptionToResponse($e), $e);
<ide> }
<ide>
<del> $e = $this->isHttpException($e)
<del> ? $e : new HttpException(500, $e->getMessage());
<add> if (! $this->isHttpException($e)) {
<add> $e = new HttpException(500, $e->getMessage());
<add> }
<ide>
<ide> return $this->toIlluminateResponse(
<ide> $this->renderHttpException($e), $e
<ide> );
<ide> }
<ide>
<add> /**
<add> * Prepare a JSON response for the given exception.
<add> *
<add> * @param \Illuminate\Http\Request $request
<add> * @param \Exception $e
<add> * @return \Symfony\Component\HttpFoundation\Response
<add> */
<add> protected function prepareJsonResponse($request, Exception $e)
<add> {
<add> $status = $this->isHttpException($e) ? $e->getStatusCode() : 500;
<add>
<add> $headers = $this->isHttpException($e) ? $e->getHeaders() : [];
<add>
<add> return response(json_encode([
<add> 'message' => $e->getMessage(),
<add> 'file' => $e->getFile(),
<add> 'line' => $e->getLine(),
<add> 'trace' => $e->getTrace(),
<add> ], JSON_PRETTY_PRINT | JSON_UNESCAPED_SLASHES), $status, array_merge($headers, [
<add> 'Content-Type' => 'application/json'
<add> ]));
<add> }
<add>
<ide> /**
<ide> * Render the given HttpException.
<ide> * | 1 |
Javascript | Javascript | add removed message for toolbarandroid | 6249d14a61723b22deb1336457b4295978471885 | <ide><path>index.js
<ide> if (__DEV__) {
<ide> },
<ide> });
<ide>
<add> // $FlowFixMe This is intentional: Flow will error when attempting to access ToolbarAndroid.
<add> Object.defineProperty(module.exports, 'ToolbarAndroid', {
<add> configurable: true,
<add> get() {
<add> invariant(
<add> false,
<add> 'ToolbarAndroid has been removed from React Native. ' +
<add> "It can now be installed and imported from '@react-native-community/toolbar-android' instead of 'react-native'. " +
<add> 'See https://github.com/react-native-community/toolbar-android',
<add> );
<add> },
<add> });
<add>
<ide> // $FlowFixMe This is intentional: Flow will error when attempting to access ViewPagerAndroid.
<ide> Object.defineProperty(module.exports, 'ViewPagerAndroid', {
<ide> configurable: true, | 1 |
Mixed | Javascript | fix modifying ticks in afterbuildticks | 20c748f90b80c7c1969a9e805a7c0876a19b270b | <ide><path>docs/axes/README.md
<ide> There are a number of config callbacks that can be used to change parameters in
<ide> | `beforeDataLimits` | `axis` | Callback that runs before data limits are determined.
<ide> | `afterDataLimits` | `axis` | Callback that runs after data limits are determined.
<ide> | `beforeBuildTicks` | `axis` | Callback that runs before ticks are created.
<del>| `afterBuildTicks` | `axis` | Callback that runs after ticks are created. Useful for filtering ticks.
<add>| `afterBuildTicks` | `axis`, `ticks` | Callback that runs after ticks are created. Useful for filtering ticks. Should return the filtered ticks.
<ide> | `beforeTickToLabelConversion` | `axis` | Callback that runs before ticks are converted into strings.
<ide> | `afterTickToLabelConversion` | `axis` | Callback that runs after ticks are converted into strings.
<ide> | `beforeCalculateTickRotation` | `axis` | Callback that runs before tick rotation is determined.
<ide><path>src/core/core.scale.js
<ide> module.exports = Element.extend({
<ide> // we still support no return (`this.ticks` internally set by calling this method).
<ide> ticks = me.buildTicks() || [];
<ide>
<del> me.afterBuildTicks();
<add> // Allow modification of ticks in callback.
<add> ticks = me.afterBuildTicks(ticks) || ticks;
<ide>
<ide> me.beforeTickToLabelConversion();
<ide>
<ide> module.exports = Element.extend({
<ide> helpers.callback(this.options.beforeBuildTicks, [this]);
<ide> },
<ide> buildTicks: helpers.noop,
<del> afterBuildTicks: function() {
<del> helpers.callback(this.options.afterBuildTicks, [this]);
<add> afterBuildTicks: function(ticks) {
<add> var me = this;
<add> // ticks is empty for old axis implementations here
<add> if (helpers.isArray(ticks) && ticks.length) {
<add> return helpers.callback(me.options.afterBuildTicks, [me, ticks]);
<add> }
<add> // Support old implementations (that modified `this.ticks` directly in buildTicks)
<add> me.ticks = helpers.callback(me.options.afterBuildTicks, [me, me.ticks]) || me.ticks;
<add> return ticks;
<ide> },
<ide>
<ide> beforeTickToLabelConversion: function() {
<ide><path>test/specs/core.scale.tests.js
<ide> describe('Core.scale', function() {
<ide> });
<ide> });
<ide> });
<add>
<add> describe('afterBuildTicks', function() {
<add> it('should allow filtering of ticks', function() {
<add> var labels = ['tick1', 'tick2', 'tick3', 'tick4', 'tick5'];
<add> var chart = window.acquireChart({
<add> type: 'line',
<add> options: {
<add> scales: {
<add> xAxes: [{
<add> id: 'x',
<add> type: 'category',
<add> labels: labels,
<add> afterBuildTicks: function(axis, ticks) {
<add> return ticks.slice(1);
<add> }
<add> }]
<add> }
<add> }
<add> });
<add>
<add> var scale = chart.scales.x;
<add> expect(scale.ticks).toEqual(labels.slice(1));
<add> });
<add>
<add> it('should allow filtering of ticks (for new implementation of buildTicks)', function() {
<add> var chart = window.acquireChart({
<add> type: 'line',
<add> data: {
<add> labels: ['2016', '2017', '2018']
<add> },
<add> options: {
<add> scales: {
<add> xAxes: [{
<add> id: 'x',
<add> type: 'time',
<add> time: {
<add> parser: 'YYYY'
<add> },
<add> ticks: {
<add> source: 'labels'
<add> },
<add> afterBuildTicks: function(axis, ticks) {
<add> return ticks.slice(1);
<add> }
<add> }]
<add> }
<add> }
<add> });
<add>
<add> var scale = chart.scales.x;
<add> expect(scale.ticks.length).toEqual(2);
<add> });
<add>
<add> it('should allow no return value from callback', function() {
<add> var labels = ['tick1', 'tick2', 'tick3', 'tick4', 'tick5'];
<add> var chart = window.acquireChart({
<add> type: 'line',
<add> options: {
<add> scales: {
<add> xAxes: [{
<add> id: 'x',
<add> type: 'category',
<add> labels: labels,
<add> afterBuildTicks: function() { }
<add> }]
<add> }
<add> }
<add> });
<add>
<add> var scale = chart.scales.x;
<add> expect(scale.ticks).toEqual(labels);
<add> });
<add>
<add> it('should allow empty ticks', function() {
<add> var labels = ['tick1', 'tick2', 'tick3', 'tick4', 'tick5'];
<add> var chart = window.acquireChart({
<add> type: 'line',
<add> options: {
<add> scales: {
<add> xAxes: [{
<add> id: 'x',
<add> type: 'category',
<add> labels: labels,
<add> afterBuildTicks: function() {
<add> return [];
<add> }
<add> }]
<add> }
<add> }
<add> });
<add>
<add> var scale = chart.scales.x;
<add> expect(scale.ticks.length).toBe(0);
<add> });
<add> });
<ide> }); | 3 |
Text | Text | add 2.9.0-beta.2 to changelog.md | 9453cfd562072e32ee28e0851d8b3b4ee54ad365 | <ide><path>CHANGELOG.md
<ide> # Ember Changelog
<ide>
<add>### 2.9.0-beta.2 (September 12, 2016)
<add>
<add>- [#14237](https://github.com/emberjs/ember.js/pull/14237) [BUGFIX] Ensure Engine Routes are deactivated before destruction
<add>- [#14176](https://github.com/emberjs/ember.js/pull/14176) [BUGFIX] Ensure Controller#transitionToRoute and Route#intermediateTransitionTo work in Engines
<add>- [#14244](https://github.com/emberjs/ember.js/pull/14244) [BUGFIX] Ensure params and hash are frozen in debug builds.
<add>- [#14245](https://github.com/emberjs/ember.js/pull/14245) [BUGFIX] Lookup partials from the current owner when rendering an Engines templates.
<add>- [#14247](https://github.com/emberjs/ember.js/pull/14247) [BUGFIX] Make `this.getAttr` an alias for `this.get`.
<add>- [#14252](https://github.com/emberjs/ember.js/pull/14252) [BUGFIX] Don't delete moduleName from the generated template compilation options.
<add>- [#14253](https://github.com/emberjs/ember.js/pull/14253) [BUGFIX] Prevent duplicated `Ember.meta` invocations.
<add>- [#14271](https://github.com/emberjs/ember.js/pull/14271) [BUGFIX] Ensure `undefined` and `null` values are not rendered as attributes or properties on initial render.
<add>- [#14272](https://github.com/emberjs/ember.js/pull/14272) [BUGFIX] Fix issues with `Transition#isActive` being falsey incorrectly.
<add>
<add>
<ide> ### 2.9.0-beta.1 (September 8, 2016)
<ide>
<ide> - [#14156](https://github.com/emberjs/ember.js/pull/14156) [FEATURE ember-glimmer] Enable by default. | 1 |
Text | Text | clarify experimental api elements in vm.md | ce03a182cf4c5fb433edce2ef7dd9f21c26e76bb | <ide><path>doc/api/vm.md
<ide> changes:
<ide> * `importModuleDynamically` {Function} Called during evaluation of this module
<ide> when `import()` is called. If this option is not specified, calls to
<ide> `import()` will reject with [`ERR_VM_DYNAMIC_IMPORT_CALLBACK_MISSING`][].
<del> This option is part of the experimental modules API, and should not be
<del> considered stable.
<add> This option is part of the experimental modules API. We do not recommend
<add> using it in a production environment.
<ide> * `specifier` {string} specifier passed to `import()`
<ide> * `script` {vm.Script}
<ide> * Returns: {Module Namespace Object|vm.Module} Returning a `vm.Module` is
<ide> changes:
<ide> * `importModuleDynamically` {Function} Called during evaluation of this module
<ide> when `import()` is called. If this option is not specified, calls to
<ide> `import()` will reject with [`ERR_VM_DYNAMIC_IMPORT_CALLBACK_MISSING`][].
<del> This option is part of the experimental modules API, and should not be
<del> considered stable.
<add> This option is part of the experimental modules API. We do not recommend
<add> using it in a production environment.
<ide> * `specifier` {string} specifier passed to `import()`
<ide> * `script` {vm.Script}
<ide> * Returns: {Module Namespace Object|vm.Module} Returning a `vm.Module` is
<ide> changes:
<ide> * `importModuleDynamically` {Function} Called during evaluation of this module
<ide> when `import()` is called. If this option is not specified, calls to
<ide> `import()` will reject with [`ERR_VM_DYNAMIC_IMPORT_CALLBACK_MISSING`][].
<del> This option is part of the experimental modules API, and should not be
<del> considered stable.
<add> This option is part of the experimental modules API. We do not recommend
<add> using it in a production environment.
<ide> * `specifier` {string} specifier passed to `import()`
<ide> * `script` {vm.Script}
<ide> * Returns: {Module Namespace Object|vm.Module} Returning a `vm.Module` is
<ide> changes:
<ide> * `importModuleDynamically` {Function} Called during evaluation of this module
<ide> when `import()` is called. If this option is not specified, calls to
<ide> `import()` will reject with [`ERR_VM_DYNAMIC_IMPORT_CALLBACK_MISSING`][].
<del> This option is part of the experimental modules API, and should not be
<del> considered stable.
<add> This option is part of the experimental modules API. We do not recommend
<add> using it in a production environment.
<ide> * `specifier` {string} specifier passed to `import()`
<ide> * `script` {vm.Script}
<ide> * Returns: {Module Namespace Object|vm.Module} Returning a `vm.Module` is | 1 |
Text | Text | remove old media images | d486152b98b481afd2d6e6e013e8a0cefda93350 | <ide><path>packages/create-next-app/README.md
<ide> If you run into any issues or have feedback, please <a href="https://github.com/
<ide> Open [http://localhost:3000](http://localhost:3000) to view your running app.
<ide> When you're ready for production, run the `build` then `start` scripts.
<ide>
<del><p align='center'>
<del> <img width="600" alt="Create Next App running in terminal" src="media/init-app.png" />
<del></p>
<del>
<del><p align='center'>
<del> <img width="600" alt="Create Next App running in terminal" src="media/dev-tree.png" />
<del></p>
<del>
<ide> ### Start Coding Now
<ide>
<ide> You **don't** need to install or setup Webpack or Babel. | 1 |
Python | Python | change param order for consistency | bdfe21ab242c2175972f53ca6b4af29ada02a2bb | <ide><path>transformers/modeling_albert.py
<ide> def tie_weights(self):
<ide> def get_output_embeddings(self):
<ide> return self.predictions.decoder
<ide>
<del> def forward(self, input_ids=None, attention_mask=None, token_type_ids=None, position_ids=None, head_mask=None,
<del> masked_lm_labels=None, inputs_embeds=None):
<add> def forward(self, input_ids=None, attention_mask=None, token_type_ids=None, position_ids=None, head_mask=None, inputs_embeds=None,
<add> masked_lm_labels=None):
<ide> outputs = self.albert(
<ide> input_ids=input_ids,
<ide> attention_mask=attention_mask, | 1 |
Text | Text | use next tag for canary releases in the doc | faabe22b4f5b5812dc756b437465a3c519f6d872 | <ide><path>scripts/release/README.md
<ide> scripts/release/prepare-canary.js --build=13471
<ide>
<ide> Once the canary has been checked out and tested locally, you're ready to publish it:
<ide> ```sh
<del>scripts/release/publish.js --tags canary
<add>scripts/release/publish.js --tags next
<ide> ```
<ide>
<ide> <sup>1: You can omit the `build` param if you just want to release the latest commit as a canary.</sup> | 1 |
Javascript | Javascript | pick another cname record to test dns queries | b5db5fc9dc85ae0c6fa1ee9b51c634a161c07fdc | <ide><path>test/internet/test-dns.js
<ide> TEST(function test_resolveSrv(done) {
<ide>
<ide>
<ide> TEST(function test_resolveCname(done) {
<del> var req = dns.resolveCname('www.google.com', function(err, names) {
<add> var req = dns.resolveCname('www.microsoft.com', function(err, names) {
<ide> if (err) throw err;
<ide>
<ide> assert.ok(names.length > 0); | 1 |
Python | Python | raise warning in hp search when hp is not in args | 8c2384d8e2d8d0551e53eb77d73204dc5fdc7f80 | <ide><path>src/transformers/trainer.py
<ide> def _hp_search_setup(self, trial: Union["optuna.Trial", Dict[str, Any]]):
<ide>
<ide> for key, value in params.items():
<ide> if not hasattr(self.args, key):
<del> raise AttributeError(
<add> logger.warn(
<ide> f"Trying to set {key} in the hyperparameter search but there is no corresponding field in `TrainingArguments`."
<ide> )
<add> continue
<ide> old_attr = getattr(self.args, key, None)
<ide> # Casting value to the proper type
<ide> if old_attr is not None: | 1 |
Python | Python | move logic into pixelshuffle layer | f71895a633becd6eec8bdb69070c88029ae390e6 | <ide><path>src/transformers/models/swin/modeling_tf_swin.py
<ide> def call(
<ide> return swin_outputs
<ide>
<ide>
<del>class PixelShuffle(tf.keras.layers.Layer):
<add>class TFSwinPixelShuffle(tf.keras.layers.Layer):
<ide> """TF layer implementation of torch.nn.PixelShuffle"""
<ide>
<del> def __init__(
<del> self,
<del> upscale_factor: int,
<del> data_format: str = "NHWC",
<del> trainable: bool = True,
<del> name: str = None,
<del> dtype=None,
<del> dynamic: bool = False,
<del> **kwargs
<del> ) -> None:
<del> super().__init__(trainable, name, dtype, dynamic, **kwargs)
<del> if upscale_factor < 2:
<del> raise ValueError("upscale_factor must be an integer value >= 2")
<add> def __init__(self, upscale_factor: int, **kwargs) -> None:
<add> super().__init__(**kwargs)
<add> if not isinstance(upscale_factor, int) or upscale_factor < 2:
<add> raise ValueError(f"upscale_factor must be an integer value >= 2 got {upscale_factor}")
<ide> self.upscale_factor = upscale_factor
<del> self.data_format = data_format
<ide>
<ide> def call(self, x: tf.Tensor) -> tf.Tensor:
<del> return tf.nn.depth_to_space(x, block_size=self.upscale_factor, data_format=self.data_format)
<add> hidden_states = x
<add> batch_size, _, _, num_input_channels = shape_list(hidden_states)
<add> block_size_squared = self.upscale_factor**2
<add> output_depth = int(num_input_channels / block_size_squared)
<add> # When the number of output channels >= 2, PyTorch's PixelShuffle and
<add> # TF's depth_to_space differ in their output as the order of channels selected for combining
<add> # is a permutation of the other c.f.
<add> # https://stackoverflow.com/questions/68272502/tf-depth-to-space-not-same-as-torchs-pixelshuffle-when-output-channels-1
<add> permutation = tf.constant(
<add> [[i + j * block_size_squared for i in range(block_size_squared) for j in range(output_depth)]]
<add> )
<add> hidden_states = tf.gather(params=hidden_states, indices=tf.tile(permutation, [batch_size, 1]), batch_dims=-1)
<add> hidden_states = tf.nn.depth_to_space(hidden_states, block_size=self.upscale_factor, data_format="NHWC")
<add> return hidden_states
<ide>
<ide>
<ide> class TFSwinDecoder(tf.keras.layers.Layer):
<ide> def __init__(self, config: SwinConfig, **kwargs):
<ide> self.conv2d = tf.keras.layers.Conv2D(
<ide> filters=config.encoder_stride**2 * config.num_channels, kernel_size=1, strides=1, name="0"
<ide> )
<del> self._block_size = config.encoder_stride
<del> self.pixel_shuffle = PixelShuffle(self._block_size, name="1")
<add> self.pixel_shuffle = TFSwinPixelShuffle(config.encoder_stride, name="1")
<ide>
<ide> def call(self, x: tf.Tensor) -> tf.Tensor:
<ide> hidden_states = x
<ide> # B,C,H,W -> B,H,W,C
<ide> hidden_states = tf.transpose(hidden_states, (0, 2, 3, 1))
<ide> hidden_states = self.conv2d(hidden_states)
<del> batch_size, _, _, num_input_channels = shape_list(hidden_states)
<del> block_size_squared = self._block_size**2
<del> output_depth = int(num_input_channels / block_size_squared)
<del> # When the number of output channels >= 2, PyTorch's PixelShuffle and
<del> # TF's depth_to_space differ in their output as the order of channels selected for combining
<del> # is a permutation of the other c.f.
<del> # https://stackoverflow.com/questions/68272502/tf-depth-to-space-not-same-as-torchs-pixelshuffle-when-output-channels-1
<del> permutation = tf.constant(
<del> [[i + j * block_size_squared for i in range(block_size_squared) for j in range(output_depth)]]
<del> )
<del> hidden_states = tf.gather(params=hidden_states, indices=tf.tile(permutation, [batch_size, 1]), batch_dims=-1)
<ide> hidden_states = self.pixel_shuffle(hidden_states)
<ide> # B,H,W,C -> B,C,H,W
<ide> hidden_states = tf.transpose(hidden_states, (0, 3, 1, 2)) | 1 |
PHP | PHP | add interface class for form contexts | 9a54547ae19ef0544e39d8317d506eddcf73baf3 | <ide><path>src/View/Form/ArrayContext.php
<ide> * ]
<ide> * ];
<ide> */
<del>class ArrayContext {
<add>class ArrayContext implements ContextInterface {
<ide>
<ide> /**
<ide> * The request object.
<ide><path>src/View/Form/ContextInterface.php
<add><?php
<add>/**
<add> * CakePHP(tm) : Rapid Development Framework (http://cakephp.org)
<add> * Copyright (c) Cake Software Foundation, Inc. (http://cakefoundation.org)
<add> *
<add> * Licensed under The MIT License
<add> * For full copyright and license information, please see the LICENSE.txt
<add> * Redistributions of files must retain the above copyright notice.
<add> *
<add> * @copyright Copyright (c) Cake Software Foundation, Inc. (http://cakefoundation.org)
<add> * @link http://cakephp.org CakePHP(tm) Project
<add> * @since CakePHP(tm) v 3.0
<add> * @license http://www.opensource.org/licenses/mit-license.php MIT License
<add> */
<add>namespace Cake\View\Form;
<add>
<add>/**
<add> * Interface for FormHelper context implementations.
<add> */
<add>interface ContextInterface {
<add>
<add>/**
<add> * Get the fields used in the context as a primary key.
<add> *
<add> * @return array
<add> */
<add> public function primaryKey();
<add>
<add>/**
<add> * Returns whether or not this form is for a create operation.
<add> *
<add> * @return boolean
<add> */
<add> public function isCreate();
<add>
<add>/**
<add> * Get the current value for a given field.
<add> *
<add> * @param string $field A dot separated path to the field a value
<add> * is needed for.
<add> * @return mixed
<add> */
<add> public function val($field);
<add>
<add>/**
<add> * Check if a given field is 'required'.
<add> *
<add> * In this context class, this is simply defined by the 'required' array.
<add> *
<add> * @param string $field A dot separated path to check required-ness for.
<add> * @return boolean
<add> */
<add> public function isRequired($field);
<add>
<add>/**
<add> * Get the abstract field type for a given field name.
<add> *
<add> * @param string $field A dot separated path to get a schema type for.
<add> * @return null|string An abstract data type or null.
<add> * @see Cake\Database\Type
<add> */
<add> public function type($field);
<add>
<add>/**
<add> * Get an associative array of other attributes for a field name.
<add> *
<add> * @param string $field A dot separated path to get additional data on.
<add> * @return array An array of data describing the additional attributes on a field.
<add> */
<add> public function attributes($field);
<add>
<add>/**
<add> * Check whether or not a field has an error attached to it
<add> *
<add> * @param string $field A dot separated path to check errors on.
<add> * @return boolean Returns true if the errors for the field are not empty.
<add> */
<add> public function hasError($field);
<add>
<add>/**
<add> * Get the errors for a given field
<add> *
<add> * @param string $field A dot separated path to check errors on.
<add> * @return array An array of errors, an empty array will be returned when the
<add> * context has no errors.
<add> */
<add> public function error($field);
<add>
<add>}
<ide><path>src/View/Form/EntityContext.php
<ide> use Cake\ORM\TableRegistry;
<ide> use Cake\Utility\Inflector;
<ide> use Cake\Validation\Validator;
<add>use Cake\View\Form\ContextInterface;
<ide> use Traversable;
<ide>
<ide> /**
<ide> * Defaults to 'default'. Can be an array of table alias=>validators when
<ide> * dealing with associated forms.
<ide> */
<del>class EntityContext {
<add>class EntityContext implements ContextInterface {
<ide>
<ide> /**
<ide> * The request object.
<ide><path>src/View/Form/NullContext.php
<ide> namespace Cake\View\Form;
<ide>
<ide> use Cake\Network\Request;
<add>use Cake\View\Form\ContextInterface;
<ide>
<ide> /**
<ide> * Provides a context provider that does nothing.
<ide> *
<ide> * This context provider simply fulfils the interface requirements
<ide> * that FormHelper has and allows access to the request data.
<ide> */
<del>class NullContext {
<add>class NullContext implements ContextInterface {
<ide>
<ide> /**
<ide> * The request object.
<ide> public function __construct(Request $request, array $context) {
<ide> }
<ide>
<ide> /**
<del> * Get the fields used in the context as a primary key.
<del> *
<del> * @return array
<add> * {@inheritDoc}
<ide> */
<ide> public function primaryKey() {
<ide> return [];
<ide> }
<ide>
<ide> /**
<del> * Returns whether or not this form is for a create operation.
<del> *
<del> * @return boolean
<add> * {@inheritDoc}
<ide> */
<ide> public function isCreate() {
<ide> return true;
<ide> }
<ide>
<ide> /**
<del> * Get the current value for a given field.
<del> *
<del> * This method will coalesce the current request data and the 'defaults'
<del> * array.
<del> *
<del> * @param string $field A dot separated path to the field a value
<del> * is needed for.
<del> * @return mixed
<add> * {@inheritDoc}
<ide> */
<ide> public function val($field) {
<ide> return $this->_request->data($field);
<ide> }
<ide>
<ide> /**
<del> * Check if a given field is 'required'.
<del> *
<del> * In this context class, this is simply defined by the 'required' array.
<del> *
<del> * @param string $field A dot separated path to check required-ness for.
<del> * @return boolean
<add> * {@inheritDoc}
<ide> */
<ide> public function isRequired($field) {
<ide> return false;
<ide> }
<ide>
<ide> /**
<del> * Get the abstract field type for a given field name.
<del> *
<del> * @param string $field A dot separated path to get a schema type for.
<del> * @return null|string An abstract data type or null.
<del> * @see Cake\Database\Type
<add> * {@inheritDoc}
<ide> */
<ide> public function type($field) {
<ide> return null;
<ide> }
<ide>
<ide> /**
<del> * Get an associative array of other attributes for a field name.
<del> *
<del> * @param string $field A dot separated path to get additional data on.
<del> * @return array An array of data describing the additional attributes on a field.
<add> * {@inheritDoc}
<ide> */
<ide> public function attributes($field) {
<ide> return [];
<ide> }
<ide>
<ide> /**
<del> * Check whether or not a field has an error attached to it
<del> *
<del> * @param string $field A dot separated path to check errors on.
<del> * @return boolean Returns true if the errors for the field are not empty.
<add> * {@inheritDoc}
<ide> */
<ide> public function hasError($field) {
<ide> return false;
<ide> }
<ide>
<ide> /**
<del> * Get the errors for a given field
<del> *
<del> * @param string $field A dot separated path to check errors on.
<del> * @return array An array of errors, an empty array will be returned when the
<del> * context has no errors.
<add> * {@inheritDoc}
<ide> */
<ide> public function error($field) {
<ide> return []; | 4 |
Text | Text | use aix instead of aix in fs.md | 431d3750544fe10006e10d89e4c68576ffe5466c | <ide><path>doc/api/fs.md
<ide> to be notified of filesystem changes.
<ide> directories.
<ide> * On SunOS systems (including Solaris and SmartOS), this uses [`event ports`][].
<ide> * On Windows systems, this feature depends on [`ReadDirectoryChangesW`][].
<del>* On Aix systems, this feature depends on [`AHAFS`][], which must be enabled.
<add>* On AIX systems, this feature depends on [`AHAFS`][], which must be enabled.
<ide> * On IBM i systems, this feature is not supported.
<ide>
<ide> If the underlying functionality is not available for some reason, then | 1 |
Javascript | Javascript | add ripgrep pcre2 support | ef7b910ed0b30edd83d69396e7398124e99118f1 | <ide><path>spec/workspace-spec.js
<ide> describe('Workspace', () => {
<ide> });
<ide> });
<ide> });
<add> describe('pcre2 enabled', async () => {
<add> it('supports lookbehind searches', async () => {
<add> const results = [];
<add>
<add> await scan(/(?<!a)aa\b/, { PCRE2: true }, result =>
<add> results.push(result)
<add> );
<add>
<add> expect(results.length).toBe(1);
<add> const { filePath, matches } = results[0];
<add> expect(filePath).toBe(
<add> atom.project.getDirectories()[0].resolve('a')
<add> );
<add> expect(matches).toHaveLength(1);
<add> expect(matches[0]).toEqual({
<add> matchText: 'aa',
<add> lineText: 'cc aa cc',
<add> lineTextOffset: 0,
<add> range: [[1, 3], [1, 5]],
<add> leadingContextLines: [],
<add> trailingContextLines: []
<add> });
<add> });
<add> });
<ide> }
<ide>
<ide> it('returns results on lines with unicode strings', async () => {
<ide><path>src/ripgrep-directory-searcher.js
<ide> module.exports = class RipgrepDirectorySearcher {
<ide> args.push('--no-ignore-vcs');
<ide> }
<ide>
<add> if (options.PCRE2) {
<add> args.push('--pcre2');
<add> }
<add>
<ide> args.push('.');
<ide>
<ide> const child = spawn(this.rgPath, args, {
<ide><path>src/workspace.js
<ide> module.exports = class Workspace extends Model {
<ide> follow: this.config.get('core.followSymlinks'),
<ide> leadingContextLineCount: options.leadingContextLineCount || 0,
<ide> trailingContextLineCount: options.trailingContextLineCount || 0,
<add> PCRE2: options.PCRE2,
<ide> didMatch: result => {
<ide> if (!this.project.isPathModified(result.filePath)) {
<ide> return iterator(result); | 3 |
Python | Python | fix the 108th issue | 6fca662c841797a2006999c699a6738b20af1d4e | <ide><path>flask/module.py
<ide> def _register(state):
<ide> path = state.url_prefix + path
<ide> state.app.add_url_rule(path + '/<path:filename>',
<ide> endpoint='%s.static' % module.name,
<del> view_func=module.send_static_file)
<add> view_func=module.send_static_file,
<add> subdomain=module.subdomain)
<ide> return _register
<ide>
<ide> | 1 |
Python | Python | guard the unsafe tf.log to prevent nan | bb6f092ae5702d50a4f55a8448a6e40144e91029 | <ide><path>research/video_prediction/prediction_train.py
<ide> # How often to save a model checkpoint
<ide> SAVE_INTERVAL = 2000
<ide>
<add># EPSILON to avoid NAN
<add>EPSILON = 1e-9
<add>
<ide> # tf record data location:
<ide> DATA_DIR = 'push/push_train'
<ide>
<ide> def peak_signal_to_noise_ratio(true, pred):
<ide> Returns:
<ide> peak signal to noise ratio (PSNR)
<ide> """
<del> return 10.0 * tf.log(1.0 / mean_squared_error(true, pred)) / tf.log(10.0)
<add> return 10.0 * (- tf.log(tf.maximum(mean_squared_error(true, pred), EPSILON))) / tf.log(10.0)
<ide>
<ide>
<ide> def mean_squared_error(true, pred): | 1 |
Ruby | Ruby | fix rubocop warnings | e24a890e593d665e8f4bad2588879602e3918a87 | <ide><path>Library/Homebrew/test/test_tab.rb
<ide> def setup
<ide> "stable" => "0.10",
<ide> "devel" => "0.14",
<ide> "head" => "HEAD-1111111",
<del> }
<add> },
<ide> })
<ide> end
<ide> | 1 |
Text | Text | tidy active record changelog. [ci skip] | db56c0fcf5c94fd746a41e2ae278b0bdb6fb1d9d | <ide><path>activerecord/CHANGELOG.md
<ide> *Yves Senn*
<ide>
<ide> * Fix uninitialized constant `TransactionState` error when `Marshall.load` is used on an Active Record result.
<del> Fixes #12790
<add>
<add> Fixes #12790.
<ide>
<ide> *Jason Ayre*
<ide>
<ide> *Severin Schoepke*
<ide>
<ide> * `ActiveRecord::Store` works together with PG `hstore` columns.
<add>
<ide> Fixes #12452.
<ide>
<ide> *Yves Senn* | 1 |
Go | Go | return error on client redirect | eb36d6021618f788012c166a533f1b321cda9695 | <ide><path>client/client.go
<ide> For example, to list running containers (the equivalent of "docker ps"):
<ide> package client
<ide>
<ide> import (
<add> "errors"
<ide> "fmt"
<ide> "net/http"
<ide> "net/url"
<ide> import (
<ide> "github.com/docker/go-connections/tlsconfig"
<ide> )
<ide>
<add>// ErrRedirect is the error returned by checkRedirect when the request is non-GET.
<add>var ErrRedirect = errors.New("unexpected redirect in response")
<add>
<ide> // Client is the API client that performs all operations
<ide> // against a docker server.
<ide> type Client struct {
<ide> type Client struct {
<ide> manualOverride bool
<ide> }
<ide>
<add>// CheckRedirect specifies the policy for dealing with redirect responses:
<add>// If the request is non-GET return `ErrRedirect`. Otherwise use the last response.
<add>//
<add>// Go 1.8 changes behavior for HTTP redirects (specificlaly 301, 307, and 308) in the client .
<add>// The Docker client (and by extension docker API client) can be made to to send a request
<add>// like POST /containers//start where what would normally be in the name section of the URL is empty.
<add>// This triggers an HTTP 301 from the daemon.
<add>// In go 1.8 this 301 will be converted to a GET request, and ends up getting a 404 from the daemon.
<add>// This behavior change manifests in the client in that before the 301 was not followed and
<add>// the client did not generate an error, but now results in a message like Error response from daemon: page not found.
<add>func CheckRedirect(req *http.Request, via []*http.Request) error {
<add> if via[0].Method == http.MethodGet {
<add> return http.ErrUseLastResponse
<add> }
<add> return ErrRedirect
<add>}
<add>
<ide> // NewEnvClient initializes a new API client based on environment variables.
<ide> // Use DOCKER_HOST to set the url to the docker server.
<ide> // Use DOCKER_API_VERSION to set the version of the API to reach, leave empty for latest.
<ide> func NewEnvClient() (*Client, error) {
<ide> Transport: &http.Transport{
<ide> TLSClientConfig: tlsc,
<ide> },
<add> CheckRedirect: CheckRedirect,
<ide> }
<ide> }
<ide>
<ide> func NewClient(host string, version string, client *http.Client, httpHeaders map
<ide> transport := new(http.Transport)
<ide> sockets.ConfigureTransport(transport, proto, addr)
<ide> client = &http.Client{
<del> Transport: transport,
<add> Transport: transport,
<add> CheckRedirect: CheckRedirect,
<ide> }
<ide> }
<ide>
<ide><path>client/client_test.go
<ide> import (
<ide>
<ide> "github.com/docker/docker/api"
<ide> "github.com/docker/docker/api/types"
<add> "github.com/stretchr/testify/assert"
<ide> "golang.org/x/net/context"
<ide> )
<ide>
<ide> func TestNewEnvClientSetsDefaultVersion(t *testing.T) {
<ide> os.Setenv(key, envVarValues[key])
<ide> }
<ide> }
<add>
<add>type roundTripFunc func(*http.Request) (*http.Response, error)
<add>
<add>func (rtf roundTripFunc) RoundTrip(req *http.Request) (*http.Response, error) {
<add> return rtf(req)
<add>}
<add>
<add>type bytesBufferClose struct {
<add> *bytes.Buffer
<add>}
<add>
<add>func (bbc bytesBufferClose) Close() error {
<add> return nil
<add>}
<add>
<add>func TestClientRedirect(t *testing.T) {
<add> client := &http.Client{
<add> CheckRedirect: CheckRedirect,
<add> Transport: roundTripFunc(func(req *http.Request) (*http.Response, error) {
<add> if req.URL.String() == "/bla" {
<add> return &http.Response{StatusCode: 404}, nil
<add> }
<add> return &http.Response{
<add> StatusCode: 301,
<add> Header: map[string][]string{"Location": {"/bla"}},
<add> Body: bytesBufferClose{bytes.NewBuffer(nil)},
<add> }, nil
<add> }),
<add> }
<add>
<add> cases := []struct {
<add> httpMethod string
<add> expectedErr error
<add> statusCode int
<add> }{
<add> {http.MethodGet, nil, 301},
<add> {http.MethodPost, &url.Error{Op: "Post", URL: "/bla", Err: ErrRedirect}, 301},
<add> {http.MethodPut, &url.Error{Op: "Put", URL: "/bla", Err: ErrRedirect}, 301},
<add> {http.MethodDelete, &url.Error{Op: "Delete", URL: "/bla", Err: ErrRedirect}, 301},
<add> }
<add>
<add> for _, tc := range cases {
<add> req, err := http.NewRequest(tc.httpMethod, "/redirectme", nil)
<add> assert.NoError(t, err)
<add> resp, err := client.Do(req)
<add> assert.Equal(t, tc.expectedErr, err)
<add> assert.Equal(t, tc.statusCode, resp.StatusCode)
<add> }
<add>} | 2 |
Javascript | Javascript | prevent binding attacks | e1a398597ddf6910e6a6f01fe90ba688d941cd61 | <ide><path>server/services/job.js
<ide> export default function getJobServices(app) {
<ide> if (!job) {
<ide> return cb(new Error('job creation should get a job object'));
<ide> }
<add>
<add> Object.assign(job, {
<add> isPaid: false,
<add> isApproved: false
<add> });
<add>
<ide> Job.create(job, (err, savedJob) => {
<ide> cb(err, savedJob);
<ide> }); | 1 |
Ruby | Ruby | fix formula paths for manually specified names | dafe97b047cece6d12c5fa4302ae5966bbe12939 | <ide><path>Library/Homebrew/cmd/create.rb
<ide> def create
<ide> path = Pathname.new url
<ide> print "Formula name [#{path.stem}]: "
<ide> fc.name = __gets || path.stem
<add> fc.path = Formula.path fc.name
<ide> end
<ide>
<ide> unless ARGV.force?
<ide> class FormulaCreator
<ide> attr :url
<ide> attr :md5
<ide> attr :name, true
<del> attr :path
<add> attr :path, true
<ide> attr :mode, true
<ide>
<ide> def url= url | 1 |
Go | Go | add test for invalid mount mode for volumes in | b10b458b6ef8327268676744a7c3230e33c9baf6 | <ide><path>integration-cli/docker_cli_run_test.go
<ide> func TestRunVolumesFromInReadWriteMode(t *testing.T) {
<ide> }
<ide>
<ide> cmd = exec.Command(dockerBinary, "run", "--volumes-from", "parent:rw", "busybox", "touch", "/test/file")
<del> if _, err := runCommand(cmd); err != nil {
<del> t.Fatal(err)
<add> if out, _, err := runCommandWithOutput(cmd); err != nil {
<add> t.Fatalf("running --volumes-from parent:rw failed with output: %q\nerror: %v", out, err)
<add> }
<add>
<add> cmd = exec.Command(dockerBinary, "run", "--volumes-from", "parent:bar", "busybox", "touch", "/test/file")
<add> if out, _, err := runCommandWithOutput(cmd); err == nil || !strings.Contains(out, "Invalid mode for volumes-from: bar") {
<add> t.Fatalf("running --volumes-from foo:bar should have failed with invalid mount mode: %q", out)
<add> }
<add>
<add> cmd = exec.Command(dockerBinary, "run", "--volumes-from", "parent", "busybox", "touch", "/test/file")
<add> if out, _, err := runCommandWithOutput(cmd); err != nil {
<add> t.Fatalf("running --volumes-from parent failed with output: %q\nerror: %v", out, err)
<ide> }
<ide>
<ide> deleteAllContainers() | 1 |
Javascript | Javascript | handle challenge creation | 1db9a123ae18415f7003dbf308ed6cf4dc3bab6d | <ide><path>client/plugins/fcc-source-challenges/gatsby-node.js
<add>const path = require('path');
<ide> const chokidar = require('chokidar');
<add>const readdirp = require('readdirp');
<ide>
<ide> const { createChallengeNode } = require('./create-challenge-nodes');
<ide>
<ide> exports.sourceNodes = function sourceChallengesSourceNodes(
<ide> const { createNode } = actions;
<ide> const watcher = chokidar.watch(curriculumPath, {
<ide> ignored: /(^|[\/\\])\../,
<add> ignoreInitial: true,
<ide> persistent: true,
<ide> usePolling: true,
<ide> cwd: curriculumPath
<ide> File changed at ${filePath}, replacing challengeNode id ${challenge.id}
<ide> : null
<ide> );
<ide>
<add> // if a file is added, that might change the order of the challenges in the
<add> // containing block, so we recreate them all
<add> watcher.on('add', filePath => {
<add> if (/\.md?$/.test(filePath)) {
<add> const blockPath = path.dirname(filePath);
<add> const fullBlockPath = path.join(
<add> __dirname,
<add> '../../../curriculum/challenges/english/',
<add> blockPath
<add> );
<add> readdirp(fullBlockPath, { fileFilter: '*.md' })
<add> .on('data', entry => {
<add> const { path: siblingPath } = entry;
<add> const relativePath = path.join(blockPath, siblingPath);
<add> onSourceChange(relativePath)
<add> .then(challenge => {
<add> reporter.info(
<add> `
<add>File changed at ${relativePath}, replacing challengeNode id ${challenge.id}
<add> `
<add> );
<add> createVisibleChallenge(challenge);
<add> })
<add> .catch(e =>
<add> reporter.error(`fcc-replace-challenge
<add>attempting to replace ${relativePath}
<add>
<add>${e.message}
<add>
<add>`)
<add> );
<add> })
<add> .on('warn', error => console.error('non-fatal error', error))
<add> .on('error', error => console.error('fatal error', error));
<add> }
<add> });
<add>
<ide> function sourceAndCreateNodes() {
<ide> return source()
<ide> .then(challenges => Promise.all(challenges))
<ide><path>client/utils/buildChallenges.js
<ide> const _ = require('lodash');
<add>const path = require('path');
<ide>
<ide> const {
<ide> getChallengesForLang,
<ide> exports.localeChallengesRootDir = getChallengesDirForLang(curriculumLocale);
<ide>
<ide> exports.replaceChallengeNode = () => {
<ide> return async function replaceChallengeNode(filePath) {
<del> return await createChallenge(challengesDir, filePath, curriculumLocale);
<add> // get the meta so that challengeOrder is accurate
<add> const blockNameRe = /\d\d-[-\w]+\/([^/]+)\//;
<add> const blockName = filePath.match(blockNameRe)[1];
<add> const metaPath = path.resolve(
<add> __dirname,
<add> `../../curriculum/challenges/_meta/${blockName}/meta.json`
<add> );
<add> delete require.cache[require.resolve(metaPath)];
<add> const meta = require(metaPath);
<add> return await createChallenge(
<add> challengesDir,
<add> filePath,
<add> curriculumLocale,
<add> meta
<add> );
<ide> };
<ide> };
<ide> | 2 |
Text | Text | improve http.setheader and getheader typeinfo | 50e9f8df62e2d8a7f05ec6c2fbd01528a96f3468 | <ide><path>doc/api/http.md
<ide> added: v1.6.0
<ide> -->
<ide>
<ide> * `name` {string}
<del>* Returns: {string}
<add>* Returns: {any}
<ide>
<ide> Reads out a header on the request. Note that the name is case insensitive.
<add>The type of the return value depends on the arguments provided to
<add>[`request.setHeader()`][].
<ide>
<ide> Example:
<ide> ```js
<add>request.setHeader('content-type', 'text/html');
<add>request.setHeader('Content-Length', Buffer.byteLength(body));
<add>request.setHeader('Set-Cookie', ['type=ninja', 'language=javascript']);
<ide> const contentType = request.getHeader('Content-Type');
<add>// contentType is 'text/html'
<add>const contentLength = request.getHeader('Content-Length');
<add>// contentLength is of type number
<add>const setCookie = request.getHeader('set-cookie');
<add>// setCookie is of type string[]
<add>
<ide> ```
<ide>
<ide> ### request.removeHeader(name)
<ide> added: v1.6.0
<ide> -->
<ide>
<ide> * `name` {string}
<del>* `value` {string}
<add>* `value` {any}
<ide>
<ide> Sets a single header value for headers object. If this header already exists in
<ide> the to-be-sent headers, its value will be replaced. Use an array of strings
<del>here to send multiple headers with the same name.
<add>here to send multiple headers with the same name. Non-string values will be
<add>stored without modification. Therefore, [`request.getHeader()`][] may return
<add>non-string values. However, the non-string values will be converted to strings
<add>for network transmission.
<ide>
<ide> Example:
<ide> ```js
<ide> added: v0.4.0
<ide> -->
<ide>
<ide> * `name` {string}
<del>* Returns: {string}
<add>* Returns: {any}
<ide>
<ide> Reads out a header that's already been queued but not sent to the client.
<del>Note that the name is case insensitive.
<add>Note that the name is case insensitive. The type of the return value depends
<add>on the arguments provided to [`response.setHeader()`][].
<ide>
<ide> Example:
<ide>
<ide> ```js
<add>response.setHeader('Content-Type', 'text/html');
<add>response.setHeader('Content-Length', Buffer.byteLength(body));
<add>response.setHeader('Set-Cookie', ['type=ninja', 'language=javascript']);
<ide> const contentType = response.getHeader('content-type');
<add>// contentType is 'text/html'
<add>const contentLength = response.getHeader('Content-Length');
<add>// contentLength is of type number
<add>const setCookie = response.getHeader('set-cookie');
<add>// setCookie is of type string[]
<ide> ```
<ide>
<ide> ### response.getHeaderNames()
<ide> added: v0.4.0
<ide> -->
<ide>
<ide> * `name` {string}
<del>* `value` {string | string[]}
<add>* `value` {any}
<ide>
<ide> Sets a single header value for implicit headers. If this header already exists
<ide> in the to-be-sent headers, its value will be replaced. Use an array of strings
<del>here to send multiple headers with the same name.
<add>here to send multiple headers with the same name. Non-string values will be
<add>stored without modification. Therefore, [`response.getHeader()`][] may return
<add>non-string values. However, the non-string values will be converted to strings
<add>for network transmission.
<ide>
<ide> Example:
<ide>
<ide> not abort the request or do anything besides add a `'timeout'` event.
<ide> [`net.createConnection()`]: net.html#net_net_createconnection_options_connectlistener
<ide> [`removeHeader(name)`]: #http_request_removeheader_name
<ide> [`request.end()`]: #http_request_end_data_encoding_callback
<add>[`request.getHeader()`]: #http_request_getheader_name
<add>[`request.setHeader()`]: #http_request_setheader_name_value
<ide> [`request.setTimeout()`]: #http_request_settimeout_timeout_callback
<ide> [`request.socket`]: #http_request_socket
<ide> [`request.socket.getPeerCertificate()`]: tls.html#tls_tlssocket_getpeercertificate_detailed
<ide> [`request.write(data, encoding)`]: #http_request_write_chunk_encoding_callback
<ide> [`response.end()`]: #http_response_end_data_encoding_callback
<add>[`response.getHeader()`]: #http_response_getheader_name
<ide> [`response.setHeader()`]: #http_response_setheader_name_value
<ide> [`response.socket`]: #http_response_socket
<ide> [`response.write()`]: #http_response_write_chunk_encoding_callback | 1 |
Text | Text | update minimum xcode version for macos | 9f830f37daf55a2c903153e1681873a8cd61baee | <ide><path>BUILDING.md
<ide> Depending on the host platform, the selection of toolchains may vary.
<ide> | ---------------- | -------------------------------------------------------------- |
<ide> | Linux | GCC >= 6.3 |
<ide> | Windows | Visual Studio >= 2017 with the Windows 10 SDK on a 64-bit host |
<del>| macOS | Xcode >= 8 (Apple LLVM >= 8) |
<add>| macOS | Xcode >= 10 (Apple LLVM >= 10) |
<ide>
<ide> ### Official binary platforms and toolchains
<ide>
<ide> Binaries at <https://nodejs.org/download/release/> are produced on:
<ide> | Binary package | Platform and Toolchain |
<ide> | --------------------- | ------------------------------------------------------------------------ |
<ide> | aix-ppc64 | AIX 7.1 TL05 on PPC64BE with GCC 6 |
<del>| darwin-x64 (and .pkg) | macOS 10.11, Xcode Command Line Tools 8 with -mmacosx-version-min=10.10 |
<add>| darwin-x64 (and .pkg) | macOS 10.11, Xcode Command Line Tools 10 with -mmacosx-version-min=10.10 |
<ide> | linux-arm64 | CentOS 7 with devtoolset-6 / GCC 6 |
<ide> | linux-armv7l | Cross-compiled on Ubuntu 16.04 x64 with [custom GCC toolchain](https://github.com/rvagg/rpi-newer-crosstools) |
<ide> | linux-ppc64le | CentOS 7 with devtoolset-6 / GCC 6 <sup>[7](#fn7)</sup> |
<ide> FreeBSD and OpenBSD users may also need to install `libexecinfo`.
<ide>
<ide> #### macOS prerequisites
<ide>
<del>* Xcode Command Line Tools >= 8 for macOS
<add>* Xcode Command Line Tools >= 10 for macOS
<ide> * Python (see note above)
<ide> * Python 2.7
<ide> * Python 3.5, 3.6, and 3.7 are experimental. | 1 |
Text | Text | harmonize shell commands in dev guide [ci skip] | 852b6886690e5ff68b78fae23d678ee2ba75d92b | <ide><path>guides/source/development_dependencies_install.md
<ide> prerequisite for installing this package manager is that
<ide> On macOS, you can run:
<ide>
<ide> ```bash
<del>brew install yarn
<add>$ brew install yarn
<ide> ```
<ide>
<ide> On Ubuntu, you can run:
<ide>
<ide> ```bash
<del>curl -sS https://dl.yarnpkg.com/debian/pubkey.gpg | sudo apt-key add -
<del>echo "deb https://dl.yarnpkg.com/debian/ stable main" | sudo tee /etc/apt/sources.list.d/yarn.list
<add>$ curl -sS https://dl.yarnpkg.com/debian/pubkey.gpg | sudo apt-key add -
<add>$ echo "deb https://dl.yarnpkg.com/debian/ stable main" | sudo tee /etc/apt/sources.list.d/yarn.list
<ide>
<del>sudo apt-get update && sudo apt-get install yarn
<add>$ sudo apt-get update && sudo apt-get install yarn
<ide> ```
<ide>
<ide> On Fedora or CentOS, just run:
<ide>
<ide> ```bash
<del>sudo wget https://dl.yarnpkg.com/rpm/yarn.repo -O /etc/yum.repos.d/yarn.repo
<add>$ sudo wget https://dl.yarnpkg.com/rpm/yarn.repo -O /etc/yum.repos.d/yarn.repo
<ide>
<del>sudo yum install yarn
<add>$ sudo yum install yarn
<ide> ```
<ide>
<ide> Finally, after installing Yarn, you will need to run the following
<ide> command inside of the `activestorage` directory to install the dependencies:
<ide>
<ide> ```bash
<del>yarn install
<add>$ yarn install
<ide> ```
<ide>
<ide> Extracting previews, tested in Active Storage's test suite requires third-party | 1 |
Ruby | Ruby | add permit_weak_imports directive | 9c7f24b84aea4ae1b465afa41445385ada4f2ffb | <ide><path>Library/Homebrew/extend/ENV/shared.rb
<ide> def warn_about_non_apple_gcc(name)
<ide>
<ide> def permit_arch_flags; end
<ide>
<add> def permit_weak_imports; end
<add>
<ide> private
<ide>
<ide> def cc=(val)
<ide><path>Library/Homebrew/extend/os/mac/extend/ENV/std.rb
<ide> def x11
<ide>
<ide> append "CFLAGS", "-I#{MacOS::X11.include}" unless MacOS::CLT.installed?
<ide> end
<add>
<add> def permit_weak_imports
<add> remove "LDFLAGS", "-Wl,-no_weak_imports"
<add> end
<ide> end
<ide><path>Library/Homebrew/extend/os/mac/extend/ENV/super.rb
<ide> def set_x11_env_if_installed
<ide> ENV.x11 = MacOS::X11.installed?
<ide> end
<ide>
<add> def permit_weak_imports
<add> remove "HOMEBREW_CCCFG", "w"
<add> end
<add>
<ide> # These methods are no longer necessary under superenv, but are needed to
<ide> # maintain an interface compatible with stdenv.
<ide> alias_method :macosxsdk, :noop | 3 |
Python | Python | add check_svml_submodule function | e9b9299c677b4e644b8adccd6d4695451969ee01 | <ide><path>numpy/core/setup.py
<ide> def can_link_svml():
<ide> system = platform.system()
<ide> return "x86_64" in machine and system == "Linux"
<ide>
<add>def check_svml_submodule(svmlpath):
<add> if not os.path.exists(svmlpath + "/README.md"):
<add> raise RuntimeError("Missing `SVML` submodule! Run `git submodule "
<add> "update --init` to fix this.")
<add>
<ide> def pythonlib_dir():
<ide> """return path where libpython* is."""
<ide> if sys.platform == 'win32':
<ide> def generate_umath_c(ext, build_dir):
<ide> join(codegen_dir, 'generate_ufunc_api.py'),
<ide> ]
<ide>
<add> svml_path = "numpy/core/src/umath/svml"
<ide> svml_objs = []
<del> if can_link_svml():
<del> svml_objs = files = glob.glob("numpy/core/src/umath/svml" +
<del> '/**/*.s', recursive=True)
<add> if can_link_svml() and check_svml_submodule(svml_path):
<add> svml_objs = files = glob.glob(svml_path + '/**/*.s', recursive=True)
<ide>
<ide> config.add_extension('_multiarray_umath',
<ide> sources=multiarray_src + umath_src + | 1 |
PHP | PHP | fix incorrect comment | 13f00c8e26a792da3817f47c07e1e5f48cd84b0f | <ide><path>src/Illuminate/Http/FrameGuard.php
<ide> class FrameGuard implements HttpKernelInterface {
<ide> protected $app;
<ide>
<ide> /**
<del> * Create a new CookieQueue instance.
<add> * Create a new FrameGuard instance.
<ide> *
<ide> * @param \Symfony\Component\HttpKernel\HttpKernelInterface $app
<ide> * @return void | 1 |
Javascript | Javascript | apply unit conversion to position as well | b9afd9865de494fce9638ef612aeab0306544a07 | <ide><path>examples/js/loaders/ColladaLoader.js
<ide> THREE.ColladaLoader = function () {
<ide> obj.scale = props[ 2 ];
<ide>
<ide> // unit conversion
<add> obj.position.multiplyScalar(colladaUnit);
<ide> obj.scale.multiplyScalar(colladaUnit);
<ide>
<ide> if ( options.centerGeometry && obj.geometry ) { | 1 |
Javascript | Javascript | remove unused option in `fs.fstatsync()` | ac8ab0ca0b4c7a9bc65c761273afe12d1e6f873e | <ide><path>lib/fs.js
<ide> function hasNoEntryError(ctx) {
<ide> * @param {number} fd
<ide> * @param {{
<ide> * bigint?: boolean;
<del> * throwIfNoEntry?: boolean;
<ide> * }} [options]
<ide> * @returns {Stats}
<ide> */
<del>function fstatSync(fd, options = { bigint: false, throwIfNoEntry: true }) {
<add>function fstatSync(fd, options = { bigint: false }) {
<ide> fd = getValidatedFd(fd);
<ide> const ctx = { fd };
<ide> const stats = binding.fstat(fd, options.bigint, undefined, ctx); | 1 |
Text | Text | fix argument order for `brew create` | 6f908880d88febe5e3a0afbb23564c74f0800f4e | <ide><path>Library/Contributions/manpages/brew.1.md
<ide> Note that these flags should only appear after a command.
<ide> versions of formula. Note downloads for any installed formula will still not be
<ide> deleted. If you want to delete those too: `rm -rf $(brew --cache)`
<ide>
<del> * `create [--autotools|--cmake] [--no-fetch] [--set-name <name>] [--set-version <version>]` <URL>:
<add> * `create <URL> [--autotools|--cmake] [--no-fetch] [--set-name <name>] [--set-version <version>]`:
<ide> Generate a formula for the downloadable file at <URL> and open it in the editor.
<ide> Homebrew will attempt to automatically derive the formula name
<ide> and version, but if it fails, you'll have to make your own template. The wget | 1 |
Ruby | Ruby | add method to find xquartz version | 65567eb55fc97925b023737456b1e989d48d68f2 | <ide><path>Library/Homebrew/macos.rb
<ide> module MacOS extend self
<ide> XCODE_3_BUNDLE_ID = "com.apple.Xcode"
<ide> CLT_STANDALONE_PKG_ID = "com.apple.pkg.DeveloperToolsCLILeo"
<ide> CLT_FROM_XCODE_PKG_ID = "com.apple.pkg.DeveloperToolsCLI"
<add> APPLE_X11_BUNDLE_ID = "org.x.X11"
<add> XQUARTZ_BUNDLE_ID = "org.macosforge.xquartz.X11"
<ide>
<ide> def version
<ide> MACOS_VERSION
<ide> def clang_build_version
<ide> end
<ide> end
<ide>
<add> def xquartz_version
<add> # This returns the version number of XQuartz, not of the upstream X.org
<add> # (which is why it is not called x11_version). Note that the X11.app
<add> # distributed by Apple is also XQuartz, and therefore covered by this method.
<add> path = app_with_bundle_id(XQUARTZ_BUNDLE_ID) or app_with_bundle_id(APPLE_X11_BUNDLE_ID)
<add> version = if not path.nil? and path.exist?
<add> `mdls -raw -name kMDItemVersion #{path}`.strip
<add> end
<add> end
<add>
<ide> def x11_prefix
<ide> @x11_prefix ||= if Pathname.new('/opt/X11/lib/libpng.dylib').exist?
<ide> Pathname.new('/opt/X11') | 1 |
Python | Python | add test for loading languages | b012ae3044670016efb2c35b3e8a745a19d80ff0 | <ide><path>spacy/tests/integration/test_load_languages.py
<add># encoding: utf8
<add>from __future__ import unicode_literals
<add>from ...fr import French
<add>
<add>def test_load_french():
<add> nlp = French()
<add> doc = nlp(u'Parlez-vous français?') | 1 |
Javascript | Javascript | fix initial animations | c439d816fac9625cde848300f745d29acdfaf469 | <ide><path>src/core/core.controller.js
<ide> class Chart {
<ide> this.$plugins = undefined;
<ide> this.$proxies = {};
<ide> this._hiddenIndices = {};
<del> this.attached = true;
<add> this.attached = false;
<ide>
<ide> // Add the chart instance to the global namespace
<ide> Chart.instances[me.id] = me;
<ide> class Chart {
<ide> retinaScale(me, newRatio);
<ide>
<ide> if (!silent) {
<del> // Notify any plugins about the resize
<ide> plugins.notify(me, 'resize', [newSize]);
<ide>
<del> // Notify of resize
<del> if (options.onResize) {
<del> options.onResize(me, newSize);
<del> }
<add> callCallback(options.onResize, [newSize], me);
<ide>
<del> // Only apply 'resize' mode if we are attached, else do a regular update.
<del> me.update(me.attached && 'resize');
<add> if (me.attached) {
<add> me.update('resize');
<add> }
<ide> }
<ide> }
<ide> | 1 |
Javascript | Javascript | add progress indicator to compare.js | 60d77bd514d3dc65cfbb64ebb8ae1f364e8bf8eb | <ide><path>benchmark/_benchmark_progress.js
<add>'use strict';
<add>
<add>const readline = require('readline');
<add>
<add>function pad(input, minLength, fill) {
<add> var result = input + '';
<add> return fill.repeat(Math.max(0, minLength - result.length)) + result;
<add>}
<add>
<add>function fraction(numerator, denominator) {
<add> const fdenominator = denominator + '';
<add> const fnumerator = pad(numerator, fdenominator.length, ' ');
<add> return `${fnumerator}/${fdenominator}`;
<add>}
<add>
<add>function getTime(diff) {
<add> const time = Math.ceil(diff[0] + diff[1] / 1e9);
<add> const seconds = pad(time % 60, 2, '0');
<add> const minutes = pad(Math.floor(time / 60) % (60 * 60), 2, '0');
<add> const hours = pad(Math.floor(time / (60 * 60)), 2, '0');
<add> return `${hours}:${minutes}:${seconds}`;
<add>}
<add>
<add>// A run is an item in the job queue: { binary, filename, iter }
<add>// A config is an item in the subqueue: { binary, filename, iter, configs }
<add>class BenchmarkProgress {
<add> constructor(queue, benchmarks) {
<add> this.queue = queue; // Scheduled runs.
<add> this.benchmarks = benchmarks; // Filenames of scheduled benchmarks.
<add> this.completedRuns = 0; // Number of completed runs.
<add> this.scheduledRuns = queue.length; // Number of scheduled runs.
<add> // Time when starting to run benchmarks.
<add> this.startTime = process.hrtime();
<add> // Number of times each file will be run (roughly).
<add> this.runsPerFile = queue.length / benchmarks.length;
<add> this.currentFile = ''; // Filename of current benchmark.
<add> this.currentFileConfig; // Configurations for current file
<add> // Number of configurations already run for the current file.
<add> this.completedConfig = 0;
<add> // Total number of configurations for the current file
<add> this.scheduledConfig = 0;
<add> this.interval = 0; // result of setInterval for updating the elapsed time
<add> }
<add>
<add> startQueue(index) {
<add> this.kStartOfQueue = index;
<add> this.currentFile = this.queue[index].filename;
<add> this.interval = setInterval(() => {
<add> if (this.completedRuns === this.scheduledRuns) {
<add> clearInterval(this.interval);
<add> } else {
<add> this.updateProgress();
<add> }
<add> }, 1000);
<add> }
<add>
<add> startSubqueue(data, index) {
<add> // This subqueue is generated by a new benchmark
<add> if (data.name !== this.currentFile || index === this.kStartOfQueue) {
<add> this.currentFile = data.name;
<add> this.scheduledConfig = data.queueLength;
<add> }
<add> this.completedConfig = 0;
<add> this.updateProgress();
<add> }
<add>
<add> completeConfig(data) {
<add> this.completedConfig++;
<add> this.updateProgress();
<add> }
<add>
<add> completeRun(job) {
<add> this.completedRuns++;
<add> this.updateProgress();
<add> }
<add>
<add> getProgress() {
<add> // Get time as soon as possible.
<add> const diff = process.hrtime(this.startTime);
<add>
<add> const completedRuns = this.completedRuns;
<add> const scheduledRuns = this.scheduledRuns;
<add> const finished = completedRuns === scheduledRuns;
<add>
<add> // Calculate numbers for fractions.
<add> const runsPerFile = this.runsPerFile;
<add> const completedFiles = Math.floor(completedRuns / runsPerFile);
<add> const scheduledFiles = this.benchmarks.length;
<add> const completedRunsForFile = finished ? runsPerFile :
<add> completedRuns % runsPerFile;
<add> const completedConfig = this.completedConfig;
<add> const scheduledConfig = this.scheduledConfig;
<add>
<add> // Calculate the percentage.
<add> let runRate = 0; // Rate of current incomplete run.
<add> if (completedConfig !== scheduledConfig) {
<add> runRate = completedConfig / scheduledConfig;
<add> }
<add> const completedRate = ((completedRuns + runRate) / scheduledRuns);
<add> const percent = pad(Math.floor(completedRate * 100), 3, ' ');
<add>
<add> const caption = finished ? 'Done\n' : this.currentFile;
<add> return `[${getTime(diff)}|% ${percent}` +
<add> `| ${fraction(completedFiles, scheduledFiles)} files ` +
<add> `| ${fraction(completedRunsForFile, runsPerFile)} runs ` +
<add> `| ${fraction(completedConfig, scheduledConfig)} configs]` +
<add> `: ${caption}`;
<add> }
<add>
<add> updateProgress(finished) {
<add> if (!process.stderr.isTTY || process.stdout.isTTY) {
<add> return;
<add> }
<add> readline.clearLine(process.stderr);
<add> readline.cursorTo(process.stderr, 0);
<add> process.stderr.write(this.getProgress());
<add> }
<add>}
<add>
<add>module.exports = BenchmarkProgress;
<ide><path>benchmark/_cli.js
<ide> function CLI(usage, settings) {
<ide> currentOptional = arg.slice(1);
<ide> }
<ide>
<del> // Default the value to true
<del> if (!settings.arrayArgs.includes(currentOptional)) {
<add> if (settings.boolArgs && settings.boolArgs.includes(currentOptional)) {
<ide> this.optional[currentOptional] = true;
<add> mode = 'both';
<add> } else {
<add> // expect the next value to be option related (either -- or the value)
<add> mode = 'option';
<ide> }
<del>
<del> // expect the next value to be option related (either -- or the value)
<del> mode = 'option';
<ide> } else if (mode === 'option') {
<ide> // Optional arguments value
<ide>
<ide><path>benchmark/common.js
<ide> Benchmark.prototype.http = function(options, cb) {
<ide>
<ide> Benchmark.prototype._run = function() {
<ide> const self = this;
<add> // If forked, report to the parent.
<add> if (process.send) {
<add> process.send({
<add> type: 'config',
<add> name: this.name,
<add> queueLength: this.queue.length
<add> });
<add> }
<ide>
<ide> (function recursive(queueIndex) {
<ide> const config = self.queue[queueIndex];
<ide> Benchmark.prototype.report = function(rate, elapsed) {
<ide> name: this.name,
<ide> conf: this.config,
<ide> rate: rate,
<del> time: elapsed[0] + elapsed[1] / 1e9
<add> time: elapsed[0] + elapsed[1] / 1e9,
<add> type: 'report'
<ide> });
<ide> };
<ide>
<ide><path>benchmark/compare.js
<ide> const fork = require('child_process').fork;
<ide> const path = require('path');
<ide> const CLI = require('./_cli.js');
<add>const BenchmarkProgress = require('./_benchmark_progress.js');
<ide>
<ide> //
<ide> // Parse arguments
<ide> const cli = CLI(`usage: ./node compare.js [options] [--] <category> ...
<ide> The output is formatted as csv, which can be processed using for
<ide> example 'compare.R'.
<ide>
<del> --new ./new-node-binary new node binary (required)
<del> --old ./old-node-binary old node binary (required)
<del> --runs 30 number of samples
<del> --filter pattern string to filter benchmark scripts
<del> --set variable=value set benchmark variable (can be repeated)
<add> --new ./new-node-binary new node binary (required)
<add> --old ./old-node-binary old node binary (required)
<add> --runs 30 number of samples
<add> --filter pattern string to filter benchmark scripts
<add> --set variable=value set benchmark variable (can be repeated)
<add> --no-progress don't show benchmark progress indicator
<ide> `, {
<del> arrayArgs: ['set']
<add> arrayArgs: ['set'],
<add> boolArgs: ['no-progress']
<ide> });
<ide>
<ide> if (!cli.optional.new || !cli.optional.old) {
<ide> if (benchmarks.length === 0) {
<ide>
<ide> // Create queue from the benchmarks list such both node versions are tested
<ide> // `runs` amount of times each.
<add>// Note: BenchmarkProgress relies on this order to estimate
<add>// how much runs remaining for a file. All benchmarks generated from
<add>// the same file must be run consecutively.
<ide> const queue = [];
<ide> for (const filename of benchmarks) {
<ide> for (let iter = 0; iter < runs; iter++) {
<ide> for (const filename of benchmarks) {
<ide> }
<ide> }
<ide> }
<add>// queue.length = binary.length * runs * benchmarks.length
<ide>
<ide> // Print csv header
<ide> console.log('"binary", "filename", "configuration", "rate", "time"');
<ide>
<add>const kStartOfQueue = 0;
<add>
<add>const showProgress = !cli.optional['no-progress'];
<add>let progress;
<add>if (showProgress) {
<add> progress = new BenchmarkProgress(queue, benchmarks);
<add> progress.startQueue(kStartOfQueue);
<add>}
<add>
<ide> (function recursive(i) {
<ide> const job = queue[i];
<ide>
<ide> console.log('"binary", "filename", "configuration", "rate", "time"');
<ide> });
<ide>
<ide> child.on('message', function(data) {
<del> // Construct configuration string, " A=a, B=b, ..."
<del> let conf = '';
<del> for (const key of Object.keys(data.conf)) {
<del> conf += ' ' + key + '=' + JSON.stringify(data.conf[key]);
<del> }
<del> conf = conf.slice(1);
<add> if (data.type === 'report') {
<add> // Construct configuration string, " A=a, B=b, ..."
<add> let conf = '';
<add> for (const key of Object.keys(data.conf)) {
<add> conf += ' ' + key + '=' + JSON.stringify(data.conf[key]);
<add> }
<add> conf = conf.slice(1);
<add> // Escape quotes (") for correct csv formatting
<add> conf = conf.replace(/"/g, '""');
<ide>
<del> // Escape quotes (") for correct csv formatting
<del> conf = conf.replace(/"/g, '""');
<del>
<del> console.log(`"${job.binary}", "${job.filename}", "${conf}", ` +
<del> `${data.rate}, ${data.time}`);
<add> console.log(`"${job.binary}", "${job.filename}", "${conf}", ` +
<add> `${data.rate}, ${data.time}`);
<add> if (showProgress) {
<add> // One item in the subqueue has been completed.
<add> progress.completeConfig(data);
<add> }
<add> } else if (showProgress && data.type === 'config') {
<add> // The child has computed the configurations, ready to run subqueue.
<add> progress.startSubqueue(data, i);
<add> }
<ide> });
<ide>
<ide> child.once('close', function(code) {
<ide> if (code) {
<ide> process.exit(code);
<ide> return;
<ide> }
<add> if (showProgress) {
<add> progress.completeRun(job);
<add> }
<ide>
<ide> // If there are more benchmarks execute the next
<ide> if (i + 1 < queue.length) {
<ide> recursive(i + 1);
<ide> }
<ide> });
<del>})(0);
<add>})(kStartOfQueue);
<ide><path>benchmark/run.js
<ide> if (format === 'csv') {
<ide> }
<ide>
<ide> child.on('message', function(data) {
<add> if (data.type !== 'report') {
<add> return;
<add> }
<ide> // Construct configuration string, " A=a, B=b, ..."
<ide> let conf = '';
<ide> for (const key of Object.keys(data.conf)) {
<ide><path>benchmark/scatter.js
<ide> function csvEncodeValue(value) {
<ide> const child = fork(path.resolve(__dirname, filepath), cli.optional.set);
<ide>
<ide> child.on('message', function(data) {
<add> if (data.type !== 'report') {
<add> return;
<add> }
<add>
<ide> // print csv header
<ide> if (printHeader) {
<ide> const confHeader = Object.keys(data.conf) | 6 |
Ruby | Ruby | check token scopes even if authorized | 4886b3b13845c8c76ebefd1a586a367961eab427 | <ide><path>Library/Homebrew/utils/github.rb
<ide> def api_credentials_type
<ide> def api_credentials_error_message(response_headers, needed_scopes)
<ide> return if response_headers.empty?
<ide>
<del> unauthorized = (response_headers["http/1.1"] == "401 Unauthorized")
<ide> scopes = response_headers["x-accepted-oauth-scopes"].to_s.split(", ")
<del> return unless unauthorized && scopes.blank?
<add> return if scopes.present?
<ide>
<ide> needed_human_scopes = needed_scopes.join(", ")
<ide> credentials_scopes = response_headers["x-oauth-scopes"] | 1 |
Go | Go | normalize comment formatting | ecb898dcb9065c8e9bcf7bb79fd160dea1c859b8 | <ide><path>pkg/archive/archive.go
<ide> func newTarAppender(idMapping *idtools.IdentityMapping, writer io.Writer, chownO
<ide> }
<ide>
<ide> // canonicalTarName provides a platform-independent and consistent posix-style
<del>//path for files and directories to be archived regardless of the platform.
<add>// path for files and directories to be archived regardless of the platform.
<ide> func canonicalTarName(name string, isDir bool) string {
<ide> name = CanonicalTarNameForPath(name)
<ide>
<ide> func (ta *tarAppender) addTarFile(path, name string) error {
<ide> }
<ide> }
<ide>
<del> //check whether the file is overlayfs whiteout
<del> //if yes, skip re-mapping container ID mappings.
<add> // check whether the file is overlayfs whiteout
<add> // if yes, skip re-mapping container ID mappings.
<ide> isOverlayWhiteout := fi.Mode()&os.ModeCharDevice != 0 && hdr.Devmajor == 0 && hdr.Devminor == 0
<ide>
<del> //handle re-mapping container ID mappings back to host ID mappings before
<del> //writing tar headers/files. We skip whiteout files because they were written
<del> //by the kernel and already have proper ownership relative to the host
<add> // handle re-mapping container ID mappings back to host ID mappings before
<add> // writing tar headers/files. We skip whiteout files because they were written
<add> // by the kernel and already have proper ownership relative to the host
<ide> if !isOverlayWhiteout && !strings.HasPrefix(filepath.Base(hdr.Name), WhiteoutPrefix) && !ta.IdentityMapping.Empty() {
<ide> fileIDPair, err := getFileUIDGID(fi.Sys())
<ide> if err != nil {
<ide><path>pkg/archive/archive_unix_test.go
<ide> func TestCopyInfoDestinationPathSymlink(t *testing.T) {
<ide> }
<ide>
<ide> testData := []FileTestData{
<del> //Create a directory: /tmp/archive-copy-test*/dir1
<del> //Test will "copy" file1 to dir1
<add> // Create a directory: /tmp/archive-copy-test*/dir1
<add> // Test will "copy" file1 to dir1
<ide> {resource: FileData{filetype: Dir, path: "dir1", permissions: 0740}, file: "file1", expected: CopyInfo{Path: root + "dir1/file1", Exists: false, IsDir: false}},
<ide>
<del> //Create a symlink directory to dir1: /tmp/archive-copy-test*/dirSymlink -> dir1
<del> //Test will "copy" file2 to dirSymlink
<add> // Create a symlink directory to dir1: /tmp/archive-copy-test*/dirSymlink -> dir1
<add> // Test will "copy" file2 to dirSymlink
<ide> {resource: FileData{filetype: Symlink, path: "dirSymlink", contents: root + "dir1", permissions: 0600}, file: "file2", expected: CopyInfo{Path: root + "dirSymlink/file2", Exists: false, IsDir: false}},
<ide>
<del> //Create a file in tmp directory: /tmp/archive-copy-test*/file1
<del> //Test to cover when the full file path already exists.
<add> // Create a file in tmp directory: /tmp/archive-copy-test*/file1
<add> // Test to cover when the full file path already exists.
<ide> {resource: FileData{filetype: Regular, path: "file1", permissions: 0600}, file: "", expected: CopyInfo{Path: root + "file1", Exists: true}},
<ide>
<del> //Create a directory: /tmp/archive-copy*/dir2
<del> //Test to cover when the full directory path already exists
<add> // Create a directory: /tmp/archive-copy*/dir2
<add> // Test to cover when the full directory path already exists
<ide> {resource: FileData{filetype: Dir, path: "dir2", permissions: 0740}, file: "", expected: CopyInfo{Path: root + "dir2", Exists: true, IsDir: true}},
<ide>
<del> //Create a symlink to a non-existent target: /tmp/archive-copy*/symlink1 -> noSuchTarget
<del> //Negative test to cover symlinking to a target that does not exit
<add> // Create a symlink to a non-existent target: /tmp/archive-copy*/symlink1 -> noSuchTarget
<add> // Negative test to cover symlinking to a target that does not exit
<ide> {resource: FileData{filetype: Symlink, path: "symlink1", contents: "noSuchTarget", permissions: 0600}, file: "", expected: CopyInfo{Path: root + "noSuchTarget", Exists: false}},
<ide>
<del> //Create a file in tmp directory for next test: /tmp/existingfile
<add> // Create a file in tmp directory for next test: /tmp/existingfile
<ide> {resource: FileData{filetype: Regular, path: "existingfile", permissions: 0600}, file: "", expected: CopyInfo{Path: root + "existingfile", Exists: true}},
<ide>
<del> //Create a symlink to an existing file: /tmp/archive-copy*/symlink2 -> /tmp/existingfile
<del> //Test to cover when the parent directory of a new file is a symlink
<add> // Create a symlink to an existing file: /tmp/archive-copy*/symlink2 -> /tmp/existingfile
<add> // Test to cover when the parent directory of a new file is a symlink
<ide> {resource: FileData{filetype: Symlink, path: "symlink2", contents: "existingfile", permissions: 0600}, file: "", expected: CopyInfo{Path: root + "existingfile", Exists: true}},
<ide> }
<ide>
<ide><path>pkg/archive/archive_windows.go
<ide> func CanonicalTarNameForPath(p string) string {
<ide> // chmodTarEntry is used to adjust the file permissions used in tar header based
<ide> // on the platform the archival is done.
<ide> func chmodTarEntry(perm os.FileMode) os.FileMode {
<del> //perm &= 0755 // this 0-ed out tar flags (like link, regular file, directory marker etc.)
<add> // perm &= 0755 // this 0-ed out tar flags (like link, regular file, directory marker etc.)
<ide> permPart := perm & os.ModePerm
<ide> noPermPart := perm &^ os.ModePerm
<ide> // Add the x bit: make everything +x from windows | 3 |
Text | Text | remove the duplication | 9f8f28684f196ff3790ff1c738e81743821fc860 | <ide><path>docs/userguide/networking/configure-dns.md
<ide> Various container options that affect container domain name services.
<ide> of the container identified by <code>CONTAINER_NAME</code>. When using <code>--link</code>
<ide> the embedded DNS will guarantee that localized lookup result only on that
<ide> container where the <code>--link</code> is used. This lets processes inside the new container
<del> connect to container without without having to know its name or IP.
<add> connect to container without having to know its name or IP.
<ide> </p>
<ide> </td>
<ide> </tr> | 1 |
Ruby | Ruby | pass the session and env in to the test request | 3806eb70a63f0252c88ebb25f0aa79b4c0ca5686 | <ide><path>actionpack/lib/action_controller/test_case.rb
<ide> class TestRequest < ActionDispatch::TestRequest #:nodoc:
<ide> DEFAULT_ENV = ActionDispatch::TestRequest::DEFAULT_ENV.dup
<ide> DEFAULT_ENV.delete 'PATH_INFO'
<ide>
<del> def initialize(env = {})
<del> super
<add> def self.new_session
<add> TestSession.new
<add> end
<add>
<add> def initialize(env, session)
<add> super(env)
<ide>
<del> self.session = TestSession.new
<add> self.session = session
<ide> self.session_options = TestSession::DEFAULT_OPTIONS
<ide> end
<ide>
<ide> def setup_controller_request_and_response
<ide> end
<ide>
<ide> def build_request
<del> TestRequest.new
<add> TestRequest.new({}, TestRequest.new_session)
<ide> end
<ide>
<ide> def build_response(klass) | 1 |
Python | Python | fix gpu training and evaluation | ca70b086612b652a1f3c797b9810f2d9ba60252e | <ide><path>spacy/cli/train.py
<ide> def train_config(config):
<ide> def train_model(Language, train_data, dev_data, output_path, n_iter, **cfg):
<ide> print("Itn.\tDep. Loss\tUAS\tNER F.\tTag %\tToken %")
<ide>
<del> nlp = Language(pipeline=['token_vectors', 'tags']) #, 'dependencies'])
<add> nlp = Language(pipeline=['token_vectors', 'tags', 'dependencies'])
<ide> dropout = util.env_opt('dropout', 0.0)
<ide> # TODO: Get spaCy using Thinc's trainer and optimizer
<ide> with nlp.begin_training(train_data, **cfg) as (trainer, optimizer):
<ide> def train_model(Language, train_data, dev_data, output_path, n_iter, **cfg):
<ide> for i, (docs, golds) in enumerate(epoch):
<ide> state = nlp.update(docs, golds, drop=dropout, sgd=optimizer)
<ide> losses['dep_loss'] += state.get('parser_loss', 0.0)
<del> losses['tag_loss'] += state.get('tagger_loss', 0.0)
<add> losses['tag_loss'] += state.get('tag_loss', 0.0)
<ide> to_render.insert(0, nlp(docs[-1].text))
<ide> to_render[0].user_data['title'] = "Batch %d" % i
<ide> with Path('/tmp/entities.html').open('w') as file_:
<ide> def train_model(Language, train_data, dev_data, output_path, n_iter, **cfg):
<ide> def evaluate(Language, gold_tuples, path):
<ide> with (path / 'model.bin').open('rb') as file_:
<ide> nlp = dill.load(file_)
<add> # TODO:
<add> # 1. This code is duplicate with spacy.train.Trainer.evaluate
<add> # 2. There's currently a semantic difference between pipe and
<add> # not pipe! It matters whether we batch the inputs. Must fix!
<add> all_docs = []
<add> all_golds = []
<add> for raw_text, paragraph_tuples in dev_sents:
<add> if gold_preproc:
<add> raw_text = None
<add> else:
<add> paragraph_tuples = merge_sents(paragraph_tuples)
<add> docs = self.make_docs(raw_text, paragraph_tuples)
<add> golds = self.make_golds(docs, paragraph_tuples)
<add> all_docs.extend(docs)
<add> all_golds.extend(golds)
<ide> scorer = Scorer()
<del> for raw_text, sents in gold_tuples:
<del> sents = merge_sents(sents)
<del> for annot_tuples, brackets in sents:
<del> if raw_text is None:
<del> tokens = Doc(nlp.vocab, words=annot_tuples[1])
<del> state = None
<del> for proc in nlp.pipeline:
<del> state = proc(tokens, state=state)
<del> else:
<del> tokens = nlp(raw_text)
<del> gold = GoldParse.from_annot_tuples(tokens, annot_tuples)
<del> scorer.score(tokens, gold)
<add> for doc, gold in zip(self.nlp.pipe(all_docs), all_golds):
<add> scorer.score(doc, gold)
<ide> return scorer
<ide>
<ide>
<ide> def print_progress(itn, losses, dev_scores):
<ide> # TODO: Fix!
<ide> scores = {}
<del> for col in ['dep_loss', 'uas', 'tags_acc', 'token_acc', 'ents_f']:
<add> for col in ['dep_loss', 'tag_loss', 'uas', 'tags_acc', 'token_acc', 'ents_f']:
<ide> scores[col] = 0.0
<ide> scores.update(losses)
<ide> scores.update(dev_scores)
<del> tpl = '{:d}\t{dep_loss:.3f}\t{uas:.3f}\t{ents_f:.3f}\t{tags_acc:.3f}\t{token_acc:.3f}'
<add> tpl = '{:d}\t{dep_loss:.3f}\t{tag_loss:.3f}\t{uas:.3f}\t{ents_f:.3f}\t{tags_acc:.3f}\t{token_acc:.3f}'
<ide> print(tpl.format(itn, **scores))
<ide>
<ide> | 1 |
Text | Text | change webflow link per request | 188b758631a8bc34cff0248e7d7ecef3d0f2fabc | <ide><path>README.md
<ide> Every release, along with the migration instructions, is documented on the Githu
<ide> The work on Redux was [funded by the community](https://www.patreon.com/reactdx).
<ide> Meet some of the outstanding companies that made it possible:
<ide>
<del>* [Webflow](https://webflow.com/)
<add>* [Webflow](https://github.com/webflow)
<ide> * [Chess iX](http://www.chess-ix.com/)
<ide>
<ide> [See the full list of Redux patrons.](PATRONS.md) | 1 |
Python | Python | add test for stable sort | 14955ccf2a6d838705abb21b237179ecfdbae19f | <ide><path>numpy/ma/tests/test_core.py
<ide> def test_sort(self):
<ide> assert_equal(sortedx._data, [1, 2, -2, -1, 0])
<ide> assert_equal(sortedx._mask, [1, 1, 0, 0, 0])
<ide>
<add> def test_stable_sort(self):
<add> x = array([1, 2, 3, 1, 2, 3], dtype=np.uint8)
<add> expected = array([0, 3, 1, 4, 2, 5])
<add> computed = argsort(x, kind='stable')
<add> assert_equal(computed, expected)
<add>
<ide> def test_argsort_matches_sort(self):
<ide> x = array([1, 4, 2, 3], mask=[0, 1, 0, 0], dtype=np.uint8)
<ide> | 1 |
Javascript | Javascript | replace concat with template literals | 9f98989484d56b837411e8af75a9b7f5980b9db4 | <ide><path>tools/doc/json.js
<ide> const marked = require('marked');
<ide> // customized heading without id attribute
<ide> const renderer = new marked.Renderer();
<ide> renderer.heading = function(text, level) {
<del> return '<h' + level + '>' + text + '</h' + level + '>\n';
<add> return `<h${level}>${text}</h${level}>\n`;
<ide> };
<ide> marked.setOptions({
<ide> renderer: renderer
<ide> function processList(section) {
<ide> } else if (type === 'list_item_end') {
<ide> if (!current) {
<ide> throw new Error('invalid list - end without current item\n' +
<del> JSON.stringify(tok) + '\n' +
<add> `${JSON.stringify(tok)}\n` +
<ide> JSON.stringify(list));
<ide> }
<ide> current = stack.pop();
<ide> } else if (type === 'text') {
<ide> if (!current) {
<ide> throw new Error('invalid list - text without current item\n' +
<del> JSON.stringify(tok) + '\n' +
<add> `${JSON.stringify(tok)}\n` +
<ide> JSON.stringify(list));
<ide> }
<ide> current.textRaw = current.textRaw || '';
<del> current.textRaw += tok.text + ' ';
<add> current.textRaw += `${tok.text} `;
<ide> }
<ide> });
<ide>
<ide> // shove the name in there for properties, since they are always
<ide> // just going to be the value etc.
<ide> if (section.type === 'property' && values[0]) {
<del> values[0].textRaw = '`' + section.name + '` ' + values[0].textRaw;
<add> values[0].textRaw = `\`${section.name}\` ${values[0].textRaw}`;
<ide> }
<ide>
<ide> // now pull the actual values out of the text bits.
<ide> function parseSignature(text, sig) {
<ide> // at this point, the name should match.
<ide> if (p !== param.name) {
<ide> console.error('Warning: invalid param "%s"', p);
<del> console.error(' > ' + JSON.stringify(param));
<del> console.error(' > ' + text);
<add> console.error(` > ${JSON.stringify(param)}`);
<add> console.error(` > ${text}`);
<ide> }
<ide> if (optional) param.optional = true;
<ide> if (def !== undefined) param.default = def;
<ide> function parseListItem(item) {
<ide> function finishSection(section, parent) {
<ide> if (!section || !parent) {
<ide> throw new Error('Invalid finishSection call\n' +
<del> JSON.stringify(section) + '\n' +
<add> `${JSON.stringify(section)}\n` +
<ide> JSON.stringify(parent));
<ide> }
<ide>
<ide> function finishSection(section, parent) {
<ide>
<ide> var plur;
<ide> if (section.type.slice(-1) === 's') {
<del> plur = section.type + 'es';
<add> plur = `${section.type}es`;
<ide> } else if (section.type.slice(-1) === 'y') {
<ide> plur = section.type.replace(/y$/, 'ies');
<ide> } else {
<del> plur = section.type + 's';
<add> plur = `${section.type}s`;
<ide> }
<ide>
<ide> // if the parent's type is 'misc', then it's just a random | 1 |
Python | Python | use longer timeout to make failure less likely | 26d27829f5e9d9a62ce1f27aa506375747d921d9 | <ide><path>libcloud/test/storage/test_local.py
<ide> def remove_tmp_file(self, tmppath):
<ide>
<ide> @unittest.skipIf(platform.system().lower() == 'windows', 'Unsupported on Windows')
<ide> def test_lock_local_storage(self):
<add> print("aaaa")
<ide> # 1. Acquire succeeds
<ide> lock = LockLocalStorage("/tmp/a")
<ide> with lock:
<ide> def test_lock_local_storage(self):
<ide> # 3. Multiprocessing scenario where IPC lock is involved
<ide> def acquire_lock_in_subprocess(pid, success):
<ide> # For first process acquire should succeed and for the second it should fail
<del>
<del> lock = LockLocalStorage("/tmp/c", timeout=0.5)
<add> lock = LockLocalStorage("/tmp/c", timeout=1.5)
<ide>
<ide> if pid == 1:
<ide> with lock:
<add> # We use longer sleep when running tests in parallel to avoid
<add> # failures related to slower process spawn
<ide> time.sleep(2)
<ide>
<ide> success.value = 1 | 1 |
Ruby | Ruby | remove methods to avoid warnings | d5bb640eb02fa9cae6e99f2c5b7d91c986bc65f1 | <ide><path>actionpack/test/controller/rescue_test.rb
<ide> module ActionDispatch
<ide> class ShowExceptions
<ide> private
<add> remove_method :public_path
<ide> def public_path
<ide> "#{FIXTURE_LOAD_PATH}/public"
<ide> end
<ide>
<add> remove_method :logger
<ide> # Silence logger
<ide> def logger
<ide> nil | 1 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.