content_type stringclasses 8 values | main_lang stringclasses 7 values | message stringlengths 1 50 | sha stringlengths 40 40 | patch stringlengths 52 962k | file_count int64 1 300 |
|---|---|---|---|---|---|
Text | Text | add async flow | dacc5a473212687ad55f3df16fe09c27f74bdb13 | <ide><path>docs/advanced/AsyncFlow.md
<add># Async Flow
<add>
<add>Without [middleware](Middleware.md), Redux store only supports [synchronous data flow](../basics/DataFlow.md). This is what you get by default with [`createStore()`](../api/createStore.md).
<add>
<add>You may enhance [`createStore()`](../api/createStore.md) with [`applyMiddleware()`](../api/applyMiddleware.md). It is not required, but it lets you [express asynchronous actions in a convenient way](AsyncActions.md).
<add>
<add>Asynchronous middleware like [redux-thunk](https://github.com/gaearon/redux-thunk) or [redux-promise](https://github.com/acdlite/redux-promise) wraps the store’s [`dispatch()`](../api/Store.md#dispatch) method and allows you to dispatch something other than actions, for example, functions or Promises. Any middleware you use can then interpret anything you dispatch, and in turn, can pass actions to the next middleware in chain. For example, a Promise middleware can intercept Promises and dispatch a pair of begin/end actions asynchronously in response to each Promise.
<add>
<add>When the last middleware in chain dispatches an action, it has to be a plain object. This is when the [synchronous Redux data flow](../basics/DataFlow.md) takes place.
<add>
<add>## Next Steps
<add>
<add>Now you know everything about data flow in a Redux app! Check out [the source code for the async example](ExampleRedditAPI.md), or read on about [React Router integration](UsageWithReactRouter.md).
<ide>\ No newline at end of file | 1 |
Javascript | Javascript | fix rntester webbrowser example | 4d99daaa914398b24e9783b47858434e25bb5fcb | <ide><path>RNTester/js/WebViewExample.js
<ide> var DISABLED_WASH = 'rgba(255,255,255,0.25)';
<ide> var TEXT_INPUT_REF = 'urlInput';
<ide> var WEBVIEW_REF = 'webview';
<ide> var DEFAULT_URL = 'https://m.facebook.com';
<add>const FILE_SYSTEM_ORIGIN_WHITE_LIST = ['file://*', 'http://*', 'https://*'];
<ide>
<ide> class WebViewExample extends React.Component<{}, $FlowFixMeState> {
<ide> state = {
<ide> class MessagingTest extends React.Component<{}, $FlowFixMeState> {
<ide> backgroundColor: BGWASH,
<ide> height: 100,
<ide> }}
<add> originWhitelist={FILE_SYSTEM_ORIGIN_WHITE_LIST}
<ide> source={require('./messagingtest.html')}
<ide> onMessage={this.onMessage}
<ide> />
<ide> exports.examples = [
<ide> backgroundColor: BGWASH,
<ide> height: 100,
<ide> }}
<add> originWhitelist={FILE_SYSTEM_ORIGIN_WHITE_LIST}
<ide> source={require('./helloworld.html')}
<ide> scalesPageToFit={true}
<ide> /> | 1 |
Go | Go | export the logt type and use it in the options | 95ea6e70696582cec3643b160a4eb4b98aa3d9e7 | <ide><path>testutil/daemon/daemon.go
<ide> import (
<ide> "gotest.tools/assert"
<ide> )
<ide>
<del>type logT interface {
<add>// LogT is the subset of the testing.TB interface used by the daemon.
<add>type LogT interface {
<ide> Logf(string, ...interface{})
<ide> }
<ide>
<del>// nopLog is a no-op implementation of logT that is used in daemons created by
<add>// nopLog is a no-op implementation of LogT that is used in daemons created by
<ide> // NewDaemon (where no testing.TB is available).
<ide> type nopLog struct{}
<ide>
<ide> type Daemon struct {
<ide> experimental bool
<ide> init bool
<ide> dockerdBinary string
<del> log logT
<add> log LogT
<ide> pidFile string
<ide> args []string
<ide>
<ide><path>testutil/daemon/ops.go
<ide> package daemon
<ide>
<ide> import (
<del> "testing"
<del>
<ide> "github.com/docker/docker/testutil/environment"
<ide> )
<ide>
<ide> func WithDefaultCgroupNamespaceMode(mode string) Option {
<ide> }
<ide>
<ide> // WithTestLogger causes the daemon to log certain actions to the provided test.
<del>func WithTestLogger(t testing.TB) Option {
<add>func WithTestLogger(t LogT) Option {
<ide> return func(d *Daemon) {
<ide> d.log = t
<ide> } | 2 |
Text | Text | add notes to contributing docs | 23369650e3502305ebf5d682e141c7d47db89111 | <ide><path>docs/topics/contributing.md
<ide> There are many ways you can contribute to Django REST framework. We'd like it t
<ide>
<ide> The most important thing you can do to help push the REST framework project forward is to be actively involved wherever possible. Code contributions are often overvalued as being the primary way to get involved in a project, we don't believe that needs to be the case.
<ide>
<del>If you use REST framework, we'd love you to be vocal about your experiances with it - you might consider writing a blog post on your experience with using REST framework, or publishing a tutorial about using the project with a particular javascript framework. Experiances from beginners can be particularly helpful because you'll be in the best position to assess which bits of REST framework are and aren't easy to understand and work with.
<add>If you use REST framework, we'd love you to be vocal about your experiences with it - you might consider writing a blog post about using REST framework, or publishing a tutorial about building a project with a particularJjavascript framework. Experiences from beginners can be particularly helpful because you'll be in the best position to assess which bits of REST framework are more difficult to understand and work with.
<ide>
<ide> Other really great ways you can help move the community forward include helping answer questions on the [discussion group][google-group], or setting up an [email alert on StackOverflow][so-filter] so that you get notified of any new questions with the `django-rest-framework` tag.
<ide> | 1 |
Mixed | Ruby | permit attachments in mailbox conductor params | c6c53a02a5dfb99fba7c0886893fde46060fdcc7 | <ide><path>actionmailbox/CHANGELOG.md
<add>* Add `attachments` to the list of permitted parameters for inbound emails conductor.
<add>
<add> When using the conductor to test inbound emails with attachments, this prevents an
<add> unpermitted parameter warning in default configurations, and prevents errors for
<add> applications that set:
<add>
<add> ```ruby
<add> config.action_controller.action_on_unpermitted_parameters = :raise
<add> ```
<add>
<add> *David Jones*, *Dana Henke*
<add>
<ide> * Add ability to configure ActiveStorage service
<ide> for storing email raw source.
<ide>
<ide><path>actionmailbox/app/controllers/rails/conductor/action_mailbox/inbound_emails_controller.rb
<ide> def create
<ide>
<ide> private
<ide> def new_mail
<del> Mail.new(params.require(:mail).permit(:from, :to, :cc, :bcc, :x_original_to, :in_reply_to, :subject, :body).to_h).tap do |mail|
<add> Mail.new(mail_params.except(:attachments).to_h).tap do |mail|
<ide> mail[:bcc]&.include_in_headers = true
<del> params[:mail][:attachments].to_a.each do |attachment|
<add> mail_params[:attachments].to_a.each do |attachment|
<ide> mail.add_file(filename: attachment.original_filename, content: attachment.read)
<ide> end
<ide> end
<ide> end
<ide>
<add> def mail_params
<add> params.require(:mail).permit(:from, :to, :cc, :bcc, :x_original_to, :in_reply_to, :subject, :body, attachments: [])
<add> end
<add>
<ide> def create_inbound_email(mail)
<ide> ActionMailbox::InboundEmail.create_and_extract_message_id!(mail.to_s)
<ide> end
<ide><path>actionmailbox/test/dummy/config/environments/test.rb
<ide>
<ide> # Annotate rendered view with file names
<ide> # config.action_view.annotate_rendered_view_with_filenames = true
<add>
<add> # Raise error if unpermitted parameters are sent
<add> config.action_controller.action_on_unpermitted_parameters = :raise
<ide> end | 3 |
Javascript | Javascript | fix broken test | aa36194eb06daf5a2e62108993ed90fb836de4b3 | <ide><path>packages/ember-runtime/tests/suites/array/lastIndexOf.js
<add>import { get } from 'ember-metal';
<ide> import { SuiteModuleBuilder } from '../suite';
<ide>
<ide> const suite = SuiteModuleBuilder.create();
<ide> suite.test('should return -1 when no match is found even startAt search location
<ide> let obj = this.newObject(this.newFixture(3));
<ide> let foo = {};
<ide>
<del> equal(obj.lastIndexOf(foo, obj.length), -1, 'obj.lastIndexOf(foo) should be -1');
<add> equal(obj.lastIndexOf(foo, get(obj, 'length')), -1, 'obj.lastIndexOf(foo) should be -1');
<ide> });
<ide>
<ide> suite.test('should return -1 when no match is found even startAt search location is greater than length', function() {
<ide> let obj = this.newObject(this.newFixture(3));
<ide> let foo = {};
<ide>
<del> equal(obj.lastIndexOf(foo, obj.length + 1), -1, 'obj.lastIndexOf(foo) should be -1');
<add> equal(obj.lastIndexOf(foo, get(obj, 'length') + 1), -1, 'obj.lastIndexOf(foo) should be -1');
<ide> });
<ide>
<ide> export default suite; | 1 |
PHP | PHP | add test for session.close | 64e46c6c722e31c10b064a68064ca904a793c712 | <ide><path>src/Http/Session.php
<ide> public function start()
<ide> }
<ide>
<ide> /**
<del> * Write datas and close the session
<add> * Write data and close the session
<ide> *
<ide> * @return bool True if session was started
<ide> */
<ide><path>tests/TestCase/Http/SessionTest.php
<ide> use Cake\Http\Session\CacheSession;
<ide> use Cake\Http\Session\DatabaseSession;
<ide> use Cake\TestSuite\TestCase;
<add>use RuntimeException;
<ide>
<ide> /**
<ide> * TestCacheSession
<ide> public function testStarted()
<ide> $this->assertTrue($session->started());
<ide> }
<ide>
<add> /**
<add> * test close method
<add> *
<add> * @return void
<add> */
<add> public function testCloseFailure()
<add> {
<add> $session = new Session();
<add> $session->started();
<add> $this->assertTrue($session->start());
<add> try {
<add> $session->close();
<add> } catch (RuntimeException $e) {
<add> // closing the session in CLI should raise an error
<add> // and won't close the session.
<add> $this->assertTrue($session->started());
<add> }
<add> }
<add>
<ide> /**
<ide> * testClear method
<ide> * | 2 |
Javascript | Javascript | allow service names with a single underscore | f0ee3353112fd84a48cf3214174536ab19e5823d | <ide><path>test/auto/injectorSpec.js
<ide> describe('injector', function() {
<ide> expect(annotate(beforeEachFn)).toEqual(['foo']);
<ide> });
<ide>
<add> it('should not strip service names with a single underscore', function() {
<add> function beforeEachFn(_) { /* _ = _ */ }
<add> expect(annotate(beforeEachFn)).toEqual(['_']);
<add> });
<ide>
<ide> it('should handle no arg functions', function() {
<ide> function $f_n0() {} | 1 |
Ruby | Ruby | move neversudosystemcommand to separate file | a9e538efbdb0532e9236400434c9970e0cdfd8fc | <ide><path>Library/Homebrew/cask/spec/spec_helper.rb
<ide> end
<ide> end
<ide> end
<del>
<del>module Hbc
<del> class NeverSudoSystemCommand < SystemCommand
<del> def self.run(command, options = {})
<del> super(command, options.merge(sudo: false))
<del> end
<del> end
<del>end
<add><path>Library/Homebrew/cask/spec/support/never_sudo_system_command.rb
<del><path>Library/Homebrew/cask/test/support/never_sudo_system_command.rb
<add>require "hbc/system_command"
<add>
<ide> module Hbc
<ide> class NeverSudoSystemCommand < SystemCommand
<ide> def self.run(command, options = {}) | 2 |
Javascript | Javascript | clarify the arguments of $setvalidity | ca5fcc6f7a842360bbe6286f900fb2e17f280eef | <ide><path>src/ng/directive/form.js
<ide> function FormController(element, attrs, $scope, $animate, $interpolate) {
<ide> addSetValidityMethod({
<ide> ctrl: this,
<ide> $element: element,
<del> set: function(object, property, control) {
<add> set: function(object, property, controller) {
<ide> var list = object[property];
<ide> if (!list) {
<del> object[property] = [control];
<add> object[property] = [controller];
<ide> } else {
<del> var index = list.indexOf(control);
<add> var index = list.indexOf(controller);
<ide> if (index === -1) {
<del> list.push(control);
<add> list.push(controller);
<ide> }
<ide> }
<ide> },
<del> unset: function(object, property, control) {
<add> unset: function(object, property, controller) {
<ide> var list = object[property];
<ide> if (!list) {
<ide> return;
<ide> }
<del> arrayRemove(list, control);
<add> arrayRemove(list, controller);
<ide> if (list.length === 0) {
<ide> delete object[property];
<ide> }
<ide><path>src/ng/directive/ngModel.js
<ide> function addSetValidityMethod(context) {
<ide>
<ide> ctrl.$setValidity = setValidity;
<ide>
<del> function setValidity(validationErrorKey, state, options) {
<add> function setValidity(validationErrorKey, state, controller) {
<ide> if (state === undefined) {
<del> createAndSet('$pending', validationErrorKey, options);
<add> createAndSet('$pending', validationErrorKey, controller);
<ide> } else {
<del> unsetAndCleanup('$pending', validationErrorKey, options);
<add> unsetAndCleanup('$pending', validationErrorKey, controller);
<ide> }
<ide> if (!isBoolean(state)) {
<del> unset(ctrl.$error, validationErrorKey, options);
<del> unset(ctrl.$$success, validationErrorKey, options);
<add> unset(ctrl.$error, validationErrorKey, controller);
<add> unset(ctrl.$$success, validationErrorKey, controller);
<ide> } else {
<ide> if (state) {
<del> unset(ctrl.$error, validationErrorKey, options);
<del> set(ctrl.$$success, validationErrorKey, options);
<add> unset(ctrl.$error, validationErrorKey, controller);
<add> set(ctrl.$$success, validationErrorKey, controller);
<ide> } else {
<del> set(ctrl.$error, validationErrorKey, options);
<del> unset(ctrl.$$success, validationErrorKey, options);
<add> set(ctrl.$error, validationErrorKey, controller);
<add> unset(ctrl.$$success, validationErrorKey, controller);
<ide> }
<ide> }
<ide> if (ctrl.$pending) {
<ide> function addSetValidityMethod(context) {
<ide> } else {
<ide> combinedState = null;
<ide> }
<add>
<ide> toggleValidationCss(validationErrorKey, combinedState);
<ide> parentForm.$setValidity(validationErrorKey, combinedState, ctrl);
<ide> }
<ide>
<del> function createAndSet(name, value, options) {
<add> function createAndSet(name, value, controller) {
<ide> if (!ctrl[name]) {
<ide> ctrl[name] = {};
<ide> }
<del> set(ctrl[name], value, options);
<add> set(ctrl[name], value, controller);
<ide> }
<ide>
<del> function unsetAndCleanup(name, value, options) {
<add> function unsetAndCleanup(name, value, controller) {
<ide> if (ctrl[name]) {
<del> unset(ctrl[name], value, options);
<add> unset(ctrl[name], value, controller);
<ide> }
<ide> if (isObjectEmpty(ctrl[name])) {
<ide> ctrl[name] = undefined; | 2 |
Text | Text | remove container.json from readme | d0bee7939482b982462c5848f24b2e5e9ad897ea | <ide><path>pkg/libcontainer/README.md
<ide> a `container.json` file is placed with the runtime configuration for how the pro
<ide> should be contained and ran. Environment, networking, and different capabilities for the
<ide> process are specified in this file. The configuration is used for each process executed inside the container.
<ide>
<del>Sample `container.json` file:
<del>```json
<del>{
<del> "mounts" : [
<del> {
<del> "type" : "devtmpfs"
<del> }
<del> ],
<del> "tty" : true,
<del> "environment" : [
<del> "HOME=/",
<del> "PATH=PATH=$PATH:/bin:/usr/bin:/sbin:/usr/sbin",
<del> "container=docker",
<del> "TERM=xterm-256color"
<del> ],
<del> "hostname" : "koye",
<del> "cgroups" : {
<del> "parent" : "docker",
<del> "name" : "docker-koye"
<del> },
<del> "capabilities_mask" : [
<del> {
<del> "value" : 8,
<del> "key" : "SETPCAP",
<del> "enabled" : false
<del> },
<del> {
<del> "enabled" : false,
<del> "value" : 16,
<del> "key" : "SYS_MODULE"
<del> },
<del> {
<del> "value" : 17,
<del> "key" : "SYS_RAWIO",
<del> "enabled" : false
<del> },
<del> {
<del> "key" : "SYS_PACCT",
<del> "value" : 20,
<del> "enabled" : false
<del> },
<del> {
<del> "value" : 21,
<del> "key" : "SYS_ADMIN",
<del> "enabled" : false
<del> },
<del> {
<del> "value" : 23,
<del> "key" : "SYS_NICE",
<del> "enabled" : false
<del> },
<del> {
<del> "value" : 24,
<del> "key" : "SYS_RESOURCE",
<del> "enabled" : false
<del> },
<del> {
<del> "key" : "SYS_TIME",
<del> "value" : 25,
<del> "enabled" : false
<del> },
<del> {
<del> "enabled" : false,
<del> "value" : 26,
<del> "key" : "SYS_TTY_CONFIG"
<del> },
<del> {
<del> "key" : "AUDIT_WRITE",
<del> "value" : 29,
<del> "enabled" : false
<del> },
<del> {
<del> "value" : 30,
<del> "key" : "AUDIT_CONTROL",
<del> "enabled" : false
<del> },
<del> {
<del> "enabled" : false,
<del> "key" : "MAC_OVERRIDE",
<del> "value" : 32
<del> },
<del> {
<del> "enabled" : false,
<del> "key" : "MAC_ADMIN",
<del> "value" : 33
<del> },
<del> {
<del> "key" : "NET_ADMIN",
<del> "value" : 12,
<del> "enabled" : false
<del> },
<del> {
<del> "value" : 27,
<del> "key" : "MKNOD",
<del> "enabled" : true
<del> }
<del> ],
<del> "networks" : [
<del> {
<del> "mtu" : 1500,
<del> "address" : "127.0.0.1/0",
<del> "type" : "loopback",
<del> "gateway" : "localhost"
<del> },
<del> {
<del> "mtu" : 1500,
<del> "address" : "172.17.42.2/16",
<del> "type" : "veth",
<del> "context" : {
<del> "bridge" : "docker0",
<del> "prefix" : "veth"
<del> },
<del> "gateway" : "172.17.42.1"
<del> }
<del> ],
<del> "namespaces" : [
<del> {
<del> "key" : "NEWNS",
<del> "value" : 131072,
<del> "enabled" : true,
<del> "file" : "mnt"
<del> },
<del> {
<del> "key" : "NEWUTS",
<del> "value" : 67108864,
<del> "enabled" : true,
<del> "file" : "uts"
<del> },
<del> {
<del> "enabled" : true,
<del> "file" : "ipc",
<del> "key" : "NEWIPC",
<del> "value" : 134217728
<del> },
<del> {
<del> "file" : "pid",
<del> "enabled" : true,
<del> "value" : 536870912,
<del> "key" : "NEWPID"
<del> },
<del> {
<del> "enabled" : true,
<del> "file" : "net",
<del> "key" : "NEWNET",
<del> "value" : 1073741824
<del> }
<del> ]
<del>}
<del>```
<add>See the `container.json` file for what the configuration should look like.
<ide>
<ide> Using this configuration and the current directory holding the rootfs for a process, one can use libcontainer to exec the container. Running the life of the namespace, a `pid` file
<ide> is written to the current directory with the pid of the namespaced process to the external world. A client can use this pid to wait, kill, or perform other operation with the container. If a user tries to run a new process inside an existing container with a live namespace, the namespace will be joined by the new process.
<ide>
<del>
<ide> You may also specify an alternate root place where the `container.json` file is read and where the `pid` file will be saved.
<ide>
<ide> #### nsinit | 1 |
Javascript | Javascript | fix undefined variables from specs | cd302135f0a19c6f3d03dd555d228556f523c741 | <ide><path>spec/atom-environment-spec.js
<ide> describe('AtomEnvironment', () => {
<ide>
<ide> it('will open the dev tools when an error is triggered', async () => {
<ide> try {
<del> a + 1 // eslint-ignore-line no-unused-vars
<add> a + 1 // eslint-disable-line no-undef
<ide> } catch (e) {
<del> window.onerror.call(window, e.toString(), 'abc', 2, 3, e)
<add> window.onerror(e.toString(), 'abc', 2, 3, e)
<ide> }
<ide>
<ide> await devToolsPromise
<ide> describe('AtomEnvironment', () => {
<ide> let error = null
<ide> atom.onWillThrowError(willThrowSpy)
<ide> try {
<del> a + 1
<add> a + 1 // eslint-disable-line no-undef
<ide> } catch (e) {
<ide> error = e
<ide> window.onerror.call(window, e.toString(), 'abc', 2, 3, e)
<ide> describe('AtomEnvironment', () => {
<ide> atom.onWillThrowError(willThrowSpy)
<ide>
<ide> try {
<del> a + 1
<add> a + 1 // eslint-disable-line no-undef
<ide> } catch (e) {
<ide> window.onerror.call(window, e.toString(), 'abc', 2, 3, e)
<ide> }
<ide> describe('AtomEnvironment', () => {
<ide> let error = null
<ide> atom.onDidThrowError(didThrowSpy)
<ide> try {
<del> a + 1
<add> a + 1 // eslint-disable-line no-undef
<ide> } catch (e) {
<ide> error = e
<ide> window.onerror.call(window, e.toString(), 'abc', 2, 3, e)
<ide><path>spec/atom-paths-spec.js
<ide> describe('AtomPaths', () => {
<ide> })
<ide>
<ide> describe('setUserData', () => {
<add> let tempAtomConfigPath = null
<ide> let tempAtomHomePath = null
<ide> let electronUserDataPath = null
<ide> let defaultElectronUserDataPath = null
<ide><path>spec/main-process/file-recovery-service.test.js
<ide> const FileRecoveryService = require('../../src/main-process/file-recovery-servic
<ide> const fs = require('fs-plus')
<ide> const fsreal = require('fs')
<ide> const EventEmitter = require('events').EventEmitter
<add>const { assert } = require('chai')
<ide> const sinon = require('sinon')
<ide> const { escapeRegExp } = require('underscore-plus')
<ide> const temp = require('temp').track()
<ide><path>spec/main-process/parse-command-line.test.js
<add>const { assert } = require('chai')
<ide> const parseCommandLine = require('../../src/main-process/parse-command-line')
<ide>
<ide> describe('parseCommandLine', () => {
<ide><path>spec/reopen-project-menu-manager-spec.js
<ide> import { Disposable } from 'event-kit'
<ide>
<ide> const ReopenProjectMenuManager = require('../src/reopen-project-menu-manager')
<ide>
<del>numberRange = (low, high) => {
<add>function numberRange (low, high) {
<ide> const size = high - low
<ide> const result = new Array(size)
<ide> for (var i = 0; i < size; i++) result[i] = low + i
<ide> numberRange = (low, high) => {
<ide> describe('ReopenProjectMenuManager', () => {
<ide> let menuManager, commandRegistry, config, historyManager, reopenProjects
<ide> let commandDisposable, configDisposable, historyDisposable
<add> let openFunction
<ide>
<ide> beforeEach(() => {
<ide> menuManager = jasmine.createSpyObj('MenuManager', ['add'])
<ide> describe('ReopenProjectMenuManager', () => {
<ide> ])
<ide> reopenProjects.update()
<ide>
<del> reopenProjectCommand =
<add> const reopenProjectCommand =
<ide> commandRegistry.add.calls[0].args[1]['application:reopen-project']
<ide> reopenProjectCommand({ detail: { index: 1 } })
<ide>
<ide> describe('ReopenProjectMenuManager', () => {
<ide> })
<ide>
<ide> it('does not call open when no command detail is supplied', () => {
<del> reopenProjectCommand =
<add> const reopenProjectCommand =
<ide> commandRegistry.add.calls[0].args[1]['application:reopen-project']
<ide> reopenProjectCommand({})
<ide>
<ide> expect(openFunction).not.toHaveBeenCalled()
<ide> })
<ide>
<ide> it('does not call open when no command detail index is supplied', () => {
<del> reopenProjectCommand =
<add> const reopenProjectCommand =
<ide> commandRegistry.add.calls[0].args[1]['application:reopen-project']
<ide> reopenProjectCommand({ detail: { anything: 'here' } })
<ide>
<ide><path>spec/text-editor-element-spec.js
<ide> describe('TextEditorElement', () => {
<ide>
<ide> describe('::setScrollTop and ::setScrollLeft', () => {
<ide> it('changes the scroll position', async () => {
<del> element = buildTextEditorElement()
<add> const element = buildTextEditorElement()
<ide> element.getModel().update({ autoHeight: false })
<ide> element.getModel().setText('lorem\nipsum\ndolor\nsit\namet')
<ide> element.setHeight(20)
<ide><path>spec/text-mate-language-mode-spec.js
<ide> describe('TextMateLanguageMode', () => {
<ide> })
<ide>
<ide> describe('.isFoldableAtRow(row)', () => {
<add> let editor
<add>
<ide> beforeEach(() => {
<ide> buffer = atom.project.bufferForPathSync('sample.js')
<ide> buffer.insert([10, 0], ' // multi-line\n // comment\n // block\n')
<ide> describe('TextMateLanguageMode', () => {
<ide> })
<ide>
<ide> describe('.getFoldableRangesAtIndentLevel', () => {
<add> let editor
<add>
<ide> it('returns the ranges that can be folded at the given indent level', () => {
<ide> buffer = new TextBuffer(dedent`
<ide> if (a) {
<ide> describe('TextMateLanguageMode', () => {
<ide>
<ide> it('works with multi-line comments', async () => {
<ide> await atom.packages.activatePackage('language-javascript')
<del> editor = await atom.workspace.open('sample-with-comments.js', {
<add> const editor = await atom.workspace.open('sample-with-comments.js', {
<ide> autoIndent: false
<ide> })
<ide> fullyTokenize(editor.getBuffer().getLanguageMode())
<ide> describe('TextMateLanguageMode', () => {
<ide>
<ide> it('searches upward and downward for surrounding comment lines and folds them as a single fold', async () => {
<ide> await atom.packages.activatePackage('language-javascript')
<del> editor = await atom.workspace.open('sample-with-comments.js')
<add> const editor = await atom.workspace.open('sample-with-comments.js')
<ide> editor.buffer.insert(
<ide> [1, 0],
<ide> ' //this is a comment\n // and\n //more docs\n\n//second comment'
<ide><path>spec/workspace-element-spec.js
<ide> describe('WorkspaceElement', () => {
<ide> })
<ide>
<ide> describe('finding the nearest visible pane in a specific direction', () => {
<del> let pane1,
<add> let nearestPaneElement,
<add> pane1,
<ide> pane2,
<ide> pane3,
<ide> pane4,
<ide><path>spec/workspace-spec.js
<ide> describe('Workspace', () => {
<ide> .getActivePane()
<ide> .addItem(document.createElement('div'))
<ide>
<del> emittedItems = []
<add> const emittedItems = []
<ide> atom.workspace.onDidStopChangingActivePaneItem(item =>
<ide> emittedItems.push(item)
<ide> )
<ide> describe('Workspace', () => {
<ide> describe('::observeActiveTextEditor()', () => {
<ide> it('invokes the observer with current active text editor and each time a different text editor becomes active', () => {
<ide> const pane = workspace.getCenter().getActivePane()
<del> observed = []
<add> const observed = []
<ide>
<ide> const inactiveEditorBeforeRegisteringObserver = new TextEditor()
<ide> const activeEditorBeforeRegisteringObserver = new TextEditor() | 9 |
Javascript | Javascript | add datetime to the list of known attributes | 9f94244994354cb6efd6cb526a39b648e2ae5257 | <ide><path>src/dom/DefaultDOMPropertyConfig.js
<ide> var DefaultDOMPropertyConfig = {
<ide> contentEditable: null,
<ide> controls: MUST_USE_PROPERTY | HAS_BOOLEAN_VALUE,
<ide> data: null, // For `<object />` acts as `src`.
<add> datetime: null,
<ide> dir: null,
<ide> disabled: MUST_USE_PROPERTY | HAS_BOOLEAN_VALUE,
<ide> draggable: null, | 1 |
Python | Python | move import into load to avoid circular imports | e49cd7aeaf81ed12490d82b8a65ca93088ec916e | <ide><path>spacy/__init__.py
<ide>
<ide> from .cli.info import info as cli_info
<ide> from .glossary import explain
<del>from .deprecated import resolve_load_name
<ide> from .about import __version__
<ide> from . import util
<ide>
<ide>
<ide> def load(name, **overrides):
<add> from .deprecated import resolve_load_name
<ide> name = resolve_load_name(name, **overrides)
<ide> return util.load_model(name, **overrides)
<ide> | 1 |
Go | Go | remove compose types.dict alias | d3dc27d145a36b3c81fd5999f2c525af95d9596a | <ide><path>cli/compose/interpolation/interpolation.go
<ide> import (
<ide> "fmt"
<ide>
<ide> "github.com/docker/docker/cli/compose/template"
<del> "github.com/docker/docker/cli/compose/types"
<ide> )
<ide>
<ide> // Interpolate replaces variables in a string with the values from a mapping
<del>func Interpolate(config types.Dict, section string, mapping template.Mapping) (types.Dict, error) {
<del> out := types.Dict{}
<add>func Interpolate(config map[string]interface{}, section string, mapping template.Mapping) (map[string]interface{}, error) {
<add> out := map[string]interface{}{}
<ide>
<ide> for name, item := range config {
<ide> if item == nil {
<ide> out[name] = nil
<ide> continue
<ide> }
<del> interpolatedItem, err := interpolateSectionItem(name, item.(types.Dict), section, mapping)
<add> interpolatedItem, err := interpolateSectionItem(name, item.(map[string]interface{}), section, mapping)
<ide> if err != nil {
<ide> return nil, err
<ide> }
<ide> func Interpolate(config types.Dict, section string, mapping template.Mapping) (t
<ide>
<ide> func interpolateSectionItem(
<ide> name string,
<del> item types.Dict,
<add> item map[string]interface{},
<ide> section string,
<ide> mapping template.Mapping,
<del>) (types.Dict, error) {
<add>) (map[string]interface{}, error) {
<ide>
<del> out := types.Dict{}
<add> out := map[string]interface{}{}
<ide>
<ide> for key, value := range item {
<ide> interpolatedValue, err := recursiveInterpolate(value, mapping)
<ide> func recursiveInterpolate(
<ide> case string:
<ide> return template.Substitute(value, mapping)
<ide>
<del> case types.Dict:
<del> out := types.Dict{}
<add> case map[string]interface{}:
<add> out := map[string]interface{}{}
<ide> for key, elem := range value {
<ide> interpolatedElem, err := recursiveInterpolate(elem, mapping)
<ide> if err != nil {
<ide><path>cli/compose/interpolation/interpolation_test.go
<ide> import (
<ide> "testing"
<ide>
<ide> "github.com/stretchr/testify/assert"
<del>
<del> "github.com/docker/docker/cli/compose/types"
<ide> )
<ide>
<ide> var defaults = map[string]string{
<ide> func defaultMapping(name string) (string, bool) {
<ide> }
<ide>
<ide> func TestInterpolate(t *testing.T) {
<del> services := types.Dict{
<del> "servicea": types.Dict{
<add> services := map[string]interface{}{
<add> "servicea": map[string]interface{}{
<ide> "image": "example:${USER}",
<ide> "volumes": []interface{}{"$FOO:/target"},
<del> "logging": types.Dict{
<add> "logging": map[string]interface{}{
<ide> "driver": "${FOO}",
<del> "options": types.Dict{
<add> "options": map[string]interface{}{
<ide> "user": "$USER",
<ide> },
<ide> },
<ide> },
<ide> }
<del> expected := types.Dict{
<del> "servicea": types.Dict{
<add> expected := map[string]interface{}{
<add> "servicea": map[string]interface{}{
<ide> "image": "example:jenny",
<ide> "volumes": []interface{}{"bar:/target"},
<del> "logging": types.Dict{
<add> "logging": map[string]interface{}{
<ide> "driver": "bar",
<del> "options": types.Dict{
<add> "options": map[string]interface{}{
<ide> "user": "jenny",
<ide> },
<ide> },
<ide> func TestInterpolate(t *testing.T) {
<ide> }
<ide>
<ide> func TestInvalidInterpolation(t *testing.T) {
<del> services := types.Dict{
<del> "servicea": types.Dict{
<add> services := map[string]interface{}{
<add> "servicea": map[string]interface{}{
<ide> "image": "${",
<ide> },
<ide> }
<ide><path>cli/compose/loader/loader.go
<ide> var (
<ide>
<ide> // ParseYAML reads the bytes from a file, parses the bytes into a mapping
<ide> // structure, and returns it.
<del>func ParseYAML(source []byte) (types.Dict, error) {
<add>func ParseYAML(source []byte) (map[string]interface{}, error) {
<ide> var cfg interface{}
<ide> if err := yaml.Unmarshal(source, &cfg); err != nil {
<ide> return nil, err
<ide> func ParseYAML(source []byte) (types.Dict, error) {
<ide> if err != nil {
<ide> return nil, err
<ide> }
<del> return converted.(types.Dict), nil
<add> return converted.(map[string]interface{}), nil
<ide> }
<ide>
<ide> // Load reads a ConfigDetails and returns a fully loaded configuration
<ide> func Load(configDetails types.ConfigDetails) (*types.Config, error) {
<ide> configDict := getConfigDict(configDetails)
<ide>
<ide> if services, ok := configDict["services"]; ok {
<del> if servicesDict, ok := services.(types.Dict); ok {
<add> if servicesDict, ok := services.(map[string]interface{}); ok {
<ide> forbidden := getProperties(servicesDict, types.ForbiddenProperties)
<ide>
<ide> if len(forbidden) > 0 {
<ide> func Load(configDetails types.ConfigDetails) (*types.Config, error) {
<ide> return v, ok
<ide> }
<ide> if services, ok := configDict["services"]; ok {
<del> servicesConfig, err := interpolation.Interpolate(services.(types.Dict), "service", lookupEnv)
<add> servicesConfig, err := interpolation.Interpolate(services.(map[string]interface{}), "service", lookupEnv)
<ide> if err != nil {
<ide> return nil, err
<ide> }
<ide> func Load(configDetails types.ConfigDetails) (*types.Config, error) {
<ide> }
<ide>
<ide> if networks, ok := configDict["networks"]; ok {
<del> networksConfig, err := interpolation.Interpolate(networks.(types.Dict), "network", lookupEnv)
<add> networksConfig, err := interpolation.Interpolate(networks.(map[string]interface{}), "network", lookupEnv)
<ide> if err != nil {
<ide> return nil, err
<ide> }
<ide> func Load(configDetails types.ConfigDetails) (*types.Config, error) {
<ide> }
<ide>
<ide> if volumes, ok := configDict["volumes"]; ok {
<del> volumesConfig, err := interpolation.Interpolate(volumes.(types.Dict), "volume", lookupEnv)
<add> volumesConfig, err := interpolation.Interpolate(volumes.(map[string]interface{}), "volume", lookupEnv)
<ide> if err != nil {
<ide> return nil, err
<ide> }
<ide> func Load(configDetails types.ConfigDetails) (*types.Config, error) {
<ide> }
<ide>
<ide> if secrets, ok := configDict["secrets"]; ok {
<del> secretsConfig, err := interpolation.Interpolate(secrets.(types.Dict), "secret", lookupEnv)
<add> secretsConfig, err := interpolation.Interpolate(secrets.(map[string]interface{}), "secret", lookupEnv)
<ide> if err != nil {
<ide> return nil, err
<ide> }
<ide> func GetUnsupportedProperties(configDetails types.ConfigDetails) []string {
<ide> unsupported := map[string]bool{}
<ide>
<ide> for _, service := range getServices(getConfigDict(configDetails)) {
<del> serviceDict := service.(types.Dict)
<add> serviceDict := service.(map[string]interface{})
<ide> for _, property := range types.UnsupportedProperties {
<ide> if _, isSet := serviceDict[property]; isSet {
<ide> unsupported[property] = true
<ide> func GetDeprecatedProperties(configDetails types.ConfigDetails) map[string]strin
<ide> return getProperties(getServices(getConfigDict(configDetails)), types.DeprecatedProperties)
<ide> }
<ide>
<del>func getProperties(services types.Dict, propertyMap map[string]string) map[string]string {
<add>func getProperties(services map[string]interface{}, propertyMap map[string]string) map[string]string {
<ide> output := map[string]string{}
<ide>
<ide> for _, service := range services {
<del> if serviceDict, ok := service.(types.Dict); ok {
<add> if serviceDict, ok := service.(map[string]interface{}); ok {
<ide> for property, description := range propertyMap {
<ide> if _, isSet := serviceDict[property]; isSet {
<ide> output[property] = description
<ide> func (e *ForbiddenPropertiesError) Error() string {
<ide> }
<ide>
<ide> // TODO: resolve multiple files into a single config
<del>func getConfigDict(configDetails types.ConfigDetails) types.Dict {
<add>func getConfigDict(configDetails types.ConfigDetails) map[string]interface{} {
<ide> return configDetails.ConfigFiles[0].Config
<ide> }
<ide>
<del>func getServices(configDict types.Dict) types.Dict {
<add>func getServices(configDict map[string]interface{}) map[string]interface{} {
<ide> if services, ok := configDict["services"]; ok {
<del> if servicesDict, ok := services.(types.Dict); ok {
<add> if servicesDict, ok := services.(map[string]interface{}); ok {
<ide> return servicesDict
<ide> }
<ide> }
<ide>
<del> return types.Dict{}
<add> return map[string]interface{}{}
<ide> }
<ide>
<ide> func transform(source map[string]interface{}, target interface{}) error {
<ide> func transformHook(
<ide> }
<ide>
<ide> // keys needs to be converted to strings for jsonschema
<del>// TODO: don't use types.Dict
<ide> func convertToStringKeysRecursive(value interface{}, keyPrefix string) (interface{}, error) {
<ide> if mapping, ok := value.(map[interface{}]interface{}); ok {
<del> dict := make(types.Dict)
<add> dict := make(map[string]interface{})
<ide> for key, entry := range mapping {
<ide> str, ok := key.(string)
<ide> if !ok {
<ide> func formatInvalidKeyError(keyPrefix string, key interface{}) error {
<ide>
<ide> // LoadServices produces a ServiceConfig map from a compose file Dict
<ide> // the servicesDict is not validated if directly used. Use Load() to enable validation
<del>func LoadServices(servicesDict types.Dict, workingDir string, lookupEnv template.Mapping) ([]types.ServiceConfig, error) {
<add>func LoadServices(servicesDict map[string]interface{}, workingDir string, lookupEnv template.Mapping) ([]types.ServiceConfig, error) {
<ide> var services []types.ServiceConfig
<ide>
<ide> for name, serviceDef := range servicesDict {
<del> serviceConfig, err := LoadService(name, serviceDef.(types.Dict), workingDir, lookupEnv)
<add> serviceConfig, err := LoadService(name, serviceDef.(map[string]interface{}), workingDir, lookupEnv)
<ide> if err != nil {
<ide> return nil, err
<ide> }
<ide> func LoadServices(servicesDict types.Dict, workingDir string, lookupEnv template
<ide>
<ide> // LoadService produces a single ServiceConfig from a compose file Dict
<ide> // the serviceDict is not validated if directly used. Use Load() to enable validation
<del>func LoadService(name string, serviceDict types.Dict, workingDir string, lookupEnv template.Mapping) (*types.ServiceConfig, error) {
<add>func LoadService(name string, serviceDict map[string]interface{}, workingDir string, lookupEnv template.Mapping) (*types.ServiceConfig, error) {
<ide> serviceConfig := &types.ServiceConfig{}
<ide> if err := transform(serviceDict, serviceConfig); err != nil {
<ide> return nil, err
<ide> func transformUlimits(data interface{}) (interface{}, error) {
<ide> switch value := data.(type) {
<ide> case int:
<ide> return types.UlimitsConfig{Single: value}, nil
<del> case types.Dict:
<add> case map[string]interface{}:
<ide> ulimit := types.UlimitsConfig{}
<ide> ulimit.Soft = value["soft"].(int)
<ide> ulimit.Hard = value["hard"].(int)
<ide> func transformUlimits(data interface{}) (interface{}, error) {
<ide>
<ide> // LoadNetworks produces a NetworkConfig map from a compose file Dict
<ide> // the source Dict is not validated if directly used. Use Load() to enable validation
<del>func LoadNetworks(source types.Dict) (map[string]types.NetworkConfig, error) {
<add>func LoadNetworks(source map[string]interface{}) (map[string]types.NetworkConfig, error) {
<ide> networks := make(map[string]types.NetworkConfig)
<ide> err := transform(source, &networks)
<ide> if err != nil {
<ide> func LoadNetworks(source types.Dict) (map[string]types.NetworkConfig, error) {
<ide>
<ide> // LoadVolumes produces a VolumeConfig map from a compose file Dict
<ide> // the source Dict is not validated if directly used. Use Load() to enable validation
<del>func LoadVolumes(source types.Dict) (map[string]types.VolumeConfig, error) {
<add>func LoadVolumes(source map[string]interface{}) (map[string]types.VolumeConfig, error) {
<ide> volumes := make(map[string]types.VolumeConfig)
<ide> err := transform(source, &volumes)
<ide> if err != nil {
<ide> func LoadVolumes(source types.Dict) (map[string]types.VolumeConfig, error) {
<ide>
<ide> // LoadSecrets produces a SecretConfig map from a compose file Dict
<ide> // the source Dict is not validated if directly used. Use Load() to enable validation
<del>func LoadSecrets(source types.Dict, workingDir string) (map[string]types.SecretConfig, error) {
<add>func LoadSecrets(source map[string]interface{}, workingDir string) (map[string]types.SecretConfig, error) {
<ide> secrets := make(map[string]types.SecretConfig)
<ide> if err := transform(source, &secrets); err != nil {
<ide> return secrets, err
<ide> func transformMapStringString(data interface{}) (interface{}, error) {
<ide> switch value := data.(type) {
<ide> case map[string]interface{}:
<ide> return toMapStringString(value, false), nil
<del> case types.Dict:
<del> return toMapStringString(value, false), nil
<ide> case map[string]string:
<ide> return value, nil
<ide> default:
<ide> func transformExternal(data interface{}) (interface{}, error) {
<ide> switch value := data.(type) {
<ide> case bool:
<ide> return map[string]interface{}{"external": value}, nil
<del> case types.Dict:
<del> return map[string]interface{}{"external": true, "name": value["name"]}, nil
<ide> case map[string]interface{}:
<ide> return map[string]interface{}{"external": true, "name": value["name"]}, nil
<ide> default:
<ide> func transformServicePort(data interface{}) (interface{}, error) {
<ide> return data, err
<ide> }
<ide> ports = append(ports, v...)
<del> case types.Dict:
<del> ports = append(ports, value)
<ide> case map[string]interface{}:
<ide> ports = append(ports, value)
<ide> default:
<ide> func transformServiceSecret(data interface{}) (interface{}, error) {
<ide> switch value := data.(type) {
<ide> case string:
<ide> return map[string]interface{}{"source": value}, nil
<del> case types.Dict:
<del> return data, nil
<ide> case map[string]interface{}:
<ide> return data, nil
<ide> default:
<ide> func transformServiceVolumeConfig(data interface{}) (interface{}, error) {
<ide> switch value := data.(type) {
<ide> case string:
<ide> return parseVolume(value)
<del> case types.Dict:
<del> return data, nil
<ide> case map[string]interface{}:
<ide> return data, nil
<ide> default:
<ide> func transformStringList(data interface{}) (interface{}, error) {
<ide>
<ide> func transformMappingOrList(mappingOrList interface{}, sep string, allowNil bool) interface{} {
<ide> switch value := mappingOrList.(type) {
<del> case types.Dict:
<add> case map[string]interface{}:
<ide> return toMapStringString(value, allowNil)
<ide> case ([]interface{}):
<ide> result := make(map[string]interface{})
<ide><path>cli/compose/loader/loader_test.go
<ide> import (
<ide> "github.com/stretchr/testify/assert"
<ide> )
<ide>
<del>func buildConfigDetails(source types.Dict, env map[string]string) types.ConfigDetails {
<add>func buildConfigDetails(source map[string]interface{}, env map[string]string) types.ConfigDetails {
<ide> workingDir, err := os.Getwd()
<ide> if err != nil {
<ide> panic(err)
<ide> networks:
<ide> - subnet: 172.28.0.0/16
<ide> `
<ide>
<del>var sampleDict = types.Dict{
<add>var sampleDict = map[string]interface{}{
<ide> "version": "3",
<del> "services": types.Dict{
<del> "foo": types.Dict{
<add> "services": map[string]interface{}{
<add> "foo": map[string]interface{}{
<ide> "image": "busybox",
<del> "networks": types.Dict{"with_me": nil},
<add> "networks": map[string]interface{}{"with_me": nil},
<ide> },
<del> "bar": types.Dict{
<add> "bar": map[string]interface{}{
<ide> "image": "busybox",
<ide> "environment": []interface{}{"FOO=1"},
<ide> "networks": []interface{}{"with_ipam"},
<ide> },
<ide> },
<del> "volumes": types.Dict{
<del> "hello": types.Dict{
<add> "volumes": map[string]interface{}{
<add> "hello": map[string]interface{}{
<ide> "driver": "default",
<del> "driver_opts": types.Dict{
<add> "driver_opts": map[string]interface{}{
<ide> "beep": "boop",
<ide> },
<ide> },
<ide> },
<del> "networks": types.Dict{
<del> "default": types.Dict{
<add> "networks": map[string]interface{}{
<add> "default": map[string]interface{}{
<ide> "driver": "bridge",
<del> "driver_opts": types.Dict{
<add> "driver_opts": map[string]interface{}{
<ide> "beep": "boop",
<ide> },
<ide> },
<del> "with_ipam": types.Dict{
<del> "ipam": types.Dict{
<add> "with_ipam": map[string]interface{}{
<add> "ipam": map[string]interface{}{
<ide> "driver": "default",
<ide> "config": []interface{}{
<del> types.Dict{
<add> map[string]interface{}{
<ide> "subnet": "172.28.0.0/16",
<ide> },
<ide> },
<ide><path>cli/compose/types/types.go
<ide> var ForbiddenProperties = map[string]string{
<ide> "memswap_limit": "Set resource limits using deploy.resources",
<ide> }
<ide>
<del>// Dict is a mapping of strings to interface{}
<del>type Dict map[string]interface{}
<del>
<ide> // ConfigFile is a filename and the contents of the file as a Dict
<ide> type ConfigFile struct {
<ide> Filename string
<del> Config Dict
<add> Config map[string]interface{}
<ide> }
<ide>
<ide> // ConfigDetails are the details about a group of ConfigFiles | 5 |
Javascript | Javascript | add method on yellowbox to ignore warnings | a974c140db605ecbdf8d3faa7a079b7e2dcebb09 | <ide><path>Libraries/ReactNative/YellowBox.js
<ide> type WarningInfo = {
<ide>
<ide> const _warningEmitter = new EventEmitter();
<ide> const _warningMap: Map<string, WarningInfo> = new Map();
<add>const IGNORED_WARNINGS: Array<string> = [];
<ide>
<ide> /**
<ide> * YellowBox renders warnings at the bottom of the app being developed.
<ide> const _warningMap: Map<string, WarningInfo> = new Map();
<ide> * console.disableYellowBox = true;
<ide> * console.warn('YellowBox is disabled.');
<ide> *
<del> * Warnings can be ignored programmatically by setting the array:
<add> * Ignore specific warnings by calling:
<add> *
<add> * YellowBox.ignoreWarnings(['Warning: ...']);
<add> *
<add> * (DEPRECATED) Warnings can be ignored programmatically by setting the array:
<ide> *
<ide> * console.ignoredYellowBox = ['Warning: ...'];
<ide> *
<ide> function ensureSymbolicatedWarning(warning: string): void {
<ide> }
<ide>
<ide> function isWarningIgnored(warning: string): boolean {
<add> const isIgnored =
<add> IGNORED_WARNINGS.some(
<add> (ignoredWarning: string) => warning.startsWith(ignoredWarning)
<add> );
<add>
<add> if (isIgnored) {
<add> return true;
<add> }
<add>
<add> // DEPRECATED
<ide> return (
<ide> Array.isArray(console.ignoredYellowBox) &&
<ide> console.ignoredYellowBox.some(
<ide> class YellowBox extends React.Component {
<ide> };
<ide> }
<ide>
<add> static ignoreWarnings(warnings: Array<string>): void {
<add> warnings.forEach((warning: string) => {
<add> if (IGNORED_WARNINGS.indexOf(warning) === -1) {
<add> IGNORED_WARNINGS.push(warning);
<add> }
<add> });
<add> }
<add>
<ide> componentDidMount() {
<ide> let scheduled = null;
<ide> this._listener = _warningEmitter.addListener('warning', warningMap => { | 1 |
Text | Text | add michaël zasso to the ctc | 9e4a899d31181a34b20b99785fae7a6a97210e60 | <ide><path>README.md
<ide> more information about the governance of the Node.js project, see
<ide> **Rod Vagg** <rod@vagg.org>
<ide> * [shigeki](https://github.com/shigeki) -
<ide> **Shigeki Ohtsu** <ohtsu@iij.ad.jp>
<add>* [targos](https://github.com/targos) -
<add>**Michaël Zasso** <targos@protonmail.com>
<ide> * [TheAlphaNerd](https://github.com/TheAlphaNerd) -
<ide> **Myles Borins** <myles.borins@gmail.com>
<ide> * [thefourtheye](https://github.com/thefourtheye) -
<ide> more information about the governance of the Node.js project, see
<ide> **Steven R Loomis** <srloomis@us.ibm.com>
<ide> * [stefanmb](https://github.com/stefanmb) -
<ide> **Stefan Budeanu** <stefan@budeanu.com>
<del>* [targos](https://github.com/targos) -
<del>**Michaël Zasso** <targos@protonmail.com>
<ide> * [tellnes](https://github.com/tellnes) -
<ide> **Christian Tellnes** <christian@tellnes.no>
<ide> * [thekemkid](https://github.com/thekemkid) - | 1 |
Ruby | Ruby | add some emoji tests 💯 | 65dde58057029d473be9175444b333525fe82850 | <ide><path>Library/Homebrew/test/emoji_test.rb
<add>require "testing_env"
<add>require "emoji"
<add>
<add>class EmojiTest < Homebrew::TestCase
<add> def test_install_badge
<add> assert_equal "🍺", Emoji.install_badge
<add>
<add> ENV["HOMEBREW_INSTALL_BADGE"] = "foo"
<add> assert_equal "foo", Emoji.install_badge
<add> end
<add>end
<ide><path>Library/Homebrew/test/utils_test.rb
<ide> def setup
<ide> @dir = Pathname.new(mktmpdir)
<ide> end
<ide>
<add> # Helper for matching escape sequences.
<add> def e(code)
<add> /(\e\[\d+m)*\e\[#{code}m/
<add> end
<add>
<add> # Helper for matching that style is reset at the end of a string.
<add> Z = /(\e\[\d+m)*\e\[0m\Z/
<add>
<ide> def test_ofail
<ide> shutup { ofail "foo" }
<ide> assert Homebrew.failed?
<ide> def test_odie
<ide> end
<ide>
<ide> def test_pretty_installed
<add> $stdout.stubs(:tty?).returns true
<add> ENV.delete("HOMEBREW_NO_EMOJI")
<add> assert_match(/\A#{e 1}foo #{e 32}✔#{Z}/, pretty_installed("foo"))
<add>
<add> ENV["HOMEBREW_NO_EMOJI"] = "1"
<add> assert_match(/\A#{e 1}foo \(installed\)#{Z}/, pretty_installed("foo"))
<add>
<ide> $stdout.stubs(:tty?).returns false
<ide> assert_equal "foo", pretty_installed("foo")
<ide> end
<ide>
<ide> def test_pretty_uninstalled
<add> $stdout.stubs(:tty?).returns true
<add> ENV.delete("HOMEBREW_NO_EMOJI")
<add> assert_match(/\A#{e 1}foo #{e 31}✘#{Z}/, pretty_uninstalled("foo"))
<add>
<add> ENV["HOMEBREW_NO_EMOJI"] = "1"
<add> assert_match(/\A#{e 1}foo \(uninstalled\)#{Z}/, pretty_uninstalled("foo"))
<add>
<ide> $stdout.stubs(:tty?).returns false
<ide> assert_equal "foo", pretty_uninstalled("foo")
<ide> end | 2 |
Java | Java | remove unnecessary static modifier | f3ce7238ced24349247c25ed2a76357f03f4eaf5 | <ide><path>src/main/java/rx/Notification.java
<ide> public void accept(Observer<? super T> observer) {
<ide> }
<ide> }
<ide>
<del> public static enum Kind {
<add> public enum Kind {
<ide> OnNext, OnError, OnCompleted
<ide> }
<ide>
<ide><path>src/main/java/rx/Single.java
<ide> public final static <T> Single<T> create(OnSubscribe<T> f) {
<ide> /**
<ide> * Invoked when Single.execute is called.
<ide> */
<del> public static interface OnSubscribe<T> extends Action1<SingleSubscriber<? super T>> {
<add> public interface OnSubscribe<T> extends Action1<SingleSubscriber<? super T>> {
<ide> // cover for generics insanity
<ide> }
<ide>
<ide> public <R> Single<R> compose(Transformer<? super T, ? extends R> transformer) {
<ide> *
<ide> * @warn more complete description needed
<ide> */
<del> public static interface Transformer<T, R> extends Func1<Single<T>, Single<R>> {
<add> public interface Transformer<T, R> extends Func1<Single<T>, Single<R>> {
<ide> // cover for generics insanity
<ide> }
<ide> | 2 |
Javascript | Javascript | add tests for c1 and support for six-digit years | 273b9ef58e32d21a1a151fec2ef7902f42553c73 | <ide><path>moment.js
<ide> aspNetJsonRegex = /^\/?Date\((\-?\d+)/i,
<ide>
<ide> // format tokens
<del> formattingTokens = /(\[[^\[]*\])|(\\)?(Mo|MM?M?M?|Do|DDDo|DD?D?D?|ddd?d?|do?|w[o|w]?|YYYY|YY|a|A|hh?|HH?|mm?|ss?|SS?S?|zz?|ZZ?)/g,
<add> formattingTokens = /(\[[^\[]*\])|(\\)?(Mo|MM?M?M?|Do|DDDo|DD?D?D?|ddd?d?|do?|w[o|w]?|YYYYY|YYYY|YY|a|A|hh?|HH?|mm?|ss?|SS?S?|zz?|ZZ?|-)/g,
<ide> localFormattingTokens = /(LT|LL?L?L?)/g,
<ide> formattingRemoveEscapes = /(^\[)|(\\)|\]$/g,
<ide>
<ide> parseTokenOneToThreeDigits = /\d{1,3}/, // 0 - 999
<ide> parseTokenThreeDigits = /\d{3}/, // 000 - 999
<ide> parseTokenFourDigits = /\d{4}/, // 0000 - 9999
<add> parseTokenFiveToSixDigits = /[+\-]?\d{5,6}/, // -999,999 - 999,999
<ide> parseTokenWord = /[0-9a-z\u00A0-\uD7FF\uF900-\uFDCF\uFDF0-\uFFEF]+/i, // any word characters or numbers
<ide> parseTokenTimezone = /Z|[\+\-]\d\d:?\d\d/i, // +00:00 -00:00 +0000 -0000 or Z
<ide> parseTokenT = /T/i, // T (ISO seperator)
<ide> w : '(a=new Date(t.year(),t.month(),t.date()-t.day()+5),b=new Date(a.getFullYear(),0,4),a=~~((a-b)/864e5/7+1.5))',
<ide> YY : 'p(t.year()%100,2)',
<ide> YYYY : 'p(t.year(),4)',
<add> YYYYY: 'p(t.year(),5)',
<ide> a : 'm(t.hours(),t.minutes(),!0)',
<ide> A : 'm(t.hours(),t.minutes(),!1)',
<ide> H : 't.hours()',
<ide> return parseTokenThreeDigits;
<ide> case 'YYYY':
<ide> return parseTokenFourDigits;
<add> case 'YYYYY':
<add> return parseTokenFiveToSixDigits;
<ide> case 'S':
<ide> case 'SS':
<ide> case 'SSS':
<ide> datePartArray[0] = input + (input > 70 ? 1900 : 2000);
<ide> break;
<ide> case 'YYYY' :
<del> datePartArray[0] = ~~Math.abs(input);
<add> case 'YYYYY' :
<add> datePartArray[0] = ~~input;
<ide> break;
<ide> // AM / PM
<ide> case 'a' : // fall through to A
<ide><path>test/moment/create.js
<ide> exports.create = {
<ide> test.equal(moment(null), null, "Calling moment(null)");
<ide> test.equal(moment('', 'YYYY-MM-DD'), null, "Calling moment('', 'YYYY-MM-DD')");
<ide> test.done();
<add> },
<add>
<add> "first century" : function(test) {
<add> test.expect(2);
<add> test.equal(moment([0, 0, 1]).format("YYYY-MM-DD"), "0000-01-01", "Year AD 0");
<add> test.equal(moment([99, 0, 1]).format("YYYY-MM-DD"), "0099-01-01", "Year AD 99");
<add> test.done();
<add> },
<add>
<add> "six digit years" : function(test) {
<add> test.expect(5);
<add> test.equal(moment([-270000, 0, 1]).format("YYYYY-MM-DD"), "-270000-01-01", "format BC 270,000");
<add> test.equal(moment([ 270000, 0, 1]).format("YYYYY-MM-DD"), "270000-01-01", "format AD 270,000");
<add> test.equal(moment("-270000-01-01", "YYYYY-MM-DD").toDate().getUTCFullYear(), -270000, "parse BC 270,000");
<add> test.equal(moment("270000-01-01", "YYYYY-MM-DD").toDate().getUTCFullYear(), 270000, "parse AD 270,000");
<add> test.equal(moment("+270000-01-01", "YYYYY-MM-DD").toDate().getUTCFullYear(), 270000, "parse AD +270,000");
<add> test.done();
<ide> }
<ide> }; | 2 |
Ruby | Ruby | remove useless ternary in postgresql adapter | 6a534455f66a24e807d75dccc847981fdf43fa18 | <ide><path>activerecord/lib/active_record/connection_adapters/postgresql_adapter.rb
<ide> def quoted_date(value) #:nodoc:
<ide>
<ide> def supports_disable_referential_integrity?() #:nodoc:
<ide> version = query("SHOW server_version")[0][0].split('.')
<del> (version[0].to_i >= 8 && version[1].to_i >= 1) ? true : false
<add> version[0].to_i >= 8 && version[1].to_i >= 1
<ide> rescue
<ide> return false
<ide> end | 1 |
Ruby | Ruby | add linux and macos only function blocks | c7927f5af594c0d2c20cea6957adfb78bdf9b98c | <ide><path>Library/Homebrew/extend/os/formula.rb
<add># frozen_string_literal: true
<add>
<add>if OS.mac?
<add> require "extend/os/mac/formula"
<add>elsif OS.linux?
<add> require "extend/os/linux/formula"
<add>end
<ide><path>Library/Homebrew/extend/os/linux/formula.rb
<add># frozen_string_literal: true
<add>
<add>class Formula
<add> class << self
<add> undef on_linux
<add>
<add> def on_linux(&_block)
<add> yield
<add> end
<add> end
<add>end
<ide><path>Library/Homebrew/extend/os/mac/formula.rb
<add># frozen_string_literal: true
<add>
<add>class Formula
<add> class << self
<add> undef on_macos
<add>
<add> def on_macos(&_block)
<add> yield
<add> end
<add> end
<add>end
<ide><path>Library/Homebrew/formula.rb
<ide> def depends_on(dep)
<ide> specs.each { |spec| spec.depends_on(dep) }
<ide> end
<ide>
<add> # Indicates use of dependencies provided by macOS.
<add> # On macOS this is a no-op (as we use the system libraries there).
<add> # On Linux this will act as `depends_on`.
<ide> def uses_from_macos(dep)
<ide> specs.each { |spec| spec.uses_from_macos(dep) }
<ide> end
<ide>
<add> # Block executed only executed on macOS. No-op on Linux.
<add> # <pre>on_macos do
<add> # depends_on "mac_only_dep"
<add> # end</pre>
<add> def on_macos(&_block); end
<add>
<add> # Block executed only executed on Linux. No-op on macOS.
<add> # <pre>on_linux do
<add> # depends_on "linux_only_dep"
<add> # end</pre>
<add> def on_linux(&_block); end
<add>
<ide> # @!attribute [w] option
<ide> # Options can be used as arguments to `brew install`.
<ide> # To switch features on/off: `"with-something"` or `"with-otherthing"`.
<ide> def link_overwrite_paths
<ide> end
<ide> end
<ide> end
<add>
<add>require "extend/os/formula"
<ide><path>Library/Homebrew/test/os/linux/formula_spec.rb
<ide> expect(f.class.head.deps.first.name).to eq("foo")
<ide> end
<ide> end
<add>
<add> describe "#on_linux" do
<add> it "defines an url on Linux only" do
<add> f = formula do
<add> homepage "https://brew.sh"
<add>
<add> on_macos do
<add> url "https://brew.sh/test-macos-0.1.tbz"
<add> sha256 TEST_SHA256
<add> end
<add>
<add> on_linux do
<add> url "https://brew.sh/test-linux-0.1.tbz"
<add> sha256 TEST_SHA256
<add> end
<add> end
<add>
<add> expect(f.stable.url).to eq("https://brew.sh/test-linux-0.1.tbz")
<add> end
<add> end
<add>
<add> describe "#on_linux" do
<add> it "adds a dependency on Linux only" do
<add> f = formula do
<add> homepage "https://brew.sh"
<add>
<add> url "https://brew.sh/test-0.1.tbz"
<add> sha256 TEST_SHA256
<add>
<add> depends_on "hello_both"
<add>
<add> on_macos do
<add> depends_on "hello_macos"
<add> end
<add>
<add> on_linux do
<add> depends_on "hello_linux"
<add> end
<add> end
<add>
<add> expect(f.class.stable.deps[0].name).to eq("hello_both")
<add> expect(f.class.stable.deps[1].name).to eq("hello_linux")
<add> expect(f.class.stable.deps[2]).to eq(nil)
<add> end
<add> end
<add>
<add> describe "#on_linux" do
<add> it "adds a patch on Linux only" do
<add> f = formula do
<add> homepage "https://brew.sh"
<add>
<add> url "https://brew.sh/test-0.1.tbz"
<add> sha256 TEST_SHA256
<add>
<add> patch do
<add> url "patch_both"
<add> end
<add>
<add> on_macos do
<add> patch do
<add> url "patch_macos"
<add> end
<add> end
<add>
<add> on_linux do
<add> patch do
<add> url "patch_linux"
<add> end
<add> end
<add> end
<add>
<add> expect(f.patchlist.length).to eq(2)
<add> expect(f.patchlist.first.strip).to eq(:p1)
<add> expect(f.patchlist.first.url).to eq("patch_both")
<add> expect(f.patchlist.second.strip).to eq(:p1)
<add> expect(f.patchlist.second.url).to eq("patch_linux")
<add> end
<add> end
<ide> end
<ide><path>Library/Homebrew/test/os/mac/formula_spec.rb
<add># frozen_string_literal: true
<add>
<add>require "formula"
<add>
<add>describe Formula do
<add> describe "#on_macos" do
<add> it "defines an url on macos only" do
<add> f = formula do
<add> homepage "https://brew.sh"
<add>
<add> on_macos do
<add> url "https://brew.sh/test-macos-0.1.tbz"
<add> sha256 TEST_SHA256
<add> end
<add>
<add> on_linux do
<add> url "https://brew.sh/test-linux-0.1.tbz"
<add> sha256 TEST_SHA256
<add> end
<add> end
<add>
<add> expect(f.stable.url).to eq("https://brew.sh/test-macos-0.1.tbz")
<add> end
<add> end
<add>
<add> describe "#on_macos" do
<add> it "adds a dependency on macos only" do
<add> f = formula do
<add> homepage "https://brew.sh"
<add>
<add> url "https://brew.sh/test-0.1.tbz"
<add> sha256 TEST_SHA256
<add>
<add> depends_on "hello_both"
<add>
<add> on_macos do
<add> depends_on "hello_macos"
<add> end
<add>
<add> on_linux do
<add> depends_on "hello_linux"
<add> end
<add> end
<add>
<add> expect(f.class.stable.deps[0].name).to eq("hello_both")
<add> expect(f.class.stable.deps[1].name).to eq("hello_macos")
<add> expect(f.class.stable.deps[2]).to eq(nil)
<add> end
<add> end
<add>
<add> describe "#on_macos" do
<add> it "adds a patch on macos only" do
<add> f = formula do
<add> homepage "https://brew.sh"
<add>
<add> url "https://brew.sh/test-0.1.tbz"
<add> sha256 TEST_SHA256
<add>
<add> patch do
<add> url "patch_both"
<add> end
<add>
<add> on_macos do
<add> patch do
<add> url "patch_macos"
<add> end
<add> end
<add>
<add> on_linux do
<add> patch do
<add> url "patch_linux"
<add> end
<add> end
<add> end
<add>
<add> expect(f.patchlist.length).to eq(2)
<add> expect(f.patchlist.first.strip).to eq(:p1)
<add> expect(f.patchlist.first.url).to eq("patch_both")
<add> expect(f.patchlist.second.strip).to eq(:p1)
<add> expect(f.patchlist.second.url).to eq("patch_macos")
<add> end
<add> end
<add>end | 6 |
Javascript | Javascript | fix the unicode symbols on the text layer | 9234c315d5869580ae14bb5e721a89a4f8225529 | <ide><path>src/fonts.js
<ide> var Font = (function Font() {
<ide> break;
<ide> }
<ide>
<del> var unicodeChars = this.toUnicode ? this.toUnicode[charcode] : charcode;
<add> var unicodeChars = !('toUnicode' in this) ? charcode :
<add> this.toUnicode[charcode] || charcode;
<ide> if (typeof unicodeChars === 'number')
<ide> unicodeChars = String.fromCharCode(unicodeChars);
<ide> | 1 |
Javascript | Javascript | remove undefined function | 8331f571edb27491161589f30e88962dfe436c2c | <ide><path>test/parallel/test-timers-max-duration-warning.js
<ide> const timers = require('timers');
<ide> const OVERFLOW = Math.pow(2, 31); // TIMEOUT_MAX is 2^31-1
<ide>
<ide> function timerNotCanceled() {
<del> common.fail('Timer should be canceled');
<add> assert.fail('Timer should be canceled');
<ide> }
<ide>
<ide> process.on('warning', common.mustCall((warning) => {
<ide><path>test/sequential/test-http-server-consumed-timeout.js
<ide> 'use strict';
<ide>
<ide> const common = require('../common');
<add>
<add>const assert = require('assert');
<ide> const http = require('http');
<ide>
<ide> let time = Date.now();
<ide> const server = http.createServer((req, res) => {
<ide> req.setTimeout(TIMEOUT, () => {
<ide> if (!intervalWasInvoked)
<ide> return common.skip('interval was not invoked quickly enough for test');
<del> common.fail('Request timeout should not fire');
<add> assert.fail('Request timeout should not fire');
<ide> });
<ide>
<ide> req.resume(); | 2 |
Python | Python | add danish language data | bb8be3d194689a78c29cf04b989e25a30405e1e2 | <ide><path>spacy/lang/da/__init__.py
<add># coding: utf8
<add>from __future__ import unicode_literals
<add>
<add>from .tokenizer_exceptions import TOKENIZER_EXCEPTIONS
<add>from .stop_words import STOP_WORDS
<add>
<add>from ..tokenizer_exceptions import BASE_EXCEPTIONS
<add>from ...language import Language
<add>from ...attrs import LANG
<add>from ...util import update_exc
<add>
<add>
<add>class Danish(Language):
<add> lang = 'da'
<add>
<add> class Defaults(Language.Defaults):
<add> lex_attr_getters = dict(Language.Defaults.lex_attr_getters)
<add> lex_attr_getters[LANG] = lambda text: 'da'
<add>
<add> tokenizer_exceptions = update_exc(BASE_EXCEPTIONS)
<add> stop_words = set(STOP_WORDS)
<add>
<add>
<add>__all__ = ['Danish']
<ide><path>spacy/lang/da/stop_words.py
<add># encoding: utf8
<add>from __future__ import unicode_literals
<add>
<add>
<add># Source: https://github.com/stopwords-iso/stopwords-da
<add>
<add>STOP_WORDS = set("""
<add>ad af aldrig alle alt anden andet andre at
<add>
<add>bare begge blev blive bliver
<add>
<add>da de dem den denne der deres det dette dig din dine disse dit dog du
<add>
<add>efter ej eller en end ene eneste enhver er et
<add>
<add>far fem fik fire flere fleste for fordi forrige fra få får før
<add>
<add>god godt
<add>
<add>ham han hans har havde have hej helt hende hendes her hos hun hvad hvem hver
<add>hvilken hvis hvor hvordan hvorfor hvornår
<add>
<add>i ikke ind ingen intet
<add>
<add>ja jeg jer jeres jo
<add>
<add>kan kom komme kommer kun kunne
<add>
<add>lad lav lidt lige lille
<add>
<add>man mand mange med meget men mens mere mig min mine mit mod må
<add>
<add>ned nej ni nogen noget nogle nu ny nyt når nær næste næsten
<add>
<add>og også okay om op os otte over
<add>
<add>på
<add>
<add>se seks selv ser ses sig sige sin sine sit skal skulle som stor store syv så
<add>sådan
<add>
<add>tag tage thi ti til to tre
<add>
<add>ud under
<add>
<add>var ved vi vil ville vor vores være været
<add>""".split())
<ide><path>spacy/lang/da/tokenizer_exceptions.py
<add># encoding: utf8
<add>from __future__ import unicode_literals
<add>
<add>from ...symbols import ORTH, LEMMA
<add>
<add>
<add>_exc = {}
<add>
<add>
<add>for orth in [
<add> "A/S", "beg.", "bl.a.", "ca.", "d.s.s.", "dvs.", "f.eks.", "fr.", "hhv.",
<add> "if.", "iflg.", "m.a.o.", "mht.", "min.", "osv.", "pga.", "resp.", "self.",
<add> "t.o.m.", "vha.", ""]:
<add> _exc[orth] = [{ORTH: orth}]
<add>
<add>
<add>TOKENIZER_EXCEPTIONS = dict(_exc) | 3 |
PHP | PHP | fix casing of method name | 595cced4214326a4d62b172a0519e37cf460486b | <ide><path>src/Database/Schema/MysqlSchema.php
<ide> public function describeOptionsSql($tableName, $config)
<ide> */
<ide> public function convertOptionsDescription(TableSchema $schema, $row)
<ide> {
<del> $schema->SetOptions([
<add> $schema->setOptions([
<ide> 'engine' => $row['Engine'],
<ide> 'collation' => $row['Collation'],
<ide> ]); | 1 |
Javascript | Javascript | add missing hooks to progress | 3d745045a1d53d1fb3c78bdbb2c85ebe8099a19f | <ide><path>lib/ProgressPlugin.js
<ide> class ProgressPlugin {
<ide> const hooks = {
<ide> finishModules: "finish module graph",
<ide> seal: "sealing",
<add> beforeChunks: "chunk graph",
<add> afterChunks: "after chunk graph",
<ide> optimizeDependenciesBasic: "basic dependencies optimization",
<ide> optimizeDependencies: "dependencies optimization",
<ide> optimizeDependenciesAdvanced: "advanced dependencies optimization",
<ide> class ProgressPlugin {
<ide> recordModules: "record modules",
<ide> recordChunks: "record chunks",
<ide> beforeHash: "hashing",
<add> contentHash: "content hashing",
<ide> afterHash: "after hashing",
<ide> recordHash: "record hash",
<ide> beforeModuleAssets: "module assets processing", | 1 |
Javascript | Javascript | adjust logic as suggested by westlangley | 779784c74c7423e79e4e339d98af42cc3715a746 | <ide><path>src/geometries/ExtrudeGeometry.js
<ide> function ExtrudeBufferGeometry( shapes, options ) {
<ide>
<ide> t = b / bevelSegments;
<ide> z = bevelThickness * Math.cos( t * Math.PI / 2 );
<del> bs = Math.abs( bevelSize ) * Math.sin( t * Math.PI / 2 );
<del>
<del> if ( bevelSize < 0 ) {
<del>
<del> bs += bevelSize;
<del>
<del> }
<add> var s = Math.sin( t * Math.PI / 2 );
<add> bs = ( bevelSize > 0 ) ? bevelSize * s : bevelSize * ( 1 - s );
<ide>
<ide> // contract shape
<ide>
<ide> function ExtrudeBufferGeometry( shapes, options ) {
<ide>
<ide> }
<ide>
<del> bs = ( bevelSize < 0 ? 0 : bevelSize );
<add> // Special case of
<add> // bs = ( bevelSize > 0 ) ? bevelSize * s : bevelSize * ( 1 - s );
<add> // with s = 1
<add> bs = ( bevelSize > 0 ? bevelSize : 0 );
<ide>
<ide> // Back facing vertices
<ide>
<ide> function ExtrudeBufferGeometry( shapes, options ) {
<ide>
<ide> t = b / bevelSegments;
<ide> z = bevelThickness * Math.cos( t * Math.PI / 2 );
<del> bs = Math.abs( bevelSize ) * Math.sin( t * Math.PI / 2 );
<del>
<del> if ( bevelSize < 0 ) {
<del>
<del> bs += bevelSize;
<del>
<del> }
<add> var s = Math.sin( t * Math.PI / 2 );
<add> bs = ( bevelSize > 0 ) ? bevelSize * s : bevelSize * ( 1 - s );
<ide>
<ide> // contract shape
<ide> | 1 |
Javascript | Javascript | update hash with empty string for missing files | fe62f9eabb933ac1c2d343a91c6d26f8378dbdb6 | <ide><path>lib/FileSystemInfo.js
<ide> class FileSystemInfo {
<ide> }
<ide>
<ide> if (stat.isFile()) {
<del> return this.getFileHash(child, callback);
<add> return this.getFileHash(child, (err, hash) => {
<add> callback(err, hash || "");
<add> });
<ide> }
<ide> if (stat.isDirectory()) {
<ide> this.contextHashQueue.increaseParallelism(); | 1 |
PHP | PHP | change visibility on engine | f135399a5a2c013897aa0df54540d9d9954c7e45 | <ide><path>src/Illuminate/Database/Schema/Blueprint.php
<ide> class Blueprint {
<ide> *
<ide> * @var string
<ide> */
<del> protected $engine;
<add> public $engine;
<ide>
<ide> /**
<ide> * Create a new schema blueprint. | 1 |
Java | Java | add a resolvabletype field to handlerresult | c6713c23e327961d021706e5b1e4d5ebe8e2f458 | <ide><path>spring-web-reactive/src/main/java/org/springframework/reactive/web/dispatch/HandlerResult.java
<ide>
<ide> package org.springframework.reactive.web.dispatch;
<ide>
<add>import org.springframework.core.ResolvableType;
<ide>
<ide> /**
<ide> * Represent the result of the invocation of an handler.
<ide> public class HandlerResult {
<ide>
<ide> private final Object value;
<ide>
<add> private final ResolvableType type;
<ide>
<del> public HandlerResult(Object handler, Object value) {
<add>
<add> public HandlerResult(Object handler, Object value, ResolvableType type) {
<ide> this.handler = handler;
<ide> this.value = value;
<add> this.type = type;
<ide> }
<ide>
<ide>
<ide> public Object getValue() {
<ide> return this.value;
<ide> }
<ide>
<add> public ResolvableType getType() {
<add> return type;
<add> }
<ide> }
<ide><path>spring-web-reactive/src/main/java/org/springframework/reactive/web/dispatch/SimpleHandlerResultHandler.java
<ide> import reactor.Publishers;
<ide>
<ide> import org.springframework.core.Ordered;
<add>import org.springframework.core.ResolvableType;
<ide> import org.springframework.http.server.ReactiveServerHttpRequest;
<ide> import org.springframework.http.server.ReactiveServerHttpResponse;
<ide>
<ide> */
<ide> public class SimpleHandlerResultHandler implements Ordered, HandlerResultHandler {
<ide>
<add> private static final ResolvableType PUBLISHER_VOID = ResolvableType.forClassWithGenerics(Publisher.class, Void.class);
<add>
<ide> private int order = Ordered.LOWEST_PRECEDENCE;
<ide>
<ide>
<ide> public int getOrder() {
<ide>
<ide> @Override
<ide> public boolean supports(HandlerResult result) {
<del> Object value = result.getValue();
<del> return value != null && Publisher.class.isAssignableFrom(value.getClass());
<add> ResolvableType type = result.getType();
<add> return type != null && PUBLISHER_VOID.isAssignableFrom(type);
<ide> }
<ide>
<ide> @Override
<ide><path>spring-web-reactive/src/main/java/org/springframework/reactive/web/dispatch/handler/HttpHandlerAdapter.java
<ide> import org.reactivestreams.Publisher;
<ide> import reactor.Publishers;
<ide>
<add>import org.springframework.core.ResolvableType;
<ide> import org.springframework.http.server.ReactiveServerHttpRequest;
<ide> import org.springframework.http.server.ReactiveServerHttpResponse;
<ide> import org.springframework.reactive.web.dispatch.HandlerAdapter;
<ide> */
<ide> public class HttpHandlerAdapter implements HandlerAdapter {
<ide>
<add> private static final ResolvableType PUBLISHER_VOID = ResolvableType.forClassWithGenerics(Publisher.class, Void.class);
<add>
<ide>
<ide> @Override
<ide> public boolean supports(Object handler) {
<ide> public Publisher<HandlerResult> handle(ReactiveServerHttpRequest request,
<ide>
<ide> HttpHandler httpHandler = (HttpHandler)handler;
<ide> Publisher<Void> completion = httpHandler.handle(request, response);
<del> return Publishers.just(new HandlerResult(httpHandler, completion));
<add> return Publishers.just(new HandlerResult(httpHandler, completion, PUBLISHER_VOID));
<ide> }
<ide>
<ide> }
<ide><path>spring-web-reactive/src/main/java/org/springframework/reactive/web/dispatch/method/annotation/RequestMappingHandlerAdapter.java
<ide> import reactor.Publishers;
<ide>
<ide> import org.springframework.beans.factory.InitializingBean;
<add>import org.springframework.core.ResolvableType;
<ide> import org.springframework.core.convert.ConversionService;
<ide> import org.springframework.http.server.ReactiveServerHttpRequest;
<ide> import org.springframework.http.server.ReactiveServerHttpResponse;
<ide> public Publisher<HandlerResult> handle(ReactiveServerHttpRequest request,
<ide>
<ide> InvocableHandlerMethod handlerMethod = new InvocableHandlerMethod((HandlerMethod) handler);
<ide> handlerMethod.setHandlerMethodArgumentResolvers(this.argumentResolvers);
<add> ResolvableType type = ResolvableType.forMethodParameter(handlerMethod.getReturnType());
<ide>
<ide> Publisher<Object> resultPublisher = handlerMethod.invokeForRequest(request);
<del> return Publishers.map(resultPublisher, result -> new HandlerResult(handlerMethod, result));
<add> return Publishers.map(resultPublisher, result -> new HandlerResult(handlerMethod, result, type));
<ide> }
<ide>
<ide> }
<ide>\ No newline at end of file
<ide><path>spring-web-reactive/src/main/java/org/springframework/reactive/web/dispatch/method/annotation/ResponseBodyResultHandler.java
<ide> public Publisher<Void> handleResult(ReactiveServerHttpRequest request,
<ide> return Publishers.empty();
<ide> }
<ide>
<del> HandlerMethod hm = (HandlerMethod) result.getHandler();
<del> ResolvableType returnType = ResolvableType.forMethodParameter(hm.getReturnValueType(value));
<add> ResolvableType returnType = result.getType();
<ide>
<ide> List<MediaType> requestedMediaTypes = getAcceptableMediaTypes(request);
<ide> List<MediaType> producibleMediaTypes = getProducibleMediaTypes(returnType);
<ide><path>spring-web-reactive/src/test/java/org/springframework/reactive/web/dispatch/SimpleHandlerResultHandlerTests.java
<add>/*
<add> * Copyright 2002-2015 the original author or authors.
<add> *
<add> * Licensed under the Apache License, Version 2.0 (the "License");
<add> * you may not use this file except in compliance with the License.
<add> * You may obtain a copy of the License at
<add> *
<add> * http://www.apache.org/licenses/LICENSE-2.0
<add> *
<add> * Unless required by applicable law or agreed to in writing, software
<add> * distributed under the License is distributed on an "AS IS" BASIS,
<add> * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
<add> * See the License for the specific language governing permissions and
<add> * limitations under the License.
<add> */
<add>
<add>package org.springframework.reactive.web.dispatch;
<add>
<add>import static org.junit.Assert.assertFalse;
<add>import static org.junit.Assert.assertTrue;
<add>import org.junit.Test;
<add>import org.reactivestreams.Publisher;
<add>
<add>import org.springframework.core.ResolvableType;
<add>import org.springframework.web.method.HandlerMethod;
<add>
<add>/**
<add> * @author Sebastien Deleuze
<add> */
<add>public class SimpleHandlerResultHandlerTests {
<add>
<add> @Test
<add> public void supports() throws NoSuchMethodException {
<add>
<add> SimpleHandlerResultHandler resultHandler = new SimpleHandlerResultHandler();
<add> TestController controller = new TestController();
<add>
<add> HandlerMethod hm = new HandlerMethod(controller, TestController.class.getMethod("voidReturnValue"));
<add> ResolvableType type = ResolvableType.forMethodParameter(hm.getReturnType());
<add> assertFalse(resultHandler.supports(new HandlerResult(hm, null, type)));
<add>
<add> hm = new HandlerMethod(controller, TestController.class.getMethod("publisherString"));
<add> type = ResolvableType.forMethodParameter(hm.getReturnType());
<add> assertFalse(resultHandler.supports(new HandlerResult(hm, null, type)));
<add>
<add> hm = new HandlerMethod(controller, TestController.class.getMethod("publisherVoid"));
<add> type = ResolvableType.forMethodParameter(hm.getReturnType());
<add> assertTrue(resultHandler.supports(new HandlerResult(hm, null, type)));
<add> }
<add>
<add>
<add> @SuppressWarnings("unused")
<add> private static class TestController {
<add>
<add> public Publisher<String> voidReturnValue() {
<add> return null;
<add> }
<add>
<add> public Publisher<String> publisherString() {
<add> return null;
<add> }
<add>
<add> public Publisher<Void> publisherVoid() {
<add> return null;
<add> }
<add> }
<add>
<add>}
<ide><path>spring-web-reactive/src/test/java/org/springframework/reactive/web/dispatch/method/annotation/ResponseBodyResultHandlerTests.java
<ide> import org.junit.Test;
<ide> import org.reactivestreams.Publisher;
<ide>
<add>import org.springframework.core.ResolvableType;
<ide> import org.springframework.core.convert.support.DefaultConversionService;
<ide> import org.springframework.reactive.codec.encoder.StringEncoder;
<ide> import org.springframework.reactive.web.dispatch.HandlerResult;
<ide> public void supports() throws NoSuchMethodException {
<ide> TestController controller = new TestController();
<ide>
<ide> HandlerMethod hm = new HandlerMethod(controller,TestController.class.getMethod("notAnnotated"));
<del> assertFalse(handler.supports(new HandlerResult(hm, null)));
<add> ResolvableType type = ResolvableType.forMethodParameter(hm.getReturnType());
<add> assertFalse(handler.supports(new HandlerResult(hm, null, type)));
<ide>
<ide> hm = new HandlerMethod(controller, TestController.class.getMethod("publisherString"));
<del> assertTrue(handler.supports(new HandlerResult(hm, null)));
<add> type = ResolvableType.forMethodParameter(hm.getReturnType());
<add> assertTrue(handler.supports(new HandlerResult(hm, null, type)));
<ide>
<ide> hm = new HandlerMethod(controller, TestController.class.getMethod("publisherVoid"));
<del> assertTrue(handler.supports(new HandlerResult(hm, null)));
<add> type = ResolvableType.forMethodParameter(hm.getReturnType());
<add> assertTrue(handler.supports(new HandlerResult(hm, null, type)));
<ide> }
<ide>
<ide> | 7 |
Javascript | Javascript | fix whitespace in headers | fda7a673c1a78633bc55b0ae762d7cae51ca230f | <ide><path>src/addons/shallowCompare.js
<ide> * LICENSE file in the root directory of this source tree. An additional grant
<ide> * of patent rights can be found in the PATENTS file in the same directory.
<ide> *
<del>* @providesModule shallowCompare
<del>*/
<add> * @providesModule shallowCompare
<add> */
<ide>
<ide> 'use strict';
<ide>
<ide><path>src/renderers/dom/client/renderSubtreeIntoContainer.js
<ide> * LICENSE file in the root directory of this source tree. An additional grant
<ide> * of patent rights can be found in the PATENTS file in the same directory.
<ide> *
<del>* @providesModule renderSubtreeIntoContainer
<del>*/
<add> * @providesModule renderSubtreeIntoContainer
<add> */
<ide>
<ide> 'use strict';
<ide>
<ide><path>src/renderers/shared/stack/reconciler/ReactInstanceType.js
<del> /**
<add>/**
<ide> * Copyright 2016-present, Facebook, Inc.
<ide> * All rights reserved.
<ide> *
<ide> * This source code is licensed under the BSD-style license found in the
<ide> * LICENSE file in the root directory of this source tree. An additional grant
<ide> * of patent rights can be found in the PATENTS file in the same directory.
<ide> *
<del> * @flow
<ide> * @providesModule ReactInstanceType
<add> * @flow
<ide> */
<ide>
<ide> 'use strict'; | 3 |
PHP | PHP | convert i18n shell tests to be integration style | ed13c7cfb82ed441e39aa6cd713dcb0f8576f0b9 | <ide><path>tests/TestCase/Shell/I18nShellTest.php
<ide> namespace Cake\Test\TestCase\Shell;
<ide>
<ide> use Cake\Shell\I18nShell;
<del>use Cake\TestSuite\TestCase;
<add>use Cake\TestSuite\ConsoleIntegrationTestCase;
<ide>
<ide> /**
<ide> * I18nShell test.
<ide> */
<del>class I18nShellTest extends TestCase
<add>class I18nShellTest extends ConsoleIntegrationTestCase
<ide> {
<ide> /**
<ide> * setup method
<ide> public function testInit()
<ide> unlink($deDir . 'cake.po');
<ide> }
<ide>
<del> $this->shell->getIo()->expects($this->at(0))
<del> ->method('ask')
<del> ->will($this->returnValue('de_DE'));
<del> $this->shell->getIo()->expects($this->at(1))
<del> ->method('ask')
<del> ->will($this->returnValue($this->localeDir));
<del>
<del> $this->shell->params['verbose'] = true;
<del> $this->shell->init();
<add> $this->exec('i18n init --verbose', [
<add> 'de_DE',
<add> $this->localeDir,
<add> ]);
<ide>
<add> $this->assertExitSuccess();
<add> $this->assertOutputContains('Generated 2 PO files');
<ide> $this->assertFileExists($deDir . 'default.po');
<ide> $this->assertFileExists($deDir . 'cake.po');
<ide> }
<ide> public function testInit()
<ide> */
<ide> public function testGetOptionParser()
<ide> {
<del> $this->shell->loadTasks();
<del> $parser = $this->shell->getOptionParser();
<del> $this->assertArrayHasKey('init', $parser->subcommands());
<del> $this->assertArrayHasKey('extract', $parser->subcommands());
<add> $this->exec('i18n -h');
<add>
<add> $this->assertExitSuccess();
<add> $this->assertOutputContains('init');
<add> $this->assertOutputContains('extract');
<add> }
<add>
<add> /**
<add> * Tests main interactive mode
<add> *
<add> * @return void
<add> */
<add> public function testInteractiveQuit()
<add> {
<add> $this->exec('i18n', ['q']);
<add> $this->assertExitSuccess();
<add> }
<add>
<add> /**
<add> * Tests main interactive mode
<add> *
<add> * @return void
<add> */
<add> public function testInteractiveHelp()
<add> {
<add> $this->exec('i18n', ['h', 'q']);
<add> $this->assertExitSuccess();
<add> $this->assertOutputContains('cake i18n');
<add> $this->assertOutputContains('init');
<add> }
<add>
<add> /**
<add> * Tests main interactive mode
<add> *
<add> * @return void
<add> */
<add> public function testInteractiveInit()
<add> {
<add> $this->exec('i18n', [
<add> 'i',
<add> 'q',
<add> ]);
<add> $this->assertExitError();
<add> $this->assertErrorContains('Invalid language code');
<ide> }
<ide> } | 1 |
Text | Text | update writable.write return value | f347dad0b7b1787092cca88789b77eb3def2d319 | <ide><path>doc/api/stream.md
<ide> occurs, the `callback` *may or may not* be called with the error as its
<ide> first argument. To reliably detect write errors, add a listener for the
<ide> `'error'` event.
<ide>
<del>The return value indicates whether the written `chunk` was buffered internally
<del>and the buffer has exceeded the `highWaterMark` configured when the stream was
<del>created. If `false` is returned, further attempts to write data to the stream
<del>should be paused until the [`'drain'`][] event is emitted.
<add>The return value is `true` if the internal buffer does not exceed
<add>`highWaterMark` configured when the stream was created after admitting `chunk`.
<add>If `false` is returned, further attempts to write data to the stream should
<add>stop until the [`'drain'`][] event is emitted. However, the `false` return
<add>value is only advisory and the writable stream will unconditionally accept and
<add>buffer `chunk` even if it has not not been allowed to drain.
<ide>
<ide> A Writable stream in object mode will always ignore the `encoding` argument.
<ide> | 1 |
Javascript | Javascript | remove unused function argument | 751d215eb8334bff0e50b7a379b55180870b7f5c | <ide><path>lib/internal/inspector/inspect_repl.js
<ide> function createRepl(inspector) {
<ide> const lines = watchedExpressions
<ide> .map((expr, idx) => {
<ide> const prefix = `${leftPad(idx, ' ', lastIndex)}: ${expr} =`;
<del> const value = inspect(values[idx], { colors: true });
<add> const value = inspect(values[idx]);
<ide> if (value.indexOf('\n') === -1) {
<ide> return `${prefix} ${value}`;
<ide> } | 1 |
Javascript | Javascript | remove root.unmount() callback from devtools code | d2ae77d0e41fc6cee22bf3f0c9bd9c3d8bcec58c | <ide><path>packages/react-devtools-extensions/src/main.js
<ide> function createPanelIfReactLoaded() {
<ide>
<ide> // It's easiest to recreate the DevTools panel (to clean up potential stale state).
<ide> // We can revisit this in the future as a small optimization.
<del> flushSync(() => {
<del> root.unmount(() => {
<del> initBridgeAndStore();
<del> });
<del> });
<add> flushSync(() => root.unmount());
<add>
<add> initBridgeAndStore();
<ide> });
<ide> },
<ide> ); | 1 |
Text | Text | trim github template comments | 42ede93d14838b062dfad870bb10f1606f55e2c2 | <ide><path>.github/ISSUE_TEMPLATE.md
<ide> <!--
<del>Thanks for wanting to report an issue you've found in Node.js. Please fill in
<del>the template below by replacing the html comments with an appropriate answer.
<del>If unsure about something, just do as best as you're able.
<add>Thank you for reporting an issue. Please fill in the template below. If unsure
<add>about something, just do as best as you're able.
<ide>
<del>version: usually output of `node -v`
<del>platform: either `uname -a` output, or if Windows, version and 32 or 64-bit.
<del>subsystem: optional -- if known please specify affected core module name.
<add>Version: usually output of `node -v`
<add>Platform: either `uname -a` output, or if Windows, version and 32 or 64-bit
<add>Subsystem: if known, please specify affected core module name
<ide>
<ide> It will be much easier for us to fix the issue if a test case that reproduces
<ide> the problem is provided. Ideally this test case should not have any external
<ide> dependencies. We understand that it is not always possible to reduce your code
<ide> to a small test case, but we would appreciate to have as
<ide> much data as possible.
<del>
<del>Thank you!
<ide> -->
<ide>
<ide> * **Version**:
<ide><path>.github/PULL_REQUEST_TEMPLATE.md
<ide> <!--
<del>Thank you for submitting a pull request to Node.js. Before you submit, please
<del>review below requirements and walk through the checklist. You can 'tick'
<del>a box by using the letter "x": [x].
<add>Thank you for your pull request. Please review below requirements and walk
<add>through the checklist. You can 'tick' a box by using the letter "x": [x].
<ide>
<del>Run the test suite by invoking: `make -j4 lint test` on linux or
<del>`vcbuild test nosign` on Windows.
<add>Run the test suite with: `make -j4 test` on UNIX or `vcbuild test nosign` on
<add>Windows.
<ide>
<ide> If this aims to fix a regression or you’re adding a feature, make sure you also
<del>write a test. Finally – if possible – a benchmark that quantifies your changes.
<add>write a test. If possible, include a benchmark that quantifies your changes.
<ide>
<ide> Finally, read through our contributors guide and make adjustments as necessary:
<ide> https://github.com/nodejs/node/blob/master/CONTRIBUTING.md
<ide> -->
<ide>
<ide> ##### Checklist
<del>
<ide> <!-- remove lines that do not apply to you -->
<ide>
<ide> - [ ] tests and code linting passes
<ide> https://github.com/nodejs/node/blob/master/CONTRIBUTING.md
<ide>
<ide>
<ide> ##### Affected core subsystem(s)
<del>
<ide> <!-- provide affected core subsystem(s) (like doc, cluster, crypto, etc) -->
<ide>
<ide>
<ide> ##### Description of change
<del>
<ide> <!-- provide a description of the change below this comment --> | 2 |
Javascript | Javascript | remove optimistic update | 8ce2cde619b4b4d66276933ce48f103e0c13e123 | <ide><path>common/app/routes/Hikes/flux/Actions.js
<ide> export default Actions({
<ide> }
<ide>
<ide> // challenge completed
<del> const optimisticSave = isSignedIn ?
<del> this.post$('/completed-challenge', { id, name, challengeType }) :
<del> Observable.just(true);
<add> let update$;
<add> if (isSignedIn) {
<add> const body = { id, name, challengeType };
<add> update$ = this.postJSON$('/completed-challenge', body)
<add> // if post fails, will retry once
<add> .retry(3)
<add> .map(({ alreadyCompleted, points }) => ({
<add> transform(state) {
<add> return {
<add> ...state,
<add> points,
<add> toast: {
<add> message:
<add> 'Challenge saved.' +
<add> (alreadyCompleted ? '' : ' First time Completed!'),
<add> title: 'Saved',
<add> type: 'info',
<add> id: state.toast && state.toast.id ? state.toast.id + 1 : 1
<add> }
<add> };
<add> }
<add> }))
<add> .catch((errObj => {
<add> const err = new Error(errObj.message);
<add> err.stack = errObj.stack;
<add> return {
<add> transform(state) { return { ...state, err }; }
<add> };
<add> }));
<add> } else {
<add> update$ = Observable.just({ transform: (() => {}) });
<add> }
<add>
<add> const challengeCompleted$ = Observable.just({
<add> transform(state) {
<add> const { hikes, currentHike: { id } } = state.hikesApp;
<add> const currentHike = findNextHike(hikes, id);
<add>
<add> return {
<add> ...state,
<add> points: isSignedIn ? state.points + 1 : state.points,
<add> hikesApp: {
<add> ...state.hikesApp,
<add> currentHike,
<add> showQuestions: false,
<add> currentQuestion: 1,
<add> mouse: [0, 0]
<add> },
<add> toast: {
<add> title: 'Congratulations!',
<add> message: 'Hike completed.' + (isSignedIn ? ' Saving...' : ''),
<add> id: state.toast && state.toast.id ?
<add> state.toast.id + 1 :
<add> 1,
<add> type: 'success'
<add> },
<add> location: {
<add> action: 'PUSH',
<add> pathname: currentHike && currentHike.dashedName ?
<add> `/hikes/${ currentHike.dashedName }` :
<add> '/hikes'
<add> }
<add> };
<add> }
<add> });
<ide>
<ide> const correctAnswer = {
<ide> transform(state) {
<ide> export default Actions({
<ide> }
<ide> };
<ide>
<del> return Observable.just({
<del> transform(state) {
<del> const { hikes, currentHike: { id } } = state.hikesApp;
<del> const currentHike = findNextHike(hikes, id);
<del>
<del> return {
<del> ...state,
<del> points: isSignedIn ? state.points + 1 : state.points,
<del> hikesApp: {
<del> ...state.hikesApp,
<del> currentHike,
<del> showQuestions: false,
<del> currentQuestion: 1,
<del> mouse: [0, 0]
<del> },
<del> toast: {
<del> title: 'Congratulations!',
<del> message: 'Hike completed',
<del> id: state.toast && typeof state.toast.id === 'number' ?
<del> state.toast.id + 1 :
<del> 0,
<del> type: 'success'
<del> },
<del> location: {
<del> action: 'PUSH',
<del> pathname: currentHike && currentHike.dashedName ?
<del> `/hikes/${ currentHike.dashedName }` :
<del> '/hikes'
<del> }
<del> };
<del> },
<del> optimistic: optimisticSave
<del> })
<add> return Observable.merge(challengeCompleted$, update$)
<ide> .delay(300)
<ide> .startWith(correctAnswer)
<ide> .catch(err => Observable.just({ | 1 |
PHP | PHP | implement failing tests for prefixing aliases | 554713afb487666fb8d7400d2be6a7b7427ba5de | <ide><path>tests/Database/DatabaseQueryBuilderTest.php
<ide> public function testBasicAlias()
<ide> }
<ide>
<ide>
<add> public function testAliasWithPrefix()
<add> {
<add> $builder = $this->getBuilder();
<add> $builder->getGrammar()->setTablePrefix('prefix_');
<add> $builder->select('*')->from('users as people');
<add> $this->assertEquals('select * from "prefix_users" as "prefix_people"', $builder->toSql());
<add> }
<add>
<add>
<add> public function testJoinAliasesWithPrefix()
<add> {
<add> $builder = $this->getBuilder();
<add> $builder->getGrammar()->setTablePrefix('prefix_');
<add> $builder->select('*')->from('services')->join('translations AS t', 't.item_id', '=', 'services.id');
<add> $this->assertEquals('select * from "prefix_services" inner join "prefix_translations" as "prefix_t" on "prefix_t"."item_id" = "prefix_services"."id"', $builder->toSql());
<add> }
<add>
<add>
<ide> public function testBasicTableWrapping()
<ide> {
<ide> $builder = $this->getBuilder(); | 1 |
PHP | PHP | fix flash directory in docblock | 8156c16dd7f91f5f295157422b3b98c09a943896 | <ide><path>src/Controller/Component/FlashComponent.php
<ide> public function set($message, array $options = []) {
<ide> * Magic method for verbose flash methods based on element names.
<ide> *
<ide> * For example: $this->Flash->success('My message') would use the
<del> * success.ctp element under `Template/Element/Flash` for rendering the
<add> * success.ctp element under `App/Template/Element/Flash` for rendering the
<ide> * flash message.
<ide> *
<ide> * @param string $name Element name to use. | 1 |
PHP | PHP | add assertions for boolean values | ea51c30de77d33f6e436e4f3d75888c0e0a35ec9 | <ide><path>tests/TestCase/Validation/ValidationTest.php
<ide> public function testIsInteger()
<ide> $this->assertFalse(Validation::isInteger([]));
<ide> $this->assertFalse(Validation::isInteger(new \StdClass));
<ide> $this->assertFalse(Validation::isInteger('2 bears'));
<add> $this->assertFalse(Validation::isInteger(true));
<add> $this->assertFalse(Validation::isInteger(false));
<ide> }
<ide>
<ide> /** | 1 |
Python | Python | accept single tensor | bb9b800ebba8914b69db362cfac4e7c8a9b17a9e | <ide><path>keras/engine/training.py
<ide> def compile(self, optimizer,
<ide> for name in self.output_names:
<ide> tmp_target_tensors.append(target_tensors.get(name, None))
<ide> target_tensors = tmp_target_tensors
<add> elif K.is_tensor(target_tensors):
<add> if len(self.outputs) != 1:
<add> raise ValueError('The model has ' + str(len(self.outputs)) +
<add> ' outputs, but you passed a single tensor as '
<add> '`target_tensors`. Expected a list or a dict '
<add> 'of tensors.')
<add> target_tensors = [target_tensors]
<ide> else:
<del> raise TypeError('Expected `target_tensors` to be '
<del> 'a list or dict, but got:', target_tensors)
<add> raise TypeError('Expected `target_tensors` to be a tensor, '
<add> 'a list of tensors, or dict of tensors, but got:', target_tensors)
<add>
<ide> for i in range(len(self.outputs)):
<ide> if i in skip_target_indices:
<ide> self.targets.append(None)
<ide><path>tests/keras/engine/test_training.py
<ide> def mse(y_true, y_pred):
<ide> def gen_data():
<ide> while True:
<ide> yield (np.asarray([]), np.asarray([]))
<add>
<ide> out = model.evaluate_generator(gen_data(), steps=1)
<ide>
<ide> # x is not a list of numpy arrays.
<ide> def gen_data():
<ide> def gen_data():
<ide> while True:
<ide> yield (np.asarray([]), np.asarray([]))
<add>
<ide> out = model.fit_generator(generator=gen_data(), epochs=5,
<ide> initial_epoch=0, validation_data=gen_data(),
<ide> callbacks=[tracker_cb])
<ide> def gen_data(i):
<ide> gen_counters[i] += 1
<ide> yield ([np.random.random((1, 3)), np.random.random((1, 3))],
<ide> [np.random.random((1, 4)), np.random.random((1, 3))])
<add>
<ide> out = model.fit_generator(generator=gen_data(0), epochs=3,
<ide> steps_per_epoch=2,
<ide> validation_data=gen_data(1),
<ide> def test_target_tensors():
<ide> target_tensors={'dense': target})
<ide> model.train_on_batch(input_val, None)
<ide>
<add> # single-output, as tensor
<add> model.compile(optimizer='rmsprop', loss='mse',
<add> target_tensors=target)
<add> model.train_on_batch(input_val, None)
<add>
<ide> # test invalid arguments
<ide> with pytest.raises(TypeError):
<ide> model.compile(optimizer='rmsprop', loss='mse',
<ide> def test_target_tensors():
<ide> 'dense_b': target_b})
<ide> model.train_on_batch(input_val, None)
<ide>
<add> # multi-output, not enough target tensors when `target_tensors` is not a dict
<add> with pytest.raises(ValueError, match='When passing a list as `target_tensors`, it should have one entry per model '
<add> 'output. The model has \d outputs, but you passed target_tensors='):
<add> model.compile(optimizer='rmsprop', loss='mse',
<add> target_tensors=[target_a])
<add> with pytest.raises(ValueError, match='The model has \d outputs, but you passed a single tensor as '
<add> '`target_tensors`. Expected a list or a dict of tensors.'):
<add> model.compile(optimizer='rmsprop', loss='mse',
<add> target_tensors=target_a)
<add>
<ide> # test with sample weights
<ide> model.compile(optimizer='rmsprop', loss='mse',
<ide> target_tensors=[target_a, target_b])
<ide> def test_model_with_crossentropy_losses_channels_first():
<ide> `channels_first` or `channels_last` image_data_format.
<ide> Tests PR #9715.
<ide> """
<add>
<ide> def prepare_simple_model(input_tensor, loss_name, target):
<ide> axis = 1 if K.image_data_format() == 'channels_first' else -1
<ide> if loss_name == 'sparse_categorical_crossentropy': | 2 |
Text | Text | add season to build status | 5920143bb9c725da22c185904600643bc2ac9877 | <ide><path>docs/build-instructions/build-status.md
<ide> | [Oniguruma](https://github.com/atom/node-oniguruma) | [](https://travis-ci.org/atom/node-oniguruma) | [](https://ci.appveyor.com/project/Atom/node-oniguruma/branch/master) | [](https://david-dm.org/atom/node-oniguruma) |
<ide> | [PathWatcher](https://github.com/atom/node-pathwatcher) | [](https://travis-ci.org/atom/node-pathwatcher) | [](https://ci.appveyor.com/project/Atom/node-pathwatcher) | [](https://david-dm.org/atom/node-pathwatcher) |
<ide> | [Property Accessors](https://github.com/atom/property-accessors) | [](https://travis-ci.org/atom/property-accessors) | [](https://ci.appveyor.com/project/Atom/property-accessors/branch/master) | [](https://david-dm.org/atom/property-accessors) |
<add>| [Season](https://github.com/atom/season) | [](https://travis-ci.org/atom/season) | [](https://ci.appveyor.com/project/Atom/season) | [](https://david-dm.org/atom/season) |
<ide> | [TextBuffer](https://github.com/atom/text-buffer) | [](https://travis-ci.org/atom/text-buffer) | [](https://ci.appveyor.com/project/Atom/text-buffer/branch/master) | [](https://david-dm.org/atom/text-buffer) |
<ide> | [Underscore-Plus](https://github.com/atom/underscore-plus) | [](https://travis-ci.org/atom/underscore-plus) | [](https://ci.appveyor.com/project/Atom/underscore-plus/branch/master) | [](https://david-dm.org/atom/underscore-plus) |
<ide> | 1 |
Javascript | Javascript | add bailout to hostroot | 0907a49abd709308e95a93e11dde16d67c06dd43 | <ide><path>src/renderers/shared/fiber/ReactFiberBeginWork.js
<ide> module.exports = function<T, P, I, TI, C, CX>(
<ide> if (updateQueue) {
<ide> const prevState = workInProgress.memoizedState;
<ide> const state = beginUpdateQueue(workInProgress, updateQueue, null, prevState, null, priorityLevel);
<add> if (prevState === state) {
<add> // If the state is the same as before, that's a bailout because we had
<add> // no work matching this priority.
<add> return bailoutOnAlreadyFinishedWork(current, workInProgress);
<add> }
<ide> const element = state.element;
<ide> reconcileChildren(current, workInProgress, element);
<ide> workInProgress.memoizedState = state;
<add> return workInProgress.child;
<ide> }
<del> return workInProgress.child;
<add> // If there is no update queue, that's a bailout because the root has no props.
<add> return bailoutOnAlreadyFinishedWork(current, workInProgress);
<ide> }
<ide>
<ide> function updateHostComponent(current, workInProgress) { | 1 |
Javascript | Javascript | fix typo in viewpagerandroid.android.js | 2849bafbc37429377cc5c5a57774dce22ffaf9c9 | <ide><path>Libraries/Components/ViewPager/ViewPagerAndroid.android.js
<ide> var ViewPagerAndroid = React.createClass({
<ide>
<ide> /**
<ide> * A helper function to scroll to a specific page in the ViewPager.
<del> * The transition between pages will be *not* be animated.
<add> * The transition between pages will *not* be animated.
<ide> */
<ide> setPageWithoutAnimation: function(selectedPage: number) {
<ide> UIManager.dispatchViewManagerCommand( | 1 |
Python | Python | improve validation of group id | 833e1094a72b5a09f6b2249001b977538f139a19 | <ide><path>airflow/utils/helpers.py
<ide> from airflow.models import TaskInstance
<ide>
<ide> KEY_REGEX = re.compile(r'^[\w.-]+$')
<add>GROUP_KEY_REGEX = re.compile(r'^[\w-]+$')
<ide> CAMELCASE_TO_SNAKE_CASE_REGEX = re.compile(r'(?!^)([A-Z]+)')
<ide>
<ide> T = TypeVar('T')
<ide> S = TypeVar('S')
<ide>
<ide>
<del>def validate_key(k: str, max_length: int = 250) -> bool:
<add>def validate_key(k: str, max_length: int = 250):
<ide> """Validates value used as a key."""
<ide> if not isinstance(k, str):
<del> raise TypeError("The key has to be a string")
<del> elif len(k) > max_length:
<add> raise TypeError(f"The key has to be a string and is {type(k)}:{k}")
<add> if len(k) > max_length:
<ide> raise AirflowException(f"The key has to be less than {max_length} characters")
<del> elif not KEY_REGEX.match(k):
<add> if not KEY_REGEX.match(k):
<ide> raise AirflowException(
<ide> "The key ({k}) has to be made of alphanumeric characters, dashes, "
<ide> "dots and underscores exclusively".format(k=k)
<ide> )
<del> else:
<del> return True
<add>
<add>
<add>def validate_group_key(k: str, max_length: int = 200):
<add> """Validates value used as a group key."""
<add> if not isinstance(k, str):
<add> raise TypeError(f"The key has to be a string and is {type(k)}:{k}")
<add> if len(k) > max_length:
<add> raise AirflowException(f"The key has to be less than {max_length} characters")
<add> if not GROUP_KEY_REGEX.match(k):
<add> raise AirflowException(
<add> f"The key ({k}) has to be made of alphanumeric characters, dashes " "and underscores exclusively"
<add> )
<ide>
<ide>
<ide> def alchemy_to_dict(obj: Any) -> Optional[Dict]:
<ide><path>airflow/utils/task_group.py
<ide>
<ide> from airflow.exceptions import AirflowException, DuplicateTaskIdFound
<ide> from airflow.models.taskmixin import TaskMixin
<add>from airflow.utils.helpers import validate_group_key
<ide>
<ide> if TYPE_CHECKING:
<ide> from airflow.models.baseoperator import BaseOperator
<ide> def __init__(
<ide> self.used_group_ids: Set[Optional[str]] = set()
<ide> self._parent_group = None
<ide> else:
<del> if not isinstance(group_id, str):
<del> raise ValueError("group_id must be str")
<del> if not group_id:
<del> raise ValueError("group_id must not be empty")
<add> if prefix_group_id:
<add> # If group id is used as prefix, it should not contain spaces nor dots
<add> # because it is used as prefix in the task_id
<add> validate_group_key(group_id)
<add> else:
<add> if not isinstance(group_id, str):
<add> raise ValueError("group_id must be str")
<add> if not group_id:
<add> raise ValueError("group_id must not be empty")
<ide>
<ide> dag = dag or DagContext.get_current_dag()
<ide>
<ide><path>tests/utils/test_helpers.py
<ide> # KIND, either express or implied. See the License for the
<ide> # specific language governing permissions and limitations
<ide> # under the License.
<del>
<add>import re
<ide> import unittest
<ide> from datetime import datetime
<ide>
<ide> import pytest
<add>from parameterized import parameterized
<ide>
<add>from airflow import AirflowException
<ide> from airflow.models import TaskInstance
<ide> from airflow.models.dag import DAG
<ide> from airflow.operators.dummy import DummyOperator
<ide> from airflow.utils import helpers
<del>from airflow.utils.helpers import build_airflow_url_with_query, merge_dicts
<add>from airflow.utils.helpers import build_airflow_url_with_query, merge_dicts, validate_group_key, validate_key
<ide> from tests.test_utils.config import conf_vars
<ide>
<ide>
<ide> def test_build_airflow_url_with_query(self):
<ide>
<ide> with cached_app(testing=True).test_request_context():
<ide> assert build_airflow_url_with_query(query) == expected_url
<add>
<add> @parameterized.expand(
<add> [
<add> (3, "The key has to be a string and is <class 'int'>:3", TypeError),
<add> (None, "The key has to be a string and is <class 'NoneType'>:None", TypeError),
<add> ("simple_key", None, None),
<add> ("simple-key", None, None),
<add> ("group.simple_key", None, None),
<add> ("root.group.simple-key", None, None),
<add> (
<add> "key with space",
<add> "The key (key with space) has to be made of alphanumeric "
<add> "characters, dashes, dots and underscores exclusively",
<add> AirflowException,
<add> ),
<add> (
<add> "key_with_!",
<add> "The key (key_with_!) has to be made of alphanumeric "
<add> "characters, dashes, dots and underscores exclusively",
<add> AirflowException,
<add> ),
<add> (' ' * 251, "The key has to be less than 250 characters", AirflowException),
<add> ]
<add> )
<add> def test_validate_key(self, key_id, message, exception):
<add> if message:
<add> with pytest.raises(exception, match=re.escape(message)):
<add> validate_key(key_id)
<add> else:
<add> validate_key(key_id)
<add>
<add> @parameterized.expand(
<add> [
<add> (3, "The key has to be a string and is <class 'int'>:3", TypeError),
<add> (None, "The key has to be a string and is <class 'NoneType'>:None", TypeError),
<add> ("simple_key", None, None),
<add> ("simple-key", None, None),
<add> (
<add> "group.simple_key",
<add> "The key (group.simple_key) has to be made of alphanumeric "
<add> "characters, dashes and underscores exclusively",
<add> AirflowException,
<add> ),
<add> (
<add> "root.group-name.simple_key",
<add> "The key (root.group-name.simple_key) has to be made of alphanumeric "
<add> "characters, dashes and underscores exclusively",
<add> AirflowException,
<add> ),
<add> (
<add> "key with space",
<add> "The key (key with space) has to be made of alphanumeric "
<add> "characters, dashes and underscores exclusively",
<add> AirflowException,
<add> ),
<add> (
<add> "key_with_!",
<add> "The key (key_with_!) has to be made of alphanumeric "
<add> "characters, dashes and underscores exclusively",
<add> AirflowException,
<add> ),
<add> (' ' * 201, "The key has to be less than 200 characters", AirflowException),
<add> ]
<add> )
<add> def test_validate_group_key(self, key_id, message, exception):
<add> if message:
<add> with pytest.raises(exception, match=re.escape(message)):
<add> validate_group_key(key_id)
<add> else:
<add> validate_group_key(key_id) | 3 |
Java | Java | introduce json streaming support | 6b9b0230c4de394348c42b6f5964fc66eb659dd3 | <ide><path>spring-web/src/main/java/org/springframework/http/MediaType.java
<ide> public class MediaType extends MimeType implements Serializable {
<ide> */
<ide> public final static String APPLICATION_RSS_XML_VALUE = "application/rss+xml";
<ide>
<add> /**
<add> * Public constant media type for {@code application/stream+json}.
<add> * @since 5.0
<add> */
<add> public final static MediaType APPLICATION_STREAM_JSON;
<add>
<add> /**
<add> * A String equivalent of {@link MediaType#APPLICATION_STREAM_JSON}.
<add> * @since 5.0
<add> */
<add> public final static String APPLICATION_STREAM_JSON_VALUE = "application/stream+json";
<add>
<ide> /**
<ide> * Public constant media type for {@code application/xhtml+xml}.
<ide> */
<ide> public class MediaType extends MimeType implements Serializable {
<ide> APPLICATION_PROBLEM_JSON = valueOf(APPLICATION_PROBLEM_JSON_VALUE);
<ide> APPLICATION_PROBLEM_XML = valueOf(APPLICATION_PROBLEM_XML_VALUE);
<ide> APPLICATION_RSS_XML = valueOf(APPLICATION_RSS_XML_VALUE);
<add> APPLICATION_STREAM_JSON = valueOf(APPLICATION_STREAM_JSON_VALUE);
<ide> APPLICATION_XHTML_XML = valueOf(APPLICATION_XHTML_XML_VALUE);
<ide> APPLICATION_XML = valueOf(APPLICATION_XML_VALUE);
<ide> IMAGE_GIF = valueOf(IMAGE_GIF_VALUE);
<ide><path>spring-web/src/main/java/org/springframework/http/codec/json/Jackson2JsonEncoder.java
<ide> /*
<del> * Copyright 2002-2016 the original author or authors.
<add> * Copyright 2002-2017 the original author or authors.
<ide> *
<ide> * Licensed under the Apache License, Version 2.0 (the "License");
<ide> * you may not use this file except in compliance with the License.
<ide>
<ide> import java.io.IOException;
<ide> import java.io.OutputStream;
<del>import java.nio.ByteBuffer;
<ide> import java.util.List;
<ide> import java.util.Map;
<ide>
<ide> import com.fasterxml.jackson.databind.SerializationFeature;
<ide> import com.fasterxml.jackson.databind.type.TypeFactory;
<ide> import org.reactivestreams.Publisher;
<add>import static org.springframework.http.MediaType.APPLICATION_STREAM_JSON;
<ide> import reactor.core.publisher.Flux;
<ide> import reactor.core.publisher.Mono;
<ide>
<ide> */
<ide> public class Jackson2JsonEncoder extends AbstractJackson2Codec implements Encoder<Object> {
<ide>
<del> private static final ByteBuffer START_ARRAY_BUFFER = ByteBuffer.wrap(new byte[]{'['});
<del>
<del> private static final ByteBuffer SEPARATOR_BUFFER = ByteBuffer.wrap(new byte[]{','});
<del>
<del> private static final ByteBuffer END_ARRAY_BUFFER = ByteBuffer.wrap(new byte[]{']'});
<del>
<del>
<ide> private final PrettyPrinter ssePrettyPrinter;
<ide>
<ide>
<ide> public Flux<DataBuffer> encode(Publisher<?> inputStream, DataBufferFactory buffe
<ide> if (inputStream instanceof Mono) {
<ide> return Flux.from(inputStream).map(value -> encodeValue(value, bufferFactory, elementType, hints));
<ide> }
<del>
<del> Mono<DataBuffer> startArray = Mono.just(bufferFactory.wrap(START_ARRAY_BUFFER));
<del> Mono<DataBuffer> endArray = Mono.just(bufferFactory.wrap(END_ARRAY_BUFFER));
<del>
<del> Flux<DataBuffer> array = Flux.from(inputStream)
<del> .concatMap(value -> {
<del> DataBuffer arraySeparator = bufferFactory.wrap(SEPARATOR_BUFFER);
<del> return Flux.just(encodeValue(value, bufferFactory, elementType, hints), arraySeparator);
<del> });
<del>
<del> return Flux.concat(startArray, array.skipLast(1), endArray);
<add> else if (APPLICATION_STREAM_JSON.isCompatibleWith(mimeType)) {
<add> return Flux.from(inputStream).map(value -> {
<add> DataBuffer buffer = encodeValue(value, bufferFactory, elementType, hints);
<add> buffer.write(new byte[]{'\n'});
<add> return buffer;
<add> });
<add> }
<add> ResolvableType listType = ResolvableType.forClassWithGenerics(List.class, elementType);
<add> return Flux.from(inputStream).collectList().map(list -> encodeValue(list, bufferFactory, listType, hints)).flux();
<ide> }
<ide>
<ide> private DataBuffer encodeValue(Object value, DataBufferFactory bufferFactory,
<ide><path>spring-web/src/test/java/org/springframework/http/codec/json/Jackson2JsonEncoderTests.java
<ide> /*
<del> * Copyright 2002-2016 the original author or authors.
<add> * Copyright 2002-2017 the original author or authors.
<ide> *
<ide> * Licensed under the Apache License, Version 2.0 (the "License");
<ide> * you may not use this file except in compliance with the License.
<ide> import com.fasterxml.jackson.annotation.JsonTypeName;
<ide> import com.fasterxml.jackson.annotation.JsonView;
<ide> import org.junit.Test;
<add>import static org.springframework.http.MediaType.APPLICATION_STREAM_JSON;
<ide> import reactor.core.publisher.Flux;
<ide> import reactor.core.publisher.Mono;
<ide> import reactor.test.StepVerifier;
<ide> public void encode() throws Exception {
<ide> Flux<DataBuffer> output = this.encoder.encode(source, this.bufferFactory, type, null, Collections.emptyMap());
<ide>
<ide> StepVerifier.create(output)
<del> .consumeNextWith(stringConsumer("["))
<del> .consumeNextWith(stringConsumer("{\"foo\":\"foo\",\"bar\":\"bar\"}"))
<del> .consumeNextWith(stringConsumer(","))
<del> .consumeNextWith(stringConsumer("{\"foo\":\"foofoo\",\"bar\":\"barbar\"}"))
<del> .consumeNextWith(stringConsumer(","))
<del> .consumeNextWith(stringConsumer("{\"foo\":\"foofoofoo\",\"bar\":\"barbarbar\"}"))
<del> .consumeNextWith(stringConsumer("]"))
<add> .consumeNextWith(stringConsumer("[{\"foo\":\"foo\",\"bar\":\"bar\"},{\"foo\":\"foofoo\",\"bar\":\"barbar\"},{\"foo\":\"foofoofoo\",\"bar\":\"barbarbar\"}]"))
<ide> .expectComplete()
<ide> .verify();
<ide> }
<ide> public void encodeWithType() throws Exception {
<ide> Flux<DataBuffer> output = this.encoder.encode(source, this.bufferFactory, type, null, Collections.emptyMap());
<ide>
<ide> StepVerifier.create(output)
<del> .consumeNextWith(stringConsumer("["))
<del> .consumeNextWith(stringConsumer("{\"type\":\"foo\"}"))
<del> .consumeNextWith(stringConsumer(","))
<del> .consumeNextWith(stringConsumer("{\"type\":\"bar\"}"))
<del> .consumeNextWith(stringConsumer("]"))
<add> .consumeNextWith(stringConsumer("[{\"type\":\"foo\"},{\"type\":\"bar\"}]"))
<add> .expectComplete()
<add> .verify();
<add> }
<add>
<add> @Test
<add> public void encodeAsStream() throws Exception {
<add> Flux<Pojo> source = Flux.just(
<add> new Pojo("foo", "bar"),
<add> new Pojo("foofoo", "barbar"),
<add> new Pojo("foofoofoo", "barbarbar")
<add> );
<add> ResolvableType type = ResolvableType.forClass(Pojo.class);
<add> Flux<DataBuffer> output = this.encoder.encode(source, this.bufferFactory, type, APPLICATION_STREAM_JSON, Collections.emptyMap());
<add>
<add> StepVerifier.create(output)
<add> .consumeNextWith(stringConsumer("{\"foo\":\"foo\",\"bar\":\"bar\"}\n"))
<add> .consumeNextWith(stringConsumer("{\"foo\":\"foofoo\",\"bar\":\"barbar\"}\n"))
<add> .consumeNextWith(stringConsumer("{\"foo\":\"foofoofoo\",\"bar\":\"barbarbar\"}\n"))
<ide> .expectComplete()
<ide> .verify();
<ide> }
<ide><path>spring-webflux/src/test/java/org/springframework/web/reactive/result/method/annotation/JsonStreamingIntegrationTests.java
<add>/*
<add> * Copyright 2002-2017 the original author or authors.
<add> *
<add> * Licensed under the Apache License, Version 2.0 (the "License");
<add> * you may not use this file except in compliance with the License.
<add> * You may obtain a copy of the License at
<add> *
<add> * http://www.apache.org/licenses/LICENSE-2.0
<add> *
<add> * Unless required by applicable law or agreed to in writing, software
<add> * distributed under the License is distributed on an "AS IS" BASIS,
<add> * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
<add> * See the License for the specific language governing permissions and
<add> * limitations under the License.
<add> */
<add>
<add>package org.springframework.web.reactive.result.method.annotation;
<add>
<add>import java.time.Duration;
<add>
<add>import org.junit.Before;
<add>import org.junit.Test;
<add>import static org.springframework.http.MediaType.APPLICATION_STREAM_JSON;
<add>import static org.springframework.http.MediaType.APPLICATION_STREAM_JSON_VALUE;
<add>import reactor.core.publisher.Flux;
<add>import reactor.test.StepVerifier;
<add>
<add>import org.springframework.context.annotation.AnnotationConfigApplicationContext;
<add>import org.springframework.context.annotation.Bean;
<add>import org.springframework.context.annotation.Configuration;
<add>import org.springframework.http.server.reactive.AbstractHttpHandlerIntegrationTests;
<add>import org.springframework.http.server.reactive.HttpHandler;
<add>import org.springframework.web.bind.annotation.RequestMapping;
<add>import org.springframework.web.bind.annotation.RestController;
<add>import org.springframework.web.reactive.DispatcherHandler;
<add>import org.springframework.web.reactive.config.EnableWebFlux;
<add>import org.springframework.web.reactive.function.client.WebClient;
<add>import org.springframework.web.server.adapter.WebHttpHandlerBuilder;
<add>
<add>/**
<add> * @author Sebastien Deleuze
<add> */
<add>public class JsonStreamingIntegrationTests extends AbstractHttpHandlerIntegrationTests {
<add>
<add> private AnnotationConfigApplicationContext wac;
<add>
<add> private WebClient webClient;
<add>
<add>
<add> @Override
<add> @Before
<add> public void setup() throws Exception {
<add> super.setup();
<add> this.webClient = WebClient.create("http://localhost:" + this.port);
<add> }
<add>
<add>
<add> @Override
<add> protected HttpHandler createHttpHandler() {
<add> this.wac = new AnnotationConfigApplicationContext();
<add> this.wac.register(TestConfiguration.class);
<add> this.wac.refresh();
<add>
<add> return WebHttpHandlerBuilder.webHandler(new DispatcherHandler(this.wac)).build();
<add> }
<add>
<add> @Test
<add> public void jsonStreaming() throws Exception {
<add> Flux<Person> result = this.webClient.get()
<add> .uri("/stream")
<add> .accept(APPLICATION_STREAM_JSON)
<add> .exchange()
<add> .flatMap(response -> response.bodyToFlux(Person.class));
<add>
<add> StepVerifier.create(result)
<add> .expectNext(new Person("foo 0"))
<add> .expectNext(new Person("foo 1"))
<add> .verifyComplete();
<add> }
<add>
<add> @RestController
<add> @SuppressWarnings("unused")
<add> static class JsonStreamingController {
<add>
<add> @RequestMapping(value = "/stream", produces = APPLICATION_STREAM_JSON_VALUE)
<add> Flux<Person> person() {
<add> return Flux.interval(Duration.ofMillis(100)).map(l -> new Person("foo " + l)).take(2);
<add> }
<add>
<add> }
<add>
<add> @Configuration
<add> @EnableWebFlux
<add> @SuppressWarnings("unused")
<add> static class TestConfiguration {
<add>
<add> @Bean
<add> public JsonStreamingController jsonStreamingController() {
<add> return new JsonStreamingController();
<add> }
<add> }
<add>
<add> private static class Person {
<add>
<add> private String name;
<add>
<add> @SuppressWarnings("unused")
<add> public Person() {
<add> }
<add>
<add> public Person(String name) {
<add> this.name = name;
<add> }
<add>
<add> public String getName() {
<add> return name;
<add> }
<add>
<add> public void setName(String name) {
<add> this.name = name;
<add> }
<add>
<add> @Override
<add> public boolean equals(Object o) {
<add> if (this == o) {
<add> return true;
<add> }
<add> if (o == null || getClass() != o.getClass()) {
<add> return false;
<add> }
<add> Person person = (Person) o;
<add> return !(this.name != null ? !this.name.equals(person.name) : person.name != null);
<add> }
<add>
<add> @Override
<add> public int hashCode() {
<add> return this.name != null ? this.name.hashCode() : 0;
<add> }
<add>
<add> @Override
<add> public String toString() {
<add> return "Person{" +
<add> "name='" + name + '\'' +
<add> '}';
<add> }
<add> }
<add>
<add>} | 4 |
Python | Python | add regression test for ticket #572 | c189d6d29970587814e65b6f29ef5dbbe7db9564 | <ide><path>numpy/core/tests/test_regression.py
<ide> def check_poly_eq(self, level=rlevel):
<ide> assert x != y
<ide> assert x == x
<ide>
<add> def check_mem_insert(self, level=rlevel):
<add> """Ticket #572"""
<add> N.lib.place(1,1,1)
<ide>
<ide> if __name__ == "__main__":
<ide> NumpyTest().run() | 1 |
Python | Python | fix benchmark for longformer | 89a78be51f1c2afd263da66ec76d3297432f0c2a | <ide><path>src/transformers/benchmark/benchmark.py
<ide> def _prepare_inference_func(self, model_name: str, batch_size: int, sequence_len
<ide> if self.args.torchscript:
<ide> config.torchscript = True
<ide>
<del> has_model_class_in_config = hasattr(config, "architectures") and len(config.architectures) > 0
<add> has_model_class_in_config = (
<add> hasattr(config, "architectures")
<add> and isinstance(config.architectures, list)
<add> and len(config.architectures) > 0
<add> )
<ide> if not self.args.only_pretrain_model and has_model_class_in_config:
<ide> try:
<ide> model_class = config.architectures[0]
<ide> def encoder_forward():
<ide> def _prepare_train_func(self, model_name: str, batch_size: int, sequence_length: int) -> Callable[[], None]:
<ide> config = self.config_dict[model_name]
<ide>
<del> has_model_class_in_config = hasattr(config, "architectures") and len(config.architectures) > 0
<add> has_model_class_in_config = (
<add> hasattr(config, "architectures")
<add> and isinstance(config.architectures, list)
<add> and len(config.architectures) > 0
<add> )
<ide> if not self.args.only_pretrain_model and has_model_class_in_config:
<ide> try:
<ide> model_class = config.architectures[0]
<ide><path>src/transformers/benchmark/benchmark_tf.py
<ide> def _prepare_inference_func(self, model_name: str, batch_size: int, sequence_len
<ide> if self.args.fp16:
<ide> raise NotImplementedError("Mixed precision is currently not supported.")
<ide>
<del> has_model_class_in_config = hasattr(config, "architectures") and len(config.architectures) > 0
<add> has_model_class_in_config = (
<add> hasattr(config, "architectures")
<add> and isinstance(config.architectures, list)
<add> and len(config.architectures) > 0
<add> )
<ide> if not self.args.only_pretrain_model and has_model_class_in_config:
<ide> try:
<ide> model_class = "TF" + config.architectures[0] # prepend 'TF' for tensorflow model
<ide> def _prepare_train_func(self, model_name: str, batch_size: int, sequence_length:
<ide> if self.args.fp16:
<ide> raise NotImplementedError("Mixed precision is currently not supported.")
<ide>
<del> has_model_class_in_config = hasattr(config, "architectures") and len(config.architectures) > 0
<add> has_model_class_in_config = (
<add> hasattr(config, "architectures")
<add> and isinstance(config.architectures, list)
<add> and len(config.architectures) > 0
<add> )
<ide> if not self.args.only_pretrain_model and has_model_class_in_config:
<ide> try:
<ide> model_class = "TF" + config.architectures[0] # prepend 'TF' for tensorflow model
<ide><path>tests/test_benchmark.py
<ide> def test_inference_fp16(self):
<ide> self.check_results_dict_not_empty(results.time_inference_result)
<ide> self.check_results_dict_not_empty(results.memory_inference_result)
<ide>
<add> def test_inference_no_model_no_architecuters(self):
<add> MODEL_ID = "sshleifer/tiny-gpt2"
<add> config = AutoConfig.from_pretrained(MODEL_ID)
<add> # set architectures equal to `None`
<add> config.architectures = None
<add> benchmark_args = PyTorchBenchmarkArguments(
<add> models=[MODEL_ID],
<add> training=True,
<add> no_inference=False,
<add> sequence_lengths=[8],
<add> batch_sizes=[1],
<add> no_multi_process=True,
<add> )
<add> benchmark = PyTorchBenchmark(benchmark_args, configs=[config])
<add> results = benchmark.run()
<add> self.check_results_dict_not_empty(results.time_inference_result)
<add> self.check_results_dict_not_empty(results.memory_inference_result)
<add>
<ide> def test_train_no_configs(self):
<ide> MODEL_ID = "sshleifer/tiny-gpt2"
<ide> benchmark_args = PyTorchBenchmarkArguments( | 3 |
Mixed | Python | make markdown 2.6 the minimum compatible version | 564faddb0fe6f640f38a1abb7387a86d6fe13243 | <ide><path>docs/index.md
<ide> each Python and Django series.
<ide> The following packages are optional:
<ide>
<ide> * [coreapi][coreapi] (1.32.0+) - Schema generation support.
<del>* [Markdown][markdown] (2.1.0+) - Markdown support for the browsable API.
<add>* [Markdown][markdown] (2.6.0+) - Markdown support for the browsable API.
<ide> * [django-filter][django-filter] (1.0.1+) - Filtering support.
<ide> * [django-crispy-forms][django-crispy-forms] - Improved HTML display for filtering.
<ide> * [django-guardian][django-guardian] (1.1.1+) - Object level permissions support.
<ide><path>rest_framework/compat.py
<ide> def is_guardian_installed():
<ide> View.http_method_names = View.http_method_names + ['patch']
<ide>
<ide>
<del># Markdown is optional
<add># Markdown is optional (version 2.6+ required)
<ide> try:
<ide> import markdown
<ide>
<del> if markdown.version <= '2.2':
<del> HEADERID_EXT_PATH = 'headerid'
<del> LEVEL_PARAM = 'level'
<del> elif markdown.version < '2.6':
<del> HEADERID_EXT_PATH = 'markdown.extensions.headerid'
<del> LEVEL_PARAM = 'level'
<del> else:
<del> HEADERID_EXT_PATH = 'markdown.extensions.toc'
<del> LEVEL_PARAM = 'baselevel'
<add> HEADERID_EXT_PATH = 'markdown.extensions.toc'
<add> LEVEL_PARAM = 'baselevel'
<ide>
<ide> def apply_markdown(text):
<ide> """ | 2 |
Java | Java | add missing word to javadoc | 2be2aa7b965e217c3e8a27f5a971323d19457374 | <ide><path>spring-core/src/main/java/org/springframework/core/ReactiveTypeDescriptor.java
<ide> import org.springframework.util.Assert;
<ide>
<ide> /**
<del> * Descriptor for a reactive type with information its stream semantics, i.e.
<add> * Descriptor for a reactive type with information about its stream semantics, i.e.
<ide> * how many values it can produce.
<ide> *
<ide> * @author Rossen Stoyanchev | 1 |
Go | Go | fix typo in api/types/container/host_config.go | a003419fca2c4ce7ed53edc8da67a22748aa4a23 | <ide><path>api/types/container/host_config.go
<ide> type HostConfig struct {
<ide> Mounts []mount.Mount `json:",omitempty"`
<ide>
<ide> // Run a custom init inside the container, if null, use the daemon's configured settings
<del> Init *bool `json:",om itempty"`
<add> Init *bool `json:",omitempty"`
<ide> }
<ide>
<ide> // Box specifies height and width dimensions. Used for sizing of a console. | 1 |
PHP | PHP | fix tests on windows | 179bb471a96c098efb1ad2a08ff6b09106fa22d3 | <ide><path>tests/TestCase/Console/ConsoleOptionParserTest.php
<ide> public function testAddOptionWithPrompt(): void
<ide> $messages = $out->messages();
<ide>
<ide> $this->assertCount(1, $messages);
<del> $expected = '<question>What is your favorite?</question>' . PHP_EOL . '> ';
<add> $expected = "<question>What is your favorite?</question>\n> ";
<ide> $this->assertEquals($expected, $messages[0]);
<ide> }
<ide>
<ide> public function testAddOptionWithPromptAndRequired(): void
<ide> $messages = $out->messages();
<ide>
<ide> $this->assertCount(1, $messages);
<del> $expected = '<question>What is your favorite?</question>' . PHP_EOL . '> ';
<add> $expected = "<question>What is your favorite?</question>\n> ";
<ide> $this->assertEquals($expected, $messages[0]);
<ide> }
<ide>
<ide> public function testAddOptionWithPromptAndOptions(): void
<ide> $messages = $out->messages();
<ide>
<ide> $this->assertCount(2, $messages);
<del> $expected = '<question>What is your favorite?</question> (red/green/blue) ' . PHP_EOL . '> ';
<add> $expected = "<question>What is your favorite?</question> (red/green/blue) \n> ";
<ide> $this->assertEquals($expected, $messages[0]);
<ide> $this->assertEquals($expected, $messages[1]);
<ide> } | 1 |
Javascript | Javascript | remove flash messages view helper | 4e21e761ad2609c101ff94444e0a03d7668e1539 | <ide><path>app.js
<ide> app.use(function(req, res, next) {
<ide> next();
<ide> });
<ide> app.use(flash());
<del>app.use(function(req, res, next) {
<del> res.locals.flash = req.flash.bind(req);
<del> next();
<del>});
<ide> app.use(less({ src: __dirname + '/public', compress: true }));
<ide> app.use(app.router);
<ide> app.use(express.static( path.join(__dirname, 'public'), { maxAge: 864000000 } )); | 1 |
Ruby | Ruby | add messages to plain assertions | 7bd70dcd6c62fb94e3298e926f69a4e1946739cc | <ide><path>activemodel/lib/active_model/lint.rb
<ide> module Tests
<ide> def test_to_key
<ide> assert model.respond_to?(:to_key), "The model should respond to to_key"
<ide> def model.persisted?() false end
<del> assert model.to_key.nil?
<add> assert model.to_key.nil?, "to_key should return nil when `persisted?` returns false"
<ide> end
<ide>
<ide> # == Responds to <tt>to_param</tt>
<ide> def test_to_param
<ide> assert model.respond_to?(:to_param), "The model should respond to to_param"
<ide> def model.to_key() [1] end
<ide> def model.persisted?() false end
<del> assert model.to_param.nil?
<add> assert model.to_param.nil?, "to_param should return nil when `persited?` returns false"
<ide> end
<ide>
<ide> # == Responds to <tt>valid?</tt> | 1 |
PHP | PHP | use statement ordering fix | 99cfc941298a4c2cfba029c8a574c083c3d214f9 | <ide><path>src/Validation/Validator.php
<ide> namespace Cake\Validation;
<ide>
<ide> use ArrayAccess;
<del>use Countable;
<del>use IteratorAggregate;
<del>
<ide> use Cake\Validation\RulesProvider;
<ide> use Cake\Validation\ValidationSet;
<add>use Countable;
<add>use IteratorAggregate;
<ide>
<ide> /**
<ide> * Validator object encapsulates all methods related to data validations for a model | 1 |
Python | Python | use time.monotonic when available | 9fe795e1375e0094c0a102628c5282bfea956420 | <ide><path>celery/backends/amqp.py
<ide> from __future__ import absolute_import
<ide>
<ide> import socket
<del>import time
<ide>
<ide> from collections import deque
<ide> from operator import itemgetter
<ide>
<ide> from celery import states
<ide> from celery.exceptions import TimeoutError
<del>from celery.five import range
<add>from celery.five import range, monotonic
<ide> from celery.utils.functional import dictfilter
<ide> from celery.utils.log import get_logger
<ide> from celery.utils.timeutils import maybe_s_to_ms
<ide> def get_task_meta(self, task_id, backlog_limit=1000):
<ide> poll = get_task_meta # XXX compat
<ide>
<ide> def drain_events(self, connection, consumer,
<del> timeout=None, now=time.time, wait=None):
<add> timeout=None, now=monotonic, wait=None):
<ide> wait = wait or connection.drain_events
<ide> results = {}
<ide>
<ide> def _many_bindings(self, ids):
<ide> return [self._create_binding(task_id) for task_id in ids]
<ide>
<ide> def get_many(self, task_ids, timeout=None,
<del> now=time.time, getfields=itemgetter('status', 'task_id'),
<add> now=monotonic, getfields=itemgetter('status', 'task_id'),
<ide> READY_STATES=states.READY_STATES, **kwargs):
<ide> with self.app.pool.acquire_channel(block=True) as (conn, channel):
<ide> ids = set(task_ids)
<ide><path>celery/backends/cassandra.py
<ide>
<ide> from celery import states
<ide> from celery.exceptions import ImproperlyConfigured
<add>from celery.five import monotonic
<ide> from celery.utils.log import get_logger
<ide> from celery.utils.timeutils import maybe_timedelta, timedelta_seconds
<ide>
<ide> def __init__(self, servers=None, keyspace=None, column_family=None,
<ide> self._column_family = None
<ide>
<ide> def _retry_on_error(self, fun, *args, **kwargs):
<del> ts = time.time() + self._retry_timeout
<add> ts = monotonic() + self._retry_timeout
<ide> while 1:
<ide> try:
<ide> return fun(*args, **kwargs)
<ide> def _retry_on_error(self, fun, *args, **kwargs):
<ide> socket.error,
<ide> socket.timeout,
<ide> Thrift.TException) as exc:
<del> if time.time() > ts:
<add> if monotonic() > ts:
<ide> raise
<ide> logger.warning('Cassandra error: %r. Retrying...', exc)
<ide> time.sleep(self._retry_wait)
<ide><path>celery/beat.py
<ide> from . import __version__
<ide> from . import platforms
<ide> from . import signals
<del>from .five import items, reraise, values
<add>from .five import items, reraise, values, monotonic
<ide> from .schedules import maybe_schedule, crontab
<ide> from .utils.imports import instantiate
<ide> from .utils.timeutils import humanize_seconds
<ide> def tick(self):
<ide>
<ide> def should_sync(self):
<ide> return (not self._last_sync or
<del> (time.time() - self._last_sync) > self.sync_every)
<add> (monotonic() - self._last_sync) > self.sync_every)
<ide>
<ide> def reserve(self, entry):
<ide> new_entry = self.schedule[entry.name] = next(entry)
<ide> def _do_sync(self):
<ide> debug('beat: Synchronizing schedule...')
<ide> self.sync()
<ide> finally:
<del> self._last_sync = time.time()
<add> self._last_sync = monotonic()
<ide>
<ide> def sync(self):
<ide> pass
<ide><path>celery/concurrency/base.py
<ide>
<ide> import logging
<ide> import os
<del>import time
<ide>
<ide> from kombu.utils.encoding import safe_repr
<ide>
<add>from celery.five import monotonic
<ide> from celery.utils import timer2
<ide> from celery.utils.log import get_logger
<ide>
<ide> def apply_target(target, args=(), kwargs={}, callback=None,
<ide> accept_callback=None, pid=None, **_):
<ide> if accept_callback:
<del> accept_callback(pid or os.getpid(), time.time())
<add> accept_callback(pid or os.getpid(), monotonic())
<ide> callback(target(*args, **kwargs))
<ide>
<ide>
<ide><path>celery/concurrency/eventlet.py
<ide> warnings.warn(RuntimeWarning(W_RACE % side))
<ide>
<ide>
<del>from time import time
<del>
<ide> from celery import signals
<add>from celery.five import monotonic
<ide> from celery.utils import timer2
<ide>
<ide> from . import base
<ide> def __init__(self, *args, **kwargs):
<ide> self._queue = set()
<ide>
<ide> def _enter(self, eta, priority, entry):
<del> secs = max(eta - time(), 0)
<add> secs = max(eta - monotonic(), 0)
<ide> g = self._spawn_after(secs, entry)
<ide> self._queue.add(g)
<ide> g.link(self._entry_exit, entry)
<ide><path>celery/concurrency/gevent.py
<ide> except ImportError: # pragma: no cover
<ide> Timeout = None # noqa
<ide>
<del>from time import time
<del>
<add>from celery.five import monotonic
<ide> from celery.utils import timer2
<ide>
<ide> from .base import apply_target, BasePool
<ide> class _Greenlet(Greenlet):
<ide> self._queue = set()
<ide>
<ide> def _enter(self, eta, priority, entry):
<del> secs = max(eta - time(), 0)
<add> secs = max(eta - monotonic(), 0)
<ide> g = self._Greenlet.spawn_later(secs, entry)
<ide> self._queue.add(g)
<ide> g.link(self._entry_exit)
<ide><path>celery/concurrency/processes.py
<ide>
<ide> from collections import deque, namedtuple
<ide> from pickle import HIGHEST_PROTOCOL
<del>from time import sleep, time
<add>from time import sleep
<ide> from weakref import WeakValueDictionary, ref
<ide>
<ide> from amqp.utils import promise
<ide> from celery._state import set_default_app
<ide> from celery.app import trace
<ide> from celery.concurrency.base import BasePool
<del>from celery.five import Counter, items, values
<add>from celery.five import Counter, items, values, monotonic
<ide> from celery.utils.log import get_logger
<ide>
<ide> __all__ = ['TaskPool']
<ide> def register_with_event_loop(self, hub):
<ide>
<ide> hub.on_tick.add(self.on_poll_start)
<ide>
<del> def _create_timelimit_handlers(self, hub, now=time):
<add> def _create_timelimit_handlers(self, hub, now=monotonic):
<ide> """For async pool this sets up the handlers used
<ide> to implement time limits."""
<ide> call_later = hub.call_later
<ide> def on_timeout_cancel(R):
<ide> _discard_tref(R._job)
<ide> self.on_timeout_cancel = on_timeout_cancel
<ide>
<del> def _on_soft_timeout(self, job, soft, hard, hub, now=time):
<add> def _on_soft_timeout(self, job, soft, hard, hub, now=monotonic):
<ide> # only used by async pool.
<ide> if hard:
<ide> self._tref_for_id[job] = hub.call_at(
<ide><path>celery/datastructures.py
<ide> from __future__ import absolute_import, print_function, unicode_literals
<ide>
<ide> import sys
<del>import time
<ide>
<ide> from collections import defaultdict, Mapping, MutableMapping, MutableSet
<ide> from heapq import heapify, heappush, heappop
<ide> from kombu.utils.encoding import safe_str
<ide> from kombu.utils.limits import TokenBucket # noqa
<ide>
<del>from celery.five import items
<add>from celery.five import items, monotonic
<ide> from celery.utils.functional import LRUCache, first, uniq # noqa
<ide>
<ide> DOT_HEAD = """
<ide> def __init__(self, maxlen=None, expires=None, data=None, heap=None):
<ide> self.__len__ = self._data.__len__
<ide> self.__contains__ = self._data.__contains__
<ide>
<del> def add(self, value, now=time.time):
<add> def add(self, value, now=monotonic):
<ide> """Add a new member."""
<ide> # offset is there to modify the length of the list,
<ide> # this way we can expire an item before inserting the value,
<ide> def discard(self, value):
<ide> self._data.pop(value, None)
<ide> pop_value = discard # XXX compat
<ide>
<del> def purge(self, limit=None, offset=0, now=time.time):
<add> def purge(self, limit=None, offset=0, now=monotonic):
<ide> """Purge expired items."""
<ide> H, maxlen = self._heap, self.maxlen
<ide> if not maxlen:
<ide><path>celery/events/__init__.py
<ide> from kombu.utils import cached_property
<ide>
<ide> from celery.app import app_or_default
<add>from celery.five import monotonic
<ide> from celery.utils import uuid
<ide> from celery.utils.functional import dictfilter
<ide> from celery.utils.timeutils import adjust_timestamp, utcoffset, maybe_s_to_ms
<ide> def get_exchange(conn):
<ide> return ex
<ide>
<ide>
<del>def Event(type, _fields=None, __dict__=dict, __now__=time.time, **fields):
<add>def Event(type, _fields=None, __dict__=dict, __now__=monotonic, **fields):
<ide> """Create an event.
<ide>
<ide> An event is a dictionary, the only required field is ``type``.
<ide> def wakeup_workers(self, channel=None):
<ide> channel=channel)
<ide>
<ide> def event_from_message(self, body, localize=True,
<del> now=time.time, tzfields=_TZGETTER,
<add> now=monotonic, tzfields=_TZGETTER,
<ide> adjust_timestamp=adjust_timestamp):
<ide> type = body.get('type', '').lower()
<ide> clock = body.get('clock')
<ide><path>celery/events/cursesmon.py
<ide> import curses
<ide> import sys
<ide> import threading
<del>import time
<ide>
<ide> from datetime import datetime
<ide> from itertools import count
<ide> from celery import VERSION_BANNER
<ide> from celery import states
<ide> from celery.app import app_or_default
<del>from celery.five import items, values
<add>from celery.five import items, values, monotonic
<ide> from celery.utils.text import abbr, abbrtask
<ide>
<ide> __all__ = ['CursesMonitor', 'evtop']
<ide> def display_task_row(self, lineno, task):
<ide> if task.uuid == self.selected_task:
<ide> attr = curses.A_STANDOUT
<ide> timestamp = datetime.utcfromtimestamp(
<del> task.timestamp or time.time(),
<add> task.timestamp or monotonic(),
<ide> )
<ide> timef = timestamp.strftime('%H:%M:%S')
<ide> hostname = task.worker.hostname if task.worker else '*NONE*'
<ide><path>celery/events/state.py
<ide> from datetime import datetime
<ide> from heapq import heappush, heappop
<ide> from itertools import islice
<del>from time import time
<ide>
<ide> from kombu.clocks import timetuple
<ide> from kombu.utils import kwdict
<ide>
<ide> from celery import states
<ide> from celery.datastructures import AttributeDict
<del>from celery.five import items, values
<add>from celery.five import items, values, monotonic
<ide> from celery.utils.functional import LRUCache
<ide> from celery.utils.log import get_logger
<ide>
<ide> def heartbeat_expires(self):
<ide>
<ide> @property
<ide> def alive(self):
<del> return bool(self.heartbeats and time() < self.heartbeat_expires)
<add> return bool(self.heartbeats and monotonic() < self.heartbeat_expires)
<ide>
<ide> @property
<ide> def id(self):
<ide><path>celery/five.py
<ide> 'nextfun', 'reraise', 'WhateverIO', 'with_metaclass',
<ide> 'OrderedDict', 'THREAD_TIMEOUT_MAX', 'format_d',
<ide> 'class_property', 'reclassmethod', 'create_module',
<del> 'recreate_module']
<add> 'recreate_module', 'monotonic']
<ide>
<ide> try:
<ide> from collections import Counter
<ide> def Counter(): # noqa
<ide> except ImportError: # pragma: no cover
<ide> from collections import UserDict # noqa
<ide>
<add>try:
<add> from time import monotonic
<add>except ImportError:
<add> from time import time as monotonic # noqa
<ide>
<ide> if PY3: # pragma: no cover
<ide> import builtins
<ide><path>celery/result.py
<ide> from .app import app_or_default
<ide> from .datastructures import DependencyGraph, GraphFormatter
<ide> from .exceptions import IncompleteStream, TimeoutError
<del>from .five import items, range, string_t
<add>from .five import items, range, string_t, monotonic
<ide>
<ide> __all__ = ['ResultBase', 'AsyncResult', 'ResultSet', 'GroupResult',
<ide> 'EagerResult', 'from_serializable']
<ide> def join(self, timeout=None, propagate=True, interval=0.5):
<ide> seconds.
<ide>
<ide> """
<del> time_start = time.time()
<add> time_start = monotonic()
<ide> remaining = None
<ide>
<ide> results = []
<ide> for result in self.results:
<ide> remaining = None
<ide> if timeout:
<del> remaining = timeout - (time.time() - time_start)
<add> remaining = timeout - (monotonic() - time_start)
<ide> if remaining <= 0.0:
<ide> raise TimeoutError('join operation timed out')
<ide> results.append(result.get(timeout=remaining,
<ide><path>celery/worker/autoscale.py
<ide> import os
<ide> import threading
<ide>
<del>from time import sleep, time
<add>from time import sleep
<ide>
<ide> from kombu.async.semaphore import DummyLock
<ide>
<ide> from celery import bootsteps
<add>from celery.five import monotonic
<ide> from celery.utils.log import get_logger
<ide> from celery.utils.threads import bgThread
<ide>
<ide> def force_scale_down(self, n):
<ide> self._shrink(min(n, self.processes))
<ide>
<ide> def scale_up(self, n):
<del> self._last_action = time()
<add> self._last_action = monotonic()
<ide> return self._grow(n)
<ide>
<ide> def scale_down(self, n):
<ide> if n and self._last_action and (
<del> time() - self._last_action > self.keepalive):
<del> self._last_action = time()
<add> monotonic() - self._last_action > self.keepalive):
<add> self._last_action = monotonic()
<ide> return self._shrink(n)
<ide>
<ide> def _grow(self, n):
<ide><path>celery/worker/job.py
<ide> from __future__ import absolute_import
<ide>
<ide> import logging
<del>import time
<ide> import socket
<ide> import sys
<ide>
<ide> SoftTimeLimitExceeded, TimeLimitExceeded,
<ide> WorkerLostError, Terminated, RetryTaskError,
<ide> )
<del>from celery.five import items
<add>from celery.five import items, monotonic
<ide> from celery.platforms import signals as _signals
<ide> from celery.utils import fun_takes_kwargs
<ide> from celery.utils.functional import noop
<ide> def on_timeout(self, soft, timeout):
<ide> if self.store_errors:
<ide> self.task.backend.mark_as_failure(self.id, exc)
<ide>
<del> def on_success(self, ret_value, now=None, nowfun=time.time):
<add> def on_success(self, ret_value, now=None, nowfun=monotonic):
<ide> """Handler called if the task was successfully processed."""
<ide> if isinstance(ret_value, ExceptionInfo):
<ide> if isinstance(ret_value.exception, (
<ide><path>celery/worker/state.py
<ide> def task_ready(request):
<ide> if C_BENCH: # pragma: no cover
<ide> import atexit
<ide>
<del> from time import time
<ide> from billiard import current_process
<add> from celery.five import monotonic
<ide> from celery.utils.debug import memdump, sample_mem
<ide>
<ide> all_count = 0
<ide> def task_reserved(request): # noqa
<ide> global bench_first
<ide> now = None
<ide> if bench_start is None:
<del> bench_start = now = time()
<add> bench_start = now = monotonic()
<ide> if bench_first is None:
<ide> bench_first = now
<ide>
<ide> def task_ready(request): # noqa
<ide> global bench_last
<ide> all_count += 1
<ide> if not all_count % bench_every:
<del> now = time()
<add> now = monotonic()
<ide> diff = now - bench_start
<ide> print('- Time spent processing {0} tasks (since first '
<ide> 'task received): ~{1:.4f}s\n'.format(bench_every, diff)) | 16 |
Text | Text | remove comma in readme.md | aa64ff6a37c4f1018713f1b020da810d2ed7909f | <ide><path>README.md
<ide> # Node.js
<ide>
<del>Node.js is an open-source, cross-platform, JavaScript runtime environment.
<add>Node.js is an open-source, cross-platform JavaScript runtime environment.
<ide>
<ide> For information on using Node.js, see the [Node.js website][].
<ide> | 1 |
Ruby | Ruby | remove flaky test | 924af100b73d24dffb2199fa46ae118e8b3bbb29 | <ide><path>Library/Homebrew/test/cask/pkg_spec.rb
<ide> let(:empty_response) { double(stdout: "", plist: { "volume" => "/", "install-location" => "", "paths" => {} }) }
<ide> let(:pkg) { described_class.new("my.fake.pkg", fake_system_command) }
<ide>
<del> it "removes files and dirs referenced by the pkg" do
<del> some_files = Array.new(3) { Pathname.new(Tempfile.new("plain_file").path) }
<del> allow(pkg).to receive(:pkgutil_bom_files).and_return(some_files)
<del>
<del> some_specials = Array.new(3) { Pathname.new(Tempfile.new("special_file").path) }
<del> allow(pkg).to receive(:pkgutil_bom_specials).and_return(some_specials)
<del>
<del> some_dirs = Array.new(3) { mktmpdir }
<del> allow(pkg).to receive(:pkgutil_bom_dirs).and_return(some_dirs)
<del>
<del> root_dir = Pathname.new(mktmpdir)
<del> allow(pkg).to receive(:root).and_return(root_dir)
<del>
<del> allow(pkg).to receive(:forget)
<del>
<del> pkg.uninstall
<del>
<del> some_files.each do |file|
<del> expect(file).not_to exist
<del> end
<del>
<del> some_dirs.each do |dir|
<del> expect(dir).not_to exist
<del> end
<del>
<del> expect(root_dir).not_to exist
<del> end
<del>
<ide> context "pkgutil" do
<ide> it "forgets the pkg" do
<ide> allow(fake_system_command).to receive(:run!).with( | 1 |
Text | Text | improve section on disabling pipes [ci skip] | a5e3d2f3180d45b6e35a0a5aadf42e2f2c6acced | <ide><path>website/docs/usage/processing-pipelines.md
<ide> require them in the pipeline settings in your model's `meta.json`.
<ide> ### Disabling and modifying pipeline components {#disabling}
<ide>
<ide> If you don't need a particular component of the pipeline – for example, the
<del>tagger or the parser, you can disable loading it. This can sometimes make a big
<del>difference and improve loading speed. Disabled component names can be provided
<del>to [`spacy.load`](/api/top-level#spacy.load),
<add>tagger or the parser, you can **disable loading** it. This can sometimes make a
<add>big difference and improve loading speed. Disabled component names can be
<add>provided to [`spacy.load`](/api/top-level#spacy.load),
<ide> [`Language.from_disk`](/api/language#from_disk) or the `nlp` object itself as a
<ide> list:
<ide>
<ide> ```python
<del>nlp = spacy.load("en", disable=["parser", "tagger"])
<add>### Disable loading
<add>nlp = spacy.load("en_core_web_sm", disable=["tagger", "parser"])
<ide> nlp = English().from_disk("/model", disable=["ner"])
<ide> ```
<ide>
<del>You can also use the [`remove_pipe`](/api/language#remove_pipe) method to remove
<del>pipeline components from an existing pipeline, the
<add>In some cases, you do want to load all pipeline components and their weights,
<add>because you need them at different points in your application. However, if you
<add>only need a `Doc` object with named entities, there's no need to run all
<add>pipeline components on it – that can potentially make processing much slower.
<add>Instead, you can use the `disable` keyword argument on
<add>[`nlp.pipe`](/api/language#pipe) to temporarily disable the components **during
<add>processing**:
<add>
<add>```python
<add>### Disable for processing
<add>for doc in nlp.pipe(texts, disable=["tagger", "parser"]):
<add> # Do something with the doc here
<add>```
<add>
<add>If you need to **execute more code** with components disabled – e.g. to reset
<add>the weights or update only some components during training – you can use the
<add>[`nlp.disable_pipes`](/api/language#disable_pipes) contextmanager. At the end of
<add>the `with` block, the disabled pipeline components will be restored
<add>automatically. Alternatively, `disable_pipes` returns an object that lets you
<add>call its `restore()` method to restore the disabled components when needed. This
<add>can be useful if you want to prevent unnecessary code indentation of large
<add>blocks.
<add>
<add>```python
<add>### Disable for block
<add># 1. Use as a contextmanager
<add>with nlp.disable_pipes("tagger", "parser"):
<add> doc = nlp(u"I won't be tagged and parsed")
<add>doc = nlp(u"I will be tagged and parsed")
<add>
<add># 2. Restore manually
<add>disabled = nlp.disable_pipes("ner")
<add>doc = nlp(u"I won't have named entities")
<add>disabled.restore()
<add>```
<add>
<add>Finally, you can also use the [`remove_pipe`](/api/language#remove_pipe) method
<add>to remove pipeline components from an existing pipeline, the
<ide> [`rename_pipe`](/api/language#rename_pipe) method to rename them, or the
<ide> [`replace_pipe`](/api/language#replace_pipe) method to replace them with a
<ide> custom component entirely (more details on this in the section on | 1 |
Javascript | Javascript | cache the lineheight regex | 78e78256a5e2630b2bba644a8d48ec1674220179 | <ide><path>src/helpers/helpers.options.js
<ide> import defaults from '../core/core.defaults';
<ide> import {isArray, isObject, valueOrDefault} from './helpers.core';
<ide> import {toFontString} from './helpers.canvas';
<ide>
<add>const LINE_HEIGHT = new RegExp(/^(normal|(\d+(?:\.\d+)?)(px|em|%)?)$/);
<add>
<ide> /**
<ide> * @alias Chart.helpers.options
<ide> * @namespace
<ide> import {toFontString} from './helpers.canvas';
<ide> * @since 2.7.0
<ide> */
<ide> export function toLineHeight(value, size) {
<del> const matches = ('' + value).match(/^(normal|(\d+(?:\.\d+)?)(px|em|%)?)$/);
<add> const matches = ('' + value).match(LINE_HEIGHT);
<ide> if (!matches || matches[1] === 'normal') {
<ide> return size * 1.2;
<ide> } | 1 |
Go | Go | pass command arguments without extra quoting | 19645521a92869be7210fb663ed5e0e82e329ca5 | <ide><path>daemon/exec_windows.go
<ide> import (
<ide>
<ide> func execSetPlatformOpt(c *container.Container, ec *exec.Config, p *libcontainerd.Process) error {
<ide> // Process arguments need to be escaped before sending to OCI.
<del> p.Args = escapeArgs(p.Args)
<del> p.User.Username = ec.User
<add> if c.Platform == "windows" {
<add> p.Args = escapeArgs(p.Args)
<add> p.User.Username = ec.User
<add> }
<ide> return nil
<ide> }
<ide><path>daemon/oci_windows.go
<ide> func (daemon *Daemon) createSpec(c *container.Container) (*specs.Spec, error) {
<ide>
<ide> // In s.Process
<ide> s.Process.Args = append([]string{c.Path}, c.Args...)
<del> if !c.Config.ArgsEscaped {
<add> if !c.Config.ArgsEscaped && img.OS == "windows" {
<ide> s.Process.Args = escapeArgs(s.Process.Args)
<ide> }
<ide>
<ide><path>libcontainerd/client_windows.go
<ide> func (clnt *client) AddProcess(ctx context.Context, containerID, processFriendly
<ide>
<ide> // Configure the environment for the process
<ide> createProcessParms.Environment = setupEnvironmentVariables(procToAdd.Env)
<del> createProcessParms.CommandLine = strings.Join(procToAdd.Args, " ")
<add> if container.ociSpec.Platform.OS == "windows" {
<add> createProcessParms.CommandLine = strings.Join(procToAdd.Args, " ")
<add> } else {
<add> createProcessParms.CommandArgs = procToAdd.Args
<add> }
<ide> createProcessParms.User = procToAdd.User.Username
<ide>
<ide> logrus.Debugf("libcontainerd: commandLine: %s", createProcessParms.CommandLine)
<ide><path>libcontainerd/container_windows.go
<ide> func (ctr *container) start(attachStdio StdioCallback) error {
<ide>
<ide> // Configure the environment for the process
<ide> createProcessParms.Environment = setupEnvironmentVariables(ctr.ociSpec.Process.Env)
<del> createProcessParms.CommandLine = strings.Join(ctr.ociSpec.Process.Args, " ")
<add> if ctr.ociSpec.Platform.OS == "windows" {
<add> createProcessParms.CommandLine = strings.Join(ctr.ociSpec.Process.Args, " ")
<add> } else {
<add> createProcessParms.CommandArgs = ctr.ociSpec.Process.Args
<add> }
<ide> createProcessParms.User = ctr.ociSpec.Process.User.Username
<ide>
<ide> // LCOW requires the raw OCI spec passed through HCS and onwards to GCS for the utility VM. | 4 |
Go | Go | show the history of an image | 1ad69ad415728cc462dd3e296295247449538b0b | <ide><path>commands.go
<ide> func (srv *Server) Help() string {
<ide> {"import", "Create a new filesystem image from the contents of a tarball"},
<ide> {"attach", "Attach to a running container"},
<ide> {"commit", "Create a new image from a container's changes"},
<add> {"history", "Show the history of an image"},
<ide> {"diff", "Inspect changes on a container's filesystem"},
<ide> {"images", "List images"},
<ide> {"info", "Display system-wide information"},
<ide> func (srv *Server) CmdRmi(stdin io.ReadCloser, stdout io.Writer, args ...string)
<ide> return nil
<ide> }
<ide>
<add>func (srv *Server) CmdHistory(stdin io.ReadCloser, stdout io.Writer, args ...string) error {
<add> cmd := rcli.Subcmd(stdout, "history", "[OPTIONS] IMAGE", "Show the history of an image")
<add> if cmd.Parse(args) != nil || cmd.NArg() != 1 {
<add> cmd.Usage()
<add> return nil
<add> }
<add> image, err := srv.runtime.LookupImage(cmd.Arg(0))
<add> if err != nil {
<add> return err
<add> }
<add> var child *Image
<add> return image.WalkHistory(func(img *Image) {
<add> if child == nil {
<add> fmt.Fprintf(stdout, " %s\n", img.Id)
<add> } else {
<add> fmt.Fprintf(stdout, " = %s + %s\n", img.Id, strings.Join(child.ParentCommand, " "))
<add> }
<add> child = img
<add> })
<add>}
<add>
<ide> func (srv *Server) CmdRm(stdin io.ReadCloser, stdout io.Writer, args ...string) error {
<ide> cmd := rcli.Subcmd(stdout, "rm", "[OPTIONS] CONTAINER", "Remove a container")
<ide> if err := cmd.Parse(args); err != nil { | 1 |
Javascript | Javascript | remove ie9 test fallback | c1eb8fe1ab717ac49ddd4d15d7abddc3e8d6ada6 | <ide><path>packages/ember-glimmer/tests/integration/helpers/input-test.js
<ide> moduleFor(`Helpers test: {{input type='text'}}`, class extends InputRenderingTes
<ide> this.render(`{{input ${ attrs.replace("%x", value) }}}`);
<ide> }
<ide>
<del> assertValue(expected) {
<del> let type = this.$input().attr('type');
<del>
<del> if (type !== 'range') {
<del> this.assert.ok(true, 'IE9 does not support range items');
<del> return;
<del> }
<del>
<del> super.assertValue(expected);
<del> }
<del>
<ide> ['@test value over default max but below set max is kept']() {
<ide> this.renderInput("25");
<ide> this.assertValue("25"); | 1 |
Javascript | Javascript | avoid a couple of redundant property loads | 7272ee613e1bfb4f39384f7baa72dead94ab2e9e | <ide><path>packages/ember-metal/lib/meta.js
<ide> export class Meta {
<ide> _getInherited(key) {
<ide> let pointer = this;
<ide> while (pointer !== undefined) {
<del> if (pointer[key]) {
<del> return pointer[key];
<add> let map = pointer[key];
<add> if (map) {
<add> return map;
<ide> }
<ide> pointer = pointer.parent;
<ide> }
<ide> export class Meta {
<ide> if (map) {
<ide> let value = map[subkey];
<ide> if (value) {
<del> if (value[itemkey] !== undefined) {
<del> return value[itemkey];
<add> let itemvalue = value[itemkey];
<add> if (itemvalue !== undefined) {
<add> return itemvalue;
<ide> }
<ide> }
<ide> }
<ide> export class Meta {
<ide> if (map) {
<ide> let value = map[subkey];
<ide> if (value !== undefined || subkey in map) {
<del> return map[subkey];
<add> return value;
<ide> }
<ide> }
<ide> pointer = pointer.parent;
<ide> if (isEnabled('mandatory-setter')) {
<ide> if (map) {
<ide> let value = map[subkey];
<ide> if (value !== undefined || subkey in map) {
<del> return map[subkey];
<add> return value;
<ide> }
<ide> }
<ide> pointer = pointer.parent; | 1 |
Ruby | Ruby | remove special behavior of autotools symbol deps | d22ad92a8479501b080312dbc1570476cda8172f | <ide><path>Library/Homebrew/dependency_collector.rb
<ide> def parse_class_spec(spec, tags)
<ide> end
<ide>
<ide> def autotools_dep(spec, tags)
<del> return if MacOS::Xcode.provides_autotools?
<del>
<ide> if spec == :libltdl
<ide> spec = :libtool
<ide> tags << :run
<ide><path>Library/Homebrew/test/test_dependency_collector.rb
<ide> def test_x11_min_version_and_tag
<ide> assert dep.optional?
<ide> end
<ide>
<del> def test_libltdl_not_build_dep
<del> MacOS::Xcode.stubs(:provides_autotools?).returns(false)
<del> dep = @d.build(:libltdl)
<del> assert_equal Dependency.new("libtool"), dep
<del> assert !dep.build?
<del> end
<del>
<del> def test_autotools_dep_no_system_autotools
<del> MacOS::Xcode.stubs(:provides_autotools?).returns(false)
<del> dep = @d.build(:libtool)
<del> assert_equal Dependency.new("libtool"), dep
<del> assert dep.build?
<del> end
<del>
<del> def test_autotools_dep_system_autotools
<del> MacOS::Xcode.stubs(:provides_autotools?).returns(true)
<del> assert_nil @d.build(:libtool)
<del> end
<del>
<ide> def test_ld64_dep_pre_leopard
<ide> MacOS.stubs(:version).returns(MacOS::Version.new("10.4"))
<ide> assert_equal LD64Dependency.new, @d.build(:ld64) | 2 |
PHP | PHP | remove throws annotation | 53ae6e90430896d5d5093ca5e16ac6283148cc23 | <ide><path>src/Illuminate/Database/Concerns/ManagesTransactions.php
<ide> protected function handleTransactionException($e, $currentAttempt, $maxAttempts)
<ide> * @param int $currentAttempt
<ide> * @param int $maxAttempts
<ide> * @return void
<del> *
<del> * @throws \Exception
<ide> */
<ide> protected function handleCommitTransactionException($e, $currentAttempt, $maxAttempts)
<ide> { | 1 |
Python | Python | update keras initializer to be fully stateless | 69b41cf4282682f9443dc65528da5fe487cd104f | <ide><path>keras/callbacks_test.py
<ide> from keras.testing_infra import test_utils
<ide> from keras.utils import io_utils
<ide> from keras.utils import np_utils
<add>from keras.utils import tf_utils
<ide>
<ide> # isort: off
<ide> from tensorflow.python.platform import tf_logging as logging
<ide> def test_LearningRateScheduler(self):
<ide>
<ide> def test_ReduceLROnPlateau(self):
<ide> with self.cached_session():
<del> np.random.seed(1337)
<add> tf_utils.set_random_seed(1337)
<ide> (x_train, y_train), (x_test, y_test) = test_utils.get_test_data(
<ide> train_samples=TRAIN_SAMPLES,
<ide> test_samples=TEST_SAMPLES,
<ide> def test_ReduceLROnPlateau(self):
<ide> y_train = np_utils.to_categorical(y_train)
<ide>
<ide> def make_model():
<del> tf.compat.v1.set_random_seed(1234)
<del> np.random.seed(1337)
<add> tf_utils.set_random_seed(1337)
<ide> model = test_utils.get_small_sequential_mlp(
<ide> num_hidden=NUM_HIDDEN,
<ide> num_classes=NUM_CLASSES,
<ide><path>keras/initializers/initializers_test.py
<ide> # ==============================================================================
<ide> """Tests for Keras initializers."""
<ide>
<del>import numpy as np
<add>import warnings
<add>
<ide> import tensorflow.compat.v2 as tf
<ide> from absl.testing import parameterized
<ide>
<ide> from keras.testing_infra import test_combinations
<ide> from keras.testing_infra import test_utils
<ide>
<add>RANDOM_INITIALIZERS = [
<add> initializers.RandomUniformV2,
<add> initializers.RandomNormalV2,
<add> initializers.OrthogonalV2,
<add> # TODO(scottzhu): Enable this after the forward compat period expires for
<add> # TruncatedNormalV2
<add> # initializers.TruncatedNormalV2,
<add> initializers.VarianceScalingV2,
<add> initializers.LecunUniformV2,
<add> initializers.LecunNormalV2,
<add> initializers.GlorotUniformV2,
<add> initializers.GlorotNormalV2,
<add> initializers.HeNormalV2,
<add> initializers.HeUniformV2,
<add>]
<add>
<ide>
<ide> def _compute_fans(shape):
<ide> """Computes the number of input and output units for a weight shape.
<ide> def _runner(
<ide> self,
<ide> init,
<ide> shape,
<del> target_mean=None,
<del> target_std=None,
<del> target_max=None,
<del> target_min=None,
<ide> ):
<ide> # The global seed is set so that we can get the same random streams
<ide> # between eager and graph mode when stateful op is used.
<ide> def test_uniform(self):
<ide> self._runner(
<ide> initializers.RandomUniformV2(minval=-1, maxval=1, seed=124),
<ide> tensor_shape,
<del> target_mean=0.0,
<del> target_max=1,
<del> target_min=-1,
<ide> )
<ide>
<ide> def test_normal(self):
<ide> def test_normal(self):
<ide> self._runner(
<ide> initializers.RandomNormalV2(mean=0, stddev=1, seed=153),
<ide> tensor_shape,
<del> target_mean=0.0,
<del> target_std=1,
<ide> )
<ide>
<ide> def test_truncated_normal(self):
<ide> def test_truncated_normal(self):
<ide> self._runner(
<ide> initializers.TruncatedNormalV2(mean=0, stddev=1, seed=126),
<ide> tensor_shape,
<del> target_mean=0.0,
<del> target_max=2,
<del> target_min=-2,
<ide> )
<ide>
<ide> def test_constant(self):
<ide> tensor_shape = (5, 6, 4)
<ide> with self.cached_session():
<del> self._runner(
<del> initializers.ConstantV2(2.0),
<del> tensor_shape,
<del> target_mean=2,
<del> target_max=2,
<del> target_min=2,
<del> )
<add> self._runner(initializers.ConstantV2(2.0), tensor_shape)
<ide>
<ide> def test_lecun_uniform(self):
<ide> tensor_shape = (5, 6, 4, 2)
<ide> with self.cached_session():
<del> fan_in, _ = _compute_fans(tensor_shape)
<del> std = np.sqrt(1.0 / fan_in)
<del> self._runner(
<del> initializers.LecunUniformV2(seed=123),
<del> tensor_shape,
<del> target_mean=0.0,
<del> target_std=std,
<del> )
<add> self._runner(initializers.LecunUniformV2(seed=123), tensor_shape)
<ide>
<ide> def test_glorot_uniform(self):
<ide> tensor_shape = (5, 6, 4, 2)
<ide> with self.cached_session():
<del> fan_in, fan_out = _compute_fans(tensor_shape)
<del> std = np.sqrt(2.0 / (fan_in + fan_out))
<del> self._runner(
<del> initializers.GlorotUniformV2(seed=123),
<del> tensor_shape,
<del> target_mean=0.0,
<del> target_std=std,
<del> )
<add> self._runner(initializers.GlorotUniformV2(seed=123), tensor_shape)
<ide>
<ide> def test_he_uniform(self):
<ide> tensor_shape = (5, 6, 4, 2)
<ide> with self.cached_session():
<del> fan_in, _ = _compute_fans(tensor_shape)
<del> std = np.sqrt(2.0 / fan_in)
<del> self._runner(
<del> initializers.HeUniformV2(seed=123),
<del> tensor_shape,
<del> target_mean=0.0,
<del> target_std=std,
<del> )
<add> self._runner(initializers.HeUniformV2(seed=123), tensor_shape)
<ide>
<ide> def test_lecun_normal(self):
<ide> tensor_shape = (5, 6, 4, 2)
<ide> with self.cached_session():
<del> fan_in, _ = _compute_fans(tensor_shape)
<del> std = np.sqrt(1.0 / fan_in)
<del> self._runner(
<del> initializers.LecunNormalV2(seed=123),
<del> tensor_shape,
<del> target_mean=0.0,
<del> target_std=std,
<del> )
<add> self._runner(initializers.LecunNormalV2(seed=123), tensor_shape)
<ide>
<ide> def test_glorot_normal(self):
<ide> tensor_shape = (5, 6, 4, 2)
<ide> with self.cached_session():
<del> fan_in, fan_out = _compute_fans(tensor_shape)
<del> std = np.sqrt(2.0 / (fan_in + fan_out))
<del> self._runner(
<del> initializers.GlorotNormalV2(seed=123),
<del> tensor_shape,
<del> target_mean=0.0,
<del> target_std=std,
<del> )
<add> self._runner(initializers.GlorotNormalV2(seed=123), tensor_shape)
<ide>
<ide> def test_he_normal(self):
<ide> tensor_shape = (5, 6, 4, 2)
<ide> with self.cached_session():
<del> fan_in, _ = _compute_fans(tensor_shape)
<del> std = np.sqrt(2.0 / fan_in)
<del> self._runner(
<del> initializers.HeNormalV2(seed=123),
<del> tensor_shape,
<del> target_mean=0.0,
<del> target_std=std,
<del> )
<add> self._runner(initializers.HeNormalV2(seed=123), tensor_shape)
<ide>
<ide> def test_orthogonal(self):
<ide> tensor_shape = (20, 20)
<ide> with self.cached_session():
<del> self._runner(
<del> initializers.OrthogonalV2(seed=123),
<del> tensor_shape,
<del> target_mean=0.0,
<del> )
<add> self._runner(initializers.OrthogonalV2(seed=123), tensor_shape)
<ide>
<ide> def test_identity(self):
<ide> with self.cached_session():
<ide> tensor_shape = (3, 4, 5)
<ide> with self.assertRaises(ValueError):
<del> self._runner(
<del> initializers.IdentityV2(),
<del> tensor_shape,
<del> target_mean=1.0 / tensor_shape[0],
<del> target_max=1.0,
<del> )
<add> self._runner(initializers.IdentityV2(), tensor_shape)
<ide>
<ide> tensor_shape = (3, 3)
<del> self._runner(
<del> initializers.IdentityV2(),
<del> tensor_shape,
<del> target_mean=1.0 / tensor_shape[0],
<del> target_max=1.0,
<del> )
<add> self._runner(initializers.IdentityV2(), tensor_shape)
<ide>
<ide> def test_zero(self):
<ide> tensor_shape = (4, 5)
<ide> with self.cached_session():
<del> self._runner(
<del> initializers.ZerosV2(),
<del> tensor_shape,
<del> target_mean=0.0,
<del> target_max=0.0,
<del> )
<add> self._runner(initializers.ZerosV2(), tensor_shape)
<ide>
<ide> def test_one(self):
<ide> tensor_shape = (4, 5)
<ide> with self.cached_session():
<del> self._runner(
<del> initializers.OnesV2(),
<del> tensor_shape,
<del> target_mean=1.0,
<del> target_max=1.0,
<del> )
<add> self._runner(initializers.OnesV2(), tensor_shape)
<ide>
<ide> def test_default_random_uniform(self):
<ide> ru = initializers.get("uniform")
<ide> def test_load_external_variance_scaling_v2(self):
<ide> ("RandomUniform_seeded", initializers.RandomUniformV2, {"seed": 123}),
<ide> ("RandomNormal", initializers.RandomNormalV2, {}),
<ide> ("RandomNormal_seeded", initializers.RandomNormalV2, {"seed": 123}),
<del> ("TruncatedNormal", initializers.TruncatedNormalV2, {}),
<del> (
<del> "TruncatedNormal_seeded",
<del> initializers.TruncatedNormalV2,
<del> {"seed": 123},
<del> ),
<add> # TODO(scottzhu): Enable these tests after the forward compat period
<add> # expires for TruncatedNormalV2.
<add> # ("TruncatedNormal", initializers.TruncatedNormalV2, {}),
<add> # (
<add> # "TruncatedNormal_seeded",
<add> # initializers.TruncatedNormalV2,
<add> # {"seed": 123},
<add> # ),
<ide> ("LecunUniform", initializers.LecunUniformV2, {}),
<ide> ("LecunUniform_seeded", initializers.LecunUniformV2, {"seed": 123}),
<ide> ("GlorotUniform", initializers.GlorotUniformV2, {}),
<ide> def test_partition(self, initializer_cls, kwargs):
<ide>
<ide> # Make sure initializer produce same result when provide same
<ide> # partition offset.
<del> # TODO(scottzhu): Enable this assert when initializer is fully
<del> # stateless
<del> # result_3 = initializer(
<del> # shape=(4, 2), partition_shape=(2, 2), partition_offset=(1,
<del> # 0))
<del> # self.assertAllClose(result_2, result_3)
<add> result_3 = initializer(
<add> shape=(4, 2),
<add> partition_shape=(2, 2),
<add> partition_offset=(1, 0),
<add> )
<add> self.assertAllClose(result_2, result_3)
<ide>
<ide> @parameterized.named_parameters(
<ide> ("Orthogonal", initializers.OrthogonalV2),
<ide> def test_partition_unsupported(self, initializer_cls):
<ide> shape=(4, 2), partition_shape=(2, 2), partition_offset=(0, 0)
<ide> )
<ide>
<add> @parameterized.parameters(RANDOM_INITIALIZERS)
<add> def test_stateless(self, initializer_cl):
<add> with self.cached_session():
<add> initializer = initializer_cl()
<add> output1 = initializer(shape=[2, 3])
<add> output2 = initializer(shape=[2, 3])
<add> initializer2 = initializer_cl()
<add> output3 = initializer2(shape=[2, 3])
<add> output4 = initializer2(shape=[2, 3])
<add>
<add> self.assertAllClose(output1, output2)
<add> self.assertAllClose(output3, output4)
<add> self.assertNotAllClose(output1, output3)
<add>
<add> with warnings.catch_warnings(record=True) as w:
<add> initializer(shape=[2, 3])
<add> self.assertLen(w, 1)
<add> self.assertIn("being called multiple times", str(w[0].message))
<add>
<add> @parameterized.parameters(RANDOM_INITIALIZERS)
<add> def test_seed_stateless(self, initializer_cl):
<add> with self.cached_session():
<add> seed = 1337
<add> initializer = initializer_cl(seed=seed)
<add> output1 = initializer(shape=[2, 3])
<add> output2 = initializer(shape=[2, 3])
<add> initializer2 = initializer_cl(seed=seed)
<add> output3 = initializer2(shape=[2, 3])
<add> output4 = initializer2(shape=[2, 3])
<add>
<add> self.assertAllClose(output1, output2)
<add> self.assertAllClose(output3, output4)
<add> self.assertAllClose(output1, output3)
<add>
<add> # We don't raise warning for seeded initializer.
<add> with warnings.catch_warnings(record=True) as w:
<add> initializer(shape=[2, 3])
<add> self.assertEmpty(w)
<add>
<ide>
<ide> if __name__ == "__main__":
<ide> tf.test.main()
<ide><path>keras/initializers/initializers_v1.py
<ide> class RandomNormal(tf.compat.v1.random_normal_initializer):
<ide>
<ide> `compat.v1` Fixed seed behavior:
<ide>
<del> >>> initializer = tf.compat.v1.keras.initializers.TruncatedNormal(seed=10)
<add> >>> initializer = tf.compat.v1.keras.initializers.RandomNormal(seed=10)
<ide> >>> a = initializer(shape=(2, 2))
<ide> >>> b = initializer(shape=(2, 2))
<ide> >>> tf.reduce_sum(a - b) == 0
<ide> <tf.Tensor: shape=(), dtype=bool, numpy=False>
<ide>
<ide> After:
<ide>
<del> >>> initializer = tf.keras.initializers.TruncatedNormal(seed=10)
<add> >>> initializer = tf.keras.initializers.RandomNormal(seed=10)
<ide> >>> a = initializer(shape=(2, 2))
<ide> >>> b = initializer(shape=(2, 2))
<ide> >>> tf.reduce_sum(a - b) == 0
<del> <tf.Tensor: shape=(), dtype=bool, numpy=False>
<add> <tf.Tensor: shape=(), dtype=bool, numpy=True>
<ide>
<ide> @end_compatibility
<ide> """
<ide> class RandomUniform(tf.compat.v1.random_uniform_initializer):
<ide> >>> a = initializer(shape=(2, 2))
<ide> >>> b = initializer(shape=(2, 2))
<ide> >>> tf.reduce_sum(a - b) == 0
<del> <tf.Tensor: shape=(), dtype=bool, numpy=False>
<add> <tf.Tensor: shape=(), dtype=bool, numpy=True>
<ide>
<ide> @end_compatibility
<ide> """
<ide><path>keras/initializers/initializers_v2.py
<ide> """Keras initializers for TF 2."""
<ide>
<ide> import math
<add>import warnings
<ide>
<ide> import tensorflow.compat.v2 as tf
<ide>
<ide> def from_config(cls, config):
<ide> config.pop("dtype", None)
<ide> return cls(**config)
<ide>
<add> def _warn_reuse(self):
<add> if getattr(self, "_used", False):
<add> if getattr(self, "seed", None) is None:
<add> warnings.warn(
<add> f"The initializer {self.__class__.__name__} is unseeded "
<add> "and being called multiple times, which will return "
<add> "identical values each time (even if the initializer is "
<add> "unseeded). Please update your code to provide a seed to "
<add> "the initializer, or avoid using the same initalizer "
<add> "instance more than once."
<add> )
<add> else:
<add> self._used = True
<add>
<ide>
<ide> @keras_export("keras.initializers.Zeros", "keras.initializers.zeros", v1=[])
<ide> class Zeros(Initializer):
<ide> class RandomUniform(Initializer):
<ide> maxval: A python scalar or a scalar tensor. Upper bound of the range of
<ide> random values to generate (exclusive).
<ide> seed: A Python integer. Used to make the behavior of the initializer
<del> deterministic. Note that a seeded initializer will not produce the same
<del> random values across multiple calls, but multiple initializers will
<del> produce the same sequence when constructed with the same seed value.
<add> deterministic. Note that a seeded initializer will produce the same
<add> random values across multiple calls.
<ide> """
<ide>
<ide> def __init__(self, minval=-0.05, maxval=0.05, seed=None):
<ide> self.minval = minval
<ide> self.maxval = maxval
<ide> self.seed = seed
<del> self._random_generator = backend.RandomGenerator(seed)
<add> self._random_generator = backend.RandomGenerator(
<add> seed, rng_type="stateless"
<add> )
<ide>
<ide> def __call__(self, shape, dtype=None, **kwargs):
<ide> """Returns a tensor object initialized as specified by the initializer.
<ide> def __call__(self, shape, dtype=None, **kwargs):
<ide> if _PARTITION_SHAPE in kwargs:
<ide> shape = kwargs[_PARTITION_SHAPE]
<ide> partition_offset = kwargs.get(_PARTITION_OFFSET, None)
<add> if partition_offset is None:
<add> # We skip the reuse warning for partitioned variable, since the same
<add> # initializer will be called multiple times for each partition.
<add> self._warn_reuse()
<ide> nonce = hash(partition_offset) if partition_offset else None
<ide> layout = kwargs.pop("layout", None)
<ide> if layout:
<del> self._random_generator._rng_type = (
<del> self._random_generator.RNG_STATEFUL
<del> )
<ide> _ensure_keras_seeded()
<ide> return utils.call_with_layout(
<ide> self._random_generator.random_uniform,
<ide> class RandomNormal(Initializer):
<ide> stddev: a python scalar or a scalar tensor. Standard deviation of the
<ide> random values to generate.
<ide> seed: A Python integer. Used to make the behavior of the initializer
<del> deterministic. Note that a seeded initializer will not produce the same
<del> random values across multiple calls, but multiple initializers will
<del> produce the same sequence when constructed with the same seed value.
<add> deterministic. Note that a seeded initializer will produce the same
<add> random values across multiple calls.
<ide> """
<ide>
<ide> def __init__(self, mean=0.0, stddev=0.05, seed=None):
<ide> self.mean = mean
<ide> self.stddev = stddev
<ide> self.seed = seed
<del> self._random_generator = backend.RandomGenerator(seed)
<add> self._random_generator = backend.RandomGenerator(
<add> seed, rng_type="stateless"
<add> )
<ide>
<ide> def __call__(self, shape, dtype=None, **kwargs):
<ide> """Returns a tensor object initialized to random normal values.
<ide> def __call__(self, shape, dtype=None, **kwargs):
<ide> if _PARTITION_SHAPE in kwargs:
<ide> shape = kwargs[_PARTITION_SHAPE]
<ide> partition_offset = kwargs.get(_PARTITION_OFFSET, None)
<add> if partition_offset is None:
<add> # We skip the reuse warning for partitioned variable, since the same
<add> # initializer will be called multiple times for each partition.
<add> self._warn_reuse()
<ide> nonce = hash(partition_offset) if partition_offset else None
<ide> layout = kwargs.pop("layout", None)
<ide> if layout:
<del> self._random_generator._rng_type = (
<del> self._random_generator.RNG_STATEFUL
<del> )
<ide> _ensure_keras_seeded()
<ide> return utils.call_with_layout(
<ide> self._random_generator.random_normal,
<ide> class TruncatedNormal(Initializer):
<ide> stddev: a python scalar or a scalar tensor. Standard deviation of the
<ide> random values to generate before truncation.
<ide> seed: A Python integer. Used to make the behavior of the initializer
<del> deterministic. Note that a seeded initializer will not produce the same
<del> random values across multiple calls, but multiple initializers will
<del> produce the same sequence when constructed with the same seed value.
<add> deterministic. Note that a seeded initializer will produce the same
<add> random values across multiple calls.
<ide> """
<ide>
<ide> def __init__(self, mean=0.0, stddev=0.05, seed=None):
<ide> self.mean = mean
<ide> self.stddev = stddev
<ide> self.seed = seed
<del> self._random_generator = backend.RandomGenerator(seed)
<add> if tf.compat.forward_compatible(2022, 6, 24):
<add> # Use the new stateless implementation after the forward compat date
<add> # is reached.
<add> self._random_generator = backend.RandomGenerator(
<add> seed, rng_type="stateless"
<add> )
<add> else:
<add> # TODO(scottzhu): Remove this after the forward compat date expires.
<add> self._random_generator = backend.RandomGenerator(seed)
<ide>
<ide> def __call__(self, shape, dtype=None, **kwargs):
<ide> """Returns a tensor object initialized to random normal values (truncated).
<ide> def __call__(self, shape, dtype=None, **kwargs):
<ide> if _PARTITION_SHAPE in kwargs:
<ide> shape = kwargs[_PARTITION_SHAPE]
<ide> partition_offset = kwargs.get(_PARTITION_OFFSET, None)
<add> if partition_offset is None:
<add> # We skip the reuse warning for partitioned variable, since the same
<add> # initializer will be called multiple times for each partition.
<add> self._warn_reuse()
<ide> nonce = hash(partition_offset) if partition_offset else None
<ide> layout = kwargs.pop("layout", None)
<ide> if layout:
<add> # TODO(scottzhu): Remove this once the forward compat period above
<add> # is expired.
<ide> self._random_generator._rng_type = (
<ide> self._random_generator.RNG_STATEFUL
<ide> )
<ide> class VarianceScaling(Initializer):
<ide> distribution: Random distribution to use. One of "truncated_normal",
<ide> "untruncated_normal" and "uniform".
<ide> seed: A Python integer. Used to make the behavior of the initializer
<del> deterministic. Note that a seeded initializer will not produce the same
<del> random values across multiple calls, but multiple initializers will
<del> produce the same sequence when constructed with the same seed value.
<add> deterministic. Note that a seeded initializer will produce the same
<add> random values across multiple calls.
<ide> """
<ide>
<ide> def __init__(
<ide> def __init__(
<ide> self.mode = mode
<ide> self.distribution = distribution
<ide> self.seed = seed
<del> self._random_generator = backend.RandomGenerator(seed)
<add> self._random_generator = backend.RandomGenerator(
<add> seed, rng_type="stateless"
<add> )
<ide>
<ide> def __call__(self, shape, dtype=None, **kwargs):
<ide> """Returns a tensor object initialized as specified by the initializer.
<ide> def __call__(self, shape, dtype=None, **kwargs):
<ide> if _PARTITION_SHAPE in kwargs:
<ide> shape = kwargs[_PARTITION_SHAPE]
<ide> partition_offset = kwargs.get(_PARTITION_OFFSET, None)
<add> if partition_offset is None:
<add> # We skip the reuse warning for partitioned variable, since the same
<add> # initializer will be called multiple times for each partition.
<add> self._warn_reuse()
<ide> nonce = hash(partition_offset) if partition_offset else None
<ide> layout = kwargs.pop("layout", None)
<ide> if layout:
<del> self._random_generator._rng_type = (
<del> self._random_generator.RNG_STATEFUL
<del> )
<ide> _ensure_keras_seeded()
<ide> return utils.call_with_layout(
<ide> self._generate_init_val,
<ide> class Orthogonal(Initializer):
<ide> Args:
<ide> gain: multiplicative factor to apply to the orthogonal matrix
<ide> seed: A Python integer. Used to make the behavior of the initializer
<del> deterministic. Note that a seeded initializer will not produce the same
<del> random values across multiple calls, but multiple initializers will
<del> produce the same sequence when constructed with the same seed value.
<add> deterministic. Note that a seeded initializer will produce the same
<add> random values across multiple calls.
<ide>
<ide> References:
<ide> - [Saxe et al., 2014](https://openreview.net/forum?id=_wzZwKpTDF_9C)
<ide> class Orthogonal(Initializer):
<ide> def __init__(self, gain=1.0, seed=None):
<ide> self.gain = gain
<ide> self.seed = seed
<del> self._random_generator = backend.RandomGenerator(seed)
<add> self._random_generator = backend.RandomGenerator(
<add> seed, rng_type="stateless"
<add> )
<ide>
<ide> def __call__(self, shape, dtype=None, **kwargs):
<ide> """Returns a tensor object initialized to an orthogonal matrix.
<ide> def __call__(self, shape, dtype=None, **kwargs):
<ide> "at least two-dimensional. Received: "
<ide> f"shape={shape} of rank {len(shape)}."
<ide> )
<add> self._warn_reuse()
<ide> layout = kwargs.pop("layout", None)
<ide> if layout:
<del> self._random_generator._rng_type = (
<del> self._random_generator.RNG_STATEFUL
<del> )
<ide> _ensure_keras_seeded()
<ide> return utils.call_with_layout(
<ide> self._generate_init_val, layout, shape=shape, dtype=dtype
<ide><path>keras/legacy_tf_layers/migration_utils_test.py
<ide>
<ide> import tensorflow as tf
<ide>
<del>from keras.initializers import GlorotUniform as V2GlorotUniform
<ide> from keras.legacy_tf_layers import migration_utils
<ide>
<ide>
<ide> def test_constant_mode_no_seed(self):
<ide> """
<ide>
<ide> # Generate three random tensors to show how the stateful random number
<del> # generation and glorot_uniform_initializer match between sessions and
<del> # eager execution.
<add> # generation match between sessions and eager execution.
<ide> random_tool = migration_utils.DeterministicRandomTestTool()
<ide> with random_tool.scope():
<ide> graph = tf.Graph()
<ide> def test_constant_mode_no_seed(self):
<ide> b = b * 3
<ide> c = tf.compat.v1.random.uniform(shape=(3, 3))
<ide> c = c * 3
<del> d = tf.compat.v1.glorot_uniform_initializer()(
<del> shape=(6, 6), dtype=tf.float32
<del> )
<del> graph_a, graph_b, graph_c, graph_d = sess.run([a, b, c, d])
<add> graph_a, graph_b, graph_c = sess.run([a, b, c])
<ide>
<ide> a = tf.compat.v2.random.uniform(shape=(3, 1))
<ide> a = a * 3
<ide> b = tf.compat.v2.random.uniform(shape=(3, 3))
<ide> b = b * 3
<ide> c = tf.compat.v2.random.uniform(shape=(3, 3))
<ide> c = c * 3
<del> d = V2GlorotUniform()(shape=(6, 6), dtype=tf.float32)
<ide> # validate that the generated random tensors match
<ide> self.assertAllClose(graph_a, a)
<ide> self.assertAllClose(graph_b, b)
<ide> self.assertAllClose(graph_c, c)
<del> self.assertAllClose(graph_d, d)
<ide> # In constant mode, because b and c were generated with the same seed
<ide> # within the same scope and have the same shape, they will have exactly
<ide> # the same values.
<ide> def test_constant_mode_seed_argument(self):
<ide> a = a * 3
<ide> b = tf.compat.v1.random.uniform(shape=(3, 3), seed=1234)
<ide> b = b * 3
<del> c = tf.compat.v1.glorot_uniform_initializer(seed=1234)(
<del> shape=(6, 6), dtype=tf.float32
<del> )
<del> graph_a, graph_b, graph_c = sess.run([a, b, c])
<add> graph_a, graph_b = sess.run([a, b])
<ide> a = tf.compat.v2.random.uniform(shape=(3, 1), seed=1234)
<ide> a = a * 3
<ide> b = tf.compat.v2.random.uniform(shape=(3, 3), seed=1234)
<ide> b = b * 3
<del> c = V2GlorotUniform(seed=1234)(shape=(6, 6), dtype=tf.float32)
<ide>
<ide> # validate that the generated random tensors match
<ide> self.assertAllClose(graph_a, a)
<ide> self.assertAllClose(graph_b, b)
<del> self.assertAllClose(graph_c, c)
<ide>
<ide> def test_num_rand_ops(self):
<ide> """Test random tensor generation consistancy in num_random_ops mode.
<ide> def test_num_rand_ops(self):
<ide> b = b * 3
<ide> c = tf.compat.v1.random.uniform(shape=(3, 3))
<ide> c = c * 3
<del> d = tf.compat.v1.glorot_uniform_initializer()(
<del> shape=(6, 6), dtype=tf.float32
<del> )
<del> graph_a, graph_b, graph_c, graph_d = sess.run([a, b, c, d])
<add> graph_a, graph_b, graph_c = sess.run([a, b, c])
<ide>
<ide> random_tool = migration_utils.DeterministicRandomTestTool(
<ide> mode="num_random_ops"
<ide> def test_num_rand_ops(self):
<ide> b = b * 3
<ide> c = tf.compat.v2.random.uniform(shape=(3, 3))
<ide> c = c * 3
<del> d = V2GlorotUniform()(shape=(6, 6), dtype=tf.float32)
<ide> # validate that the generated random tensors match
<ide> self.assertAllClose(graph_a, a)
<ide> self.assertAllClose(graph_b, b)
<ide> self.assertAllClose(graph_c, c)
<del> self.assertAllClose(graph_d, d)
<ide> # validate that the tensors differ based on ops ordering
<ide> self.assertNotAllClose(b, c)
<ide> self.assertNotAllClose(graph_b, graph_c) | 5 |
Javascript | Javascript | remove confusing typo | 74c32e2609d7d76cd5adcbd55cc57c96b100ef95 | <ide><path>Libraries/PermissionsAndroid/PermissionsAndroid.js
<ide> type Rationale = {
<ide> * ```
<ide> * async function requestCameraPermission() {
<ide> * try {
<del> * const granted = await AndroidPermissions.requestPermission(
<del> * AndroidPermissions.PERMISSIONS.CAMERA,
<add> * const granted = await PermissionsAndroid.requestPermission(
<add> * PermissionsAndroid.PERMISSIONS.CAMERA,
<ide> * {
<ide> * 'title': 'Cool Photo App Camera Permission',
<ide> * 'message': 'Cool Photo App needs access to your camera ' + | 1 |
Mixed | Go | add insecure registries to docker info | 44a50abe7b16368bdc8b70e01cb095dc46cbbbaf | <ide><path>api/client/info.go
<ide> func (cli *DockerCli) CmdInfo(args ...string) error {
<ide> if info.ClusterAdvertise != "" {
<ide> fmt.Fprintf(cli.out, "Cluster Advertise: %s\n", info.ClusterAdvertise)
<ide> }
<add>
<add> if info.RegistryConfig != nil && (len(info.RegistryConfig.InsecureRegistryCIDRs) > 0 || len(info.RegistryConfig.IndexConfigs) > 0) {
<add> fmt.Fprintln(cli.out, "Insecure registries:")
<add> for _, registry := range info.RegistryConfig.IndexConfigs {
<add> if registry.Secure == false {
<add> fmt.Fprintf(cli.out, " %s\n", registry.Name)
<add> }
<add> }
<add>
<add> for _, registry := range info.RegistryConfig.InsecureRegistryCIDRs {
<add> mask, _ := registry.Mask.Size()
<add> fmt.Fprintf(cli.out, " %s/%d\n", registry.IP.String(), mask)
<add> }
<add> }
<ide> return nil
<ide> }
<ide><path>docs/reference/commandline/info.md
<ide> For example:
<ide> Registry: [https://index.docker.io/v1/]
<ide> Labels:
<ide> storage=ssd
<add> Insecure registries:
<add> myinsecurehost:5000
<add> 127.0.0.0/8
<ide>
<ide> The global `-D` option tells all `docker` commands to output debug information.
<ide>
<ide><path>integration-cli/docker_cli_info_test.go
<ide> func (s *DockerSuite) TestInfoDebug(c *check.C) {
<ide> c.Assert(out, checker.Contains, "EventsListeners")
<ide> c.Assert(out, checker.Contains, "Docker Root Dir")
<ide> }
<add>
<add>func (s *DockerSuite) TestInsecureRegistries(c *check.C) {
<add> testRequires(c, SameHostDaemon, DaemonIsLinux)
<add>
<add> registryCIDR := "192.168.1.0/24"
<add> registryHost := "insecurehost.com:5000"
<add>
<add> d := NewDaemon(c)
<add> err := d.Start("--insecure-registry="+registryCIDR, "--insecure-registry="+registryHost)
<add> c.Assert(err, checker.IsNil)
<add> defer d.Stop()
<add>
<add> out, err := d.Cmd("info")
<add> c.Assert(err, checker.IsNil)
<add> c.Assert(out, checker.Contains, "Insecure registries:\n")
<add> c.Assert(out, checker.Contains, fmt.Sprintf(" %s\n", registryHost))
<add> c.Assert(out, checker.Contains, fmt.Sprintf(" %s\n", registryCIDR))
<add>}
<ide><path>man/docker-info.1.md
<ide> Here is a sample output:
<ide> Debug mode (server): false
<ide> Username: xyz
<ide> Registry: https://index.docker.io/v1/
<add> Insecure registries:
<add> myinsecurehost:5000
<add> 127.0.0.0/8
<ide>
<ide> # HISTORY
<ide> April 2014, Originally compiled by William Henry (whenry at redhat dot com) | 4 |
Javascript | Javascript | modernize js and tighten equality checking | d59074c1cbf7c03f90bb5d850036f6153fe9464d | <ide><path>test/parallel/test-child-process-detached.js
<ide> 'use strict';
<del>var common = require('../common');
<del>var assert = require('assert');
<del>var path = require('path');
<add>const common = require('../common');
<add>const assert = require('assert');
<add>const path = require('path');
<ide>
<del>var spawn = require('child_process').spawn;
<del>var childPath = path.join(common.fixturesDir,
<del> 'parent-process-nonpersistent.js');
<del>var persistentPid = -1;
<add>const spawn = require('child_process').spawn;
<add>const childPath = path.join(common.fixturesDir,
<add> 'parent-process-nonpersistent.js');
<add>let persistentPid = -1;
<ide>
<del>var child = spawn(process.execPath, [ childPath ]);
<add>const child = spawn(process.execPath, [ childPath ]);
<ide>
<ide> child.stdout.on('data', function(data) {
<ide> persistentPid = parseInt(data, 10);
<ide> });
<ide>
<ide> process.on('exit', function() {
<del> assert(persistentPid !== -1);
<add> assert.notStrictEqual(persistentPid, -1);
<ide> assert.throws(function() {
<ide> process.kill(child.pid);
<ide> }); | 1 |
Javascript | Javascript | log hmr events | 20588a6bf869ae06e690c76fe8978f198d009505 | <ide><path>local-cli/server/util/attachHMRServer.js
<ide> function attachHMRServer({httpServer, path, packagerServer}) {
<ide> if (!client) {
<ide> return;
<ide> }
<add> console.log(
<add> `[Hot Module Replacement] File change detected (${time()})`
<add> );
<ide>
<ide> client.ws.send(JSON.stringify({type: 'update-start'}));
<ide> stat.then(() => {
<ide> function attachHMRServer({httpServer, path, packagerServer}) {
<ide> return;
<ide> }
<ide>
<add> console.log(
<add> '[Hot Module Replacement] Sending HMR update to client (' +
<add> time() + ')'
<add> );
<ide> client.ws.send(update);
<ide> });
<ide> },
<ide> function arrayEquals(arrayA, arrayB) {
<ide> );
<ide> }
<ide>
<add>function time() {
<add> const date = new Date();
<add> return `${date.getHours()}:${date.getMinutes()}:${date.getSeconds()}:${date.getMilliseconds()}`;
<add>}
<add>
<ide> module.exports = attachHMRServer; | 1 |
Ruby | Ruby | add a bind collector, remove the bind visitor | ee54e9bb3a6d7ffcf756ec1e385399b92354cb6b | <ide><path>activerecord/lib/active_record/connection_adapters/abstract/database_statements.rb
<ide> def initialize
<ide> def to_sql(arel, binds = [])
<ide> if arel.respond_to?(:ast)
<ide> binds = binds.dup
<del> visitor.compile(arel.ast) do
<del> quote(*binds.shift.reverse)
<del> end
<add> visitor.accept(arel.ast, collector).compile binds.dup
<ide> else
<ide> arel
<ide> end
<ide><path>activerecord/lib/active_record/connection_adapters/abstract_adapter.rb
<ide> def expire
<ide> @owner = nil
<ide> end
<ide>
<del> def unprepared_visitor
<del> self.class::BindSubstitution.new self
<del> end
<del>
<ide> def unprepared_statement
<ide> old_prepared_statements, @prepared_statements = @prepared_statements, false
<del> old_visitor, @visitor = @visitor, unprepared_visitor
<ide> yield
<ide> ensure
<del> @visitor, @prepared_statements = old_visitor, old_prepared_statements
<add> @prepared_statements = old_prepared_statements
<ide> end
<ide>
<ide> # Returns the human-readable name of the adapter. Use mixed case - one
<ide><path>activerecord/lib/active_record/connection_adapters/sqlite3_adapter.rb
<ide> def dealloc(stmt)
<ide> end
<ide> end
<ide>
<del> class BindSubstitution < Arel::Visitors::SQLite # :nodoc:
<del> include Arel::Visitors::BindVisitor
<del> end
<del>
<ide> def initialize(connection, logger, config)
<ide> super(connection, logger)
<ide>
<ide> def initialize(connection, logger, config)
<ide> self.class.type_cast_config_to_integer(config.fetch(:statement_limit) { 1000 }))
<ide> @config = config
<ide>
<add> @visitor = Arel::Visitors::SQLite.new self
<add>
<ide> if self.class.type_cast_config_to_boolean(config.fetch(:prepared_statements) { true })
<ide> @prepared_statements = true
<del> @visitor = Arel::Visitors::SQLite.new self
<ide> else
<del> @visitor = unprepared_visitor
<add> @prepared_statements = false
<add> end
<add> end
<add>
<add> class BindCollector < Arel::Collectors::Bind
<add> def initialize(conn)
<add> @conn = conn
<add> super()
<add> end
<add>
<add> def compile(bvs)
<add> super(bvs.map { |bv| @conn.quote(*bv.reverse) })
<add> end
<add> end
<add>
<add> def collector
<add> if @prepared_statements
<add> Arel::Collectors::SQLString.new
<add> else
<add> BindCollector.new self
<ide> end
<ide> end
<ide> | 3 |
Java | Java | add config for avoiding jni refs | d53bfed9bb0904db6ec3409e819ee5c994fb1a23 | <ide><path>ReactAndroid/src/main/java/com/facebook/yoga/YogaConfig.java
<ide> public class YogaConfig {
<ide> long mNativePointer;
<ide> private YogaLogger mLogger;
<ide> private YogaNodeCloneFunction mYogaNodeCloneFunction;
<add> public boolean avoidGlobalJNIRefs = false;
<ide>
<ide> private native long jni_YGConfigNew();
<ide> public YogaConfig() { | 1 |
Mixed | Javascript | fix shell spawn with autorun | b90f3da9deaeda2b04a77af2e11a4b6d33a0c510 | <ide><path>doc/api/child_process.md
<ide> added: v0.1.90
<ide> * `encoding` {String} (Default: `'utf8'`)
<ide> * `shell` {String} Shell to execute the command with
<ide> (Default: `'/bin/sh'` on UNIX, `'cmd.exe'` on Windows, The shell should
<del> understand the `-c` switch on UNIX or `/s /c` on Windows. On Windows,
<add> understand the `-c` switch on UNIX or `/d /s /c` on Windows. On Windows,
<ide> command line parsing should be compatible with `cmd.exe`.)
<ide> * `timeout` {Number} (Default: `0`)
<ide> * [`maxBuffer`][] {Number} largest amount of data (in bytes) allowed on
<ide> added: v0.1.90
<ide> * `shell` {Boolean|String} If `true`, runs `command` inside of a shell. Uses
<ide> `'/bin/sh'` on UNIX, and `'cmd.exe'` on Windows. A different shell can be
<ide> specified as a string. The shell should understand the `-c` switch on UNIX,
<del> or `/s /c` on Windows. Defaults to `false` (no shell).
<add> or `/d /s /c` on Windows. Defaults to `false` (no shell).
<ide> * return: {ChildProcess}
<ide>
<ide> The `child_process.spawn()` method spawns a new process using the given
<ide> added: v0.11.12
<ide> * `env` {Object} Environment key-value pairs
<ide> * `shell` {String} Shell to execute the command with
<ide> (Default: `'/bin/sh'` on UNIX, `'cmd.exe'` on Windows, The shell should
<del> understand the `-c` switch on UNIX or `/s /c` on Windows. On Windows,
<add> understand the `-c` switch on UNIX or `/d /s /c` on Windows. On Windows,
<ide> command line parsing should be compatible with `cmd.exe`.)
<ide> * `uid` {Number} Sets the user identity of the process. (See setuid(2).)
<ide> * `gid` {Number} Sets the group identity of the process. (See setgid(2).)
<ide> added: v0.11.12
<ide> * `shell` {Boolean|String} If `true`, runs `command` inside of a shell. Uses
<ide> `'/bin/sh'` on UNIX, and `'cmd.exe'` on Windows. A different shell can be
<ide> specified as a string. The shell should understand the `-c` switch on UNIX,
<del> or `/s /c` on Windows. Defaults to `false` (no shell).
<add> or `/d /s /c` on Windows. Defaults to `false` (no shell).
<ide> * return: {Object}
<ide> * `pid` {Number} Pid of the child process
<ide> * `output` {Array} Array of results from stdio output
<ide><path>lib/child_process.js
<ide> function normalizeSpawnArguments(file /*, args, options*/) {
<ide> if (process.platform === 'win32') {
<ide> file = typeof options.shell === 'string' ? options.shell :
<ide> process.env.comspec || 'cmd.exe';
<del> args = ['/s', '/c', '"' + command + '"'];
<add> args = ['/d', '/s', '/c', '"' + command + '"'];
<ide> options.windowsVerbatimArguments = true;
<ide> } else {
<ide> if (typeof options.shell === 'string')
<ide><path>test/common.js
<ide> exports.spawnPwd = function(options) {
<ide> var spawn = require('child_process').spawn;
<ide>
<ide> if (exports.isWindows) {
<del> return spawn('cmd.exe', ['/c', 'cd'], options);
<add> return spawn('cmd.exe', ['/d', '/c', 'cd'], options);
<ide> } else {
<ide> return spawn('pwd', [], options);
<ide> }
<ide> exports.spawnSyncPwd = function(options) {
<ide> const spawnSync = require('child_process').spawnSync;
<ide>
<ide> if (exports.isWindows) {
<del> return spawnSync('cmd.exe', ['/c', 'cd'], options);
<add> return spawnSync('cmd.exe', ['/d', '/c', 'cd'], options);
<ide> } else {
<ide> return spawnSync('pwd', [], options);
<ide> } | 3 |
PHP | PHP | fix cs/psalm errors | 9c9663bb1d2591e479ea09550137395cea2ce3b4 | <ide><path>src/TestSuite/Fixture/FixtureInjector.php
<ide> */
<ide> namespace Cake\TestSuite\Fixture;
<ide>
<del>use Cake\TestSuite\Fix;
<ide> use Cake\TestSuite\TestCase;
<ide> use Cake\TestSuite\TestListenerTrait;
<ide> use PHPUnit\Framework\Test;
<ide> class FixtureInjector implements TestListener
<ide> {
<ide> use TestListenerTrait;
<ide>
<add> /**
<add> * The instance of the fixture manager to use
<add> *
<add> * @var \Cake\TestSuite\Fixture\FixtureManager
<add> */
<add> protected $_fixtureManager;
<add>
<ide> /**
<ide> * Holds a reference to the container test suite
<ide> *
<ide><path>src/TestSuite/Fixture/FixtureManager.php
<ide> public function unload(TestCase $test): void
<ide> }
<ide>
<ide> /**
<del> * @inheritDoc
<add> * Loads the data for a single fixture.
<add> *
<add> * @param string $name of the fixture
<add> * @param \Cake\Datasource\ConnectionInterface|null $connection Connection instance or null
<add> * to get a Connection from the fixture.
<add> * @param bool $dropTables Whether or not to drop tables.
<add> * @return void
<add> * @throws \UnexpectedValueException if $name is not a previously fixtures class
<ide> */
<ide> public function loadSingle(string $name, ?ConnectionInterface $connection = null, bool $dropTables = true): void
<ide> {
<ide><path>tests/TestCase/TestSuite/TestCaseTest.php
<ide> use Cake\Routing\Exception\MissingRouteException;
<ide> use Cake\Routing\Router;
<ide> use Cake\Test\Fixture\FixturizedTestCase;
<del>use Cake\TestSuite\Fixture\FixtureManager;
<ide> use Cake\TestSuite\TestCase;
<ide> use PHPUnit\Framework\AssertionFailedError;
<ide> use TestApp\Model\Table\SecondaryPostsTable; | 3 |
Text | Text | add note about buffer octets integer coercion | 0e803d1d81375e0fa9318281d67b08f84cddf04a | <ide><path>doc/api/buffer.md
<ide> streams in TCP streams, file system operations, and other contexts.
<ide> With [`TypedArray`] now available, the `Buffer` class implements the
<ide> [`Uint8Array`] API in a manner that is more optimized and suitable for Node.js.
<ide>
<del>Instances of the `Buffer` class are similar to arrays of integers but
<add>Instances of the `Buffer` class are similar to arrays of integers from `0` to
<add>`255` (other integers are coerced to this range by `& 255` operation) but
<ide> correspond to fixed-sized, raw memory allocations outside the V8 heap.
<ide> The size of the `Buffer` is established when it is created and cannot be
<ide> changed. | 1 |
PHP | PHP | add requesttarget implementation | 1b7bb0a370ef47284b10ab85189bde52cafb8c35 | <ide><path>src/Network/Request.php
<ide> class Request implements ArrayAccess
<ide> */
<ide> protected $protocol;
<ide>
<add> /**
<add> * The request target if overridden
<add> *
<add> * @var string|null
<add> */
<add> protected $requestTarget;
<add>
<ide> /**
<ide> * Wrapper method to create a new request from PHP superglobals.
<ide> *
<ide> public function addPaths(array $paths)
<ide> *
<ide> * @param bool $base Include the base path, set to false to trim the base path off.
<ide> * @return string The current request URL including query string args.
<add> * @deprecated 3.4.0 This method will be removed in 4.0.0. You should use getRequestTarget() instead.
<ide> */
<ide> public function here($base = true)
<ide> {
<ide> public function withUri(UriInterface $uri)
<ide> return $new;
<ide> }
<ide>
<add> /**
<add> * Create a new instance with a specific request-target.
<add> *
<add> * You can use this method to overwrite the request target that is
<add> * inferred from the request's Uri. This also lets you change the request
<add> * target's form to an absolute-form, authority-form or asterisk-form
<add> *
<add> * @link http://tools.ietf.org/html/rfc7230#section-2.7 (for the various
<add> * request-target forms allowed in request messages)
<add> * @param string $target The request target.
<add> * @return static
<add> */
<add> public function withRequestTarget($target)
<add> {
<add> $new = clone $this;
<add> $new->requestTarget = $target;
<add>
<add> return $new;
<add> }
<add>
<add> /**
<add> * Retrieves the request's target.
<add> *
<add> * Retrieves the message's request-target either as it was requested,
<add> * or as set with `withRequestTarget()`. By default this will return the
<add> * application relative path without base directory, and the query string
<add> * defined in the SERVER environment.
<add> *
<add> * @return string
<add> */
<add> public function getRequestTarget()
<add> {
<add> if ($this->requestTarget !== null) {
<add> return $this->requestTarget;
<add> }
<add>
<add> $target = $this->uri->getPath();
<add> if ($this->uri->getQuery()) {
<add> $target .= '?' . $this->uri->getQuery();
<add> }
<add>
<add> if (empty($target)) {
<add> $target = '/';
<add> }
<add>
<add> return $target;
<add> }
<add>
<ide> /**
<ide> * Array access read implementation
<ide> *
<ide><path>tests/TestCase/Network/RequestTest.php
<ide> public function testWithoutAttributesDenyEmulatedProperties($prop)
<ide> $request->withoutAttribute($prop);
<ide> }
<ide>
<add> /**
<add> * Test the requestTarget methods.
<add> *
<add> * @return void
<add> */
<add> public function testWithRequestTarget()
<add> {
<add> $request = new Request([
<add> 'environment' => [
<add> 'REQUEST_URI' => '/articles/view/1',
<add> 'QUERY_STRING' => 'comments=1&open=0'
<add> ],
<add> 'base' => '/basedir'
<add> ]);
<add> $this->assertEquals(
<add> '/articles/view/1?comments=1&open=0',
<add> $request->getRequestTarget(),
<add> 'Should not include basedir.'
<add> );
<add>
<add> $new = $request->withRequestTarget('/articles/view/3');
<add> $this->assertNotSame($new, $request);
<add> $this->assertEquals(
<add> '/articles/view/1?comments=1&open=0',
<add> $request->getRequestTarget(),
<add> 'should be unchanged.'
<add> );
<add> $this->assertEquals('/articles/view/3', $new->getRequestTarget(), 'reflects method call');
<add> }
<add>
<ide> /**
<ide> * Data provider for emulated property tests.
<ide> * | 2 |
PHP | PHP | add test case for empty $name | 254c0b16b59f1fe50a35cac0c7b69c67ea654016 | <ide><path>tests/Filesystem/FilesystemAdapterTest.php
<ide> public function testDownloadNonAsciiFilename()
<ide> $files = new FilesystemAdapter($this->filesystem);
<ide> $response = $files->download('file.txt', 'пиздюк.txt');
<ide> $this->assertInstanceOf(StreamedResponse::class, $response);
<add> $this->assertEquals("attachment; filename=pizdyuk.txt; filename*=utf-8''%D0%BF%D0%B8%D0%B7%D0%B4%D1%8E%D0%BA.txt", $response->headers->get('content-disposition'));
<add> }
<add>
<add> public function testDownloadNonAsciiEmptyFilename()
<add> {
<add> setlocale(LC_ALL,'en_US.UTF-8');
<add>
<add> $this->filesystem->write('пиздюк.txt', 'Hello World');
<add> $files = new FilesystemAdapter($this->filesystem);
<add> $response = $files->download('пиздюк.txt');
<add> $this->assertInstanceOf(StreamedResponse::class, $response);
<ide> $this->assertEquals('attachment; filename=pizdyuk.txt; filename*=utf-8\'\'%D0%BF%D0%B8%D0%B7%D0%B4%D1%8E%D0%BA.txt', $response->headers->get('content-disposition'));
<ide> }
<ide> | 1 |
Mixed | Javascript | move dep0023 to end of life | 853bee0acfb79ce64202c4e23dc4343b491fa864 | <ide><path>doc/api/deprecations.md
<ide> The `os.tmpDir()` API is deprecated. Please use [`os.tmpdir()`][] instead.
<ide> ### DEP0023: os.getNetworkInterfaces()
<ide> <!-- YAML
<ide> changes:
<add> - version: REPLACEME
<add> pr-url: https://github.com/nodejs/node/pull/25280
<add> description: End-of-Life.
<ide> - version:
<ide> - v4.8.6
<ide> - v6.12.0
<ide> changes:
<ide> description: Runtime deprecation.
<ide> -->
<ide>
<del>Type: Runtime
<add>Type: End-of-Life
<ide>
<ide> The `os.getNetworkInterfaces()` method is deprecated. Please use the
<del>[`os.networkInterfaces`][] property instead.
<add>[`os.networkInterfaces()`][] method instead.
<ide>
<ide> <a id="DEP0024"></a>
<ide> ### DEP0024: REPLServer.prototype.convertToContext()
<ide> Setting the TLS ServerName to an IP address is not permitted by
<ide> [`http.request()`]: http.html#http_http_request_options_callback
<ide> [`https.get()`]: https.html#https_https_get_options_callback
<ide> [`https.request()`]: https.html#https_https_request_options_callback
<del>[`os.networkInterfaces`]: os.html#os_os_networkinterfaces
<add>[`os.networkInterfaces()`]: os.html#os_os_networkinterfaces
<ide> [`os.tmpdir()`]: os.html#os_os_tmpdir
<ide> [`process.env`]: process.html#process_process_env
<ide> [`punycode`]: punycode.html
<ide><path>lib/os.js
<ide> const kEndianness = isBigEndian ? 'BE' : 'LE';
<ide> const tmpDirDeprecationMsg =
<ide> 'os.tmpDir() is deprecated. Use os.tmpdir() instead.';
<ide>
<del>const getNetworkInterfacesDepMsg =
<del> 'os.getNetworkInterfaces is deprecated. Use os.networkInterfaces instead.';
<del>
<ide> const avgValues = new Float64Array(3);
<ide>
<ide> function loadavg() {
<ide> module.exports = {
<ide> uptime: getUptime,
<ide>
<ide> // Deprecated APIs
<del> getNetworkInterfaces: deprecate(getInterfaceAddresses,
<del> getNetworkInterfacesDepMsg,
<del> 'DEP0023'),
<ide> tmpDir: deprecate(tmpdir, tmpDirDeprecationMsg, 'DEP0022')
<ide> };
<ide> | 2 |
Ruby | Ruby | move more schema modification tests | 06b8dc0a9bb556798c7cf26daf82ad4522c73f70 | <ide><path>activerecord/test/cases/migration/change_schema_test.rb
<ide> def test_column_exists_on_table_with_no_options_parameter_supplied
<ide> end
<ide> end
<ide>
<add> def test_add_index
<add> # Limit size of last_name and key columns to support Firebird index limitations
<add> connection.create_table :testings do |t|
<add> t.string :first_name
<add> t.string :last_name, :limit => 100
<add> t.string :key, :limit => 100
<add> t.boolean :administrator
<add> end
<add>
<add> connection.add_index("testings", "last_name")
<add> connection.remove_index("testings", "last_name")
<add>
<add> # Orcl nds shrt indx nms. Sybs 2.
<add> # OpenBase does not have named indexes. You must specify a single column name
<add> unless current_adapter?(:SybaseAdapter, :OpenBaseAdapter)
<add> connection.add_index("testings", ["last_name", "first_name"])
<add> connection.remove_index("testings", :column => ["last_name", "first_name"])
<add>
<add> # Oracle adapter cannot have specified index name larger than 30 characters
<add> # Oracle adapter is shortening index name when just column list is given
<add> unless current_adapter?(:OracleAdapter)
<add> connection.add_index("testings", ["last_name", "first_name"])
<add> connection.remove_index("testings", :name => :index_testings_on_last_name_and_first_name)
<add> connection.add_index("testings", ["last_name", "first_name"])
<add> connection.remove_index("testings", "last_name_and_first_name")
<add> end
<add> connection.add_index("testings", ["last_name", "first_name"])
<add> connection.remove_index("testings", ["last_name", "first_name"])
<add>
<add> connection.add_index("testings", ["last_name"], :length => 10)
<add> connection.remove_index("testings", "last_name")
<add>
<add> connection.add_index("testings", ["last_name"], :length => {:last_name => 10})
<add> connection.remove_index("testings", ["last_name"])
<add>
<add> connection.add_index("testings", ["last_name", "first_name"], :length => 10)
<add> connection.remove_index("testings", ["last_name", "first_name"])
<add>
<add> connection.add_index("testings", ["last_name", "first_name"], :length => {:last_name => 10, :first_name => 20})
<add> connection.remove_index("testings", ["last_name", "first_name"])
<add> end
<add>
<add> # quoting
<add> # Note: changed index name from "key" to "key_idx" since "key" is a Firebird reserved word
<add> # OpenBase does not have named indexes. You must specify a single column name
<add> unless current_adapter?(:OpenBaseAdapter)
<add> connection.add_index("testings", ["key"], :name => "key_idx", :unique => true)
<add> connection.remove_index("testings", :name => "key_idx", :unique => true)
<add> end
<add>
<add> # Sybase adapter does not support indexes on :boolean columns
<add> # OpenBase does not have named indexes. You must specify a single column
<add> unless current_adapter?(:SybaseAdapter, :OpenBaseAdapter)
<add> connection.add_index("testings", %w(last_name first_name administrator), :name => "named_admin")
<add> connection.remove_index("testings", :name => "named_admin")
<add> end
<add>
<add> # Selected adapters support index sort order
<add> if current_adapter?(:SQLite3Adapter, :MysqlAdapter, :Mysql2Adapter, :PostgreSQLAdapter)
<add> connection.add_index("testings", ["last_name"], :order => {:last_name => :desc})
<add> connection.remove_index("testings", ["last_name"])
<add> connection.add_index("testings", ["last_name", "first_name"], :order => {:last_name => :desc})
<add> connection.remove_index("testings", ["last_name", "first_name"])
<add> connection.add_index("testings", ["last_name", "first_name"], :order => {:last_name => :desc, :first_name => :asc})
<add> connection.remove_index("testings", ["last_name", "first_name"])
<add> connection.add_index("testings", ["last_name", "first_name"], :order => :desc)
<add> connection.remove_index("testings", ["last_name", "first_name"])
<add> end
<add> end
<add>
<ide> private
<ide> def testing_table_with_only_foo_attribute
<ide> connection.create_table :testings, :id => false do |t|
<ide><path>activerecord/test/cases/migration_test.rb
<ide> def teardown
<ide> Person.reset_column_information
<ide> end
<ide>
<del> def test_add_index
<del> # Limit size of last_name and key columns to support Firebird index limitations
<del> Person.connection.add_column "people", "last_name", :string, :limit => 100
<del> Person.connection.add_column "people", "key", :string, :limit => 100
<del> Person.connection.add_column "people", "administrator", :boolean
<del>
<del> assert_nothing_raised { Person.connection.add_index("people", "last_name") }
<del> assert_nothing_raised { Person.connection.remove_index("people", "last_name") }
<del>
<del> # Orcl nds shrt indx nms. Sybs 2.
<del> # OpenBase does not have named indexes. You must specify a single column name
<del> unless current_adapter?(:SybaseAdapter, :OpenBaseAdapter)
<del> assert_nothing_raised { Person.connection.add_index("people", ["last_name", "first_name"]) }
<del> assert_nothing_raised { Person.connection.remove_index("people", :column => ["last_name", "first_name"]) }
<del> # Oracle adapter cannot have specified index name larger than 30 characters
<del> # Oracle adapter is shortening index name when just column list is given
<del> unless current_adapter?(:OracleAdapter)
<del> assert_nothing_raised { Person.connection.add_index("people", ["last_name", "first_name"]) }
<del> assert_nothing_raised { Person.connection.remove_index("people", :name => :index_people_on_last_name_and_first_name) }
<del> assert_nothing_raised { Person.connection.add_index("people", ["last_name", "first_name"]) }
<del> assert_nothing_raised { Person.connection.remove_index("people", "last_name_and_first_name") }
<del> end
<del> assert_nothing_raised { Person.connection.add_index("people", ["last_name", "first_name"]) }
<del> assert_nothing_raised { Person.connection.remove_index("people", ["last_name", "first_name"]) }
<del> assert_nothing_raised { Person.connection.add_index("people", ["last_name"], :length => 10) }
<del> assert_nothing_raised { Person.connection.remove_index("people", "last_name") }
<del> assert_nothing_raised { Person.connection.add_index("people", ["last_name"], :length => {:last_name => 10}) }
<del> assert_nothing_raised { Person.connection.remove_index("people", ["last_name"]) }
<del> assert_nothing_raised { Person.connection.add_index("people", ["last_name", "first_name"], :length => 10) }
<del> assert_nothing_raised { Person.connection.remove_index("people", ["last_name", "first_name"]) }
<del> assert_nothing_raised { Person.connection.add_index("people", ["last_name", "first_name"], :length => {:last_name => 10, :first_name => 20}) }
<del> assert_nothing_raised { Person.connection.remove_index("people", ["last_name", "first_name"]) }
<del> end
<del>
<del> # quoting
<del> # Note: changed index name from "key" to "key_idx" since "key" is a Firebird reserved word
<del> # OpenBase does not have named indexes. You must specify a single column name
<del> unless current_adapter?(:OpenBaseAdapter)
<del> Person.update_all "#{Person.connection.quote_column_name 'key'}=#{Person.connection.quote_column_name 'id'}" #some databases (including sqlite2 won't add a unique index if existing data non unique)
<del> assert_nothing_raised { Person.connection.add_index("people", ["key"], :name => "key_idx", :unique => true) }
<del> assert_nothing_raised { Person.connection.remove_index("people", :name => "key_idx", :unique => true) }
<del> end
<del>
<del> # Sybase adapter does not support indexes on :boolean columns
<del> # OpenBase does not have named indexes. You must specify a single column
<del> unless current_adapter?(:SybaseAdapter, :OpenBaseAdapter)
<del> assert_nothing_raised { Person.connection.add_index("people", %w(last_name first_name administrator), :name => "named_admin") }
<del> assert_nothing_raised { Person.connection.remove_index("people", :name => "named_admin") }
<del> end
<del>
<del> # Selected adapters support index sort order
<del> if current_adapter?(:SQLite3Adapter, :MysqlAdapter, :Mysql2Adapter, :PostgreSQLAdapter)
<del> assert_nothing_raised { Person.connection.add_index("people", ["last_name"], :order => {:last_name => :desc}) }
<del> assert_nothing_raised { Person.connection.remove_index("people", ["last_name"]) }
<del> assert_nothing_raised { Person.connection.add_index("people", ["last_name", "first_name"], :order => {:last_name => :desc}) }
<del> assert_nothing_raised { Person.connection.remove_index("people", ["last_name", "first_name"]) }
<del> assert_nothing_raised { Person.connection.add_index("people", ["last_name", "first_name"], :order => {:last_name => :desc, :first_name => :asc}) }
<del> assert_nothing_raised { Person.connection.remove_index("people", ["last_name", "first_name"]) }
<del> assert_nothing_raised { Person.connection.add_index("people", ["last_name", "first_name"], :order => :desc) }
<del> assert_nothing_raised { Person.connection.remove_index("people", ["last_name", "first_name"]) }
<del> end
<del> end
<del>
<ide> def test_index_symbol_names
<ide> assert_nothing_raised { Person.connection.add_index :people, :primary_contact_id, :name => :symbol_index_name }
<ide> assert Person.connection.index_exists?(:people, :primary_contact_id, :name => :symbol_index_name) | 2 |
Go | Go | fix untag without force while container running | 1f7a9b1ab3d261de5be7d490e7e4f978f317242f | <ide><path>daemon/image_delete.go
<ide> func (daemon *Daemon) ImageDelete(imageRef string, force, prune bool) ([]types.I
<ide> // first. We can only remove this reference if either force is
<ide> // true, there are multiple repository references to this
<ide> // image, or there are no containers using the given reference.
<del> if !(force || len(repoRefs) > 1) {
<add> if !force && isSingleReference(repoRefs) {
<ide> if container := daemon.getContainerUsingImage(imgID); container != nil {
<ide> // If we removed the repository reference then
<ide> // this image would remain "dangling" and since | 1 |
Java | Java | add support for imageloadevents to rctimageview | 59fe71caff6cc13b086f9299e6fc85084671d3a6 | <ide><path>ReactAndroid/src/main/java/com/facebook/react/flat/BitmapUpdateListener.java
<ide> /* package */ interface BitmapUpdateListener {
<ide> public void onSecondaryAttach(Bitmap bitmap);
<ide> public void onBitmapReady(Bitmap bitmap);
<add> public void onImageLoadEvent(int imageLoadEvent);
<ide> }
<ide><path>ReactAndroid/src/main/java/com/facebook/react/flat/DrawImage.java
<ide> */
<ide> public ScaleType getScaleType();
<ide>
<add> /**
<add> * React tag used for dispatching ImageLoadEvents, or 0 to ignore events.
<add> */
<add> public void setReactTag(int reactTag);
<add>
<ide> public void setBorderWidth(float borderWidth);
<ide>
<ide> public float getBorderWidth();
<ide><path>ReactAndroid/src/main/java/com/facebook/react/flat/DrawImageWithDrawee.java
<ide> import android.graphics.Canvas;
<ide> import android.graphics.PorterDuff;
<ide> import android.graphics.PorterDuffColorFilter;
<add>import android.graphics.drawable.Animatable;
<ide>
<add>import com.facebook.drawee.controller.ControllerListener;
<ide> import com.facebook.drawee.drawable.ScalingUtils.ScaleType;
<ide> import com.facebook.drawee.generic.GenericDraweeHierarchy;
<ide> import com.facebook.drawee.generic.RoundingParams;
<ide> import com.facebook.imagepipeline.request.ImageRequest;
<ide> import com.facebook.infer.annotation.Assertions;
<add>import com.facebook.react.views.image.ImageLoadEvent;
<ide> import com.facebook.react.views.image.ImageResizeMode;
<ide>
<ide> /**
<ide> * DrawImageWithDrawee is DrawCommand that can draw a local or remote image.
<ide> * It uses DraweeRequestHelper internally to fetch and cache the images.
<ide> */
<del>/* package */ final class DrawImageWithDrawee extends AbstractDrawCommand implements DrawImage {
<add>/* package */ final class DrawImageWithDrawee extends AbstractDrawCommand
<add> implements DrawImage, ControllerListener {
<ide>
<ide> private @Nullable DraweeRequestHelper mRequestHelper;
<ide> private @Nullable PorterDuffColorFilter mColorFilter;
<ide> private ScaleType mScaleType = ImageResizeMode.defaultValue();
<ide> private float mBorderWidth;
<ide> private float mBorderRadius;
<ide> private int mBorderColor;
<add> private int mReactTag;
<add> private @Nullable FlatViewGroup.InvalidateCallback mCallback;
<ide>
<ide> @Override
<ide> public boolean hasImageRequest() {
<ide> public void setImageRequest(@Nullable ImageRequest imageRequest) {
<ide> if (imageRequest == null) {
<ide> mRequestHelper = null;
<ide> } else {
<del> mRequestHelper = new DraweeRequestHelper(imageRequest);
<add> mRequestHelper = new DraweeRequestHelper(imageRequest, this);
<ide> }
<ide> }
<ide>
<ide> public int getBorderColor() {
<ide> return mBorderColor;
<ide> }
<ide>
<add> @Override
<add> public void setReactTag(int reactTag) {
<add> mReactTag = reactTag;
<add> }
<add>
<ide> @Override
<ide> public void onDraw(Canvas canvas) {
<ide> Assertions.assumeNotNull(mRequestHelper).getDrawable().draw(canvas);
<ide> }
<ide>
<ide> @Override
<ide> public void onAttached(FlatViewGroup.InvalidateCallback callback) {
<add> mCallback = callback;
<add>
<ide> GenericDraweeHierarchy hierarchy = Assertions.assumeNotNull(mRequestHelper).getHierarchy();
<ide>
<ide> RoundingParams roundingParams = hierarchy.getRoundingParams();
<ide> public void onDetached() {
<ide> Assertions.assumeNotNull(mRequestHelper).detach();
<ide> }
<ide>
<add> @Override
<add> public void onSubmit(String id, Object callerContext) {
<add> if (mCallback != null && mReactTag != 0) {
<add> mCallback.dispatchImageLoadEvent(mReactTag, ImageLoadEvent.ON_LOAD_START);
<add> }
<add> }
<add>
<add> @Override
<add> public void onFinalImageSet(
<add> String id,
<add> @Nullable Object imageInfo,
<add> @Nullable Animatable animatable) {
<add> if (mCallback != null && mReactTag != 0) {
<add> mCallback.dispatchImageLoadEvent(mReactTag, ImageLoadEvent.ON_LOAD_END);
<add> mCallback.dispatchImageLoadEvent(mReactTag, ImageLoadEvent.ON_LOAD);
<add> }
<add> }
<add>
<add> @Override
<add> public void onIntermediateImageSet(String id, @Nullable Object imageInfo) {
<add> }
<add>
<add> @Override
<add> public void onIntermediateImageFailed(String id, Throwable throwable) {
<add> }
<add>
<add> @Override
<add> public void onFailure(String id, Throwable throwable) {
<add> if (mCallback != null && mReactTag != 0) {
<add> mCallback.dispatchImageLoadEvent(mReactTag, ImageLoadEvent.ON_LOAD_END);
<add> }
<add> }
<add>
<add> @Override
<add> public void onRelease(String id) {
<add> }
<add>
<ide> private boolean shouldDisplayBorder() {
<ide> return mBorderColor != 0 || mBorderRadius >= 0.5f;
<ide> }
<ide><path>ReactAndroid/src/main/java/com/facebook/react/flat/DrawImageWithPipeline.java
<ide> private @Nullable Path mPathForRoundedBitmap;
<ide> private @Nullable BitmapShader mBitmapShader;
<ide> private boolean mForceClip;
<add> private int mReactTag;
<ide>
<ide> @Override
<ide> public boolean hasImageRequest() {
<ide> public ScaleType getScaleType() {
<ide> return mScaleType;
<ide> }
<ide>
<add> @Override
<add> public void setReactTag(int reactTag) {
<add> mReactTag = reactTag;
<add> }
<add>
<ide> @Override
<ide> protected void onDraw(Canvas canvas) {
<ide> Bitmap bitmap = Assertions.assumeNotNull(mRequestHelper).getBitmap();
<ide> public void onBitmapReady(Bitmap bitmap) {
<ide> updateBounds(bitmap);
<ide> }
<ide>
<add> @Override
<add> public void onImageLoadEvent(int imageLoadEvent) {
<add> if (mReactTag != 0 && mCallback != null) {
<add> mCallback.dispatchImageLoadEvent(mReactTag, imageLoadEvent);
<add> }
<add> }
<add>
<ide> /* package */ void updateBounds(Bitmap bitmap) {
<ide> Assertions.assumeNotNull(mCallback).invalidate();
<ide>
<ide><path>ReactAndroid/src/main/java/com/facebook/react/flat/DraweeRequestHelper.java
<ide> import android.graphics.drawable.Drawable;
<ide>
<ide> import com.facebook.drawee.controller.AbstractDraweeControllerBuilder;
<add>import com.facebook.drawee.controller.ControllerListener;
<ide> import com.facebook.drawee.generic.GenericDraweeHierarchy;
<ide> import com.facebook.drawee.generic.GenericDraweeHierarchyBuilder;
<ide> import com.facebook.drawee.interfaces.DraweeController;
<ide> private final DraweeController mDraweeController;
<ide> private int mAttachCounter;
<ide>
<del> /* package */ DraweeRequestHelper(ImageRequest imageRequest) {
<add> /* package */ DraweeRequestHelper(ImageRequest imageRequest, ControllerListener listener) {
<ide> DraweeController controller = sControllerBuilder
<ide> .setImageRequest(imageRequest)
<ide> .setCallerContext(RCTImageView.getCallerContext())
<add> .setControllerListener(listener)
<ide> .build();
<ide>
<ide> controller.setHierarchy(sHierarchyBuilder.build());
<ide><path>ReactAndroid/src/main/java/com/facebook/react/flat/FlatViewGroup.java
<ide> import android.graphics.Canvas;
<ide> import android.graphics.Rect;
<ide> import android.graphics.drawable.Drawable;
<add>import android.os.SystemClock;
<ide> import android.view.MotionEvent;
<ide> import android.view.View;
<ide> import android.view.ViewGroup;
<ide> import android.view.ViewParent;
<ide>
<add>import com.facebook.react.bridge.ReactContext;
<ide> import com.facebook.react.bridge.SoftAssertions;
<ide> import com.facebook.react.touch.CatalystInterceptingViewGroup;
<ide> import com.facebook.react.touch.OnInterceptTouchEventListener;
<ide> import com.facebook.react.uimanager.PointerEvents;
<ide> import com.facebook.react.uimanager.ReactCompoundView;
<ide> import com.facebook.react.uimanager.ReactPointerEventsView;
<add>import com.facebook.react.uimanager.UIManagerModule;
<add>import com.facebook.react.views.image.ImageLoadEvent;
<ide>
<ide> /**
<ide> * A view that FlatShadowNode hierarchy maps to. Performs drawing by iterating over
<ide> public void invalidate() {
<ide> view.invalidate();
<ide> }
<ide> }
<add>
<add> public void dispatchImageLoadEvent(int reactTag, int imageLoadEvent) {
<add> FlatViewGroup view = get();
<add> if (view == null) {
<add> return;
<add> }
<add>
<add> ReactContext reactContext = ((ReactContext) view.getContext());
<add> UIManagerModule uiManagerModule = reactContext.getNativeModule(UIManagerModule.class);
<add> uiManagerModule.getEventDispatcher().dispatchEvent(
<add> new ImageLoadEvent(reactTag, SystemClock.uptimeMillis(), imageLoadEvent));
<add> }
<ide> }
<ide>
<ide> private static final ArrayList<FlatViewGroup> LAYOUT_REQUESTS = new ArrayList<>();
<ide><path>ReactAndroid/src/main/java/com/facebook/react/flat/InlineImageSpanWithPipeline.java
<ide> public void onBitmapReady(Bitmap bitmap) {
<ide> Assertions.assumeNotNull(mCallback).invalidate();
<ide> }
<ide>
<add> @Override
<add> public void onImageLoadEvent(int imageLoadEvent) {
<add> // ignore
<add> }
<add>
<ide> @Override
<ide> public void onAttached(FlatViewGroup.InvalidateCallback callback) {
<ide> mCallback = callback;
<ide><path>ReactAndroid/src/main/java/com/facebook/react/flat/PipelineRequestHelper.java
<ide> import com.facebook.imagepipeline.image.CloseableImage;
<ide> import com.facebook.imagepipeline.request.ImageRequest;
<ide> import com.facebook.infer.annotation.Assertions;
<add>import com.facebook.react.views.image.ImageLoadEvent;
<ide>
<ide> /**
<ide> * Helper class for DrawImage that helps manage fetch requests through ImagePipeline.
<ide> // this is a secondary attach, ignore it, only updating Bitmap boundaries if needed.
<ide> Bitmap bitmap = getBitmap();
<ide> if (bitmap != null) {
<del> mBitmapUpdateListener.onSecondaryAttach(bitmap);
<add> listener.onSecondaryAttach(bitmap);
<ide> }
<ide> return;
<ide> }
<ide>
<add> listener.onImageLoadEvent(ImageLoadEvent.ON_LOAD_START);
<add>
<ide> Assertions.assertCondition(mDataSource == null);
<ide> Assertions.assertCondition(mImageRef == null);
<ide>
<ide> public void onNewResult(DataSource<CloseableReference<CloseableImage>> dataSourc
<ide> return;
<ide> }
<ide>
<del> Assertions.assumeNotNull(mBitmapUpdateListener).onBitmapReady(bitmap);
<add> BitmapUpdateListener listener = Assertions.assumeNotNull(mBitmapUpdateListener);
<add> listener.onBitmapReady(bitmap);
<add> listener.onImageLoadEvent(ImageLoadEvent.ON_LOAD_END);
<add> listener.onImageLoadEvent(ImageLoadEvent.ON_LOAD);
<ide> } finally {
<ide> dataSource.close();
<ide> }
<ide> }
<ide>
<ide> @Override
<ide> public void onFailure(DataSource<CloseableReference<CloseableImage>> dataSource) {
<del> if (mDataSource != dataSource) {
<del> // Should always be the case, but let's be safe.
<add> if (mDataSource == dataSource) {
<add> Assertions.assumeNotNull(mBitmapUpdateListener).onImageLoadEvent(ImageLoadEvent.ON_LOAD_END);
<ide> mDataSource = null;
<ide> }
<ide>
<ide> public void onFailure(DataSource<CloseableReference<CloseableImage>> dataSource)
<ide>
<ide> @Override
<ide> public void onCancellation(DataSource<CloseableReference<CloseableImage>> dataSource) {
<add> if (mDataSource == dataSource) {
<add> mDataSource = null;
<add> }
<add>
<add> dataSource.close();
<ide> }
<ide>
<ide> @Override
<ide><path>ReactAndroid/src/main/java/com/facebook/react/flat/RCTImageView.java
<ide> protected void collectState(
<ide> }
<ide> }
<ide>
<add> @ReactProp(name = "shouldNotifyLoadEvents")
<add> public void setShouldNotifyLoadEvents(boolean shouldNotifyLoadEvents) {
<add> getMutableDrawImage().setReactTag(shouldNotifyLoadEvents ? getReactTag() : 0);
<add> }
<add>
<ide> @ReactProp(name = "src")
<ide> public void setSource(@Nullable String source) {
<ide> getMutableDrawImage().setImageRequest( | 9 |
Python | Python | fix gevent connection leak | 5031240115aee31156fe07c6d1c32d86ecf15f00 | <ide><path>celery/backends/mongodb.py
<ide> def process_cleanup(self):
<ide> if self._connection is not None:
<ide> # MongoDB connection will be closed automatically when object
<ide> # goes out of scope
<add> del(self.collection)
<add> del(self.database)
<ide> self._connection = None
<ide>
<ide> def _store_result(self, task_id, result, status, traceback=None): | 1 |
Javascript | Javascript | fix race condition in addon test | 84579b1d7d92274b41a737e4d119d87807d486b1 | <ide><path>test/addons-napi/test_promise/test.js
<ide> const common = require('../../common');
<ide> const test_promise = require(`./build/${common.buildType}/test_promise`);
<ide> const assert = require('assert');
<ide>
<del>let expected_result, promise;
<del>
<ide> // A resolution
<del>expected_result = 42;
<del>promise = test_promise.createPromise();
<del>promise.then(
<del> common.mustCall(function(result) {
<del> assert.strictEqual(result, expected_result,
<del> 'promise resolved as expected');
<del> }),
<del> common.mustNotCall());
<del>test_promise.concludeCurrentPromise(expected_result, true);
<add>{
<add> const expected_result = 42;
<add> const promise = test_promise.createPromise();
<add> promise.then(
<add> common.mustCall(function(result) {
<add> assert.strictEqual(result, expected_result,
<add> `promise resolved as expected, received ${result}`);
<add> }),
<add> common.mustNotCall());
<add> test_promise.concludeCurrentPromise(expected_result, true);
<add>}
<ide>
<ide> // A rejection
<del>expected_result = 'It\'s not you, it\'s me.';
<del>promise = test_promise.createPromise();
<del>promise.then(
<del> common.mustNotCall(),
<del> common.mustCall(function(result) {
<del> assert.strictEqual(result, expected_result,
<del> 'promise rejected as expected');
<del> }));
<del>test_promise.concludeCurrentPromise(expected_result, false);
<add>{
<add> const expected_result = 'It\'s not you, it\'s me.';
<add> const promise = test_promise.createPromise();
<add> promise.then(
<add> common.mustNotCall(),
<add> common.mustCall(function(result) {
<add> assert.strictEqual(result, expected_result,
<add> `promise rejected as expected, received ${result}`);
<add> }));
<add> test_promise.concludeCurrentPromise(expected_result, false);
<add>}
<ide>
<ide> // Chaining
<del>promise = test_promise.createPromise();
<add>const promise = test_promise.createPromise();
<ide> promise.then(
<ide> common.mustCall(function(result) {
<ide> assert.strictEqual(result, 'chained answer',
<ide> promise.then(
<ide> common.mustNotCall());
<ide> test_promise.concludeCurrentPromise(Promise.resolve('chained answer'), true);
<ide>
<del>assert.strictEqual(test_promise.isPromise(promise), true,
<del> 'natively created promise is recognized as a promise');
<del>
<del>assert.strictEqual(test_promise.isPromise(Promise.reject(-1)), true,
<del> 'Promise created with JS is recognized as a promise');
<del>
<del>assert.strictEqual(test_promise.isPromise(2.4), false,
<del> 'Number is recognized as not a promise');
<del>
<del>assert.strictEqual(test_promise.isPromise('I promise!'), false,
<del> 'String is recognized as not a promise');
<del>
<del>assert.strictEqual(test_promise.isPromise(undefined), false,
<del> 'undefined is recognized as not a promise');
<del>
<del>assert.strictEqual(test_promise.isPromise(null), false,
<del> 'null is recognized as not a promise');
<del>
<del>assert.strictEqual(test_promise.isPromise({}), false,
<del> 'an object is recognized as not a promise');
<add>assert.strictEqual(test_promise.isPromise(promise), true);
<add>assert.strictEqual(test_promise.isPromise(Promise.reject(-1)), true);
<add>assert.strictEqual(test_promise.isPromise(2.4), false);
<add>assert.strictEqual(test_promise.isPromise('I promise!'), false);
<add>assert.strictEqual(test_promise.isPromise(undefined), false);
<add>assert.strictEqual(test_promise.isPromise(null), false);
<add>assert.strictEqual(test_promise.isPromise({}), false); | 1 |
Ruby | Ruby | populate empty mirror_list for bottles | 62a98753ec9cce23653b18e95b8debafd877856d | <ide><path>Library/Homebrew/formula.rb
<ide> def system cmd, *args
<ide> def fetch
<ide> if install_bottle? self
<ide> downloader = CurlBottleDownloadStrategy.new bottle_url, name, version, nil
<add> mirror_list = []
<ide> else
<ide> downloader = @downloader
<ide> # Don't attempt mirrors if this install is not pointed at a "stable" URL. | 1 |
Text | Text | add @bnb as a collaborator | 2f5e4b435344d8d784f2f4334769bb4b78834ad1 | <ide><path>README.md
<ide> For information about the governance of the Node.js project, see
<ide> **Beth Griggs** <<bgriggs@redhat.com>> (she/her)
<ide> * [bmeck](https://github.com/bmeck) -
<ide> **Bradley Farias** <<bradley.meck@gmail.com>>
<add>* [bnb](https://github.com/bnb) -
<add> **Tierney Cyren** <<hello@bnb.im>> (they/he)
<ide> * [boneskull](https://github.com/boneskull) -
<ide> **Christopher Hiller** <<boneskull@boneskull.com>> (he/him)
<ide> * [BridgeAR](https://github.com/BridgeAR) - | 1 |
Python | Python | update examples to use the new custom `reverse()' | c7e7279d979a346b5d1c950cc960183013799c41 | <ide><path>examples/blogpost/resources.py
<del>from django.core.urlresolvers import reverse
<ide> from djangorestframework.resources import ModelResource
<add>from djangorestframework.utils import reverse
<ide> from blogpost.models import BlogPost, Comment
<ide>
<ide>
<ide> class BlogPostResource(ModelResource):
<ide> ordering = ('-created',)
<ide>
<ide> def comments(self, instance):
<del> return reverse('comments', kwargs={'blogpost': instance.key})
<add> return reverse('comments', request, kwargs={'blogpost': instance.key})
<ide>
<ide>
<ide> class CommentResource(ModelResource):
<ide> class CommentResource(ModelResource):
<ide> ordering = ('-created',)
<ide>
<ide> def blogpost(self, instance):
<del> return reverse('blog-post', kwargs={'key': instance.blogpost.key})
<add> return reverse('blog-post', request, kwargs={'key': instance.blogpost.key})
<ide><path>examples/blogpost/tests.py
<ide> """Test a range of REST API usage of the example application.
<ide> """
<ide>
<del>from django.core.urlresolvers import reverse
<ide> from django.test import TestCase
<del>from django.core.urlresolvers import reverse
<ide> from django.utils import simplejson as json
<ide>
<ide> from djangorestframework.compat import RequestFactory
<add>from djangorestframework.utils import reverse
<ide> from djangorestframework.views import InstanceModelView, ListOrCreateModelView
<ide>
<ide> from blogpost import models, urls
<ide><path>examples/mixin/urls.py
<ide> from djangorestframework.mixins import ResponseMixin
<ide> from djangorestframework.renderers import DEFAULT_RENDERERS
<ide> from djangorestframework.response import Response
<add>from djangorestframework.utils import reverse
<ide>
<ide> from django.conf.urls.defaults import patterns, url
<del>from django.core.urlresolvers import reverse
<ide>
<ide>
<ide> class ExampleView(ResponseMixin, View):
<ide> class ExampleView(ResponseMixin, View):
<ide>
<ide> def get(self, request):
<ide> response = Response(200, {'description': 'Some example content',
<del> 'url': reverse('mixin-view')})
<add> 'url': reverse('mixin-view', request)})
<ide> return self.render(response)
<ide>
<ide>
<ide><path>examples/objectstore/views.py
<ide> from django.conf import settings
<del>from django.core.urlresolvers import reverse
<ide>
<add>from djangorestframework.utils import reverse
<ide> from djangorestframework.views import View
<ide> from djangorestframework.response import Response
<ide> from djangorestframework import status
<ide> def get(self, request):
<ide> filepaths = [os.path.join(OBJECT_STORE_DIR, file) for file in os.listdir(OBJECT_STORE_DIR) if not file.startswith('.')]
<ide> ctime_sorted_basenames = [item[0] for item in sorted([(os.path.basename(path), os.path.getctime(path)) for path in filepaths],
<ide> key=operator.itemgetter(1), reverse=True)]
<del> return [reverse('stored-object', kwargs={'key':key}) for key in ctime_sorted_basenames]
<add> return [reverse('stored-object', request, kwargs={'key':key}) for key in ctime_sorted_basenames]
<ide>
<ide> def post(self, request):
<ide> """
<ide> def post(self, request):
<ide> pathname = os.path.join(OBJECT_STORE_DIR, key)
<ide> pickle.dump(self.CONTENT, open(pathname, 'wb'))
<ide> remove_oldest_files(OBJECT_STORE_DIR, MAX_FILES)
<del> return Response(status.HTTP_201_CREATED, self.CONTENT, {'Location': reverse('stored-object', kwargs={'key':key})})
<add> return Response(status.HTTP_201_CREATED, self.CONTENT, {'Location': reverse('stored-object', request, kwargs={'key':key})})
<ide>
<ide>
<ide> class StoredObject(View):
<ide><path>examples/permissionsexample/views.py
<ide> from djangorestframework.views import View
<ide> from djangorestframework.permissions import PerUserThrottling, IsAuthenticated
<del>from django.core.urlresolvers import reverse
<add>from djangorestframework.utils import reverse
<ide>
<ide>
<ide> class PermissionsExampleView(View):
<ide> def get(self, request):
<ide> return [
<ide> {
<ide> 'name': 'Throttling Example',
<del> 'url': reverse('throttled-resource')
<add> 'url': reverse('throttled-resource', request)
<ide> },
<ide> {
<ide> 'name': 'Logged in example',
<del> 'url': reverse('loggedin-resource')
<add> 'url': reverse('loggedin-resource', request)
<ide> },
<ide> ]
<ide>
<ide><path>examples/pygments_api/views.py
<ide> from __future__ import with_statement # for python 2.5
<ide> from django.conf import settings
<del>from django.core.urlresolvers import reverse
<ide>
<ide> from djangorestframework.resources import FormResource
<ide> from djangorestframework.response import Response
<ide> from djangorestframework.renderers import BaseRenderer
<add>from djangorestframework.utils import reverse
<ide> from djangorestframework.views import View
<ide> from djangorestframework import status
<ide>
<ide> def get(self, request):
<ide> Return a list of all currently existing snippets.
<ide> """
<ide> unique_ids = [os.path.split(f)[1] for f in list_dir_sorted_by_ctime(HIGHLIGHTED_CODE_DIR)]
<del> return [reverse('pygments-instance', args=[unique_id]) for unique_id in unique_ids]
<add> return [reverse('pygments-instance', request, args=[unique_id]) for unique_id in unique_ids]
<ide>
<ide> def post(self, request):
<ide> """
<ide> def post(self, request):
<ide>
<ide> remove_oldest_files(HIGHLIGHTED_CODE_DIR, MAX_FILES)
<ide>
<del> return Response(status.HTTP_201_CREATED, headers={'Location': reverse('pygments-instance', args=[unique_id])})
<add> return Response(status.HTTP_201_CREATED, headers={'Location': reverse('pygments-instance', request, args=[unique_id])})
<ide>
<ide>
<ide> class PygmentsInstance(View):
<ide><path>examples/resourceexample/views.py
<del>from django.core.urlresolvers import reverse
<del>
<add>from djangorestframework.utils import reverse
<ide> from djangorestframework.views import View
<ide> from djangorestframework.response import Response
<ide> from djangorestframework import status
<ide> def get(self, request):
<ide> """
<ide> Handle GET requests, returning a list of URLs pointing to 3 other views.
<ide> """
<del> return {"Some other resources": [reverse('another-example', kwargs={'num':num}) for num in range(3)]}
<add> return {"Some other resources": [reverse('another-example', request, kwargs={'num':num}) for num in range(3)]}
<ide>
<ide>
<ide> class AnotherExampleView(View):
<ide><path>examples/sandbox/views.py
<ide> """The root view for the examples provided with Django REST framework"""
<ide>
<del>from django.core.urlresolvers import reverse
<add>from djangorestframework.utils import reverse
<ide> from djangorestframework.views import View
<ide>
<ide>
<ide> class Sandbox(View):
<ide> Please feel free to browse, create, edit and delete the resources in these examples."""
<ide>
<ide> def get(self, request):
<del> return [{'name': 'Simple Resource example', 'url': reverse('example-resource')},
<del> {'name': 'Simple ModelResource example', 'url': reverse('model-resource-root')},
<del> {'name': 'Simple Mixin-only example', 'url': reverse('mixin-view')},
<del> {'name': 'Object store API', 'url': reverse('object-store-root')},
<del> {'name': 'Code highlighting API', 'url': reverse('pygments-root')},
<del> {'name': 'Blog posts API', 'url': reverse('blog-posts-root')},
<del> {'name': 'Permissions example', 'url': reverse('permissions-example')}
<add> return [{'name': 'Simple Resource example', 'url': reverse('example-resource', request)},
<add> {'name': 'Simple ModelResource example', 'url': reverse('model-resource-root', request)},
<add> {'name': 'Simple Mixin-only example', 'url': reverse('mixin-view', request)},
<add> {'name': 'Object store API', 'url': reverse('object-store-root', request)},
<add> {'name': 'Code highlighting API', 'url': reverse('pygments-root', request)},
<add> {'name': 'Blog posts API', 'url': reverse('blog-posts-root', request)},
<add> {'name': 'Permissions example', 'url': reverse('permissions-example', request)}
<ide> ] | 8 |
Go | Go | remove execcmd() utility | 1e88fe578e40a22bf349e0acc465a3f5da9dd32d | <ide><path>pkg/idtools/idtools_unix.go
<ide> func callGetent(database, key string) (io.Reader, error) {
<ide> if getentCmd == "" {
<ide> return nil, fmt.Errorf("unable to find getent command")
<ide> }
<del> out, err := execCmd(getentCmd, database, key)
<add> out, err := exec.Command(getentCmd, database, key).CombinedOutput()
<ide> if err != nil {
<ide> exitCode, errC := getExitCode(err)
<ide> if errC != nil {
<ide><path>pkg/idtools/idtools_unix_test.go
<ide> func compareTrees(left, right map[string]node) error {
<ide> }
<ide>
<ide> func delUser(t *testing.T, name string) {
<del> _, err := execCmd("userdel", name)
<del> assert.Check(t, err)
<add> out, err := exec.Command("userdel", name).CombinedOutput()
<add> assert.Check(t, err, out)
<ide> }
<ide>
<ide> func TestParseSubidFileWithNewlinesAndComments(t *testing.T) {
<ide><path>pkg/idtools/usergroupadd_linux.go
<ide> package idtools // import "github.com/docker/docker/pkg/idtools"
<ide>
<ide> import (
<ide> "fmt"
<add> "os/exec"
<ide> "regexp"
<ide> "sort"
<ide> "strconv"
<ide> func AddNamespaceRangesUser(name string) (int, int, error) {
<ide> }
<ide>
<ide> // Query the system for the created uid and gid pair
<del> out, err := execCmd("id", name)
<add> out, err := exec.Command("id", name).CombinedOutput()
<ide> if err != nil {
<ide> return -1, -1, fmt.Errorf("error trying to find uid/gid for new user %q: %v", name, err)
<ide> }
<ide> func addUser(name string) error {
<ide> return fmt.Errorf("cannot add user; no useradd/adduser binary found")
<ide> }
<ide>
<del> if out, err := execCmd(userCommand, args...); err != nil {
<add> if out, err := exec.Command(userCommand, args...).CombinedOutput(); err != nil {
<ide> return fmt.Errorf("failed to add user with error: %v; output: %q", err, string(out))
<ide> }
<ide> return nil
<ide> func createSubordinateRanges(name string) error {
<ide> if err != nil {
<ide> return fmt.Errorf("can't find available subuid range: %v", err)
<ide> }
<del> out, err := execCmd("usermod", "-v", fmt.Sprintf("%d-%d", startID, startID+defaultRangeLen-1), name)
<add> idRange := fmt.Sprintf("%d-%d", startID, startID+defaultRangeLen-1)
<add> out, err := exec.Command("usermod", "-v", idRange, name).CombinedOutput()
<ide> if err != nil {
<ide> return fmt.Errorf("unable to add subuid range to user: %q; output: %s, err: %v", name, out, err)
<ide> }
<ide> func createSubordinateRanges(name string) error {
<ide> if err != nil {
<ide> return fmt.Errorf("can't find available subgid range: %v", err)
<ide> }
<del> out, err := execCmd("usermod", "-w", fmt.Sprintf("%d-%d", startID, startID+defaultRangeLen-1), name)
<add> idRange := fmt.Sprintf("%d-%d", startID, startID+defaultRangeLen-1)
<add> out, err := exec.Command("usermod", "-w", idRange, name).CombinedOutput()
<ide> if err != nil {
<ide> return fmt.Errorf("unable to add subgid range to user: %q; output: %s, err: %v", name, out, err)
<ide> }
<ide><path>pkg/idtools/utils_unix.go
<ide> func resolveBinary(binname string) (string, error) {
<ide> }
<ide> return "", fmt.Errorf("Binary %q does not resolve to a binary of that name in $PATH (%q)", binname, resolvedPath)
<ide> }
<del>
<del>func execCmd(cmd string, arg ...string) ([]byte, error) {
<del> execCmd := exec.Command(cmd, arg...)
<del> return execCmd.CombinedOutput()
<del>} | 4 |
Javascript | Javascript | use async and await in git-repository-spec | 6e0b629389610120ae817cd79e6bd4f511889a1a | <ide><path>spec/git-repository-spec.js
<add>const {it, fit, ffit, fffit, beforeEach, afterEach} = require('./async-spec-helpers')
<ide> const path = require('path')
<ide> const fs = require('fs-plus')
<ide> const temp = require('temp').track()
<ide> describe('GitRepository', () => {
<ide> describe('.checkoutHeadForEditor(editor)', () => {
<ide> let filePath, editor
<ide>
<del> beforeEach(() => {
<add> beforeEach(async () => {
<ide> spyOn(atom, 'confirm')
<ide>
<ide> const workingDirPath = copyRepository()
<ide> repo = new GitRepository(workingDirPath, {project: atom.project, config: atom.config, confirm: atom.confirm})
<ide> filePath = path.join(workingDirPath, 'a.txt')
<ide> fs.writeFileSync(filePath, 'ch ch changes')
<ide>
<del> waitsForPromise(() => atom.workspace.open(filePath))
<del>
<del> runs(() => editor = atom.workspace.getActiveTextEditor())
<add> editor = await atom.workspace.open(filePath)
<ide> })
<ide>
<ide> it('displays a confirmation dialog by default', () => {
<ide> describe('GitRepository', () => {
<ide> fs.writeFileSync(cleanPath, 'Full of text')
<ide> fs.writeFileSync(newPath, '')
<ide> newPath = fs.absolute(newPath)
<del> }) // specs could be running under symbol path.
<add> })
<ide>
<del> it('returns status information for all new and modified files', () => {
<del> fs.writeFileSync(modifiedPath, 'making this path modified')
<add> it('returns status information for all new and modified files', async () => {
<ide> const statusHandler = jasmine.createSpy('statusHandler')
<ide> repo.onDidChangeStatuses(statusHandler)
<del> repo.refreshStatus()
<del>
<del> waitsFor(() => statusHandler.callCount > 0)
<add> fs.writeFileSync(modifiedPath, 'making this path modified')
<ide>
<del> runs(() => {
<del> expect(repo.getCachedPathStatus(cleanPath)).toBeUndefined()
<del> expect(repo.isStatusNew(repo.getCachedPathStatus(newPath))).toBeTruthy()
<del> expect(repo.isStatusModified(repo.getCachedPathStatus(modifiedPath))).toBeTruthy()
<del> })
<add> await repo.refreshStatus()
<add> expect(statusHandler.callCount).toBe(1)
<add> expect(repo.getCachedPathStatus(cleanPath)).toBeUndefined()
<add> expect(repo.isStatusNew(repo.getCachedPathStatus(newPath) )).toBeTruthy()
<add> expect(repo.isStatusModified(repo.getCachedPathStatus(modifiedPath))).toBeTruthy()
<ide> })
<ide>
<del> it('caches the proper statuses when a subdir is open', () => {
<add> it('caches the proper statuses when a subdir is open', async () => {
<ide> const subDir = path.join(workingDirectory, 'dir')
<ide> fs.mkdirSync(subDir)
<del>
<ide> const filePath = path.join(subDir, 'b.txt')
<ide> fs.writeFileSync(filePath, '')
<del>
<ide> atom.project.setPaths([subDir])
<add> await atom.workspace.open('b.txt')
<add> repo = atom.project.getRepositories()[0]
<ide>
<del> waitsForPromise(() => atom.workspace.open('b.txt'))
<del>
<del> let statusHandler = null
<del> runs(() => {
<del> repo = atom.project.getRepositories()[0]
<del>
<del> statusHandler = jasmine.createSpy('statusHandler')
<del> repo.onDidChangeStatuses(statusHandler)
<del> repo.refreshStatus()
<del> })
<del>
<del> waitsFor(() => statusHandler.callCount > 0)
<del>
<del> runs(() => {
<del> const status = repo.getCachedPathStatus(filePath)
<del> expect(repo.isStatusModified(status)).toBe(false)
<del> expect(repo.isStatusNew(status)).toBe(false)
<del> })
<add> await repo.refreshStatus()
<add> const status = repo.getCachedPathStatus(filePath)
<add> expect(repo.isStatusModified(status)).toBe(false)
<add> expect(repo.isStatusNew(status)).toBe(false)
<ide> })
<ide>
<del> it('works correctly when the project has multiple folders (regression)', () => {
<add> it('works correctly when the project has multiple folders (regression)', async () => {
<ide> atom.project.addPath(workingDirectory)
<ide> atom.project.addPath(path.join(__dirname, 'fixtures', 'dir'))
<del> const statusHandler = jasmine.createSpy('statusHandler')
<del> repo.onDidChangeStatuses(statusHandler)
<del>
<del> repo.refreshStatus()
<del>
<del> waitsFor(() => statusHandler.callCount > 0)
<ide>
<del> runs(() => {
<del> expect(repo.getCachedPathStatus(cleanPath)).toBeUndefined()
<del> expect(repo.isStatusNew(repo.getCachedPathStatus(newPath))).toBeTruthy()
<del> expect(repo.isStatusModified(repo.getCachedPathStatus(modifiedPath))).toBeTruthy()
<del> })
<add> await repo.refreshStatus()
<add> expect(repo.getCachedPathStatus(cleanPath)).toBeUndefined()
<add> expect(repo.isStatusNew(repo.getCachedPathStatus(newPath))).toBeTruthy()
<add> expect(repo.isStatusModified(repo.getCachedPathStatus(modifiedPath))).toBeTruthy()
<ide> })
<ide>
<del> it('caches statuses that were looked up synchronously', () => {
<add> it('caches statuses that were looked up synchronously', async () => {
<ide> const originalContent = 'undefined'
<ide> fs.writeFileSync(modifiedPath, 'making this path modified')
<ide> repo.getPathStatus('file.txt')
<ide>
<ide> fs.writeFileSync(modifiedPath, originalContent)
<del> waitsForPromise(() => repo.refreshStatus())
<del> runs(() => {
<del> expect(repo.isStatusModified(repo.getCachedPathStatus(modifiedPath))).toBeFalsy()
<del> })
<add> await repo.refreshStatus()
<add> expect(repo.isStatusModified(repo.getCachedPathStatus(modifiedPath))).toBeFalsy()
<ide> })
<ide> })
<ide>
<ide> describe('buffer events', () => {
<ide> let editor
<ide>
<del> beforeEach(() => {
<del> let statusRefreshed = false
<add> beforeEach(async () => {
<ide> atom.project.setPaths([copyRepository()])
<del> atom.project.getRepositories()[0].onDidChangeStatuses(() => statusRefreshed = true)
<del>
<del> waitsForPromise(() => atom.workspace.open('other.txt').then(o => editor = o))
<del>
<del> waitsFor('repo to refresh', () => statusRefreshed)
<add> const refreshPromise = new Promise(resolve => atom.project.getRepositories()[0].onDidChangeStatuses(resolve))
<add> editor = await atom.workspace.open('other.txt')
<add> await refreshPromise
<ide> })
<ide>
<del> it('emits a status-changed event when a buffer is saved', () => {
<add> it('emits a status-changed event when a buffer is saved', async () => {
<ide> editor.insertNewline()
<ide>
<ide> const statusHandler = jasmine.createSpy('statusHandler')
<ide> atom.project.getRepositories()[0].onDidChangeStatus(statusHandler)
<ide>
<del> waitsForPromise(() => editor.save())
<del>
<del> runs(() => {
<del> expect(statusHandler.callCount).toBe(1)
<del> expect(statusHandler).toHaveBeenCalledWith({path: editor.getPath(), pathStatus: 256})
<del> })
<add> await editor.save()
<add> expect(statusHandler.callCount).toBe(1)
<add> expect(statusHandler).toHaveBeenCalledWith({path: editor.getPath(), pathStatus: 256})
<ide> })
<ide>
<del> it('emits a status-changed event when a buffer is reloaded', () => {
<add> it('emits a status-changed event when a buffer is reloaded', async () => {
<ide> fs.writeFileSync(editor.getPath(), 'changed')
<ide>
<ide> const statusHandler = jasmine.createSpy('statusHandler')
<ide> atom.project.getRepositories()[0].onDidChangeStatus(statusHandler)
<ide>
<del> waitsForPromise(() => editor.getBuffer().reload())
<del>
<del> runs(() => {
<del> expect(statusHandler.callCount).toBe(1)
<del> expect(statusHandler).toHaveBeenCalledWith({path: editor.getPath(), pathStatus: 256})
<del> })
<del>
<del> waitsForPromise(() => editor.getBuffer().reload())
<add> await editor.getBuffer().reload()
<add> expect(statusHandler.callCount).toBe(1)
<add> expect(statusHandler).toHaveBeenCalledWith({path: editor.getPath(), pathStatus: 256})
<ide>
<del> runs(() => {
<del> expect(statusHandler.callCount).toBe(1)
<del> })
<add> await editor.getBuffer().reload()
<add> expect(statusHandler.callCount).toBe(1)
<ide> })
<ide>
<ide> it("emits a status-changed event when a buffer's path changes", () => {
<ide> describe('GitRepository', () => {
<ide> if (project2) project2.destroy()
<ide> })
<ide>
<del> it('subscribes to all the serialized buffers in the project', () => {
<add> it('subscribes to all the serialized buffers in the project', async () => {
<ide> atom.project.setPaths([copyRepository()])
<ide>
<del> waitsForPromise(() => atom.workspace.open('file.txt'))
<add> await atom.workspace.open('file.txt')
<ide>
<del> waitsForPromise(() => {
<del> project2 = new Project({notificationManager: atom.notifications, packageManager: atom.packages, confirm: atom.confirm, applicationDelegate: atom.applicationDelegate})
<del> return project2.deserialize(atom.project.serialize({isUnloading: false}))
<add> project2 = new Project({
<add> notificationManager: atom.notifications,
<add> packageManager: atom.packages,
<add> confirm: atom.confirm,
<add> applicationDelegate: atom.applicationDelegate
<ide> })
<add> await project2.deserialize(atom.project.serialize({isUnloading: false}))
<ide>
<del> waitsFor(() => buffer = project2.getBuffers()[0])
<add> buffer = project2.getBuffers()[0]
<ide>
<del> waitsForPromise(() => {
<del> const originalContent = buffer.getText()
<del> buffer.append('changes')
<add> const originalContent = buffer.getText()
<add> buffer.append('changes')
<ide>
<del> statusHandler = jasmine.createSpy('statusHandler')
<del> project2.getRepositories()[0].onDidChangeStatus(statusHandler)
<del> return buffer.save()
<del> })
<add> statusHandler = jasmine.createSpy('statusHandler')
<add> project2.getRepositories()[0].onDidChangeStatus(statusHandler)
<add> await buffer.save()
<ide>
<del> runs(() => {
<del> expect(statusHandler.callCount).toBe(1)
<del> expect(statusHandler).toHaveBeenCalledWith({path: buffer.getPath(), pathStatus: 256})
<del> })
<add> expect(statusHandler.callCount).toBe(1)
<add> expect(statusHandler).toHaveBeenCalledWith({path: buffer.getPath(), pathStatus: 256})
<ide> })
<ide> })
<ide> }) | 1 |
Ruby | Ruby | apply rafaels review fixes | 47e4795060ee8550e28f86cedd4654fad19f72a4 | <ide><path>railties/lib/rails/command/behavior.rb
<ide> def subclasses
<ide> # This code is based directly on the Text gem implementation.
<ide> # Copyright (c) 2006-2013 Paul Battley, Michael Neumann, Tim Fletcher.
<ide> #
<del> # Returns a value representing the "cost" of transforming str1 into str2
<add> # Returns a value representing the "cost" of transforming str1 into str2.
<ide> def levenshtein_distance(str1, str2)
<ide> s = str1
<ide> t = str2
<ide><path>railties/lib/rails/commands/server/server_command.rb
<ide> def middleware
<ide> end
<ide>
<ide> def default_options
<del> super.merge( Port: ENV.fetch("PORT", 3000).to_i,
<add> super.merge(
<add> Port: ENV.fetch("PORT", 3000).to_i,
<add> Host: ENV.fetch("HOST", "localhost").dup,
<ide> DoNotReverseLookup: true,
<ide> environment: (ENV["RAILS_ENV"] || ENV["RACK_ENV"] || "development").dup,
<ide> daemonize: false,
<ide><path>railties/lib/rails/test_unit/minitest_plugin.rb
<ide> require "active_support/core_ext/module/attribute_accessors"
<del>require "active_support/core_ext/hash/keys"
<ide> require "rails/test_unit/reporter"
<ide> require "rails/test_unit/test_requirer"
<ide> require "shellwords" | 3 |
Javascript | Javascript | use const where applicable in template | e66a039061724181abfd739a251191ac036d4922 | <ide><path>lib/Template.js
<ide> module.exports = class Template extends Tapable {
<ide> }
<ide> str = str.trim();
<ide> if(!str) return "";
<del> let ind = (str[0] === "\n" ? "" : prefix);
<add> const ind = (str[0] === "\n" ? "" : prefix);
<ide> return ind + str.replace(/\n([^\n])/g, "\n" + prefix + "$1");
<ide> }
<ide> | 1 |
Python | Python | fix the callback argument | c08d53c7fc086cca727584c04a78fd1779415510 | <ide><path>libcloud/storage/drivers/cloudfiles.py
<ide> def download_object_as_stream(self, obj, chunk_size=None):
<ide> object_name),
<ide> method='GET', raw=True)
<ide>
<del> return self._get_object(obj=obj, callback=self._save_object,
<add> return self._get_object(obj=obj, callback=self._get_object_as_stream,
<ide> response=response,
<ide> callback_kwargs={'chunk_size': chunk_size},
<ide> success_status_code=httplib.OK) | 1 |
Go | Go | remove fallback for legacy containers | 239d9c5eda8f65678b26de26a6763a67772f3151 | <ide><path>daemon/daemon.go
<ide> func (daemon *Daemon) restore() error {
<ide> log.WithError(err).Error("failed to load container")
<ide> return
<ide> }
<del> // Ignore the container if it does not support the current driver being used by the graph
<del> if (c.Driver == "" && daemon.graphDriver == "aufs") || c.Driver == daemon.graphDriver {
<del> rwlayer, err := daemon.imageService.GetLayerByID(c.ID)
<del> if err != nil {
<del> log.WithError(err).Error("failed to load container mount")
<del> return
<del> }
<del> c.RWLayer = rwlayer
<del> log.WithFields(logrus.Fields{
<del> "running": c.IsRunning(),
<del> "paused": c.IsPaused(),
<del> }).Debug("loaded container")
<del>
<del> mapLock.Lock()
<del> containers[c.ID] = c
<del> mapLock.Unlock()
<del> } else {
<del> log.Debugf("cannot load container because it was created with another storage driver")
<add> if c.Driver != daemon.graphDriver {
<add> // Ignore the container if it wasn't created with the current storage-driver
<add> log.Debugf("not restoring container because it was created with another storage driver (%s)", c.Driver)
<add> return
<add> }
<add> rwlayer, err := daemon.imageService.GetLayerByID(c.ID)
<add> if err != nil {
<add> log.WithError(err).Error("failed to load container mount")
<add> return
<ide> }
<add> c.RWLayer = rwlayer
<add> log.WithFields(logrus.Fields{
<add> "running": c.IsRunning(),
<add> "paused": c.IsPaused(),
<add> }).Debug("loaded container")
<add>
<add> mapLock.Lock()
<add> containers[c.ID] = c
<add> mapLock.Unlock()
<ide> }(v.Name())
<ide> }
<ide> group.Wait() | 1 |
Java | Java | ignore head requests in shallowetagheaderfilter | b732251b093552812d7457608d422e899241ca04 | <ide><path>spring-web/src/main/java/org/springframework/web/filter/ShallowEtagHeaderFilter.java
<ide> /*
<del> * Copyright 2002-2016 the original author or authors.
<add> * Copyright 2002-2017 the original author or authors.
<ide> *
<ide> * Licensed under the Apache License, Version 2.0 (the "License");
<ide> * you may not use this file except in compliance with the License.
<ide> import java.io.IOException;
<ide> import java.io.InputStream;
<ide> import java.io.PrintWriter;
<add>
<ide> import javax.servlet.FilterChain;
<ide> import javax.servlet.ServletException;
<ide> import javax.servlet.ServletOutputStream;
<ide> protected boolean isEligibleForEtag(HttpServletRequest request, HttpServletRespo
<ide> int responseStatusCode, InputStream inputStream) {
<ide>
<ide> String method = request.getMethod();
<del> if (responseStatusCode >= 200 && responseStatusCode < 300 &&
<del> (HttpMethod.GET.matches(method) || HttpMethod.HEAD.matches(method))) {
<add> if (responseStatusCode >= 200 && responseStatusCode < 300
<add> && HttpMethod.GET.matches(method)) {
<ide>
<ide> String cacheControl = response.getHeader(HEADER_CACHE_CONTROL);
<ide> if (cacheControl == null || !cacheControl.contains(DIRECTIVE_NO_STORE)) {
<ide><path>spring-web/src/test/java/org/springframework/web/filter/ShallowEtagHeaderFilterTests.java
<ide> public void isEligibleForEtag() {
<ide> assertTrue(filter.isEligibleForEtag(request, response, 200, StreamUtils.emptyInput()));
<ide> assertFalse(filter.isEligibleForEtag(request, response, 300, StreamUtils.emptyInput()));
<ide>
<add> request = new MockHttpServletRequest("HEAD", "/hotels");
<add> assertFalse(filter.isEligibleForEtag(request, response, 200, StreamUtils.emptyInput()));
<add>
<ide> request = new MockHttpServletRequest("POST", "/hotels");
<ide> assertFalse(filter.isEligibleForEtag(request, response, 200, StreamUtils.emptyInput()));
<ide> | 2 |
Python | Python | add support for gradient checkpointing in bert | 90f4b2452077ac3bac9453bdc63e0359aa4fe4d2 | <ide><path>src/transformers/configuration_bert.py
<ide> class BertConfig(PretrainedConfig):
<ide> The standard deviation of the truncated_normal_initializer for initializing all weight matrices.
<ide> layer_norm_eps (:obj:`float`, optional, defaults to 1e-12):
<ide> The epsilon used by the layer normalization layers.
<add> gradient_checkpointing (:obj:`bool`, optional, defaults to False):
<add> If True, use gradient checkpointing to save memory at the expense of slower backward pass.
<ide>
<ide> Example::
<ide>
<ide> def __init__(
<ide> initializer_range=0.02,
<ide> layer_norm_eps=1e-12,
<ide> pad_token_id=0,
<add> gradient_checkpointing=False,
<ide> **kwargs
<ide> ):
<ide> super().__init__(pad_token_id=pad_token_id, **kwargs)
<ide> def __init__(
<ide> self.type_vocab_size = type_vocab_size
<ide> self.initializer_range = initializer_range
<ide> self.layer_norm_eps = layer_norm_eps
<add> self.gradient_checkpointing = gradient_checkpointing
<ide><path>src/transformers/modeling_bert.py
<ide> import warnings
<ide>
<ide> import torch
<add>import torch.utils.checkpoint
<ide> from torch import nn
<ide> from torch.nn import CrossEntropyLoss, MSELoss
<ide>
<ide> def forward(
<ide> class BertEncoder(nn.Module):
<ide> def __init__(self, config):
<ide> super().__init__()
<add> self.config = config
<ide> self.layer = nn.ModuleList([BertLayer(config) for _ in range(config.num_hidden_layers)])
<ide>
<ide> def forward(
<ide> def forward(
<ide> if output_hidden_states:
<ide> all_hidden_states = all_hidden_states + (hidden_states,)
<ide>
<del> layer_outputs = layer_module(
<del> hidden_states,
<del> attention_mask,
<del> head_mask[i],
<del> encoder_hidden_states,
<del> encoder_attention_mask,
<del> output_attentions,
<del> )
<add> if getattr(self.config, "gradient_checkpointing", False):
<add>
<add> def create_custom_forward(module):
<add> def custom_forward(*inputs):
<add> return module(*inputs, output_attentions)
<add>
<add> return custom_forward
<add>
<add> layer_outputs = torch.utils.checkpoint.checkpoint(
<add> create_custom_forward(layer_module),
<add> hidden_states,
<add> attention_mask,
<add> head_mask[i],
<add> encoder_hidden_states,
<add> encoder_attention_mask,
<add> )
<add> else:
<add> layer_outputs = layer_module(
<add> hidden_states,
<add> attention_mask,
<add> head_mask[i],
<add> encoder_hidden_states,
<add> encoder_attention_mask,
<add> output_attentions,
<add> )
<ide> hidden_states = layer_outputs[0]
<ide>
<ide> if output_attentions: | 2 |
Text | Text | add easy taxi to list of companies using airflow | 2025bd8e6e0f1357a1a0902771be9d7a586bd7b1 | <ide><path>README.md
<ide> if you may.
<ide>
<ide> Currently **officially** using Airflow:
<ide>
<del>* Airbnb [@mistercrunch]
<add>
<ide> * Agari [@r39132](https://github.com/r39132)
<add>* Airbnb [@mistercrunch]
<add>* BlueApron [[@jasonjho](https://github.com/jasonjho) & [@matthewdavidhauser](https://github.com/matthewdavidhauser)]
<add>* Chartboost [[@cgelman](https://github.com/cgelman) & [@dclubb](https://github.com/dclubb)]
<ide> * [Cotap](https://github.com/cotap/) [[@maraca](https://github.com/maraca) & [@richardchew](https://github.com/richardchew)]
<add>* Easy Taxi [@caique-lima](https://github.com/caique-lima)
<add>* [Jampp](https://github.com/jampp)
<add>* [LingoChamp](http://www.liulishuo.com/) [[@haitaoyao](https://github.com/haitaoyao)]
<ide> * Lyft
<ide> * Stripe [@jbalogh]
<add>* We[WeTransfer](https://github.com/WeTransfer) [[@jochem](https://github.com/jochem)]
<ide> * Wooga
<ide> * Xoom [[@gepser](https://github.com/gepser) & [@omarvides](https://github.com/omarvides)]
<ide> * Yahoo!
<del>* [Jampp](https://github.com/jampp)
<del>* Chartboost [[@cgelman](https://github.com/cgelman) & [@dclubb](https://github.com/dclubb)]
<del>* BlueApron [[@jasonjho](https://github.com/jasonjho) & [@matthewdavidhauser](https://github.com/matthewdavidhauser)]
<del>* [LingoChamp](http://www.liulishuo.com/) [[@haitaoyao](https://github.com/haitaoyao)]
<del>* [WeTransfer](https://github.com/WeTransfer) [[@jochem](https://github.com/jochem)]
<ide>
<ide> ## Links
<ide> | 1 |
Ruby | Ruby | extract float numbers when using size option | 16e6d77200472a80adff14c01ea6955cbe0b047e | <ide><path>actionview/lib/action_view/helpers/asset_tag_helper.rb
<ide> def resolve_image_source(source, skip_pipeline)
<ide>
<ide> def extract_dimensions(size)
<ide> size = size.to_s
<del> if /\A\d+x\d+\z/.match?(size)
<add> if /\A(\d+|\d+.\d+)x(\d+|\d+.\d+)\z/.match?(size)
<ide> size.split("x")
<del> elsif /\A\d+\z/.match?(size)
<add> elsif /\A(\d+|\d+.\d+)\z/.match?(size)
<ide> [size, size]
<ide> end
<ide> end
<ide><path>actionview/test/template/asset_tag_helper_test.rb
<ide> def content_security_policy_nonce
<ide> %(image_tag("rss.gif", :alt => "rss syndication")) => %(<img alt="rss syndication" src="/images/rss.gif" />),
<ide> %(image_tag("gold.png", :size => "20")) => %(<img height="20" src="/images/gold.png" width="20" />),
<ide> %(image_tag("gold.png", :size => 20)) => %(<img height="20" src="/images/gold.png" width="20" />),
<add> %(image_tag("silver.png", :size => "90.9")) => %(<img height="90.9" src="/images/silver.png" width="90.9" />),
<add> %(image_tag("silver.png", :size => 90.9)) => %(<img height="90.9" src="/images/silver.png" width="90.9" />),
<ide> %(image_tag("gold.png", :size => "45x70")) => %(<img height="70" src="/images/gold.png" width="45" />),
<ide> %(image_tag("gold.png", "size" => "45x70")) => %(<img height="70" src="/images/gold.png" width="45" />),
<add> %(image_tag("silver.png", :size => "67.12x74.09")) => %(<img height="74.09" src="/images/silver.png" width="67.12" />),
<add> %(image_tag("silver.png", "size" => "67.12x74.09")) => %(<img height="74.09" src="/images/silver.png" width="67.12" />),
<add> %(image_tag("bronze.png", :size => "10x15.7")) => %(<img height="15.7" src="/images/bronze.png" width="10" />),
<add> %(image_tag("bronze.png", "size" => "10x15.7")) => %(<img height="15.7" src="/images/bronze.png" width="10" />),
<add> %(image_tag("platinum.png", :size => "4.9x20")) => %(<img height="20" src="/images/platinum.png" width="4.9" />),
<add> %(image_tag("platinum.png", "size" => "4.9x20")) => %(<img height="20" src="/images/platinum.png" width="4.9" />),
<ide> %(image_tag("error.png", "size" => "45 x 70")) => %(<img src="/images/error.png" />),
<ide> %(image_tag("error.png", "size" => "x")) => %(<img src="/images/error.png" />),
<ide> %(image_tag("google.com.png")) => %(<img src="/images/google.com.png" />),
<ide> def content_security_policy_nonce
<ide> %(video_tag("rss.m4v", :preload => 'none')) => %(<video preload="none" src="/videos/rss.m4v"></video>),
<ide> %(video_tag("gold.m4v", :size => "160x120")) => %(<video height="120" src="/videos/gold.m4v" width="160"></video>),
<ide> %(video_tag("gold.m4v", "size" => "320x240")) => %(<video height="240" src="/videos/gold.m4v" width="320"></video>),
<add> %(video_tag("silver.m4v", :size => "100.3x200.6")) => %(<video height="200.6" src="/videos/silver.m4v" width="100.3"></video>),
<add> %(video_tag("silver.m4v", "size" => "100.3x200.6")) => %(<video height="200.6" src="/videos/silver.m4v" width="100.3"></video>),
<add> %(video_tag("bronze.m4v", :size => "50x12.7")) => %(<video height="12.7" src="/videos/bronze.m4v" width="50"></video>),
<add> %(video_tag("bronze.m4v", "size" => "50x12.7")) => %(<video height="12.7" src="/videos/bronze.m4v" width="50"></video>),
<add> %(video_tag("platinum.m4v", :size => "10.1x24")) => %(<video height="24" src="/videos/platinum.m4v" width="10.1"></video>),
<add> %(video_tag("platinum.m4v", "size" => "10.1x24")) => %(<video height="24" src="/videos/platinum.m4v" width="10.1"></video>),
<ide> %(video_tag("trailer.ogg", :poster => "screenshot.png")) => %(<video poster="/images/screenshot.png" src="/videos/trailer.ogg"></video>),
<ide> %(video_tag("error.avi", "size" => "100")) => %(<video height="100" src="/videos/error.avi" width="100"></video>),
<ide> %(video_tag("error.avi", "size" => 100)) => %(<video height="100" src="/videos/error.avi" width="100"></video>), | 2 |
Ruby | Ruby | reuse variable to avoid symbol usage | 69c5e010dcff0545fb2f38672e28894f230b1338 | <ide><path>actionpack/lib/action_dispatch/http/response.rb
<ide> class Response
<ide>
<ide> # Get and set headers for this response.
<ide> attr_accessor :header
<del>
<add>
<ide> alias_method :headers=, :header=
<ide> alias_method :headers, :header
<ide>
<ide> def message
<ide>
<ide> def respond_to?(method, include_private = false)
<ide> if method.to_s == 'to_path'
<del> stream.respond_to?(:to_path)
<add> stream.respond_to?(method)
<ide> else
<ide> super
<ide> end | 1 |
PHP | PHP | fix issue with missing base on redirect route | 32b48ecc7cc8fcf17ccf0d3ceaf9285338336208 | <ide><path>lib/Cake/Network/CakeRequest.php
<ide> class CakeRequest implements ArrayAccess {
<ide> *
<ide> * @var array
<ide> */
<del> public $params = array();
<add> public $params = array(
<add> 'plugin' => null,
<add> 'controller' => null,
<add> 'action' => null,
<add> );
<ide>
<ide> /**
<ide> * Array of POST data. Will contain form data as well as uploaded files.
<ide><path>lib/Cake/Routing/Dispatcher.php
<ide> public function dispatch(CakeRequest $request, CakeResponse $response, $addition
<ide> return;
<ide> }
<ide>
<del> $request = $this->parseParams($request, $additionalParams);
<ide> Router::setRequestInfo($request);
<add> $request = $this->parseParams($request, $additionalParams);
<ide> $controller = $this->_getController($request, $response);
<ide>
<ide> if (!($controller instanceof Controller)) { | 2 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.