content_type stringclasses 8 values | main_lang stringclasses 7 values | message stringlengths 1 50 | sha stringlengths 40 40 | patch stringlengths 52 962k | file_count int64 1 300 |
|---|---|---|---|---|---|
Text | Text | create issue and pull request templates | 9c2899f6c82d2a3415f240f70c77d4bd4d2e8377 | <ide><path>CONTRIBUTING.md
<ide> Some tips on good issue reporting:
<ide> * When describing issues try to phrase your ticket in terms of the *behavior* you think needs changing rather than the *code* you think need changing.
<ide> * Search the issue list first for related items, and make sure you're running the latest version of REST framework before reporting an issue.
<ide> * If reporting a bug, then try to include a pull request with a failing test case. This will help us quickly identify if there is a valid issue, and make sure that it gets fixed more quickly if there is one.
<del>* Feature requests will often be closed with a recommendation that they be implemented outside of the core REST framework library. Keeping new feature requests implemented as third party libraries allows us to keep down the maintenance overhead of REST framework, so that the focus can be on continued stability, bugfixes, and great documentation.
<add>* Feature requests will often be closed with a recommendation that they be implemented outside of the core REST framework library. Keeping new feature requests implemented as third party libraries allows us to keep down the maintenance overhead of REST framework, so that the focus can be on continued stability, bug fixes, and great documentation.
<ide> * Closing an issue doesn't necessarily mean the end of a discussion. If you believe your issue has been closed incorrectly, explain why and we'll consider if it needs to be reopened.
<ide>
<ide> ## Triaging issues
<ide> Always run the tests before submitting pull requests, and ideally run `tox` in o
<ide>
<ide> Once you've made a pull request take a look at the Travis build status in the GitHub interface and make sure the tests are running as you'd expect.
<ide>
<del>![Travis status][travis-status]
<del>
<del>*Above: Travis build notifications*
<del>
<ide> ## Managing compatibility issues
<ide>
<ide> Sometimes, in order to ensure your code works on various different versions of Django, Python or third party libraries, you'll need to run slightly different code depending on the environment. Any code that branches in this way should be isolated into the `compat.py` module, and should provide a single common interface that the rest of the codebase can use.
<ide> If you want to draw attention to a note or warning, use a pair of enclosing line
<ide> [so-filter]: http://stackexchange.com/filters/66475/rest-framework
<ide> [issues]: https://github.com/tomchristie/django-rest-framework/issues?state=open
<ide> [pep-8]: http://www.python.org/dev/peps/pep-0008/
<del>[travis-status]: ../img/travis-status.png
<ide> [pull-requests]: https://help.github.com/articles/using-pull-requests
<ide> [tox]: http://tox.readthedocs.org/en/latest/
<ide> [markdown]: http://daringfireball.net/projects/markdown/basics
<ide><path>ISSUE_TEMPLATE.md
<add>## Checklist
<add>
<add>- [ ] I have verified that that issue exists against the `master` branch of Django REST framework.
<add>- [ ] I have searched for similar issues in both open and closed tickets and cannot find a duplicate.
<add>- [ ] This is not a usage question. (Those should be directed to the [discussion group](https://groups.google.com/forum/#!forum/django-rest-framework) instead.)
<add>- [ ] This cannot be dealt with as a third party library. (We prefer new functionality to be [in the form of third party libraries](http://www.django-rest-framework.org/topics/third-party-resources/#about-third-party-packages) where possible.)
<add>- [ ] I have reduced the issue to the simplest possible case.
<add>- [ ] I have included a failing test as a pull request. (If you are unable to do so we can still accept the issue.)
<add>
<add>## Steps to reproduce
<add>
<add>## Expected behavior
<add>
<add>## Actual behavior
<ide><path>PULL_REQUEST_TEMPLATE.md
<add>*Note*: Before submitting this pull request, please review our [contributing guidelines](https://github.com/tomchristie/django-rest-framework/blob/master/CONTRIBUTING.md#pull-requests).
<add>
<add>## Description
<add>
<add>Please describe your pull request. If it fixes a bug or resolves a feature request, be sure to link to that issue. When linking to an issue, please use `refs #...` in the description of the pull request. | 3 |
Text | Text | add development guide for ui-components | a4b6c7eee0a579648947207a6db116bd3854a71c | <ide><path>docs/_sidebar.md
<ide> - [Work on Cypress tests](how-to-add-cypress-tests.md)
<ide> - [Work on video challenges](how-to-help-with-video-challenges.md)
<ide> - [Work on the docs theme](how-to-work-on-the-docs-theme.md)
<add> - [Work on the component library](how-to-work-on-the-component-library.md)
<ide> - **Additional Guides**
<ide> - [Test translations locally](how-to-test-translations-locally.md)
<ide> - [Understand the curriculum file structure](curriculum-file-structure.md)
<ide><path>docs/how-to-work-on-the-component-library.md
<add>Welcome to freeCodeCamp's `ui-components` library. The components are built mostly from scratch with basic HTML elements and [Tailwind CSS](https://tailwindcss.com/).
<add>
<add># How to Work on the Component Library
<add>
<add>> [!NOTE]
<add>>
<add>> freeCodeCamp has been using Bootstrap components in the UI. However, we are moving away from it and building our own component library, which helps standardize our UX/UI patterns and improve accessibility. The project is tracked in [this GitHub issue](https://github.com/freeCodeCamp/freeCodeCamp/issues/44668).
<add>
<add>The following steps are recommended when working on a new component:
<add>
<add>- Research and planning
<add>- Implement the component
<add>- Display the use cases on Storybook
<add>- Write unit tests
<add>
<add>## Researching and planning
<add>
<add>Before building a component, you need to research and document on how the existing version behaves and looks, to ensure that the new one has matching styles and supports all the current usages. In order to meet the web accesibility requirements, you should also pay attention to the accessibility aspect of the components, see which HTML elements and ARIA attributes are used under the hood.
<add>
<add>Once you have gathered enough information about the component, you can start thinking about the props interface. Ideally, the interface should be as similar to the current version as possible, to ease the adoption later on. Since we are using Bootstrap components, the simplest approach is to mimic [their implementation](https://github.com/react-bootstrap/react-bootstrap/tree/master/src).
<add>
<add>We prefer smaller pull requests rather than a large one, because they speed up the review time and reduce cognitive overload for the reviewers. For that reason, you should think about how you would break down the implementation and come up with a delivery plan.
<add>
<add>We recommend opening a separate GitHub issue for each component and include all the notes in the issue description. It can be used as a place to host all of your working notes, as well as a way to communicate the approach with the reviewers. We will use the issue thread for further discussion if needed. [The issue for Button component](https://github.com/freeCodeCamp/freeCodeCamp/issues/45357) can be used as a reference.
<add>
<add>## Implementing the component
<add>
<add>A new component can be created using the following command from the root directory:
<add>
<add>```bash
<add>cd tools/ui-components
<add>
<add>npm run gen-component MyComponent
<add>```
<add>
<add>The command will generate a new folder inside the `ui-components` directory, with the following files:
<add>
<add>| File name | Purpose |
<add>| -------------------------- | ------------------------------------------------------ |
<add>| `index.ts` | It is used for exporting the component and its types. |
<add>| `my-component.stories.tsx` | It is used for demoing the component on Storybook. |
<add>| `my-component.test.tsx` | It is a test file. |
<add>| `my-component.tsx` | It is where we implement the component. |
<add>| `types.ts` | Is is where we locate component's interface and types. |
<add>
<add>Each component is different, but in general the components should:
<add>
<add>- Support forwarding ref
<add>- Be styled for both light and dark themes
<add>- Be styled internally based on their props (the consumers should not need to restyle the components with the `className` prop)
<add>- Utilize the built-in styling system from Tailwind instead of having custom styles
<add>
<add>### Useful links
<add>
<add>- [Tailwind CSS Configuration](https://tailwindcss.com/docs/configuration)
<add>- [React Bootstrap v0.33 Docs](https://react-bootstrap-v3.netlify.app)
<add>- [Bootstrap 3.3.7 stylesheet](https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.7/css/bootstrap.css)
<add>- [React Bootstrap current implementation](https://github.com/react-bootstrap/react-bootstrap/tree/master/src)
<add>- [React Bootstrap current tests](https://github.com/react-bootstrap/react-bootstrap/tree/master/test)
<add>
<add>## Displaying the use cases on Storybook
<add>
<add>Use cases of the component should be added to the Storybook file (`.stories.tsx`).
<add>
<add>To start Storybook, run the following command from the root directory:
<add>
<add>```bash
<add>npm run storybook
<add>```
<add>
<add>The Storybook page is available on [http://localhost:6006](http://localhost:6006).
<add>
<add>## Writing unit tests
<add>
<add>We use [React Testing Library](https://testing-library.com/docs/react-testing-library/intro/) to write unit tests. The tests should assert that the components behave as expected and are accessible.
<add>
<add>To run tests against the component library, run the following command from the root directory:
<add>
<add>```bash
<add>npm run test-ui-components
<add>```
<add>
<add>### Useful links
<add>
<add>- [Testing for Accessibility](https://testing-library.com/docs/dom-testing-library/api-accessibility)
<add>- [Order of priority of React Testing Library's queries](https://testing-library.com/docs/queries/about/#priority)
<add>- [Common mistakes with React Testing Library](https://kentcdodds.com/blog/common-mistakes-with-react-testing-library) | 2 |
Text | Text | update amazon ec2 docker installation instructions | da667581cffec4048d848454266b8d3cad55c859 | <ide><path>docs/sources/installation/amazon.md
<ide> page_keywords: amazon ec2, virtualization, cloud, docker, documentation, install
<ide>
<ide> # Amazon EC2
<ide>
<del>There are several ways to install Docker on AWS EC2:
<del>
<del> - [*Amazon QuickStart (Release Candidate - March 2014)*](
<del> #amazon-quickstart-release-candidate-march-2014) or
<del> - [*Amazon QuickStart*](#amazon-quickstart) or
<del> - [*Standard Ubuntu Installation*](#standard-ubuntu-installation)
<add>There are several ways to install Docker on AWS EC2. You can use Amazon Linux, which includes the Docker packages in its Software Repository, or opt for any of the other supported Linux images, for example a [*Standard Ubuntu Installation*](#standard-ubuntu-installation).
<ide>
<ide> **You'll need an** [AWS account](http://aws.amazon.com/) **first, of
<ide> course.**
<ide>
<del>## Amazon QuickStart
<del>
<del>1. **Choose an image:**
<del> - Launch the [Create Instance
<del> Wizard](https://console.aws.amazon.com/ec2/v2/home?#LaunchInstanceWizard:)
<del> menu on your AWS Console.
<del> - Click the `Select` button for a 64Bit Ubuntu
<del> image. For example: Ubuntu Server 12.04.3 LTS
<del> - For testing you can use the default (possibly free)
<del> `t1.micro` instance (more info on
<del> [pricing](http://aws.amazon.com/ec2/pricing/)).
<del> - Click the `Next: Configure Instance Details`
<del> button at the bottom right.
<del>
<del>2. **Tell CloudInit to install Docker:**
<del> - When you're on the "Configure Instance Details" step, expand the
<del> "Advanced Details" section.
<del> - Under "User data", select "As text".
<del> - Enter `#include https://get.docker.com` into
<del> the instance *User Data*.
<del> [CloudInit](https://help.ubuntu.com/community/CloudInit) is part
<del> of the Ubuntu image you chose; it will bootstrap Docker by
<del> running the shell script located at this URL.
<del>
<del>3. After a few more standard choices where defaults are probably ok,
<del> your AWS Ubuntu instance with Docker should be running!
<del>
<del>**If this is your first AWS instance, you may need to set up your
<del>Security Group to allow SSH.** By default all incoming ports to your new
<del>instance will be blocked by the AWS Security Group, so you might just
<del>get timeouts when you try to connect.
<add>## Amazon QuickStart with Amazon Linux AMI 2014.09.1
<ide>
<del>Installing with `get.docker.com` (as above) will
<del>create a service named `lxc-docker`. It will also
<del>set up a [*docker group*](../binaries/#dockergroup) and you may want to
<del>add the *ubuntu* user to it so that you don't have to use
<del>`sudo` for every Docker command.
<del>
<del>Once you`ve got Docker installed, you're ready to try it out – head on
<del>over to the [User Guide](/userguide).
<del>
<del>## Amazon QuickStart (Release Candidate - March 2014)
<del>
<del>Amazon just published new Docker-ready AMIs (2014.03 Release Candidate).
<del>Docker packages can now be installed from Amazon's provided Software
<add>The latest Amazon Linux AMI, 2014.09.1, is Docker ready. Docker packages can be installed from Amazon's provided Software
<ide> Repository.
<ide>
<ide> 1. **Choose an image:**
<ide> - Launch the [Create Instance
<ide> Wizard](https://console.aws.amazon.com/ec2/v2/home?#LaunchInstanceWizard:)
<ide> menu on your AWS Console.
<del> - Click the `Community AMI` menu option on the
<del> left side
<del> - Search for `2014.03` and select one of the Amazon provided AMI,
<del> for example `amzn-ami-pv-2014.03.rc-0.x86_64-ebs`
<add> - In the Quick Start menu, select the Amazon provided AMI for Amazon Linux 2014.09.1
<ide> - For testing you can use the default (possibly free)
<del> `t1.micro` instance (more info on
<add> `t2.micro` instance (more info on
<ide> [pricing](http://aws.amazon.com/ec2/pricing/)).
<ide> - Click the `Next: Configure Instance Details`
<ide> button at the bottom right.
<del>
<ide> 2. After a few more standard choices where defaults are probably ok,
<ide> your Amazon Linux instance should be running!
<ide> 3. SSH to your instance to install Docker :
<ide> `ssh -i <path to your private key> ec2-user@<your public IP address>`
<del>
<ide> 4. Once connected to the instance, type
<ide> `sudo yum install -y docker ; sudo service docker start`
<ide> to install and start Docker
<ide>
<add>**If this is your first AWS instance, you may need to set up your Security Group to allow SSH.** By default all incoming ports to your new instance will be blocked by the AWS Security Group, so you might just get timeouts when you try to connect.
<add>
<add>Once you`ve got Docker installed, you're ready to try it out – head on
<add>over to the [User Guide](/userguide).
<add>
<ide> ## Standard Ubuntu Installation
<ide>
<ide> If you want a more hands-on installation, then you can follow the | 1 |
Ruby | Ruby | remove `json` argument and extend `cachable` | eab0f88c3cf9dfc2396f35281022d14592d7b8dc | <ide><path>Library/Homebrew/api.rb
<ide> require "api/cask"
<ide> require "api/formula"
<ide> require "api/versions"
<add>require "extend/cachable"
<ide>
<ide> module Homebrew
<ide> # Helper functions for using Homebrew's formulae.brew.sh API.
<ide> module Homebrew
<ide> module API
<ide> extend T::Sig
<ide>
<add> extend Cachable
<add>
<ide> module_function
<ide>
<ide> API_DOMAIN = "https://formulae.brew.sh/api"
<ide>
<del> sig { params(endpoint: String, json: T::Boolean).returns(T.any(String, Hash)) }
<del> def fetch(endpoint, json: false)
<del> return @cache[endpoint] if @cache.present? && @cache.key?(endpoint)
<add> sig { params(endpoint: String).returns(T.any(String, Hash)) }
<add> def fetch(endpoint)
<add> return cache[endpoint] if cache.present? && cache.key?(endpoint)
<ide>
<ide> api_url = "#{API_DOMAIN}/#{endpoint}"
<ide> output = Utils::Curl.curl_output("--fail", "--max-time", "5", api_url)
<ide> raise ArgumentError, "No file found at #{Tty.underline}#{api_url}#{Tty.reset}" unless output.success?
<ide>
<del> @cache ||= {}
<del> @cache[endpoint] = if json
<del> JSON.parse(output.stdout)
<del> else
<del> output.stdout
<del> end
<add> cache[endpoint] = JSON.parse(output.stdout)
<ide> rescue JSON::ParserError
<ide> raise ArgumentError, "Invalid JSON file: #{Tty.underline}#{api_url}#{Tty.reset}"
<ide> end
<ide><path>Library/Homebrew/api/analytics.rb
<ide> def analytics_api_path
<ide>
<ide> sig { params(category: String, days: T.any(Integer, String)).returns(Hash) }
<ide> def fetch(category, days)
<del> Homebrew::API.fetch "#{analytics_api_path}/#{category}/#{days}d.json", json: true
<add> Homebrew::API.fetch "#{analytics_api_path}/#{category}/#{days}d.json"
<ide> end
<ide> end
<ide> end
<ide><path>Library/Homebrew/api/bottle.rb
<ide> def bottle_api_path
<ide>
<ide> sig { params(name: String).returns(Hash) }
<ide> def fetch(name)
<del> Homebrew::API.fetch "#{bottle_api_path}/#{name}.json", json: true
<add> Homebrew::API.fetch "#{bottle_api_path}/#{name}.json"
<ide> end
<ide>
<ide> sig { params(name: String).returns(T::Boolean) }
<ide><path>Library/Homebrew/api/cask.rb
<ide> module Cask
<ide>
<ide> sig { params(name: String).returns(Hash) }
<ide> def fetch(name)
<del> Homebrew::API.fetch "cask/#{name}.json", json: true
<add> Homebrew::API.fetch "cask/#{name}.json"
<ide> end
<ide> end
<ide> end
<ide><path>Library/Homebrew/api/formula.rb
<ide> def formula_api_path
<ide>
<ide> sig { params(name: String).returns(Hash) }
<ide> def fetch(name)
<del> Homebrew::API.fetch "#{formula_api_path}/#{name}.json", json: true
<add> Homebrew::API.fetch "#{formula_api_path}/#{name}.json"
<ide> end
<ide> end
<ide> end
<ide><path>Library/Homebrew/api/versions.rb
<ide> module Versions
<ide>
<ide> def formulae
<ide> # The result is cached by Homebrew::API.fetch
<del> Homebrew::API.fetch "versions-formulae.json", json: true
<add> Homebrew::API.fetch "versions-formulae.json"
<ide> end
<ide>
<ide> def linux
<ide> # The result is cached by Homebrew::API.fetch
<del> Homebrew::API.fetch "versions-linux.json", json: true
<add> Homebrew::API.fetch "versions-linux.json"
<ide> end
<ide>
<ide> def casks
<ide> # The result is cached by Homebrew::API.fetch
<del> Homebrew::API.fetch "versions-casks.json", json: true
<add> Homebrew::API.fetch "versions-casks.json"
<ide> end
<ide>
<ide> sig { params(name: String).returns(T.nilable(PkgVersion)) } | 6 |
Javascript | Javascript | convert vars into let/const | 6074664f73c6b1ea1f774f2bc698224e3677cef0 | <ide><path>packages/react-reconciler/index.js
<ide> export type {
<ide> Reconciler,
<ide> } from './src/ReactFiberReconciler';
<ide>
<del>var ReactFiberReconciler = require('./src/ReactFiberReconciler');
<add>const ReactFiberReconciler = require('./src/ReactFiberReconciler');
<ide>
<ide> // TODO: decide on the top-level export form.
<ide> // This is hacky but makes it work with both Rollup and Jest.
<ide><path>packages/react-reconciler/src/ReactDebugCurrentFiber.js
<ide> function setCurrentPhase(phase: LifeCyclePhase | null) {
<ide> ReactDebugCurrentFiber.phase = phase;
<ide> }
<ide>
<del>var ReactDebugCurrentFiber = {
<add>const ReactDebugCurrentFiber = {
<ide> current: (null: Fiber | null),
<ide> phase: (null: LifeCyclePhase | null),
<ide> resetCurrentFiber,
<ide><path>packages/react-reconciler/src/ReactFiber.js
<ide> import {NoContext} from './ReactTypeOfInternalContext';
<ide> if (__DEV__) {
<ide> var hasBadMapPolyfill = false;
<ide> try {
<del> const nonExtensibleObject = Object.preventExtensions({});
<add> var nonExtensibleObject = Object.preventExtensions({});
<ide> /* eslint-disable no-new */
<ide> new Map([[nonExtensibleObject, null]]);
<ide> new Set([nonExtensibleObject]);
<ide> function FiberNode(
<ide> // is faster.
<ide> // 5) It should be easy to port this to a C struct and keep a C implementation
<ide> // compatible.
<del>var createFiber = function(
<add>const createFiber = function(
<ide> tag: TypeOfWork,
<ide> pendingProps: mixed,
<ide> key: null | string,
<ide><path>packages/react-reconciler/src/ReactFiberBeginWork.js
<ide> export default function<T, P, I, TI, HI, PI, C, CC, CX, PL>(
<ide> // It used to be here.
<ide> }
<ide>
<del> var unmaskedContext = getUnmaskedContext(workInProgress);
<del> var context = getMaskedContext(workInProgress, unmaskedContext);
<add> const unmaskedContext = getUnmaskedContext(workInProgress);
<add> const context = getMaskedContext(workInProgress, unmaskedContext);
<ide>
<del> var nextChildren;
<add> let nextChildren;
<ide>
<ide> if (__DEV__) {
<ide> ReactCurrentOwner.current = workInProgress;
<ide> export default function<T, P, I, TI, HI, PI, C, CC, CX, PL>(
<ide> 'An indeterminate component should never have mounted. This error is ' +
<ide> 'likely caused by a bug in React. Please file an issue.',
<ide> );
<del> var fn = workInProgress.type;
<del> var props = workInProgress.pendingProps;
<del> var unmaskedContext = getUnmaskedContext(workInProgress);
<del> var context = getMaskedContext(workInProgress, unmaskedContext);
<add> const fn = workInProgress.type;
<add> const props = workInProgress.pendingProps;
<add> const unmaskedContext = getUnmaskedContext(workInProgress);
<add> const context = getMaskedContext(workInProgress, unmaskedContext);
<ide>
<del> var value;
<add> let value;
<ide>
<ide> if (__DEV__) {
<ide> if (fn.prototype && typeof fn.prototype.render === 'function') {
<ide> export default function<T, P, I, TI, HI, PI, C, CC, CX, PL>(
<ide> }
<ide>
<ide> function updateCallComponent(current, workInProgress, renderExpirationTime) {
<del> var nextCall = (workInProgress.pendingProps: ReactCall);
<add> let nextCall = (workInProgress.pendingProps: ReactCall);
<ide> if (hasContextChanged()) {
<ide> // Normally we can bail out on props equality but if context has changed
<ide> // we don't do the bailout and we have to reuse existing props instead.
<ide><path>packages/react-reconciler/src/ReactFiberCommitWork.js
<ide> import {commitCallbacks} from './ReactFiberUpdateQueue';
<ide> import {onCommitUnmount} from './ReactFiberDevToolsHook';
<ide> import {startPhaseTimer, stopPhaseTimer} from './ReactDebugFiberPerf';
<ide>
<del>var {invokeGuardedCallback, hasCaughtError, clearCaughtError} = ReactErrorUtils;
<add>const {
<add> invokeGuardedCallback,
<add> hasCaughtError,
<add> clearCaughtError,
<add>} = ReactErrorUtils;
<ide>
<ide> export default function<T, P, I, TI, HI, PI, C, CC, CX, PL>(
<ide> config: HostConfig<T, P, I, TI, HI, PI, C, CC, CX, PL>,
<ide> captureError: (failedFiber: Fiber, error: mixed) => Fiber | null,
<ide> ) {
<ide> const {getPublicInstance, mutation, persistence} = config;
<ide>
<del> var callComponentWillUnmountWithTimer = function(current, instance) {
<add> const callComponentWillUnmountWithTimer = function(current, instance) {
<ide> startPhaseTimer(current, 'componentWillUnmount');
<ide> instance.props = current.memoizedProps;
<ide> instance.state = current.memoizedState;
<ide><path>packages/react-reconciler/src/ReactFiberCompleteWork.js
<ide> export default function<T, P, I, TI, HI, PI, C, CC, CX, PL>(
<ide> workInProgress: Fiber,
<ide> renderExpirationTime: ExpirationTime,
<ide> ) {
<del> var call = (workInProgress.memoizedProps: ?ReactCall);
<add> const call = (workInProgress.memoizedProps: ?ReactCall);
<ide> invariant(
<ide> call,
<ide> 'Should be resolved by now. This error is likely caused by a bug in ' +
<ide> export default function<T, P, I, TI, HI, PI, C, CC, CX, PL>(
<ide>
<ide> // Build up the returns.
<ide> // TODO: Compare this to a generator or opaque helpers like Children.
<del> var returns: Array<mixed> = [];
<add> const returns: Array<mixed> = [];
<ide> appendAllReturns(returns, workInProgress);
<del> var fn = call.handler;
<del> var props = call.props;
<del> var nextChildren = fn(props, returns);
<add> const fn = call.handler;
<add> const props = call.props;
<add> const nextChildren = fn(props, returns);
<ide>
<del> var currentFirstChild = current !== null ? current.child : null;
<add> const currentFirstChild = current !== null ? current.child : null;
<ide> workInProgress.child = reconcileChildFibers(
<ide> workInProgress,
<ide> currentFirstChild,
<ide><path>packages/react-reconciler/src/ReactFiberInstrumentation.js
<ide> // See https://github.com/facebook/react/pull/8033.
<ide> // This is not part of the public API, not even for React DevTools.
<ide> // You may only inject a debugTool if you work on React Fiber itself.
<del>var ReactFiberInstrumentation = {
<add>const ReactFiberInstrumentation = {
<ide> debugTool: null,
<ide> };
<ide>
<ide><path>packages/react-reconciler/src/ReactFiberReconciler.js
<ide> function getContextForSubtree(
<ide> export default function<T, P, I, TI, HI, PI, C, CC, CX, PL>(
<ide> config: HostConfig<T, P, I, TI, HI, PI, C, CC, CX, PL>,
<ide> ): Reconciler<C, I, TI> {
<del> var {getPublicInstance} = config;
<add> const {getPublicInstance} = config;
<ide>
<del> var {
<add> const {
<ide> computeAsyncExpiration,
<ide> computeUniqueAsyncExpiration,
<ide> computeExpirationForFiber,
<ide><path>packages/react-reconciler/src/ReactFiberScheduler.js
<ide> import {AsyncUpdates} from './ReactTypeOfInternalContext';
<ide> import {getUpdateExpirationTime} from './ReactFiberUpdateQueue';
<ide> import {resetContext} from './ReactFiberContext';
<ide>
<del>var {invokeGuardedCallback, hasCaughtError, clearCaughtError} = ReactErrorUtils;
<add>const {
<add> invokeGuardedCallback,
<add> hasCaughtError,
<add> clearCaughtError,
<add>} = ReactErrorUtils;
<ide>
<ide> export type CapturedError = {
<ide> componentName: ?string,
<ide> if (__DEV__) {
<ide> var didWarnStateUpdateForUnmountedComponent = {};
<ide>
<ide> var warnAboutUpdateOnUnmounted = function(fiber: Fiber) {
<del> const componentName = getComponentName(fiber) || 'ReactClass';
<add> var componentName = getComponentName(fiber) || 'ReactClass';
<ide> if (didWarnStateUpdateForUnmountedComponent[componentName]) {
<ide> return;
<ide> }
<ide><path>packages/react-reconciler/src/ReactFiberTreeReflection.js
<ide> import {
<ide> } from 'shared/ReactTypeOfWork';
<ide> import {NoEffect, Placement} from 'shared/ReactTypeOfSideEffect';
<ide>
<del>var MOUNTING = 1;
<del>var MOUNTED = 2;
<del>var UNMOUNTED = 3;
<add>const MOUNTING = 1;
<add>const MOUNTED = 2;
<add>const UNMOUNTED = 3;
<ide>
<ide> function isFiberMountedImpl(fiber: Fiber): number {
<ide> let node = fiber;
<ide> export function isMounted(component: React$Component<any, any>): boolean {
<ide> }
<ide> }
<ide>
<del> var fiber: ?Fiber = ReactInstanceMap.get(component);
<add> const fiber: ?Fiber = ReactInstanceMap.get(component);
<ide> if (!fiber) {
<ide> return false;
<ide> }
<ide><path>packages/react-reconciler/src/__tests__/ReactExpiration-test.js
<ide>
<ide> 'use strict';
<ide>
<del>var React;
<del>var ReactNoop;
<add>let React;
<add>let ReactNoop;
<ide>
<ide> describe('ReactExpiration', () => {
<ide> beforeEach(() => {
<ide><path>packages/react-reconciler/src/__tests__/ReactFiberHostContext-test.js
<ide>
<ide> 'use strict';
<ide>
<del>var React;
<del>var ReactFiberReconciler;
<add>let React;
<add>let ReactFiberReconciler;
<ide>
<ide> describe('ReactFiberHostContext', () => {
<ide> beforeEach(() => {
<ide> describe('ReactFiberHostContext', () => {
<ide> });
<ide>
<ide> it('works with null host context', () => {
<del> var creates = 0;
<del> var Renderer = ReactFiberReconciler({
<add> let creates = 0;
<add> const Renderer = ReactFiberReconciler({
<ide> prepareForCommit: function() {},
<ide> resetAfterCommit: function() {},
<ide> getRootHostContext: function() {
<ide><path>packages/react-reconciler/src/__tests__/ReactFragment-test.js
<ide> describe('ReactFragment', () => {
<ide> });
<ide>
<ide> it('should preserve state of children with 1 level nesting', function() {
<del> var ops = [];
<add> const ops = [];
<ide>
<ide> class Stateful extends React.Component {
<ide> componentDidUpdate() {
<ide> describe('ReactFragment', () => {
<ide> });
<ide>
<ide> it('should preserve state between top-level fragments', function() {
<del> var ops = [];
<add> const ops = [];
<ide>
<ide> class Stateful extends React.Component {
<ide> componentDidUpdate() {
<ide> describe('ReactFragment', () => {
<ide> });
<ide>
<ide> it('should preserve state of children nested at same level', function() {
<del> var ops = [];
<add> const ops = [];
<ide>
<ide> class Stateful extends React.Component {
<ide> componentDidUpdate() {
<ide> describe('ReactFragment', () => {
<ide> });
<ide>
<ide> it('should not preserve state in non-top-level fragment nesting', function() {
<del> var ops = [];
<add> const ops = [];
<ide>
<ide> class Stateful extends React.Component {
<ide> componentDidUpdate() {
<ide> describe('ReactFragment', () => {
<ide> });
<ide>
<ide> it('should not preserve state of children if nested 2 levels without siblings', function() {
<del> var ops = [];
<add> const ops = [];
<ide>
<ide> class Stateful extends React.Component {
<ide> componentDidUpdate() {
<ide> describe('ReactFragment', () => {
<ide> });
<ide>
<ide> it('should not preserve state of children if nested 2 levels with siblings', function() {
<del> var ops = [];
<add> const ops = [];
<ide>
<ide> class Stateful extends React.Component {
<ide> componentDidUpdate() {
<ide> describe('ReactFragment', () => {
<ide> });
<ide>
<ide> it('should preserve state between array nested in fragment and fragment', function() {
<del> var ops = [];
<add> const ops = [];
<ide>
<ide> class Stateful extends React.Component {
<ide> componentDidUpdate() {
<ide> describe('ReactFragment', () => {
<ide> });
<ide>
<ide> it('should preserve state between top level fragment and array', function() {
<del> var ops = [];
<add> const ops = [];
<ide>
<ide> class Stateful extends React.Component {
<ide> componentDidUpdate() {
<ide> describe('ReactFragment', () => {
<ide> });
<ide>
<ide> it('should not preserve state between array nested in fragment and double nested fragment', function() {
<del> var ops = [];
<add> const ops = [];
<ide>
<ide> class Stateful extends React.Component {
<ide> componentDidUpdate() {
<ide> describe('ReactFragment', () => {
<ide> });
<ide>
<ide> it('should not preserve state between array nested in fragment and double nested array', function() {
<del> var ops = [];
<add> const ops = [];
<ide>
<ide> class Stateful extends React.Component {
<ide> componentDidUpdate() {
<ide> describe('ReactFragment', () => {
<ide> });
<ide>
<ide> it('should preserve state between double nested fragment and double nested array', function() {
<del> var ops = [];
<add> const ops = [];
<ide>
<ide> class Stateful extends React.Component {
<ide> componentDidUpdate() {
<ide> describe('ReactFragment', () => {
<ide> });
<ide>
<ide> it('should not preserve state of children when the keys are different', function() {
<del> var ops = [];
<add> const ops = [];
<ide>
<ide> class Stateful extends React.Component {
<ide> componentDidUpdate() {
<ide> describe('ReactFragment', () => {
<ide> });
<ide>
<ide> it('should not preserve state between unkeyed and keyed fragment', function() {
<del> var ops = [];
<add> const ops = [];
<ide>
<ide> class Stateful extends React.Component {
<ide> componentDidUpdate() {
<ide> describe('ReactFragment', () => {
<ide> });
<ide>
<ide> it('should preserve state with reordering in multiple levels', function() {
<del> var ops = [];
<add> const ops = [];
<ide>
<ide> class Stateful extends React.Component {
<ide> componentDidUpdate() {
<ide> describe('ReactFragment', () => {
<ide>
<ide> it('should not preserve state when switching to a keyed fragment to an array', function() {
<ide> spyOnDev(console, 'error');
<del> var ops = [];
<add> const ops = [];
<ide>
<ide> class Stateful extends React.Component {
<ide> componentDidUpdate() {
<ide> describe('ReactFragment', () => {
<ide>
<ide> it('should preserve state when it does not change positions', function() {
<ide> spyOnDev(console, 'error');
<del> var ops = [];
<add> const ops = [];
<ide>
<ide> class Stateful extends React.Component {
<ide> componentDidUpdate() {
<ide><path>packages/react-reconciler/src/__tests__/ReactIncremental-test.js
<ide>
<ide> 'use strict';
<ide>
<del>var React;
<del>var ReactNoop;
<del>var PropTypes;
<add>let React;
<add>let ReactNoop;
<add>let PropTypes;
<ide>
<ide> describe('ReactIncremental', () => {
<ide> beforeEach(() => {
<ide> describe('ReactIncremental', () => {
<ide> });
<ide>
<ide> it('should render a simple component, in steps if needed', () => {
<del> var renderCallbackCalled = false;
<del> var barCalled = false;
<add> let renderCallbackCalled = false;
<add> let barCalled = false;
<ide> function Bar() {
<ide> barCalled = true;
<ide> return (
<ide> describe('ReactIncremental', () => {
<ide> );
<ide> }
<ide>
<del> var fooCalled = false;
<add> let fooCalled = false;
<ide> function Foo() {
<ide> fooCalled = true;
<ide> return [<Bar key="a" isBar={true} />, <Bar key="b" isBar={true} />];
<ide> describe('ReactIncremental', () => {
<ide> });
<ide>
<ide> it('updates a previous render', () => {
<del> var ops = [];
<add> let ops = [];
<ide>
<ide> function Header() {
<ide> ops.push('Header');
<ide> describe('ReactIncremental', () => {
<ide> return <footer>Bye</footer>;
<ide> }
<ide>
<del> var header = <Header />;
<del> var footer = <Footer />;
<add> const header = <Header />;
<add> const footer = <Footer />;
<ide>
<ide> function Foo(props) {
<ide> ops.push('Foo');
<ide> describe('ReactIncremental', () => {
<ide> });
<ide>
<ide> it('can cancel partially rendered work and restart', () => {
<del> var ops = [];
<add> let ops = [];
<ide>
<ide> function Bar(props) {
<ide> ops.push('Bar');
<ide> describe('ReactIncremental', () => {
<ide> });
<ide>
<ide> it('can deprioritize unfinished work and resume it later', () => {
<del> var ops = [];
<add> let ops = [];
<ide>
<ide> function Bar(props) {
<ide> ops.push('Bar');
<ide> describe('ReactIncremental', () => {
<ide> });
<ide>
<ide> it('can deprioritize a tree from without dropping work', () => {
<del> var ops = [];
<add> let ops = [];
<ide>
<ide> function Bar(props) {
<ide> ops.push('Bar');
<ide> describe('ReactIncremental', () => {
<ide> });
<ide>
<ide> xit('can resume work in a subtree even when a parent bails out', () => {
<del> var ops = [];
<add> let ops = [];
<ide>
<ide> function Bar(props) {
<ide> ops.push('Bar');
<ide> describe('ReactIncremental', () => {
<ide> return <span>{props.children}</span>;
<ide> }
<ide>
<del> var middleContent = (
<add> const middleContent = (
<ide> <aaa>
<ide> <Tester />
<ide> <bbb hidden={true}>
<ide> describe('ReactIncremental', () => {
<ide> });
<ide>
<ide> xit('can resume work in a bailed subtree within one pass', () => {
<del> var ops = [];
<add> let ops = [];
<ide>
<ide> function Bar(props) {
<ide> ops.push('Bar');
<ide> describe('ReactIncremental', () => {
<ide> });
<ide>
<ide> xit('can reuse work done after being preempted', () => {
<del> var ops = [];
<add> let ops = [];
<ide>
<ide> function Bar(props) {
<ide> ops.push('Bar');
<ide> describe('ReactIncremental', () => {
<ide> return <span>{props.children}</span>;
<ide> }
<ide>
<del> var middleContent = (
<add> const middleContent = (
<ide> <div>
<ide> <Middle>Hello</Middle>
<ide> <Bar>-</Bar>
<ide> <Middle>World</Middle>
<ide> </div>
<ide> );
<ide>
<del> var step0 = (
<add> const step0 = (
<ide> <div>
<ide> <Middle>Hi</Middle>
<ide> <Bar>{'Foo'}</Bar>
<ide> describe('ReactIncremental', () => {
<ide> });
<ide>
<ide> xit('can reuse work if shouldComponentUpdate is false, after being preempted', () => {
<del> var ops = [];
<add> let ops = [];
<ide>
<ide> function Bar(props) {
<ide> ops.push('Bar');
<ide> describe('ReactIncremental', () => {
<ide> });
<ide>
<ide> xit('can call sCU while resuming a partly mounted component', () => {
<del> var ops = [];
<add> let ops = [];
<ide>
<del> var instances = new Set();
<add> const instances = new Set();
<ide>
<ide> class Bar extends React.Component {
<ide> state = {y: 'A'};
<ide> describe('ReactIncremental', () => {
<ide> });
<ide>
<ide> xit('gets new props when setting state on a partly updated component', () => {
<del> var ops = [];
<del> var instances = [];
<add> let ops = [];
<add> const instances = [];
<ide>
<ide> class Bar extends React.Component {
<ide> state = {y: 'A'};
<ide> describe('ReactIncremental', () => {
<ide> });
<ide>
<ide> xit('calls componentWillMount twice if the initial render is aborted', () => {
<del> var ops = [];
<add> let ops = [];
<ide>
<ide> class LifeCycle extends React.Component {
<ide> state = {x: this.props.x};
<ide> describe('ReactIncremental', () => {
<ide> });
<ide>
<ide> xit('uses state set in componentWillMount even if initial render was aborted', () => {
<del> var ops = [];
<add> let ops = [];
<ide>
<ide> class LifeCycle extends React.Component {
<ide> constructor(props) {
<ide> describe('ReactIncremental', () => {
<ide> });
<ide>
<ide> xit('calls componentWill* twice if an update render is aborted', () => {
<del> var ops = [];
<add> let ops = [];
<ide>
<ide> class LifeCycle extends React.Component {
<ide> componentWillMount() {
<ide> describe('ReactIncremental', () => {
<ide> });
<ide>
<ide> xit('does not call componentWillReceiveProps for state-only updates', () => {
<del> var ops = [];
<add> let ops = [];
<ide>
<del> var instances = [];
<add> const instances = [];
<ide>
<ide> class LifeCycle extends React.Component {
<ide> state = {x: 0};
<ide> describe('ReactIncremental', () => {
<ide> });
<ide>
<ide> xit('skips will/DidUpdate when bailing unless an update was already in progress', () => {
<del> var ops = [];
<add> let ops = [];
<ide>
<ide> class LifeCycle extends React.Component {
<ide> componentWillMount() {
<ide> describe('ReactIncremental', () => {
<ide> });
<ide>
<ide> it('can nest batchedUpdates', () => {
<del> var ops = [];
<del> var instance;
<add> let ops = [];
<add> let instance;
<ide>
<ide> class Foo extends React.Component {
<ide> state = {n: 0};
<ide> describe('ReactIncremental', () => {
<ide> });
<ide>
<ide> it('can handle if setState callback throws', () => {
<del> var ops = [];
<del> var instance;
<add> let ops = [];
<add> let instance;
<ide>
<ide> class Foo extends React.Component {
<ide> state = {n: 0};
<ide> describe('ReactIncremental', () => {
<ide> });
<ide>
<ide> it('merges and masks context', () => {
<del> var ops = [];
<add> const ops = [];
<ide>
<ide> class Intl extends React.Component {
<ide> static childContextTypes = {
<ide> describe('ReactIncremental', () => {
<ide> });
<ide>
<ide> it('does not leak own context into context provider', () => {
<del> var ops = [];
<add> const ops = [];
<ide> class Recurse extends React.Component {
<ide> static contextTypes = {
<ide> n: PropTypes.number,
<ide> describe('ReactIncremental', () => {
<ide> });
<ide>
<ide> it('provides context when reusing work', () => {
<del> var ops = [];
<add> const ops = [];
<ide>
<ide> class Intl extends React.Component {
<ide> static childContextTypes = {
<ide> describe('ReactIncremental', () => {
<ide> });
<ide>
<ide> it('reads context when setState is below the provider', () => {
<del> var ops = [];
<del> var statefulInst;
<add> const ops = [];
<add> let statefulInst;
<ide>
<ide> class Intl extends React.Component {
<ide> static childContextTypes = {
<ide> describe('ReactIncremental', () => {
<ide> });
<ide>
<ide> it('reads context when setState is above the provider', () => {
<del> var ops = [];
<del> var statefulInst;
<add> const ops = [];
<add> let statefulInst;
<ide>
<ide> class Intl extends React.Component {
<ide> static childContextTypes = {
<ide><path>packages/react-reconciler/src/__tests__/ReactIncrementalErrorHandling-test.js
<ide>
<ide> 'use strict';
<ide>
<del>var PropTypes;
<del>var React;
<del>var ReactNoop;
<add>let PropTypes;
<add>let React;
<add>let ReactNoop;
<ide>
<ide> describe('ReactIncrementalErrorHandling', () => {
<ide> beforeEach(() => {
<ide> describe('ReactIncrementalErrorHandling', () => {
<ide> });
<ide>
<ide> it('catches render error in a boundary during partial deferred mounting', () => {
<del> var ops = [];
<add> const ops = [];
<ide> class ErrorBoundary extends React.Component {
<ide> state = {error: null};
<ide> componentDidCatch(error) {
<ide> describe('ReactIncrementalErrorHandling', () => {
<ide> });
<ide>
<ide> it('catches render error in a boundary during synchronous mounting', () => {
<del> var ops = [];
<add> const ops = [];
<ide> class ErrorBoundary extends React.Component {
<ide> state = {error: null};
<ide> componentDidCatch(error) {
<ide> describe('ReactIncrementalErrorHandling', () => {
<ide> });
<ide>
<ide> it('catches render error in a boundary during batched mounting', () => {
<del> var ops = [];
<add> const ops = [];
<ide> class ErrorBoundary extends React.Component {
<ide> state = {error: null};
<ide> componentDidCatch(error) {
<ide> describe('ReactIncrementalErrorHandling', () => {
<ide> });
<ide>
<ide> it('propagates an error from a noop error boundary during full deferred mounting', () => {
<del> var ops = [];
<add> const ops = [];
<ide> class RethrowErrorBoundary extends React.Component {
<ide> componentDidCatch(error) {
<ide> ops.push('RethrowErrorBoundary componentDidCatch');
<ide> describe('ReactIncrementalErrorHandling', () => {
<ide> });
<ide>
<ide> it('propagates an error from a noop error boundary during partial deferred mounting', () => {
<del> var ops = [];
<add> const ops = [];
<ide> class RethrowErrorBoundary extends React.Component {
<ide> componentDidCatch(error) {
<ide> ops.push('RethrowErrorBoundary componentDidCatch');
<ide> describe('ReactIncrementalErrorHandling', () => {
<ide> });
<ide>
<ide> it('propagates an error from a noop error boundary during synchronous mounting', () => {
<del> var ops = [];
<add> const ops = [];
<ide> class RethrowErrorBoundary extends React.Component {
<ide> componentDidCatch(error) {
<ide> ops.push('RethrowErrorBoundary componentDidCatch');
<ide> describe('ReactIncrementalErrorHandling', () => {
<ide> });
<ide>
<ide> it('propagates an error from a noop error boundary during batched mounting', () => {
<del> var ops = [];
<add> const ops = [];
<ide> class RethrowErrorBoundary extends React.Component {
<ide> componentDidCatch(error) {
<ide> ops.push('RethrowErrorBoundary componentDidCatch');
<ide> describe('ReactIncrementalErrorHandling', () => {
<ide> });
<ide>
<ide> it('can schedule updates after uncaught error in render on mount', () => {
<del> var ops = [];
<add> let ops = [];
<ide>
<ide> function BrokenRender() {
<ide> ops.push('BrokenRender');
<ide> describe('ReactIncrementalErrorHandling', () => {
<ide> });
<ide>
<ide> it('can schedule updates after uncaught error in render on update', () => {
<del> var ops = [];
<add> let ops = [];
<ide>
<ide> function BrokenRender(props) {
<ide> ops.push('BrokenRender');
<ide> describe('ReactIncrementalErrorHandling', () => {
<ide> });
<ide>
<ide> it('can schedule updates after uncaught error during umounting', () => {
<del> var ops = [];
<add> let ops = [];
<ide>
<ide> class BrokenComponentWillUnmount extends React.Component {
<ide> render() {
<ide> describe('ReactIncrementalErrorHandling', () => {
<ide> });
<ide>
<ide> it('does not interrupt unmounting if detaching a ref throws', () => {
<del> var ops = [];
<add> let ops = [];
<ide>
<ide> class Bar extends React.Component {
<ide> componentWillUnmount() {
<ide><path>packages/react-reconciler/src/__tests__/ReactIncrementalErrorLogging-test.internal.js
<ide>
<ide> 'use strict';
<ide>
<del>var React;
<del>var ReactNoop;
<add>let React;
<add>let ReactNoop;
<ide>
<ide> describe('ReactIncrementalErrorLogging', () => {
<ide> beforeEach(() => {
<ide><path>packages/react-reconciler/src/__tests__/ReactIncrementalReflection-test.js
<ide>
<ide> 'use strict';
<ide>
<del>var React;
<del>var ReactNoop;
<add>let React;
<add>let ReactNoop;
<ide>
<ide> describe('ReactIncrementalReflection', () => {
<ide> beforeEach(() => {
<ide><path>packages/react-reconciler/src/__tests__/ReactIncrementalScheduling-test.js
<ide>
<ide> 'use strict';
<ide>
<del>var React;
<del>var ReactNoop;
<add>let React;
<add>let ReactNoop;
<ide>
<ide> describe('ReactIncrementalScheduling', () => {
<ide> beforeEach(() => {
<ide> describe('ReactIncrementalScheduling', () => {
<ide> });
<ide>
<ide> it('schedules sync updates when inside componentDidMount/Update', () => {
<del> var instance;
<del> var ops = [];
<add> let instance;
<add> let ops = [];
<ide>
<ide> class Foo extends React.Component {
<ide> state = {tick: 0};
<ide><path>packages/react-reconciler/src/__tests__/ReactIncrementalSideEffects-test.js
<ide>
<ide> 'use strict';
<ide>
<del>var React;
<del>var ReactNoop;
<add>let React;
<add>let ReactNoop;
<ide>
<ide> describe('ReactIncrementalSideEffects', () => {
<ide> beforeEach(() => {
<ide> describe('ReactIncrementalSideEffects', () => {
<ide> return <span prop={props.children} />;
<ide> }
<ide>
<del> var middleContent = (
<add> const middleContent = (
<ide> <div>
<ide> <Bar>Hello</Bar>
<ide> <Bar>World</Bar>
<ide> describe('ReactIncrementalSideEffects', () => {
<ide> ),
<ide> ),
<ide> ]);
<del> var innerSpanA = ReactNoop.getChildren()[0].children[1].children[1];
<add> const innerSpanA = ReactNoop.getChildren()[0].children[1].children[1];
<ide> ReactNoop.render(<Foo tick={2} idx={1} />);
<ide> ReactNoop.flushDeferredPri(30 + 25);
<ide> expect(ReactNoop.getChildren()).toEqual([
<ide> describe('ReactIncrementalSideEffects', () => {
<ide> ),
<ide> ]);
<ide>
<del> var innerSpanB = ReactNoop.getChildren()[0].children[1].children[1];
<add> const innerSpanB = ReactNoop.getChildren()[0].children[1].children[1];
<ide> // This should have been an update to an existing instance, not recreation.
<ide> // We verify that by ensuring that the child instance was the same as
<ide> // before.
<ide> expect(innerSpanA).toBe(innerSpanB);
<ide> });
<ide>
<ide> xit('can defer side-effects and reuse them later - complex', function() {
<del> var ops = [];
<add> let ops = [];
<ide>
<ide> class Bar extends React.Component {
<ide> shouldComponentUpdate(nextProps) {
<ide> describe('ReactIncrementalSideEffects', () => {
<ide> });
<ide>
<ide> it('deprioritizes setStates that happens within a deprioritized tree', () => {
<del> var ops = [];
<add> let ops = [];
<ide>
<del> var barInstances = [];
<add> const barInstances = [];
<ide>
<ide> class Bar extends React.Component {
<ide> constructor() {
<ide> describe('ReactIncrementalSideEffects', () => {
<ide> // TODO: Test that callbacks are not lost if an update is preempted.
<ide>
<ide> it('calls componentWillUnmount after a deletion, even if nested', () => {
<del> var ops = [];
<add> const ops = [];
<ide>
<ide> class Bar extends React.Component {
<ide> componentWillUnmount() {
<ide> describe('ReactIncrementalSideEffects', () => {
<ide> });
<ide>
<ide> it('calls componentDidMount/Update after insertion/update', () => {
<del> var ops = [];
<add> let ops = [];
<ide>
<ide> class Bar extends React.Component {
<ide> componentDidMount() {
<ide> describe('ReactIncrementalSideEffects', () => {
<ide>
<ide> it('invokes ref callbacks after insertion/update/unmount', () => {
<ide> spyOnDev(console, 'error');
<del> var classInstance = null;
<add> let classInstance = null;
<ide>
<del> var ops = [];
<add> let ops = [];
<ide>
<ide> class ClassComponent extends React.Component {
<ide> render() {
<ide> describe('ReactIncrementalSideEffects', () => {
<ide> // expected way for aborted and resumed render life-cycles.
<ide>
<ide> it('supports string refs', () => {
<del> var fooInstance = null;
<add> let fooInstance = null;
<ide>
<ide> class Bar extends React.Component {
<ide> componentDidMount() {
<ide><path>packages/react-reconciler/src/__tests__/ReactIncrementalTriangle-test.js
<ide>
<ide> 'use strict';
<ide>
<del>var React;
<del>var ReactNoop;
<add>let React;
<add>let ReactNoop;
<ide>
<ide> describe('ReactIncrementalTriangle', () => {
<ide> beforeEach(() => {
<ide> describe('ReactIncrementalTriangle', () => {
<ide> simulate(...actions);
<ide> } catch (e) {
<ide> console.error(
<del> `Triangle fuzz tester error! Copy and paste the following line into the test suite:
<add> `Triangle fuzz tester error! Copy and paste the following line into the test suite:
<ide> ${formatActions(actions)}
<ide> `,
<ide> );
<ide><path>packages/react-reconciler/src/__tests__/ReactIncrementalUpdates-test.js
<ide>
<ide> 'use strict';
<ide>
<del>var React;
<del>var ReactNoop;
<add>let React;
<add>let ReactNoop;
<ide>
<ide> describe('ReactIncrementalUpdates', () => {
<ide> beforeEach(() => {
<ide><path>packages/react-reconciler/src/__tests__/ReactPersistent-test.internal.js
<ide>
<ide> 'use strict';
<ide>
<del>var React;
<del>var ReactNoop;
<add>let React;
<add>let ReactNoop;
<ide> let ReactPortal;
<ide>
<ide> describe('ReactPersistent', () => {
<ide> describe('ReactPersistent', () => {
<ide>
<ide> render(<Foo text="Hello" />);
<ide> ReactNoop.flush();
<del> var originalChildren = getChildren();
<add> const originalChildren = getChildren();
<ide> expect(originalChildren).toEqual([div(span())]);
<ide>
<ide> render(<Foo text="World" />);
<ide> ReactNoop.flush();
<del> var newChildren = getChildren();
<add> const newChildren = getChildren();
<ide> expect(newChildren).toEqual([div(span(), span())]);
<ide>
<ide> expect(originalChildren).toEqual([div(span())]);
<ide> describe('ReactPersistent', () => {
<ide>
<ide> render(<Foo text="Hello" />);
<ide> ReactNoop.flush();
<del> var originalChildren = getChildren();
<add> const originalChildren = getChildren();
<ide> expect(originalChildren).toEqual([div(span('Hello'))]);
<ide>
<ide> render(<Foo text="World" />);
<ide> ReactNoop.flush();
<del> var newChildren = getChildren();
<add> const newChildren = getChildren();
<ide> expect(newChildren).toEqual([div(span('Hello'), span('World'))]);
<ide>
<ide> expect(originalChildren).toEqual([div(span('Hello'))]);
<ide> describe('ReactPersistent', () => {
<ide>
<ide> render(<Foo text="Hello" />);
<ide> ReactNoop.flush();
<del> var originalChildren = getChildren();
<add> const originalChildren = getChildren();
<ide> expect(originalChildren).toEqual([div('Hello', span())]);
<ide>
<ide> render(<Foo text="World" />);
<ide> ReactNoop.flush();
<del> var newChildren = getChildren();
<add> const newChildren = getChildren();
<ide> expect(newChildren).toEqual([div('World', span())]);
<ide>
<ide> expect(originalChildren).toEqual([div('Hello', span())]);
<ide> describe('ReactPersistent', () => {
<ide>
<ide> expect(emptyPortalChildSet).toEqual([]);
<ide>
<del> var originalChildren = getChildren();
<add> const originalChildren = getChildren();
<ide> expect(originalChildren).toEqual([div()]);
<del> var originalPortalChildren = portalContainer.children;
<add> const originalPortalChildren = portalContainer.children;
<ide> expect(originalPortalChildren).toEqual([div(span())]);
<ide>
<ide> render(
<ide> describe('ReactPersistent', () => {
<ide> );
<ide> ReactNoop.flush();
<ide>
<del> var newChildren = getChildren();
<add> const newChildren = getChildren();
<ide> expect(newChildren).toEqual([div()]);
<del> var newPortalChildren = portalContainer.children;
<add> const newPortalChildren = portalContainer.children;
<ide> expect(newPortalChildren).toEqual([div(span(), 'Hello ', 'World')]);
<ide>
<ide> expect(originalChildren).toEqual([div()]);
<ide> describe('ReactPersistent', () => {
<ide> render(<Parent />);
<ide> ReactNoop.flush();
<ide>
<del> var clearedPortalChildren = portalContainer.children;
<add> const clearedPortalChildren = portalContainer.children;
<ide> expect(clearedPortalChildren).toEqual([]);
<ide>
<ide> // The original is unchanged.
<ide><path>packages/react-reconciler/src/__tests__/ReactTopLevelFragment-test.js
<ide>
<ide> 'use strict';
<ide>
<del>var React;
<del>var ReactNoop;
<add>let React;
<add>let ReactNoop;
<ide>
<ide> // This is a new feature in Fiber so I put it in its own test file. It could
<ide> // probably move to one of the other test files once it is official.
<ide> describe('ReactTopLevelFragment', function() {
<ide> });
<ide>
<ide> it('should preserve state when switching from a single child', function() {
<del> var instance = null;
<add> let instance = null;
<ide>
<ide> class Stateful extends React.Component {
<ide> render() {
<ide> describe('ReactTopLevelFragment', function() {
<ide> ReactNoop.render(<Fragment />);
<ide> ReactNoop.flush();
<ide>
<del> var instanceA = instance;
<add> const instanceA = instance;
<ide>
<ide> expect(instanceA).not.toBe(null);
<ide>
<ide> ReactNoop.render(<Fragment condition={true} />);
<ide> ReactNoop.flush();
<ide>
<del> var instanceB = instance;
<add> const instanceB = instance;
<ide>
<ide> expect(instanceB).toBe(instanceA);
<ide> });
<ide>
<ide> it('should not preserve state when switching to a nested array', function() {
<del> var instance = null;
<add> let instance = null;
<ide>
<ide> class Stateful extends React.Component {
<ide> render() {
<ide> describe('ReactTopLevelFragment', function() {
<ide> ReactNoop.render(<Fragment />);
<ide> ReactNoop.flush();
<ide>
<del> var instanceA = instance;
<add> const instanceA = instance;
<ide>
<ide> expect(instanceA).not.toBe(null);
<ide>
<ide> ReactNoop.render(<Fragment condition={true} />);
<ide> ReactNoop.flush();
<ide>
<del> var instanceB = instance;
<add> const instanceB = instance;
<ide>
<ide> expect(instanceB).not.toBe(instanceA);
<ide> });
<ide>
<ide> it('preserves state if an implicit key slot switches from/to null', function() {
<del> var instance = null;
<add> let instance = null;
<ide>
<ide> class Stateful extends React.Component {
<ide> render() {
<ide> describe('ReactTopLevelFragment', function() {
<ide> ReactNoop.render(<Fragment />);
<ide> ReactNoop.flush();
<ide>
<del> var instanceA = instance;
<add> const instanceA = instance;
<ide>
<ide> expect(instanceA).not.toBe(null);
<ide>
<ide> ReactNoop.render(<Fragment condition={true} />);
<ide> ReactNoop.flush();
<ide>
<del> var instanceB = instance;
<add> const instanceB = instance;
<ide>
<ide> expect(instanceB).toBe(instanceA);
<ide>
<ide> ReactNoop.render(<Fragment condition={false} />);
<ide> ReactNoop.flush();
<ide>
<del> var instanceC = instance;
<add> const instanceC = instance;
<ide>
<ide> expect(instanceC === instanceA).toBe(true);
<ide> });
<ide>
<ide> it('should preserve state in a reorder', function() {
<del> var instance = null;
<add> let instance = null;
<ide>
<ide> class Stateful extends React.Component {
<ide> render() {
<ide> describe('ReactTopLevelFragment', function() {
<ide> ReactNoop.render(<Fragment />);
<ide> ReactNoop.flush();
<ide>
<del> var instanceA = instance;
<add> const instanceA = instance;
<ide>
<ide> expect(instanceA).not.toBe(null);
<ide>
<ide> ReactNoop.render(<Fragment condition={true} />);
<ide> ReactNoop.flush();
<ide>
<del> var instanceB = instance;
<add> const instanceB = instance;
<ide>
<ide> expect(instanceB).toBe(instanceA);
<ide> });
<ide><path>packages/react-reconciler/src/__tests__/ReactTopLevelText-test.js
<ide>
<ide> 'use strict';
<ide>
<del>var React;
<del>var ReactNoop;
<add>let React;
<add>let ReactNoop;
<ide>
<ide> // This is a new feature in Fiber so I put it in its own test file. It could
<ide> // probably move to one of the other test files once it is official. | 24 |
Go | Go | add network attachment methods to cluster provider | 547c342c1b96f66f74533fdd648127098a922faa | <ide><path>libnetwork/cluster/provider.go
<ide> package cluster
<ide>
<add>import (
<add> "github.com/docker/engine-api/types/network"
<add> "golang.org/x/net/context"
<add>)
<add>
<ide> // Provider provides clustering config details
<ide> type Provider interface {
<ide> IsManager() bool
<ide> type Provider interface {
<ide> GetAdvertiseAddress() string
<ide> GetRemoteAddress() string
<ide> ListenClusterEvents() <-chan struct{}
<add> AttachNetwork(string, string, []string) (*network.NetworkingConfig, error)
<add> DetachNetwork(string, string) error
<add> UpdateAttachment(string, string, *network.NetworkingConfig) error
<add> WaitForDetachment(context.Context, string, string, string) error
<ide> }
<ide><path>libnetwork/cmd/dnet/dnet.go
<ide> import (
<ide>
<ide> "github.com/Sirupsen/logrus"
<ide> "github.com/docker/docker/pkg/term"
<add> "github.com/docker/engine-api/types/network"
<ide> "github.com/docker/libnetwork"
<ide> "github.com/docker/libnetwork/api"
<ide> "github.com/docker/libnetwork/config"
<ide> import (
<ide> "github.com/docker/libnetwork/options"
<ide> "github.com/docker/libnetwork/types"
<ide> "github.com/gorilla/mux"
<add> "golang.org/x/net/context"
<ide> )
<ide>
<ide> const (
<ide> func (d *dnetConnection) ListenClusterEvents() <-chan struct{} {
<ide> return d.configEvent
<ide> }
<ide>
<add>func (d *dnetConnection) AttachNetwork(string, string, []string) (*network.NetworkingConfig, error) {
<add> return nil, nil
<add>}
<add>
<add>func (d *dnetConnection) DetachNetwork(string, string) error {
<add> return nil
<add>}
<add>
<add>func (d *dnetConnection) UpdateAttachment(string, string, *network.NetworkingConfig) error {
<add> return nil
<add>}
<add>
<add>func (d *dnetConnection) WaitForDetachment(context.Context, string, string, string) error {
<add> return nil
<add>}
<add>
<ide> func handleSignals(controller libnetwork.NetworkController) {
<ide> c := make(chan os.Signal, 1)
<ide> signals := []os.Signal{os.Interrupt, syscall.SIGTERM, syscall.SIGQUIT} | 2 |
Javascript | Javascript | remove invalid character | 5c203240d484c9803e19ce537adcb70fd9010a30 | <ide><path>src/ng/filter/filter.js
<ide> <hr>
<ide> Any: <input ng-model="search.$"> <br>
<ide> Name only <input ng-model="search.name"><br>
<del> Phone only <input ng-model="search.phone"å><br>
<add> Phone only <input ng-model="search.phone"><br>
<ide> Equality <input type="checkbox" ng-model="strict"><br>
<ide> <table id="searchObjResults">
<ide> <tr><th>Name</th><th>Phone</th></tr> | 1 |
Ruby | Ruby | fix typo in deprecation notice | b0ae34ad72ed8410e6f16c1d9ee4b04b2630208f | <ide><path>activesupport/lib/active_support/message_verifier.rb
<ide> class InvalidSignature < StandardError; end
<ide>
<ide> def initialize(secret, options = {})
<ide> unless options.is_a?(Hash)
<del> ActiveSupport::Deprecation.warn "The second parameter should be an options hash. Use :digest => 'algorithm' to sepcify the digest algorithm."
<add> ActiveSupport::Deprecation.warn "The second parameter should be an options hash. Use :digest => 'algorithm' to specify the digest algorithm."
<ide> options = { :digest => options }
<ide> end
<ide> | 1 |
Javascript | Javascript | use shared builder | 3e4551bb559a9a2e337f14603cf7c6d3bb675a1e | <ide><path>Gruntfile.js
<ide> module.exports = function(grunt) {
<ide> filenames.add(filename);
<ide> });
<ide> });
<del> const buildStuff = require('./build/js/build');
<add> const buildStuff = require('@gfxfundamentals/lesson-builder');
<ide> const settings = Object.assign({}, buildSettings, {
<ide> filenames,
<ide> });
<ide> const finish = this.async();
<del> buildStuff(settings).then(function() {
<del> finish();
<del> }).done();
<add> buildStuff(settings).finally(finish);
<ide> });
<ide>
<ide> grunt.registerTask('buildlessons', function() {
<del> const buildStuff = require('./build/js/build');
<add> const buildStuff = require('@gfxfundamentals/lesson-builder');
<ide> const finish = this.async();
<del> buildStuff(buildSettings).then(function() {
<del> finish();
<del> }).done();
<add> buildStuff(buildSettings).finally(finish);
<ide> });
<ide>
<ide> grunt.task.registerMultiTask('fixthreepaths', 'fix three paths', function() {
<ide><path>build/js/build.js
<del>/* global module require process */
<del>/* eslint no-undef: "error" */
<del>/* eslint no-console: "off" */
<del>
<del>/*
<del>
<del>This entire file is one giant hack and really needs to be cleaned up!
<del>
<del>*/
<del>
<del>'use strict';
<del>
<del>const requiredNodeVersion = 12;
<del>if (parseInt((/^v(\d+)\./).exec(process.version)[1]) < requiredNodeVersion) {
<del> throw Error(`requires at least node: ${requiredNodeVersion}`);
<del>}
<del>
<del>module.exports = function(settings) { // wrapper in case we're in module_context mode
<del>
<del>const hackyProcessSelectFiles = settings.filenames !== undefined;
<del>
<del>const cache = new (require('inmemfilecache'))();
<del>const Feed = require('feed').Feed;
<del>const fs = require('fs');
<del>const glob = require('glob');
<del>const Handlebars = require('handlebars');
<del>const hanson = require('hanson');
<del>const marked = require('marked');
<del>const path = require('path');
<del>const Promise = require('promise');
<del>const sitemap = require('sitemap');
<del>const utils = require('./utils');
<del>const moment = require('moment');
<del>const url = require('url');
<del>
<del>//process.title = 'build';
<del>
<del>let numErrors = 0;
<del>function error(...args) {
<del> ++numErrors;
<del> console.error(...args);
<del>}
<del>
<del>const executeP = Promise.denodeify(utils.execute);
<del>
<del>marked.setOptions({
<del> rawHtml: true,
<del> //pedantic: true,
<del>});
<del>
<del>function applyObject(src, dst) {
<del> Object.keys(src).forEach(function(key) {
<del> dst[key] = src[key];
<del> });
<del> return dst;
<del>}
<del>
<del>function mergeObjects() {
<del> const merged = {};
<del> Array.prototype.slice.call(arguments).forEach(function(src) {
<del> applyObject(src, merged);
<del> });
<del> return merged;
<del>}
<del>
<del>function readFile(fileName) {
<del> try {
<del> return cache.readFileSync(fileName, 'utf-8');
<del> } catch (e) {
<del> console.error('could not read:', fileName);
<del> throw e;
<del> }
<del>}
<del>
<del>function readHANSON(fileName) {
<del> const text = readFile(fileName);
<del> try {
<del> return hanson.parse(text);
<del> } catch (e) {
<del> throw new Error(`can not parse: ${fileName}: ${e}`);
<del> }
<del>}
<del>
<del>function writeFileIfChanged(fileName, content) {
<del> if (fs.existsSync(fileName)) {
<del> const old = readFile(fileName);
<del> if (content === old) {
<del> return;
<del> }
<del> }
<del> fs.writeFileSync(fileName, content);
<del> console.log('Wrote: ' + fileName); // eslint-disable-line
<del>}
<del>
<del>function copyFile(src, dst) {
<del> writeFileIfChanged(dst, readFile(src));
<del>}
<del>
<del>function replaceParams(str, params) {
<del> const template = Handlebars.compile(str);
<del> if (Array.isArray(params)) {
<del> params = mergeObjects.apply(null, params.slice().reverse());
<del> }
<del>
<del> return template(params);
<del>}
<del>
<del>function encodeParams(params) {
<del> const values = Object.values(params).filter(v => v);
<del> if (!values.length) {
<del> return '';
<del> }
<del> return '&' + Object.entries(params).map((kv) => {
<del> return `${encodeURIComponent(kv[0])}=${encodeURIComponent(kv[1])}`;
<del> }).join('&');
<del>}
<del>
<del>function encodeQuery(query) {
<del> if (!query) {
<del> return '';
<del> }
<del> return '?' + query.split('&').map(function(pair) {
<del> return pair.split('=').map(function(kv) {
<del> return encodeURIComponent(decodeURIComponent(kv));
<del> }).join('=');
<del> }).join('&');
<del>}
<del>
<del>function encodeUrl(src) {
<del> const u = url.parse(src);
<del> u.search = encodeQuery(u.query);
<del> return url.format(u);
<del>}
<del>
<del>function TemplateManager() {
<del> const templates = {};
<del>
<del> this.apply = function(filename, params) {
<del> let template = templates[filename];
<del> if (!template) {
<del> template = Handlebars.compile(readFile(filename));
<del> templates[filename] = template;
<del> }
<del>
<del> if (Array.isArray(params)) {
<del> params = mergeObjects.apply(null, params.slice().reverse());
<del> }
<del>
<del> return template(params);
<del> };
<del>}
<del>
<del>const templateManager = new TemplateManager();
<del>
<del>Handlebars.registerHelper('include', function(filename, options) {
<del> let context;
<del> if (options && options.hash && options.hash.filename) {
<del> const varName = options.hash.filename;
<del> filename = options.data.root[varName];
<del> context = Object.assign({}, options.data.root, options.hash);
<del> } else {
<del> context = options.data.root;
<del> }
<del> return templateManager.apply(filename, context);
<del>});
<del>
<del>Handlebars.registerHelper('example', function(options) {
<del> options.hash.width = options.hash.width ? 'width: ' + options.hash.width + 'px;' : '';
<del> options.hash.height = options.hash.height ? 'height: ' + options.hash.height + 'px;' : '';
<del> options.hash.caption = options.hash.caption || options.data.root.defaultExampleCaption;
<del> options.hash.examplePath = options.data.root.examplePath;
<del> options.hash.encodedUrl = encodeURIComponent(encodeUrl(options.hash.url));
<del> options.hash.url = encodeUrl(options.hash.url);
<del> options.hash.params = encodeParams({
<del> startPane: options.hash.startPane,
<del> });
<del> return templateManager.apply('build/templates/example.template', options.hash);
<del>});
<del>
<del>Handlebars.registerHelper('diagram', function(options) {
<del>
<del> options.hash.width = options.hash.width || '400';
<del> options.hash.height = options.hash.height || '300';
<del> options.hash.examplePath = options.data.root.examplePath;
<del> options.hash.className = options.hash.className || '';
<del> options.hash.url = encodeUrl(options.hash.url);
<del>
<del> return templateManager.apply('build/templates/diagram.template', options.hash);
<del>});
<del>
<del>Handlebars.registerHelper('image', function(options) {
<del>
<del> options.hash.examplePath = options.data.root.examplePath;
<del> options.hash.className = options.hash.className || '';
<del> options.hash.caption = options.hash.caption || undefined;
<del>
<del> if (options.hash.url.substring(0, 4) === 'http') {
<del> options.hash.examplePath = '';
<del> }
<del>
<del> return templateManager.apply('build/templates/image.template', options.hash);
<del>});
<del>
<del>Handlebars.registerHelper('selected', function(options) {
<del> const key = options.hash.key;
<del> const value = options.hash.value;
<del> const re = options.hash.re;
<del> const sub = options.hash.sub;
<del>
<del> const a = this[key];
<del> let b = options.data.root[value];
<del>
<del> if (re) {
<del> const r = new RegExp(re);
<del> b = b.replace(r, sub);
<del> }
<del>
<del> return a === b ? 'selected' : '';
<del>});
<del>
<del>function slashify(s) {
<del> return s.replace(/\\/g, '/');
<del>}
<del>
<del>function articleFilter(f) {
<del> if (hackyProcessSelectFiles) {
<del> if (!settings.filenames.has(f)) {
<del> return false;
<del> }
<del> }
<del> return !process.env['ARTICLE_FILTER'] || f.indexOf(process.env['ARTICLE_FILTER']) >= 0;
<del>}
<del>
<del>const Builder = function(outBaseDir, options) {
<del>
<del> const g_articlesByLang = {};
<del> let g_articles = [];
<del> let g_langInfo;
<del> let g_originalLangInfo;
<del> const g_langDB = {};
<del> const g_outBaseDir = outBaseDir;
<del> const g_origPath = options.origPath;
<del> const g_originalByFileName = {};
<del>
<del> const toc = readHANSON('toc.hanson');
<del>
<del> // These are the english articles.
<del> const g_origArticles = glob.sync(path.join(g_origPath, '*.md'))
<del> .map(a => path.basename(a))
<del> .filter(a => a !== 'index.md')
<del> .filter(articleFilter);
<del>
<del> const extractHeader = (function() {
<del> const headerRE = /([A-Z0-9_-]+): (.*?)$/i;
<del>
<del> return function(content) {
<del> const metaData = { };
<del> const lines = content.split('\n');
<del> for (;;) {
<del> const line = lines[0].trim();
<del> const m = headerRE.exec(line);
<del> if (!m) {
<del> break;
<del> }
<del> metaData[m[1].toLowerCase()] = m[2];
<del> lines.shift();
<del> }
<del> return {
<del> content: lines.join('\n'),
<del> headers: metaData,
<del> };
<del> };
<del> }());
<del>
<del> const parseMD = function(content) {
<del> return extractHeader(content);
<del> };
<del>
<del> const loadMD = function(contentFileName) {
<del> const content = cache.readFileSync(contentFileName, 'utf-8');
<del> const data = parseMD(content);
<del> data.link = contentFileName.replace(/\\/g, '/').replace(/\.md$/, '.html');
<del> return data;
<del> };
<del>
<del> function extractHandlebars(content) {
<del> const tripleRE = /\{\{\{.*?\}\}\}/g;
<del> const doubleRE = /\{\{\{.*?\}\}\}/g;
<del>
<del> let numExtractions = 0;
<del> const extractions = {
<del> };
<del>
<del> function saveHandlebar(match) {
<del> const id = '==HANDLEBARS_ID_' + (++numExtractions) + '==';
<del> extractions[id] = match;
<del> return id;
<del> }
<del>
<del> content = content.replace(tripleRE, saveHandlebar);
<del> content = content.replace(doubleRE, saveHandlebar);
<del>
<del> return {
<del> content: content,
<del> extractions: extractions,
<del> };
<del> }
<del>
<del> function insertHandlebars(info, content) {
<del> const handlebarRE = /==HANDLEBARS_ID_\d+==/g;
<del>
<del> function restoreHandlebar(match) {
<del> const value = info.extractions[match];
<del> if (value === undefined) {
<del> throw new Error('no match restoring handlebar for: ' + match);
<del> }
<del> return value;
<del> }
<del>
<del> content = content.replace(handlebarRE, restoreHandlebar);
<del>
<del> return content;
<del> }
<del>
<del> function isSameDomain(url, pageUrl) {
<del> const fdq1 = new URL(pageUrl);
<del> const fdq2 = new URL(url, pageUrl);
<del> return fdq1.origin === fdq2.origin;
<del> }
<del>
<del> function getUrlPath(url) {
<del> // yes, this is a hack
<del> const q = url.indexOf('?');
<del> return q >= 0 ? url.substring(0, q) : url;
<del> }
<del>
<del> // Try top fix relative links. This *should* only
<del> // happen in translations
<del> const iframeLinkRE = /(<iframe[\s\S]*?\s+src=")(.*?)(")/g;
<del> const imgLinkRE = /(<img[\s\S]*?\s+src=")(.*?)(")/g;
<del> const aLinkRE = /(<a[\s\S]*?\s+href=")(.*?)(")/g;
<del> const mdLinkRE = /(\[[\s\S]*?\]\()(.*?)(\))/g;
<del> const handlebarLinkRE = /({{{.*?\s+url=")(.*?)(")/g;
<del> const linkREs = [
<del> iframeLinkRE,
<del> imgLinkRE,
<del> aLinkRE,
<del> mdLinkRE,
<del> handlebarLinkRE,
<del> ];
<del> function hackRelLinks(content, pageUrl) {
<del> // console.log('---> pageUrl:', pageUrl);
<del> function fixRelLink(m, prefix, url, suffix) {
<del> if (isSameDomain(url, pageUrl)) {
<del> // a link that starts with "../" should be "../../" if it's in a translation
<del> // a link that starts with "resources" should be "../resources" if it's in a translation
<del> if (url.startsWith('../') ||
<del> url.startsWith('resources')) {
<del> // console.log(' url:', url);
<del> return `${prefix}../${url}${suffix}`;
<del> }
<del> }
<del> return m;
<del> }
<del>
<del> return content
<del> .replace(imgLinkRE, fixRelLink)
<del> .replace(aLinkRE, fixRelLink)
<del> .replace(iframeLinkRE, fixRelLink);
<del> }
<del>
<del> /**
<del> * Get all the local urls based on a regex that has <prefix><url><suffix>
<del> */
<del> function getUrls(regex, str) {
<del> const links = new Set();
<del> let m;
<del> do {
<del> m = regex.exec(str);
<del> if (m && m[2][0] !== '#' && isSameDomain(m[2], 'http://example.com/a/b/c/d')) {
<del> links.add(getUrlPath(m[2]));
<del> }
<del> } while (m);
<del> return links;
<del> }
<del>
<del> /**
<del> * Get all the local links in content
<del> */
<del> function getLinks(content) {
<del> return new Set(linkREs.map(re => [...getUrls(re, content)]).flat());
<del> }
<del>
<del> function fixUrls(regex, content, origLinks) {
<del> return content.replace(regex, (m, prefix, url, suffix) => {
<del> const q = url.indexOf('?');
<del> const urlPath = q >= 0 ? url.substring(0, q) : url;
<del> const urlQuery = q >= 0 ? url.substring(q) : '';
<del> if (!origLinks.has(urlPath) &&
<del> isSameDomain(urlPath, 'https://foo.com/a/b/c/d.html') &&
<del> !(/\/..\/^/.test(urlPath)) && // hacky test for link to main page. Example /webgl/lessons/ja/
<del> urlPath[0] !== '#') { // test for same page anchor -- bad test :(
<del> for (const origLink of origLinks) {
<del> if (urlPath.endsWith(origLink)) {
<del> const newUrl = `${origLink}${urlQuery}`;
<del> console.log(' fixing:', url, 'to', newUrl);
<del> return `${prefix}${newUrl}${suffix}`;
<del> }
<del> }
<del> error('could not fix:', url);
<del> }
<del> return m;
<del> });
<del> }
<del>
<del> const applyTemplateToContent = function(templatePath, contentFileName, outFileName, opt_extra, data) {
<del> // Call prep's Content which parses the HTML. This helps us find missing tags
<del> // should probably call something else.
<del> //Convert(md_content)
<del> const relativeOutName = slashify(outFileName).substring(g_outBaseDir.length);
<del> const pageUrl = `${settings.baseUrl}${relativeOutName}`;
<del> const metaData = data.headers;
<del> const content = data.content;
<del> //console.log(JSON.stringify(metaData, undefined, ' '));
<del> const info = extractHandlebars(content);
<del> let html = marked(info.content);
<del> // HACK! :-(
<del> // There's probably a way to do this in marked
<del> html = html.replace(/<pre><code/g, '<pre class="prettyprint"><code');
<del> // HACK! :-(
<del> if (opt_extra && opt_extra.home && opt_extra.home.length > 1) {
<del> html = hackRelLinks(html, pageUrl);
<del> }
<del> html = insertHandlebars(info, html);
<del> html = replaceParams(html, [opt_extra, g_langInfo]);
<del> const pathRE = new RegExp(`^\\/${settings.rootFolder}\\/lessons\\/$`);
<del> const langs = Object.keys(g_langDB).map((name) => {
<del> const lang = g_langDB[name];
<del> const url = slashify(path.join(lang.basePath, path.basename(outFileName)))
<del> .replace('index.html', '')
<del> .replace(pathRE, '/');
<del> return {
<del> lang: lang.lang,
<del> language: lang.language,
<del> url: url,
<del> };
<del> });
<del> metaData['content'] = html;
<del> metaData['langs'] = langs;
<del> metaData['src_file_name'] = slashify(contentFileName);
<del> metaData['dst_file_name'] = relativeOutName;
<del> metaData['basedir'] = '';
<del> metaData['toc'] = opt_extra.toc;
<del> metaData['tocHtml'] = g_langInfo.tocHtml;
<del> metaData['templateOptions'] = opt_extra.templateOptions;
<del> metaData['langInfo'] = g_langInfo;
<del> metaData['url'] = pageUrl;
<del> metaData['relUrl'] = relativeOutName;
<del> metaData['screenshot'] = `${settings.baseUrl}/${settings.rootFolder}/lessons/resources/${settings.siteThumbnail}`;
<del> const basename = path.basename(contentFileName, '.md');
<del> ['.jpg', '.png'].forEach(function(ext) {
<del> const filename = path.join(settings.rootFolder, 'lessons', 'screenshots', basename + ext);
<del> if (fs.existsSync(filename)) {
<del> metaData['screenshot'] = `${settings.baseUrl}/${settings.rootFolder}/lessons/screenshots/${basename}${ext}`;
<del> }
<del> });
<del> const output = templateManager.apply(templatePath, metaData);
<del> writeFileIfChanged(outFileName, output);
<del>
<del> return metaData;
<del> };
<del>
<del> const applyTemplateToFile = function(templatePath, contentFileName, outFileName, opt_extra) {
<del> console.log('processing: ', contentFileName); // eslint-disable-line
<del> opt_extra = opt_extra || {};
<del> const data = loadMD(contentFileName);
<del> const metaData = applyTemplateToContent(templatePath, contentFileName, outFileName, opt_extra, data);
<del> g_articles.push(metaData);
<del> };
<del>
<del> const applyTemplateToFiles = function(templatePath, filesSpec, extra) {
<del> const files = glob
<del> .sync(filesSpec)
<del> .sort()
<del> .filter(articleFilter);
<del>
<del> const byFilename = {};
<del> files.forEach((fileName) => {
<del> const data = loadMD(fileName);
<del> if (!data.headers.category) {
<del> throw new Error(`no catgeory for article: ${fileName}`);
<del> }
<del> byFilename[path.basename(fileName)] = data;
<del> });
<del>
<del> // HACK
<del> if (extra.lang === 'en') {
<del> Object.assign(g_originalByFileName, byFilename);
<del> g_originalLangInfo = g_langInfo;
<del> }
<del>
<del> function getLocalizedCategory(category) {
<del> const localizedCategory = g_langInfo.categoryMapping[category];
<del> if (localizedCategory) {
<del> return localizedCategory;
<del> }
<del> console.error(`no localization for category: ${category}`);
<del> const categoryName = g_originalLangInfo.categoryMapping[category];
<del> if (!categoryName) {
<del> throw new Error(`no English mapping for category: ${category}`);
<del> }
<del> return categoryName;
<del> }
<del>
<del> function addLangToLink(link) {
<del> return extra.lang === 'en'
<del> ? link
<del> : `${path.dirname(link)}/${extra.lang}/${path.basename(link)}`;
<del> }
<del>
<del> function tocLink(fileName) {
<del> let data = byFilename[fileName];
<del> let link;
<del> if (data) {
<del> link = data.link;
<del> } else {
<del> data = g_originalByFileName[fileName];
<del> link = addLangToLink(data.link);
<del> }
<del> const toc = data.headers.toc;
<del> if (toc === '#') {
<del> return [...data.content.matchAll(/<a\s*id="(.*?)"\s*data-toc="(.*?)"\s*><\/a>/g)].map(([, id, title]) => {
<del> const hashlink = `${link}#${id}`;
<del> return `<li><a href="/${hashlink}">${title}</a></li>`;
<del> }).join('\n');
<del> }
<del> return `<li><a href="/${link}">${toc}</a></li>`;
<del> }
<del>
<del> function makeToc(toc) {
<del> return `<ul>${
<del> Object.entries(toc).map(([category, files]) => ` <li>${getLocalizedCategory(category)}</li>
<del> <ul>
<del> ${Array.isArray(files)
<del> ? files.map(tocLink).join('\n')
<del> : makeToc(files)
<del> }
<del> </ul>`
<del> ).join('\n')
<del> }</ul>`;
<del> }
<del>
<del> if (!hackyProcessSelectFiles) {
<del> g_langInfo.tocHtml = makeToc(toc);
<del> }
<del>
<del> files.forEach(function(fileName) {
<del> const ext = path.extname(fileName);
<del> const baseName = fileName.substr(0, fileName.length - ext.length);
<del> const outFileName = path.join(outBaseDir, baseName + '.html');
<del> applyTemplateToFile(templatePath, fileName, outFileName, extra);
<del> });
<del>
<del> };
<del>
<del> const addArticleByLang = function(article, lang) {
<del> const filename = path.basename(article.dst_file_name);
<del> let articleInfo = g_articlesByLang[filename];
<del> const url = `${settings.baseUrl}${article.dst_file_name}`;
<del> if (!articleInfo) {
<del> articleInfo = {
<del> url: url,
<del> changefreq: 'monthly',
<del> links: [],
<del> };
<del> g_articlesByLang[filename] = articleInfo;
<del> }
<del> articleInfo.links.push({
<del> url: url,
<del> lang: lang,
<del> });
<del> };
<del>
<del> const getLanguageSelection = function(lang) {
<del> const lessons = lang.lessons;
<del> const langInfo = readHANSON(path.join(lessons, 'langinfo.hanson'));
<del> langInfo.langCode = langInfo.langCode || lang.lang;
<del> langInfo.home = lang.home;
<del> g_langDB[lang.lang] = {
<del> lang: lang.lang,
<del> language: langInfo.language,
<del> basePath: '/' + lessons,
<del> langInfo: langInfo,
<del> };
<del> };
<del>
<del> this.preProcess = function(langs) {
<del> langs.forEach(getLanguageSelection);
<del> };
<del>
<del> this.process = function(options) {
<del> console.log('Processing Lang: ' + options.lang); // eslint-disable-line
<del> g_articles = [];
<del> g_langInfo = g_langDB[options.lang].langInfo;
<del>
<del> applyTemplateToFiles(options.template, path.join(options.lessons, settings.lessonGrep), options);
<del>
<del> const articlesFilenames = g_articles.map(a => path.basename(a.src_file_name));
<del>
<del> // should do this first was easier to add here
<del> if (options.lang !== 'en') {
<del> const existing = g_origArticles.filter(name => articlesFilenames.indexOf(name) >= 0);
<del> existing.forEach((name) => {
<del> const origMdFilename = path.join(g_origPath, name);
<del> const transMdFilename = path.join(g_origPath, options.lang, name);
<del> const origLinks = getLinks(loadMD(origMdFilename).content);
<del> const transLinks = getLinks(loadMD(transMdFilename).content);
<del>
<del> if (process.env['ARTICLE_VERBOSE']) {
<del> console.log('---[', transMdFilename, ']---');
<del> console.log('origLinks: ---\n ', [...origLinks].join('\n '));
<del> console.log('transLinks: ---\n ', [...transLinks].join('\n '));
<del> }
<del>
<del> let show = true;
<del> transLinks.forEach((link) => {
<del> if (!origLinks.has(link)) {
<del> if (show) {
<del> show = false;
<del> error('---[', transMdFilename, ']---');
<del> }
<del> error(' link:[', link, '] not found in English file');
<del> }
<del> });
<del>
<del> if (!show && process.env['ARTICLE_FIX']) {
<del> // there was an error, try to auto-fix
<del> let fixedMd = fs.readFileSync(transMdFilename, {encoding: 'utf8'});
<del> linkREs.forEach((re) => {
<del> fixedMd = fixUrls(re, fixedMd, origLinks);
<del> });
<del> fs.writeFileSync(transMdFilename, fixedMd);
<del> }
<del> });
<del> }
<del>
<del> if (hackyProcessSelectFiles) {
<del> return Promise.resolve();
<del> }
<del>
<del> // generate place holders for non-translated files
<del> const missing = g_origArticles.filter(name => articlesFilenames.indexOf(name) < 0);
<del> missing.forEach(name => {
<del> const ext = path.extname(name);
<del> const baseName = name.substr(0, name.length - ext.length);
<del> const outFileName = path.join(outBaseDir, options.lessons, baseName + '.html');
<del> const data = Object.assign({}, loadMD(path.join(g_origPath, name)));
<del> data.content = g_langInfo.missing;
<del> const extra = {
<del> origLink: '/' + slashify(path.join(g_origPath, baseName + '.html')),
<del> toc: options.toc,
<del> };
<del> console.log(' generating missing:', outFileName); // eslint-disable-line
<del> applyTemplateToContent(
<del> 'build/templates/missing.template',
<del> path.join(options.lessons, 'langinfo.hanson'),
<del> outFileName,
<del> extra,
<del> data);
<del> });
<del>
<del> function utcMomentFromGitLog(result, filename, timeType) {
<del> const dateStr = result.stdout.split('\n')[0].trim();
<del> const utcDateStr = dateStr
<del> .replace(/"/g, '') // WTF to these quotes come from!??!
<del> .replace(' ', 'T')
<del> .replace(' ', '')
<del> .replace(/(\d\d)$/, ':$1');
<del> const m = moment.utc(utcDateStr);
<del> if (m.isValid()) {
<del> return m;
<del> }
<del> const stat = fs.statSync(filename);
<del> return moment(stat[timeType]);
<del> }
<del>
<del> const tasks = g_articles.map((article) => {
<del> return function() {
<del> return executeP('git', [
<del> 'log',
<del> '--format="%ci"',
<del> '--name-only',
<del> '--diff-filter=A',
<del> article.src_file_name,
<del> ]).then((result) => {
<del> article.dateAdded = utcMomentFromGitLog(result, article.src_file_name, 'ctime');
<del> });
<del> };
<del> }).concat(g_articles.map((article) => {
<del> return function() {
<del> return executeP('git', [
<del> 'log',
<del> '--format="%ci"',
<del> '--name-only',
<del> '--max-count=1',
<del> article.src_file_name,
<del> ]).then((result) => {
<del> article.dateModified = utcMomentFromGitLog(result, article.src_file_name, 'mtime');
<del> });
<del> };
<del> }));
<del>
<del> return tasks.reduce(function(cur, next){
<del> return cur.then(next);
<del> }, Promise.resolve()).then(function() {
<del> let articles = g_articles.filter(function(article) {
<del> return article.dateAdded !== undefined;
<del> });
<del> articles = articles.sort(function(a, b) {
<del> return b.dateAdded - a.dateAdded;
<del> });
<del>
<del> if (articles.length) {
<del> const feed = new Feed({
<del> title: g_langInfo.title,
<del> description: g_langInfo.description,
<del> link: g_langInfo.link,
<del> image: `${settings.baseUrl}/${settings.rootFolder}/lessons/resources/${settings.siteThumbnail}`,
<del> date: articles[0].dateModified.toDate(),
<del> published: articles[0].dateModified.toDate(),
<del> updated: articles[0].dateModified.toDate(),
<del> author: {
<del> name: `${settings.siteName} Contributors`,
<del> link: `${settings.baseUrl}/contributors.html`,
<del> },
<del> });
<del>
<del> articles.forEach(function(article) {
<del> feed.addItem({
<del> title: article.title,
<del> link: `${settings.baseUrl}${article.dst_file_name}`,
<del> description: '',
<del> author: [
<del> {
<del> name: `${settings.siteName} Contributors`,
<del> link: `${settings.baseUrl}/contributors.html`,
<del> },
<del> ],
<del> // contributor: [
<del> // ],
<del> date: article.dateModified.toDate(),
<del> published: article.dateAdded.toDate(),
<del> // image: posts[key].image
<del> });
<del>
<del> addArticleByLang(article, options.lang);
<del> });
<del>
<del> try {
<del> const outPath = path.join(g_outBaseDir, options.lessons, 'atom.xml');
<del> console.log('write:', outPath); // eslint-disable-line
<del> writeFileIfChanged(outPath, feed.atom1());
<del> } catch (err) {
<del> return Promise.reject(err);
<del> }
<del> } else {
<del> console.log('no articles!'); // eslint-disable-line
<del> }
<del>
<del> return Promise.resolve();
<del> }).then(function() {
<del> // this used to insert a table of contents
<del> // but it was useless being auto-generated
<del> applyTemplateToFile('build/templates/index.template', path.join(options.lessons, 'index.md'), path.join(g_outBaseDir, options.lessons, 'index.html'), {
<del> table_of_contents: '',
<del> templateOptions: g_langInfo,
<del> tocHtml: g_langInfo.tocHtml,
<del> });
<del> return Promise.resolve();
<del> }, function(err) {
<del> error('ERROR!:');
<del> error(err);
<del> if (err.stack) {
<del> error(err.stack); // eslint-disable-line
<del> }
<del> throw new Error(err.toString());
<del> });
<del> };
<del>
<del> this.writeGlobalFiles = function() {
<del> const sm = sitemap.createSitemap({
<del> hostname: settings.baseUrl,
<del> cacheTime: 600000,
<del> });
<del> const articleLangs = { };
<del> Object.keys(g_articlesByLang).forEach(function(filename) {
<del> const article = g_articlesByLang[filename];
<del> const langs = {};
<del> article.links.forEach(function(link) {
<del> langs[link.lang] = true;
<del> });
<del> articleLangs[filename] = langs;
<del> sm.add(article);
<del> });
<del> // var langInfo = {
<del> // articles: articleLangs,
<del> // langs: g_langDB,
<del> // };
<del> // var langJS = 'window.langDB = ' + JSON.stringify(langInfo, null, 2);
<del> // writeFileIfChanged(path.join(g_outBaseDir, 'langdb.js'), langJS);
<del> writeFileIfChanged(path.join(g_outBaseDir, 'sitemap.xml'), sm.toString());
<del> copyFile(path.join(g_outBaseDir, `${settings.rootFolder}/lessons/atom.xml`), path.join(g_outBaseDir, 'atom.xml'));
<del> copyFile(path.join(g_outBaseDir, `${settings.rootFolder}/lessons/index.html`), path.join(g_outBaseDir, 'index.html'));
<del>
<del> applyTemplateToFile('build/templates/index.template', 'contributors.md', path.join(g_outBaseDir, 'contributors.html'), {
<del> table_of_contents: '',
<del> templateOptions: '',
<del> });
<del>
<del> {
<del> const filename = path.join(settings.outDir, 'link-check.html');
<del> const html = `
<del> <html>
<del> <body>
<del> ${langs.map(lang => `<a href="${lang.home}">${lang.lang}</a>`).join('\n')}
<del> </body>
<del> </html>
<del> `;
<del> writeFileIfChanged(filename, html);
<del> }
<del> };
<del>
<del>
<del>};
<del>
<del>const b = new Builder(settings.outDir, {
<del> origPath: `${settings.rootFolder}/lessons`, // english articles
<del>});
<del>
<del>const readdirs = function(dirpath) {
<del> const dirsOnly = function(filename) {
<del> const stat = fs.statSync(filename);
<del> return stat.isDirectory();
<del> };
<del>
<del> const addPath = function(filename) {
<del> return path.join(dirpath, filename);
<del> };
<del>
<del> return fs.readdirSync(`${settings.rootFolder}/lessons`)
<del> .map(addPath)
<del> .filter(dirsOnly);
<del>};
<del>
<del>const isLangFolder = function(dirname) {
<del> const filename = path.join(dirname, 'langinfo.hanson');
<del> return fs.existsSync(filename);
<del>};
<del>
<del>
<del>const pathToLang = function(filename) {
<del> const lang = path.basename(filename);
<del> const lessonBase = `${settings.rootFolder}/lessons`;
<del> const lessons = `${lessonBase}/${lang}`;
<del> return {
<del> lang,
<del> toc: `${settings.rootFolder}/lessons/${lang}/toc.html`,
<del> lessons: `${lessonBase}/${lang}`,
<del> template: 'build/templates/lesson.template',
<del> examplePath: `/${lessonBase}/`,
<del> home: `/${lessons}/`,
<del> };
<del>};
<del>
<del>let langs = [
<del> // English is special (sorry it's where I started)
<del> {
<del> template: 'build/templates/lesson.template',
<del> lessons: `${settings.rootFolder}/lessons`,
<del> lang: 'en',
<del> toc: `${settings.rootFolder}/lessons/toc.html`,
<del> examplePath: `/${settings.rootFolder}/lessons/`,
<del> home: '/',
<del> },
<del>];
<del>
<del>langs = langs.concat(readdirs(`${settings.rootFolder}/lessons`)
<del> .filter(isLangFolder)
<del> .map(pathToLang));
<del>
<del>b.preProcess(langs);
<del>
<del>if (hackyProcessSelectFiles) {
<del> const langsInFilenames = new Set();
<del> [...settings.filenames].forEach((filename) => {
<del> const m = /lessons\/(\w{2}|\w{5})\//.exec(filename);
<del> const lang = m ? m[1] : 'en';
<del> langsInFilenames.add(lang);
<del> });
<del> langs = langs.filter(lang => langsInFilenames.has(lang.lang));
<del>}
<del>
<del>const tasks = langs.map(function(lang) {
<del> return function() {
<del> return b.process(lang);
<del> };
<del>});
<del>
<del>return tasks.reduce(function(cur, next) {
<del> return cur.then(next);
<del>}, Promise.resolve()).then(function() {
<del> if (!hackyProcessSelectFiles) {
<del> b.writeGlobalFiles(langs);
<del> }
<del> return numErrors ? Promise.reject(new Error(`${numErrors} errors`)) : Promise.resolve();
<del>}).finally(() => {
<del> cache.clear();
<del>});
<del>
<del>};
<del>
<ide><path>build/js/utils.js
<del>/*
<del> * Copyright 2014, Gregg Tavares.
<del> * All rights reserved.
<del> *
<del> * Redistribution and use in source and binary forms, with or without
<del> * modification, are permitted provided that the following conditions are
<del> * met:
<del> *
<del> * * Redistributions of source code must retain the above copyright
<del> * notice, this list of conditions and the following disclaimer.
<del> * * Redistributions in binary form must reproduce the above
<del> * copyright notice, this list of conditions and the following disclaimer
<del> * in the documentation and/or other materials provided with the
<del> * distribution.
<del> * * Neither the name of Gregg Tavares. nor the names of its
<del> * contributors may be used to endorse or promote products derived from
<del> * this software without specific prior written permission.
<del> *
<del> * THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
<del> * "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
<del> * LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
<del> * A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
<del> * OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
<del> * SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
<del> * LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
<del> * DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
<del> * THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
<del> * (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
<del> * OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
<del> */
<del>
<del>/*eslint-env node*/
<del>/*eslint no-console: 0*/
<del>
<del>'use strict';
<del>
<del>const execute = function(cmd, args, callback) {
<del> const spawn = require('child_process').spawn;
<del>
<del> const proc = spawn(cmd, args);
<del> let stdout = [];
<del> let stderr = [];
<del>
<del> proc.stdout.setEncoding('utf8');
<del> proc.stdout.on('data', function(data) {
<del> const str = data.toString();
<del> const lines = str.split(/(\r?\n)/g);
<del> stdout = stdout.concat(lines);
<del> });
<del>
<del> proc.stderr.setEncoding('utf8');
<del> proc.stderr.on('data', function(data) {
<del> const str = data.toString();
<del> const lines = str.split(/(\r?\n)/g);
<del> stderr = stderr.concat(lines);
<del> });
<del>
<del> proc.on('close', function(code) {
<del> const result = {stdout: stdout.join('\n'), stderr: stderr.join('\n')};
<del> if (parseInt(code) !== 0) {
<del> callback('exit code ' + code, result);
<del> } else {
<del> callback(null, result);
<del> }
<del> });
<del>};
<del>
<del>exports.execute = execute;
<del> | 3 |
PHP | PHP | add getfallbackchannelname method | bb57655967f31b426b968a46ca738d6a245e873d | <ide><path>src/Illuminate/Log/LogManager.php
<ide> protected function formatter()
<ide> });
<ide> }
<ide>
<add> /**
<add> * Get fallback log channel name.
<add> *
<add> * @return string
<add> */
<add> protected function getFallbackChannelName()
<add> {
<add> return $this->app->bound('env') ? $this->app->environment() : 'production';
<add> }
<add>
<ide> /**
<ide> * Get the log connection configuration.
<ide> *
<ide><path>src/Illuminate/Log/LoggerConfiguration.php
<ide> protected function level(array $config)
<ide> protected function parseChannel(array $config)
<ide> {
<ide> if (! isset($config['name'])) {
<del> return $this->app->bound('env') ? $this->app->environment() : 'production';
<add> return $this->getFallbackChannelName();
<ide> }
<ide>
<ide> return $config['name'];
<ide> }
<add>
<add> /**
<add> * Get fallback log channel name.
<add> *
<add> * @return string
<add> */
<add> abstract protected function getFallbackChannelName();
<ide> } | 2 |
Text | Text | add cli reference docs for root-level commands | 9cea26bc7797789e4be2309153bd49c234544d2d | <ide><path>docs/reference/commandline/container.md
<add>
<add>---
<add>title: "container"
<add>description: "The container command description and usage"
<add>keywords: "container"
<add>---
<add>
<add><!-- This file is maintained within the docker/docker Github
<add> repository at https://github.com/docker/docker/. Make all
<add> pull requests against that repo. If you see this file in
<add> another repository, consider it read-only there, as it will
<add> periodically be overwritten by the definitive file. Pull
<add> requests which include edits to this file in other repositories
<add> will be rejected.
<add>-->
<add>
<add># container
<add>
<add>```markdown
<add>Usage: docker container COMMAND
<add>
<add>Manage containers
<add>
<add>Options:
<add> --help Print usage
<add>
<add>Commands:
<add> attach Attach to a running container
<add> commit Create a new image from a container's changes
<add> cp Copy files/folders between a container and the local filesystem
<add> create Create a new container
<add> diff Inspect changes to files or directories on a container's filesystem
<add> exec Run a command in a running container
<add> export Export a container's filesystem as a tar archive
<add> inspect Display detailed information on one or more containers
<add> kill Kill one or more running containers
<add> logs Fetch the logs of a container
<add> ls List containers
<add> pause Pause all processes within one or more containers
<add> port List port mappings or a specific mapping for the container
<add> prune Remove all stopped containers
<add> rename Rename a container
<add> restart Restart one or more containers
<add> rm Remove one or more containers
<add> run Run a command in a new container
<add> start Start one or more stopped containers
<add> stats Display a live stream of container(s) resource usage statistics
<add> stop Stop one or more running containers
<add> top Display the running processes of a container
<add> unpause Unpause all processes within one or more containers
<add> update Update configuration of one or more containers
<add> wait Block until one or more containers stop, then print their exit codes
<add>
<add>Run 'docker container COMMAND --help' for more information on a command.
<add>
<add>```
<add>
<add>## Description
<add>
<add>Manage containers.
<add>
<ide><path>docs/reference/commandline/image.md
<add>
<add>---
<add>title: "image"
<add>description: "The image command description and usage"
<add>keywords: "image"
<add>---
<add>
<add><!-- This file is maintained within the docker/docker Github
<add> repository at https://github.com/docker/docker/. Make all
<add> pull requests against that repo. If you see this file in
<add> another repository, consider it read-only there, as it will
<add> periodically be overwritten by the definitive file. Pull
<add> requests which include edits to this file in other repositories
<add> will be rejected.
<add>-->
<add>
<add># image
<add>
<add>```markdown
<add>Usage: docker image COMMAND
<add>
<add>Manage images
<add>
<add>Options:
<add> --help Print usage
<add>
<add>Commands:
<add> build Build an image from a Dockerfile
<add> history Show the history of an image
<add> import Import the contents from a tarball to create a filesystem image
<add> inspect Display detailed information on one or more images
<add> load Load an image from a tar archive or STDIN
<add> ls List images
<add> prune Remove unused images
<add> pull Pull an image or a repository from a registry
<add> push Push an image or a repository to a registry
<add> rm Remove one or more images
<add> save Save one or more images to a tar archive (streamed to STDOUT by default)
<add> tag Create a tag TARGET_IMAGE that refers to SOURCE_IMAGE
<add>
<add>Run 'docker image COMMAND --help' for more information on a command.
<add>
<add>```
<add>
<add>## Description
<add>
<add>Manage images.
<ide><path>docs/reference/commandline/network.md
<add>---
<add>title: "network"
<add>description: "The network command description and usage"
<add>keywords: "network"
<add>---
<add>
<add><!-- This file is maintained within the docker/docker Github
<add> repository at https://github.com/docker/docker/. Make all
<add> pull requests against that repo. If you see this file in
<add> another repository, consider it read-only there, as it will
<add> periodically be overwritten by the definitive file. Pull
<add> requests which include edits to this file in other repositories
<add> will be rejected.
<add>-->
<add>
<add># network
<add>
<add>```markdown
<add>Usage: docker network COMMAND
<add>
<add>Manage networks
<add>
<add>Options:
<add> --help Print usage
<add>
<add>Commands:
<add> connect Connect a container to a network
<add> create Create a network
<add> disconnect Disconnect a container from a network
<add> inspect Display detailed information on one or more networks
<add> ls List networks
<add> prune Remove all unused networks
<add> rm Remove one or more networks
<add>
<add>Run 'docker network COMMAND --help' for more information on a command.
<add>```
<add>
<add>## Description
<add>
<add>Manage networks. You can use subcommand to create, list, inspect, remove,
<add>connect and disconnect networks.
<add>
<add>## Related commands
<add>
<add>* [network create](network_create.md)
<add>* [network inspect](network_inspect.md)
<add>* [network list](network_list.md)
<add>* [network rm](network_rm.md)
<add>* [network prune](network_prune.md)
<ide><path>docs/reference/commandline/node.md
<add>
<add>---
<add>title: "node"
<add>description: "The node command description and usage"
<add>keywords: "node"
<add>---
<add>
<add><!-- This file is maintained within the docker/docker Github
<add> repository at https://github.com/docker/docker/. Make all
<add> pull requests against that repo. If you see this file in
<add> another repository, consider it read-only there, as it will
<add> periodically be overwritten by the definitive file. Pull
<add> requests which include edits to this file in other repositories
<add> will be rejected.
<add>-->
<add>
<add># node
<add>
<add>```markdown
<add>Usage: docker node COMMAND
<add>
<add>Manage Swarm nodes
<add>
<add>Options:
<add> --help Print usage
<add>
<add>Commands:
<add> demote Demote one or more nodes from manager in the swarm
<add> inspect Display detailed information on one or more nodes
<add> ls List nodes in the swarm
<add> promote Promote one or more nodes to manager in the swarm
<add> ps List tasks running on one or more nodes, defaults to current node
<add> rm Remove one or more nodes from the swarm
<add> update Update a node
<add>
<add>Run 'docker node COMMAND --help' for more information on a command.
<add>```
<add>
<add>## Description
<add>
<add>Manage nodes.
<add>
<ide><path>docs/reference/commandline/plugin.md
<add>---
<add>title: "plugin"
<add>description: "The plugin command description and usage"
<add>keywords: "plugin"
<add>---
<add>
<add><!-- This file is maintained within the docker/docker Github
<add> repository at https://github.com/docker/docker/. Make all
<add> pull requests against that repo. If you see this file in
<add> another repository, consider it read-only there, as it will
<add> periodically be overwritten by the definitive file. Pull
<add> requests which include edits to this file in other repositories
<add> will be rejected.
<add>-->
<add>
<add># plugin
<add>
<add>```markdown
<add>Usage: docker plugin COMMAND
<add>
<add>Manage plugins
<add>
<add>Options:
<add> --help Print usage
<add>
<add>Commands:
<add> create Create a plugin from a rootfs and configuration. Plugin data directory must contain config.json and rootfs directory.
<add> disable Disable a plugin
<add> enable Enable a plugin
<add> inspect Display detailed information on one or more plugins
<add> install Install a plugin
<add> ls List plugins
<add> push Push a plugin to a registry
<add> rm Remove one or more plugins
<add> set Change settings for a plugin
<add> upgrade Upgrade an existing plugin
<add>
<add>Run 'docker plugin COMMAND --help' for more information on a command.
<add>
<add>```
<add>
<add>## Description
<add>
<add>Manage plugins.
<ide><path>docs/reference/commandline/secret.md
<add>---
<add>title: "secret"
<add>description: "The secret command description and usage"
<add>keywords: "secret"
<add>---
<add>
<add><!-- This file is maintained within the docker/docker Github
<add> repository at https://github.com/docker/docker/. Make all
<add> pull requests against that repo. If you see this file in
<add> another repository, consider it read-only there, as it will
<add> periodically be overwritten by the definitive file. Pull
<add> requests which include edits to this file in other repositories
<add> will be rejected.
<add>-->
<add>
<add># secret
<add>
<add>```markdown
<add>Usage: docker secret COMMAND
<add>
<add>Manage Docker secrets
<add>
<add>Options:
<add> --help Print usage
<add>
<add>Commands:
<add> create Create a secret from a file or STDIN as content
<add> inspect Display detailed information on one or more secrets
<add> ls List secrets
<add> rm Remove one or more secrets
<add>
<add>Run 'docker secret COMMAND --help' for more information on a command.
<add>
<add>```
<add>
<add>## Description
<add>
<add>Manage secrets.
<add>
<add>## Related commands
<add>
<add>* [secret create](secret_create.md)
<add>* [secret inspect](secret_inspect.md)
<add>* [secret list](secret_list.md)
<add>* [secret rm](secret_rm.md)
<ide><path>docs/reference/commandline/service.md
<add>---
<add>title: "service"
<add>description: "The service command description and usage"
<add>keywords: "service"
<add>---
<add>
<add><!-- This file is maintained within the docker/docker Github
<add> repository at https://github.com/docker/docker/. Make all
<add> pull requests against that repo. If you see this file in
<add> another repository, consider it read-only there, as it will
<add> periodically be overwritten by the definitive file. Pull
<add> requests which include edits to this file in other repositories
<add> will be rejected.
<add>-->
<add>
<add># service
<add>
<add>```markdown
<add>Usage: docker service COMMAND
<add>
<add>Manage services
<add>
<add>Options:
<add> --help Print usage
<add>
<add>Commands:
<add> create Create a new service
<add> inspect Display detailed information on one or more services
<add> logs Fetch the logs of a service
<add> ls List services
<add> ps List the tasks of a service
<add> rm Remove one or more services
<add> scale Scale one or multiple replicated services
<add> update Update a service
<add>
<add>Run 'docker service COMMAND --help' for more information on a command.
<add>```
<add>
<add>## Description
<add>
<add>Manage services.
<add>
<ide><path>docs/reference/commandline/stack.md
<add>---
<add>title: "stack"
<add>description: "The stack command description and usage"
<add>keywords: "stack"
<add>---
<add>
<add><!-- This file is maintained within the docker/docker Github
<add> repository at https://github.com/docker/docker/. Make all
<add> pull requests against that repo. If you see this file in
<add> another repository, consider it read-only there, as it will
<add> periodically be overwritten by the definitive file. Pull
<add> requests which include edits to this file in other repositories
<add> will be rejected.
<add>-->
<add>
<add># stack
<add>
<add>```markdown
<add>Usage: docker stack COMMAND
<add>
<add>Manage Docker stacks
<add>
<add>Options:
<add> --help Print usage
<add>
<add>Commands:
<add> deploy Deploy a new stack or update an existing stack
<add> ls List stacks
<add> ps List the tasks in the stack
<add> rm Remove the stack
<add> services List the services in the stack
<add>
<add>Run 'docker stack COMMAND --help' for more information on a command.
<add>```
<add>
<add>## Description
<add>
<add>Manage stacks.
<add>
<ide><path>docs/reference/commandline/swarm.md
<add>---
<add>title: "swarm"
<add>description: "The swarm command description and usage"
<add>keywords: "swarm"
<add>---
<add>
<add><!-- This file is maintained within the docker/docker Github
<add> repository at https://github.com/docker/docker/. Make all
<add> pull requests against that repo. If you see this file in
<add> another repository, consider it read-only there, as it will
<add> periodically be overwritten by the definitive file. Pull
<add> requests which include edits to this file in other repositories
<add> will be rejected.
<add>-->
<add>
<add># swarm
<add>
<add>```markdown
<add>Usage: docker swarm COMMAND
<add>
<add>Manage Swarm
<add>
<add>Options:
<add> --help Print usage
<add>
<add>Commands:
<add> init Initialize a swarm
<add> join Join a swarm as a node and/or manager
<add> join-token Manage join tokens
<add> leave Leave the swarm
<add> unlock Unlock swarm
<add> unlock-key Manage the unlock key
<add> update Update the swarm
<add>
<add>Run 'docker swarm COMMAND --help' for more information on a command.
<add>```
<add>
<add>## Description
<add>
<add>Manage the swarm.
<ide><path>docs/reference/commandline/system.md
<add>---
<add>title: "system"
<add>description: "The system command description and usage"
<add>keywords: "system"
<add>---
<add>
<add><!-- This file is maintained within the docker/docker Github
<add> repository at https://github.com/docker/docker/. Make all
<add> pull requests against that repo. If you see this file in
<add> another repository, consider it read-only there, as it will
<add> periodically be overwritten by the definitive file. Pull
<add> requests which include edits to this file in other repositories
<add> will be rejected.
<add>-->
<add>
<add># system
<add>
<add>```markdown
<add>Usage: docker system COMMAND
<add>
<add>Manage Docker
<add>
<add>Options:
<add> --help Print usage
<add>
<add>Commands:
<add> df Show docker disk usage
<add> events Get real time events from the server
<add> info Display system-wide information
<add> prune Remove unused data
<add>
<add>Run 'docker system COMMAND --help' for more information on a command.
<add>```
<add>
<add>## Description
<add>
<add>Manage docker.
<ide><path>docs/reference/commandline/volume.md
<add>---
<add>title: "volume"
<add>description: "The volume command description and usage"
<add>keywords: "volume"
<add>---
<add>
<add><!-- This file is maintained within the docker/docker Github
<add> repository at https://github.com/docker/docker/. Make all
<add> pull requests against that repo. If you see this file in
<add> another repository, consider it read-only there, as it will
<add> periodically be overwritten by the definitive file. Pull
<add> requests which include edits to this file in other repositories
<add> will be rejected.
<add>-->
<add>
<add># volume
<add>
<add>```markdown
<add>Usage: docker volume COMMAND
<add>
<add>Manage volumes
<add>
<add>Options:
<add> --help Print usage
<add>
<add>Commands:
<add> create Create a volume
<add> inspect Display detailed information on one or more volumes
<add> ls List volumes
<add> prune Remove all unused volumes
<add> rm Remove one or more volumes
<add>
<add>Run 'docker volume COMMAND --help' for more information on a command.
<add>```
<add>
<add>## Description
<add>
<add>Manage volumes. You can use subcommand to create, list, inspect, remove
<add>volumes.
<add>
<add>## Related commands
<add>
<add>* [volume create](volume_create.md)
<add>* [volume inspect](volume_inspect.md)
<add>* [volume list](volume_list.md)
<add>* [volume rm](volume_rm.md)
<add>* [volume prune](volume_prune.md)
<add>* [Understand Data Volumes](https://docs.docker.com/engine/tutorials/dockervolumes/) | 11 |
Python | Python | avoid redundant numpy conversion | 037e2b17bbea65da897b768d7d06f5ea275c1724 | <ide><path>official/vision/beta/evaluation/panoptic_quality_evaluator.py
<ide> def update_state(self, groundtruths, predictions):
<ide> ValueError: if the required prediction or groundtruth fields are not
<ide> present in the incoming `predictions` or `groundtruths`.
<ide> """
<del> groundtruths, predictions = self._convert_to_numpy(groundtruths,
<del> predictions)
<ide> for k in self._required_prediction_fields:
<ide> if k not in predictions:
<ide> raise ValueError(
<ide> def update_state(self, groundtruths, predictions):
<ide> self._pq_metric_module.compare_and_accumulate(
<ide> _groundtruths, _predictions)
<ide> else:
<add> groundtruths, predictions = self._convert_to_numpy(
<add> groundtruths, predictions)
<ide> self._pq_metric_module.compare_and_accumulate(groundtruths, predictions) | 1 |
Text | Text | add a note on usage scope of aliasedbuffer | 447b3907c7b19bf53236538cb53968a7c2aca4b2 | <ide><path>CPP_STYLE_GUIDE.md
<ide> * [Use explicit pointer comparisons](#use-explicit-pointer-comparisons)
<ide> * [Ownership and Smart Pointers](#ownership-and-smart-pointers)
<ide> * [Avoid non-const references](#avoid-non-const-references)
<add> * [Use AliasedBuffers to manipulate TypedArrays](#use-aliasedbuffers-to-manipulate-typedarrays)
<ide> * [Others](#others)
<ide> * [Type casting](#type-casting)
<ide> * [Using `auto`](#using-auto)
<ide> class ExampleClass {
<ide> };
<ide> ```
<ide>
<add>### Use AliasedBuffers to manipulate TypedArrays
<add>
<add>When working with typed arrays that involve direct data modification
<add>from C++, use an `AliasedBuffer` when possible. The API abstraction and
<add>the usage scope of `AliasedBuffer` are documented in [aliased_buffer.h][].
<add>
<add>```c++
<add>// Create an AliasedBuffer.
<add>AliasedBuffer<uint32_t, v8::Uint32Array> data;
<add>...
<add>
<add>// Modify the data through natural operator semantics.
<add>data[0] = 12345;
<add>```
<add>
<ide> ## Others
<ide>
<ide> ### Type casting
<ide> even `try` and `catch` **will** break.
<ide> [Run Time Type Information]: https://en.wikipedia.org/wiki/Run-time_type_information
<ide> [cppref_auto_ptr]: https://en.cppreference.com/w/cpp/memory/auto_ptr
<ide> [without C++ exception handling]: https://gcc.gnu.org/onlinedocs/libstdc++/manual/using_exceptions.html#intro.using.exception.no
<add>[aliased_buffer.h]: https://github.com/nodejs/node/blob/master/src/aliased_buffer.h#L12 | 1 |
PHP | PHP | update docblock type | 25276e58dbcc3ce13d0ef7d17653cf912dc9173b | <ide><path>src/Validation/Validation.php
<ide> public static function mimeType($check, $mimeTypes = []): bool
<ide> * Helper for reading the file out of the various file implementations
<ide> * we accept.
<ide> *
<del> * @param string|array|\Psr\Http\Message\UploadedFileInterface $check The data to read a filename out of.
<add> * @param mixed $check The data to read a filename out of.
<ide> * @return string|false Either the filename or false on failure.
<ide> */
<ide> protected static function getFilename($check) | 1 |
Text | Text | translate ja of texture | db4b9fe0938e533c2793f31234221664f4d97708 | <ide><path>threejs/lessons/ja/threejs-textures.md
<add>Title: Three.jsのテクスチャ
<add>Description: Three.jsのテクスチャの使い方
<add>TOC: テクスチャ
<add>
<add>この記事はthree.jsについてのシリーズ記事の一つです。
<add>最初の記事は[Three.jsの基礎知識](threejs-fundamentals.html)です。
<add>まだ読んでない人は、そちらから先に読んでみるといいかもしれません。
<add>
<add>テクスチャはThree.jsの大きなトピックの一つです。
<add>どのレベルで説明するといいか100%承知してはいませんが、やってみようと思います。
<add>Three.jsにはたくさんのトピックがあり、互いに関係しているので、一度に説明するのが難しいのです。
<add>これがこの記事の内容の早見表です。
<add>
<add><ul>
<add><li><a href="#hello">ハロー・テクスチャ</a></li>
<add><li><a href="#six">立方体の各面に異なる6つのテクスチャを貼り付ける</a></li>
<add><li><a href="#loading">テクスチャの読み込み</a></li>
<add><ul>
<add> <li><a href="#easy">簡単な方法</a></li>
<add> <li><a href="#wait1">テクスチャの読み込みを待つ</a></li>
<add> <li><a href="#waitmany">複数テクスチャの読み込みを待つ</a></li>
<add> <li><a href="#cors">異なるオリジンからのテクスチャの読み込み</a></li>
<add></ul>
<add><li><a href="#memory">メモリ使用</a></li>
<add><li><a href="#format">JPG vs PNG</a></li>
<add><li><a href="#filtering-and-mips">フィルタリングとMIP</a></li>
<add><li><a href="#uvmanipulation">テクスチャの繰り返し、オフセット、回転、ラッピング</a></li>
<add></ul>
<add>
<add>## <a name="hello"></a> ハロー・テクスチャ
<add>
<add>テクスチャは*一般的に*PhotoshopやGIMPのような3rdパーティーのプログラムで最もよく作られる画像です。
<add>例えば、この画像を立方体に乗せてみましょう。
<add>
<add><div class="threejs_center">
<add> <img src="../resources/images/wall.jpg" style="width: 600px;" class="border" >
<add></div>
<add>
<add>最初の例を修正してみましょう。`TextureLoader`を作ることで、必要なことはすべてできます。
<add>[`load`](TextureLoader.load)を画像のURLを引数にして呼び、`color`を設定する代わりに、
<add>マテリアルの`map`属性にその結果を渡してください。
<add>
<add>```js
<add>+const loader = new THREE.TextureLoader();
<add>
<add>const material = new THREE.MeshBasicMaterial({
<add>- color: 0xFF8844,
<add>+ map: loader.load('resources/images/wall.jpg'),
<add>});
<add>```
<add>
<add>`MeshBasicMaterial`を使っているので、光源が必要ないことに注意してください。
<add>
<add>{{{example url="../threejs-textured-cube.html" }}}
<add>
<add>## <a name="six"></a> 立方体の各面に異なる6つのテクスチャを貼り付ける
<add>
<add>立方体の各面に貼り付ける、6つのテクスチャはどのようなものでしょうか。
<add>
<add><div class="threejs_center">
<add> <div>
<add> <img src="../resources/images/flower-1.jpg" style="width: 100px;" class="border" >
<add> <img src="../resources/images/flower-2.jpg" style="width: 100px;" class="border" >
<add> <img src="../resources/images/flower-3.jpg" style="width: 100px;" class="border" >
<add> </div>
<add> <div>
<add> <img src="../resources/images/flower-4.jpg" style="width: 100px;" class="border" >
<add> <img src="../resources/images/flower-5.jpg" style="width: 100px;" class="border" >
<add> <img src="../resources/images/flower-6.jpg" style="width: 100px;" class="border" >
<add> </div>
<add></div>
<add>
<add>`Mesh`を作るときに、単に6つのマテリアルを作り、配列として渡します。
<add>
<add>```js
<add>const loader = new THREE.TextureLoader();
<add>
<add>-const material = new THREE.MeshBasicMaterial({
<add>- map: loader.load('resources/images/wall.jpg'),
<add>-});
<add>+const materials = [
<add>+ new THREE.MeshBasicMaterial({map: loader.load('resources/images/flower-1.jpg')}),
<add>+ new THREE.MeshBasicMaterial({map: loader.load('resources/images/flower-2.jpg')}),
<add>+ new THREE.MeshBasicMaterial({map: loader.load('resources/images/flower-3.jpg')}),
<add>+ new THREE.MeshBasicMaterial({map: loader.load('resources/images/flower-4.jpg')}),
<add>+ new THREE.MeshBasicMaterial({map: loader.load('resources/images/flower-5.jpg')}),
<add>+ new THREE.MeshBasicMaterial({map: loader.load('resources/images/flower-6.jpg')}),
<add>+];
<add>-const cube = new THREE.Mesh(geometry, material);
<add>+const cube = new THREE.Mesh(geometry, materials);
<add>```
<add>
<add>動きました!
<add>
<add>{{{example url="../threejs-textured-cube-6-textures.html" }}}
<add>
<add>ただし、全ての種類のジオメトリが複数のマテリアルに対応しているわけではないことに注意してください。
<add>`BoxGeometry`と`BoxBufferGeometry`は、それぞれの面に6つのマテリアルを使えます。
<add>`ConeGeometry`と`ConeBufferGeometry`は2つのマテリアルを使うことができ、一つは底面、一つは円錐面に適用されます。
<add>`CylinderGeometry`と`CylinderBufferGeometry`は3つのマテリアルを使うことができ、一つは底面、一つは上面、一つは側面に適用されます。
<add>その他のケースでは、カスタムジオメトリのビルドや読み込み、テクスチャの座標の修正が必要になります。
<add>
<add>
<add>1つのジオメトリに複数の画像を適用したいなら、
<add>[テクスチャアトラス](https://en.wikipedia.org/wiki/Texture_atlas)を使うのが、ほかの3Dエンジンでははるかに一般的で、はるかに高性能です。
<add>テクスチャアトラスは、一つのテクスチャに複数の画像を配置し、ジオメトリの頂点の座標を使って
<add>テクスチャのどの部分がジオメトリのおのおのの三角形に使われるか、選択するものです。
<add>
<add>テクスチャの座標とはなんでしょうか?ジオメトリ頂点に与えられたデータのことで、
<add>テクスチャのどの部分がその頂点に対応するか指定するものです。
<add>[カスタムジオメトリの構築](threejs-custom-geometry.html)を始めるときに説明します。
<add>
<add>## <a name="loading"></a> テクスチャの読み込み
<add>
<add>### <a name="easy"></a> 簡単な方法
<add>
<add>このサイトのコードのほとんどは、もっとも簡単なテクスチャの読み込み方を使っています。
<add>`TextureLoader`を作り、その[`load`](TextureLoader.load)メソッドを呼びます。
<add>これは`Texture`オブジェクトを返します。
<add>
<add>
<add>```js
<add>const texture = loader.load('resources/images/flower-1.jpg');
<add>```
<add>
<add>このメソッドを使うと、画像がthree.jsによって非同期的に読み込まれるまで、テクスチャが透明になります。読み込まれた時点で、テクスチャをダウンロードした画像に更新します。
<add>
<add>
<add>この方法では、テクスチャの読み込みを待つ必要がなく、ページをすぐにレンダリングし始めることができるという、大きな利点があります。
<add>多くのケースでこの方法で問題ありませんが、テクスチャをダウンロードし終えたときにthree.jsに通知してもらうこともできます。
<add>
<add>### <a name="wait1"></a> テクスチャの読み込みを待つ
<add>
<add>テクスチャの読み込みを待つために、テクスチャローダーの`load`メソッドは、テクスチャの読み込みが終了したときに呼ばれるコールバックを取ります。
<add>冒頭の例に戻り、このように、`Mesh`を作りシーンに追加する前に、テクスチャの読み込みを待つことができます。
<add>
<add>
<add>```js
<add>const loader = new THREE.TextureLoader();
<add>loader.load('resources/images/wall.jpg', (texture) => {
<add> const material = new THREE.MeshBasicMaterial({
<add> map: texture,
<add> });
<add> const cube = new THREE.Mesh(geometry, material);
<add> scene.add(cube);
<add> cubes.push(cube); // add to our list of cubes to rotate
<add>});
<add>```
<add>
<add>ブラウザのキャッシュをクリアし、接続が遅くならない限り、違いが分かることはないと思いますが、
<add>ちゃんとテクスチャが読み込まれるのを待っているので、安心してください。
<add>
<add>{{{example url="../threejs-textured-cube-wait-for-texture.html" }}}
<add>
<add>### <a name="waitmany"></a> 複数テクスチャの読み込みを待つ
<add>
<add>
<add>全てのテクスチャが読み込まれたことを待つために、`LoadingManager`を使うことができます。
<add>`TextureLoader`を渡すと、[`onLoad`](LoadingManager.onLoad)属性がコールバックに設定されます。
<add>
<add>```js
<add>+const loadManager = new THREE.LoadingManager();
<add>*const loader = new THREE.TextureLoader(loadManager);
<add>
<add>const materials = [
<add> new THREE.MeshBasicMaterial({map: loader.load('resources/images/flower-1.jpg')}),
<add> new THREE.MeshBasicMaterial({map: loader.load('resources/images/flower-2.jpg')}),
<add> new THREE.MeshBasicMaterial({map: loader.load('resources/images/flower-3.jpg')}),
<add> new THREE.MeshBasicMaterial({map: loader.load('resources/images/flower-4.jpg')}),
<add> new THREE.MeshBasicMaterial({map: loader.load('resources/images/flower-5.jpg')}),
<add> new THREE.MeshBasicMaterial({map: loader.load('resources/images/flower-6.jpg')}),
<add>];
<add>
<add>+loadManager.onLoad = () => {
<add>+ const cube = new THREE.Mesh(geometry, materials);
<add>+ scene.add(cube);
<add>+ cubes.push(cube); // add to our list of cubes to rotate
<add>+};
<add>```
<add>
<add>`LoadingManager`は[`onProgress`](LoadingManager.onProgress)属性もあり、
<add>プログレスインジケーターを表示するためのコールバックを設定できます。
<add>
<add>まず、HTMLにプログレスバーを追加しましょう。
<add>
<add>```html
<add><body>
<add> <canvas id="c"></canvas>
<add>+ <div id="loading">
<add>+ <div class="progress"><div class="progressbar"></div></div>
<add>+ </div>
<add></body>
<add>```
<add>
<add>そしてCSSにも追加します。
<add>
<add>```css
<add>#loading {
<add> position: fixed;
<add> top: 0;
<add> left: 0;
<add> width: 100%;
<add> height: 100%;
<add> display: flex;
<add> justify-content: center;
<add> align-items: center;
<add>}
<add>#loading .progress {
<add> margin: 1.5em;
<add> border: 1px solid white;
<add> width: 50vw;
<add>}
<add>#loading .progressbar {
<add> margin: 2px;
<add> background: white;
<add> height: 1em;
<add> transform-origin: top left;
<add> transform: scaleX(0);
<add>}
<add>```
<add>
<add>そうすると、コード内で`onProgress`コールバックの`progressbar`のスケールが更新できます。
<add>これは、最後のアイテムが読み込まれるURL、いま読み込まれているアイテムの数、アイテムの合計数を渡して呼ばれます。
<add>
<add>```js
<add>+const loadingElem = document.querySelector('#loading');
<add>+const progressBarElem = loadingElem.querySelector('.progressbar');
<add>
<add>loadManager.onLoad = () => {
<add>+ loadingElem.style.display = 'none';
<add> const cube = new THREE.Mesh(geometry, materials);
<add> scene.add(cube);
<add> cubes.push(cube); // add to our list of cubes to rotate
<add>};
<add>
<add>+loadManager.onProgress = (urlOfLastItemLoaded, itemsLoaded, itemsTotal) => {
<add>+ const progress = itemsLoaded / itemsTotal;
<add>+ progressBarElem.style.transform = `scaleX(${progress})`;
<add>+};
<add>```
<add>
<add>キャッシュを削除して低速なコネクションを作らない限りは、プログレスバーを見ることできないかもしれません。
<add>
<add>{{{example url="../threejs-textured-cube-wait-for-all-textures.html" }}}
<add>
<add>## <a name="cors"></a> 異なるオリジンからのテクスチャの読み込み
<add>
<add>異なるサーバーの画像を使うため、そのサーバーは正しいヘッダーを送る必要があります。
<add>そうしないと、three.jsでその画像を使うことができず、エラーを受け取るでしょう。
<add>もし皆さんが画像を提供するサーバーを運用しているなら、
<add>[正しいヘッダーを送る](https://developer.mozilla.org/en-US/docs/Web/HTTP/CORS)を確認してください。
<add>画像をホスティングしているサーバーに手を入れられず、権限用のヘッダーを送ることができないなら、
<add>そのサーバーからの画像を使うことはできません。
<add>
<add>
<add>例えば、[imgur](https://imgur.com)、[flickr](https://flickr.com)、そして
<add>[github](https://github.com)は全て、ホストしている画像を
<add>three.jsで使うことができるようなヘッダーを送っています。
<add>
<add>## <a name="memory"></a>メモリ使用
<add>
<add>多くの場合、テクスチャはthree.jsアプリの中で最もメモリを使っています。
<add>*一般的に*テクスチャは`幅 * 高さ * 4 * 1.33`バイトのメモリを消費していることを理解するのは重要です。
<add>
<add>圧縮については言及していないことに注意してください。.jpgイメージを作り、超高圧縮することもできます。
<add>例えば、家のシーンを作っているとしましょう。家の中には、テーブルがあり、上面に木目のテクスチャを置くことに決めました。
<add>
<add><div class="threejs_center"><img class="border" src="resources/images/compressed-but-large-wood-texture.jpg" align="center" style="width: 300px"></div>
<add>
<add>
<add>このイメージはたった157kなので、比較的速くダウンロードすることができます。しかし、
<add>[ピクセルだと3024 x 3761の大きさ](resources/images/compressed-but-large-wood-texture.jpg)です。
<add>前述した式によると、
<add>
<add> 3024 * 3761 * 4 * 1.33 = 60505764.5
<add>
<add>
<add>となり、three.jsの**60メガのメモリ!**を消費するでしょう。
<add>このようなテクスチャがいくつかあるだけで、メモリリークしてしまうでしょう。
<add>
<add>
<add>この例を持ち出したのは、テクスチャを使用することの隠れたコストを知っているのが重要だからです。
<add>three.jsでテクスチャを使うためには、テクスチャのデータをGPUに渡し、*一般的に*非圧縮にしておく必要があります。
<add>
<add>この話の教訓は、テクスチャをファイルサイズだけでなく、次元も小さくすることです。
<add>ファイルサイズの小ささ = 高速なダウンロードです。次元の小ささ = 省メモリです。
<add>では、どのように小さくできるのでしょうか?
<add>できるだけ小さく、そして十分見えるくらいです。
<add>
<add>## <a name="format"></a> JPG vs PNG
<add>
<add>これは通常のHTMLとほぼ同じで、PNGはロスレス圧縮なので、lossy圧縮のJPGよりも
<add>一般的にダウンロードが遅くなります。
<add>しかし、PNGは透過性があります。PNGは法線マップや後ほど説明する非画像マップのような非画像データにも適したフォーマットです。
<add>
<add>WebGLにおいて、JPGがPNGよりも省メモリではないことを覚えておいてください。上記を参照してください。
<add>
<add>## <a name="filtering-and-mips"></a> フィルタリングとMIP
<add>
<add>この16x16のテクスチャを
<add>
<add><div class="threejs_center"><img src="resources/images/mip-low-res-enlarged.png" class="nobg" align="center"></div>
<add>
<add>立方体に適用してみます。
<add>
<add><div class="spread"><div data-diagram="filterCube"></div></div>
<add>
<add>この立方体をとても小さく描画してみましょう。
<add>
<add><div class="spread"><div data-diagram="filterCubeSmall"></div></div>
<add>
<add>ふーむ、見えにくいです。小さな立方体を拡大してみましょう。
<add>
<add><div class="spread"><div data-diagram="filterCubeSmallLowRes"></div></div>
<add>
<add>GPUは小さな立方体のどのピクセルにどの色を使うか、どうやって知るのでしょうか?
<add>立方体が小さすぎて1、2ピクセルしかないとしたらどうでしょうか?
<add>
<add>フィルタリングとはこういうものです。
<add>
<add>もしフォトショップなら近くの全てのピクセルを平均して、1、2ピクセルの色を見つけます。
<add>これはとても遅い操作です。GPUはミップマップを使ってこの問題を解決します。
<add>
<add>MIPはテクスチャのコピーで、ピクセルがブレンドされて次の小さいMIPを作られます。そのため、前のMIPの半分の幅と半分の高さになっています。
<add>MIPは1x1ピクセルのMIPが得られるまで作られます。
<add>全てのMIP上の画像はこのようになります。
<add>
<add><div class="threejs_center"><img src="resources/images/mipmap-low-res-enlarged.png" class="nobg" align="center"></div>
<add>
<add>さて、立方体が1、2ピクセルの小ささに描かれたとき、どんな色にするか決めるため、GPUは最も小さなMIPレベルか次に小さいMIPか選ぶことができます。
<add>
<add>
<add>three.jsでは、テクスチャが元の大きさより大きく描かれたときと、小さく描かれたときの両方で、処理の設定を選ぶことができます。
<add>
<add>
<add>テクスチャが元の大きさより大きく描かれたときのフィルタ設定として、[`texture.magFilter`](Texture.magFilter)属性に`THREE.NearestFilter`か`THREE.LinearFilter`を設定することができます。
<add>`NearestFilter`は元のテクスチャから最も近い1ピクセルを使用するということです。
<add>低解像度のテクスチャでは、マインクラフトのようにピクセル化された見た目になります。
<add>
<add>`LinearFilter`はテクスチャから、色を決めたいピクセルに最も近い4ピクセルを選び、
<add>実際の点が4つのピクセルからどれだけ離れているかに応じて適切な比率で混ぜ合わせます。
<add>
<add><div class="spread">
<add> <div>
<add> <div data-diagram="filterCubeMagNearest" style="height: 250px;"></div>
<add> <div class="code">Nearest</div>
<add> </div>
<add> <div>
<add> <div data-diagram="filterCubeMagLinear" style="height: 250px;"></div>
<add> <div class="code">Linear</div>
<add> </div>
<add></div>
<add>
<add>元の大きさよりもテクスチャが小さく描画された時のフィルタ設定では、
<add>[`texture.minFilter`](Texture.minFilter)属性を6つの値から一つ設定できます。
<add>
<add>* `THREE.NearestFilter`
<add>
<add> 上と同様に、テクスチャの最も近いピクセルを選ぶ。
<add>
<add>* `THREE.LinearFilter`
<add>
<add> 上と同様に、テクスチャから4ピクセルを選んで混ぜ合わせる。
<add>
<add>* `THREE.NearestMipmapNearestFilter`
<add>
<add> 適切なMIPを選び、ピクセルを一つ選ぶ。
<add>
<add>* `THREE.NearestMipmapLinearFilter`
<add>
<add> 2つMIPを選び、それぞれからピクセルを選んで、その2つを混ぜる。
<add>
<add>* `THREE.LinearMipmapNearestFilter`
<add>
<add> 適切なMIPを選び、4ピクセルを選んで混ぜ合わせる。
<add>
<add>* `THREE.LinearMipmapLinearFilter`
<add>
<add> 2つMIPを選び、それぞれから4ピクセルを選んで、8つ全部を混ぜ合わせて1ピクセルにする。
<add>
<add>ここで6つ全ての設定の例を見せましょう。
<add>
<add><div class="spread">
<add> <div data-diagram="filterModes" style="
<add> height: 450px;
<add> position: relative;
<add> ">
<add> <div style="
<add> width: 100%;
<add> height: 100%;
<add> display: flex;
<add> align-items: center;
<add> justify-content: flex-start;
<add> ">
<add> <div style="
<add> background: rgba(255,0,0,.8);
<add> color: white;
<add> padding: .5em;
<add> margin: 1em;
<add> font-size: small;
<add> border-radius: .5em;
<add> line-height: 1.2;
<add> user-select: none;"
<add> >click to<br/>change<br/>texture</div>
<add> </div>
<add> <div class="filter-caption" style="left: 0.5em; top: 0.5em;">nearest</div>
<add> <div class="filter-caption" style="width: 100%; text-align: center; top: 0.5em;">linear</div>
<add> <div class="filter-caption" style="right: 0.5em; text-align: right; top: 0.5em;">nearest<br/>mipmap<br/>nearest</div>
<add> <div class="filter-caption" style="left: 0.5em; text-align: left; bottom: 0.5em;">nearest<br/>mipmap<br/>linear</div>
<add> <div class="filter-caption" style="width: 100%; text-align: center; bottom: 0.5em;">linear<br/>mipmap<br/>nearest</div>
<add> <div class="filter-caption" style="right: 0.5em; text-align: right; bottom: 0.5em;">linear<br/>mipmap<br/>linear</div>
<add> </div>
<add></div>
<add>
<add>注意することは、左上と中央上は`NearestFilter`を使っていて、`LinearFilter`はMIPを使っていないことです。GPUが元のテクスチャからピクセルを選ぶので、遠くはちらついて見えます。
<add>左側はたった一つのピクセルが選ばれ、中央は4つのピクセルが選ばれて混ぜ合わされます。しかし、
<add>良い色の表現には至っていません。
<add>ほかの4つの中では、右下の`LinearMipmapLinearFilter`が一番良いです。
<add>
<add>上の画像をクリックすると、上で使用しているテクスチャと、MIPレベルごとに色が異なるテクスチャが切り替わります。
<add>
<add><div class="threejs_center">
<add> <div data-texture-diagram="differentColoredMips"></div>
<add></div>
<add>
<add>これで、起きていることが分かりやすいでしょう。
<add>左上と中央上は、最初のMIPがずっと遠くまで使われているのが分かります。
<add>右上と中央下は、別のMIPが使われているのがよく分かります。
<add>
<add>
<add>元のテクスチャに切り替えると、右下が滑らか、つまり高品質であることが分かります。
<add>なぜ常にこのモードにしないのか聞きたいかもしれません。
<add>最も明らかな理由は、レトロ感を出すために、ピクセル化してほしいとかです。
<add>次の理由は、8ピクセルを読み込んで混ぜ合わせることは、1ピクセルを読んで混ぜ合わせるよりも遅いことです。
<add>1つのテクスチャの速度では違いが出るように思えないかもしれませんが、
<add>記事が進むにつれて、最終的に4、5のテクスチャを一度に持つマテリアルが出てくるでしょう。
<add>4テクスチャ * 8ピクセル(テクスチャごと)は、どのピクセルを描画するにも32ピクセル探すことになります。
<add>これはモバイルデバイスで考えるときに特に重要になります。
<add>
<add>## <a name="uvmanipulation"></a> テクスチャの繰り返し、オフセット、回転、ラッピング
<add>
<add>テクスチャは、繰り返し、オフセット、回転の設定があります。
<add>
<add>three.jsのデフォルトのテクスチャは繰り返されません。
<add>テクスチャが繰り返されるかどうかの設定には、2つの属性があります。
<add>水平方向のラッピングに[`wrapS`](Texture.wrapS)と、垂直方向のラッピングに[`wrapT`](Texture.wrapT)です。
<add>
<add>以下のどれかが設定されます:
<add>
<add>* `THREE.ClampToEdgeWrapping`
<add>
<add> それぞれの角の最後のピクセルが永遠に繰り返されます。
<add>
<add>* `THREE.RepeatWrapping`
<add>
<add> テクスチャが繰り返されます。
<add>
<add>* `THREE.MirroredRepeatWrapping`
<add>
<add> テクスチャの鏡像が取られ、繰り返されます。
<add>
<add>例えば、両方向にラッピングすると、
<add>
<add>```js
<add>someTexture.wrapS = THREE.RepeatWrapping;
<add>someTexture.wrapT = THREE.RepeatWrapping;
<add>```
<add>
<add>繰り返しは`repeat`属性で設定されます。
<add>
<add>```js
<add>const timesToRepeatHorizontally = 4;
<add>const timesToRepeatVertically = 2;
<add>someTexture.repeat.set(timesToRepeatHorizontally, timesToRepeatVertically);
<add>```
<add>
<add>テクスチャのオフセットは`offset`属性でできます。
<add>テクスチャは1単位 = 1テクスチャの大きさにオフセットされます。
<add>言い換えると、0 = オフセットなし、1 = テクスチャ全体の大きさということです。
<add>
<add>```js
<add>const xOffset = .5; // offset by half the texture
<add>const yOffset = .25; // offset by 1/4 the texture
<add>someTexture.offset.set(xOffset, yOffset);
<add>```
<add>
<add>テクスチャの回転は、`rotation`属性で、ラジアンで指定します。
<add>同様に `center`属性で回転の中心を指定します。
<add>デフォルトは0,0で、左下の角で回転します。
<add>オフセットと同じように、単位はテクスチャの大きさなので、`.5, .5`に設定すると、
<add>テクスチャの中心での回転になります。
<add>
<add>```js
<add>someTexture.center.set(.5, .5);
<add>someTexture.rotation = THREE.MathUtils.degToRad(45);
<add>```
<add>
<add>最初に取り上げたサンプルでこれらの値を試してみましょう。
<add>
<add>最初に、テクスチャを操作できるように参照を保持しておきます。
<add>
<add>```js
<add>+const texture = loader.load('resources/images/wall.jpg');
<add>const material = new THREE.MeshBasicMaterial({
<add>- map: loader.load('resources/images/wall.jpg');
<add>+ map: texture,
<add>});
<add>```
<add>
<add>ここでも、簡単なインターフェースを提供するために[dat.GUI](https://github.com/dataarts/dat.gui)を使います。
<add>
<add>```js
<add>import {GUI} from '../3rdparty/dat.gui.module.js';
<add>```
<add>
<add>以前のdat.GUIの例でしたように、dat.GUIに度数で操作できるオブジェクトを与え、
<add>ラジアン単位でプロパティを設定する簡単なクラスを使います。
<add>
<add>```js
<add>class DegRadHelper {
<add> constructor(obj, prop) {
<add> this.obj = obj;
<add> this.prop = prop;
<add> }
<add> get value() {
<add> return THREE.MathUtils.radToDeg(this.obj[this.prop]);
<add> }
<add> set value(v) {
<add> this.obj[this.prop] = THREE.MathUtils.degToRad(v);
<add> }
<add>}
<add>```
<add>
<add>`"123"`といった文字列から`123`といった数値に変換するクラスも必要です。
<add>これは、three.jsは`wrapS`や`wrapT`のようなenumの設定として数値が必要ですが、
<add>dat.GUIはenumに文字列のみを使うためです。
<add>
<add>```js
<add>class StringToNumberHelper {
<add> constructor(obj, prop) {
<add> this.obj = obj;
<add> this.prop = prop;
<add> }
<add> get value() {
<add> return this.obj[this.prop];
<add> }
<add> set value(v) {
<add> this.obj[this.prop] = parseFloat(v);
<add> }
<add>}
<add>```
<add>
<add>このクラスを使って、上記設定のための簡単なGUIをセットアップできます。
<add>
<add>```js
<add>const wrapModes = {
<add> 'ClampToEdgeWrapping': THREE.ClampToEdgeWrapping,
<add> 'RepeatWrapping': THREE.RepeatWrapping,
<add> 'MirroredRepeatWrapping': THREE.MirroredRepeatWrapping,
<add>};
<add>
<add>function updateTexture() {
<add> texture.needsUpdate = true;
<add>}
<add>
<add>const gui = new GUI();
<add>gui.add(new StringToNumberHelper(texture, 'wrapS'), 'value', wrapModes)
<add> .name('texture.wrapS')
<add> .onChange(updateTexture);
<add>gui.add(new StringToNumberHelper(texture, 'wrapT'), 'value', wrapModes)
<add> .name('texture.wrapT')
<add> .onChange(updateTexture);
<add>gui.add(texture.repeat, 'x', 0, 5, .01).name('texture.repeat.x');
<add>gui.add(texture.repeat, 'y', 0, 5, .01).name('texture.repeat.y');
<add>gui.add(texture.offset, 'x', -2, 2, .01).name('texture.offset.x');
<add>gui.add(texture.offset, 'y', -2, 2, .01).name('texture.offset.y');
<add>gui.add(texture.center, 'x', -.5, 1.5, .01).name('texture.center.x');
<add>gui.add(texture.center, 'y', -.5, 1.5, .01).name('texture.center.y');
<add>gui.add(new DegRadHelper(texture, 'rotation'), 'value', -360, 360)
<add> .name('texture.rotation');
<add>```
<add>
<add>最後に特記することは、もしテクスチャの`wrapS`や`wrapT`を変えるなら、
<add>three.jsが設定の適用を知るために、[`texture.needsUpdate`](Texture.needsUpdate)も設定しなければならないことです。ほかの設定は自動的に適用されます。
<add>
<add>
<add>{{{example url="../threejs-textured-cube-adjust.html" }}}
<add>
<add>これはテクスチャのトピックへの第一歩にすぎません。
<add>ある時点で、テクスチャの座標や、マテリアルが適用できる別の9種のテクスチャについても説明します。
<add>
<add>今のところは、[光源](threejs-lights.html)に進みましょう。
<add>
<add><!--
<add>alpha
<add>ao
<add>env
<add>light
<add>specular
<add>bumpmap ?
<add>normalmap ?
<add>metalness
<add>roughness
<add>-->
<add>
<add><link rel="stylesheet" href="resources/threejs-textures.css">
<add><script type="module" src="resources/threejs-textures.js"></script> | 1 |
PHP | PHP | implement httpkernelinterface on router | e8a704cf4116feae5ac03845bad732b0e38cc1f9 | <ide><path>src/Illuminate/Routing/Router.php
<ide> use Illuminate\Http\Response;
<ide> use Illuminate\Events\Dispatcher;
<ide> use Illuminate\Container\Container;
<add>use Symfony\Component\HttpKernel\HttpKernelInterface;
<add>use Symfony\Component\HttpFoundation\Request as SymfonyRequest;
<ide> use Symfony\Component\HttpKernel\Exception\NotFoundHttpException;
<ide>
<del>class Router implements RouteFiltererInterface {
<add>class Router implements HttpKernelInterface, RouteFiltererInterface {
<ide>
<ide> /**
<ide> * The event dispatcher instance.
<ide> public function setControllerDispatcher(ControllerDispatcher $dispatcher)
<ide> $this->controllerDispatcher = $dispatcher;
<ide> }
<ide>
<add> /**
<add> * Get the response for a given request.
<add> *
<add> * @param \Symfony\Component\HttpFoundation\Request $request
<add> * @return \Symfony\Component\HttpFoundation\Response
<add> */
<add> public function handle(SymfonyRequest $request, $type = HttpKernelInterface::MASTER_REQUEST, $catch = true)
<add> {
<add> return $this->dispatch(Request::createFromBase($request));
<add> }
<add>
<ide> }
<ide>\ No newline at end of file | 1 |
PHP | PHP | add retry helper | e3bd359d52cee0ba8db9673e45a8221c1c1d95d6 | <ide><path>src/Illuminate/Support/helpers.php
<ide> function preg_replace_array($pattern, array $replacements, $subject)
<ide> }
<ide> }
<ide>
<add>if (! function_exists('retry')) {
<add> /**
<add> * Retry an operation a given number of times.
<add> *
<add> * @param int $times
<add> * @param callable $callback
<add> * @return mixed
<add> */
<add> function retry($times, callable $callback)
<add> {
<add> $times--;
<add>
<add> beginning:
<add> try {
<add> return $callback();
<add> } catch (Exception $e) {
<add> if (! $times) {
<add> throw $e;
<add> }
<add>
<add> $times--;
<add>
<add> goto beginning;
<add> }
<add> }
<add>}
<add>
<ide> if (! function_exists('snake_case')) {
<ide> /**
<ide> * Convert a string to snake case. | 1 |
Python | Python | add distilbert support for ner fine-tuning | 2b07b9e5ee14ac37fcef7bac958963d869b3b79a | <ide><path>examples/run_ner.py
<ide> from transformers import AdamW, WarmupLinearSchedule
<ide> from transformers import WEIGHTS_NAME, BertConfig, BertForTokenClassification, BertTokenizer
<ide> from transformers import RobertaConfig, RobertaForTokenClassification, RobertaTokenizer
<add>from transformers import DistilBertConfig, DistilBertForTokenClassification, DistilBertTokenizer
<ide>
<ide> logger = logging.getLogger(__name__)
<ide>
<ide> ALL_MODELS = sum(
<del> (tuple(conf.pretrained_config_archive_map.keys()) for conf in (BertConfig, RobertaConfig)),
<add> (tuple(conf.pretrained_config_archive_map.keys()) for conf in (BertConfig, RobertaConfig, DistilBertConfig)),
<ide> ())
<ide>
<ide> MODEL_CLASSES = {
<ide> "bert": (BertConfig, BertForTokenClassification, BertTokenizer),
<del> "roberta": (RobertaConfig, RobertaForTokenClassification, RobertaTokenizer)
<add> "roberta": (RobertaConfig, RobertaForTokenClassification, RobertaTokenizer),
<add> "distilbert": (DistilBertConfig, DistilBertForTokenClassification, DistilBertTokenizer)
<ide> }
<ide>
<ide>
<ide> def train(args, train_dataset, model, tokenizer, labels, pad_token_label_id):
<ide> batch = tuple(t.to(args.device) for t in batch)
<ide> inputs = {"input_ids": batch[0],
<ide> "attention_mask": batch[1],
<del> "token_type_ids": batch[2] if args.model_type in ["bert", "xlnet"] else None,
<del> # XLM and RoBERTa don"t use segment_ids
<ide> "labels": batch[3]}
<add> if args.model_type != "distilbert":
<add> inputs["token_type_ids"]: batch[2] if args.model_type in ["bert", "xlnet"] else None # XLM and RoBERTa don"t use segment_ids
<add>
<ide> outputs = model(**inputs)
<ide> loss = outputs[0] # model outputs are always tuple in pytorch-transformers (see doc)
<ide>
<ide> def evaluate(args, model, tokenizer, labels, pad_token_label_id, mode, prefix=""
<ide> with torch.no_grad():
<ide> inputs = {"input_ids": batch[0],
<ide> "attention_mask": batch[1],
<del> "token_type_ids": batch[2] if args.model_type in ["bert", "xlnet"] else None,
<del> # XLM and RoBERTa don"t use segment_ids
<ide> "labels": batch[3]}
<add> if args.model_type != "distilbert":
<add> inputs["token_type_ids"]: batch[2] if args.model_type in ["bert", "xlnet"] else None # XLM and RoBERTa don"t use segment_ids
<ide> outputs = model(**inputs)
<ide> tmp_eval_loss, logits = outputs[:2]
<ide>
<ide> def main():
<ide>
<ide> if __name__ == "__main__":
<ide> main()
<add> | 1 |
Go | Go | replace my c code with tianons go code | 524416560a4624d30023db32101c9fe5ebffc895 | <ide><path>pkg/netlink/netlink_linux.go
<ide>
<ide> package netlink
<ide>
<del>/*
<del>#include <string.h>
<del>#include <errno.h>
<del>#include <sys/ioctl.h>
<del>#include <net/if.h>
<del>
<del>static int get_socket(void) {
<del> int s_errno;
<del> int fd;
<del>
<del> fd = socket(PF_INET, SOCK_DGRAM, 0);
<del> if (fd >= 0) {
<del> return fd;
<del> }
<del> s_errno = errno;
<del>
<del> fd = socket(PF_PACKET, SOCK_DGRAM, 0);
<del> if (fd >= 0) {
<del> return fd;
<del> }
<del>
<del> fd = socket(PF_INET6, SOCK_DGRAM, 0);
<del> if (fd >= 0) {
<del> return fd;
<del> }
<del> errno = s_errno;
<del> return -1;
<del>}
<del>
<del>
<del>static int change_name(const char *old_name, const char *new_name) {
<del> struct ifreq ifr;
<del> int err;
<del> int fd;
<del>
<del> fd = get_socket();
<del> if (fd < 0) {
<del> return -1;
<del> }
<del>
<del> strncpy(ifr.ifr_name, old_name, IFNAMSIZ);
<del> strncpy(ifr.ifr_newname, new_name, IFNAMSIZ);
<del>
<del> err = ioctl(fd, SIOCSIFNAME, &ifr);
<del> if (err) {
<del> close(fd);
<del> return -1;
<del> }
<del> close(fd);
<del> return err;
<del>}
<del>*/
<del>import "C"
<del>
<ide> import (
<ide> "encoding/binary"
<ide> "fmt"
<ide> done:
<ide> return res, nil
<ide> }
<ide>
<add>func getIfSocket() (int, error) {
<add> fd, err := syscall.Socket(syscall.AF_INET, syscall.SOCK_DGRAM, 0)
<add> if err == nil {
<add> return fd, err
<add> }
<add> sErr := err
<add>
<add> fd, err = syscall.Socket(syscall.AF_PACKET, syscall.SOCK_DGRAM, 0)
<add> if err == nil {
<add> return fd, err
<add> }
<add>
<add> fd, err = syscall.Socket(syscall.AF_INET6, syscall.SOCK_DGRAM, 0)
<add> if err == nil {
<add> return fd, err
<add> }
<add>
<add> return -1, sErr
<add>}
<add>
<ide> func NetworkChangeName(oldName, newName string) error {
<del> var (
<del> cold = C.CString(oldName)
<del> cnew = C.CString(newName)
<del> )
<add> fd, err := getIfSocket()
<add> if err != nil {
<add> return err
<add> }
<add> defer syscall.Close(fd)
<add> IFNAMSIZ := 16
<add>
<add> data := [32]byte{}
<add> copy(data[:IFNAMSIZ-1], oldName)
<add> copy(data[IFNAMSIZ:IFNAMSIZ*2-1], newName)
<ide>
<del> if errno := int(C.change_name(cold, cnew)); errno != 0 {
<del> return fmt.Errorf("unable to change name %d", errno)
<add> if _, _, errno := syscall.Syscall(syscall.SYS_IOCTL, uintptr(fd), syscall.SIOCSIFNAME, uintptr(unsafe.Pointer(&data[0]))); errno != 0 {
<add> return errno
<ide> }
<add>
<ide> return nil
<ide> } | 1 |
PHP | PHP | handle query string arguments in digest auth data | cdc67116c5e9ebba67a76045434e7b6fb7d1876b | <ide><path>lib/Cake/Controller/Component/Auth/DigestAuthenticate.php
<ide> public function parseAuthData($digest) {
<ide> }
<ide> $keys = $match = array();
<ide> $req = array('nonce' => 1, 'nc' => 1, 'cnonce' => 1, 'qop' => 1, 'username' => 1, 'uri' => 1, 'response' => 1);
<del> preg_match_all('/(\w+)=([\'"]?)([a-zA-Z0-9\:\#\%@=.\/_-]+)\2/', $digest, $match, PREG_SET_ORDER);
<add> preg_match_all('/(\w+)=([\'"]?)([a-zA-Z0-9\:\#\%\?\&@=\.\/_-]+)\2/', $digest, $match, PREG_SET_ORDER);
<ide>
<ide> foreach ($match as $i) {
<ide> $keys[$i[1]] = $i[3];
<ide><path>lib/Cake/Test/Case/Controller/Component/Auth/DigestAuthenticateTest.php
<ide> public function testParseAuthData() {
<ide> Digest username="Mufasa",
<ide> realm="testrealm@host.com",
<ide> nonce="dcd98b7102dd2f0e8b11d0f600bfb0c093",
<del> uri="/dir/index.html",
<add> uri="/dir/index.html?query=string&value=some%20value",
<ide> qop=auth,
<ide> nc=00000001,
<ide> cnonce="0a4f113b",
<ide> public function testParseAuthData() {
<ide> 'username' => 'Mufasa',
<ide> 'realm' => 'testrealm@host.com',
<ide> 'nonce' => 'dcd98b7102dd2f0e8b11d0f600bfb0c093',
<del> 'uri' => '/dir/index.html',
<add> 'uri' => '/dir/index.html?query=string&value=some%20value',
<ide> 'qop' => 'auth',
<ide> 'nc' => '00000001',
<ide> 'cnonce' => '0a4f113b', | 2 |
Text | Text | remove openjsf slack nodejs from support doc | 3697587c31c71e6291adb5a4878b5f0860113385 | <ide><path>.github/SUPPORT.md
<ide> If you didn't find an answer in the resources above, try these unofficial
<ide> resources:
<ide>
<ide> * [Questions tagged 'node.js' on Stack Overflow](https://stackoverflow.com/questions/tagged/node.js)
<del>* [#nodejs](https://openjs-foundation.slack.com/archives/CK9Q4MB53) channel on the OpenJS Foundation Slack ([join here](https://slack-invite.openjsf.org/))
<ide> * [#node.js channel on libera.chat](https://web.libera.chat?channels=node.js&uio=d4)
<ide> * [Node.js Slack Community](https://node-js.slack.com/)
<ide> * To register: [nodeslackers.com](https://www.nodeslackers.com/) | 1 |
PHP | PHP | add dynamodb_endpoint to the cache config | 953093795869beb658bdd9be9442188f39e62b6a | <ide><path>config/cache.php
<ide> 'secret' => env('AWS_SECRET_ACCESS_KEY'),
<ide> 'region' => env('AWS_DEFAULT_REGION', 'us-east-1'),
<ide> 'table' => env('DYNAMODB_CACHE_TABLE', 'cache'),
<add> 'endpoint' => env('DYNAMODB_ENDPOINT'),
<ide> ],
<ide>
<ide> ], | 1 |
Javascript | Javascript | update openssl3 error messages for 3.0.0+quic | e016cc79f0b8ddf2b982dcb1bebb2af6f97dec5b | <ide><path>test/parallel/test-crypto-key-objects.js
<ide> const privateDsa = fixtures.readKey('dsa_private_encrypted_1025.pem',
<ide> // Reading an encrypted key without a passphrase should fail.
<ide> assert.throws(() => createPrivateKey(privateDsa), common.hasOpenSSL3 ? {
<ide> name: 'Error',
<del> message: 'error:1E08010C:DECODER routines::unsupported',
<add> message: 'error:07880109:common libcrypto routines::interrupted or ' +
<add> 'cancelled',
<ide> } : {
<ide> name: 'TypeError',
<ide> code: 'ERR_MISSING_PASSPHRASE',
<ide> const privateDsa = fixtures.readKey('dsa_private_encrypted_1025.pem',
<ide> passphrase: Buffer.alloc(1024, 'a')
<ide> }), {
<ide> message: common.hasOpenSSL3 ?
<del> 'error:1E08010C:DECODER routines::unsupported' :
<add> 'error:07880109:common libcrypto routines::interrupted or cancelled' :
<ide> /bad decrypt/
<ide> });
<ide>
<ide><path>test/parallel/test-crypto-keygen.js
<ide> const sec1EncExp = (cipher) => getRegExpForPEM('EC PRIVATE KEY', cipher);
<ide> // Since the private key is encrypted, signing shouldn't work anymore.
<ide> assert.throws(() => testSignVerify(publicKey, privateKey),
<ide> common.hasOpenSSL3 ? {
<del> message: 'error:1E08010C:DECODER routines::unsupported'
<add> message: 'error:07880109:common libcrypto ' +
<add> 'routines::interrupted or cancelled'
<ide> } : {
<ide> name: 'TypeError',
<ide> code: 'ERR_MISSING_PASSPHRASE',
<ide> const sec1EncExp = (cipher) => getRegExpForPEM('EC PRIVATE KEY', cipher);
<ide> // Since the private key is encrypted, signing shouldn't work anymore.
<ide> assert.throws(() => testSignVerify(publicKey, privateKey),
<ide> common.hasOpenSSL3 ? {
<del> message: 'error:1E08010C:DECODER routines::unsupported'
<add> message: 'error:07880109:common libcrypto ' +
<add> 'routines::interrupted or cancelled'
<ide> } : {
<ide> name: 'TypeError',
<ide> code: 'ERR_MISSING_PASSPHRASE', | 2 |
Go | Go | implement docker commit with standalone client lib | 8c9ad7b818c0a7b1e39f8df1fabba243a0961c2d | <ide><path>api/client/commit.go
<ide> package client
<ide>
<ide> import (
<del> "encoding/json"
<ide> "errors"
<ide> "fmt"
<del> "net/url"
<ide>
<ide> "github.com/docker/distribution/reference"
<del> "github.com/docker/docker/api/types"
<add> "github.com/docker/docker/api/client/lib"
<ide> Cli "github.com/docker/docker/cli"
<ide> "github.com/docker/docker/opts"
<ide> flag "github.com/docker/docker/pkg/mflag"
<ide> "github.com/docker/docker/registry"
<del> "github.com/docker/docker/runconfig"
<ide> )
<ide>
<ide> // CmdCommit creates a new image from a container's changes.
<ide> func (cli *DockerCli) CmdCommit(args ...string) error {
<ide> }
<ide> }
<ide>
<del> v := url.Values{}
<del> v.Set("container", name)
<del> v.Set("repo", repositoryName)
<del> v.Set("tag", tag)
<del> v.Set("comment", *flComment)
<del> v.Set("author", *flAuthor)
<del> for _, change := range flChanges.GetAll() {
<del> v.Add("changes", change)
<add> options := lib.ContainerCommitOptions{
<add> ContainerID: name,
<add> RepositoryName: repositoryName,
<add> Tag: tag,
<add> Comment: *flComment,
<add> Author: *flAuthor,
<add> Changes: flChanges.GetAll(),
<add> Pause: *flPause,
<add> JSONConfig: *flConfig,
<ide> }
<ide>
<del> if *flPause != true {
<del> v.Set("pause", "0")
<del> }
<del>
<del> var (
<del> config *runconfig.Config
<del> response types.ContainerCommitResponse
<del> )
<del>
<del> if *flConfig != "" {
<del> config = &runconfig.Config{}
<del> if err := json.Unmarshal([]byte(*flConfig), config); err != nil {
<del> return err
<del> }
<del> }
<del> serverResp, err := cli.call("POST", "/commit?"+v.Encode(), config, nil)
<add> response, err := cli.client.ContainerCommit(options)
<ide> if err != nil {
<ide> return err
<ide> }
<ide>
<del> defer serverResp.body.Close()
<del>
<del> if err := json.NewDecoder(serverResp.body).Decode(&response); err != nil {
<del> return err
<del> }
<del>
<ide> fmt.Fprintln(cli.out, response.ID)
<ide> return nil
<ide> }
<ide><path>api/client/lib/container_commit.go
<add>package lib
<add>
<add>import (
<add> "encoding/json"
<add> "net/url"
<add>
<add> "github.com/docker/docker/api/types"
<add> "github.com/docker/docker/runconfig"
<add>)
<add>
<add>// ContainerCommitOptions hods parameters to commit changes into a container.
<add>type ContainerCommitOptions struct {
<add> ContainerID string
<add> RepositoryName string
<add> Tag string
<add> Comment string
<add> Author string
<add> Changes []string
<add> Pause bool
<add> JSONConfig string
<add>}
<add>
<add>// ContainerCommit applies changes into a container and creates a new tagged image.
<add>func (cli *Client) ContainerCommit(options types.ContainerCommitOptions) (types.ContainerCommitResponse, error) {
<add> query := url.Values{}
<add> query.Set("container", options.ContainerID)
<add> query.Set("repo", options.RepositoryName)
<add> query.Set("tag", options.Tag)
<add> query.Set("comment", options.Comment)
<add> query.Set("author", options.Author)
<add> for _, change := range options.Changes {
<add> query.Add("changes", change)
<add> }
<add> if options.Pause != true {
<add> query.Set("pause", "0")
<add> }
<add>
<add> var (
<add> config *runconfig.Config
<add> response types.ContainerCommitResponse
<add> )
<add>
<add> if options.JSONConfig != "" {
<add> config = &runconfig.Config{}
<add> if err := json.Unmarshal([]byte(options.JSONConfig), config); err != nil {
<add> return response, err
<add> }
<add> }
<add>
<add> resp, err := cli.POST("/commit", query, config, nil)
<add> if err != nil {
<add> return response, err
<add> }
<add>
<add> defer resp.body.Close()
<add>
<add> if err := json.NewDecoder(resp.body).Decode(&response); err != nil {
<add> return response, err
<add> }
<add>
<add> return response, nil
<add>} | 2 |
Python | Python | use api dict to generate old types code | 9d048961def588712d3040956a21f4363521e07c | <ide><path>numpy/core/code_generators/generate_numpy_api.py
<ide> def generate_api(output_dir, force=False):
<ide>
<ide> return targets
<ide>
<del>def generate_type_decl(offset, types, init_list, module_list, extension_list):
<del> for k, atype in enumerate(types):
<del> num = offset + k
<del> astr = " (void *) &Py%sArrType_Type," % types[k]
<del> init_list.append(astr)
<del> astr = """\
<del>#ifdef NPY_ENABLE_SEPARATE_COMPILATION
<del> extern NPY_NO_EXPORT PyTypeObject Py%(type)sArrType_Type;
<del>#else
<del> NPY_NO_EXPORT PyTypeObject Py%(type)sArrType_Type;
<del>#endif
<del>""" % {'type': types[k]}
<del> module_list.append(astr)
<del> astr = "#define Py%sArrType_Type (*(PyTypeObject *)PyArray_API[%d])" % \
<del> (types[k], num)
<del> extension_list.append(astr)
<del>
<ide> class Type:
<ide> def __init__(self, name, index, ptr_cast):
<ide> self.index = index
<ide> def do_generate_api(targets, sources):
<ide> init_list = []
<ide>
<ide> # setup old types
<del> generate_type_decl(fixed, old_types, init_list, module_list, extension_list)
<add> for t in range(fixed, numtypes):
<add> name, index = ordered_types_api.pop(0)
<add> init_list.append(""" (void *) &%s,""" % name)
<add> astr = """\
<add>#ifdef NPY_ENABLE_SEPARATE_COMPILATION
<add> extern NPY_NO_EXPORT PyTypeObject %(type)s;
<add>#else
<add> NPY_NO_EXPORT PyTypeObject %(type)s;
<add>#endif
<add>""" % {'type': name}
<add> module_list.append(astr)
<add> astr = "#define %s (*(PyTypeObject *)PyArray_API[%d])" % \
<add> (name, index)
<add> extension_list.append(astr)
<ide>
<ide> # set up object API
<ide> num = genapi.add_api_list(numtypes, 'PyArray_API', numpyapi_list, | 1 |
Text | Text | remove scaleway info | 2ec186377eef75efafb58c56ea4ebb673e60b9fd | <ide><path>ARM.md
<ide> So for example in order to build a Docker binary one has to
<ide> 1. clone the Docker/Docker repository on an ARM device `git clone git@github.com:docker/docker.git`
<ide> 2. change into the checked out repository with `cd docker`
<ide> 3. execute `make binary` to create a Docker Engine binary for ARM
<del>
<del># Supported devices
<del>
<del>## Scaleway Server C1
<del>A Scaleway C1 server can be easily purchased on demand on the Scaleway website:
<del>
<del>https://www.scaleway.com
<del>
<del>It is a cheap and fast way to get access to a physical ARM server.
<del>It features a 4-cores ARMv7 CPU with 2GB of RAM and a 1 Gbit/s network card.
<del>
<del>Scaleway servers can be started we prepared images from their image hub.
<del>The best image to build a Docker Development Image is:
<del>
<del>https://www.scaleway.com/imagehub/docker/ | 1 |
Go | Go | fix style of swarm test name | 073d8115871f15ad36b6ab34e2af1f8f22ec333f | <ide><path>integration-cli/docker_api_swarm_test.go
<ide> func (s *DockerSwarmSuite) TestAPISwarmServicesUpdate(c *check.C) {
<ide> map[string]int{image1: instances})
<ide> }
<ide>
<del>func (s *DockerSwarmSuite) TestApiSwarmServicesFailedUpdate(c *check.C) {
<add>func (s *DockerSwarmSuite) TestAPISwarmServicesFailedUpdate(c *check.C) {
<ide> const nodeCount = 3
<ide> var daemons [nodeCount]*SwarmDaemon
<ide> for i := 0; i < nodeCount; i++ { | 1 |
Javascript | Javascript | clarify scope types and controlleras | 80a2176e2095eaf441c43c221391093631d34031 | <ide><path>src/ng/compile.js
<ide> *
<ide> * * **falsy:** No scope will be created for the directive. The directive will use its parent's scope.
<ide> *
<del> * * **`true`:** A new scope will be created for the directive's element. If multiple directives on the
<del> * same element request a new scope, only one new scope is created. The new scope rule does not apply
<del> * for the root of the template since the root of the template always gets a new scope.
<add> * * **`true`:** A new child scope that prototypically inherits from its parent will be created for
<add> * the directive's element. If multiple directives on the same element request a new scope,
<add> * only one new scope is created. The new scope rule does not apply for the root of the template
<add> * since the root of the template always gets a new scope.
<ide> *
<ide> * * **`{...}` (an object hash):** A new "isolate" scope is created for the directive's element. The
<ide> * 'isolate' scope differs from normal scope in that it does not prototypically inherit from its parent
<ide> *
<ide> * #### `controllerAs`
<ide> * Identifier name for a reference to the controller in the directive's scope.
<del> * This allows the controller to be referenced from the directive template. The directive
<del> * needs to define a scope for this configuration to be used. Useful in the case when
<del> * directive is used as component.
<add> * This allows the controller to be referenced from the directive template. This is especially
<add> * useful when a directive is used as component, i.e. with an `isolate` scope. It's also possible
<add> * to use it in a directive without an `isolate` / `new` scope, but you need to be aware that the
<add> * `controllerAs` reference might overwrite a property that already exists on the parent scope.
<ide> *
<ide> *
<ide> * #### `restrict` | 1 |
Java | Java | restore prior resolvetypearguments behavior | 1a3ba79071a4e0898ec8d8807ea584ef00bdc3d7 | <ide><path>spring-core/src/main/java/org/springframework/core/GenericTypeResolver.java
<ide> public static Class[] resolveTypeArguments(Class<?> clazz, Class<?> genericIfc)
<ide> if (!type.hasGenerics()) {
<ide> return null;
<ide> }
<del> return type.resolveGenerics();
<add> return type.resolveGenerics(Object.class);
<ide> }
<ide>
<ide> /**
<ide><path>spring-core/src/main/java/org/springframework/core/ResolvableType.java
<ide> public ResolvableType[] getGenerics() {
<ide> * @see #resolve()
<ide> */
<ide> public Class<?>[] resolveGenerics() {
<add> return resolveGenerics(null);
<add> }
<add>
<add> /**
<add> * Convenience method that will {@link #getGenerics() get} and {@link #resolve()
<add> * resolve} generic parameters, using the specified {@code fallback} if any type
<add> * cannot be resolved.
<add> * @param fallback the fallback class to use if resolution fails (may be {@code null})
<add> * @return an array of resolved generic parameters (the resulting array will never be
<add> * {@code null}, but it may contain {@code null} elements})
<add> * @see #getGenerics()
<add> * @see #resolve()
<add> */
<add> public Class<?>[] resolveGenerics(Class<?> fallback) {
<ide> ResolvableType[] generics = getGenerics();
<ide> Class<?>[] resolvedGenerics = new Class<?>[generics.length];
<ide> for (int i = 0; i < generics.length; i++) {
<del> resolvedGenerics[i] = generics[i].resolve();
<add> resolvedGenerics[i] = generics[i].resolve(fallback);
<ide> }
<ide> return resolvedGenerics;
<ide> }
<ide><path>spring-core/src/main/java/org/springframework/core/convert/support/GenericConversionService.java
<ide> /*
<del> * Copyright 2002-2012 the original author or authors.
<add> * Copyright 2002-2013 the original author or authors.
<ide> *
<ide> * Licensed under the Apache License, Version 2.0 (the "License");
<ide> * you may not use this file except in compliance with the License.
<ide> import java.util.concurrent.ConcurrentHashMap;
<ide>
<ide> import org.springframework.core.GenericTypeResolver;
<add>import org.springframework.core.ResolvableType;
<ide> import org.springframework.core.convert.ConversionException;
<ide> import org.springframework.core.convert.ConversionFailedException;
<ide> import org.springframework.core.convert.ConversionService;
<ide> protected GenericConverter getDefaultConverter(TypeDescriptor sourceType, TypeDe
<ide> // internal helpers
<ide>
<ide> private GenericConverter.ConvertiblePair getRequiredTypeInfo(Object converter, Class<?> genericIfc) {
<del> Class<?>[] args = GenericTypeResolver.resolveTypeArguments(converter.getClass(), genericIfc);
<del> return (args != null ? new GenericConverter.ConvertiblePair(args[0], args[1]) : null);
<add> ResolvableType resolvableType = ResolvableType.forClass(converter.getClass()).as(genericIfc);
<add> if(resolvableType.hasUnresolvableGenerics()) {
<add> return null;
<add> }
<add> return new GenericConverter.ConvertiblePair(resolvableType.resolveGeneric(0), resolvableType.resolveGeneric(1));
<ide> }
<ide>
<ide> private void invalidateCache() {
<ide><path>spring-core/src/test/java/org/springframework/core/GenericTypeResolverTests.java
<ide> import java.lang.reflect.TypeVariable;
<ide> import java.util.Collection;
<ide> import java.util.HashMap;
<add>import java.util.List;
<ide> import java.util.Map;
<ide>
<ide> import org.junit.Test;
<ide> public void testGetTypeVariableMap() throws Exception {
<ide> assertThat(x, equalTo((Type) Long.class));
<ide> }
<ide>
<add> @Test
<add> public void getGenericsCannotBeResovled() throws Exception {
<add> // SPR-11030
<add> Class[] resolved = GenericTypeResolver.resolveTypeArguments(List.class, Iterable.class);
<add> assertThat(resolved, equalTo(new Class[] { Object.class }));
<add> }
<ide>
<ide> public interface MyInterfaceType<T> {
<ide> } | 4 |
Javascript | Javascript | remove unused packages from xplat/js/package.json | 894f6b3e744679769ba0f6b83bae89354f0f1914 | <ide><path>scripts/android-e2e-test.js
<ide> const wd = require('wd');
<ide> const path = require('path');
<ide> const fs = require('fs');
<ide> const pd = require('pretty-data2').pd;
<del>require('colors');
<add>
<ide> // value in ms to print out screen contents, set this value in CI to debug if tests are failing
<ide> const appiumDebugInterval = process.env.APPIUM_DEBUG_INTERVAL;
<ide> | 1 |
Python | Python | add script to run ud test | 723b328062f02f0833234900314418abc7d372eb | <ide><path>spacy/cli/ud_run_test.py
<add>'''Train for CONLL 2017 UD treebank evaluation. Takes .conllu files, writes
<add>.conllu format for development data, allowing the official scorer to be used.
<add>'''
<add>from __future__ import unicode_literals
<add>import plac
<add>import tqdm
<add>from pathlib import Path
<add>import re
<add>import sys
<add>import json
<add>
<add>import spacy
<add>import spacy.util
<add>from ..tokens import Token, Doc
<add>from ..gold import GoldParse
<add>from ..util import compounding, minibatch_by_words
<add>from ..syntax.nonproj import projectivize
<add>from ..matcher import Matcher
<add>from ..morphology import Fused_begin, Fused_inside
<add>from .. import displacy
<add>from collections import defaultdict, Counter
<add>from timeit import default_timer as timer
<add>
<add>import itertools
<add>import random
<add>import numpy.random
<add>import cytoolz
<add>
<add>from . import conll17_ud_eval
<add>
<add>from .. import lang
<add>from .. import lang
<add>from ..lang import zh
<add>from ..lang import ja
<add>from ..lang import ru
<add>
<add>
<add>################
<add># Data reading #
<add>################
<add>
<add>space_re = re.compile('\s+')
<add>def split_text(text):
<add> return [space_re.sub(' ', par.strip()) for par in text.split('\n\n')]
<add>
<add>
<add>##############
<add># Evaluation #
<add>##############
<add>
<add>def read_conllu(file_):
<add> docs = []
<add> sent = []
<add> doc = []
<add> for line in file_:
<add> if line.startswith('# newdoc'):
<add> if doc:
<add> docs.append(doc)
<add> doc = []
<add> elif line.startswith('#'):
<add> continue
<add> elif not line.strip():
<add> if sent:
<add> doc.append(sent)
<add> sent = []
<add> else:
<add> sent.append(list(line.strip().split('\t')))
<add> if len(sent[-1]) != 10:
<add> print(repr(line))
<add> raise ValueError
<add> if sent:
<add> doc.append(sent)
<add> if doc:
<add> docs.append(doc)
<add> return docs
<add>
<add>
<add>def evaluate(nlp, text_loc, gold_loc, sys_loc, limit=None):
<add> if text_loc.parts[-1].endswith('.conllu'):
<add> docs = []
<add> with text_loc.open() as file_:
<add> for conllu_doc in read_conllu(file_):
<add> for conllu_sent in conllu_doc:
<add> words = [line[1] for line in conllu_sent]
<add> docs.append(Doc(nlp.vocab, words=words))
<add> for name, component in nlp.pipeline:
<add> docs = list(component.pipe(docs))
<add> else:
<add> with text_loc.open('r', encoding='utf8') as text_file:
<add> texts = split_text(text_file.read())
<add> docs = list(nlp.pipe(texts))
<add> with sys_loc.open('w', encoding='utf8') as out_file:
<add> write_conllu(docs, out_file)
<add> with gold_loc.open('r', encoding='utf8') as gold_file:
<add> gold_ud = conll17_ud_eval.load_conllu(gold_file)
<add> with sys_loc.open('r', encoding='utf8') as sys_file:
<add> sys_ud = conll17_ud_eval.load_conllu(sys_file)
<add> scores = conll17_ud_eval.evaluate(gold_ud, sys_ud)
<add> return docs, scores
<add>
<add>
<add>def write_conllu(docs, file_):
<add> merger = Matcher(docs[0].vocab)
<add> merger.add('SUBTOK', None, [{'DEP': 'subtok', 'op': '+'}])
<add> for i, doc in enumerate(docs):
<add> matches = merger(doc)
<add> spans = [doc[start:end+1] for _, start, end in matches]
<add> offsets = [(span.start_char, span.end_char) for span in spans]
<add> for start_char, end_char in offsets:
<add> doc.merge(start_char, end_char)
<add> # TODO: This shuldn't be necessary? Should be handled in merge
<add> for word in doc:
<add> if word.i == word.head.i:
<add> word.dep_ = 'ROOT'
<add> file_.write("# newdoc id = {i}\n".format(i=i))
<add> for j, sent in enumerate(doc.sents):
<add> file_.write("# sent_id = {i}.{j}\n".format(i=i, j=j))
<add> file_.write("# text = {text}\n".format(text=sent.text))
<add> for k, token in enumerate(sent):
<add> file_.write(_get_token_conllu(token, k, len(sent)) + '\n')
<add> file_.write('\n')
<add> for word in sent:
<add> if word.head.i == word.i and word.dep_ == 'ROOT':
<add> break
<add> else:
<add> print("Rootless sentence!")
<add> print(sent)
<add> print(i)
<add> for w in sent:
<add> print(w.i, w.text, w.head.text, w.head.i, w.dep_)
<add> raise ValueError
<add>
<add>
<add>def _get_token_conllu(token, k, sent_len):
<add> if token.check_morph(Fused_begin) and (k+1 < sent_len):
<add> n = 1
<add> text = [token.text]
<add> while token.nbor(n).check_morph(Fused_inside):
<add> text.append(token.nbor(n).text)
<add> n += 1
<add> id_ = '%d-%d' % (k+1, (k+n))
<add> fields = [id_, ''.join(text)] + ['_'] * 8
<add> lines = ['\t'.join(fields)]
<add> else:
<add> lines = []
<add> if token.head.i == token.i:
<add> head = 0
<add> else:
<add> head = k + (token.head.i - token.i) + 1
<add> fields = [str(k+1), token.text, token.lemma_, token.pos_, token.tag_, '_',
<add> str(head), token.dep_.lower(), '_', '_']
<add> if token.check_morph(Fused_begin) and (k+1 < sent_len):
<add> if k == 0:
<add> fields[1] = token.norm_[0].upper() + token.norm_[1:]
<add> else:
<add> fields[1] = token.norm_
<add> elif token.check_morph(Fused_inside):
<add> fields[1] = token.norm_
<add> elif token._.split_start is not None:
<add> split_start = token._.split_start
<add> split_end = token._.split_end
<add> split_len = (split_end.i - split_start.i) + 1
<add> n_in_split = token.i - split_start.i
<add> subtokens = guess_fused_orths(split_start.text, [''] * split_len)
<add> fields[1] = subtokens[n_in_split]
<add>
<add> lines.append('\t'.join(fields))
<add> return '\n'.join(lines)
<add>
<add>
<add>def guess_fused_orths(word, ud_forms):
<add> '''The UD data 'fused tokens' don't necessarily expand to keys that match
<add> the form. We need orths that exact match the string. Here we make a best
<add> effort to divide up the word.'''
<add> if word == ''.join(ud_forms):
<add> # Happy case: we get a perfect split, with each letter accounted for.
<add> return ud_forms
<add> elif len(word) == sum(len(subtoken) for subtoken in ud_forms):
<add> # Unideal, but at least lengths match.
<add> output = []
<add> remain = word
<add> for subtoken in ud_forms:
<add> assert len(subtoken) >= 1
<add> output.append(remain[:len(subtoken)])
<add> remain = remain[len(subtoken):]
<add> assert len(remain) == 0, (word, ud_forms, remain)
<add> return output
<add> else:
<add> # Let's say word is 6 long, and there are three subtokens. The orths
<add> # *must* equal the original string. Arbitrarily, split [4, 1, 1]
<add> first = word[:len(word)-(len(ud_forms)-1)]
<add> output = [first]
<add> remain = word[len(first):]
<add> for i in range(1, len(ud_forms)):
<add> assert remain
<add> output.append(remain[:1])
<add> remain = remain[1:]
<add> assert len(remain) == 0, (word, output, remain)
<add> return output
<add>
<add>
<add>
<add>def print_results(name, ud_scores):
<add> fields = {}
<add> if ud_scores is not None:
<add> fields.update({
<add> 'words': ud_scores['Words'].f1 * 100,
<add> 'sents': ud_scores['Sentences'].f1 * 100,
<add> 'tags': ud_scores['XPOS'].f1 * 100,
<add> 'uas': ud_scores['UAS'].f1 * 100,
<add> 'las': ud_scores['LAS'].f1 * 100,
<add> })
<add> else:
<add> fields.update({
<add> 'words': 0.0,
<add> 'sents': 0.0,
<add> 'tags': 0.0,
<add> 'uas': 0.0,
<add> 'las': 0.0
<add> })
<add> tpl = '\t'.join((
<add> name,
<add> '{las:.1f}',
<add> '{uas:.1f}',
<add> '{tags:.1f}',
<add> '{sents:.1f}',
<add> '{words:.1f}',
<add> ))
<add> print(tpl.format(**fields))
<add> return fields
<add>
<add>
<add>def get_token_split_start(token):
<add> if token.text == '':
<add> assert token.i != 0
<add> i = -1
<add> while token.nbor(i).text == '':
<add> i -= 1
<add> return token.nbor(i)
<add> elif (token.i+1) < len(token.doc) and token.nbor(1).text == '':
<add> return token
<add> else:
<add> return None
<add>
<add>
<add>def get_token_split_end(token):
<add> if (token.i+1) == len(token.doc):
<add> return token if token.text == '' else None
<add> elif token.text != '' and token.nbor(1).text != '':
<add> return None
<add> i = 1
<add> while (token.i+i) < len(token.doc) and token.nbor(i).text == '':
<add> i += 1
<add> return token.nbor(i-1)
<add>
<add>
<add>Token.set_extension('split_start', getter=get_token_split_start)
<add>Token.set_extension('split_end', getter=get_token_split_end)
<add>Token.set_extension('begins_fused', default=False)
<add>Token.set_extension('inside_fused', default=False)
<add>
<add>
<add>##################
<add># Initialization #
<add>##################
<add>
<add>
<add>def load_nlp(experiments_dir, corpus):
<add> nlp = spacy.load(experiments_dir / corpus / 'best-model')
<add> return nlp
<add>
<add>def initialize_pipeline(nlp, docs, golds, config, device):
<add> nlp.add_pipe(nlp.create_pipe('parser'))
<add> return nlp
<add>
<add>
<add>@plac.annotations(
<add> test_data_dir=("Path to Universal Dependencies test data", "positional", None, Path),
<add> experiment_dir=("Parent directory with output model", "positional", None, Path),
<add> corpus=("UD corpus to evaluate, e.g. UD_English, UD_Spanish, etc", "positional", None, str),
<add>)
<add>def main(test_data_dir, experiment_dir, corpus):
<add> lang.zh.Chinese.Defaults.use_jieba = False
<add> lang.ja.Japanese.Defaults.use_janome = False
<add> lang.ru.Russian.Defaults.use_pymorphy2 = False
<add>
<add> nlp = load_nlp(experiment_dir, corpus)
<add>
<add> treebank_code = nlp.meta['treebank']
<add> for section in ('test', 'dev'):
<add> if section == 'dev':
<add> section_dir = 'conll17-ud-development-2017-03-19'
<add> else:
<add> section_dir = 'conll17-ud-test-2017-05-09'
<add> text_path = test_data_dir / 'input' / section_dir / (treebank_code+'.txt')
<add> udpipe_path = test_data_dir / 'input' / section_dir / (treebank_code+'-udpipe.conllu')
<add> gold_path = test_data_dir / 'gold' / section_dir / (treebank_code+'.conllu')
<add>
<add> header = [section, 'LAS', 'UAS', 'TAG', 'SENT', 'WORD']
<add> print('\t'.join(header))
<add> inputs = {'gold': gold_path, 'udp': udpipe_path, 'raw': text_path}
<add> for input_type in ('udp', 'raw'):
<add> input_path = inputs[input_type]
<add> output_path = experiment_dir / corpus / '{section}.conllu'.format(section=section)
<add>
<add> parsed_docs, test_scores = evaluate(nlp, input_path, gold_path, output_path)
<add>
<add> accuracy = print_results(input_type, test_scores)
<add> acc_path = experiment_dir / corpus / '{section}-accuracy.json'.format(section=section)
<add> with open(acc_path, 'w') as file_:
<add> file_.write(json.dumps(accuracy, indent=2))
<add>
<add>
<add>if __name__ == '__main__':
<add> plac.call(main) | 1 |
PHP | PHP | use short closure | 7b17f5f32623c2ee75f2bff57a42bb8f180ac779 | <ide><path>database/factories/UserFactory.php
<ide> public function definition()
<ide> */
<ide> public function unverified()
<ide> {
<del> return $this->state(function (array $attributes) {
<del> return [
<del> 'email_verified_at' => null,
<del> ];
<del> });
<add> return $this->state(fn (array $attributes) => [
<add> 'email_verified_at' => null,
<add> ]);
<ide> }
<ide> } | 1 |
Ruby | Ruby | add hook for add_resource_route | 17b8fd5540fd7f6e3cbfa48c0daa4051674647b7 | <ide><path>railties/lib/rails/generators.rb
<ide> module Generators
<ide> :orm => false,
<ide> :performance_tool => nil,
<ide> :resource_controller => :controller,
<add> :resource_route => true,
<ide> :scaffold_controller => :scaffold_controller,
<ide> :stylesheets => true,
<ide> :stylesheet_engine => :css,
<ide> def self.hidden_namespaces
<ide>
<ide> [
<ide> "rails",
<add> "resource_route",
<ide> "#{orm}:migration",
<ide> "#{orm}:model",
<ide> "#{orm}:observer",
<ide><path>railties/lib/rails/generators/rails/resource/resource_generator.rb
<ide> class ResourceGenerator < ModelGenerator #metagenerator
<ide> class_option :http, :type => :boolean, :default => false,
<ide> :desc => "Generate resource with HTTP actions only"
<ide>
<del> def add_resource_route
<del> return if options[:actions].present?
<del> route_config = regular_class_path.collect{ |namespace| "namespace :#{namespace} do " }.join(" ")
<del> route_config << "resources :#{file_name.pluralize}"
<del> route_config << " end" * regular_class_path.size
<del> route route_config
<del> end
<add> hook_for :resource_route, :required => true
<ide> end
<ide> end
<ide> end
<ide><path>railties/lib/rails/generators/rails/resource_route/resource_route_generator.rb
<add>module Rails
<add> module Generators
<add> class ResourceRouteGenerator < NamedBase
<add> def add_resource_route
<add> return if options[:actions].present?
<add> route_config = regular_class_path.collect{ |namespace| "namespace :#{namespace} do " }.join(" ")
<add> route_config << "resources :#{file_name.pluralize}"
<add> route_config << " end" * regular_class_path.size
<add> route route_config
<add> end
<add> end
<add> end
<add>end | 3 |
Javascript | Javascript | change var to let | dde8fd3fab1ea849984d328f129691e32a055f21 | <ide><path>lib/fs.js
<ide> if (isWindows) {
<ide> };
<ide> } else {
<ide> splitRoot = function splitRoot(str) {
<del> for (var i = 0; i < str.length; ++i) {
<add> for (let i = 0; i < str.length; ++i) {
<ide> if (str.charCodeAt(i) !== CHAR_FORWARD_SLASH)
<ide> return str.slice(0, i);
<ide> } | 1 |
Python | Python | return scalars for scalar inputs to np.real/imag | d268966200adbe0e2afdd40aa92bb83c5a931e30 | <ide><path>numpy/lib/tests/test_type_check.py
<ide> def test_real(self):
<ide> y = np.random.rand(10,)
<ide> assert_array_equal(y, np.real(y))
<ide>
<add> y = np.array(1)
<add> out = np.real(y)
<add> assert_array_equal(y, out)
<add> assert_(isinstance(out, np.ndarray))
<add>
<add> y = 1
<add> out = np.real(y)
<add> assert_equal(y, out)
<add> assert_(not isinstance(out, np.ndarray))
<add>
<ide> def test_cmplx(self):
<ide> y = np.random.rand(10,)+1j*np.random.rand(10,)
<ide> assert_array_equal(y.real, np.real(y))
<ide>
<add> y = np.array(1 + 1j)
<add> out = np.real(y)
<add> assert_array_equal(y.real, out)
<add> assert_(isinstance(out, np.ndarray))
<add>
<add> y = 1 + 1j
<add> out = np.real(y)
<add> assert_equal(1.0, out)
<add> assert_(not isinstance(out, np.ndarray))
<add>
<ide>
<ide> class TestImag(TestCase):
<ide>
<ide> def test_real(self):
<ide> y = np.random.rand(10,)
<ide> assert_array_equal(0, np.imag(y))
<ide>
<add> y = np.array(1)
<add> out = np.imag(y)
<add> assert_array_equal(0, out)
<add> assert_(isinstance(out, np.ndarray))
<add>
<add> y = 1
<add> out = np.imag(y)
<add> assert_equal(0, out)
<add> assert_(not isinstance(out, np.ndarray))
<add>
<ide> def test_cmplx(self):
<ide> y = np.random.rand(10,)+1j*np.random.rand(10,)
<ide> assert_array_equal(y.imag, np.imag(y))
<ide>
<add> y = np.array(1 + 1j)
<add> out = np.imag(y)
<add> assert_array_equal(y.imag, out)
<add> assert_(isinstance(out, np.ndarray))
<add>
<add> y = 1 + 1j
<add> out = np.imag(y)
<add> assert_equal(1.0, out)
<add> assert_(not isinstance(out, np.ndarray))
<add>
<ide>
<ide> class TestIscomplex(TestCase):
<ide>
<ide><path>numpy/lib/type_check.py
<ide> 'common_type']
<ide>
<ide> import numpy.core.numeric as _nx
<del>from numpy.core.numeric import asarray, asanyarray, array, isnan, \
<del> obj2sctype, zeros
<add>from numpy.core.numeric import asarray, asanyarray, array, isnan, zeros
<ide> from .ufunclike import isneginf, isposinf
<ide>
<ide> _typecodes_by_elsize = 'GDFgdfQqLlIiHhBb?'
<ide> def asfarray(a, dtype=_nx.float_):
<ide> dtype = _nx.float_
<ide> return asarray(a, dtype=dtype)
<ide>
<add>
<ide> def real(val):
<ide> """
<del> Return the real part of the elements of the array.
<add> Return the real part of the complex argument.
<ide>
<ide> Parameters
<ide> ----------
<ide> def real(val):
<ide>
<ide> Returns
<ide> -------
<del> out : ndarray
<del> Output array. If `val` is real, the type of `val` is used for the
<del> output. If `val` has complex elements, the returned type is float.
<add> out : ndarray or scalar
<add> The real component of the complex argument. If `val` is real, the type
<add> of `val` is used for the output. If `val` has complex elements, the
<add> returned type is float.
<ide>
<ide> See Also
<ide> --------
<ide> def real(val):
<ide> >>> a.real = np.array([9, 8, 7])
<ide> >>> a
<ide> array([ 9.+2.j, 8.+4.j, 7.+6.j])
<add> >>> np.real(1 + 1j)
<add> 1.0
<ide>
<ide> """
<del> return asanyarray(val).real
<add> try:
<add> return val.real
<add> except AttributeError:
<add> return asanyarray(val).real
<add>
<ide>
<ide> def imag(val):
<ide> """
<del> Return the imaginary part of the elements of the array.
<add> Return the imaginary part of the complex argument.
<ide>
<ide> Parameters
<ide> ----------
<ide> def imag(val):
<ide>
<ide> Returns
<ide> -------
<del> out : ndarray
<del> Output array. If `val` is real, the type of `val` is used for the
<del> output. If `val` has complex elements, the returned type is float.
<add> out : ndarray or scalar
<add> The imaginary component of the complex argument. If `val` is real,
<add> the type of `val` is used for the output. If `val` has complex
<add> elements, the returned type is float.
<ide>
<ide> See Also
<ide> --------
<ide> def imag(val):
<ide> >>> a.imag = np.array([8, 10, 12])
<ide> >>> a
<ide> array([ 1. +8.j, 3.+10.j, 5.+12.j])
<add> >>> np.imag(1 + 1j)
<add> 1.0
<ide>
<ide> """
<del> return asanyarray(val).imag
<add> try:
<add> return val.imag
<add> except AttributeError:
<add> return asanyarray(val).imag
<add>
<ide>
<ide> def iscomplex(x):
<ide> """ | 2 |
Javascript | Javascript | use assert.strictequal and fix settimeout | f7a35df171f16a6e79de3343ccbebb0dd8df124e | <ide><path>test/parallel/test-domain-timers.js
<ide> var timeout;
<ide> var timeoutd = domain.create();
<ide>
<ide> timeoutd.on('error', common.mustCall(function(e) {
<del> assert.equal(e.message, 'Timeout UNREFd', 'Domain should catch timer error');
<add> assert.strictEqual(e.message, 'Timeout UNREFd',
<add> 'Domain should catch timer error');
<ide> clearTimeout(timeout);
<ide> }));
<ide>
<ide> timeoutd.run(function() {
<ide> setTimeout(function() {
<ide> throw new Error('Timeout UNREFd');
<del> }).unref();
<add> }, 0).unref();
<ide> });
<ide>
<ide> var immediated = domain.create();
<ide>
<ide> immediated.on('error', common.mustCall(function(e) {
<del> assert.equal(e.message, 'Immediate Error',
<del> 'Domain should catch immediate error');
<add> assert.strictEqual(e.message, 'Immediate Error',
<add> 'Domain should catch immediate error');
<ide> }));
<ide>
<ide> immediated.run(function() { | 1 |
Text | Text | add changelog to active model too [ci skip] | eddcdb0f1de6e7b1b503a6df8d60cd6a145ce080 | <ide><path>activemodel/CHANGELOG.md
<add>* Introduce `validate` as an alias for `valid?`.
<add>
<add> This is more intuitive when you want to run validations but don't care about the return value.
<add>
<add> *Henrik Nyh*
<add>
<ide> Please check [4-1-stable](https://github.com/rails/rails/blob/4-1-stable/activemodel/CHANGELOG.md) for previous changes. | 1 |
Ruby | Ruby | add pinned version to outdated json output | 996dcdee2cacdfee90602387d5c5142d749f00de | <ide><path>Library/Homebrew/cmd/outdated.rb
<ide> def print_outdated_json(formulae)
<ide>
<ide> json << { name: f.full_name,
<ide> installed_versions: outdated_versions.collect(&:to_s),
<del> current_version: current_version }
<add> current_version: current_version,
<add> pinned: f.pinned?,
<add> pinned_version: f.pinned_version }
<ide> end
<ide> puts JSON.generate(json)
<ide>
<ide><path>Library/Homebrew/test/cmd/outdated_spec.rb
<ide> .and be_a_success
<ide> end
<ide> end
<add>
<add> context "json output" do
<add> it "includes pinned version in the json output" do
<add> setup_test_formula "testball"
<add> (HOMEBREW_CELLAR/"testball/0.0.1/foo").mkpath
<add>
<add> shutup do
<add> expect { brew "pin", "testball" }.to be_a_success
<add> end
<add>
<add> expected_json = [
<add> {
<add> name: "testball",
<add> installed_versions: ["0.0.1"],
<add> current_version: "0.1",
<add> pinned: true,
<add> pinned_version: "0.0.1",
<add> },
<add> ].to_json
<add>
<add> expect { brew "outdated", "--json=v1" }
<add> .to output(expected_json + "\n").to_stdout
<add> .and not_to_output.to_stderr
<add> .and be_a_success
<add> end
<add>
<add> it "has no pinned version when the formula isn't pinned" do
<add> setup_test_formula "testball"
<add> (HOMEBREW_CELLAR/"testball/0.0.1/foo").mkpath
<add>
<add> expected_json = [
<add> {
<add> name: "testball",
<add> installed_versions: ["0.0.1"],
<add> current_version: "0.1",
<add> pinned: false,
<add> pinned_version: nil,
<add> },
<add> ].to_json
<add>
<add> expect { brew "outdated", "--json=v1" }
<add> .to output(expected_json + "\n").to_stdout
<add> .and not_to_output.to_stderr
<add> .and be_a_success
<add> end
<add> end
<ide> end | 2 |
Javascript | Javascript | remove bad assertion | 1df736e2561b9e061a218529fb1b47cfc55a5d84 | <ide><path>i18n/spec/closureI18nExtractorSpec.js
<ide> describe('findLocaleId', function() {
<ide> it('should not find localeId if data is missing', function() {
<ide> expect(findLocaleId('', 'num')).toBeUndefined();
<ide> expect(findLocaleId('aa', 'datetime')).toBeUndefined();
<del> expect(findLocaleId('aa', 'randomType')).toBeUndefined();
<ide> expect(findLocaleId('NumberFormatSymbols_en', 'datetime')).toBeUndefined();
<ide> expect(findLocaleId('DateTimeSymbols_en', 'num')).toBeUndefined();
<ide> }); | 1 |
Ruby | Ruby | build predicatebuilder object only when needed | 2ff73039bdb6880af5586d8e4d6960b34cdf00ce | <ide><path>activerecord/lib/active_record/relation/query_methods.rb
<ide> def build_arel
<ide> def build_where(*args)
<ide> return if args.blank?
<ide>
<del> builder = PredicateBuilder.new(table.engine)
<del>
<ide> opts = args.first
<ide> case opts
<ide> when String, Array
<ide> @klass.send(:sanitize_sql, args.size > 1 ? args : opts)
<ide> when Hash
<ide> attributes = @klass.send(:expand_hash_conditions_for_aggregates, opts)
<del> builder.build_from_hash(attributes, table)
<add> PredicateBuilder.new(table.engine).build_from_hash(attributes, table)
<ide> else
<ide> opts
<ide> end | 1 |
Javascript | Javascript | increase coverage of string-decoder | 492163c74cf8ed5bf581231a348d75e61809b9a2 | <ide><path>test/parallel/test-string-decoder-end.js
<ide> const assert = require('assert');
<ide> const SD = require('string_decoder').StringDecoder;
<ide> const encodings = ['base64', 'hex', 'utf8', 'utf16le', 'ucs2'];
<ide>
<del>const bufs = [ '☃💩', 'asdf' ].map(function(b) {
<del> return Buffer.from(b);
<del>});
<add>const bufs = [ '☃💩', 'asdf' ].map((b) => Buffer.from(b));
<ide>
<ide> // also test just arbitrary bytes from 0-15.
<ide> for (let i = 1; i <= 16; i++) {
<del> const bytes = new Array(i).join('.').split('.').map(function(_, j) {
<del> return j + 0x78;
<del> });
<add> const bytes = new Array(i).join('.').split('.').map((_, j) => j + 0x78);
<ide> bufs.push(Buffer.from(bytes));
<ide> }
<ide>
<ide> encodings.forEach(testEncoding);
<ide> console.log('ok');
<ide>
<ide> function testEncoding(encoding) {
<del> bufs.forEach(function(buf) {
<add> bufs.forEach((buf) => {
<ide> testBuf(encoding, buf);
<ide> });
<ide> }
<ide><path>test/parallel/test-string-decoder.js
<ide> assert.strictEqual(decoder.write(Buffer.from('3DD8', 'hex')), '');
<ide> assert.strictEqual(decoder.write(Buffer.from('4D', 'hex')), '');
<ide> assert.strictEqual(decoder.end(), '\ud83d');
<ide>
<add>assert.throws(() => {
<add> new StringDecoder(1);
<add>}, /^Error: Unknown encoding: 1$/);
<add>
<add>assert.throws(() => {
<add> new StringDecoder('test');
<add>}, /^Error: Unknown encoding: test$/);
<add>
<ide> // test verifies that StringDecoder will correctly decode the given input
<ide> // buffer with the given encoding to the expected output. It will attempt all
<ide> // possible ways to write() the input buffer, see writeSequences(). The
<ide> function test(encoding, input, expected, singleSequence) {
<ide> } else {
<ide> sequences = [singleSequence];
<ide> }
<del> sequences.forEach(function(sequence) {
<add> sequences.forEach((sequence) => {
<ide> const decoder = new StringDecoder(encoding);
<ide> let output = '';
<del> sequence.forEach(function(write) {
<add> sequence.forEach((write) => {
<ide> output += decoder.write(input.slice(write[0], write[1]));
<ide> });
<ide> output += decoder.end(); | 2 |
Javascript | Javascript | support texture name in gltfloader | 9dbb37e2342ce9523d1559c00244905745352f14 | <ide><path>examples/js/loaders/GLTFLoader.js
<ide> THREE.GLTFLoader = ( function () {
<ide>
<ide> _texture.flipY = false;
<ide>
<add> if ( texture.name !== undefined ) _texture.name = texture.name;
<add>
<ide> if ( texture.sampler ) {
<ide>
<ide> var sampler = json.samplers[ texture.sampler ]; | 1 |
PHP | PHP | list artisan | e0cf00fd1319733666d6c9c1e24468c66f6f58c7 | <ide><path>src/Illuminate/Foundation/Console/RouteListCommand.php
<ide> protected function getRouteInformation(Route $route)
<ide> 'method' => implode('|', $route->methods()),
<ide> 'uri' => $route->uri(),
<ide> 'name' => $route->getName(),
<del> 'action' => $route->getActionName(),
<add> 'action' => ltrim($route->getActionName(), '\\'),
<ide> 'middleware' => $this->getMiddleware($route),
<ide> ]);
<ide> } | 1 |
Ruby | Ruby | publish bottles on bintray | 675991eb611af0a970dce6945f5b882c4343875f | <ide><path>Library/Homebrew/cmd/pull.rb
<ide> def pull
<ide> safe_system "git", "merge", "--ff-only", "--no-edit", bottle_branch
<ide> safe_system "git", "branch", "-D", bottle_branch
<ide>
<del> # TODO: publish on bintray
<del> # safe_system "curl", "-u#{user}:#{key}", "-X", "POST",
<del> # "https://api.bintray.com/content/homebrew/#{repo}/#{formula}/#{version}"
<add> # Publish bottles on Bintray
<add> bintray_user = ENV["BINTRAY_USER"]
<add> bintray_key = ENV["BINTRAY_KEY"]
<add> bintray_repo = if tap_name
<add> tap_name.sub("/", "-") + "-bottles"
<add> else
<add> "bottles"
<add> end
<add>
<add> # Skip taps for now until we're using Bintray for Homebrew/homebrew
<add> if bintray_user && bintray_key && !tap_name
<add> changed_formulae.each do |f|
<add> ohai "Publishing on Bintray:"
<add> safe_system "curl", "--silent", "--fail",
<add> "-u#{bintray_user}:#{bintray_key}", "-X", "POST",
<add> "https://api.bintray.com/content/homebrew/#{bintray_repo}/#{f.name}/#{f.version}/publish"
<add> puts
<add> end
<add> end
<ide> end
<ide>
<ide> ohai 'Patch changed:' | 1 |
Javascript | Javascript | make syncwritestream non-enumerable | 53a4eb319893d722cd614bacde98856b1f7c37cb | <ide><path>lib/fs.js
<ide> util.inherits(SyncWriteStream, Stream);
<ide>
<ide>
<ide> // Export
<del>fs.SyncWriteStream = SyncWriteStream;
<del>
<add>Object.defineProperty(fs, 'SyncWriteStream', {
<add> configurable: true,
<add> writable: true,
<add> value: SyncWriteStream
<add>});
<ide>
<ide> SyncWriteStream.prototype.write = function(data, arg1, arg2) {
<ide> var encoding, cb; | 1 |
Python | Python | improve initialization for mutually textcat | d13b9373bf04133eae11873782ee992cbc048726 | <ide><path>spacy/_ml.py
<ide> def finish_update(d_X, sgd=None):
<ide>
<ide>
<ide> def _zero_init(model):
<del> def _zero_init_impl(self, X, y):
<add> def _zero_init_impl(self, *args, **kwargs):
<ide> self.W.fill(0)
<ide>
<del> model.on_data_hooks.append(_zero_init_impl)
<add> model.on_init_hooks.append(_zero_init_impl)
<ide> if model.W is not None:
<ide> model.W.fill(0.0)
<ide> return model
<ide> def build_simple_cnn_text_classifier(tok2vec, nr_class, exclusive_classes=False,
<ide> if exclusive_classes:
<ide> output_layer = Softmax(nr_class, tok2vec.nO)
<ide> else:
<del> output_layer = zero_init(Affine(nr_class, tok2vec.nO)) >> logistic
<add> output_layer = zero_init(Affine(nr_class, tok2vec.nO, drop_factor=0.0)) >> logistic
<ide> model = tok2vec >> flatten_add_lengths >> Pooling(mean_pool) >> output_layer
<ide> model.tok2vec = chain(tok2vec, flatten)
<ide> model.nO = nr_class | 1 |
Javascript | Javascript | stack trace for critical uglifyjs errors | 3597cdb1496bc14547b92118a76c5e168f590dca | <ide><path>lib/optimize/UglifyJsPlugin.js
<ide> UglifyJsPlugin.prototype.apply = function(compiler) {
<ide> compilation.errors.push(new Error(file + " from UglifyJs\n" + err.message + " [" + file + ":" + err.line + "," + err.col + "]"));
<ide> }
<ide> } else
<del> compilation.errors.push(new Error(file + " from UglifyJs\n" + err.message));
<add> compilation.errors.push(new Error(file + " from UglifyJs\n" + err.stack));
<ide> } finally {
<ide> uglify.AST_Node.warn_function = oldWarnFunction;
<ide> } | 1 |
Python | Python | fix conv_rnn bug with cntk/theano | 510f9f0b1d97ca3567c858e0300c913a4bc2bf53 | <ide><path>keras/layers/convolutional_recurrent.py
<ide> def get_initial_state(self, inputs):
<ide> import tensorflow as tf
<ide> kernel = tf.zeros(tuple(shape))
<ide> else:
<del> K.zeros(tuple(shape))
<add> kernel = K.zeros(tuple(shape))
<ide> initial_state = self.cell.input_conv(initial_state,
<ide> kernel,
<ide> padding=self.cell.padding) | 1 |
PHP | PHP | allow string payloads for get requests | 2b6648164cdb817c775c53021387d67d229dafbf | <ide><path>src/Network/Http/Client.php
<ide> public function cookies()
<ide> * @param array $options Additional options for the request.
<ide> * @return \Cake\Network\Http\Response
<ide> */
<del> public function get($url, array $data = [], array $options = [])
<add> public function get($url, $data = [], array $options = [])
<ide> {
<ide> $options = $this->_mergeOptions($options);
<ide> $body = [];
<ide> public function send(Request $request, $options = [])
<ide> * Generate a URL based on the scoped client options.
<ide> *
<ide> * @param string $url Either a full URL or just the path.
<del> * @param array $query The query data for the URL.
<add> * @param string|array $query The query data for the URL.
<ide> * @param array $options The config options stored with Client::config()
<ide> * @return string A complete url with scheme, port, host, path.
<ide> */
<ide> public function buildUrl($url, $query = [], $options = [])
<ide> }
<ide> if ($query) {
<ide> $q = (strpos($url, '?') === false) ? '?' : '&';
<del> $url .= $q . http_build_query($query);
<add> $url .= $q;
<add> $url .= is_string($query) ? $query : http_build_query($query);
<ide> }
<ide> if (preg_match('#^https?://#', $url)) {
<ide> return $url;
<ide><path>tests/TestCase/Network/Http/ClientTest.php
<ide> public function testGetQuerystring()
<ide> $this->assertSame($result, $response);
<ide> }
<ide>
<add> /**
<add> * test get request with string of query data.
<add> *
<add> * @return void
<add> */
<add> public function testGetQuerystringString()
<add> {
<add> $response = new Response();
<add>
<add> $mock = $this->getMock('Cake\Network\Http\Adapter\Stream', ['send']);
<add> $mock->expects($this->once())
<add> ->method('send')
<add> ->with($this->logicalAnd(
<add> $this->isInstanceOf('Cake\Network\Http\Request'),
<add> $this->attributeEqualTo('_url', 'http://cakephp.org/search?q=hi+there&Category%5Bid%5D%5B0%5D=2&Category%5Bid%5D%5B1%5D=3')
<add> ))
<add> ->will($this->returnValue([$response]));
<add>
<add> $http = new Client([
<add> 'host' => 'cakephp.org',
<add> 'adapter' => $mock
<add> ]);
<add> $data = [
<add> 'q' => 'hi there',
<add> 'Category' => ['id' => [2, 3]]
<add> ];
<add> $result = $http->get('/search', http_build_query($data));
<add> $this->assertSame($result, $response);
<add> }
<add>
<ide> /**
<ide> * Test a GET with a request body. Services like
<ide> * elasticsearch use this feature. | 2 |
Python | Python | add tests for reformer tokenizer | c9454507cf57d38fd863c2544300c88583fc60e3 | <ide><path>tests/test_tokenization_reformer.py
<add># coding=utf-8
<add># Copyright 2018 The Google AI Language Team Authors.
<add>#
<add># Licensed under the Apache License, Version 2.0 (the "License");
<add># you may not use this file except in compliance with the License.
<add># You may obtain a copy of the License at
<add>#
<add># http://www.apache.org/licenses/LICENSE-2.0
<add>#
<add># Unless required by applicable law or agreed to in writing, software
<add># distributed under the License is distributed on an "AS IS" BASIS,
<add># WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
<add># See the License for the specific language governing permissions and
<add># limitations under the License.
<add>
<add>
<add>import os
<add>import unittest
<add>
<add>from transformers.file_utils import cached_property
<add>from transformers.testing_utils import require_torch, slow
<add>from transformers.tokenization_reformer import SPIECE_UNDERLINE, ReformerTokenizer
<add>
<add>from .test_tokenization_common import TokenizerTesterMixin
<add>
<add>
<add>SAMPLE_VOCAB = os.path.join(os.path.dirname(os.path.abspath(__file__)), "fixtures/test_sentencepiece.model")
<add>
<add>
<add>class ReformerTokenizationTest(TokenizerTesterMixin, unittest.TestCase):
<add>
<add> tokenizer_class = ReformerTokenizer
<add>
<add> def setUp(self):
<add> super().setUp()
<add>
<add> tokenizer = ReformerTokenizer(SAMPLE_VOCAB, keep_accents=True)
<add> tokenizer.save_pretrained(self.tmpdirname)
<add>
<add> def test_full_tokenizer(self):
<add> tokenizer = ReformerTokenizer(SAMPLE_VOCAB, keep_accents=True)
<add>
<add> tokens = tokenizer.tokenize("This is a test")
<add> self.assertListEqual(tokens, ["▁This", "▁is", "▁a", "▁t", "est"])
<add>
<add> self.assertListEqual(
<add> tokenizer.convert_tokens_to_ids(tokens), [285, 46, 10, 170, 382],
<add> )
<add>
<add> tokens = tokenizer.tokenize("I was born in 92000, and this is falsé.")
<add> self.assertListEqual(
<add> tokens,
<add> [
<add> SPIECE_UNDERLINE + "I",
<add> SPIECE_UNDERLINE + "was",
<add> SPIECE_UNDERLINE + "b",
<add> "or",
<add> "n",
<add> SPIECE_UNDERLINE + "in",
<add> SPIECE_UNDERLINE + "",
<add> "9",
<add> "2",
<add> "0",
<add> "0",
<add> "0",
<add> ",",
<add> SPIECE_UNDERLINE + "and",
<add> SPIECE_UNDERLINE + "this",
<add> SPIECE_UNDERLINE + "is",
<add> SPIECE_UNDERLINE + "f",
<add> "al",
<add> "s",
<add> "é",
<add> ".",
<add> ],
<add> )
<add> ids = tokenizer.convert_tokens_to_ids(tokens)
<add> self.assertListEqual(
<add> ids, [8, 21, 84, 55, 24, 19, 7, 0, 602, 347, 347, 347, 3, 12, 66, 46, 72, 80, 6, 0, 4],
<add> )
<add>
<add> back_tokens = tokenizer.convert_ids_to_tokens(ids)
<add> self.assertListEqual(
<add> back_tokens,
<add> [
<add> SPIECE_UNDERLINE + "I",
<add> SPIECE_UNDERLINE + "was",
<add> SPIECE_UNDERLINE + "b",
<add> "or",
<add> "n",
<add> SPIECE_UNDERLINE + "in",
<add> SPIECE_UNDERLINE + "",
<add> "<unk>",
<add> "2",
<add> "0",
<add> "0",
<add> "0",
<add> ",",
<add> SPIECE_UNDERLINE + "and",
<add> SPIECE_UNDERLINE + "this",
<add> SPIECE_UNDERLINE + "is",
<add> SPIECE_UNDERLINE + "f",
<add> "al",
<add> "s",
<add> "<unk>",
<add> ".",
<add> ],
<add> )
<add>
<add> @cached_property
<add> def big_tokenizer(self):
<add> return ReformerTokenizer.from_pretrained("google/reformer-crime-and-punishment")
<add>
<add> @slow
<add> def test_tokenization_base_easy_symbols(self):
<add> symbols = "Hello World!"
<add> original_tokenizer_encodings = [126, 32, 262, 152, 38, 72, 287]
<add>
<add> self.assertListEqual(original_tokenizer_encodings, self.big_tokenizer.encode(symbols))
<add>
<add> @slow
<add> def test_tokenization_base_hard_symbols(self):
<add> symbols = 'This is a very long text with a lot of weird characters, such as: . , ~ ? ( ) " [ ] ! : - . Also we will add words that should not exsist and be tokenized to <unk>, such as saoneuhaoesuth'
<add> original_tokenizer_encodings = [
<add> 108,
<add> 265,
<add> 24,
<add> 111,
<add> 4,
<add> 258,
<add> 156,
<add> 35,
<add> 28,
<add> 275,
<add> 3,
<add> 259,
<add> 297,
<add> 260,
<add> 84,
<add> 4,
<add> 35,
<add> 110,
<add> 44,
<add> 8,
<add> 259,
<add> 91,
<add> 268,
<add> 21,
<add> 11,
<add> 209,
<add> 274,
<add> 109,
<add> 266,
<add> 277,
<add> 117,
<add> 86,
<add> 93,
<add> 315,
<add> 258,
<add> 278,
<add> 258,
<add> 277,
<add> 258,
<add> 0,
<add> 258,
<add> 288,
<add> 258,
<add> 319,
<add> 258,
<add> 0,
<add> 258,
<add> 0,
<add> 258,
<add> 0,
<add> 258,
<add> 0,
<add> 258,
<add> 287,
<add> 258,
<add> 315,
<add> 258,
<add> 289,
<add> 258,
<add> 278,
<add> 99,
<add> 269,
<add> 266,
<add> 262,
<add> 8,
<add> 259,
<add> 241,
<add> 4,
<add> 217,
<add> 230,
<add> 268,
<add> 266,
<add> 55,
<add> 168,
<add> 106,
<add> 75,
<add> 193,
<add> 266,
<add> 223,
<add> 27,
<add> 49,
<add> 26,
<add> 282,
<add> 25,
<add> 264,
<add> 299,
<add> 19,
<add> 26,
<add> 0,
<add> 258,
<add> 277,
<add> 117,
<add> 86,
<add> 93,
<add> 176,
<add> 183,
<add> 270,
<add> 11,
<add> 262,
<add> 42,
<add> 61,
<add> 265,
<add> ]
<add>
<add> self.assertListEqual(original_tokenizer_encodings, self.big_tokenizer.encode(symbols))
<add>
<add> @slow
<add> @require_torch
<add> def test_torch_encode_plus_sent_to_model(self):
<add> import torch
<add> from transformers import ReformerModel, ReformerConfig
<add>
<add> # Build sequence
<add> first_ten_tokens = list(self.big_tokenizer.get_vocab().keys())[:10]
<add> sequence = " ".join(first_ten_tokens)
<add> encoded_sequence = self.big_tokenizer.encode_plus(sequence, return_tensors="pt")
<add> batch_encoded_sequence = self.big_tokenizer.batch_encode_plus([sequence, sequence], return_tensors="pt")
<add>
<add> config = ReformerConfig()
<add> # The input gets padded during training so adjust the axial position encodings from the pretrained model value of (512, 1024)
<add> config.axial_pos_shape = encoded_sequence["input_ids"].shape
<add> model = ReformerModel(config)
<add>
<add> # Reformer has config.vocab_size == tokenizer.vocab_size == len(tokenizer) - 1 = 320; len(tokenizer) is 321 (including a pad token with id 320)
<add> assert model.get_input_embeddings().weight.shape[0] >= self.big_tokenizer.vocab_size
<add>
<add> with torch.no_grad():
<add> model(**encoded_sequence)
<add> model(**batch_encoded_sequence) | 1 |
Go | Go | remove deprecated types | f02cdb50be3c99dcb5609efcb44c2af51e612f7a | <ide><path>api/types/volume/deprecated.go
<del>package volume // import "github.com/docker/docker/api/types/volume"
<del>
<del>// VolumeCreateBody Volume configuration
<del>//
<del>// Deprecated: use CreateOptions
<del>type VolumeCreateBody = CreateOptions
<del>
<del>// VolumeListOKBody Volume list response
<del>//
<del>// Deprecated: use ListResponse
<del>type VolumeListOKBody = ListResponse | 1 |
Text | Text | add translation for titles [portuguese] | b8d0f2c82fece5427fd980b181cd1c3c12c3f00f | <ide><path>curriculum/challenges/portuguese/03-front-end-libraries/jquery/clone-an-element-using-jquery.portuguese.md
<ide> videoUrl: ''
<ide> localeTitle: Clone um elemento usando jQuery
<ide> ---
<ide>
<del>## Description
<add>## Descrição
<ide> <section id="description"> Além de mover elementos, você também pode copiá-los de um lugar para outro. jQuery tem uma função chamada <code>clone()</code> que faz uma cópia de um elemento. Por exemplo, se quiséssemos copiar o <code>target2</code> do nosso <code>target2</code> da <code>left-well</code> para o nosso da <code>right-well</code> , <code>$("#target2").clone().appendTo("#right-well");</code> : <code>$("#target2").clone().appendTo("#right-well");</code> Você notou que isso envolve juntar duas funções jQuery? Isso é chamado de <code>function chaining</code> e é uma maneira conveniente de fazer as coisas com o jQuery. Clone seu elemento <code>target5</code> e anexe-o à sua <code>left-well</code> . </section>
<ide>
<del>## Instructions
<add>## Intruções
<ide> <section id="instructions">
<ide> </section>
<ide>
<del>## Tests
<add>## Testes
<ide> <section id='tests'>
<ide>
<ide> ```yml
<ide> tests:
<ide>
<ide> </section>
<ide>
<del>## Challenge Seed
<add>## Desafio
<ide> <section id='challengeSeed'>
<ide>
<ide> <div id='html-seed'>
<ide> tests:
<ide>
<ide> </section>
<ide>
<del>## Solution
<add>## Solução
<ide> <section id='solution'>
<ide>
<ide> ```js | 1 |
Python | Python | add regression test for #686 | 444d665f9dd9befe00a60f3e3ead9418546bd24e | <ide><path>spacy/tests/regression/test_issue686.py
<add># coding: utf8
<add>from __future__ import unicode_literals
<add>
<add>import pytest
<add>
<add>
<add>@pytest.mark.xfail
<add>@pytest.mark.models
<add>@pytest.mark.parametrize('text', ["He is the man.", "They are the men."])
<add>def test_issue686(EN, text):
<add> """Test that pronoun lemmas are assigned correctly."""
<add> tokens = EN(text)
<add> assert tokens[0].lemma_ == "-PRON-" | 1 |
Text | Text | remove all-caps shouting from child_process.md | 266234bd62ff2d6f60e9127508f89641f8ada7c8 | <ide><path>doc/api/child_process.md
<ide> See also: [`child_process.exec()`][] and [`child_process.fork()`][].
<ide> ## Synchronous Process Creation
<ide>
<ide> The [`child_process.spawnSync()`][], [`child_process.execSync()`][], and
<del>[`child_process.execFileSync()`][] methods are **synchronous** and **WILL**
<add>[`child_process.execFileSync()`][] methods are **synchronous** and **will**
<ide> block the Node.js event loop, pausing execution of any additional code until the
<ide> spawned process exits.
<ide> | 1 |
Javascript | Javascript | add tests for fs/promises chown functions | a4ce449bb7b0e2f9edba75baf2a2f652a6750454 | <ide><path>test/parallel/test-fs-promises.js
<ide> const fsPromises = fs.promises;
<ide> const {
<ide> access,
<ide> chmod,
<add> chown,
<ide> copyFile,
<add> lchown,
<ide> link,
<ide> lchmod,
<ide> lstat,
<ide> function verifyStatObject(stat) {
<ide> await chmod(dest, (0o10777));
<ide> await handle.chmod(0o10777);
<ide>
<add> if (!common.isWindows) {
<add> await chown(dest, process.getuid(), process.getgid());
<add> await handle.chown(process.getuid(), process.getgid());
<add> }
<add>
<add> assert.rejects(
<add> async () => {
<add> await chown(dest, 1, -1);
<add> },
<add> {
<add> code: 'ERR_OUT_OF_RANGE',
<add> name: 'RangeError [ERR_OUT_OF_RANGE]',
<add> message: 'The value of "gid" is out of range. ' +
<add> 'It must be >= 0 && < 4294967296. Received -1'
<add> });
<add>
<add> assert.rejects(
<add> async () => {
<add> await handle.chown(1, -1);
<add> },
<add> {
<add> code: 'ERR_OUT_OF_RANGE',
<add> name: 'RangeError [ERR_OUT_OF_RANGE]',
<add> message: 'The value of "gid" is out of range. ' +
<add> 'It must be >= 0 && < 4294967296. Received -1'
<add> });
<add>
<ide> await utimes(dest, new Date(), new Date());
<ide>
<ide> try {
<ide> function verifyStatObject(stat) {
<ide> if (common.canCreateSymLink()) {
<ide> const newLink = path.resolve(tmpDir, 'baz3.js');
<ide> await symlink(newPath, newLink);
<add> if (!common.isWindows) {
<add> await lchown(newLink, process.getuid(), process.getgid());
<add> }
<ide> stats = await lstat(newLink);
<ide> verifyStatObject(stats);
<ide> | 1 |
Text | Text | fix typo in shallow routing doc | 191d65889be4fdaedc4442ff9cf4435ee71705d9 | <ide><path>readme.md
<ide> Router.onAppUpdated = (nextUrl) => {
<ide>
<ide> Shallow routing allows you to change the URL without running `getInitialProps`. You'll receive the updated `pathname` and the `query` via the `url` prop of the same page that's loaded, without losing state.
<ide>
<del>You can do this by invoking the eith `Router.push` or `Router.replace` with `shallow: true` option. Here's an example:
<add>You can do this by invoking either `Router.push` or `Router.replace` with the `shallow: true` option. Here's an example:
<ide>
<ide> ```jsx
<ide> // Current URL is "/" | 1 |
Ruby | Ruby | detect uncompressed tars | 428781723b9612198b05b5fb30ce60622f363011 | <ide><path>Library/Homebrew/download_strategy.rb
<ide> def stage
<ide> when :zip
<ide> quiet_safe_system '/usr/bin/unzip', {:quiet_flag => '-qq'}, @tarball_path
<ide> chdir
<del> when :gzip, :bzip2, :compress
<add> when :gzip, :bzip2, :compress, :tar
<ide> # Assume these are also tarred
<ide> # TODO check if it's really a tar archive
<ide> safe_system '/usr/bin/tar', 'xf', @tarball_path
<ide><path>Library/Homebrew/extend/pathname.rb
<ide> def compression_type
<ide> # OS X installer package
<ide> return :pkg if self.extname == '.pkg'
<ide>
<del> # get the first six bytes
<add> # Get enough of the file to detect common file types
<add> # POSIX tar magic has a 257 byte offset
<ide> magic_bytes = nil
<del> File.open(self) { |f| magic_bytes = f.read(6) }
<add> File.open(self) { |f| magic_bytes = f.read(262) }
<ide>
<ide> # magic numbers stolen from /usr/share/file/magic/
<ide> case magic_bytes
<ide> when /^PK\003\004/ then :zip
<ide> when /^\037\213/ then :gzip
<ide> when /^BZh/ then :bzip2
<ide> when /^\037\235/ then :compress
<add> when /^.{257}ustar/ then :tar
<ide> when /^\xFD7zXZ\x00/ then :xz
<ide> when /^Rar!/ then :rar
<ide> else | 2 |
Java | Java | add way to override default multipartreader | e7cca7792d9281aacecb422bc51a758d67d27e6c | <ide><path>spring-web/src/main/java/org/springframework/http/codec/ServerCodecConfigurer.java
<ide> /*
<del> * Copyright 2002-2018 the original author or authors.
<add> * Copyright 2002-2019 the original author or authors.
<ide> *
<ide> * Licensed under the Apache License, Version 2.0 (the "License");
<ide> * you may not use this file except in compliance with the License.
<ide> package org.springframework.http.codec;
<ide>
<ide> import org.springframework.core.codec.Encoder;
<add>import org.springframework.http.codec.multipart.Part;
<ide>
<ide> /**
<ide> * Extension of {@link CodecConfigurer} for HTTP message reader and writer
<ide> interface ServerDefaultCodecs extends DefaultCodecs {
<ide> /**
<ide> * Configure the {@code Encoder} to use for Server-Sent Events.
<ide> * <p>By default if this is not set, and Jackson is available, the
<del> * {@link #jackson2JsonEncoder} override is used instead. Use this property
<del> * if you want to further customize the SSE encoder.
<add> * {@link #jackson2JsonEncoder} override is used instead. Use this method
<add> * to customize the SSE encoder.
<ide> */
<ide> void serverSentEventEncoder(Encoder<?> encoder);
<add>
<add> /**
<add> * Configure the {@code HttpMessageReader} to use for multipart messages
<add> * (i.e. file uploads).
<add> * <p>By default if this is not set, the
<add> * {@link org.springframework.http.codec.multipart.DefaultMultipartMessageReader} is used.
<add> * Use this method to customize the multipart reader, for instance to use
<add> * {@link org.springframework.http.codec.multipart.SynchronossPartHttpMessageReader}
<add> * instead.
<add> */
<add> void multipartReader(HttpMessageReader<Part> multipartReader);
<ide> }
<ide>
<ide> }
<ide><path>spring-web/src/main/java/org/springframework/http/codec/support/ServerDefaultCodecsImpl.java
<ide> /*
<del> * Copyright 2002-2018 the original author or authors.
<add> * Copyright 2002-2019 the original author or authors.
<ide> *
<ide> * Licensed under the Apache License, Version 2.0 (the "License");
<ide> * you may not use this file except in compliance with the License.
<ide> import org.springframework.core.codec.Encoder;
<ide> import org.springframework.http.codec.HttpMessageReader;
<ide> import org.springframework.http.codec.HttpMessageWriter;
<add>import org.springframework.http.codec.LoggingCodecSupport;
<ide> import org.springframework.http.codec.ServerCodecConfigurer;
<ide> import org.springframework.http.codec.ServerSentEventHttpMessageWriter;
<add>import org.springframework.http.codec.multipart.DefaultMultipartMessageReader;
<ide> import org.springframework.http.codec.multipart.MultipartHttpMessageReader;
<del>import org.springframework.http.codec.multipart.SynchronossPartHttpMessageReader;
<add>import org.springframework.http.codec.multipart.Part;
<ide> import org.springframework.lang.Nullable;
<del>import org.springframework.util.ClassUtils;
<ide>
<ide> /**
<ide> * Default implementation of {@link ServerCodecConfigurer.ServerDefaultCodecs}.
<ide> */
<ide> class ServerDefaultCodecsImpl extends BaseDefaultCodecs implements ServerCodecConfigurer.ServerDefaultCodecs {
<ide>
<del> private static final boolean synchronossMultipartPresent =
<del> ClassUtils.isPresent("org.synchronoss.cloud.nio.multipart.NioMultipartParser",
<del> DefaultServerCodecConfigurer.class.getClassLoader());
<del>
<del>
<ide> @Nullable
<ide> private Encoder<?> sseEncoder;
<ide>
<add> @Nullable
<add> private HttpMessageReader<Part> multipartReader;
<ide>
<ide> @Override
<ide> public void serverSentEventEncoder(Encoder<?> encoder) {
<ide> this.sseEncoder = encoder;
<ide> }
<ide>
<add> @Override
<add> public void multipartReader(HttpMessageReader<Part> multipartReader) {
<add> this.multipartReader = multipartReader;
<add> }
<add>
<ide>
<ide> @Override
<ide> protected void extendTypedReaders(List<HttpMessageReader<?>> typedReaders) {
<del> if (synchronossMultipartPresent) {
<del> boolean enable = isEnableLoggingRequestDetails();
<ide>
<del> SynchronossPartHttpMessageReader partReader = new SynchronossPartHttpMessageReader();
<del> partReader.setEnableLoggingRequestDetails(enable);
<del> typedReaders.add(partReader);
<add> HttpMessageReader<Part> partReader = getMultipartReader();
<ide>
<del> MultipartHttpMessageReader reader = new MultipartHttpMessageReader(partReader);
<del> reader.setEnableLoggingRequestDetails(enable);
<del> typedReaders.add(reader);
<add> boolean logRequestDetails = isEnableLoggingRequestDetails();
<add> if (partReader instanceof LoggingCodecSupport) {
<add> ((LoggingCodecSupport) partReader).setEnableLoggingRequestDetails(logRequestDetails);
<ide> }
<add> typedReaders.add(partReader);
<add>
<add> MultipartHttpMessageReader reader = new MultipartHttpMessageReader(partReader);
<add> reader.setEnableLoggingRequestDetails(logRequestDetails);
<add> typedReaders.add(reader);
<add> }
<add>
<add> private HttpMessageReader<Part> getMultipartReader() {
<add> return this.multipartReader != null ? this.multipartReader : new DefaultMultipartMessageReader();
<ide> }
<ide>
<ide> @Override
<ide><path>spring-web/src/test/java/org/springframework/http/codec/support/ServerCodecConfigurerTests.java
<ide> import org.springframework.http.codec.json.Jackson2JsonEncoder;
<ide> import org.springframework.http.codec.json.Jackson2SmileDecoder;
<ide> import org.springframework.http.codec.json.Jackson2SmileEncoder;
<add>import org.springframework.http.codec.multipart.DefaultMultipartMessageReader;
<ide> import org.springframework.http.codec.multipart.MultipartHttpMessageReader;
<del>import org.springframework.http.codec.multipart.SynchronossPartHttpMessageReader;
<ide> import org.springframework.http.codec.protobuf.ProtobufDecoder;
<ide> import org.springframework.http.codec.protobuf.ProtobufHttpMessageWriter;
<ide> import org.springframework.http.codec.xml.Jaxb2XmlDecoder;
<ide> import org.springframework.http.codec.xml.Jaxb2XmlEncoder;
<ide> import org.springframework.util.MimeTypeUtils;
<ide>
<ide> import static org.junit.Assert.*;
<del>import static org.springframework.core.ResolvableType.*;
<add>import static org.springframework.core.ResolvableType.forClass;
<ide>
<ide> /**
<ide> * Unit tests for {@link ServerCodecConfigurer}.
<ide> public void defaultReaders() {
<ide> assertStringDecoder(getNextDecoder(readers), true);
<ide> assertEquals(ProtobufDecoder.class, getNextDecoder(readers).getClass());
<ide> assertEquals(FormHttpMessageReader.class, readers.get(this.index.getAndIncrement()).getClass());
<del> assertEquals(SynchronossPartHttpMessageReader.class, readers.get(this.index.getAndIncrement()).getClass());
<add> assertEquals(DefaultMultipartMessageReader.class, readers.get(this.index.getAndIncrement()).getClass());
<ide> assertEquals(MultipartHttpMessageReader.class, readers.get(this.index.getAndIncrement()).getClass());
<ide> assertEquals(Jackson2JsonDecoder.class, getNextDecoder(readers).getClass());
<ide> assertEquals(Jackson2SmileDecoder.class, getNextDecoder(readers).getClass()); | 3 |
Javascript | Javascript | improve readability on conditional assignment | 345ed5d1c73f38a6032c4c2b869742502da807ea | <ide><path>src/ng/parse.js
<ide> Lexer.prototype = {
<ide> string += String.fromCharCode(parseInt(hex, 16));
<ide> } else {
<ide> var rep = ESCAPE[ch];
<del> if (rep) {
<del> string += rep;
<del> } else {
<del> string += ch;
<del> }
<add> string = string + (rep || ch);
<ide> }
<ide> escape = false;
<ide> } else if (ch === '\\') { | 1 |
Python | Python | fix separate compilation with distutils | dd63849d8c97e7dcf963c587d91711f0dfeeea40 | <ide><path>numpy/core/setup.py
<ide> def generate_umath_c(ext,build_dir):
<ide> multiarray_deps = [
<ide> join('src', 'multiarray', 'arrayobject.h'),
<ide> join('src', 'multiarray', 'arraytypes.h'),
<add> join('src', 'multiarray', 'buffer.h'),
<ide> join('src', 'multiarray', 'calculation.h'),
<ide> join('src', 'multiarray', 'common.h'),
<ide> join('src', 'multiarray', 'convert_datatype.h'),
<ide> def generate_umath_c(ext,build_dir):
<ide> multiarray_src = [join('src', 'multiarray', 'multiarraymodule.c'),
<ide> join('src', 'multiarray', 'hashdescr.c'),
<ide> join('src', 'multiarray', 'arrayobject.c'),
<add> join('src', 'multiarray', 'buffer.c'),
<ide> join('src', 'multiarray', 'numpyos.c'),
<ide> join('src', 'multiarray', 'conversion_utils.c'),
<ide> join('src', 'multiarray', 'flagsobject.c'), | 1 |
Javascript | Javascript | allow suspending and resuming watchers on scope | 41d5c90f170cc054b0f8f88220c22ef1ef6cc0a6 | <ide><path>src/ng/rootScope.js
<ide> function $RootScopeProvider() {
<ide> this.$$watchersCount = 0;
<ide> this.$id = nextUid();
<ide> this.$$ChildScope = null;
<add> this.$$suspended = false;
<ide> }
<ide> ChildScope.prototype = parent;
<ide> return ChildScope;
<ide> function $RootScopeProvider() {
<ide> this.$$childHead = this.$$childTail = null;
<ide> this.$root = this;
<ide> this.$$destroyed = false;
<add> this.$$suspended = false;
<ide> this.$$listeners = {};
<ide> this.$$listenerCount = {};
<ide> this.$$watchersCount = 0;
<ide> function $RootScopeProvider() {
<ide>
<ide> traverseScopesLoop:
<ide> do { // "traverse the scopes" loop
<del> if ((watchers = current.$$watchers)) {
<add> if ((watchers = !current.$$suspended && current.$$watchers)) {
<ide> // process our watches
<ide> watchers.$$digestWatchIndex = watchers.length;
<ide> while (watchers.$$digestWatchIndex--) {
<ide> function $RootScopeProvider() {
<ide> // Insanity Warning: scope depth-first traversal
<ide> // yes, this code is a bit crazy, but it works and we have tests to prove it!
<ide> // this piece should be kept in sync with the traversal in $broadcast
<del> if (!(next = ((current.$$watchersCount && current.$$childHead) ||
<add> // (though it differs due to having the extra check for $$suspended and does not
<add> // check $$listenerCount)
<add> if (!(next = ((!current.$$suspended && current.$$watchersCount && current.$$childHead) ||
<ide> (current !== target && current.$$nextSibling)))) {
<ide> while (current !== target && !(next = current.$$nextSibling)) {
<ide> current = current.$parent;
<ide> function $RootScopeProvider() {
<ide> $browser.$$checkUrlChange();
<ide> },
<ide>
<add> /**
<add> * @ngdoc method
<add> * @name $rootScope.Scope#$suspend
<add> * @kind function
<add> *
<add> * @description
<add> * Suspend watchers of this scope subtree so that they will not be invoked during digest.
<add> *
<add> * This can be used to optimize your application when you know that running those watchers
<add> * is redundant.
<add> *
<add> * **Warning**
<add> *
<add> * Suspending scopes from the digest cycle can have unwanted and difficult to debug results.
<add> * Only use this approach if you are confident that you know what you are doing and have
<add> * ample tests to ensure that bindings get updated as you expect.
<add> *
<add> * Some of the things to consider are:
<add> *
<add> * * Any external event on a directive/component will not trigger a digest while the hosting
<add> * scope is suspended - even if the event handler calls `$apply()` or `$rootScope.$digest()`.
<add> * * Transcluded content exists on a scope that inherits from outside a directive but exists
<add> * as a child of the directive's containing scope. If the containing scope is suspended the
<add> * transcluded scope will also be suspended, even if the scope from which the transcluded
<add> * scope inherits is not suspended.
<add> * * Multiple directives trying to manage the suspended status of a scope can confuse each other:
<add> * * A call to `$suspend()` on an already suspended scope is a no-op.
<add> * * A call to `$resume()` on a non-suspended scope is a no-op.
<add> * * If two directives suspend a scope, then one of them resumes the scope, the scope will no
<add> * longer be suspended. This could result in the other directive believing a scope to be
<add> * suspended when it is not.
<add> * * If a parent scope is suspended then all its descendants will be also excluded from future
<add> * digests whether or not they have been suspended themselves. Note that this also applies to
<add> * isolate child scopes.
<add> * * Calling `$digest()` directly on a descendant of a suspended scope will still run the watchers
<add> * for that scope and its descendants. When digesting we only check whether the current scope is
<add> * locally suspended, rather than checking whether it has a suspended ancestor.
<add> * * Calling `$resume()` on a scope that has a suspended ancestor will not cause the scope to be
<add> * included in future digests until all its ancestors have been resumed.
<add> * * Resolved promises, e.g. from explicit `$q` deferreds and `$http` calls, trigger `$apply()`
<add> * against the `$rootScope` and so will still trigger a global digest even if the promise was
<add> * initiated by a component that lives on a suspended scope.
<add> */
<add> $suspend: function() {
<add> this.$$suspended = true;
<add> },
<add>
<add> /**
<add> * @ngdoc method
<add> * @name $rootScope.Scope#$isSuspended
<add> * @kind function
<add> *
<add> * @description
<add> * Call this method to determine if this scope has been explicitly suspended. It will not
<add> * tell you whether an ancestor has been suspended.
<add> * To determine if this scope will be excluded from a digest triggered at the $rootScope,
<add> * for example, you must check all its ancestors:
<add> *
<add> * ```
<add> * function isExcludedFromDigest(scope) {
<add> * while(scope) {
<add> * if (scope.$isSuspended()) return true;
<add> * scope = scope.$parent;
<add> * }
<add> * return false;
<add> * ```
<add> *
<add> * Be aware that a scope may not be included in digests if it has a suspended ancestor,
<add> * even if `$isSuspended()` returns false.
<add> *
<add> * @returns true if the current scope has been suspended.
<add> */
<add> $isSuspended: function() {
<add> return this.$$suspended;
<add> },
<add>
<add> /**
<add> * @ngdoc method
<add> * @name $rootScope.Scope#$resume
<add> * @kind function
<add> *
<add> * @description
<add> * Resume watchers of this scope subtree in case it was suspended.
<add> *
<add> * See {@link $rootScope.Scope#$suspend} for information about the dangers of using this approach.
<add> */
<add> $resume: function() {
<add> this.$$suspended = false;
<add> },
<ide>
<ide> /**
<ide> * @ngdoc event
<ide> function $RootScopeProvider() {
<ide> // Insanity Warning: scope depth-first traversal
<ide> // yes, this code is a bit crazy, but it works and we have tests to prove it!
<ide> // this piece should be kept in sync with the traversal in $digest
<del> // (though it differs due to having the extra check for $$listenerCount)
<add> // (though it differs due to having the extra check for $$listenerCount and
<add> // does not check $$suspended)
<ide> if (!(next = ((current.$$listenerCount[name] && current.$$childHead) ||
<ide> (current !== target && current.$$nextSibling)))) {
<ide> while (current !== target && !(next = current.$$nextSibling)) {
<ide><path>test/ng/rootScopeSpec.js
<ide> describe('Scope', function() {
<ide> });
<ide> });
<ide>
<add>
<add> describe('$suspend/$resume/$isSuspended', function() {
<add> it('should suspend watchers on scope', inject(function($rootScope) {
<add> var watchSpy = jasmine.createSpy('watchSpy');
<add> $rootScope.$watch(watchSpy);
<add> $rootScope.$suspend();
<add> $rootScope.$digest();
<add> expect(watchSpy).not.toHaveBeenCalled();
<add> }));
<add>
<add> it('should resume watchers on scope', inject(function($rootScope) {
<add> var watchSpy = jasmine.createSpy('watchSpy');
<add> $rootScope.$watch(watchSpy);
<add> $rootScope.$suspend();
<add> $rootScope.$resume();
<add> $rootScope.$digest();
<add> expect(watchSpy).toHaveBeenCalled();
<add> }));
<add>
<add> it('should suspend watchers on child scope', inject(function($rootScope) {
<add> var watchSpy = jasmine.createSpy('watchSpy');
<add> var scope = $rootScope.$new(true);
<add> scope.$watch(watchSpy);
<add> $rootScope.$suspend();
<add> $rootScope.$digest();
<add> expect(watchSpy).not.toHaveBeenCalled();
<add> }));
<add>
<add> it('should resume watchers on child scope', inject(function($rootScope) {
<add> var watchSpy = jasmine.createSpy('watchSpy');
<add> var scope = $rootScope.$new(true);
<add> scope.$watch(watchSpy);
<add> $rootScope.$suspend();
<add> $rootScope.$resume();
<add> $rootScope.$digest();
<add> expect(watchSpy).toHaveBeenCalled();
<add> }));
<add>
<add> it('should resume digesting immediately if `$resume` is called from an ancestor scope watch handler', inject(function($rootScope) {
<add> var watchSpy = jasmine.createSpy('watchSpy');
<add> var scope = $rootScope.$new();
<add>
<add> // Setup a handler that will toggle the scope suspension
<add> $rootScope.$watch('a', function(a) { if (a) scope.$resume(); else scope.$suspend(); });
<add>
<add> // Spy on the scope watches being called
<add> scope.$watch(watchSpy);
<add>
<add> // Trigger a digest that should suspend the scope from within the watch handler
<add> $rootScope.$apply('a = false');
<add> // The scope is suspended before it gets to do a digest
<add> expect(watchSpy).not.toHaveBeenCalled();
<add>
<add> // Trigger a digest that should resume the scope from within the watch handler
<add> $rootScope.$apply('a = true');
<add> // The watch handler that resumes the scope is in the parent, so the resumed scope will digest immediately
<add> expect(watchSpy).toHaveBeenCalled();
<add> }));
<add>
<add> it('should resume digesting immediately if `$resume` is called from a non-ancestor scope watch handler', inject(function($rootScope) {
<add> var watchSpy = jasmine.createSpy('watchSpy');
<add> var scope = $rootScope.$new();
<add> var sibling = $rootScope.$new();
<add>
<add> // Setup a handler that will toggle the scope suspension
<add> sibling.$watch('a', function(a) { if (a) scope.$resume(); else scope.$suspend(); });
<add>
<add> // Spy on the scope watches being called
<add> scope.$watch(watchSpy);
<add>
<add> // Trigger a digest that should suspend the scope from within the watch handler
<add> $rootScope.$apply('a = false');
<add> // The scope is suspended by the sibling handler after the scope has already digested
<add> expect(watchSpy).toHaveBeenCalled();
<add> watchSpy.calls.reset();
<add>
<add> // Trigger a digest that should resume the scope from within the watch handler
<add> $rootScope.$apply('a = true');
<add> // The watch handler that resumes the scope marks the digest as dirty, so it will run an extra digest
<add> expect(watchSpy).toHaveBeenCalled();
<add> }));
<add>
<add> it('should not suspend watchers on parent or sibling scopes', inject(function($rootScope) {
<add> var watchSpyParent = jasmine.createSpy('watchSpyParent');
<add> var watchSpyChild = jasmine.createSpy('watchSpyChild');
<add> var watchSpySibling = jasmine.createSpy('watchSpySibling');
<add>
<add> var parent = $rootScope.$new();
<add> parent.$watch(watchSpyParent);
<add> var child = parent.$new();
<add> child.$watch(watchSpyChild);
<add> var sibling = parent.$new();
<add> sibling.$watch(watchSpySibling);
<add>
<add> child.$suspend();
<add> $rootScope.$digest();
<add> expect(watchSpyParent).toHaveBeenCalled();
<add> expect(watchSpyChild).not.toHaveBeenCalled();
<add> expect(watchSpySibling).toHaveBeenCalled();
<add> }));
<add>
<add> it('should return true from `$isSuspended()` when a scope is suspended', inject(function($rootScope) {
<add> $rootScope.$suspend();
<add> expect($rootScope.$isSuspended()).toBe(true);
<add> $rootScope.$resume();
<add> expect($rootScope.$isSuspended()).toBe(false);
<add> }));
<add>
<add> it('should return false from `$isSuspended()` for a non-suspended scope that has a suspended ancestor', inject(function($rootScope) {
<add> var childScope = $rootScope.$new();
<add> $rootScope.$suspend();
<add> expect(childScope.$isSuspended()).toBe(false);
<add> childScope.$suspend();
<add> expect(childScope.$isSuspended()).toBe(true);
<add> childScope.$resume();
<add> expect(childScope.$isSuspended()).toBe(false);
<add> $rootScope.$resume();
<add> expect(childScope.$isSuspended()).toBe(false);
<add> }));
<add> });
<add>
<add>
<ide> describe('optimizations', function() {
<ide>
<ide> function setupWatches(scope, log) { | 2 |
Java | Java | avoid duplicate flush when closing outputstream | eec22f5072011259bee02cd23cd6d0692f1cb187 | <ide><path>spring-web/src/main/java/org/springframework/remoting/httpinvoker/HttpInvokerServiceExporter.java
<ide> /*
<del> * Copyright 2002-2012 the original author or authors.
<add> * Copyright 2002-2016 the original author or authors.
<ide> *
<ide> * Licensed under the Apache License, Version 2.0 (the "License");
<ide> * you may not use this file except in compliance with the License.
<ide>
<ide> package org.springframework.remoting.httpinvoker;
<ide>
<add>import java.io.FilterOutputStream;
<ide> import java.io.IOException;
<ide> import java.io.InputStream;
<ide> import java.io.ObjectInputStream;
<ide> protected void writeRemoteInvocationResult(
<ide> HttpServletRequest request, HttpServletResponse response, RemoteInvocationResult result, OutputStream os)
<ide> throws IOException {
<ide>
<del> ObjectOutputStream oos = createObjectOutputStream(decorateOutputStream(request, response, os));
<add> ObjectOutputStream oos =
<add> createObjectOutputStream(new FlushGuardedOutputStream(decorateOutputStream(request, response, os)));
<ide> try {
<ide> doWriteRemoteInvocationResult(result, oos);
<ide> }
<ide> protected OutputStream decorateOutputStream(
<ide> return os;
<ide> }
<ide>
<add> /**
<add> * Decorate an OutputStream to guard against {@code flush()} calls, which
<add> * are turned into no-ops.
<add> * <p>Because {@link ObjectOutputStream#close()} will in fact flush/drain
<add> * the underlying stream twice, this {@link FilterOutputStream} will
<add> * guard against individual flush calls. Multiple flush calls can lead
<add> * to performance issues, since writes aren't gathered as they should be.
<add> *
<add> * @see <a href="https://jira.spring.io/browse/SPR-14040">SPR-14040</a>
<add> */
<add> class FlushGuardedOutputStream extends FilterOutputStream {
<add> public FlushGuardedOutputStream(OutputStream out) {
<add> super(out);
<add> }
<add>
<add> @Override
<add> public void flush() throws IOException {
<add> // Do nothing
<add> }
<add> }
<add>
<ide> } | 1 |
Javascript | Javascript | remove self written clone non recursive | 06d8dc87199a2ac6c74e2ea6ed7703cf2fcdcb1b | <ide><path>examples/js/loaders/GLTF2Loader.js
<ide> THREE.GLTF2Loader = ( function () {
<ide> }
<ide>
<ide> //do not clone children as they will be replaced anyway
<del> var children = group.children
<del> group.children=[];
<del> var clonedgroup = group.clone();
<del> group.children=children;
<add> var clonedgroup = group.clone( false );
<ide> for ( var childrenId in group.children ) {
<ide>
<ide> var child = group.children[ childrenId ]; | 1 |
Ruby | Ruby | upgrade virtualenv to 16.7.4 | 22a7b4bc1e51e3fd32e9916bed25fcd45f178ff8 | <ide><path>Library/Homebrew/language/python_virtualenv_constants.rb
<ide> # frozen_string_literal: true
<ide>
<ide> PYTHON_VIRTUALENV_URL =
<del> "https://files.pythonhosted.org/packages/a9/8a" \
<del> "/580c7176f01540615c2eb3f3ab5462613b4beac4aa63410be89ecc7b7472" \
<del> "/virtualenv-16.7.2.tar.gz"
<add> "https://files.pythonhosted.org/packages/11/74" \
<add> "/2c151a13ef41ab9fb43b3c4ff9e788e0496ed7923b2078d42cab30622bdf" \
<add> "/virtualenv-16.7.4.tar.gz"
<ide> PYTHON_VIRTUALENV_SHA256 =
<del> "909fe0d3f7c9151b2df0a2cb53e55bdb7b0d61469353ff7a49fd47b0f0ab9285"
<add> "94a6898293d07f84a98add34c4df900f8ec64a570292279f6d91c781d37fd305" | 1 |
Javascript | Javascript | fix typo in cff_parser_spec.js | 23236f1b0b9dc9883f78a44a38ce2a9964b4513d | <ide><path>test/unit/cff_parser_spec.js
<ide> describe("CFFParser", function () {
<ide> privateDict: privateDictStub,
<ide> }).charStrings;
<ide> expect(charStrings.count).toEqual(1);
<del> // shoudn't be sanitized
<add> // shouldn't be sanitized
<ide> expect(charStrings.get(0).length).toEqual(38);
<ide> });
<ide> | 1 |
Python | Python | remove print statements from test | 3bc53905ccfd7f0c1c444766fb37070899582a20 | <ide><path>spacy/tests/doc/test_retokenize_merge.py
<ide> def test_doc_retokenize_spans_entity_merge_iob():
<ide> retokenizer.merge(doc[2:4])
<ide> retokenizer.merge(doc[4:6])
<ide> retokenizer.merge(doc[7:9])
<del> for token in doc:
<del> print(token)
<del> print(token.ent_iob)
<ide> assert len(doc) == 6
<ide> assert doc[3].ent_iob_ == "B"
<ide> assert doc[4].ent_iob_ == "I" | 1 |
Text | Text | change http to https in links | 1b0851c9b75f0d0a15427898ae49a2f67d076f81 | <ide><path>INTHEWILD.md
<ide> Currently, **officially** using Airflow:
<ide> 1. [Adobe](https://www.adobe.com/) [[@mishikaSingh](https://github.com/mishikaSingh), [@ramandumcs](https://github.com/ramandumcs), [@vardancse](https://github.com/vardancse)]
<ide> 1. [Agari](https://github.com/agaridata) [[@r39132](https://github.com/r39132)]
<ide> 1. [Agoda](https://agoda.com) [[@akki](https://github.com/akki)]
<del>1. [Airbnb](http://airbnb.io/) [[@mistercrunch](https://github.com/mistercrunch), [@artwr](https://github.com/artwr)]
<add>1. [Airbnb](https://airbnb.io/) [[@mistercrunch](https://github.com/mistercrunch), [@artwr](https://github.com/artwr)]
<ide> 1. [AirDNA](https://www.airdna.co)
<ide> 1. [Airfinity](https://www.airfinity.com) [[@sibowyer](https://github.com/sibowyer)]
<ide> 1. [Airtel](https://www.airtel.in/) [[@harishbisht](https://github.com/harishbisht)]
<ide> Currently, **officially** using Airflow:
<ide> 1. [Arrive](https://www.arrive.com/)
<ide> 1. [Artelys](https://www.artelys.com/) [[@fortierq](https://github.com/fortierq)]
<ide> 1. [Asana](https://asana.com/) [[@chang](https://github.com/chang), [@dima-asana](https://github.com/dima-asana), [@jdavidheiser](https://github.com/jdavidheiser), [@ricardoandresrojas](https://github.com/ricardoandresrojas)]
<del>1. [Astronomer](http://www.astronomer.io) [[@schnie](https://github.com/schnie), [@ashb](https://github.com/ashb), [@kaxil](https://github.com/kaxil), [@dimberman](https://github.com/dimberman), [@andriisoldatenko](https://github.com/andriisoldatenko), [@ryw](https://github.com/ryw), [@ryanahamilton](https://github.com/ryanahamilton), [@jhtimmins](https://github.com/jhtimmins), [@vikramkoka](https://github.com/vikramkoka)]
<add>1. [Astronomer](https://www.astronomer.io) [[@schnie](https://github.com/schnie), [@ashb](https://github.com/ashb), [@kaxil](https://github.com/kaxil), [@dimberman](https://github.com/dimberman), [@andriisoldatenko](https://github.com/andriisoldatenko), [@ryw](https://github.com/ryw), [@ryanahamilton](https://github.com/ryanahamilton), [@jhtimmins](https://github.com/jhtimmins), [@vikramkoka](https://github.com/vikramkoka)]
<ide> 1. [Auth0](https://auth0.com) [[@scottypate](https://github.com/scottypate)], [[@dm03514](https://github.com/dm03514)], [[@karangale](https://github.com/karangale)]
<ide> 1. [Automattic](https://automattic.com/) [[@anandnalya](https://github.com/anandnalya), [@bperson](https://github.com/bperson), [@khrol](https://github.com/Khrol), [@xyu](https://github.com/xyu)]
<ide> 1. [Avesta Technologies](https://avestatechnologies.com) [[@TheRum](https://github.com/TheRum)]
<ide> Currently, **officially** using Airflow:
<ide> 1. [Beamly](https://www.beamly.com/) [[@christopheralcock](https://github.com/christopheralcock)]
<ide> 1. [Beeswax](https://beeswax.com/)
<ide> 1. [Bellhops](https://github.com/bellhops)
<del>1. [BelugaDB](https://belugadb.com) [[@fabio-nukui](https://github.com/fabio-nukui) & [@joao-sallaberry](http://github.com/joao-sallaberry) & [@lucianoviola](https://github.com/lucianoviola) & [@tmatuki](https://github.com/tmatuki)]
<add>1. [BelugaDB](https://belugadb.com) [[@fabio-nukui](https://github.com/fabio-nukui) & [@joao-sallaberry](https://github.com/joao-sallaberry) & [@lucianoviola](https://github.com/lucianoviola) & [@tmatuki](https://github.com/tmatuki)]
<ide> 1. [Betterment](https://www.betterment.com/) [[@betterment](https://github.com/Betterment)]
<ide> 1. [Bexs Bank](https://www.bexs.com.br/en) [[@felipefb](https://github.com/felipefb) & [@ilarsen](https://github.com/ishvann)]
<ide> 1. [BigQuant](https://bigquant.com/) [[@bigquant](https://github.com/bigquant)]
<ide> Currently, **officially** using Airflow:
<ide> 1. [Capital One](https://www.capitalone.com) [[@anoopengineer](https://github.com/anoopengineer)]
<ide> 1. [Carbonite](https://www.carbonite.com) [[@ajbosco](https://github.com/ajbosco)]
<ide> 1. [CarLabs](https://www.carlabs.ai/) [[@sganz](https://github.com/sganz) & [@odannyc](https://github.com/odannyc)]
<del>1. [CAVA](https://www.cava.com) [[@minh5](http://github.com/minh5) & [@patchus](http://github.com/patchus)]
<add>1. [CAVA](https://www.cava.com) [[@minh5](https://github.com/minh5) & [@patchus](https://github.com/patchus)]
<ide> 1. [Celect](http://www.celect.com) [[@superdosh](https://github.com/superdosh) & [@chadcelect](https://github.com/chadcelect)]
<ide> 1. [Censys](https://censys.io) [[@zakird](https://github.com/zakird), [@dadrian](https://github.com/dadrian), & [@andrewsardone](https://github.com/andrewsardone)]
<ide> 1. [Change.org](https://www.change.org) [[@change](https://github.com/change), [@vijaykramesh](https://github.com/vijaykramesh)]
<ide> Currently, **officially** using Airflow:
<ide> 1. [GameWisp](https://gamewisp.com) [[@tjbiii](https://github.com/TJBIII) & [@theryanwalls](https://github.com/theryanwalls)]
<ide> 1. [Geekie](https://www.geekie.com.br) [[@wolney](https://github.com/wolney)]
<ide> 1. [GeneCards](https://www.genecards.org) [[@oferze](https://github.com/oferze)]
<del>1. [Gentner Lab](http://github.com/gentnerlab) [[@neuromusic](https://github.com/neuromusic)]
<add>1. [Gentner Lab](https://github.com/gentnerlab) [[@neuromusic](https://github.com/neuromusic)]
<ide> 1. [Get Simpl](https://getsimpl.com/) [[@rootcss](https://github.com/rootcss)]
<ide> 1. [Getir](https://www.getir.com/) [[@mpolatcan](https://github.com/mpolatcan)]
<ide> 1. [GitLab](https://about.gitlab.com/) [[@tayloramurphy](https://gitlab.com/tayloramurphy) & [@m_walker](https://gitlab.com/m_walker)]
<ide> Currently, **officially** using Airflow:
<ide> 1. [Pronto Tools](http://www.prontotools.io/) [[@zkan](https://github.com/zkan) & [@mesodiar](https://github.com/mesodiar)]
<ide> 1. [proton.ai](https://proton.ai/) [[@prmsolutions](https://github.com/prmsolutions)]
<ide> 1. [PubNub](https://pubnub.com) [[@jzucker2](https://github.com/jzucker2)]
<del>1. [PXYData](https://www.pxydata.com) [[@patchus](http://github.com/patchus)]
<del>1. [Qliro](https://www.qliro.com) [[@kvackkvackanka](http://github.com/kvackkvackanka)]
<add>1. [PXYData](https://www.pxydata.com) [[@patchus](https://github.com/patchus)]
<add>1. [Qliro](https://www.qliro.com) [[@kvackkvackanka](https://github.com/kvackkvackanka)]
<ide> 1. [Qoala](https://www.qoala.id) [[@gnomeria](https://github.com/gnomeria), [@qoala-engineering](https://github.com/qoala-engineering)]
<ide> 1. [Qplum](https://qplum.co) [[@manti](https://github.com/manti)]
<del>1. [Quantopian](https://www.quantopian.com/) [[@eronarn](http://github.com/eronarn)]
<add>1. [Quantopian](https://www.quantopian.com/) [[@eronarn](https://github.com/eronarn)]
<ide> 1. [Qubole](https://qubole.com) [[@msumit](https://github.com/msumit)]
<ide> 1. [QuintoAndar](https://quintoandar.com.br) [[@quintoandar](https://github.com/quintoandar)]
<ide> 1. [Quizlet](https://quizlet.com) [[@quizlet](https://github.com/quizlet)]
<ide><path>README.md
<ide> For information on installing backport providers check [backport-providers.rst](
<ide>
<ide> ## Official source code
<ide>
<del>Apache Airflow is an [Apache Software Foundation](http://www.apache.org) (ASF) project,
<add>Apache Airflow is an [Apache Software Foundation](https://www.apache.org) (ASF) project,
<ide> and our official source code releases:
<ide>
<del>- Follow the [ASF Release Policy](http://www.apache.org/legal/release-policy.html)
<add>- Follow the [ASF Release Policy](https://www.apache.org/legal/release-policy.html)
<ide> - Can be downloaded from [the ASF Distribution Directory](https://downloads.apache.org/airflow)
<ide> - Are cryptographically signed by the release manager
<ide> - Are officially voted on by the PMC members during the
<del> [Release Approval Process](http://www.apache.org/legal/release-policy.html#release-approval)
<add> [Release Approval Process](https://www.apache.org/legal/release-policy.html#release-approval)
<ide>
<ide> Following the ASF rules, the source packages released must be sufficient for a user to build and test the
<ide> release provided they have access to the appropriate platform and tools. | 2 |
Javascript | Javascript | pass runtimetemplate to dependencytemplate | 758a388f691bba849b199964640cbab1cab8f91d | <ide><path>lib/Compilation.js
<ide> const MainTemplate = require("./MainTemplate");
<ide> const ChunkTemplate = require("./ChunkTemplate");
<ide> const HotUpdateChunkTemplate = require("./HotUpdateChunkTemplate");
<ide> const ModuleTemplate = require("./ModuleTemplate");
<add>const RuntimeTemplate = require("./RuntimeTemplate");
<ide> const Dependency = require("./Dependency");
<ide> const ChunkRenderError = require("./ChunkRenderError");
<ide> const AsyncDependencyToInitialChunkWarning = require("./AsyncDependencyToInitialChunkWarning");
<ide> class Compilation extends Tapable {
<ide> this.mainTemplate = new MainTemplate(this.outputOptions);
<ide> this.chunkTemplate = new ChunkTemplate(this.outputOptions);
<ide> this.hotUpdateChunkTemplate = new HotUpdateChunkTemplate(this.outputOptions);
<add> this.runtimeTemplate = new RuntimeTemplate(this.outputOptions, this.requestShortener);
<ide> this.moduleTemplates = {
<del> javascript: new ModuleTemplate(this.outputOptions, this.requestShortener),
<del> webassembly: new ModuleTemplate(this.outputOptions, this.requestShortener)
<add> javascript: new ModuleTemplate(this.runtimeTemplate),
<add> webassembly: new ModuleTemplate(this.runtimeTemplate)
<ide> };
<ide>
<ide> this.semaphore = new Semaphore(options.parallelism || 100);
<ide><path>lib/ContextModule.js
<ide> webpackAsyncContext.id = ${JSON.stringify(id)};
<ide> module.exports = webpackAsyncContext;`;
<ide> }
<ide>
<del> getLazyOnceSource(block, dependencies, id, outputOptions, requestShortener) {
<del> const promise = DepBlockHelpers.getDepBlockPromise(block, outputOptions, requestShortener, "lazy-once context");
<add> getLazyOnceSource(block, dependencies, id, runtimeTemplate) {
<add> const promise = DepBlockHelpers.getDepBlockPromise(block, runtimeTemplate, "lazy-once context");
<ide> const map = this.getUserRequestMap(dependencies);
<ide> const fakeMap = this.getFakeMap(dependencies);
<ide> const thenFunction = fakeMap ?
<ide> module.exports = webpackEmptyAsyncContext;
<ide> webpackEmptyAsyncContext.id = ${JSON.stringify(id)};`;
<ide> }
<ide>
<del> getSourceString(asyncMode, outputOptions, requestShortener) {
<add> getSourceString(asyncMode, runtimeTemplate) {
<ide> if(asyncMode === "lazy") {
<ide> if(this.blocks && this.blocks.length > 0) {
<ide> return this.getLazySource(this.blocks, this.id);
<ide> webpackEmptyAsyncContext.id = ${JSON.stringify(id)};`;
<ide> if(asyncMode === "lazy-once") {
<ide> const block = this.blocks[0];
<ide> if(block) {
<del> return this.getLazyOnceSource(block, block.dependencies, this.id, outputOptions, requestShortener);
<add> return this.getLazyOnceSource(block, block.dependencies, this.id, runtimeTemplate);
<ide> }
<ide> return this.getSourceForEmptyAsyncContext(this.id);
<ide> }
<ide> webpackEmptyAsyncContext.id = ${JSON.stringify(id)};`;
<ide> return new RawSource(sourceString);
<ide> }
<ide>
<del> source(dependencyTemplates, outputOptions, requestShortener) {
<add> source(dependencyTemplates, runtimeTemplate) {
<ide> return this.getSource(
<del> this.getSourceString(this.options.mode, outputOptions, requestShortener)
<add> this.getSourceString(this.options.mode, runtimeTemplate)
<ide> );
<ide> }
<ide>
<ide><path>lib/DependenciesBlockVariable.js
<ide> class DependenciesBlockVariable {
<ide> });
<ide> }
<ide>
<del> expressionSource(dependencyTemplates, outputOptions, requestShortener) {
<add> expressionSource(dependencyTemplates, runtimeTemplate) {
<ide> const source = new ReplaceSource(new RawSource(this.expression));
<ide> this.dependencies.forEach(dep => {
<ide> const template = dependencyTemplates.get(dep.constructor);
<ide> if(!template) throw new Error(`No template for dependency: ${dep.constructor.name}`);
<del> template.apply(dep, source, outputOptions, requestShortener, dependencyTemplates);
<add> template.apply(dep, source, runtimeTemplate, dependencyTemplates);
<ide> });
<ide> return source;
<ide> }
<ide><path>lib/EvalDevToolModuleTemplatePlugin.js
<ide> class EvalDevToolModuleTemplatePlugin {
<ide> const str = ModuleFilenameHelpers.createFilename(module, {
<ide> moduleFilenameTemplate: this.moduleFilenameTemplate,
<ide> namespace: this.namespace
<del> }, moduleTemplate.requestShortener);
<add> }, moduleTemplate.runtimeTemplate.requestShortener);
<ide> const footer = ["\n",
<del> ModuleFilenameHelpers.createFooter(module, moduleTemplate.requestShortener),
<add> ModuleFilenameHelpers.createFooter(module, moduleTemplate.runtimeTemplate.requestShortener),
<ide> this.sourceUrlComment.replace(/\[url\]/g, encodeURI(str).replace(/%2F/g, "/").replace(/%20/g, "_").replace(/%5E/g, "^").replace(/%5C/g, "\\").replace(/^\//, ""))
<ide> ].join("\n");
<ide> const result = new RawSource(`eval(${JSON.stringify(content + footer)});`);
<ide><path>lib/EvalSourceMapDevToolModuleTemplatePlugin.js
<ide> class EvalSourceMapDevToolModuleTemplatePlugin {
<ide> return ModuleFilenameHelpers.createFilename(module, {
<ide> moduleFilenameTemplate: self.moduleFilenameTemplate,
<ide> namespace: self.namespace
<del> }, moduleTemplate.requestShortener);
<add> }, moduleTemplate.runtimeTemplate.requestShortener);
<ide> });
<ide> moduleFilenames = ModuleFilenameHelpers.replaceDuplicates(moduleFilenames, (filename, i, n) => {
<ide> for(let j = 0; j < n; j++)
<ide> class EvalSourceMapDevToolModuleTemplatePlugin {
<ide> sourceMap.sources = moduleFilenames;
<ide> if(sourceMap.sourcesContent) {
<ide> sourceMap.sourcesContent = sourceMap.sourcesContent.map((content, i) => {
<del> return typeof content === "string" ? `${content}\n\n\n${ModuleFilenameHelpers.createFooter(modules[i], moduleTemplate.requestShortener)}` : null;
<add> return typeof content === "string" ? `${content}\n\n\n${ModuleFilenameHelpers.createFooter(modules[i], moduleTemplate.runtimeTemplate.requestShortener)}` : null;
<ide> });
<ide> }
<ide> sourceMap.sourceRoot = options.sourceRoot || "";
<ide><path>lib/FunctionModuleTemplatePlugin.js
<ide> class FunctionModuleTemplatePlugin {
<ide> });
<ide>
<ide> moduleTemplate.plugin("package", (moduleSource, module) => {
<del> if(moduleTemplate.outputOptions.pathinfo) {
<add> if(moduleTemplate.runtimeTemplate.outputOptions.pathinfo) {
<ide> const source = new ConcatSource();
<del> const req = module.readableIdentifier(moduleTemplate.requestShortener);
<add> const req = module.readableIdentifier(moduleTemplate.runtimeTemplate.requestShortener);
<ide> source.add("/*!****" + req.replace(/./g, "*") + "****!*\\\n");
<ide> source.add(" !*** " + req.replace(/\*\//g, "*_/") + " ***!\n");
<ide> source.add(" \\****" + req.replace(/./g, "*") + "****/\n");
<ide> class FunctionModuleTemplatePlugin {
<ide> source.add(Template.toComment("all exports used") + "\n");
<ide> if(module.optimizationBailout) {
<ide> module.optimizationBailout.forEach(text => {
<del> if(typeof text === "function") text = text(moduleTemplate.requestShortener);
<add> if(typeof text === "function") text = text(moduleTemplate.runtimeTemplate.requestShortener);
<ide> source.add(Template.toComment(`${text}`) + "\n");
<ide> });
<ide> }
<ide><path>lib/ModuleTemplate.js
<ide> const SyncWaterfallHook = require("tapable").SyncWaterfallHook;
<ide> const SyncHook = require("tapable").SyncHook;
<ide>
<ide> module.exports = class ModuleTemplate extends Tapable {
<del> constructor(outputOptions, requestShortener) {
<add> constructor(runtimeTemplate) {
<ide> super();
<del> this.outputOptions = outputOptions || {};
<del> this.requestShortener = requestShortener;
<add> this.runtimeTemplate = runtimeTemplate;
<ide> this.hooks = {
<ide> content: new SyncWaterfallHook(["source", "module", "options", "dependencyTemplates"]),
<ide> module: new SyncWaterfallHook(["source", "module", "options", "dependencyTemplates"]),
<ide> module.exports = class ModuleTemplate extends Tapable {
<ide> }
<ide>
<ide> render(module, dependencyTemplates, options) {
<del> const moduleSource = module.source(dependencyTemplates, this.outputOptions, this.requestShortener);
<add> const moduleSource = module.source(dependencyTemplates, this.runtimeTemplate);
<ide> const moduleSourcePostContent = this.hooks.content.call(moduleSource, module, options, dependencyTemplates);
<ide> const moduleSourcePostModule = this.hooks.module.call(moduleSourcePostContent, module, options, dependencyTemplates);
<ide> const moduleSourcePostRender = this.hooks.render.call(moduleSourcePostModule, module, options, dependencyTemplates);
<ide><path>lib/MultiModule.js
<ide> class MultiModule extends Module {
<ide> super.updateHash(hash);
<ide> }
<ide>
<del> source(dependencyTemplates, outputOptions) {
<add> source(dependencyTemplates, runtimeTemplate) {
<ide> const str = [];
<ide> this.dependencies.forEach((dep, idx) => {
<ide> if(dep.module) {
<ide> if(idx === this.dependencies.length - 1)
<ide> str.push("module.exports = ");
<ide> str.push("__webpack_require__(");
<del> if(outputOptions.pathinfo)
<add> if(runtimeTemplate.outputOptions.pathinfo)
<ide> str.push(Template.toComment(dep.request));
<ide> str.push(`${JSON.stringify(dep.module.id)}`);
<ide> str.push(")");
<ide><path>lib/NormalModule.js
<ide> class NormalModule extends Module {
<ide> return `${this.hash}-${dtHash}`;
<ide> }
<ide>
<del> sourceDependency(dependency, dependencyTemplates, source, outputOptions, requestShortener) {
<add> sourceDependency(dependency, dependencyTemplates, source, runtimeTemplate) {
<ide> const template = dependencyTemplates.get(dependency.constructor);
<ide> if(!template) throw new Error("No template for dependency: " + dependency.constructor.name);
<del> template.apply(dependency, source, outputOptions, requestShortener, dependencyTemplates);
<add> template.apply(dependency, source, runtimeTemplate, dependencyTemplates);
<ide> }
<ide>
<del> sourceVariables(variable, availableVars, dependencyTemplates, outputOptions, requestShortener) {
<add> sourceVariables(variable, availableVars, dependencyTemplates, runtimeTemplate) {
<ide> const name = variable.name;
<del> const expr = variable.expressionSource(dependencyTemplates, outputOptions, requestShortener);
<add> const expr = variable.expressionSource(dependencyTemplates, runtimeTemplate);
<ide>
<ide> if(availableVars.some(v => v.name === name && v.expression.source() === expr.source())) {
<ide> return;
<ide> class NormalModule extends Module {
<ide> }, startState);
<ide> }
<ide>
<del> sourceBlock(block, availableVars, dependencyTemplates, source, outputOptions, requestShortener) {
<add> sourceBlock(block, availableVars, dependencyTemplates, source, runtimeTemplate) {
<ide> block.dependencies.forEach((dependency) => this.sourceDependency(
<del> dependency, dependencyTemplates, source, outputOptions, requestShortener));
<add> dependency, dependencyTemplates, source, runtimeTemplate));
<ide>
<ide> /**
<ide> * Get the variables of all blocks that we need to inject.
<ide> class NormalModule extends Module {
<ide> */
<ide> const vars = block.variables.reduce((result, value) => {
<ide> const variable = this.sourceVariables(
<del> value, availableVars, dependencyTemplates, outputOptions, requestShortener);
<add> value, availableVars, dependencyTemplates, runtimeTemplate);
<ide>
<ide> if(variable) {
<ide> result.push(variable);
<ide> class NormalModule extends Module {
<ide> availableVars.concat(vars),
<ide> dependencyTemplates,
<ide> source,
<del> outputOptions,
<del> requestShortener
<add> runtimeTemplate
<ide> )
<ide> );
<ide> }
<ide>
<del> source(dependencyTemplates, outputOptions, requestShortener) {
<add> source(dependencyTemplates, runtimeTemplate) {
<ide> if(this.type.startsWith("javascript")) {
<ide> const hashDigest = this.getHashDigest(dependencyTemplates);
<ide> if(this._cachedSourceHash === hashDigest) {
<ide> class NormalModule extends Module {
<ide>
<ide> const source = new ReplaceSource(this._source);
<ide>
<del> this.sourceBlock(this, [], dependencyTemplates, source, outputOptions, requestShortener);
<add> this.sourceBlock(this, [], dependencyTemplates, source, runtimeTemplate);
<ide>
<ide> const cachedSource = new CachedSource(source);
<ide> this._cachedSource = cachedSource;
<ide><path>lib/RuntimeTemplate.js
<add>/*
<add> MIT License http://www.opensource.org/licenses/mit-license.php
<add> Author Tobias Koppers @sokra
<add>*/
<add>"use strict";
<add>
<add>module.exports = class RuntimeTemplate {
<add> constructor(outputOptions, requestShortener) {
<add> this.outputOptions = outputOptions || {};
<add> this.requestShortener = requestShortener;
<add> }
<add>};
<ide><path>lib/dependencies/AMDRequireArrayDependency.js
<ide> class AMDRequireArrayDependency extends Dependency {
<ide> }
<ide>
<ide> AMDRequireArrayDependency.Template = class AMDRequireArrayDependencyTemplate {
<del> apply(dep, source, outputOptions, requestShortener) {
<del> const content = this.getContent(dep, outputOptions, requestShortener);
<add> apply(dep, source, runtime) {
<add> const content = this.getContent(dep, runtime);
<ide> source.replace(dep.range[0], dep.range[1] - 1, content);
<ide> }
<ide>
<del> getContent(dep, outputOptions, requestShortener) {
<add> getContent(dep, runtime) {
<ide> const requires = dep.depsArray.map((dependency) => {
<del> const optionalComment = outputOptions.pathinfo ? Template.toComment(requestShortener.shorten(dependency.request)) : "";
<add> const optionalComment = runtime.outputOptions.pathinfo ? Template.toComment(runtime.requestShortener.shorten(dependency.request)) : "";
<ide> return this.contentForDependency(dependency, optionalComment);
<ide> });
<ide> return `[${requires.join(", ")}]`;
<ide><path>lib/dependencies/AMDRequireDependency.js
<ide> class AMDRequireDependency extends NullDependency {
<ide> }
<ide>
<ide> AMDRequireDependency.Template = class AMDRequireDependencyTemplate {
<del> apply(dep, source, outputOptions, requestShortener) {
<add> apply(dep, source, runtime) {
<ide> const depBlock = dep.block;
<del> const wrapper = DepBlockHelpers.getLoadDepBlockWrapper(depBlock, outputOptions, requestShortener, "require");
<add> const wrapper = DepBlockHelpers.getLoadDepBlockWrapper(depBlock, runtime, "require");
<ide>
<ide> // has array range but no function range
<ide> if(depBlock.arrayRange && !depBlock.functionRange) {
<ide><path>lib/dependencies/ContextDependencyTemplateAsId.js
<ide> const Template = require("../Template");
<ide>
<ide> class ContextDependencyTemplateAsId {
<ide>
<del> apply(dep, source, outputOptions, requestShortener) {
<del> const comment = outputOptions.pathinfo ? Template.toComment(requestShortener.shorten(dep.request)) + " " : "";
<add> apply(dep, source, runtime) {
<add> const comment = runtime.outputOptions.pathinfo ? Template.toComment(runtime.requestShortener.shorten(dep.request)) + " " : "";
<ide>
<ide> if(dep.module && dep.module.dependencies && dep.module.dependencies.length > 0) {
<ide> if(dep.valueRange) {
<ide><path>lib/dependencies/ContextDependencyTemplateAsRequireCall.js
<ide> const Template = require("../Template");
<ide>
<ide> class ContextDependencyTemplateAsRequireCall {
<ide>
<del> apply(dep, source, outputOptions, requestShortener) {
<del> const comment = outputOptions.pathinfo ? Template.toComment(requestShortener.shorten(dep.options.request)) + " " : "";
<add> apply(dep, source, runtime) {
<add> const comment = runtime.outputOptions.pathinfo ? Template.toComment(runtime.requestShortener.shorten(dep.options.request)) + " " : "";
<ide>
<ide> const containsDeps = dep.module && dep.module.dependencies && dep.module.dependencies.length > 0;
<ide> const isAsync = dep.options.mode !== "sync" && dep.options.mode !== "weak";
<ide><path>lib/dependencies/DepBlockHelpers.js
<ide> const Template = require("../Template");
<ide>
<ide> const DepBlockHelpers = exports;
<ide>
<del>DepBlockHelpers.getLoadDepBlockWrapper = (depBlock, outputOptions, requestShortener, name) => {
<del> const promiseCode = DepBlockHelpers.getDepBlockPromise(depBlock, outputOptions, requestShortener, name);
<add>DepBlockHelpers.getLoadDepBlockWrapper = (depBlock, runtimeTemplate, name) => {
<add> const promiseCode = DepBlockHelpers.getDepBlockPromise(depBlock, runtimeTemplate, name);
<ide> return [
<ide> promiseCode + ".then(",
<ide> ").catch(",
<ide> ")"
<ide> ];
<ide> };
<ide>
<del>DepBlockHelpers.getDepBlockPromise = (depBlock, outputOptions, requestShortener, name) => {
<add>DepBlockHelpers.getDepBlockPromise = (depBlock, runtimeTemplate, name) => {
<ide> if(depBlock.chunks) {
<ide> const chunks = depBlock.chunks.filter(chunk => !chunk.hasRuntime() && chunk.id !== null);
<del> const pathChunkCheck = outputOptions.pathinfo && depBlock.chunkName;
<del> const shortChunkName = requestShortener.shorten(depBlock.chunkName);
<add> const pathChunkCheck = runtimeTemplate.outputOptions.pathinfo && depBlock.chunkName;
<add> const shortChunkName = runtimeTemplate.requestShortener.shorten(depBlock.chunkName);
<ide> const chunkReason = Template.toNormalComment(depBlock.chunkReason);
<ide> const requireChunkId = chunk => "__webpack_require__.e(" + JSON.stringify(chunk.id) + ")";
<ide> name = Template.toNormalComment(name);
<ide><path>lib/dependencies/HarmonyAcceptDependency.js
<ide> class HarmonyAcceptDependency extends NullDependency {
<ide> }
<ide>
<ide> HarmonyAcceptDependency.Template = class HarmonyAcceptDependencyTemplate {
<del> apply(dep, source, outputOptions, requestShortener) {
<add> apply(dep, source, runtime) {
<ide> const content = dep.dependencies
<ide> .filter(dependency => HarmonyImportDependency.Template.isImportEmitted(dependency, source))
<del> .map(dependency => dependency.getImportStatement(false, outputOptions, requestShortener))
<add> .map(dependency => dependency.getImportStatement(false, runtime))
<ide> .join("");
<ide>
<ide> if(dep.hasCallback) {
<ide><path>lib/dependencies/HarmonyAcceptImportDependency.js
<ide> class HarmonyAcceptImportDependency extends HarmonyImportDependency {
<ide> }
<ide>
<ide> HarmonyAcceptImportDependency.Template = class HarmonyAcceptImportDependencyTemplate {
<del> apply(dep, source, outputOptions, requestShortener) {}
<add> apply(dep, source, runtime) {}
<ide> };
<ide>
<ide> module.exports = HarmonyAcceptImportDependency;
<ide><path>lib/dependencies/HarmonyExportImportedSpecifierDependency.js
<ide> class HarmonyExportImportedSpecifierDependency extends HarmonyImportDependency {
<ide> module.exports = HarmonyExportImportedSpecifierDependency;
<ide>
<ide> HarmonyExportImportedSpecifierDependency.Template = class HarmonyExportImportedSpecifierDependencyTemplate extends HarmonyImportDependency.Template {
<del> harmonyInit(dep, source, outputOptions, requestShortener, dependencyTemplates) {
<del> super.harmonyInit(dep, source, outputOptions, requestShortener, dependencyTemplates);
<del> const importVar = dep.getImportVar(requestShortener);
<add> harmonyInit(dep, source, runtime, dependencyTemplates) {
<add> super.harmonyInit(dep, source, runtime, dependencyTemplates);
<add> const importVar = dep.getImportVar();
<ide> const content = this.getContent(dep, importVar);
<ide> source.insert(-1, content);
<ide> }
<ide><path>lib/dependencies/HarmonyExportSpecifierDependency.js
<ide> HarmonyExportSpecifierDependency.Template = class HarmonyExportSpecifierDependen
<ide> return 0;
<ide> }
<ide>
<del> harmonyInit(dep, source, outputOptions, requestShortener) {
<add> harmonyInit(dep, source, runtime) {
<ide> const content = this.getContent(dep);
<ide> source.insert(-1, content);
<ide> }
<ide><path>lib/dependencies/HarmonyImportDependency.js
<ide> class HarmonyImportDependency extends ModuleDependency {
<ide> return importVar;
<ide> }
<ide>
<del> getImportStatement(declare, outputOptions, requestShortener) {
<add> getImportStatement(declare, runtime) {
<ide> const module = this.module;
<del> const comment = outputOptions.pathinfo ? Template.toComment(requestShortener.shorten(this.request)) : "";
<add> const comment = runtime.outputOptions.pathinfo ? Template.toComment(runtime.requestShortener.shorten(this.request)) : "";
<ide> const optDeclaration = declare ? "var " : "";
<ide> const optNewline = declare ? "\n" : " ";
<ide>
<ide> HarmonyImportDependency.Template = class HarmonyImportDependencyTemplate {
<ide> return key && sourceInfo.emittedImports.get(key);
<ide> }
<ide>
<del> harmonyInit(dep, source, outputOptions, requestShortener) {
<add> harmonyInit(dep, source, runtime) {
<ide> let sourceInfo = importEmittedMap.get(source);
<ide> if(!sourceInfo) {
<ide> importEmittedMap.set(source, sourceInfo = {
<ide> HarmonyImportDependency.Template = class HarmonyImportDependencyTemplate {
<ide> const key = dep.module || dep.request;
<ide> if(key && sourceInfo.emittedImports.get(key)) return;
<ide> sourceInfo.emittedImports.set(key, true);
<del> const content = dep.getImportStatement(true, outputOptions, requestShortener);
<add> const content = dep.getImportStatement(true, runtime);
<ide> source.insert(-1, content);
<ide> }
<ide> };
<ide><path>lib/dependencies/HarmonyImportSpecifierDependency.js
<ide> class HarmonyImportSpecifierDependency extends HarmonyImportDependency {
<ide> }
<ide>
<ide> HarmonyImportSpecifierDependency.Template = class HarmonyImportSpecifierDependencyTemplate extends HarmonyImportDependency.Template {
<del> apply(dep, source, outputOptions, requestShortener) {
<del> super.apply(dep, source, outputOptions, requestShortener);
<del> const importedVar = dep.getImportVar(requestShortener);
<add> apply(dep, source, runtime) {
<add> super.apply(dep, source, runtime);
<add> const importedVar = dep.getImportVar();
<ide> const content = this.getContent(dep, importedVar);
<ide> source.replace(dep.range[0], dep.range[1] - 1, content);
<ide> }
<ide><path>lib/dependencies/HarmonyInitDependency.js
<ide> class HarmonyInitDependency extends NullDependency {
<ide> module.exports = HarmonyInitDependency;
<ide>
<ide> HarmonyInitDependency.Template = class HarmonyInitDependencyTemplate {
<del> apply(dep, source, outputOptions, requestShortener, dependencyTemplates) {
<add> apply(dep, source, runtime, dependencyTemplates) {
<ide> const module = dep.originModule;
<ide> const list = [];
<ide> for(const dependency of module.dependencies) {
<ide> HarmonyInitDependency.Template = class HarmonyInitDependencyTemplate {
<ide> });
<ide>
<ide> for(const item of list) {
<del> item.template.harmonyInit(item.dependency, source, outputOptions, requestShortener, dependencyTemplates);
<add> item.template.harmonyInit(item.dependency, source, runtime, dependencyTemplates);
<ide> }
<ide> }
<ide> };
<ide><path>lib/dependencies/ImportDependency.js
<ide> class ImportDependency extends ModuleDependency {
<ide> }
<ide>
<ide> ImportDependency.Template = class ImportDependencyTemplate {
<del> apply(dep, source, outputOptions, requestShortener) {
<add> apply(dep, source, runtime) {
<ide> const depBlock = dep.block;
<del> const promise = DepBlockHelpers.getDepBlockPromise(depBlock, outputOptions, requestShortener, "import()");
<del> const comment = outputOptions.pathinfo ? Template.toComment(requestShortener.shorten(dep.request)) : "";
<add> const promise = DepBlockHelpers.getDepBlockPromise(depBlock, runtime, "import()");
<add> const comment = runtime.outputOptions.pathinfo ? Template.toComment(runtime.requestShortener.shorten(dep.request)) : "";
<ide>
<ide> const content = this.getContent(promise, dep, comment);
<ide> source.replace(depBlock.range[0], depBlock.range[1] - 1, content);
<ide><path>lib/dependencies/ImportEagerDependency.js
<ide> class ImportEagerDependency extends ModuleDependency {
<ide> }
<ide>
<ide> ImportEagerDependency.Template = class ImportEagerDependencyTemplate {
<del> apply(dep, source, outputOptions, requestShortener) {
<del> const comment = outputOptions.pathinfo ? Template.toComment(requestShortener.shorten(dep.request)) : "";
<add> apply(dep, source, runtime) {
<add> const comment = runtime.outputOptions.pathinfo ? Template.toComment(runtime.requestShortener.shorten(dep.request)) : "";
<ide>
<ide> const content = this.getContent(dep, comment);
<ide> source.replace(dep.range[0], dep.range[1] - 1, content);
<ide><path>lib/dependencies/ImportWeakDependency.js
<ide> class ImportWeakDependency extends ModuleDependency {
<ide> }
<ide>
<ide> ImportWeakDependency.Template = class ImportDependencyTemplate {
<del> apply(dep, source, outputOptions, requestShortener) {
<del> const comment = outputOptions.pathinfo ? Template.toComment(requestShortener.shorten(dep.request)) : "";
<add> apply(dep, source, runtime) {
<add> const comment = runtime.outputOptions.pathinfo ? Template.toComment(runtime.requestShortener.shorten(dep.request)) : "";
<ide>
<ide> const content = this.getContent(dep, comment);
<ide> source.replace(dep.range[0], dep.range[1] - 1, content);
<ide><path>lib/dependencies/ModuleDependencyTemplateAsId.js
<ide> const Template = require("../Template");
<ide>
<ide> class ModuleDependencyTemplateAsId {
<ide>
<del> apply(dep, source, outputOptions, requestShortener) {
<add> apply(dep, source, runtime) {
<ide> if(!dep.range) return;
<del> const comment = outputOptions.pathinfo ?
<del> Template.toComment(requestShortener.shorten(dep.request)) + " " : "";
<add> const comment = runtime.outputOptions.pathinfo ?
<add> Template.toComment(runtime.requestShortener.shorten(dep.request)) + " " : "";
<ide> let content;
<ide> if(dep.module)
<ide> content = comment + JSON.stringify(dep.module.id);
<ide><path>lib/dependencies/ModuleDependencyTemplateAsRequireId.js
<ide> const Template = require("../Template");
<ide>
<ide> class ModuleDependencyTemplateAsRequireId {
<ide>
<del> apply(dep, source, outputOptions, requestShortener) {
<add> apply(dep, source, runtime) {
<ide> if(!dep.range) return;
<del> const comment = outputOptions.pathinfo ?
<del> Template.toComment(requestShortener.shorten(dep.request)) + " " : "";
<add> const comment = runtime.outputOptions.pathinfo ?
<add> Template.toComment(runtime.requestShortener.shorten(dep.request)) + " " : "";
<ide> let content;
<ide> if(dep.module)
<ide> content = `__webpack_require__(${comment}${JSON.stringify(dep.module.id)})`;
<ide><path>lib/dependencies/RequireEnsureDependency.js
<ide> class RequireEnsureDependency extends NullDependency {
<ide> }
<ide>
<ide> RequireEnsureDependency.Template = class RequireEnsureDependencyTemplate {
<del> apply(dep, source, outputOptions, requestShortener) {
<add> apply(dep, source, runtime) {
<ide> const depBlock = dep.block;
<del> const wrapper = DepBlockHelpers.getLoadDepBlockWrapper(depBlock, outputOptions, requestShortener, "require.ensure");
<add> const wrapper = DepBlockHelpers.getLoadDepBlockWrapper(depBlock, runtime, "require.ensure");
<ide> const errorCallbackExists = depBlock.expr.arguments.length === 4 || (!depBlock.chunkName && depBlock.expr.arguments.length === 3);
<ide> const startBlock = wrapper[0] + "(";
<ide> const middleBlock = `).bind(null, __webpack_require__)${wrapper[1]}`;
<ide><path>lib/dependencies/RequireIncludeDependency.js
<ide> class RequireIncludeDependency extends ModuleDependency {
<ide> }
<ide>
<ide> RequireIncludeDependency.Template = class RequireIncludeDependencyTemplate {
<del> apply(dep, source, outputOptions, requestShortener) {
<del> const comment = outputOptions.pathinfo ? Template.toComment(`require.include ${requestShortener.shorten(dep.request)}`) : "";
<add> apply(dep, source, runtime) {
<add> const comment = runtime.outputOptions.pathinfo ? Template.toComment(`require.include ${runtime.requestShortener.shorten(dep.request)}`) : "";
<ide> source.replace(dep.range[0], dep.range[1] - 1, `undefined${comment}`);
<ide> }
<ide> };
<ide><path>lib/dependencies/UnsupportedDependency.js
<ide> class UnsupportedDependency extends NullDependency {
<ide> }
<ide>
<ide> UnsupportedDependency.Template = class UnsupportedDependencyTemplate {
<del> apply(dep, source, outputOptions, requestShortener) {
<add> apply(dep, source, runtime) {
<ide> source.replace(dep.range[0], dep.range[1], webpackMissingModule(dep.request));
<ide> }
<ide> };
<ide><path>lib/optimize/ConcatenatedModule.js
<ide> class ConcatenatedModule extends Module {
<ide> return this.rootModule.identifier() + " " + hash.digest("hex");
<ide> }
<ide>
<del> source(dependencyTemplates, outputOptions, requestShortener) {
<add> source(dependencyTemplates, runtimeTemplate) {
<add> const requestShortener = runtimeTemplate.requestShortener;
<ide> // Metainfo for each module
<ide> const modulesWithInfo = this._orderedConcatenationList.map((info, idx) => {
<ide> switch(info.type) {
<ide> class ConcatenatedModule extends Module {
<ide> modulesWithInfo.forEach(info => {
<ide> if(info.type === "concatenated") {
<ide> const m = info.module;
<del> const source = m.source(innerDependencyTemplates, outputOptions, requestShortener);
<add> const source = m.source(innerDependencyTemplates, runtimeTemplate);
<ide> const code = source.source();
<ide> let ast;
<ide> try {
<ide> class HarmonyImportSpecifierDependencyConcatenatedTemplate {
<ide> return NaN;
<ide> }
<ide>
<del> harmonyInit(dep, source, outputOptions, requestShortener, dependencyTemplates) {
<add> harmonyInit(dep, source, runtimeTemplate, dependencyTemplates) {
<ide> const module = dep.module;
<ide> const info = this.modulesMap.get(module);
<ide> if(!info) {
<del> this.originalTemplate.harmonyInit(dep, source, outputOptions, requestShortener, dependencyTemplates);
<add> this.originalTemplate.harmonyInit(dep, source, runtimeTemplate, dependencyTemplates);
<ide> return;
<ide> }
<ide> }
<ide>
<del> apply(dep, source, outputOptions, requestShortener, dependencyTemplates) {
<add> apply(dep, source, runtime, dependencyTemplates) {
<ide> const module = dep.module;
<ide> const info = this.modulesMap.get(module);
<ide> if(!info) {
<del> this.originalTemplate.apply(dep, source, outputOptions, requestShortener, dependencyTemplates);
<add> this.originalTemplate.apply(dep, source, runtime, dependencyTemplates);
<ide> return;
<ide> }
<ide> let content;
<ide> class HarmonyImportSideEffectDependencyConcatenatedTemplate {
<ide> return NaN;
<ide> }
<ide>
<del> harmonyInit(dep, source, outputOptions, requestShortener, dependencyTemplates) {
<add> harmonyInit(dep, source, runtime, dependencyTemplates) {
<ide> const module = dep.module;
<ide> const info = this.modulesMap.get(module);
<ide> if(!info) {
<del> this.originalTemplate.harmonyInit(dep, source, outputOptions, requestShortener, dependencyTemplates);
<add> this.originalTemplate.harmonyInit(dep, source, runtime, dependencyTemplates);
<ide> return;
<ide> }
<ide> }
<ide>
<del> apply(dep, source, outputOptions, requestShortener, dependencyTemplates) {
<add> apply(dep, source, runtime, dependencyTemplates) {
<ide> const module = dep.module;
<ide> const info = this.modulesMap.get(module);
<ide> if(!info) {
<del> this.originalTemplate.apply(dep, source, outputOptions, requestShortener, dependencyTemplates);
<add> this.originalTemplate.apply(dep, source, runtime, dependencyTemplates);
<ide> return;
<ide> }
<ide> }
<ide> class HarmonyExportSpecifierDependencyConcatenatedTemplate {
<ide> return NaN;
<ide> }
<ide>
<del> harmonyInit(dep, source, outputOptions, requestShortener, dependencyTemplates) {
<add> harmonyInit(dep, source, runtime, dependencyTemplates) {
<ide> if(dep.originModule === this.rootModule) {
<del> this.originalTemplate.harmonyInit(dep, source, outputOptions, requestShortener, dependencyTemplates);
<add> this.originalTemplate.harmonyInit(dep, source, runtime, dependencyTemplates);
<ide> return;
<ide> }
<ide> }
<ide>
<del> apply(dep, source, outputOptions, requestShortener, dependencyTemplates) {
<add> apply(dep, source, runtime, dependencyTemplates) {
<ide> if(dep.originModule === this.rootModule) {
<del> this.originalTemplate.apply(dep, source, outputOptions, requestShortener, dependencyTemplates);
<add> this.originalTemplate.apply(dep, source, runtime, dependencyTemplates);
<ide> }
<ide> }
<ide> }
<ide> class HarmonyExportExpressionDependencyConcatenatedTemplate {
<ide> this.rootModule = rootModule;
<ide> }
<ide>
<del> apply(dep, source, outputOptions, requestShortener, dependencyTemplates) {
<add> apply(dep, source, runtime, dependencyTemplates) {
<ide> let content = "/* harmony default export */ var __WEBPACK_MODULE_DEFAULT_EXPORT__ = ";
<ide> if(dep.originModule === this.rootModule) {
<ide> const used = dep.originModule.isUsed("default");
<ide> class HarmonyExportImportedSpecifierDependencyConcatenatedTemplate {
<ide> return NaN;
<ide> }
<ide>
<del> harmonyInit(dep, source, outputOptions, requestShortener, dependencyTemplates) {
<add> harmonyInit(dep, source, runtime, dependencyTemplates) {
<ide> const module = dep.module;
<ide> const info = this.modulesMap.get(module);
<ide> if(!info) {
<del> this.originalTemplate.harmonyInit(dep, source, outputOptions, requestShortener, dependencyTemplates);
<add> this.originalTemplate.harmonyInit(dep, source, runtime, dependencyTemplates);
<ide> return;
<ide> }
<ide> }
<ide>
<del> apply(dep, source, outputOptions, requestShortener, dependencyTemplates) {
<add> apply(dep, source, runtime, dependencyTemplates) {
<ide> if(dep.originModule === this.rootModule) {
<ide> if(this.modulesMap.get(dep.module)) {
<ide> const exportDefs = this.getExports(dep);
<ide> class HarmonyExportImportedSpecifierDependencyConcatenatedTemplate {
<ide> source.insert(-1, content);
<ide> });
<ide> } else {
<del> this.originalTemplate.apply(dep, source, outputOptions, requestShortener, dependencyTemplates);
<add> this.originalTemplate.apply(dep, source, runtime, dependencyTemplates);
<ide> }
<ide> }
<ide> }
<ide> class HarmonyCompatibilityDependencyConcatenatedTemplate {
<ide> this.modulesMap = modulesMap;
<ide> }
<ide>
<del> apply(dep, source, outputOptions, requestShortener, dependencyTemplates) {
<add> apply(dep, source, runtime, dependencyTemplates) {
<ide> // do nothing
<ide> }
<ide> }
<ide><path>test/ContextDependencyTemplateAsId.unittest.js
<ide> require("should");
<ide> const sinon = require("sinon");
<ide> const ContextDependencyTemplateAsId = require("../lib/dependencies/ContextDependencyTemplateAsId");
<ide>
<del>const requestShortenerMock = {
<del> shorten: (request) => `shortened ${request}`
<del>};
<del>
<ide> describe("ContextDependencyTemplateAsId", () => {
<ide> let env;
<ide>
<del> const applyContextDependencyTemplateAsId = function() {
<del> const contextDependencyTemplateAsId = new ContextDependencyTemplateAsId();
<del> const args = [].slice.call(arguments).concat(requestShortenerMock);
<del> contextDependencyTemplateAsId.apply.apply(contextDependencyTemplateAsId, args);
<del> };
<del>
<ide> beforeEach(() => {
<ide> env = {
<ide> source: {
<ide> replace: sinon.stub()
<ide> },
<del> outputOptions: {
<del> pathinfo: true
<add> runtimeTemplate: {
<add> outputOptions: {
<add> pathinfo: true
<add> },
<add> requestShortener: {
<add> shorten: (request) => `shortened ${request}`
<add> }
<ide> },
<ide> module: {
<ide> id: "123",
<ide> describe("ContextDependencyTemplateAsId", () => {
<ide> describe("when applied", () => {
<ide> describe("with module missing depedencies", () => {
<ide> beforeEach(() => {
<del> applyContextDependencyTemplateAsId(env.baseDependency, env.source, env.outputOptions);
<add> new ContextDependencyTemplateAsId().apply(env.baseDependency, env.source, env.runtimeTemplate);
<ide> });
<ide>
<ide> it("replaces source with missing module error", () => {
<ide> describe("ContextDependencyTemplateAsId", () => {
<ide>
<ide> describe("and path info true", function() {
<ide> beforeEach(function() {
<del> env.outputOptions.pathinfo = true;
<del> applyContextDependencyTemplateAsId(env.dependency, env.source, env.outputOptions);
<add> env.runtimeTemplate.outputOptions.pathinfo = true;
<add> new ContextDependencyTemplateAsId().apply(env.dependency, env.source, env.runtimeTemplate);
<ide> });
<ide>
<ide> it("replaces source with webpack require with comment", () => {
<ide> describe("ContextDependencyTemplateAsId", () => {
<ide>
<ide> describe("and path info false", function() {
<ide> beforeEach(function() {
<del> env.outputOptions.pathinfo = false;
<del> applyContextDependencyTemplateAsId(env.dependency, env.source, env.outputOptions);
<add> env.runtimeTemplate.outputOptions.pathinfo = false;
<add> new ContextDependencyTemplateAsId().apply(env.dependency, env.source, env.runtimeTemplate);
<ide> });
<ide>
<ide> it("replaces source with webpack require without comment", () => {
<ide> describe("ContextDependencyTemplateAsId", () => {
<ide> module: env.module
<ide> });
<ide>
<del> applyContextDependencyTemplateAsId(dependency, env.source, env.outputOptions);
<add> new ContextDependencyTemplateAsId().apply(dependency, env.source, env.runtimeTemplate);
<ide> });
<ide>
<ide> it("replaces source with webpack require and wraps value", () => {
<ide> describe("ContextDependencyTemplateAsId", () => {
<ide> module: env.module
<ide> });
<ide>
<del> applyContextDependencyTemplateAsId(dependency, env.source, env.outputOptions);
<add> new ContextDependencyTemplateAsId().apply(dependency, env.source, env.runtimeTemplate);
<ide> });
<ide>
<ide> it("replaces source with webpack require, wraps value and make replacements", () => { | 32 |
Python | Python | allow longer pod names for k8s executor / kpo | d93240696beeca7d28542d0fe0b53871b3d6612c | <ide><path>airflow/kubernetes/kubernetes_helper_functions.py
<ide> from __future__ import annotations
<ide>
<ide> import logging
<add>import secrets
<add>import string
<ide>
<ide> import pendulum
<ide> from slugify import slugify
<ide>
<ide> log = logging.getLogger(__name__)
<ide>
<add>alphanum_lower = string.ascii_lowercase + string.digits
<ide>
<del>def create_pod_id(dag_id: str | None = None, task_id: str | None = None) -> str:
<add>
<add>def rand_str(num):
<add> """Generate random lowercase alphanumeric string of length num.
<add>
<add> :meta private:
<add> """
<add> return "".join(secrets.choice(alphanum_lower) for _ in range(num))
<add>
<add>
<add>def add_pod_suffix(*, pod_name, rand_len=8, max_len=80):
<add> """Add random string to pod name while staying under max len"""
<add> suffix = "-" + rand_str(rand_len)
<add> return pod_name[: max_len - len(suffix)].strip("-.") + suffix
<add>
<add>
<add>def create_pod_id(
<add> dag_id: str | None = None,
<add> task_id: str | None = None,
<add> *,
<add> max_length: int = 80,
<add> unique: bool = True,
<add>) -> str:
<ide> """
<del> Generates the kubernetes safe pod_id. Note that this is
<del> NOT the full ID that will be launched to k8s. We will add a uuid
<del> to ensure uniqueness.
<add> Generates unique pod ID given a dag_id and / or task_id.
<ide>
<ide> :param dag_id: DAG ID
<ide> :param task_id: Task ID
<del> :return: The non-unique pod_id for this task/DAG pairing
<add> :param max_length: max number of characters
<add> :param unique: whether a random string suffix should be added
<add> :return: A valid identifier for a kubernetes pod name
<ide> """
<add> if not (dag_id or task_id):
<add> raise ValueError("Must supply either dag_id or task_id.")
<ide> name = ""
<ide> if dag_id:
<ide> name += dag_id
<ide> if task_id:
<ide> if name:
<ide> name += "-"
<ide> name += task_id
<del> return slugify(name, lowercase=True)[:253].strip("-.")
<add> base_name = slugify(name, lowercase=True)[:max_length].strip(".-")
<add> if unique:
<add> return add_pod_suffix(pod_name=base_name, rand_len=8, max_len=max_length)
<add> else:
<add> return base_name
<ide>
<ide>
<ide> def annotations_to_key(annotations: dict[str, str]) -> TaskInstanceKey:
<ide><path>airflow/kubernetes/pod_generator.py
<ide> import logging
<ide> import os
<ide> import re
<del>import uuid
<ide> import warnings
<ide> from functools import reduce
<ide>
<ide> from kubernetes.client.api_client import ApiClient
<ide>
<ide> from airflow.exceptions import AirflowConfigException, PodReconciliationError, RemovedInAirflow3Warning
<add>from airflow.kubernetes.kubernetes_helper_functions import add_pod_suffix, rand_str
<ide> from airflow.kubernetes.pod_generator_deprecated import PodDefaults, PodGenerator as PodGeneratorDeprecated
<ide> from airflow.utils import yaml
<ide> from airflow.version import version as airflow_version
<ide> def __init__(
<ide>
<ide> def gen_pod(self) -> k8s.V1Pod:
<ide> """Generates pod"""
<add> warnings.warn("This function is deprecated. ", RemovedInAirflow3Warning)
<ide> result = self.ud_pod
<ide>
<del> result.metadata.name = self.make_unique_pod_id(result.metadata.name)
<add> result.metadata.name = add_pod_suffix(pod_name=result.metadata.name)
<ide>
<ide> if self.extract_xcom:
<ide> result = self.add_xcom_sidecar(result)
<ide> def construct_pod(
<ide> - executor_config
<ide> - dynamic arguments
<ide> """
<add> if len(pod_id) > 253:
<add> warnings.warn(
<add> "pod_id supplied is longer than 253 characters; truncating and adding unique suffix."
<add> )
<add> pod_id = add_pod_suffix(pod_name=pod_id, max_len=253)
<ide> try:
<ide> image = pod_override_object.spec.containers[0].image # type: ignore
<ide> if not image:
<ide> def construct_pod(
<ide> metadata=k8s.V1ObjectMeta(
<ide> namespace=namespace,
<ide> annotations=annotations,
<del> name=PodGenerator.make_unique_pod_id(pod_id),
<add> name=pod_id,
<ide> labels=labels,
<ide> ),
<ide> spec=k8s.V1PodSpec(
<ide> def make_unique_pod_id(pod_id: str) -> str | None:
<ide> r"""
<ide> Kubernetes pod names must consist of one or more lowercase
<ide> rfc1035/rfc1123 labels separated by '.' with a maximum length of 253
<del> characters. Each label has a maximum length of 63 characters.
<add> characters.
<ide>
<ide> Name must pass the following regex for validation
<ide> ``^[a-z0-9]([-a-z0-9]*[a-z0-9])?(\\.[a-z0-9]([-a-z0-9]*[a-z0-9])?)*$``
<ide>
<ide> For more details, see:
<ide> https://github.com/kubernetes/kubernetes/blob/release-1.1/docs/design/identifiers.md
<ide>
<del> :param pod_id: a dag_id with only alphanumeric characters
<add> :param pod_id: requested pod name
<ide> :return: ``str`` valid Pod name of appropriate length
<ide> """
<add> warnings.warn(
<add> "This function is deprecated. Use `add_pod_suffix` in `kubernetes_helper_functions`.",
<add> RemovedInAirflow3Warning,
<add> )
<add>
<ide> if not pod_id:
<ide> return None
<ide>
<del> safe_uuid = uuid.uuid4().hex # safe uuid will always be less than 63 chars
<del>
<del> # Get prefix length after subtracting the uuid length. Clean up '.' and '-' from
<del> # end of podID ('.' can't be followed by '-').
<del> label_prefix_length = MAX_LABEL_LEN - len(safe_uuid) - 1 # -1 for separator
<del> trimmed_pod_id = pod_id[:label_prefix_length].rstrip("-.")
<del>
<del> # previously used a '.' as the separator, but this could create errors in some situations
<del> return f"{trimmed_pod_id}-{safe_uuid}"
<add> max_pod_id_len = 100 # arbitrarily chosen
<add> suffix = rand_str(8) # 8 seems good enough
<add> base_pod_id_len = max_pod_id_len - len(suffix) - 1 # -1 for separator
<add> trimmed_pod_id = pod_id[:base_pod_id_len].rstrip("-.")
<add> return f"{trimmed_pod_id}-{suffix}"
<ide>
<ide>
<ide> def merge_objects(base_obj, client_obj):
<ide><path>airflow/providers/cncf/kubernetes/operators/kubernetes_pod.py
<ide> import json
<ide> import logging
<ide> import re
<add>import secrets
<add>import string
<ide> import warnings
<ide> from contextlib import AbstractContextManager
<ide> from typing import TYPE_CHECKING, Any, Sequence
<ide>
<ide> from kubernetes.client import CoreV1Api, models as k8s
<add>from slugify import slugify
<ide>
<ide> from airflow.compat.functools import cached_property
<ide> from airflow.exceptions import AirflowException
<ide>
<ide> from airflow.utils.context import Context
<ide>
<add>alphanum_lower = string.ascii_lowercase + string.digits
<ide>
<del>def _task_id_to_pod_name(val: str) -> str:
<add>
<add>def _rand_str(num):
<add> """Generate random lowercase alphanumeric string of length num.
<add>
<add> TODO: when min airflow version >= 2.5, delete this function and import from kubernetes_helper_functions.
<add>
<add> :meta private:
<add> """
<add> return "".join(secrets.choice(alphanum_lower) for _ in range(num))
<add>
<add>
<add>def _add_pod_suffix(*, pod_name, rand_len=8, max_len=253):
<add> """Add random string to pod name while staying under max len
<add>
<add> TODO: when min airflow version >= 2.5, delete this function and import from kubernetes_helper_functions.
<add>
<add> :meta private:
<ide> """
<del> Given a task_id, convert it to a pod name.
<del> Adds a 0 if start or end char is invalid.
<del> Replaces any other invalid char with `-`.
<add> suffix = "-" + _rand_str(rand_len)
<add> return pod_name[: max_len - len(suffix)].strip("-.") + suffix
<ide>
<del> :param val: non-empty string, presumed to be a task id
<del> :return valid kubernetes object name.
<add>
<add>def _create_pod_id(
<add> dag_id: str | None = None,
<add> task_id: str | None = None,
<add> *,
<add> max_length: int = 80,
<add> unique: bool = True,
<add>) -> str:
<ide> """
<del> if not val:
<del> raise ValueError("_task_id_to_pod_name requires non-empty string.")
<del> val = val.lower()
<del> if not re.match(r"[a-z0-9]", val[0]):
<del> val = f"0{val}"
<del> if not re.match(r"[a-z0-9]", val[-1]):
<del> val = f"{val}0"
<del> val = re.sub(r"[^a-z0-9\-.]", "-", val)
<del> if len(val) > 253:
<del> raise ValueError(
<del> f"Pod name {val} is longer than 253 characters. "
<del> "See https://kubernetes.io/docs/concepts/overview/working-with-objects/names/."
<del> )
<del> return val
<add> Generates unique pod ID given a dag_id and / or task_id.
<add>
<add> TODO: when min airflow version >= 2.5, delete this function and import from kubernetes_helper_functions.
<add>
<add> :param dag_id: DAG ID
<add> :param task_id: Task ID
<add> :param max_length: max number of characters
<add> :param unique: whether a random string suffix should be added
<add> :return: A valid identifier for a kubernetes pod name
<add> """
<add> if not (dag_id or task_id):
<add> raise ValueError("Must supply either dag_id or task_id.")
<add> name = ""
<add> if dag_id:
<add> name += dag_id
<add> if task_id:
<add> if name:
<add> name += "-"
<add> name += task_id
<add> base_name = slugify(name, lowercase=True)[:max_length].strip(".-")
<add> if unique:
<add> return _add_pod_suffix(pod_name=base_name, max_len=max_length)
<add> else:
<add> return base_name
<ide>
<ide>
<ide> class PodReattachFailure(AirflowException):
<ide> def __init__(
<ide> namespace: str | None = None,
<ide> image: str | None = None,
<ide> name: str | None = None,
<del> random_name_suffix: bool | None = True,
<add> random_name_suffix: bool = True,
<ide> cmds: list[str] | None = None,
<ide> arguments: list[str] | None = None,
<ide> ports: list[k8s.V1ContainerPort] | None = None,
<ide> def build_pod_request_obj(self, context: Context | None = None) -> k8s.V1Pod:
<ide> pod = PodGenerator.reconcile_pods(pod_template, pod)
<ide>
<ide> if not pod.metadata.name:
<del> pod.metadata.name = _task_id_to_pod_name(self.task_id)
<del>
<del> if self.random_name_suffix:
<del> pod.metadata.name = PodGenerator.make_unique_pod_id(pod.metadata.name)
<add> pod.metadata.name = _create_pod_id(task_id=self.task_id, unique=self.random_name_suffix)
<add> elif self.random_name_suffix:
<add> # user has supplied pod name, we're just adding suffix
<add> pod.metadata.name = _add_pod_suffix(pod_name=pod.metadata.name)
<ide>
<ide> if not pod.metadata.namespace:
<ide> # todo: replace with call to `hook.get_namespace` in 6.0, when it doesn't default to `default`.
<ide><path>tests/kubernetes/models/test_secret.py
<ide> def test_only_mount_sub_secret(self, mock_uuid):
<ide> )
<ide>
<ide> @mock.patch("uuid.uuid4")
<del> def test_attach_to_pod(self, mock_uuid):
<add> @mock.patch("airflow.kubernetes.pod_generator.rand_str")
<add> def test_attach_to_pod(self, mock_rand_str, mock_uuid):
<ide> static_uuid = uuid.UUID("cf4a56d2-8101-4217-b027-2af6216feb48")
<ide> mock_uuid.return_value = static_uuid
<add> rand_str = "abcd1234"
<add> mock_rand_str.return_value = rand_str
<ide> path = sys.path[0] + "/tests/kubernetes/pod_generator_base.yaml"
<del> pod = PodGenerator(pod_template_file=path).gen_pod()
<add> pod = PodGenerator(pod_template_file=path).ud_pod
<ide> secrets = [
<ide> # This should be a secretRef
<ide> Secret("env", None, "secret_a"),
<ide> def test_attach_to_pod(self, mock_uuid):
<ide> "kind": "Pod",
<ide> "metadata": {
<ide> "labels": {"app": "myapp"},
<del> "name": "myapp-pod-cf4a56d281014217b0272af6216feb48",
<add> "name": "myapp-pod",
<ide> "namespace": "default",
<ide> },
<ide> "spec": {
<ide> def test_attach_to_pod(self, mock_uuid):
<ide> "ports": [{"containerPort": 1234, "name": "foo"}],
<ide> "resources": {"limits": {"memory": "200Mi"}, "requests": {"memory": "100Mi"}},
<ide> "volumeMounts": [
<del> {"mountPath": "/airflow/xcom", "name": "xcom"},
<ide> {
<ide> "mountPath": "/etc/foo",
<ide> "name": "secretvol" + str(static_uuid),
<ide> "readOnly": True,
<ide> },
<ide> ],
<ide> },
<del> {
<del> "command": ["sh", "-c", 'trap "exit 0" INT; while true; do sleep 30; done;'],
<del> "image": "alpine",
<del> "name": "airflow-xcom-sidecar",
<del> "resources": {"requests": {"cpu": "1m"}},
<del> "volumeMounts": [{"mountPath": "/airflow/xcom", "name": "xcom"}],
<del> },
<ide> ],
<ide> "hostNetwork": True,
<ide> "imagePullSecrets": [{"name": "pull_secret_a"}, {"name": "pull_secret_b"}],
<ide> "securityContext": {"fsGroup": 2000, "runAsUser": 1000},
<ide> "volumes": [
<del> {"emptyDir": {}, "name": "xcom"},
<ide> {"name": "secretvol" + str(static_uuid), "secret": {"secretName": "secret_b"}},
<ide> ],
<ide> },
<ide><path>tests/kubernetes/test_kubernetes_helper_functions.py
<ide> import re
<ide>
<ide> import pytest
<add>from pytest import param
<ide>
<ide> from airflow.kubernetes.kubernetes_helper_functions import create_pod_id
<add>from airflow.providers.cncf.kubernetes.operators.kubernetes_pod import _create_pod_id
<ide>
<ide> pod_name_regex = r"^[a-z0-9]([-a-z0-9]*[a-z0-9])?(\.[a-z0-9]([-a-z0-9]*[a-z0-9])?)*$"
<ide>
<ide>
<add># todo: when cncf provider min airflow version >= 2.5 remove this parameterization
<add># we added this function to provider temporarily until min airflow version catches up
<add># meanwhile, we use this one test to test both core and provider
<ide> @pytest.mark.parametrize(
<del> "val, expected",
<del> [
<del> ("task-id", "task-id"), # no problem
<del> ("task_id", "task-id"), # underscores
<del> ("---task.id---", "task-id"), # dots
<del> (".task.id", "task-id"), # leading dot invalid
<del> ("**task.id", "task-id"), # leading dot invalid
<del> ("-90Abc*&", "90abc"), # invalid ends
<del> ("90AçLbˆˆç˙ßߘ˜˙c*a", "90aclb-c-ssss-c-a"), # weird unicode
<del> ],
<add> "create_pod_id", [param(_create_pod_id, id="provider"), param(create_pod_id, id="core")]
<ide> )
<del>def test_create_pod_id_task_only(val, expected):
<del> actual = create_pod_id(task_id=val)
<del> assert actual == expected
<del> assert re.match(pod_name_regex, actual)
<add>class TestCreatePodId:
<add> @pytest.mark.parametrize(
<add> "val, expected",
<add> [
<add> ("task-id", "task-id"), # no problem
<add> ("task_id", "task-id"), # underscores
<add> ("---task.id---", "task-id"), # dots
<add> (".task.id", "task-id"), # leading dot invalid
<add> ("**task.id", "task-id"), # leading dot invalid
<add> ("-90Abc*&", "90abc"), # invalid ends
<add> ("90AçLbˆˆç˙ßߘ˜˙c*a", "90aclb-c-ssss-c-a"), # weird unicode
<add> ],
<add> )
<add> def test_create_pod_id_task_only(self, val, expected, create_pod_id):
<add> actual = create_pod_id(task_id=val, unique=False)
<add> assert actual == expected
<add> assert re.match(pod_name_regex, actual)
<ide>
<add> @pytest.mark.parametrize(
<add> "val, expected",
<add> [
<add> ("dag-id", "dag-id"), # no problem
<add> ("dag_id", "dag-id"), # underscores
<add> ("---dag.id---", "dag-id"), # dots
<add> (".dag.id", "dag-id"), # leading dot invalid
<add> ("**dag.id", "dag-id"), # leading dot invalid
<add> ("-90Abc*&", "90abc"), # invalid ends
<add> ("90AçLbˆˆç˙ßߘ˜˙c*a", "90aclb-c-ssss-c-a"), # weird unicode
<add> ],
<add> )
<add> def test_create_pod_id_dag_only(self, val, expected, create_pod_id):
<add> actual = create_pod_id(dag_id=val, unique=False)
<add> assert actual == expected
<add> assert re.match(pod_name_regex, actual)
<ide>
<del>@pytest.mark.parametrize(
<del> "val, expected",
<del> [
<del> ("dag-id", "dag-id"), # no problem
<del> ("dag_id", "dag-id"), # underscores
<del> ("---dag.id---", "dag-id"), # dots
<del> (".dag.id", "dag-id"), # leading dot invalid
<del> ("**dag.id", "dag-id"), # leading dot invalid
<del> ("-90Abc*&", "90abc"), # invalid ends
<del> ("90AçLbˆˆç˙ßߘ˜˙c*a", "90aclb-c-ssss-c-a"), # weird unicode
<del> ],
<del>)
<del>def test_create_pod_id_dag_only(val, expected):
<del> actual = create_pod_id(dag_id=val)
<del> assert actual == expected
<del> assert re.match(pod_name_regex, actual)
<del>
<add> @pytest.mark.parametrize(
<add> "dag_id, task_id, expected",
<add> [
<add> ("dag-id", "task-id", "dag-id-task-id"), # no problem
<add> ("dag_id", "task_id", "dag-id-task-id"), # underscores
<add> ("dag.id", "task.id", "dag-id-task-id"), # dots
<add> (".dag.id", ".---task.id", "dag-id-task-id"), # leading dot invalid
<add> ("**dag.id", "**task.id", "dag-id-task-id"), # leading dot invalid
<add> ("-90Abc*&", "-90Abc*&", "90abc-90abc"), # invalid ends
<add> ("90AçLbˆˆç˙ßߘ˜˙c*a", "90AçLbˆˆç˙ßߘ˜˙c*a", "90aclb-c-ssss-c-a-90aclb-c-ssss-c-a"), # ugly
<add> ],
<add> )
<add> def test_create_pod_id_dag_and_task(self, dag_id, task_id, expected, create_pod_id):
<add> actual = create_pod_id(dag_id=dag_id, task_id=task_id, unique=False)
<add> assert actual == expected
<add> assert re.match(pod_name_regex, actual)
<ide>
<del>@pytest.mark.parametrize(
<del> "dag_id, task_id, expected",
<del> [
<del> ("dag-id", "task-id", "dag-id-task-id"), # no problem
<del> ("dag_id", "task_id", "dag-id-task-id"), # underscores
<del> ("dag.id", "task.id", "dag-id-task-id"), # dots
<del> (".dag.id", ".---task.id", "dag-id-task-id"), # leading dot invalid
<del> ("**dag.id", "**task.id", "dag-id-task-id"), # leading dot invalid
<del> ("-90Abc*&", "-90Abc*&", "90abc-90abc"), # invalid ends
<del> ("90AçLbˆˆç˙ßߘ˜˙c*a", "90AçLbˆˆç˙ßߘ˜˙c*a", "90aclb-c-ssss-c-a-90aclb-c-ssss-c-a"), # ugly
<del> ],
<del>)
<del>def test_create_pod_id_dag_and_task(dag_id, task_id, expected):
<del> actual = create_pod_id(dag_id=dag_id, task_id=task_id)
<del> assert actual == expected
<del> assert re.match(pod_name_regex, actual)
<add> def test_create_pod_id_dag_too_long_with_suffix(self, create_pod_id):
<add> actual = create_pod_id("0" * 254)
<add> assert re.match(r"0{71}-[a-z0-9]{8}", actual)
<add> assert re.match(pod_name_regex, actual)
<ide>
<add> def test_create_pod_id_dag_too_long_non_unique(self, create_pod_id):
<add> actual = create_pod_id("0" * 254, unique=False)
<add> assert re.match(r"0{80}", actual)
<add> assert re.match(pod_name_regex, actual)
<ide>
<del>def test_create_pod_id_dag_too_long():
<del> actual = create_pod_id("0" * 254)
<del> assert actual == "0" * 253
<del> assert re.match(pod_name_regex, actual)
<add> @pytest.mark.parametrize("unique", [True, False])
<add> @pytest.mark.parametrize("length", [25, 100, 200, 300])
<add> def test_create_pod_id(self, create_pod_id, length, unique):
<add> """Test behavior of max_length and unique."""
<add> dag_id = "dag-dag-dag-dag-dag-dag-dag-dag-dag-dag-dag-dag-dag-dag-dag-dag-"
<add> task_id = "task-task-task-task-task-task-task-task-task-task-task-task-task-task-task-task-task-"
<add> actual = create_pod_id(
<add> dag_id=dag_id,
<add> task_id=task_id,
<add> max_length=length,
<add> unique=unique,
<add> )
<add> base = f"{dag_id}{task_id}".strip("-")
<add> if unique:
<add> assert actual[:-9] == base[: length - 9].strip("-")
<add> assert re.match(r"-[a-z0-9]{8}", actual[-9:])
<add> else:
<add> assert actual == base[:length]
<ide><path>tests/kubernetes/test_pod_generator.py
<ide> import os
<ide> import re
<ide> import sys
<del>import uuid
<ide> from unittest import mock
<ide> from unittest.mock import MagicMock
<ide>
<ide> import pytest
<ide> from dateutil import parser
<ide> from kubernetes.client import ApiClient, models as k8s
<del>from parameterized import parameterized
<add>from pytest import param
<ide>
<ide> from airflow import __version__
<ide> from airflow.exceptions import AirflowConfigException, PodReconciliationError
<ide>
<ide> class TestPodGenerator:
<ide> def setup_method(self):
<del> self.static_uuid = uuid.UUID("cf4a56d2-8101-4217-b027-2af6216feb48")
<add> self.rand_str = "abcd1234"
<ide> self.deserialize_result = {
<ide> "apiVersion": "v1",
<ide> "kind": "Pod",
<ide> def setup_method(self):
<ide> }
<ide> self.metadata = {
<ide> "labels": self.labels,
<del> "name": "pod_id-" + self.static_uuid.hex,
<add> "name": "pod_id-" + self.rand_str,
<ide> "namespace": "namespace",
<ide> "annotations": self.annotations,
<ide> }
<ide> def setup_method(self):
<ide> kind="Pod",
<ide> metadata=k8s.V1ObjectMeta(
<ide> namespace="default",
<del> name="myapp-pod-" + self.static_uuid.hex,
<add> name="myapp-pod-" + self.rand_str,
<ide> labels={"app": "myapp"},
<ide> ),
<ide> spec=k8s.V1PodSpec(
<ide> def setup_method(self):
<ide> ),
<ide> )
<ide>
<del> @mock.patch("uuid.uuid4")
<del> def test_gen_pod_extract_xcom(self, mock_uuid):
<del> mock_uuid.return_value = self.static_uuid
<add> @mock.patch("airflow.kubernetes.kubernetes_helper_functions.rand_str")
<add> def test_gen_pod_extract_xcom(self, mock_rand_str):
<add> """
<add> Method gen_pod is used nowhere in codebase and is deprecated.
<add> This test is only retained for backcompat.
<add> """
<add> mock_rand_str.return_value = self.rand_str
<ide> path = sys.path[0] + "/tests/kubernetes/pod_generator_base_with_secrets.yaml"
<ide>
<ide> pod_generator = PodGenerator(pod_template_file=path, extract_xcom=True)
<ide> result = pod_generator.gen_pod()
<del> result_dict = self.k8s_client.sanitize_for_serialization(result)
<ide> container_two = {
<ide> "name": "airflow-xcom-sidecar",
<ide> "image": "alpine",
<ide> def test_gen_pod_extract_xcom(self, mock_uuid):
<ide> )
<ide> result_dict = self.k8s_client.sanitize_for_serialization(result)
<ide> expected_dict = self.k8s_client.sanitize_for_serialization(self.expected)
<del>
<ide> assert result_dict == expected_dict
<ide>
<ide> def test_from_obj(self):
<ide> def test_from_obj(self):
<ide> },
<ide> } == result
<ide>
<del> @mock.patch("uuid.uuid4")
<del> def test_reconcile_pods_empty_mutator_pod(self, mock_uuid):
<del> mock_uuid.return_value = self.static_uuid
<add> def test_reconcile_pods_empty_mutator_pod(self):
<ide> path = sys.path[0] + "/tests/kubernetes/pod_generator_base_with_secrets.yaml"
<del>
<ide> pod_generator = PodGenerator(pod_template_file=path, extract_xcom=True)
<del> base_pod = pod_generator.gen_pod()
<add> base_pod = pod_generator.ud_pod
<ide> mutator_pod = None
<del> name = "name1-" + self.static_uuid.hex
<del>
<del> base_pod.metadata.name = name
<del>
<ide> result = PodGenerator.reconcile_pods(base_pod, mutator_pod)
<ide> assert base_pod == result
<ide>
<ide> mutator_pod = k8s.V1Pod()
<ide> result = PodGenerator.reconcile_pods(base_pod, mutator_pod)
<ide> assert base_pod == result
<ide>
<del> @mock.patch("uuid.uuid4")
<del> def test_reconcile_pods(self, mock_uuid):
<del> mock_uuid.return_value = self.static_uuid
<add> @mock.patch("airflow.kubernetes.kubernetes_helper_functions.rand_str")
<add> def test_reconcile_pods(self, mock_rand_str):
<add> mock_rand_str.return_value = self.rand_str
<ide> path = sys.path[0] + "/tests/kubernetes/pod_generator_base_with_secrets.yaml"
<ide>
<del> base_pod = PodGenerator(pod_template_file=path, extract_xcom=False).gen_pod()
<add> base_pod = PodGenerator(pod_template_file=path, extract_xcom=False).ud_pod
<ide>
<ide> mutator_pod = k8s.V1Pod(
<ide> metadata=k8s.V1ObjectMeta(
<ide> def test_reconcile_pods(self, mock_uuid):
<ide> @pytest.mark.parametrize(
<ide> "config_image, expected_image",
<ide> [
<del> pytest.param("my_image:my_tag", "my_image:my_tag", id="image_in_cfg"),
<del> pytest.param(None, "busybox", id="no_image_in_cfg"),
<add> param("my_image:my_tag", "my_image:my_tag", id="image_in_cfg"),
<add> param(None, "busybox", id="no_image_in_cfg"),
<ide> ],
<ide> )
<del> @mock.patch("uuid.uuid4")
<del> def test_construct_pod(self, mock_uuid, config_image, expected_image):
<add> def test_construct_pod(self, config_image, expected_image):
<ide> template_file = sys.path[0] + "/tests/kubernetes/pod_generator_base_with_secrets.yaml"
<ide> worker_config = PodGenerator.deserialize_model_file(template_file)
<del> mock_uuid.return_value = self.static_uuid
<ide> executor_config = k8s.V1Pod(
<ide> spec=k8s.V1PodSpec(
<ide> containers=[
<ide> def test_construct_pod(self, mock_uuid, config_image, expected_image):
<ide> expected.metadata.labels = self.labels
<ide> expected.metadata.labels["app"] = "myapp"
<ide> expected.metadata.annotations = self.annotations
<del> expected.metadata.name = "pod_id-" + self.static_uuid.hex
<add> expected.metadata.name = "pod_id"
<ide> expected.metadata.namespace = "test_namespace"
<ide> expected.spec.containers[0].args = ["command"]
<ide> expected.spec.containers[0].image = expected_image
<ide> def test_construct_pod(self, mock_uuid, config_image, expected_image):
<ide>
<ide> assert expected_dict == result_dict
<ide>
<del> @mock.patch("uuid.uuid4")
<del> def test_construct_pod_mapped_task(self, mock_uuid):
<add> def test_construct_pod_mapped_task(self):
<ide> template_file = sys.path[0] + "/tests/kubernetes/pod_generator_base.yaml"
<ide> worker_config = PodGenerator.deserialize_model_file(template_file)
<del> mock_uuid.return_value = self.static_uuid
<del>
<ide> result = PodGenerator.construct_pod(
<ide> dag_id=self.dag_id,
<ide> task_id=self.task_id,
<ide> def test_construct_pod_mapped_task(self, mock_uuid):
<ide> expected.metadata.labels["map_index"] = "0"
<ide> expected.metadata.annotations = self.annotations
<ide> expected.metadata.annotations["map_index"] = "0"
<del> expected.metadata.name = "pod_id-" + self.static_uuid.hex
<add> expected.metadata.name = "pod_id"
<ide> expected.metadata.namespace = "test_namespace"
<ide> expected.spec.containers[0].args = ["command"]
<ide> del expected.spec.containers[0].env_from[1:]
<ide> def test_construct_pod_mapped_task(self, mock_uuid):
<ide>
<ide> assert expected_dict == result_dict
<ide>
<del> @mock.patch("uuid.uuid4")
<del> def test_construct_pod_empty_executor_config(self, mock_uuid):
<add> def test_construct_pod_empty_executor_config(self):
<ide> path = sys.path[0] + "/tests/kubernetes/pod_generator_base_with_secrets.yaml"
<ide> worker_config = PodGenerator.deserialize_model_file(path)
<del> mock_uuid.return_value = self.static_uuid
<ide> executor_config = None
<ide>
<ide> result = PodGenerator.construct_pod(
<ide> def test_construct_pod_empty_executor_config(self, mock_uuid):
<ide> worker_config.metadata.annotations = self.annotations
<ide> worker_config.metadata.labels = self.labels
<ide> worker_config.metadata.labels["app"] = "myapp"
<del> worker_config.metadata.name = "pod_id-" + self.static_uuid.hex
<add> worker_config.metadata.name = "pod_id"
<ide> worker_config.metadata.namespace = "namespace"
<ide> worker_config.spec.containers[0].env.append(
<ide> k8s.V1EnvVar(name="AIRFLOW_IS_K8S_EXECUTOR_POD", value="True")
<ide> )
<ide> worker_config_result = self.k8s_client.sanitize_for_serialization(worker_config)
<del> assert worker_config_result == sanitized_result
<add> assert sanitized_result == worker_config_result
<ide>
<del> @mock.patch("uuid.uuid4")
<del> def test_construct_pod_attribute_error(self, mock_uuid):
<add> @mock.patch("airflow.kubernetes.kubernetes_helper_functions.rand_str")
<add> def test_construct_pod_attribute_error(self, mock_rand_str):
<ide> """
<ide> After upgrading k8s library we might get attribute error.
<ide> In this case it should raise PodReconciliationError
<ide> """
<ide> path = sys.path[0] + "/tests/kubernetes/pod_generator_base_with_secrets.yaml"
<ide> worker_config = PodGenerator.deserialize_model_file(path)
<del> mock_uuid.return_value = self.static_uuid
<add> mock_rand_str.return_value = self.rand_str
<ide> executor_config = MagicMock()
<ide> executor_config.side_effect = AttributeError("error")
<ide>
<ide> def test_construct_pod_attribute_error(self, mock_uuid):
<ide> scheduler_job_id="uuid",
<ide> )
<ide>
<del> @mock.patch("uuid.uuid4")
<del> def test_ensure_max_label_length(self, mock_uuid):
<del> mock_uuid.return_value = self.static_uuid
<add> @mock.patch("airflow.kubernetes.kubernetes_helper_functions.rand_str")
<add> def test_ensure_max_identifier_length(self, mock_rand_str):
<add> mock_rand_str.return_value = self.rand_str
<ide> path = os.path.join(os.path.dirname(__file__), "pod_generator_base_with_secrets.yaml")
<ide> worker_config = PodGenerator.deserialize_model_file(path)
<ide>
<ide> def test_ensure_max_label_length(self, mock_uuid):
<ide> base_worker_pod=worker_config,
<ide> )
<ide>
<del> assert result.metadata.name == "a" * 30 + "-" + self.static_uuid.hex
<add> assert result.metadata.name == "a" * 244 + "-" + self.rand_str
<ide> for _, v in result.metadata.labels.items():
<ide> assert len(v) <= 63
<ide>
<ide> def test_deserialize_non_existent_model_file(self, caplog):
<ide> assert len(caplog.records) == 1
<ide> assert "does not exist" in caplog.text
<ide>
<del> @parameterized.expand(
<add> @pytest.mark.parametrize(
<add> "input",
<ide> (
<del> ("max_label_length", "a" * 63),
<del> ("max_subdomain_length", "a" * 253),
<del> (
<del> "tiny",
<del> "aaa",
<del> ),
<del> )
<add> param("a" * 70, id="max_label_length"),
<add> param("a" * 253, id="max_subdomain_length"),
<add> param("a" * 95, id="close to max"),
<add> param("aaa", id="tiny"),
<add> ),
<ide> )
<del> def test_pod_name_confirm_to_max_length(self, _, pod_id):
<del> name = PodGenerator.make_unique_pod_id(pod_id)
<del> assert len(name) <= 253
<del> parts = name.split("-")
<del>
<del> # 63 is the MAX_LABEL_LEN in pod_generator.py
<del> # 33 is the length of uuid4 + 1 for the separating '-' (32 + 1)
<del> # 30 is the max length of the prefix
<del> # so 30 = 63 - (32 + 1)
<del> assert len(parts[0]) <= 30
<del> assert len(parts[1]) == 32
<del>
<del> @parameterized.expand(
<add> def test_pod_name_confirm_to_max_length(self, input):
<add> actual = PodGenerator.make_unique_pod_id(input)
<add> assert len(actual) <= 100
<add> actual_base, actual_suffix = actual.rsplit("-", maxsplit=1)
<add> # we limit pod id length to 100
<add> # random suffix is 8 chars plus the '-' separator
<add> # so actual pod id base should first 91 chars of requested pod id
<add> assert actual_base == input[:91]
<add> # suffix should always be 8, the random alphanum
<add> assert re.match(r"^[a-z0-9]{8}$", actual_suffix)
<add>
<add> @pytest.mark.parametrize(
<add> "pod_id, expected_starts_with",
<ide> (
<add> (
<add> "somewhat-long-pod-name-maybe-longer-than-previously-supported-with-hyphen-",
<add> "somewhat-long-pod-name-maybe-longer-than-previously-supported-with-hyphen",
<add> ),
<ide> ("pod-name-with-hyphen-", "pod-name-with-hyphen"),
<ide> ("pod-name-with-double-hyphen--", "pod-name-with-double-hyphen"),
<ide> ("pod0-name", "pod0-name"),
<ide> ("simple", "simple"),
<ide> ("pod-name-with-dot.", "pod-name-with-dot"),
<ide> ("pod-name-with-double-dot..", "pod-name-with-double-dot"),
<ide> ("pod-name-with-hyphen-dot-.", "pod-name-with-hyphen-dot"),
<del> )
<add> ),
<ide> )
<ide> def test_pod_name_is_valid(self, pod_id, expected_starts_with):
<del> name = PodGenerator.make_unique_pod_id(pod_id)
<del>
<add> """
<add> `make_unique_pod_id` doesn't actually guarantee that the regex passes for any input.
<add> But I guess this test verifies that an otherwise valid pod_id doesn't get _screwed up_.
<add> """
<add> actual = PodGenerator.make_unique_pod_id(pod_id)
<add> assert len(actual) <= 253
<add> assert actual == actual.lower(), "not lowercase"
<add> # verify using official k8s regex
<ide> regex = r"^[a-z0-9]([-a-z0-9]*[a-z0-9])?(\.[a-z0-9]([-a-z0-9]*[a-z0-9])?)*$"
<del> assert (
<del> len(name) <= 253 and all(ch.lower() == ch for ch in name) and re.match(regex, name)
<del> ), "pod_id is invalid - fails allowed regex check"
<del>
<del> assert name.rsplit("-", 1)[0] == expected_starts_with
<add> assert re.match(regex, actual), "pod_id is invalid - fails allowed regex check"
<add> assert actual.rsplit("-", 1)[0] == expected_starts_with
<add> # verify ends with 8 char lowercase alphanum string
<add> assert re.match(rf"^{expected_starts_with}-[a-z0-9]{{8}}$", actual), "doesn't match expected pattern"
<ide>
<ide> def test_validate_pod_generator(self):
<ide> with pytest.raises(AirflowConfigException):
<ide><path>tests/providers/cncf/kubernetes/operators/test_kubernetes_pod.py
<ide> from airflow.providers.cncf.kubernetes.operators.kubernetes_pod import (
<ide> KubernetesPodOperator,
<ide> _optionally_suppress,
<del> _task_id_to_pod_name,
<ide> )
<ide> from airflow.utils import timezone
<ide> from airflow.utils.session import create_session
<ide> def test_task_id_as_name(self):
<ide> random_name_suffix=False,
<ide> )
<ide> pod = k.build_pod_request_obj({})
<del> assert pod.metadata.name == "0.hi.--09hi"
<add> assert pod.metadata.name == "hi-09hi"
<ide>
<ide> def test_task_id_as_name_with_suffix(self):
<ide> k = KubernetesPodOperator(
<ide> task_id=".hi.-_09HI",
<ide> random_name_suffix=True,
<ide> )
<ide> pod = k.build_pod_request_obj({})
<del> expected = "0.hi.--09hi"
<del> assert pod.metadata.name.startswith(expected)
<del> assert re.match(rf"{expected}-[a-z0-9-]+", pod.metadata.name) is not None
<add> expected = "hi-09hi"
<add> assert pod.metadata.name[: len(expected)] == expected
<add> assert re.match(rf"{expected}-[a-z0-9]{{8}}", pod.metadata.name) is not None
<ide>
<ide> def test_task_id_as_name_with_suffix_very_long(self):
<ide> k = KubernetesPodOperator(
<ide> task_id="a" * 250,
<ide> random_name_suffix=True,
<ide> )
<ide> pod = k.build_pod_request_obj({})
<del> assert re.match(r"aaaaaaaaaaaaaaaaaaaaaaaaaaaaaa-[a-z0-9-]+", pod.metadata.name) is not None
<add> assert (
<add> re.match(
<add> r"aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa-[a-z0-9]{8}",
<add> pod.metadata.name,
<add> )
<add> is not None
<add> )
<ide>
<ide> def test_task_id_as_name_dag_id_is_ignored(self):
<ide> dag = DAG(dag_id="this_is_a_dag_name", start_date=pendulum.now())
<ide> def test__suppress_no_error(self, caplog):
<ide> with _optionally_suppress():
<ide> print("hi")
<ide> assert caplog.text == ""
<del>
<del>
<del>@pytest.mark.parametrize(
<del> "val, expected",
<del> [
<del> ("task-id", "task-id"), # no problem
<del> ("task_id", "task-id"), # underscores
<del> ("task.id", "task.id"), # dots ok
<del> (".task.id", "0.task.id"), # leading dot invalid
<del> ("-90Abc*&", "0-90abc--0"), # invalid ends
<del> ("90AçLbˆˆç˙ßߘ˜˙c*a", "90a-lb---------c-a"), # weird unicode
<del> ],
<del>)
<del>def test_task_id_to_pod_name(val, expected):
<del> assert _task_id_to_pod_name(val) == expected
<del>
<del>
<del>def test_task_id_to_pod_name_long():
<del> with pytest.raises(ValueError, match="longer than 253"):
<del> _task_id_to_pod_name("0" * 254) | 7 |
Ruby | Ruby | start work towards making ar include-able | b2c9ce341a1c907041f55461aefebb0321280cb5 | <ide><path>activerecord/lib/active_record.rb
<ide> module ActiveRecord
<ide>
<ide> autoload :Base
<ide> autoload :Callbacks
<add> autoload :Core
<ide> autoload :CounterCache
<ide> autoload :DynamicMatchers
<ide> autoload :DynamicFinderMatch
<ide> module ActiveRecord
<ide> autoload :Integration
<ide> autoload :Migration
<ide> autoload :Migrator, 'active_record/migration'
<add> autoload :Model
<ide> autoload :ModelSchema
<ide> autoload :NestedAttributes
<ide> autoload :Observer
<ide><path>activerecord/lib/active_record/base.rb
<ide> module ActiveRecord #:nodoc:
<ide> # So it's possible to assign a logger to the class through <tt>Base.logger=</tt> which will then be used by all
<ide> # instances in the current object space.
<ide> class Base
<del> ##
<del> # :singleton-method:
<del> # Accepts a logger conforming to the interface of Log4r or the default Ruby 1.8+ Logger class,
<del> # which is then passed on to any new database connections made and which can be retrieved on both
<del> # a class and instance level by calling +logger+.
<del> cattr_accessor :logger, :instance_writer => false
<del>
<del> ##
<del> # :singleton-method:
<del> # Contains the database configuration - as is typically stored in config/database.yml -
<del> # as a Hash.
<del> #
<del> # For example, the following database.yml...
<del> #
<del> # development:
<del> # adapter: sqlite3
<del> # database: db/development.sqlite3
<del> #
<del> # production:
<del> # adapter: sqlite3
<del> # database: db/production.sqlite3
<del> #
<del> # ...would result in ActiveRecord::Base.configurations to look like this:
<del> #
<del> # {
<del> # 'development' => {
<del> # 'adapter' => 'sqlite3',
<del> # 'database' => 'db/development.sqlite3'
<del> # },
<del> # 'production' => {
<del> # 'adapter' => 'sqlite3',
<del> # 'database' => 'db/production.sqlite3'
<del> # }
<del> # }
<del> cattr_accessor :configurations, :instance_writer => false
<del> @@configurations = {}
<del>
<del> ##
<del> # :singleton-method:
<del> # Determines whether to use Time.local (using :local) or Time.utc (using :utc) when pulling
<del> # dates and times from the database. This is set to :local by default.
<del> cattr_accessor :default_timezone, :instance_writer => false
<del> @@default_timezone = :local
<del>
<del> ##
<del> # :singleton-method:
<del> # Specifies the format to use when dumping the database schema with Rails'
<del> # Rakefile. If :sql, the schema is dumped as (potentially database-
<del> # specific) SQL statements. If :ruby, the schema is dumped as an
<del> # ActiveRecord::Schema file which can be loaded into any database that
<del> # supports migrations. Use :ruby if you want to have different database
<del> # adapters for, e.g., your development and test environments.
<del> cattr_accessor :schema_format , :instance_writer => false
<del> @@schema_format = :ruby
<del>
<del> ##
<del> # :singleton-method:
<del> # Specify whether or not to use timestamps for migration versions
<del> cattr_accessor :timestamped_migrations , :instance_writer => false
<del> @@timestamped_migrations = true
<del>
<del> class << self # Class methods
<del> def inherited(child_class) #:nodoc:
<del> # force attribute methods to be higher in inheritance hierarchy than other generated methods
<del> child_class.generated_attribute_methods
<del> child_class.generated_feature_methods
<del> super
<del> end
<del>
<del> def generated_feature_methods
<del> @generated_feature_methods ||= begin
<del> mod = const_set(:GeneratedFeatureMethods, Module.new)
<del> include mod
<del> mod
<del> end
<del> end
<del>
<del> # Returns a string like 'Post(id:integer, title:string, body:text)'
<del> def inspect
<del> if self == Base
<del> super
<del> elsif abstract_class?
<del> "#{super}(abstract)"
<del> elsif table_exists?
<del> attr_list = columns.map { |c| "#{c.name}: #{c.type}" } * ', '
<del> "#{super}(#{attr_list})"
<del> else
<del> "#{super}(Table doesn't exist)"
<del> end
<del> end
<del>
<del> # Overwrite the default class equality method to provide support for association proxies.
<del> def ===(object)
<del> object.is_a?(self)
<del> end
<del>
<del> def arel_table
<del> @arel_table ||= Arel::Table.new(table_name, arel_engine)
<del> end
<del>
<del> def arel_engine
<del> @arel_engine ||= begin
<del> if self == ActiveRecord::Base
<del> ActiveRecord::Base
<del> else
<del> connection_handler.connection_pools[name] ? self : superclass.arel_engine
<del> end
<del> end
<del> end
<del>
<del> private
<del>
<del> def relation #:nodoc:
<del> @relation ||= Relation.new(self, arel_table)
<del>
<del> if finder_needs_type_condition?
<del> @relation.where(type_condition).create_with(inheritance_column.to_sym => sti_name)
<del> else
<del> @relation
<del> end
<del> end
<del> end
<del>
<del> public
<del> # New objects can be instantiated as either empty (pass no construction parameter) or pre-set with
<del> # attributes but not yet saved (pass a hash with key names matching the associated table column names).
<del> # In both instances, valid attribute keys are determined by the column names of the associated table --
<del> # hence you can't have attributes that aren't part of the table columns.
<del> #
<del> # +initialize+ respects mass-assignment security and accepts either +:as+ or +:without_protection+ options
<del> # in the +options+ parameter.
<del> #
<del> # ==== Examples
<del> # # Instantiates a single new object
<del> # User.new(:first_name => 'Jamie')
<del> #
<del> # # Instantiates a single new object using the :admin mass-assignment security role
<del> # User.new({ :first_name => 'Jamie', :is_admin => true }, :as => :admin)
<del> #
<del> # # Instantiates a single new object bypassing mass-assignment security
<del> # User.new({ :first_name => 'Jamie', :is_admin => true }, :without_protection => true)
<del> def initialize(attributes = nil, options = {})
<del> @attributes = self.class.initialize_attributes(self.class.column_defaults.dup)
<del> @association_cache = {}
<del> @aggregation_cache = {}
<del> @attributes_cache = {}
<del> @new_record = true
<del> @readonly = false
<del> @destroyed = false
<del> @marked_for_destruction = false
<del> @previously_changed = {}
<del> @changed_attributes = {}
<del> @relation = nil
<del>
<del> ensure_proper_type
<del>
<del> populate_with_current_scope_attributes
<del>
<del> assign_attributes(attributes, options) if attributes
<del>
<del> yield self if block_given?
<del> run_callbacks :initialize
<del> end
<del>
<del> # Initialize an empty model object from +coder+. +coder+ must contain
<del> # the attributes necessary for initializing an empty model object. For
<del> # example:
<del> #
<del> # class Post < ActiveRecord::Base
<del> # end
<del> #
<del> # post = Post.allocate
<del> # post.init_with('attributes' => { 'title' => 'hello world' })
<del> # post.title # => 'hello world'
<del> def init_with(coder)
<del> @attributes = self.class.initialize_attributes(coder['attributes'])
<del> @relation = nil
<del>
<del> @attributes_cache, @previously_changed, @changed_attributes = {}, {}, {}
<del> @association_cache = {}
<del> @aggregation_cache = {}
<del> @readonly = @destroyed = @marked_for_destruction = false
<del> @new_record = false
<del> run_callbacks :find
<del> run_callbacks :initialize
<del>
<del> self
<del> end
<del>
<del> # Duped objects have no id assigned and are treated as new records. Note
<del> # that this is a "shallow" copy as it copies the object's attributes
<del> # only, not its associations. The extent of a "deep" copy is application
<del> # specific and is therefore left to the application to implement according
<del> # to its need.
<del> # The dup method does not preserve the timestamps (created|updated)_(at|on).
<del> def initialize_dup(other)
<del> cloned_attributes = other.clone_attributes(:read_attribute_before_type_cast)
<del> cloned_attributes.delete(self.class.primary_key)
<del>
<del> @attributes = cloned_attributes
<del>
<del> _run_after_initialize_callbacks if respond_to?(:_run_after_initialize_callbacks)
<del>
<del> @changed_attributes = {}
<del> self.class.column_defaults.each do |attr, orig_value|
<del> @changed_attributes[attr] = orig_value if field_changed?(attr, orig_value, @attributes[attr])
<del> end
<del>
<del> @aggregation_cache = {}
<del> @association_cache = {}
<del> @attributes_cache = {}
<del> @new_record = true
<del>
<del> ensure_proper_type
<del> populate_with_current_scope_attributes
<del> super
<del> end
<del>
<del> # Populate +coder+ with attributes about this record that should be
<del> # serialized. The structure of +coder+ defined in this method is
<del> # guaranteed to match the structure of +coder+ passed to the +init_with+
<del> # method.
<del> #
<del> # Example:
<del> #
<del> # class Post < ActiveRecord::Base
<del> # end
<del> # coder = {}
<del> # Post.new.encode_with(coder)
<del> # coder # => { 'id' => nil, ... }
<del> def encode_with(coder)
<del> coder['attributes'] = attributes
<del> end
<del>
<del> # Returns true if +comparison_object+ is the same exact object, or +comparison_object+
<del> # is of the same type and +self+ has an ID and it is equal to +comparison_object.id+.
<del> #
<del> # Note that new records are different from any other record by definition, unless the
<del> # other record is the receiver itself. Besides, if you fetch existing records with
<del> # +select+ and leave the ID out, you're on your own, this predicate will return false.
<del> #
<del> # Note also that destroying a record preserves its ID in the model instance, so deleted
<del> # models are still comparable.
<del> def ==(comparison_object)
<del> super ||
<del> comparison_object.instance_of?(self.class) &&
<del> id.present? &&
<del> comparison_object.id == id
<del> end
<del> alias :eql? :==
<del>
<del> # Delegates to id in order to allow two records of the same type and id to work with something like:
<del> # [ Person.find(1), Person.find(2), Person.find(3) ] & [ Person.find(1), Person.find(4) ] # => [ Person.find(1) ]
<del> def hash
<del> id.hash
<del> end
<del>
<del> # Freeze the attributes hash such that associations are still accessible, even on destroyed records.
<del> def freeze
<del> @attributes.freeze; self
<del> end
<del>
<del> # Returns +true+ if the attributes hash has been frozen.
<del> def frozen?
<del> @attributes.frozen?
<del> end
<del>
<del> # Allows sort on objects
<del> def <=>(other_object)
<del> if other_object.is_a?(self.class)
<del> self.to_key <=> other_object.to_key
<del> else
<del> nil
<del> end
<del> end
<del>
<del> # Returns +true+ if the record is read only. Records loaded through joins with piggy-back
<del> # attributes will be marked as read only since they cannot be saved.
<del> def readonly?
<del> @readonly
<del> end
<del>
<del> # Marks this record as read only.
<del> def readonly!
<del> @readonly = true
<del> end
<del>
<del> # Returns the contents of the record as a nicely formatted string.
<del> def inspect
<del> inspection = if @attributes
<del> self.class.column_names.collect { |name|
<del> if has_attribute?(name)
<del> "#{name}: #{attribute_for_inspect(name)}"
<del> end
<del> }.compact.join(", ")
<del> else
<del> "not initialized"
<del> end
<del> "#<#{self.class} #{inspection}>"
<del> end
<del>
<del> # Hackery to accomodate Syck. Remove for 4.0.
<del> def to_yaml(opts = {}) #:nodoc:
<del> if YAML.const_defined?(:ENGINE) && !YAML::ENGINE.syck?
<del> super
<del> else
<del> coder = {}
<del> encode_with(coder)
<del> YAML.quick_emit(self, opts) do |out|
<del> out.map(taguri, to_yaml_style) do |map|
<del> coder.each { |k, v| map.add(k, v) }
<del> end
<del> end
<del> end
<del> end
<del>
<del> # Hackery to accomodate Syck. Remove for 4.0.
<del> def yaml_initialize(tag, coder) #:nodoc:
<del> init_with(coder)
<del> end
<del>
<del> private
<del>
<del> # Under Ruby 1.9, Array#flatten will call #to_ary (recursively) on each of the elements
<del> # of the array, and then rescues from the possible NoMethodError. If those elements are
<del> # ActiveRecord::Base's, then this triggers the various method_missing's that we have,
<del> # which significantly impacts upon performance.
<del> #
<del> # So we can avoid the method_missing hit by explicitly defining #to_ary as nil here.
<del> #
<del> # See also http://tenderlovemaking.com/2011/06/28/til-its-ok-to-return-nil-from-to_ary/
<del> def to_ary # :nodoc:
<del> nil
<del> end
<del>
<del> include ActiveRecord::Persistence
<del> extend ActiveModel::Naming
<del> extend QueryCache::ClassMethods
<del> extend ActiveSupport::Benchmarkable
<del> extend ActiveSupport::DescendantsTracker
<del>
<del> extend Querying
<del> include ReadonlyAttributes
<del> include ModelSchema
<del> extend Translation
<del> include Inheritance
<del> include Scoping
<del> extend DynamicMatchers
<del> include Sanitization
<del> include Integration
<del> include AttributeAssignment
<del> include ActiveModel::Conversion
<del> include Validations
<del> extend CounterCache
<del> include Locking::Optimistic, Locking::Pessimistic
<del> include AttributeMethods
<del> include Callbacks, ActiveModel::Observing, Timestamp
<del> include Associations
<del> include IdentityMap
<del> include ActiveModel::SecurePassword
<del> extend Explain
<del>
<del> # AutosaveAssociation needs to be included before Transactions, because we want
<del> # #save_with_autosave_associations to be wrapped inside a transaction.
<del> include AutosaveAssociation, NestedAttributes
<del> include Aggregations, Transactions, Reflection, Serialization, Store
<add> include ActiveRecord::Model
<ide> end
<ide> end
<del>
<del>require 'active_record/connection_adapters/abstract/connection_specification'
<del>ActiveSupport.run_load_hooks(:active_record, ActiveRecord::Base)
<ide><path>activerecord/lib/active_record/core.rb
<add>require 'active_support/concern'
<add>
<add>module ActiveRecord
<add> module Core
<add> extend ActiveSupport::Concern
<add>
<add> included do
<add> ##
<add> # :singleton-method:
<add> # Accepts a logger conforming to the interface of Log4r or the default Ruby 1.8+ Logger class,
<add> # which is then passed on to any new database connections made and which can be retrieved on both
<add> # a class and instance level by calling +logger+.
<add> cattr_accessor :logger, :instance_writer => false
<add>
<add> ##
<add> # :singleton-method:
<add> # Contains the database configuration - as is typically stored in config/database.yml -
<add> # as a Hash.
<add> #
<add> # For example, the following database.yml...
<add> #
<add> # development:
<add> # adapter: sqlite3
<add> # database: db/development.sqlite3
<add> #
<add> # production:
<add> # adapter: sqlite3
<add> # database: db/production.sqlite3
<add> #
<add> # ...would result in ActiveRecord::Base.configurations to look like this:
<add> #
<add> # {
<add> # 'development' => {
<add> # 'adapter' => 'sqlite3',
<add> # 'database' => 'db/development.sqlite3'
<add> # },
<add> # 'production' => {
<add> # 'adapter' => 'sqlite3',
<add> # 'database' => 'db/production.sqlite3'
<add> # }
<add> # }
<add> cattr_accessor :configurations, :instance_writer => false
<add> self.configurations = {}
<add>
<add> ##
<add> # :singleton-method:
<add> # Determines whether to use Time.local (using :local) or Time.utc (using :utc) when pulling
<add> # dates and times from the database. This is set to :local by default.
<add> cattr_accessor :default_timezone, :instance_writer => false
<add> self.default_timezone = :local
<add>
<add> ##
<add> # :singleton-method:
<add> # Specifies the format to use when dumping the database schema with Rails'
<add> # Rakefile. If :sql, the schema is dumped as (potentially database-
<add> # specific) SQL statements. If :ruby, the schema is dumped as an
<add> # ActiveRecord::Schema file which can be loaded into any database that
<add> # supports migrations. Use :ruby if you want to have different database
<add> # adapters for, e.g., your development and test environments.
<add> cattr_accessor :schema_format , :instance_writer => false
<add> self.schema_format = :ruby
<add>
<add> ##
<add> # :singleton-method:
<add> # Specify whether or not to use timestamps for migration versions
<add> cattr_accessor :timestamped_migrations , :instance_writer => false
<add> self.timestamped_migrations = true
<add> end
<add>
<add> module ClassMethods
<add> def inherited(child_class) #:nodoc:
<add> # force attribute methods to be higher in inheritance hierarchy than other generated methods
<add> child_class.generated_attribute_methods
<add> child_class.generated_feature_methods
<add> super
<add> end
<add>
<add> def generated_feature_methods
<add> @generated_feature_methods ||= begin
<add> mod = const_set(:GeneratedFeatureMethods, Module.new)
<add> include mod
<add> mod
<add> end
<add> end
<add>
<add> # Returns a string like 'Post(id:integer, title:string, body:text)'
<add> def inspect
<add> if self == Base
<add> super
<add> elsif abstract_class?
<add> "#{super}(abstract)"
<add> elsif table_exists?
<add> attr_list = columns.map { |c| "#{c.name}: #{c.type}" } * ', '
<add> "#{super}(#{attr_list})"
<add> else
<add> "#{super}(Table doesn't exist)"
<add> end
<add> end
<add>
<add> # Overwrite the default class equality method to provide support for association proxies.
<add> def ===(object)
<add> object.is_a?(self)
<add> end
<add>
<add> def arel_table
<add> @arel_table ||= Arel::Table.new(table_name, arel_engine)
<add> end
<add>
<add> def arel_engine
<add> @arel_engine ||= begin
<add> if self == ActiveRecord::Base
<add> ActiveRecord::Base
<add> else
<add> connection_handler.connection_pools[name] ? self : superclass.arel_engine
<add> end
<add> end
<add> end
<add>
<add> private
<add>
<add> def relation #:nodoc:
<add> @relation ||= Relation.new(self, arel_table)
<add>
<add> if finder_needs_type_condition?
<add> @relation.where(type_condition).create_with(inheritance_column.to_sym => sti_name)
<add> else
<add> @relation
<add> end
<add> end
<add> end
<add>
<add> # New objects can be instantiated as either empty (pass no construction parameter) or pre-set with
<add> # attributes but not yet saved (pass a hash with key names matching the associated table column names).
<add> # In both instances, valid attribute keys are determined by the column names of the associated table --
<add> # hence you can't have attributes that aren't part of the table columns.
<add> #
<add> # +initialize+ respects mass-assignment security and accepts either +:as+ or +:without_protection+ options
<add> # in the +options+ parameter.
<add> #
<add> # ==== Examples
<add> # # Instantiates a single new object
<add> # User.new(:first_name => 'Jamie')
<add> #
<add> # # Instantiates a single new object using the :admin mass-assignment security role
<add> # User.new({ :first_name => 'Jamie', :is_admin => true }, :as => :admin)
<add> #
<add> # # Instantiates a single new object bypassing mass-assignment security
<add> # User.new({ :first_name => 'Jamie', :is_admin => true }, :without_protection => true)
<add> def initialize(attributes = nil, options = {})
<add> @attributes = self.class.initialize_attributes(self.class.column_defaults.dup)
<add> @association_cache = {}
<add> @aggregation_cache = {}
<add> @attributes_cache = {}
<add> @new_record = true
<add> @readonly = false
<add> @destroyed = false
<add> @marked_for_destruction = false
<add> @previously_changed = {}
<add> @changed_attributes = {}
<add> @relation = nil
<add>
<add> ensure_proper_type
<add>
<add> populate_with_current_scope_attributes
<add>
<add> assign_attributes(attributes, options) if attributes
<add>
<add> yield self if block_given?
<add> run_callbacks :initialize
<add> end
<add>
<add> # Initialize an empty model object from +coder+. +coder+ must contain
<add> # the attributes necessary for initializing an empty model object. For
<add> # example:
<add> #
<add> # class Post < ActiveRecord::Base
<add> # end
<add> #
<add> # post = Post.allocate
<add> # post.init_with('attributes' => { 'title' => 'hello world' })
<add> # post.title # => 'hello world'
<add> def init_with(coder)
<add> @attributes = self.class.initialize_attributes(coder['attributes'])
<add> @relation = nil
<add>
<add> @attributes_cache, @previously_changed, @changed_attributes = {}, {}, {}
<add> @association_cache = {}
<add> @aggregation_cache = {}
<add> @readonly = @destroyed = @marked_for_destruction = false
<add> @new_record = false
<add> run_callbacks :find
<add> run_callbacks :initialize
<add>
<add> self
<add> end
<add>
<add> # Duped objects have no id assigned and are treated as new records. Note
<add> # that this is a "shallow" copy as it copies the object's attributes
<add> # only, not its associations. The extent of a "deep" copy is application
<add> # specific and is therefore left to the application to implement according
<add> # to its need.
<add> # The dup method does not preserve the timestamps (created|updated)_(at|on).
<add> def initialize_dup(other)
<add> cloned_attributes = other.clone_attributes(:read_attribute_before_type_cast)
<add> cloned_attributes.delete(self.class.primary_key)
<add>
<add> @attributes = cloned_attributes
<add>
<add> _run_after_initialize_callbacks if respond_to?(:_run_after_initialize_callbacks)
<add>
<add> @changed_attributes = {}
<add> self.class.column_defaults.each do |attr, orig_value|
<add> @changed_attributes[attr] = orig_value if field_changed?(attr, orig_value, @attributes[attr])
<add> end
<add>
<add> @aggregation_cache = {}
<add> @association_cache = {}
<add> @attributes_cache = {}
<add> @new_record = true
<add>
<add> ensure_proper_type
<add> populate_with_current_scope_attributes
<add> super
<add> end
<add>
<add> # Populate +coder+ with attributes about this record that should be
<add> # serialized. The structure of +coder+ defined in this method is
<add> # guaranteed to match the structure of +coder+ passed to the +init_with+
<add> # method.
<add> #
<add> # Example:
<add> #
<add> # class Post < ActiveRecord::Base
<add> # end
<add> # coder = {}
<add> # Post.new.encode_with(coder)
<add> # coder # => { 'id' => nil, ... }
<add> def encode_with(coder)
<add> coder['attributes'] = attributes
<add> end
<add>
<add> # Returns true if +comparison_object+ is the same exact object, or +comparison_object+
<add> # is of the same type and +self+ has an ID and it is equal to +comparison_object.id+.
<add> #
<add> # Note that new records are different from any other record by definition, unless the
<add> # other record is the receiver itself. Besides, if you fetch existing records with
<add> # +select+ and leave the ID out, you're on your own, this predicate will return false.
<add> #
<add> # Note also that destroying a record preserves its ID in the model instance, so deleted
<add> # models are still comparable.
<add> def ==(comparison_object)
<add> super ||
<add> comparison_object.instance_of?(self.class) &&
<add> id.present? &&
<add> comparison_object.id == id
<add> end
<add> alias :eql? :==
<add>
<add> # Delegates to id in order to allow two records of the same type and id to work with something like:
<add> # [ Person.find(1), Person.find(2), Person.find(3) ] & [ Person.find(1), Person.find(4) ] # => [ Person.find(1) ]
<add> def hash
<add> id.hash
<add> end
<add>
<add> # Freeze the attributes hash such that associations are still accessible, even on destroyed records.
<add> def freeze
<add> @attributes.freeze; self
<add> end
<add>
<add> # Returns +true+ if the attributes hash has been frozen.
<add> def frozen?
<add> @attributes.frozen?
<add> end
<add>
<add> # Allows sort on objects
<add> def <=>(other_object)
<add> if other_object.is_a?(self.class)
<add> self.to_key <=> other_object.to_key
<add> else
<add> nil
<add> end
<add> end
<add>
<add> # Returns +true+ if the record is read only. Records loaded through joins with piggy-back
<add> # attributes will be marked as read only since they cannot be saved.
<add> def readonly?
<add> @readonly
<add> end
<add>
<add> # Marks this record as read only.
<add> def readonly!
<add> @readonly = true
<add> end
<add>
<add> # Returns the contents of the record as a nicely formatted string.
<add> def inspect
<add> inspection = if @attributes
<add> self.class.column_names.collect { |name|
<add> if has_attribute?(name)
<add> "#{name}: #{attribute_for_inspect(name)}"
<add> end
<add> }.compact.join(", ")
<add> else
<add> "not initialized"
<add> end
<add> "#<#{self.class} #{inspection}>"
<add> end
<add>
<add> # Hackery to accomodate Syck. Remove for 4.0.
<add> def to_yaml(opts = {}) #:nodoc:
<add> if YAML.const_defined?(:ENGINE) && !YAML::ENGINE.syck?
<add> super
<add> else
<add> coder = {}
<add> encode_with(coder)
<add> YAML.quick_emit(self, opts) do |out|
<add> out.map(taguri, to_yaml_style) do |map|
<add> coder.each { |k, v| map.add(k, v) }
<add> end
<add> end
<add> end
<add> end
<add>
<add> # Hackery to accomodate Syck. Remove for 4.0.
<add> def yaml_initialize(tag, coder) #:nodoc:
<add> init_with(coder)
<add> end
<add>
<add> private
<add>
<add> # Under Ruby 1.9, Array#flatten will call #to_ary (recursively) on each of the elements
<add> # of the array, and then rescues from the possible NoMethodError. If those elements are
<add> # ActiveRecord::Base's, then this triggers the various method_missing's that we have,
<add> # which significantly impacts upon performance.
<add> #
<add> # So we can avoid the method_missing hit by explicitly defining #to_ary as nil here.
<add> #
<add> # See also http://tenderlovemaking.com/2011/06/28/til-its-ok-to-return-nil-from-to_ary/
<add> def to_ary # :nodoc:
<add> nil
<add> end
<add> end
<add>end
<ide><path>activerecord/lib/active_record/model.rb
<add>module ActiveRecord
<add> module Model
<add> def self.included(base)
<add> base.class_eval do
<add> include ActiveRecord::Persistence
<add> extend ActiveModel::Naming
<add> extend QueryCache::ClassMethods
<add> extend ActiveSupport::Benchmarkable
<add> extend ActiveSupport::DescendantsTracker
<add>
<add> extend Querying
<add> include ReadonlyAttributes
<add> include ModelSchema
<add> extend Translation
<add> include Inheritance
<add> include Scoping
<add> extend DynamicMatchers
<add> include Sanitization
<add> include Integration
<add> include AttributeAssignment
<add> include ActiveModel::Conversion
<add> include Validations
<add> extend CounterCache
<add> include Locking::Optimistic, Locking::Pessimistic
<add> include AttributeMethods
<add> include Callbacks, ActiveModel::Observing, Timestamp
<add> include Associations
<add> include IdentityMap
<add> include ActiveModel::SecurePassword
<add> extend Explain
<add>
<add> # AutosaveAssociation needs to be included before Transactions, because we want
<add> # #save_with_autosave_associations to be wrapped inside a transaction.
<add> include AutosaveAssociation, NestedAttributes
<add> include Aggregations, Transactions, Reflection, Serialization, Store
<add>
<add> include Core
<add> end
<add> end
<add> end
<add>end
<add>
<add>require 'active_record/connection_adapters/abstract/connection_specification'
<add>ActiveSupport.run_load_hooks(:active_record, ActiveRecord::Base) | 4 |
PHP | PHP | remove obsolete attribute | 4cc5ebc340045a287a29e82f43ccd37480b4f202 | <ide><path>src/Illuminate/Http/Request.php
<ide> class Request extends SymfonyRequest implements ArrayAccess
<ide> */
<ide> protected $json;
<ide>
<del> /**
<del> * The Illuminate session store implementation.
<del> *
<del> * @var \Illuminate\Session\Store
<del> */
<del> protected $sessionStore;
<del>
<ide> /**
<ide> * The user resolver callback.
<ide> * | 1 |
Java | Java | improve test coverage of rdbmsoperation | 608be54a58d4f4b94a23d37033877dfd0ce7b295 | <ide><path>spring-jdbc/src/test/java/org/springframework/jdbc/object/RdbmsOperationTests.java
<ide> import org.springframework.jdbc.core.SqlParameter;
<ide> import org.springframework.jdbc.datasource.DriverManagerDataSource;
<ide>
<add>import static java.util.Map.entry;
<ide> import static org.assertj.core.api.Assertions.assertThat;
<ide> import static org.assertj.core.api.Assertions.assertThatExceptionOfType;
<ide>
<ide> public void emptySql() {
<ide> operation::compile);
<ide> }
<ide>
<add> @Test
<add> public void getSql() {
<add> String sql = "select * from mytable";
<add> operation.setDataSource(new DriverManagerDataSource());
<add> operation.setSql(sql);
<add> String strGotten = operation.getSql();
<add> assertThat(strGotten.equals(sql));
<add> }
<ide> @Test
<ide> public void setTypeAfterCompile() {
<ide> operation.setDataSource(new DriverManagerDataSource());
<ide> public void tooManyParameters() {
<ide> assertThatExceptionOfType(InvalidDataAccessApiUsageException.class).isThrownBy(() ->
<ide> operation.validateParameters(new Object[] { 1, 2 }));
<ide> }
<add> @Test
<add> public void tooManyMapParameters() {
<add> operation.setSql("select * from mytable");
<add> assertThatExceptionOfType(InvalidDataAccessApiUsageException.class).isThrownBy(() ->
<add> operation.validateNamedParameters(Map.ofEntries(
<add> entry("a", "b"),
<add> entry("c", "d")
<add> ) ));
<add> }
<ide>
<ide> @Test
<ide> public void unspecifiedMapParameters() { | 1 |
Ruby | Ruby | run postinstall on dependencies | 1bb9c56e9c19206404e5cfcf1ff9535a13848c95 | <ide><path>Library/Homebrew/cmd/test-bot.rb
<ide> def formula(formula_name)
<ide> end
<ide>
<ide> test "brew", "fetch", "--retry", *unchanged_dependencies unless unchanged_dependencies.empty?
<del> test "brew", "fetch", "--retry", "--build-bottle", *changed_dependences unless changed_dependences.empty?
<del> # Install changed dependencies as new bottles so we don't have checksum problems.
<del> test "brew", "install", "--build-bottle", *changed_dependences unless changed_dependences.empty?
<add>
<add> unless changed_dependences.empty?
<add> test "brew", "fetch", "--retry", "--build-bottle", *changed_dependences
<add> # Install changed dependencies as new bottles so we don't have checksum problems.
<add> test "brew", "install", "--build-bottle", *changed_dependences
<add> # Run postinstall on them because the tested formula might depend on
<add> # this step
<add> test "brew", "postinstall", *changed_dependences
<add> end
<ide> formula_fetch_options = []
<ide> formula_fetch_options << "--build-bottle" unless ARGV.include? "--no-bottle"
<ide> formula_fetch_options << "--force" if ARGV.include? "--cleanup" | 1 |
Javascript | Javascript | avoid unneeded geometry allocations | 67d67bfcc3ce96a80d18da4f90ad7e8792402ac8 | <ide><path>src/helpers/ArrowHelper.js
<ide> import { Mesh } from '../objects/Mesh';
<ide> import { Line } from '../objects/Line';
<ide> import { Vector3 } from '../math/Vector3';
<ide>
<del>var lineGeometry = new BufferGeometry();
<del>lineGeometry.addAttribute( 'position', new Float32BufferAttribute( [ 0, 0, 0, 0, 1, 0 ], 3 ) );
<del>
<del>var coneGeometry = new CylinderBufferGeometry( 0, 0.5, 1, 5, 1 );
<del>coneGeometry.translate( 0, - 0.5, 0 );
<add>var lineGeometry, coneGeometry;
<ide>
<ide> function ArrowHelper( dir, origin, length, color, headLength, headWidth ) {
<ide>
<ide> function ArrowHelper( dir, origin, length, color, headLength, headWidth ) {
<ide> if ( headLength === undefined ) headLength = 0.2 * length;
<ide> if ( headWidth === undefined ) headWidth = 0.2 * headLength;
<ide>
<add> if ( lineGeometry === undefined ) {
<add>
<add> lineGeometry = new BufferGeometry();
<add> lineGeometry.addAttribute( 'position', new Float32BufferAttribute( [ 0, 0, 0, 0, 1, 0 ], 3 ) );
<add>
<add> coneGeometry = new CylinderBufferGeometry( 0, 0.5, 1, 5, 1 );
<add> coneGeometry.translate( 0, - 0.5, 0 );
<add>
<add> }
<add>
<ide> this.position.copy( origin );
<ide>
<ide> this.line = new Line( lineGeometry, new LineBasicMaterial( { color: color } ) );
<ide><path>src/renderers/WebGLRenderer.js
<ide> function WebGLRenderer( parameters ) {
<ide>
<ide> //
<ide>
<del> var backgroundCamera = new OrthographicCamera( - 1, 1, 1, - 1, 0, 1 );
<del> var backgroundCamera2 = new PerspectiveCamera();
<del> var backgroundPlaneMesh = new Mesh(
<del> new PlaneBufferGeometry( 2, 2 ),
<del> new MeshBasicMaterial( { depthTest: false, depthWrite: false, fog: false } )
<del> );
<del> var backgroundBoxShader = ShaderLib[ 'cube' ];
<del> var backgroundBoxMesh = new Mesh(
<del> new BoxBufferGeometry( 5, 5, 5 ),
<del> new ShaderMaterial( {
<del> uniforms: backgroundBoxShader.uniforms,
<del> vertexShader: backgroundBoxShader.vertexShader,
<del> fragmentShader: backgroundBoxShader.fragmentShader,
<del> side: BackSide,
<del> depthTest: false,
<del> depthWrite: false,
<del> fog: false
<del> } )
<del> );
<add> var backgroundPlaneCamera, backgroundPlaneMesh;
<add> var backgroundBoxCamera, backgroundBoxMesh;
<ide>
<ide> //
<ide>
<ide> function WebGLRenderer( parameters ) {
<ide>
<ide> if ( background && background.isCubeTexture ) {
<ide>
<del> backgroundCamera2.projectionMatrix.copy( camera.projectionMatrix );
<add> if ( backgroundBoxCamera === undefined ) {
<add>
<add> backgroundBoxCamera = new PerspectiveCamera();
<add>
<add> backgroundBoxMesh = new Mesh(
<add> new BoxBufferGeometry( 5, 5, 5 ),
<add> new ShaderMaterial( {
<add> uniforms: ShaderLib.cube.uniforms,
<add> vertexShader: ShaderLib.cube.vertexShader,
<add> fragmentShader: ShaderLib.cube.fragmentShader,
<add> side: BackSide,
<add> depthTest: false,
<add> depthWrite: false,
<add> fog: false
<add> } )
<add> );
<add>
<add> }
<add>
<add> backgroundBoxCamera.projectionMatrix.copy( camera.projectionMatrix );
<add>
<add> backgroundBoxCamera.matrixWorld.extractRotation( camera.matrixWorld );
<add> backgroundBoxCamera.matrixWorldInverse.getInverse( backgroundBoxCamera.matrixWorld );
<ide>
<del> backgroundCamera2.matrixWorld.extractRotation( camera.matrixWorld );
<del> backgroundCamera2.matrixWorldInverse.getInverse( backgroundCamera2.matrixWorld );
<ide>
<ide> backgroundBoxMesh.material.uniforms[ "tCube" ].value = background;
<del> backgroundBoxMesh.modelViewMatrix.multiplyMatrices( backgroundCamera2.matrixWorldInverse, backgroundBoxMesh.matrixWorld );
<add> backgroundBoxMesh.modelViewMatrix.multiplyMatrices( backgroundBoxCamera.matrixWorldInverse, backgroundBoxMesh.matrixWorld );
<ide>
<ide> objects.update( backgroundBoxMesh );
<ide>
<del> _this.renderBufferDirect( backgroundCamera2, null, backgroundBoxMesh.geometry, backgroundBoxMesh.material, backgroundBoxMesh, null );
<add> _this.renderBufferDirect( backgroundBoxCamera, null, backgroundBoxMesh.geometry, backgroundBoxMesh.material, backgroundBoxMesh, null );
<ide>
<ide> } else if ( background && background.isTexture ) {
<ide>
<add> if ( backgroundPlaneCamera === undefined ) {
<add>
<add> backgroundPlaneCamera = new OrthographicCamera( - 1, 1, 1, - 1, 0, 1 );
<add>
<add> backgroundPlaneMesh = new Mesh(
<add> new PlaneBufferGeometry( 2, 2 ),
<add> new MeshBasicMaterial( { depthTest: false, depthWrite: false, fog: false } )
<add> );
<add>
<add> }
<add>
<ide> backgroundPlaneMesh.material.map = background;
<ide>
<ide> objects.update( backgroundPlaneMesh );
<ide>
<del> _this.renderBufferDirect( backgroundCamera, null, backgroundPlaneMesh.geometry, backgroundPlaneMesh.material, backgroundPlaneMesh, null );
<add> _this.renderBufferDirect( backgroundPlaneCamera, null, backgroundPlaneMesh.geometry, backgroundPlaneMesh.material, backgroundPlaneMesh, null );
<ide>
<ide> }
<ide> | 2 |
Text | Text | fix storage driver options in man page | 13deb4a245ce508c8eb6bbe065d0f560472c68e2 | <ide><path>man/docker-daemon.8.md
<ide> internals) to create writable containers from images. Many of these
<ide> backends use operating system level technologies and can be
<ide> configured.
<ide>
<del>Specify options to the storage backend with **--storage-opt** flags. The only
<del>backend that currently takes options is *devicemapper*. Therefore use these
<del>flags with **-s=**devicemapper.
<add>Specify options to the storage backend with **--storage-opt** flags. The
<add>backends that currently take options are *devicemapper* and *zfs*.
<add>Options for *devicemapper* are prefixed with *dm* and options for *zfs*
<add>start with *zfs*.
<ide>
<ide> Specifically for devicemapper, the default is a "loopback" model which
<ide> requires no pre-configuration, but is extremely inefficient. Do not
<ide> more information see `man lvmthin`. Then, use `--storage-opt
<ide> dm.thinpooldev` to tell the Docker engine to use that pool for
<ide> allocating images and container snapshots.
<ide>
<del>Here is the list of *devicemapper* options:
<add>##Devicemapper options:
<ide>
<ide> #### dm.thinpooldev
<ide>
<ide> this topic, see
<ide> Otherwise, set this flag for migrating existing Docker daemons to a
<ide> daemon with a supported environment.
<ide>
<add>##ZFS options
<add>
<add>#### zfs.fsname
<add>
<add>Set zfs filesystem under which docker will create its own datasets.
<add>By default docker will pick up the zfs filesystem where docker graph
<add>(`/var/lib/docker`) is located.
<add>
<add>Example use: `docker daemon -s zfs --storage-opt zfs.fsname=zroot/docker`
<add>
<ide> # CLUSTER STORE OPTIONS
<ide>
<ide> The daemon uses libkv to advertise | 1 |
Javascript | Javascript | move logic to overflow section | 83fd2870e0241cc7390d44b3d94980a8dfe7ee93 | <ide><path>src/lib/create/from-array.js
<ide> export function configFromArray (config) {
<ide> }
<ide>
<ide> //if the day of the year is set, figure out what it is
<del> if (config._dayOfYear) {
<add> if (config._dayOfYear != null) {
<ide> yearToUse = defaults(config._a[YEAR], currentDate[YEAR]);
<ide>
<del> if (config._dayOfYear > daysInYear(yearToUse)) {
<add> if (config._dayOfYear > daysInYear(yearToUse) || config._dayOfYear === 0) {
<ide> getParsingFlags(config)._overflowDayOfYear = true;
<ide> }
<ide>
<ide><path>src/lib/parse/regex.js
<ide> export var match1to3 = /\d{1,3}/; // 0 - 999
<ide> export var match1to4 = /\d{1,4}/; // 0 - 9999
<ide> export var match1to6 = /[+-]?\d{1,6}/; // -999999 - 999999
<ide>
<del>export var match1to3nonzero = /[1-9]\d{0,2}|\d[1-9]\d{0,1}|\d{0,2}[1-9]/; // 1 - 999
<del>export var match3nonzero = /[1-9]\d\d|\d[1-9]\d|\d\d[1-9]/; // 001 - 999
<del>
<ide> export var matchUnsigned = /\d+/; // 0 - inf
<ide> export var matchSigned = /[+-]?\d+/; // -inf - inf
<ide>
<ide><path>src/lib/units/day-of-year.js
<ide> import { addFormatToken } from '../format/format';
<ide> import { addUnitAlias } from './aliases';
<ide> import { addUnitPriority } from './priorities';
<del>import { addRegexToken, match3nonzero, match1to3nonzero } from '../parse/regex';
<add>import { addRegexToken, match3, match1to3 } from '../parse/regex';
<ide> import { daysInYear } from './year';
<ide> import { createUTCDate } from '../create/date-from-array';
<ide> import { addParseToken } from '../parse/token';
<ide> addUnitPriority('dayOfYear', 4);
<ide>
<ide> // PARSING
<ide>
<del>addRegexToken('DDD', match1to3nonzero);
<del>addRegexToken('DDDD', match3nonzero);
<add>addRegexToken('DDD', match1to3);
<add>addRegexToken('DDDD', match3);
<ide> addParseToken(['DDD', 'DDDD'], function (input, array, config) {
<ide> config._dayOfYear = toInt(input);
<ide> });
<ide><path>src/test/moment/days_in_year.js
<add>import { module, test } from '../qunit';
<add>import moment from '../../moment';
<add>
<add>module('days in year');
<add>
<add>// https://github.com/moment/moment/issues/3717
<add>test('YYYYDDD should not parse DDD=000', function (assert) {
<add> assert.equal(moment(7000000, moment.ISO_8601, true).isValid(), false);
<add> assert.equal(moment('7000000', moment.ISO_8601, true).isValid(), false);
<add> assert.equal(moment(7000000, moment.ISO_8601, false).isValid(), false);
<add>});
<ide><path>src/test/moment/regex.js
<del>import { module, test } from '../qunit';
<del>import moment from '../../moment';
<del>import {match1to3nonzero} from '../../lib/parse/regex';
<del>
<del>module('regex');
<del>
<del>test('match1to3nonzero works properly', function (assert) {
<del> assert.equal('0'.match(match1to3nonzero), null);
<del> assert.equal('00'.match(match1to3nonzero), null);
<del> assert.equal('000'.match(match1to3nonzero), null);
<del>
<del> for (var i = 1; i < 1000; i++) {
<del> assert.notEqual(String(i).match(match1to3nonzero), null);
<del> if (i < 100) {
<del> assert.notEqual('0' + String(i).match(match1to3nonzero), null);
<del> if (i < 10) {
<del> assert.notEqual('00' + String(i).match(match1to3nonzero), null);
<del> }
<del> }
<del> }
<del>});
<del>
<del>// https://github.com/moment/moment/issues/3717
<del>test('YYYYDDD should not parse DDD=000 (strict mode)', function (assert) {
<del> assert.equal(moment(7000000, moment.ISO_8601, true).isValid(), false);
<del> assert.equal(moment('7000000', moment.ISO_8601, true).isValid(), false);
<del>});
<del>
<del>// https://github.com/moment/moment/issues/3717
<del>test('YYYYDDD will parse DDD=000 (non-strict mode)', function (assert) {
<del> assert.equal(moment(7000000, moment.ISO_8601, false).isValid(), true);
<del> assert.equal(moment(7000000, moment.ISO_8601, false)._pf.unusedTokens.length, 1);
<del> assert.equal(moment(7000000, moment.ISO_8601, false)._pf.unusedTokens[0], 'DDD');
<del>}); | 5 |
Javascript | Javascript | fix performance regression | ff6656338d35f01da6e3a93b70baf54e1d669fd6 | <ide><path>lib/serialization/BinaryMiddleware.js
<ide> class BinaryMiddleware extends SerializerMiddleware {
<ide> currentBuffer = leftOverBuffer;
<ide> leftOverBuffer = null;
<ide> } else {
<del> currentBuffer = Buffer.allocUnsafeSlow(
<del> Math.max(bytesNeeded, buffersTotalLength, 16384)
<add> currentBuffer = Buffer.allocUnsafe(
<add> Math.max(
<add> bytesNeeded,
<add> Math.min(Math.max(buffersTotalLength, 1024), 16384)
<add> )
<ide> );
<ide> }
<ide> }; | 1 |
Text | Text | add 2.2.0-beta.3 to changelog.md | c5492eab98599172af1529e2da29075f8632722c | <ide><path>CHANGELOG.md
<ide> # Ember Changelog
<ide>
<add>### v2.3.0-beta.3 (December 19, 2015)
<add>
<add>- [#12659](https://github.com/emberjs/ember.js/pull/12659) [BUGFIX] Update HTMLBars to 0.14.7.
<add>- [#12666](https://github.com/emberjs/ember.js/pull/12666) [BUGFIX] Prevent triggering V8 memory leak bug through registry / resolver access.
<add>- [#12677](https://github.com/emberjs/ember.js/pull/12677) [BUGFIX] Remove FastBoot monkeypatches.
<add>- [#12680](https://github.com/emberjs/ember.js/pull/12680) [BUGFIX] Clear cached instances when factories are unregistered.
<add>- [#12682](https://github.com/emberjs/ember.js/pull/12682) [BUGFIX] Fix rerendering contextual components when invoked with dot syntax and block form.
<add>- [#12686](https://github.com/emberjs/ember.js/pull/12686) [BUGFIX] Ensure HTML safe warning is not thrown for `null` and `undefined` values.
<add>- [#12699](https://github.com/emberjs/ember.js/pull/12699) [BUGFIX] Only add deprecated container after create when present (prevents errors when non-extendable factory is frozen after creation).
<add>- [#12705](https://github.com/emberjs/ember.js/pull/12705) [BUGFIX] Fix FastBoot URL parsing crash.
<add>- [#12728](https://github.com/emberjs/ember.js/pull/12728) [BUGFIX] Fix incorrect export for `Ember.computed.collect`.
<add>- [#12731](https://github.com/emberjs/ember.js/pull/12731) [BUGFIX] Ensure `container` can still be provided to `.create` (prevents an error and provides a helpful deprecation).
<add>
<ide> ### v2.3.0-beta.2 (November 29, 2015)
<ide>
<ide> - [#12626](https://github.com/emberjs/ember.js/pull/12626) [BUGFIX] Fix "rest" style positional params in contextual components when using dot syntax. | 1 |
Text | Text | remove linking of url text to url | 0b27cb4a7c91a3ae3269f93a536b1bba170abc14 | <ide><path>doc/onboarding.md
<ide> needs to be pointed out separately during the onboarding.
<ide> ## Exercise: Make a PR adding yourself to the README
<ide>
<ide> * Example:
<del> [https://github.com/nodejs/node/commit/ce986de829457c39257cd205067602e765768fb0][]
<add> https://github.com/nodejs/node/commit/ce986de829457c39257cd205067602e765768fb0
<ide> * For raw commit message: `git log ce986de829457c39257cd205067602e765768fb0
<ide> -1`
<ide> * Collaborators are in alphabetical order by GitHub username.
<ide> needs to be pointed out separately during the onboarding.
<ide> [`git-node`]: https://github.com/nodejs/node-core-utils/blob/master/docs/git-node.md
<ide> [`node-core-utils`]: https://github.com/nodejs/node-core-utils
<ide> [Landing Pull Requests]: https://github.com/nodejs/node/blob/master/COLLABORATOR_GUIDE.md#landing-pull-requests
<del>[https://github.com/nodejs/node/commit/ce986de829457c39257cd205067602e765768fb0]: https://github.com/nodejs/node/commit/ce986de829457c39257cd205067602e765768fb0
<ide> [Publicizing or hiding organization membership]: https://help.github.com/articles/publicizing-or-hiding-organization-membership/
<ide> [set up the credentials]: https://github.com/nodejs/node-core-utils#setting-up-credentials
<ide> [two-factor authentication]: https://help.github.com/articles/securing-your-account-with-two-factor-authentication-2fa/ | 1 |
Text | Text | fix json examples in the api reference | 7296c845b739ae4133e3d02d56cb7e8df29eb020 | <ide><path>docs/reference/api/docker_remote_api_v1.17.md
<ide> Create a container
<ide> Content-Type: application/json
<ide>
<ide> {
<del> "Id":"e90e34656806"
<add> "Id":"e90e34656806",
<ide> "Warnings":[]
<ide> }
<ide>
<ide><path>docs/reference/api/docker_remote_api_v1.18.md
<ide> Create a container
<ide> Content-Type: application/json
<ide>
<ide> {
<del> "Id":"e90e34656806"
<add> "Id":"e90e34656806",
<ide> "Warnings":[]
<ide> }
<ide>
<ide> Sets up an exec instance in a running container `id`
<ide> Content-Type: application/json
<ide>
<ide> {
<del> "Id": "f90e34656806"
<add> "Id": "f90e34656806",
<ide> "Warnings":[]
<ide> }
<ide>
<ide><path>docs/reference/api/docker_remote_api_v1.19.md
<ide> Create a container
<ide> Content-Type: application/json
<ide>
<ide> {
<del> "Id":"e90e34656806"
<add> "Id":"e90e34656806",
<ide> "Warnings":[]
<ide> }
<ide>
<ide> Sets up an exec instance in a running container `id`
<ide> Content-Type: application/json
<ide>
<ide> {
<del> "Id": "f90e34656806"
<add> "Id": "f90e34656806",
<ide> "Warnings":[]
<ide> }
<ide>
<ide><path>docs/reference/api/docker_remote_api_v1.20.md
<ide> List containers
<ide> },
<ide> {
<ide> "Id": "9cd87474be90",
<del> "Names":["/coolName"]
<add> "Names":["/coolName"],
<ide> "Image": "ubuntu:latest",
<ide> "Command": "echo 222222",
<ide> "Created": 1367854155,
<ide> List containers
<ide> },
<ide> {
<ide> "Id": "3176a2479c92",
<del> "Names":["/sleepy_dog"]
<add> "Names":["/sleepy_dog"],
<ide> "Image": "ubuntu:latest",
<ide> "Command": "echo 3333333333333333",
<ide> "Created": 1367854154,
<ide> List containers
<ide> },
<ide> {
<ide> "Id": "4cb07b47f9fb",
<del> "Names":["/running_cat"]
<add> "Names":["/running_cat"],
<ide> "Image": "ubuntu:latest",
<ide> "Command": "echo 444444444444444444444444444444444",
<ide> "Created": 1367854152,
<ide> Create a container
<ide> Content-Type: application/json
<ide>
<ide> {
<del> "Id":"e90e34656806"
<add> "Id":"e90e34656806",
<ide> "Warnings":[]
<ide> }
<ide>
<ide> Sets up an exec instance in a running container `id`
<ide> Content-Type: application/json
<ide>
<ide> {
<del> "Id": "f90e34656806"
<add> "Id": "f90e34656806",
<ide> "Warnings":[]
<ide> }
<ide>
<ide><path>docs/reference/api/docker_remote_api_v1.21.md
<ide> List containers
<ide> },
<ide> {
<ide> "Id": "9cd87474be90",
<del> "Names":["/coolName"]
<add> "Names":["/coolName"],
<ide> "Image": "ubuntu:latest",
<ide> "Command": "echo 222222",
<ide> "Created": 1367854155,
<ide> List containers
<ide> },
<ide> {
<ide> "Id": "3176a2479c92",
<del> "Names":["/sleepy_dog"]
<add> "Names":["/sleepy_dog"],
<ide> "Image": "ubuntu:latest",
<ide> "Command": "echo 3333333333333333",
<ide> "Created": 1367854154,
<ide> List containers
<ide> },
<ide> {
<ide> "Id": "4cb07b47f9fb",
<del> "Names":["/running_cat"]
<add> "Names":["/running_cat"],
<ide> "Image": "ubuntu:latest",
<ide> "Command": "echo 444444444444444444444444444444444",
<ide> "Created": 1367854152,
<ide> Create a container
<ide> Content-Type: application/json
<ide>
<ide> {
<del> "Id":"e90e34656806"
<add> "Id":"e90e34656806",
<ide> "Warnings":[]
<ide> }
<ide>
<ide> Sets up an exec instance in a running container `id`
<ide> Content-Type: application/json
<ide>
<ide> {
<del> "Id": "f90e34656806"
<add> "Id": "f90e34656806",
<ide> "Warnings":[]
<ide> }
<ide> | 5 |
PHP | PHP | register connectionresolverinterface as core alias | 647f59fd66999bea333554befc930c7acea1601a | <ide><path>src/Illuminate/Foundation/Application.php
<ide> public function registerCoreContainerAliases()
<ide> 'config' => [\Illuminate\Config\Repository::class, \Illuminate\Contracts\Config\Repository::class],
<ide> 'cookie' => [\Illuminate\Cookie\CookieJar::class, \Illuminate\Contracts\Cookie\Factory::class, \Illuminate\Contracts\Cookie\QueueingFactory::class],
<ide> 'encrypter' => [\Illuminate\Encryption\Encrypter::class, \Illuminate\Contracts\Encryption\Encrypter::class],
<del> 'db' => [\Illuminate\Database\DatabaseManager::class],
<add> 'db' => [\Illuminate\Database\DatabaseManager::class, \Illuminate\Database\ConnectionResolverInterface::class],
<ide> 'db.connection' => [\Illuminate\Database\Connection::class, \Illuminate\Database\ConnectionInterface::class],
<ide> 'events' => [\Illuminate\Events\Dispatcher::class, \Illuminate\Contracts\Events\Dispatcher::class],
<ide> 'files' => [\Illuminate\Filesystem\Filesystem::class],
<ide><path>tests/Integration/Foundation/CoreContainerAliasesTest.php
<add><?php
<add>
<add>namespace Illuminate\Tests\Integration\Foundation;
<add>
<add>use Orchestra\Testbench\TestCase;
<add>use Illuminate\Database\DatabaseManager;
<add>use Illuminate\Database\ConnectionResolverInterface;
<add>
<add>class CoreContainerAliasesTest extends TestCase
<add>{
<add> public function test_it_can_resolve_core_container_aliases()
<add> {
<add> $this->assertInstanceOf(DatabaseManager::class, $this->app->make(ConnectionResolverInterface::class));
<add> }
<add>} | 2 |
Ruby | Ruby | fix output location, also run pristine | 141db031933eec278ff6815226fd97de5ac6f3cc | <ide><path>Library/Homebrew/dev-cmd/vendor-gems.rb
<ide> def vendor_gems
<ide>
<ide> Homebrew.install_bundler!
<ide>
<del> ohai "cd #{HOMEBREW_LIBRARY_PATH}/vendor"
<add> ohai "cd #{HOMEBREW_LIBRARY_PATH}"
<ide> HOMEBREW_LIBRARY_PATH.cd do
<add> ohai "bundle pristine"
<add> safe_system "bundle", "pristine"
<add>
<ide> ohai "bundle install --standalone"
<ide> safe_system "bundle", "install", "--standalone"
<ide> | 1 |
Text | Text | add link to 4.0 branch | 712eca41a187bbcfd36ede4817a378da2acd1866 | <ide><path>README.md
<ide> Want to learn more? [See the wiki.](https://github.com/mbostock/d3/wiki)
<ide>
<ide> For examples, [see the gallery](https://github.com/mbostock/d3/wiki/Gallery) and [mbostock’s bl.ocks](http://bl.ocks.org/mbostock).
<add>
<add>## D3 4.0
<add>
<add>The next major release of D3, 4.0, is coming! Checkout the [4.0 development branch](https://github.com/mbostock/d3/tree/4) and read the [new API reference](https://github.com/mbostock/d3/blob/4/README.md). | 1 |
Python | Python | remove unused backcompat method in k8s hook | 3aadc44a13d0d100778792691a0341818723c51c | <ide><path>airflow/providers/cncf/kubernetes/hooks/kubernetes.py
<ide> def _get_field(self, field_name):
<ide> prefixed_name = f"extra__kubernetes__{field_name}"
<ide> return self.conn_extras.get(prefixed_name) or None
<ide>
<del> @staticmethod
<del> def _deprecation_warning_core_param(deprecation_warnings):
<del> settings_list_str = "".join([f"\n\t{k}={v!r}" for k, v in deprecation_warnings])
<del> warnings.warn(
<del> f"\nApplying core Airflow settings from section [kubernetes] with the following keys:"
<del> f"{settings_list_str}\n"
<del> "In a future release, KubernetesPodOperator will no longer consider core\n"
<del> "Airflow settings; define an Airflow connection instead.",
<del> DeprecationWarning,
<del> )
<del>
<ide> def get_conn(self) -> client.ApiClient:
<ide> """Returns kubernetes api session for use with requests"""
<ide> in_cluster = self._coalesce_param(self.in_cluster, self._get_field("in_cluster")) | 1 |
Javascript | Javascript | update output to include exit code & signal | da7d92e3f030c3054b12457f70aa4c54a7ebc82b | <ide><path>test/parallel/test-cluster-server-restart-none.js
<ide> if (cluster.isMaster) {
<ide> worker1.on('listening', common.mustCall(() => {
<ide> const worker2 = cluster.fork();
<ide> worker2.on('exit', (code, signal) => {
<del> assert.strictEqual(code, 0, 'worker2 did not exit normally');
<del> assert.strictEqual(signal, null, 'worker2 did not exit normally');
<add> assert.strictEqual(code, 0,
<add> 'worker2 did not exit normally. ' +
<add> `exited with code ${code}`);
<add> assert.strictEqual(signal, null,
<add> 'worker2 did not exit normally. ' +
<add> `exited with signal ${signal}`);
<ide> worker1.disconnect();
<ide> });
<ide> }));
<ide>
<ide> worker1.on('exit', common.mustCall((code, signal) => {
<del> assert.strictEqual(code, 0, 'worker1 did not exit normally');
<del> assert.strictEqual(signal, null, 'worker1 did not exit normally');
<add> assert.strictEqual(code, 0,
<add> 'worker1 did not exit normally. ' +
<add> `exited with code ${code}`);
<add> assert.strictEqual(signal, null,
<add> 'worker1 did not exit normally. ' +
<add> `exited with signal ${signal}`);
<ide> }));
<ide> } else {
<ide> const net = require('net'); | 1 |
Java | Java | add jdbctestutils.deleterowsintablewhere method | 720714b43456d3216f12fa679546cf29bb4b2c65 | <ide><path>spring-test/src/main/java/org/springframework/test/jdbc/JdbcTestUtils.java
<ide> /*
<del> * Copyright 2002-2012 the original author or authors.
<add> * Copyright 2002-2013 the original author or authors.
<ide> *
<ide> * Licensed under the Apache License, Version 2.0 (the "License");
<ide> * you may not use this file except in compliance with the License.
<ide>
<ide> import org.apache.commons.logging.Log;
<ide> import org.apache.commons.logging.LogFactory;
<del>
<ide> import org.springframework.core.io.Resource;
<ide> import org.springframework.core.io.ResourceLoader;
<ide> import org.springframework.core.io.support.EncodedResource;
<ide> import org.springframework.dao.DataAccessException;
<ide> import org.springframework.dao.DataAccessResourceFailureException;
<ide> import org.springframework.jdbc.core.JdbcTemplate;
<add>import org.springframework.jdbc.core.SqlParameterValue;
<ide> import org.springframework.jdbc.datasource.init.ResourceDatabasePopulator;
<ide> import org.springframework.util.StringUtils;
<ide>
<ide> public static int deleteFromTables(JdbcTemplate jdbcTemplate, String... tableNam
<ide> return totalRowCount;
<ide> }
<ide>
<add> /**
<add> * Delete rows from the given table, using the provided {@code WHERE} clause.
<add> * <p>If the provided {@code WHERE} clause contains text, it will be prefixed
<add> * with {@code " WHERE "} and then appended to the generated {@code DELETE}
<add> * statement. For example, if the provided table name is {@code "person"} and
<add> * the provided where clause is {@code "name = 'Bob' and age > 25"}, the
<add> * resulting SQL statement to execute will be
<add> * {@code "DELETE FROM person WHERE name = 'Bob' and age > 25"}.
<add> * <p>As an alternative to hard-coded values, the {@code "?"} placeholder can
<add> * be used within the {@code WHERE} clause, binding to the given arguments.
<add> * @param jdbcTemplate the JdbcTemplate with which to perform JDBC operations
<add> * @param tableName the name of the table to delete rows in
<add> * @param whereClause the {@code WHERE} clause to append to the query
<add> * @param args arguments to bind to the query (leaving it to the PreparedStatement
<add> * to guess the corresponding SQL type); may also contain {@link SqlParameterValue}
<add> * objects which indicate not only the argument value but also the SQL type and
<add> * optionally the scale.
<add> * @return the number of rows deleted from the table
<add> */
<add> public static int deleteFromTableWhere(JdbcTemplate jdbcTemplate, String tableName,
<add> String whereClause, Object... args) {
<add> String sql = "DELETE FROM " + tableName;
<add> if(StringUtils.hasText(whereClause)) {
<add> sql += " WHERE " + whereClause;
<add> }
<add> int rowCount = (args != null && args.length > 0 ? jdbcTemplate.update(sql, args)
<add> : jdbcTemplate.update(sql));
<add> if (logger.isInfoEnabled()) {
<add> logger.info("Deleted " + rowCount + " rows from table " + tableName);
<add> }
<add> return rowCount;
<add> }
<add>
<ide> /**
<ide> * Drop the specified tables.
<ide> * @param jdbcTemplate the JdbcTemplate with which to perform JDBC operations
<ide><path>spring-test/src/test/java/org/springframework/test/jdbc/JdbcTestUtilsTests.java
<ide> /*
<del> * Copyright 2002-2012 the original author or authors.
<add> * Copyright 2002-2013 the original author or authors.
<ide> *
<ide> * Licensed under the Apache License, Version 2.0 (the "License");
<ide> * you may not use this file except in compliance with the License.
<ide>
<ide> package org.springframework.test.jdbc;
<ide>
<del>import static org.junit.Assert.*;
<add>import static org.hamcrest.Matchers.equalTo;
<add>import static org.junit.Assert.assertEquals;
<add>import static org.junit.Assert.assertThat;
<add>import static org.junit.Assert.assertTrue;
<add>import static org.mockito.BDDMockito.given;
<ide>
<ide> import java.io.LineNumberReader;
<ide> import java.util.ArrayList;
<ide> import java.util.List;
<ide>
<ide> import org.junit.Test;
<add>import org.junit.runner.RunWith;
<add>import org.mockito.Mock;
<add>import org.mockito.runners.MockitoJUnitRunner;
<ide> import org.springframework.core.io.ClassPathResource;
<ide> import org.springframework.core.io.support.EncodedResource;
<add>import org.springframework.jdbc.core.JdbcTemplate;
<ide>
<ide> /**
<ide> * Unit tests for {@link JdbcTestUtils}.
<ide> *
<ide> * @author Thomas Risberg
<ide> * @author Sam Brannen
<add> * @author Phillip Webb
<ide> * @since 2.5.4
<ide> */
<add>@RunWith(MockitoJUnitRunner.class)
<ide> public class JdbcTestUtilsTests {
<ide>
<add> @Mock
<add> private JdbcTemplate jdbcTemplate;
<add>
<ide> @Test
<ide> public void containsDelimiters() {
<ide> assertTrue("test with ';' is wrong", !JdbcTestUtils.containsSqlScriptDelimiters("select 1\n select ';'", ';'));
<ide> public void readAndSplitScriptContainingComments() throws Exception {
<ide> assertEquals("statement 4 not split correctly", statement4, statements.get(3));
<ide> }
<ide>
<add> @Test
<add> public void testDeleteNoWhere() throws Exception {
<add> given(jdbcTemplate.update("DELETE FROM person")).willReturn(10);
<add> int deleted = JdbcTestUtils.deleteFromTableWhere(jdbcTemplate, "person", null);
<add> assertThat(deleted, equalTo(10));
<add> }
<add>
<add> @Test
<add> public void testDeleteWhere() throws Exception {
<add> given(jdbcTemplate.update("DELETE FROM person WHERE name = 'Bob' and age > 25")).willReturn(10);
<add> int deleted = JdbcTestUtils.deleteFromTableWhere(jdbcTemplate, "person", "name = 'Bob' and age > 25");
<add> assertThat(deleted, equalTo(10));
<add> }
<add>
<add> @Test
<add> public void deleteWhereAndArguments() throws Exception {
<add> given(jdbcTemplate.update("DELETE FROM person WHERE name = ? and age > ?", "Bob", 25)).willReturn(10);
<add> int deleted = JdbcTestUtils.deleteFromTableWhere(jdbcTemplate, "person", "name = ? and age > ?", "Bob", 25);
<add> assertThat(deleted, equalTo(10));
<add> }
<add>
<add>
<ide> } | 2 |
Javascript | Javascript | remove vjs-seeking on src change | 83cbeec424fc3537524621284a57b4b6c7a157b4 | <ide><path>src/js/player.js
<ide> class Player extends Component {
<ide> // TODO: Update to use `emptied` event instead. See #1277.
<ide>
<ide> this.removeClass('vjs-ended');
<add> this.removeClass('vjs-seeking');
<ide>
<ide> // reset the error state
<ide> this.error(null); | 1 |
PHP | PHP | apply fixes from styleci | 8913981ce443ed0d57518800fe407ae4be5567b7 | <ide><path>src/Illuminate/Foundation/Providers/ArtisanServiceProvider.php
<ide> use Illuminate\Foundation\Console\RouteListCommand;
<ide> use Illuminate\Foundation\Console\ViewClearCommand;
<ide> use Illuminate\Session\Console\SessionTableCommand;
<add>use Illuminate\Foundation\Console\BladeCacheCommand;
<ide> use Illuminate\Foundation\Console\PolicyMakeCommand;
<ide> use Illuminate\Foundation\Console\RouteCacheCommand;
<del>use Illuminate\Foundation\Console\BladeCacheCommand;
<ide> use Illuminate\Foundation\Console\RouteClearCommand;
<ide> use Illuminate\Console\Scheduling\ScheduleRunCommand;
<ide> use Illuminate\Foundation\Console\ChannelMakeCommand; | 1 |
Javascript | Javascript | replace buffer with fastbuffer in deserialize | 82db6729915e031814425360d2b8e7c83328697d | <ide><path>lib/v8.js
<ide> const arrayBufferViewTypeToIndex = new Map();
<ide> }
<ide> }
<ide>
<del>const bufferConstructorIndex = arrayBufferViewTypes.push(Buffer) - 1;
<add>const bufferConstructorIndex = arrayBufferViewTypes.push(FastBuffer) - 1;
<ide>
<ide> class DefaultSerializer extends Serializer {
<ide> constructor() {
<ide><path>test/parallel/test-v8-deserialize-buffer.js
<add>// Flags: --pending-deprecation --no-warnings
<add>'use strict';
<add>
<add>const common = require('../common');
<add>const v8 = require('v8');
<add>
<add>process.on('warning', common.mustNotCall());
<add>v8.deserialize(v8.serialize(Buffer.alloc(0))); | 2 |
Ruby | Ruby | describe what we are protecting | 21e5fd4a2a1c162ad33708d3e01b1fda165f204d | <ide><path>actioncable/lib/action_cable/connection/base.rb
<ide> def on_close(reason, code) # :nodoc:
<ide> send_async :handle_close
<ide> end
<ide>
<add> # TODO Change this to private once we've dropped Ruby 2.2 support.
<add> # Workaround for Ruby 2.2 "private attribute?" warning.
<ide> protected
<ide> # The request that initiated the WebSocket connection is available here. This gives access to the environment, cookies, etc.
<ide> def request
<ide><path>actioncable/lib/action_cable/connection/message_buffer.rb
<ide> def process!
<ide> receive_buffered_messages
<ide> end
<ide>
<add> # TODO Change this to private once we've dropped Ruby 2.2 support.
<add> # Workaround for Ruby 2.2 "private attribute?" warning.
<ide> protected
<ide> attr_reader :connection
<ide> attr_reader :buffered_messages
<ide><path>actioncable/lib/action_cable/connection/subscriptions.rb
<ide> def unsubscribe_from_all
<ide> subscriptions.each { |id, channel| remove_subscription(channel) }
<ide> end
<ide>
<add> # TODO Change this to private once we've dropped Ruby 2.2 support.
<add> # Workaround for Ruby 2.2 "private attribute?" warning.
<ide> protected
<ide> attr_reader :connection, :subscriptions
<ide>
<ide><path>actioncable/lib/action_cable/connection/web_socket.rb
<ide> def rack_response
<ide> websocket.rack_response
<ide> end
<ide>
<add> # TODO Change this to private once we've dropped Ruby 2.2 support.
<add> # Workaround for Ruby 2.2 "private attribute?" warning.
<ide> protected
<ide> attr_reader :websocket
<ide> end
<ide><path>actionpack/lib/action_dispatch/http/mime_type.rb
<ide> def html?
<ide>
<ide> def all?; false; end
<ide>
<add> # TODO Change this to private once we've dropped Ruby 2.2 support.
<add> # Workaround for Ruby 2.2 "private attribute?" warning.
<ide> protected
<ide>
<ide> attr_reader :string, :synonyms
<ide><path>actionview/lib/action_view/helpers/tags/translator.rb
<ide> def translate
<ide> translated_attribute || human_attribute_name
<ide> end
<ide>
<add> # TODO Change this to private once we've dropped Ruby 2.2 support.
<add> # Workaround for Ruby 2.2 "private attribute?" warning.
<ide> protected
<ide>
<ide> attr_reader :object_name, :method_and_value, :scope, :model
<ide><path>activemodel/lib/active_model/type/integer.rb
<ide> def serialize(value)
<ide> result
<ide> end
<ide>
<add> # TODO Change this to private once we've dropped Ruby 2.2 support.
<add> # Workaround for Ruby 2.2 "private attribute?" warning.
<ide> protected
<ide>
<ide> attr_reader :range
<ide><path>activemodel/lib/active_model/type/registry.rb
<ide> def lookup(symbol, *args)
<ide> end
<ide> end
<ide>
<add> # TODO Change this to private once we've dropped Ruby 2.2 support.
<add> # Workaround for Ruby 2.2 "private attribute?" warning.
<ide> protected
<ide>
<ide> attr_reader :registrations
<ide> def matches?(type_name, *args, **kwargs)
<ide> type_name == name
<ide> end
<ide>
<add> # TODO Change this to private once we've dropped Ruby 2.2 support.
<add> # Workaround for Ruby 2.2 "private attribute?" warning.
<ide> protected
<ide>
<ide> attr_reader :name, :block
<ide><path>activemodel/lib/active_model/validations/acceptance.rb
<ide> def define_on(klass)
<ide> klass.send(:attr_writer, *attr_writers)
<ide> end
<ide>
<add> # TODO Change this to private once we've dropped Ruby 2.2 support.
<add> # Workaround for Ruby 2.2 "private attribute?" warning.
<ide> protected
<ide>
<ide> attr_reader :attributes
<ide><path>activemodel/test/cases/attribute_assignment_test.rb
<ide> def broken_attribute=(value)
<ide> raise ErrorFromAttributeWriter
<ide> end
<ide>
<add> # TODO Change this to private once we've dropped Ruby 2.2 support.
<add> # Workaround for Ruby 2.2 "private attribute?" warning.
<ide> protected
<ide>
<ide> attr_writer :metadata
<ide><path>activerecord/lib/active_record/associations/association_scope.rb
<ide> def self.get_bind_values(owner, chain)
<ide> binds
<ide> end
<ide>
<add> # TODO Change this to private once we've dropped Ruby 2.2 support.
<add> # Workaround for Ruby 2.2 "private attribute?" warning.
<ide> protected
<ide>
<ide> attr_reader :value_transformation
<ide><path>activerecord/lib/active_record/attribute.rb
<ide> def encode_with(coder)
<ide> coder["value"] = value if defined?(@value)
<ide> end
<ide>
<add> # TODO Change this to private once we've dropped Ruby 2.2 support.
<add> # Workaround for Ruby 2.2 "private attribute?" warning.
<ide> protected
<ide>
<ide> attr_reader :original_attribute
<ide><path>activerecord/lib/active_record/attribute/user_provided_default.rb
<ide> def with_type(type)
<ide> self.class.new(name, user_provided_value, type, original_attribute)
<ide> end
<ide>
<add> # TODO Change this to private once we've dropped Ruby 2.2 support.
<add> # Workaround for Ruby 2.2 "private attribute?" warning.
<ide> protected
<ide>
<ide> attr_reader :user_provided_value
<ide><path>activerecord/lib/active_record/attribute_mutation_tracker.rb
<ide> def force_change(attr_name)
<ide> forced_changes << attr_name.to_s
<ide> end
<ide>
<add> # TODO Change this to private once we've dropped Ruby 2.2 support.
<add> # Workaround for Ruby 2.2 "private attribute?" warning.
<ide> protected
<ide>
<ide> attr_reader :attributes, :forced_changes
<ide><path>activerecord/lib/active_record/attribute_set.rb
<ide> def ==(other)
<ide> attributes == other.attributes
<ide> end
<ide>
<add> # TODO Change this to private once we've dropped Ruby 2.2 support.
<add> # Workaround for Ruby 2.2 "private attribute?" warning.
<ide> protected
<ide>
<ide> attr_reader :attributes
<ide><path>activerecord/lib/active_record/attribute_set/builder.rb
<ide> def marshal_load(delegate_hash)
<ide> @materialized = true
<ide> end
<ide>
<add> # TODO Change this to private once we've dropped Ruby 2.2 support.
<add> # Workaround for Ruby 2.2 "private attribute?" warning.
<ide> protected
<ide>
<ide> attr_reader :types, :values, :additional_types, :delegate_hash, :default
<ide><path>activerecord/lib/active_record/attribute_set/yaml_encoder.rb
<ide> def decode(coder)
<ide> end
<ide> end
<ide>
<add> # TODO Change this to private once we've dropped Ruby 2.2 support.
<add> # Workaround for Ruby 2.2 "private attribute?" warning.
<ide> protected
<ide>
<ide> attr_reader :default_types
<ide><path>activerecord/lib/active_record/connection_adapters/abstract/schema_definitions.rb
<ide> def add_to(table)
<ide> end
<ide> end
<ide>
<add> # TODO Change this to private once we've dropped Ruby 2.2 support.
<add> # Workaround for Ruby 2.2 "private attribute?" warning.
<ide> protected
<ide>
<ide> attr_reader :name, :polymorphic, :index, :foreign_key, :type, :options
<ide><path>activerecord/lib/active_record/connection_adapters/postgresql/oid/bit.rb
<ide> def hex?
<ide> /\A[0-9A-F]*\Z/i.match?(value)
<ide> end
<ide>
<add> # TODO Change this to private once we've dropped Ruby 2.2 support.
<add> # Workaround for Ruby 2.2 "private attribute?" warning.
<ide> protected
<ide>
<ide> attr_reader :value
<ide><path>activerecord/lib/active_record/enum.rb
<ide> def assert_valid_value(value)
<ide> end
<ide> end
<ide>
<add> # TODO Change this to private once we've dropped Ruby 2.2 support.
<add> # Workaround for Ruby 2.2 "private attribute?" warning.
<ide> protected
<ide>
<ide> attr_reader :name, :mapping, :subtype
<ide><path>activerecord/lib/active_record/railties/controller_runtime.rb
<ide> module Railties # :nodoc:
<ide> module ControllerRuntime #:nodoc:
<ide> extend ActiveSupport::Concern
<ide>
<add> # TODO Change this to private once we've dropped Ruby 2.2 support.
<add> # Workaround for Ruby 2.2 "private attribute?" warning.
<ide> protected
<ide>
<ide> attr_internal :db_runtime
<ide><path>activerecord/lib/active_record/relation/predicate_builder.rb
<ide> def build(attribute, value)
<ide> handler_for(value).call(attribute, value)
<ide> end
<ide>
<add> # TODO Change this to private once we've dropped Ruby 2.2 support.
<add> # Workaround for Ruby 2.2 "private attribute?" warning.
<ide> protected
<ide>
<ide> attr_reader :table
<ide><path>activerecord/lib/active_record/relation/predicate_builder/array_handler.rb
<ide> def call(attribute, value)
<ide> array_predicates.inject { |composite, predicate| composite.or(predicate) }
<ide> end
<ide>
<add> # TODO Change this to private once we've dropped Ruby 2.2 support.
<add> # Workaround for Ruby 2.2 "private attribute?" warning.
<ide> protected
<ide>
<ide> attr_reader :predicate_builder
<ide><path>activerecord/lib/active_record/relation/predicate_builder/association_query_handler.rb
<ide> def call(attribute, value)
<ide> predicate_builder.build_from_hash(queries)
<ide> end
<ide>
<add> # TODO Change this to private once we've dropped Ruby 2.2 support.
<add> # Workaround for Ruby 2.2 "private attribute?" warning.
<ide> protected
<ide>
<ide> attr_reader :predicate_builder
<ide><path>activerecord/lib/active_record/relation/predicate_builder/base_handler.rb
<ide> def call(attribute, value)
<ide> predicate_builder.build(attribute, value.id)
<ide> end
<ide>
<add> # TODO Change this to private once we've dropped Ruby 2.2 support.
<add> # Workaround for Ruby 2.2 "private attribute?" warning.
<ide> protected
<ide>
<ide> attr_reader :predicate_builder
<ide><path>activerecord/lib/active_record/relation/predicate_builder/class_handler.rb
<ide> def call(attribute, value)
<ide> predicate_builder.build(attribute, value.name)
<ide> end
<ide>
<add> # TODO Change this to private once we've dropped Ruby 2.2 support.
<add> # Workaround for Ruby 2.2 "private attribute?" warning.
<ide> protected
<ide>
<ide> attr_reader :predicate_builder
<ide><path>activerecord/lib/active_record/relation/predicate_builder/polymorphic_array_handler.rb
<ide> def call(attribute, value)
<ide> end
<ide> end
<ide>
<add> # TODO Change this to private once we've dropped Ruby 2.2 support.
<add> # Workaround for Ruby 2.2 "private attribute?" warning.
<ide> protected
<ide>
<ide> attr_reader :predicate_builder
<ide><path>activerecord/lib/active_record/relation/where_clause.rb
<ide> def self.empty
<ide> @empty ||= new([], [])
<ide> end
<ide>
<add> # TODO Change this to private once we've dropped Ruby 2.2 support.
<add> # Workaround for Ruby 2.2 "private attribute?" warning.
<ide> protected
<ide>
<ide> attr_reader :predicates
<ide><path>activerecord/lib/active_record/relation/where_clause_factory.rb
<ide> def build(opts, other)
<ide> WhereClause.new(parts, binds || [])
<ide> end
<ide>
<add> # TODO Change this to private once we've dropped Ruby 2.2 support.
<add> # Workaround for Ruby 2.2 "private attribute?" warning.
<ide> protected
<ide>
<ide> attr_reader :klass, :predicate_builder
<ide><path>activerecord/lib/active_record/table_metadata.rb
<ide> def polymorphic_association?
<ide> association && association.polymorphic?
<ide> end
<ide>
<add> # TODO Change this to private once we've dropped Ruby 2.2 support.
<add> # Workaround for Ruby 2.2 "private attribute?" warning.
<ide> protected
<ide>
<ide> attr_reader :klass, :arel_table, :association
<ide><path>activerecord/lib/active_record/type/adapter_specific_registry.rb
<ide> def <=>(other)
<ide> priority <=> other.priority
<ide> end
<ide>
<add> # TODO Change this to private once we've dropped Ruby 2.2 support.
<add> # Workaround for Ruby 2.2 "private attribute?" warning.
<ide> protected
<ide>
<ide> attr_reader :name, :block, :adapter, :override
<ide> def priority
<ide> super | 4
<ide> end
<ide>
<add> # TODO Change this to private once we've dropped Ruby 2.2 support.
<add> # Workaround for Ruby 2.2 "private attribute?" warning.
<ide> protected
<ide>
<ide> attr_reader :options, :klass
<ide><path>activerecord/lib/active_record/type_caster/connection.rb
<ide> def type_cast_for_database(attribute_name, value)
<ide> connection.type_cast_from_column(column, value)
<ide> end
<ide>
<add> # TODO Change this to private once we've dropped Ruby 2.2 support.
<add> # Workaround for Ruby 2.2 "private attribute?" warning.
<ide> protected
<ide>
<ide> attr_reader :table_name
<ide><path>activerecord/lib/active_record/type_caster/map.rb
<ide> def type_cast_for_database(attr_name, value)
<ide> type.serialize(value)
<ide> end
<ide>
<add> # TODO Change this to private once we've dropped Ruby 2.2 support.
<add> # Workaround for Ruby 2.2 "private attribute?" warning.
<ide> protected
<ide>
<ide> attr_reader :types
<ide><path>activerecord/lib/rails/generators/active_record/migration/migration_generator.rb
<ide> def create_migration_file
<ide> migration_template @migration_template, "db/migrate/#{file_name}.rb"
<ide> end
<ide>
<add> # TODO Change this to private once we've dropped Ruby 2.2 support.
<add> # Workaround for Ruby 2.2 "private attribute?" warning.
<ide> protected
<ide> attr_reader :migration_action, :join_tables
<ide>
<ide><path>activesupport/lib/active_support/subscriber.rb
<ide> def subscribers
<ide> @@subscribers ||= []
<ide> end
<ide>
<add> # TODO Change this to private once we've dropped Ruby 2.2 support.
<add> # Workaround for Ruby 2.2 "private attribute?" warning.
<ide> protected
<ide>
<ide> attr_reader :subscriber, :notifier, :namespace
<ide><path>railties/lib/rails/generators/named_base.rb
<ide> def js_template(source, destination)
<ide> end
<ide> end
<ide>
<add> # TODO Change this to private once we've dropped Ruby 2.2 support.
<add> # Workaround for Ruby 2.2 "private attribute?" warning.
<ide> protected
<ide> attr_reader :file_name
<ide>
<ide><path>railties/lib/rails/generators/resource_helpers.rb
<ide> def initialize(*args) #:nodoc:
<ide> assign_controller_names!(controller_name.pluralize)
<ide> end
<ide>
<add> # TODO Change this to private once we've dropped Ruby 2.2 support.
<add> # Workaround for Ruby 2.2 "private attribute?" warning.
<ide> protected
<ide>
<ide> attr_reader :controller_name, :controller_file_name | 37 |
Python | Python | return instance from registry | 5d91eaa6dde8b054ca24b848789d5319d0e4ed17 | <ide><path>celery/decorators.py
<ide> """
<ide> from inspect import getargspec
<ide>
<add>from celery import registry
<ide> from celery.task.base import Task, PeriodicTask
<ide> from celery.utils.functional import wraps
<ide>
<ide> def run(self, *args, **kwargs):
<ide> cls_dict = dict(options, run=run,
<ide> __module__=fun.__module__,
<ide> __doc__=fun.__doc__)
<del> return type(fun.__name__, (base, ), cls_dict)()
<add> T = type(fun.__name__, (base, ), cls_dict)()
<add> return registry.tasks[T.name] # global instance.
<ide>
<ide> return _create_task_cls
<ide> | 1 |
Text | Text | remove videos.md description | 5502e03f55a7d3f339f0ff8b3584ebf683f8e7b0 | <ide><path>docs/tutorials/videos.md
<ide> id: videos
<ide> title: 'Videos'
<ide> sidebar_label: 'Videos'
<del>description: 'The official Fundamentals tutorial for Redux: learn the fundamentals of using Redux'
<ide> hide_title: true
<ide> ---
<ide> | 1 |
Javascript | Javascript | increase timeout for testcases to 10s | 291be20a3de31a5c3cb3a39b85dc98ec04076173 | <ide><path>test/TestCases.template.js
<ide> const describeCases = config => {
<ide> return;
<ide>
<ide> function _it(title, fn) {
<del> exportedTests.push({ title, fn, timeout: 5000 });
<add> exportedTests.push({ title, fn, timeout: 10000 });
<ide> }
<ide>
<ide> function _require(module) { | 1 |
Javascript | Javascript | fix indentation in uglifyjs change | 6c3cd7750cd9cd30309cbdb7f73c230c7398169a | <ide><path>lib/optimize/UglifyJsPlugin.js
<ide> UglifyJsPlugin.prototype.apply = function(compiler) {
<ide> warnings.push(warning);
<ide> };
<ide> }
<del> uglify.base54.reset();
<add> uglify.base54.reset();
<ide> var ast = uglify.parse(input, {
<ide> filename: file
<ide> }); | 1 |
Ruby | Ruby | fix rubocop warnings | b9b07fc082f72dd0c3a886735cbcffe7f7244733 | <ide><path>Library/Homebrew/cmd/search.rb
<ide> def search
<ide> puts
<ide> end
<ide> puts msg
<del> elsif count == 0
<add> elsif count.zero?
<ide> puts "No formula found for #{query.inspect}."
<ide> begin
<ide> GitHub.print_pull_requests_matching(query)
<ide> def search
<ide>
<ide> SEARCHABLE_TAPS = OFFICIAL_TAPS.map { |tap| ["Homebrew", tap] } + [
<ide> %w[Caskroom cask],
<del> %w[Caskroom versions]
<add> %w[Caskroom versions],
<ide> ]
<ide>
<ide> def query_regexp(query)
<ide> def search_tap(user, repo, rx)
<ide> names = remote_tap_formulae["#{user}/#{repo}"]
<ide> user = user.downcase if user == "Homebrew" # special handling for the Homebrew organization
<ide> names.select { |name| rx === name }.map { |name| "#{user}/#{repo}/#{name}" }
<del> rescue GitHub::HTTPNotFoundError => e
<add> rescue GitHub::HTTPNotFoundError
<ide> opoo "Failed to search tap: #{user}/#{repo}. Please run `brew update`"
<ide> []
<ide> rescue GitHub::Error => e | 1 |
Java | Java | constrain targets for @filter declaration | df7bda5637a56c6d877a82d68e76027f7055fef2 | <ide><path>org.springframework.context/src/main/java/org/springframework/context/annotation/ComponentScan.java
<ide> * Declares the type filter to be used as an {@linkplain ComponentScan#includeFilters()
<ide> * include filter} or {@linkplain ComponentScan#includeFilters() exclude filter}.
<ide> */
<del> @Retention(RetentionPolicy.SOURCE)
<add> @Retention(RetentionPolicy.RUNTIME)
<add> @Target({})
<ide> @interface Filter {
<ide> /**
<ide> * The type of filter to use. | 1 |
Java | Java | copy cookies in built serverresponse | 0db317575b5eead35896530338ace74f9354574f | <ide><path>spring-webmvc/src/main/java/org/springframework/web/servlet/function/DefaultServerResponseBuilder.java
<ide> public DefaultServerResponseBuilder(ServerResponse other) {
<ide> this.statusCode = (other instanceof AbstractServerResponse ?
<ide> ((AbstractServerResponse) other).statusCode : other.statusCode().value());
<ide> this.headers.addAll(other.headers());
<add> this.cookies.addAll(other.cookies());
<ide> }
<ide>
<ide> public DefaultServerResponseBuilder(HttpStatus status) {
<ide><path>spring-webmvc/src/test/java/org/springframework/web/servlet/function/DefaultServerResponseBuilderTests.java
<ide> public void status() {
<ide>
<ide> @Test
<ide> public void from() {
<del> ServerResponse other = ServerResponse.ok().header("foo", "bar").build();
<add> Cookie cookie = new Cookie("foo", "bar");
<add> ServerResponse other = ServerResponse.ok()
<add> .header("foo", "bar")
<add> .cookie(cookie)
<add> .build();
<ide> ServerResponse result = ServerResponse.from(other).build();
<ide> assertEquals(HttpStatus.OK, result.statusCode());
<ide> assertEquals("bar", result.headers().getFirst("foo"));
<add> assertEquals(cookie, result.cookies().getFirst("foo"));
<ide> }
<ide>
<ide> | 2 |
Python | Python | fix training with preset vectors | e93d43a43a03ed207a7d9efe5817adb0afb0ef82 | <ide><path>spacy/cli/train.py
<ide> n_iter=("number of iterations", "option", "n", int),
<ide> n_sents=("number of sentences", "option", "ns", int),
<ide> use_gpu=("Use GPU", "option", "g", int),
<del> resume=("Whether to resume training", "flag", "R", bool),
<add> vectors=("Model to load vectors from", "option", "v"),
<ide> no_tagger=("Don't train tagger", "flag", "T", bool),
<ide> no_parser=("Don't train parser", "flag", "P", bool),
<ide> no_entities=("Don't train NER", "flag", "N", bool),
<ide> gold_preproc=("Use gold preprocessing", "flag", "G", bool),
<ide> )
<ide> def train(cmd, lang, output_dir, train_data, dev_data, n_iter=20, n_sents=0,
<del> use_gpu=-1, resume=False, no_tagger=False, no_parser=False, no_entities=False,
<add> use_gpu=-1, vectors=None, no_tagger=False, no_parser=False, no_entities=False,
<ide> gold_preproc=False):
<ide> """
<ide> Train a model. Expects data in spaCy's JSON format.
<ide> def train(cmd, lang, output_dir, train_data, dev_data, n_iter=20, n_sents=0,
<ide> corpus = GoldCorpus(train_path, dev_path, limit=n_sents)
<ide> n_train_words = corpus.count_train()
<ide>
<del> if not resume:
<del> lang_class = util.get_lang_class(lang)
<del> nlp = lang_class(pipeline=pipeline)
<del> optimizer = nlp.begin_training(lambda: corpus.train_tuples, device=use_gpu)
<del> else:
<del> print("Load resume")
<del> util.use_gpu(use_gpu)
<del> nlp = _resume_model(lang, pipeline, corpus)
<del> optimizer = nlp.resume_training(device=use_gpu)
<del> lang_class = nlp.__class__
<del>
<add> lang_class = util.get_lang_class(lang)
<add> nlp = lang_class(pipeline=pipeline)
<add> if vectors:
<add> util.load_model(vectors, vocab=nlp.vocab)
<add> optimizer = nlp.begin_training(lambda: corpus.train_tuples, device=use_gpu)
<ide> nlp._optimizer = None
<ide>
<ide> print("Itn.\tLoss\tUAS\tNER P.\tNER R.\tNER F.\tTag %\tToken %")
<ide> try:
<add> train_docs = corpus.train_docs(nlp, projectivize=True, noise_level=0.0,
<add> gold_preproc=gold_preproc, max_length=0)
<add> train_docs = list(train_docs)
<ide> for i in range(n_iter):
<ide> with tqdm.tqdm(total=n_train_words, leave=False) as pbar:
<del> train_docs = corpus.train_docs(nlp, projectivize=True, noise_level=0.0,
<del> gold_preproc=gold_preproc, max_length=0)
<ide> losses = {}
<ide> for batch in minibatch(train_docs, size=batch_sizes):
<ide> docs, golds = zip(*batch)
<ide> def train(cmd, lang, output_dir, train_data, dev_data, n_iter=20, n_sents=0,
<ide> except:
<ide> pass
<ide>
<del>
<del>def _resume_model(lang, pipeline, corpus):
<del> nlp = util.load_model(lang)
<del> pipes = {getattr(pipe, 'name', None) for pipe in nlp.pipeline}
<del> for name in pipeline:
<del> if name not in pipes:
<del> factory = nlp.Defaults.factories[name]
<del> for pipe in factory(nlp):
<del> if hasattr(pipe, 'begin_training'):
<del> pipe.begin_training(corpus.train_tuples,
<del> pipeline=nlp.pipeline)
<del> nlp.pipeline.append(pipe)
<del> nlp.meta['pipeline'] = pipeline
<del> if nlp.vocab.vectors.data.shape[1] >= 1:
<del> nlp.vocab.vectors.data = Model.ops.asarray(
<del> nlp.vocab.vectors.data)
<del>
<del> return nlp
<del>
<del>
<ide> def _render_parses(i, to_render):
<ide> to_render[0].user_data['title'] = "Batch %d" % i
<ide> with Path('/tmp/entities.html').open('w') as file_: | 1 |
Ruby | Ruby | move localpodspecpatch to dedicated file | 8fe2b591c7e073d629e95cd7b67aa1dfa96ece38 | <ide><path>scripts/cocoapods/__tests__/local_podspec_patch-test.rb
<add># Copyright (c) Meta Platforms, Inc. and affiliates.
<add>#
<add># This source code is licensed under the MIT license found in the
<add># LICENSE file in the root directory of this source tree.
<add>
<add>require "test/unit"
<add>require "json"
<add>require_relative "../local_podspec_patch.rb"
<add>require_relative "./test_utils/FileMock.rb"
<add>require_relative "./test_utils/DirMock.rb"
<add>require_relative "./test_utils/PodMock.rb"
<add>require_relative "./test_utils/LocalPodspecPatchMock.rb"
<add>
<add>class LocalPodspecPatchTests < Test::Unit::TestCase
<add> def setup
<add> File.enable_testing_mode!
<add> Dir.enable_testing_mode!
<add> end
<add>
<add> def teardown
<add> File.reset()
<add> Dir.reset()
<add> end
<add>
<add> # =================== #
<add> # Test - Pods To Update #
<add> # =================== #
<add>
<add> def test_podsToUpdate_whenNoFilesExists_returnLocalPodspecs
<add> # Arrange
<add> react_native_path = "../node_modules/react-native"
<add> globs = ["a/path/to/boost.podspec", "a/path/to/DoubleConversion.podspec"]
<add> mocked_pwd = "a/path/to"
<add> Dir.mocked_existing_globs(globs, "#{react_native_path}/third-party-podspecs/*")
<add> Dir.set_pwd(mocked_pwd)
<add>
<add> # Act
<add> local_podspec = LocalPodspecPatch.pods_to_update(:react_native_path => react_native_path)
<add>
<add> # Assert
<add> assert_equal(local_podspec, [])
<add> assert_equal(Dir.glob_invocation, ["#{react_native_path}/third-party-podspecs/*"])
<add> assert_equal(File.exist_invocation_params, [
<add> File.join(mocked_pwd, "Pods/Local Podspecs", "boost.podspec.json"),
<add> File.join(mocked_pwd, "Pods/Local Podspecs", "DoubleConversion.podspec.json"),
<add> ])
<add> end
<add>
<add> def test_podsToUpdate_whenFilesExistsWithSameVersions_returnsEmpty
<add> # Arrange
<add> react_native_path = "../node_modules/react-native"
<add> globs = ["a/path/to/boost.podspec", "a/path/to/DoubleConversion.podspec"]
<add> mocked_pwd = "a/path/to"
<add> prepare_PodsToUpdate_test_withMatchingVersions(react_native_path, globs, mocked_pwd)
<add>
<add> # Act
<add> local_podspec = LocalPodspecPatch.pods_to_update(:react_native_path => react_native_path)
<add>
<add> # Assert
<add> assert_equal(local_podspec, [])
<add> assert_equal(Dir.glob_invocation, ["#{react_native_path}/third-party-podspecs/*"])
<add> assert_equal(File.exist_invocation_params, [
<add> File.join(mocked_pwd, "Pods/Local Podspecs", "boost.podspec.json"),
<add> File.join(mocked_pwd, "Pods/Local Podspecs", "DoubleConversion.podspec.json"),
<add> ])
<add> end
<add>
<add> def test_podsToUpdate_whenFilesExistsWithDifferentVersions_returnsThem
<add> # Arrange
<add> react_native_path = "../node_modules/react-native"
<add> globs = ["a/path/to/boost.podspec", "a/path/to/DoubleConversion.podspec"]
<add> mocked_pwd = "a/path/to"
<add> prepare_PodsToUpdate_test_withDifferentVersions(react_native_path, globs, mocked_pwd)
<add>
<add> # Act
<add> local_podspec = LocalPodspecPatch.pods_to_update(:react_native_path => react_native_path)
<add>
<add> # Assert
<add> assert_equal(local_podspec, [
<add> "boost",
<add> "DoubleConversion"
<add> ])
<add> assert_equal(Dir.glob_invocation, ["#{react_native_path}/third-party-podspecs/*"])
<add> assert_equal(File.exist_invocation_params, [
<add> File.join(mocked_pwd, "Pods/Local Podspecs", "boost.podspec.json"),
<add> File.join(mocked_pwd, "Pods/Local Podspecs", "DoubleConversion.podspec.json"),
<add> ])
<add> end
<add>
<add> # ======================================== #
<add> # Test - Patch Detect Changes With Podfile #
<add> # ======================================== #
<add> def test_patchDetectChangesWithPodfile_whenAlreadyChanged_returnSameChangeSet()
<add> local_pods = [
<add> "boost",
<add> "DoubleConversion"
<add> ]
<add> LocalPodspecPatch.mock_local_podspecs(local_pods)
<add> changes = {
<add> :unchanged => ["some_pod"],
<add> :changed => ["boost", "DoubleConversion", "another_pod"]
<add> }
<add>
<add> Pod::Lockfile.prepend(LocalPodspecPatch)
<add>
<add> new_changes = Pod::Lockfile.new().patch_detect_changes_with_podfile(changes)
<add>
<add> assert_equal(new_changes, {
<add> :unchanged => ["some_pod"],
<add> :changed => ["boost", "DoubleConversion", "another_pod"]
<add> })
<add> end
<add>
<add> def test_patchDetectChangesWithPodfile_whenLocalPodsUnchanged_movesLocalPodsToChangeSet()
<add> pods = [
<add> "boost",
<add> "DoubleConversion"
<add> ]
<add> LocalPodspecPatch.mock_local_podspecs(pods)
<add> changes = {
<add> :unchanged => ["first_pod", "boost", "DoubleConversion"],
<add> :changed => ["another_pod"]
<add> }
<add>
<add> Pod::Lockfile.prepend(LocalPodspecPatch)
<add>
<add> new_changes = Pod::Lockfile.new().patch_detect_changes_with_podfile(changes)
<add>
<add> assert_equal(new_changes, {
<add> :unchanged => ["first_pod"],
<add> :changed => ["another_pod", "boost", "DoubleConversion"]
<add> })
<add> end
<add>
<add> # ========= #
<add> # Utilities #
<add> # ========= #
<add> def prepare_PodsToUpdate_test_withMatchingVersions(react_native_path, globs, mocked_pwd)
<add> File.mocked_existing_files([
<add> "a/path/to/Pods/Local Podspecs/boost.podspec.json",
<add> "a/path/to/Pods/Local Podspecs/DoubleConversion.podspec.json"
<add> ])
<add> File.files_to_read({
<add> "a/path/to/Pods/Local Podspecs/boost.podspec.json" => "{ \"version\": \"0.0.1\"}",
<add> "a/path/to/Pods/Local Podspecs/DoubleConversion.podspec.json" => "{ \"version\": \"1.0.1\"}",
<add> })
<add> Dir.mocked_existing_globs(globs, "#{react_native_path}/third-party-podspecs/*")
<add> Dir.set_pwd(mocked_pwd)
<add> Pod::Specification.specs_from_file({
<add> "../node_modules/react-native/third-party-podspecs/boost.podspec" => Pod::PodSpecMock.new(:version => "0.0.1"),
<add> "../node_modules/react-native/third-party-podspecs/DoubleConversion.podspec" => Pod::PodSpecMock.new(:version => "1.0.1"),
<add> })
<add> end
<add>
<add> def prepare_PodsToUpdate_test_withDifferentVersions(react_native_path, globs, mocked_pwd)
<add> File.mocked_existing_files([
<add> "a/path/to/Pods/Local Podspecs/boost.podspec.json",
<add> "a/path/to/Pods/Local Podspecs/DoubleConversion.podspec.json"
<add> ])
<add> File.files_to_read({
<add> "a/path/to/Pods/Local Podspecs/boost.podspec.json" => "{ \"version\": \"0.0.1\"}",
<add> "a/path/to/Pods/Local Podspecs/DoubleConversion.podspec.json" => "{ \"version\": \"1.0.1\"}",
<add> })
<add> Dir.mocked_existing_globs(globs, "#{react_native_path}/third-party-podspecs/*")
<add> Dir.set_pwd(mocked_pwd)
<add> Pod::Specification.specs_from_file({
<add> "../node_modules/react-native/third-party-podspecs/boost.podspec" => Pod::PodSpecMock.new(:version => "0.1.1"),
<add> "../node_modules/react-native/third-party-podspecs/DoubleConversion.podspec" => Pod::PodSpecMock.new(:version => "1.1.1"),
<add> })
<add> end
<add>end
<ide><path>scripts/cocoapods/__tests__/test_utils/DirMock.rb
<ide> class Dir
<ide> @@exist_invocation_params = []
<ide> @@mocked_existing_dirs = []
<ide>
<add> @@glob_invocation = []
<add> @@mocked_existing_globs = {}
<add>
<add> @@pwd = nil
<add>
<ide> # Monkey patched exists? method.
<ide> # It is used also by the test runner, so it can't start monkey patched
<ide> # To use this, invoke the `is_testing` method before starting your test.
<ide> def self.mocked_existing_dirs(dirs)
<ide> @@mocked_existing_dirs = dirs
<ide> end
<ide>
<add> # Set what the `glob` function should return
<add> def self.mocked_existing_globs(globs, path)
<add> @@mocked_existing_globs[path] = globs
<add> end
<add>
<add> def self.glob_invocation
<add> return @@glob_invocation
<add> end
<add>
<add> def self.glob(path)
<add> @@glob_invocation.push(path)
<add> return @@mocked_existing_globs[path]
<add> end
<add>
<add> def self.set_pwd(pwd)
<add> @@pwd = pwd
<add> end
<add>
<add> def self.pwd
<add> if @@pwd != nil
<add> return @@pwd
<add> end
<add> return pwd
<add> end
<add>
<ide> # Turn on the mocking features of the File mock
<ide> def self.enable_testing_mode!()
<ide> @@is_testing = true
<ide> end
<ide>
<ide> # Resets all the settings for the File mock
<ide> def self.reset()
<add> @@pwd = nil
<ide> @@mocked_existing_dirs = []
<ide> @@is_testing = false
<ide> @@exist_invocation_params = []
<add> @@glob_invocation = []
<add> @@mocked_existing_globs = {}
<ide> end
<ide> end
<ide><path>scripts/cocoapods/__tests__/test_utils/FileMock.rb
<ide> class File
<ide> @@open_invocation_count = 0
<ide>
<ide> @@open_files = []
<add>
<add> @@files_to_read = {}
<ide> attr_reader :collected_write
<ide> attr_reader :fsync_invocation_count
<ide>
<ide> def self.open_files
<ide> return @@open_files
<ide> end
<ide>
<add> def self.file_invocation_params
<add> return @@file_invocation_params
<add> end
<add>
<ide> def write(text)
<ide> @collected_write.push(text.to_s)
<ide> end
<ide> def fsync()
<ide> @fsync_invocation_count += 1
<ide> end
<ide>
<add>
<add> def self.files_to_read(files)
<add> @@files_to_read = files
<add> end
<add>
<add> def self.read(filepath)
<add> return @@files_to_read[filepath]
<add> end
<add>
<ide> # Resets all the settings for the File mock
<ide> def self.reset()
<ide> @@delete_invocation_count = 0
<ide> def self.reset()
<ide> @@open_invocation_count = 0
<ide> @@mocked_existing_files = []
<ide> @@is_testing = false
<add> @@file_invocation_params = []
<ide> @@exist_invocation_params = []
<add> @@files_to_read = {}
<ide> end
<ide>
<ide>
<ide><path>scripts/cocoapods/__tests__/test_utils/LocalPodspecPatchMock.rb
<add># Copyright (c) Meta Platforms, Inc. and affiliates.
<add>#
<add># This source code is licensed under the MIT license found in the
<add># LICENSE file in the root directory of this source tree.
<add>
<add>module LocalPodspecPatch
<add> def self.mock_local_podspecs(pods)
<add> @@local_podspecs = pods
<add> end
<add>
<add> def reset()
<add> @@local_podspecs = []
<add> end
<add>end
<ide><path>scripts/cocoapods/__tests__/test_utils/PodMock.rb
<ide> def self.reset()
<ide> @@executed_commands = []
<ide> end
<ide> end
<add>
<add> class Specification
<add> @@specs_from_file = {}
<add>
<add> def self.specs_from_file(specs)
<add> @@specs_from_file = specs
<add> end
<add>
<add> def self.from_file(path)
<add> return @@specs_from_file[path]
<add> end
<add>
<add> def reset()
<add> @@specs_from_file = {}
<add> end
<add> end
<add>
<add> class PodSpecMock
<add> attr_reader :version
<add>
<add> def initialize(version: "0.0.1")
<add> @version = version
<add> end
<add> end
<add>
<add> class Lockfile
<add> def initialize()
<add> end
<add> end
<ide> end
<ide><path>scripts/cocoapods/local_podspec_patch.rb
<add># Copyright (c) Meta Platforms, Inc. and affiliates.
<add>#
<add># This source code is licensed under the MIT license found in the
<add># LICENSE file in the root directory of this source tree.
<add>
<add># Monkeypatch of `Pod::Lockfile` to ensure automatic update of dependencies integrated with a local podspec when their version changed.
<add># This is necessary because local podspec dependencies must be otherwise manually updated.
<add>module LocalPodspecPatch
<add> # Returns local podspecs whose versions differ from the one in the `react-native` package.
<add> def self.pods_to_update(react_native_path: "../node_modules/react-native")
<add> @@local_podspecs = Dir.glob("#{react_native_path}/third-party-podspecs/*").map { |file| File.basename(file, ".podspec") }
<add> local_podspecs.select do |podspec_name|
<add>
<add> # Read local podspec to determine the cached version
<add> local_podspec_path = File.join(
<add> Dir.pwd, "Pods/Local Podspecs/#{podspec_name}.podspec.json"
<add> )
<add>
<add> # Local podspec cannot be outdated if it does not exist, yet
<add> next unless File.exist?(local_podspec_path)
<add>
<add> local_podspec = File.read(local_podspec_path)
<add> local_podspec_json = JSON.parse(local_podspec)
<add> local_version = local_podspec_json["version"]
<add>
<add> # Read the version from a podspec from the `react-native` package
<add> podspec_path = "#{react_native_path}/third-party-podspecs/#{podspec_name}.podspec"
<add> current_podspec = Pod::Specification.from_file(podspec_path)
<add> current_version = current_podspec.version.to_s
<add> current_version != local_version
<add> end
<add> @@local_podspecs
<add> end
<add>
<add> # Patched `detect_changes_with_podfile` method
<add> def detect_changes_with_podfile(podfile)
<add> Pod::UI.puts "Invoke detect_changes_with_podfile patched method".red
<add> changes = super(podfile)
<add> return patch_detect_changes_with_podfile(changes)
<add> end
<add>
<add> def patch_detect_changes_with_podfile(changes)
<add> @@local_podspecs.each do |local_podspec|
<add> next unless changes[:unchanged].include?(local_podspec)
<add>
<add> changes[:unchanged].delete(local_podspec)
<add> changes[:changed] << local_podspec
<add> end
<add> changes
<add> end
<add>end
<ide><path>scripts/react_native_pods.rb
<ide> require_relative './cocoapods/codegen.rb'
<ide> require_relative './cocoapods/utils.rb'
<ide> require_relative './cocoapods/new_architecture.rb'
<add>require_relative './cocoapods/local_podspec_patch.rb'
<ide>
<ide> $CODEGEN_OUTPUT_DIR = 'build/generated/ios'
<ide> $CODEGEN_COMPONENT_DIR = 'react/renderer/components'
<ide> def use_react_native! (options={})
<ide> use_flipper_pods(flipper_configuration.versions, :configurations => flipper_configuration.configurations)
<ide> end
<ide>
<del> pods_to_update = LocalPodspecPatch.pods_to_update(options)
<add> pods_to_update = LocalPodspecPatch.pods_to_update(:react_native_path => prefix)
<ide> if !pods_to_update.empty?
<ide> if Pod::Lockfile.public_instance_methods.include?(:detect_changes_with_podfile)
<ide> Pod::Lockfile.prepend(LocalPodspecPatch)
<ide> def __apply_Xcode_12_5_M1_post_install_workaround(installer)
<ide> time_header = "#{Pod::Config.instance.installation_root.to_s}/Pods/RCT-Folly/folly/portability/Time.h"
<ide> `sed -i -e $'s/ && (__IPHONE_OS_VERSION_MIN_REQUIRED < __IPHONE_10_0)//' #{time_header}`
<ide> end
<del>
<del># Monkeypatch of `Pod::Lockfile` to ensure automatic update of dependencies integrated with a local podspec when their version changed.
<del># This is necessary because local podspec dependencies must be otherwise manually updated.
<del>module LocalPodspecPatch
<del> # Returns local podspecs whose versions differ from the one in the `react-native` package.
<del> def self.pods_to_update(react_native_options)
<del> prefix = react_native_options[:path] ||= "../node_modules/react-native"
<del> @@local_podspecs = Dir.glob("#{prefix}/third-party-podspecs/*").map { |file| File.basename(file, ".podspec") }
<del> local_podspecs.select do |podspec_name|
<del> # Read local podspec to determine the cached version
<del> local_podspec_path = File.join(
<del> Dir.pwd, "Pods/Local Podspecs/#{podspec_name}.podspec.json"
<del> )
<del>
<del> # Local podspec cannot be outdated if it does not exist, yet
<del> next unless File.file?(local_podspec_path)
<del>
<del> local_podspec = File.read(local_podspec_path)
<del> local_podspec_json = JSON.parse(local_podspec)
<del> local_version = local_podspec_json["version"]
<del>
<del> # Read the version from a podspec from the `react-native` package
<del> podspec_path = "#{prefix}/third-party-podspecs/#{podspec_name}.podspec"
<del> current_podspec = Pod::Specification.from_file(podspec_path)
<del>
<del> current_version = current_podspec.version.to_s
<del> current_version != local_version
<del> end
<del> @@local_podspecs
<del> end
<del>
<del> # Patched `detect_changes_with_podfile` method
<del> def detect_changes_with_podfile(podfile)
<del> changes = super(podfile)
<del> @@local_podspecs.each do |local_podspec|
<del> next unless changes[:unchanged].include?(local_podspec)
<del>
<del> changes[:unchanged].delete(local_podspec)
<del> changes[:changed] << local_podspec
<del> end
<del> changes
<del> end
<del>end | 7 |
Javascript | Javascript | add a test case | 0e3795bf77066684c3e56aeb2d737cd70cbb810c | <ide><path>test/configCases/module-name/different-issuers-for-same-module/a.js
<add>module.exports = require("./c");
<ide><path>test/configCases/module-name/different-issuers-for-same-module/b.js
<add>module.exports = require("./c");
<ide><path>test/configCases/module-name/different-issuers-for-same-module/c.js
<add>module.exports = "c";
<ide><path>test/configCases/module-name/different-issuers-for-same-module/loader-a.js
<add>module.exports = function(src) {
<add> return `module.exports = "loader-a"`;
<add>};
<ide><path>test/configCases/module-name/different-issuers-for-same-module/loader-b.js
<add>module.exports = function(src) {
<add> return `module.exports = "loader-b"`;
<add>};
<ide><path>test/configCases/module-name/different-issuers-for-same-module/test.js
<add>it("should assign different names to the same module with different issuers ", function() {
<add> var fs = require("fs");
<add> var path = require("path");
<add> var bundle = fs.readFileSync(path.join(__dirname, "bundle0.js"), "utf-8");
<add> bundle.should.containEql("./a.js~./c.js");
<add> bundle.should.containEql("./a.js~./c.js");
<add>});
<ide><path>test/configCases/module-name/different-issuers-for-same-module/webpack.config.js
<add>module.exports = {
<add> mode: "development",
<add> entry: {
<add> main: ["./a", "./b", "./test"]
<add> },
<add> module: {
<add> rules: [
<add> {
<add> test: /c\.js/,
<add> issuer: /a\.js/,
<add> loader: "./loader-a"
<add> },
<add> {
<add> test: /c\.js/,
<add> issuer: /b\.js/,
<add> loader: "./loader-b"
<add> }
<add> ]
<add> },
<add> node: {
<add> __dirname: false
<add> }
<add>}; | 7 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.