content_type stringclasses 8 values | main_lang stringclasses 7 values | message stringlengths 1 50 | sha stringlengths 40 40 | patch stringlengths 52 962k | file_count int64 1 300 |
|---|---|---|---|---|---|
Python | Python | fix syntax error in setup.py | 535833a38de02ae544f83ec001154ac6dfd8f33d | <ide><path>setup.py
<ide> def setup_package():
<ide> 'plac<1.0.0,>=0.9.6',
<ide> 'pathlib',
<ide> 'ujson>=1.35',
<del> 'dill>=0.2,<0.3',
<add> 'dill>=0.2,<0.3'],
<ide> setup_requires=['wheel'],
<ide> classifiers=[
<ide> 'Development Status :: 5 - Production/Stable', | 1 |
Text | Text | update 3 chinese challenges | a9092776b5d637595b068dda93d69b0ff8ee320a | <ide><path>curriculum/challenges/chinese/05-apis-and-microservices/apis-and-microservices-projects/url-shortener-microservice.md
<ide> tests:
<ide> </section>
<ide>
<ide> ## Challenge Seed
<del><section id='challengeSeed'
<add><section id='challengeSeed'>
<ide>
<ide> </section>
<ide>
<ide><path>curriculum/challenges/chinese/06-quality-assurance/advanced-node-and-express/send-and-display-chat-messages.md
<ide> socket.emit('chat message', messageToSend);
<ide> tests:
<ide> - text: 服务端应监听 <code>'chat message'</code>,且应在监听到后 emit。
<ide> testString: getUserInput => $.get(getUserInput('url')+ '/_api/server.js') .then(data => { assert.match(data, /socket.on.*('|")chat message('|")[^]*io.emit.*('|")chat message('|").*name.*message/gis, 'Your server should listen to the socket for "chat message" then emit to all users "chat message" with name and message in the data object'); }, xhr => { throw new Error(xhr.statusText); })
<del> - text: '客户端应正确处理和展示从 <code>'chat message'</code> 事件发来的新数据。
<add> - text: 客户端应正确处理和展示从 <code>'chat message'</code> 事件发来的新数据。
<ide> testString: 'getUserInput => $.get(getUserInput(''url'')+ ''/public/client.js'') .then(data => { assert.match(data, /socket.on.*(''|")chat message(''|")[^]*messages.*li/gis, ''You should append a list item to #messages on your client within the "chat message" event listener to display the new message''); }, xhr => { throw new Error(xhr.statusText); })'
<ide> ```
<ide>
<add><path>curriculum/challenges/chinese/08-data-analysis-with-python/data-analysis-with-python-course/pandas-conditional-selection-and-modifying-dataframes.md
<del><path>curriculum/challenges/chinese/08-data-analysis-with-python/data-analysis-with-python-course/pandas-condtitional-selection-and-modifying-dataframes.md
<ide> ---
<ide> id: 5e9a093a74c4063ca6f7c15b
<del>title: Pandas Condtitional Selection and Modifying DataFrames
<add>title: Pandas Conditional Selection and Modifying DataFrames
<ide> challengeType: 11
<ide> isHidden: false
<ide> videoId: BFlH0fN5xRQ | 3 |
Text | Text | fix the typo of link in devmapper | f7701d90a94922fa6854cc80b2218392d6a4ac76 | <ide><path>daemon/graphdriver/devmapper/README.md
<ide> This uses the `dm` prefix and would be used something like `docker daemon --stor
<ide>
<ide> These options are currently documented both in [the man
<ide> page](../../../man/docker.1.md) and in [the online
<del>documentation](https://docs.docker.com/reference/commandline/daemon/#storage-driver-options).
<add>documentation](https://docs.docker.com/engine/userguide/storagedriver/device-mapper-driver/).
<ide> If you add an options, update both the `man` page and the documentation. | 1 |
Python | Python | add example using tensorboard standalone projector | 4eeb1788562a31af4e910a3288b5abcd55e4c870 | <ide><path>examples/vectors_tensorboard_standalone.py
<add>#!/usr/bin/env python
<add># coding: utf8
<add>"""Export spaCy model vectors for use in TensorBoard's standalone embedding projector.
<add>https://github.com/tensorflow/embedding-projector-standalone
<add>
<add>Usage:
<add>
<add> python vectors_tensorboard_standalone.py ./myVectorModel ./output [name]
<add>
<add>This outputs two files that have to be copied into the "oss_data" of the standalone projector:
<add>
<add> [name]_labels.tsv - metadata such as human readable labels for vectors
<add> [name]_tensors.bytes - numpy.ndarray of numpy.float32 precision vectors
<add>
<add>"""
<add>from __future__ import unicode_literals
<add>
<add>import json
<add>import math
<add>from os import path
<add>
<add>import numpy
<add>import plac
<add>import spacy
<add>import tqdm
<add>
<add>
<add>@plac.annotations(
<add> vectors_loc=("Path to spaCy model that contains vectors", "positional", None, str),
<add> out_loc=("Path to output folder writing tensors and labels data", "positional", None, str),
<add> name=("Human readable name for tsv file and vectors tensor", "positional", None, str),
<add>)
<add>def main(vectors_loc, out_loc, name="spaCy_vectors"):
<add> # A tab-separated file that contains information about the vectors for visualization
<add> #
<add> # Learn more: https://www.tensorflow.org/programmers_guide/embedding#metadata
<add> meta_file = "{}_labels.tsv".format(name)
<add> out_meta_file = path.join(out_loc, meta_file)
<add>
<add> print('Loading spaCy vectors model: {}'.format(vectors_loc))
<add> model = spacy.load(vectors_loc)
<add>
<add> print('Finding lexemes with vectors attached: {}'.format(vectors_loc))
<add> voacb_strings = [
<add> w for w in tqdm.tqdm(model.vocab.strings, total=len(model.vocab.strings), leave=False)
<add> if model.vocab.has_vector(w)
<add> ]
<add> vector_count = len(voacb_strings)
<add>
<add> print('Building Projector labels for {} vectors: {}'.format(vector_count, out_meta_file))
<add> vector_dimensions = model.vocab.vectors.shape[1]
<add> tf_vectors_variable = numpy.zeros((vector_count, vector_dimensions), dtype=numpy.float32)
<add>
<add> # Write a tab-separated file that contains information about the vectors for visualization
<add> #
<add> # Reference: https://www.tensorflow.org/programmers_guide/embedding#metadata
<add> with open(out_meta_file, 'wb') as file_metadata:
<add> # Define columns in the first row
<add> file_metadata.write("Text\tFrequency\n".encode('utf-8'))
<add> # Write out a row for each vector that we add to the tensorflow variable we created
<add> vec_index = 0
<add>
<add> for text in tqdm.tqdm(voacb_strings, total=len(voacb_strings), leave=False):
<add> # https://github.com/tensorflow/tensorflow/issues/9094
<add> text = '<Space>' if text.lstrip() == '' else text
<add> lex = model.vocab[text]
<add>
<add> # Store vector data and metadata
<add> tf_vectors_variable[vec_index] = numpy.float64(model.vocab.get_vector(text))
<add> file_metadata.write("{}\t{}\n".format(text, math.exp(lex.prob) * len(voacb_strings)).encode('utf-8'))
<add> vec_index += 1
<add>
<add> # Write out "[name]_tensors.bytes" file for standalone embeddings projector to load
<add> tensor_path = '{}_tensors.bytes'.format(name)
<add> tf_vectors_variable.tofile(path.join(out_loc, tensor_path))
<add>
<add> print('Done.')
<add> print('Add the following entry to "oss_data/oss_demo_projector_config.json"')
<add> print(json.dumps({
<add> "tensorName": name,
<add> "tensorShape": [vector_count, vector_dimensions],
<add> "tensorPath": 'oss_data/{}'.format(tensor_path),
<add> "metadataPath": 'oss_data/{}'.format(meta_file)
<add> }, indent=2))
<add>
<add>
<add>if __name__ == '__main__':
<add> plac.call(main) | 1 |
Javascript | Javascript | add test for exception handlings in debugger | eda2498dfd2595c313d155bc85a44ea60c3e43f0 | <ide><path>test/parallel/test-debugger-invalid-json.js
<add>'use strict';
<add>const common = require('../common');
<add>const startCLI = require('../common/debugger');
<add>
<add>common.skipIfInspectorDisabled();
<add>
<add>const assert = require('assert');
<add>const http = require('http');
<add>
<add>const host = '127.0.0.1';
<add>
<add>{
<add> const server = http.createServer((req, res) => {
<add> res.statusCode = 400;
<add> res.end('Bad Request');
<add> });
<add> server.listen(0, common.mustCall(() => {
<add> const port = server.address().port;
<add> const cli = startCLI([`${host}:${port}`]);
<add> cli.quit().then(common.mustCall((code) => {
<add> assert.strictEqual(code, 1);
<add> })).finally(() => {
<add> server.close();
<add> });
<add> }));
<add>}
<add>
<add>{
<add> const server = http.createServer((req, res) => {
<add> res.statusCode = 200;
<add> res.end('some data that is invalid json');
<add> });
<add> server.listen(0, host, common.mustCall(() => {
<add> const port = server.address().port;
<add> const cli = startCLI([`${host}:${port}`]);
<add> cli.quit().then(common.mustCall((code) => {
<add> assert.strictEqual(code, 1);
<add> })).finally(() => {
<add> server.close();
<add> });
<add> }));
<add>} | 1 |
Ruby | Ruby | recognize xcode 5.1 and clt 5.1 | 5bd5e055415c31d13cd96742a4109e35b9523502 | <ide><path>Library/Homebrew/os/mac.rb
<ide> def preferred_arch
<ide> "5.0" => { :clang => "5.0", :clang_build => 500 },
<ide> "5.0.1" => { :clang => "5.0", :clang_build => 500 },
<ide> "5.0.2" => { :clang => "5.0", :clang_build => 500 },
<add> "5.1" => { :clang => "5.1", :clang_build => 503 },
<ide> }
<ide>
<ide> def compilers_standard?
<ide><path>Library/Homebrew/os/mac/xcode.rb
<ide> def latest_version
<ide> when "10.5" then "3.1.4"
<ide> when "10.6" then "3.2.6"
<ide> when "10.7" then "4.6.3"
<del> when "10.8" then "5.0.2"
<del> when "10.9" then "5.0.2"
<add> when "10.8" then "5.1"
<add> when "10.9" then "5.1"
<ide> else
<ide> # Default to newest known version of Xcode for unreleased OSX versions.
<ide> if MacOS.version > "10.9"
<del> "5.0.2"
<add> "5.1"
<ide> else
<ide> raise "Mac OS X '#{MacOS.version}' is invalid"
<ide> end
<ide> def uncached_version
<ide> when 41 then "4.5"
<ide> when 42 then "4.6"
<ide> when 50 then "5.0"
<del> else "5.0"
<add> when 51 then "5.1"
<add> else "5.1"
<ide> end
<ide> end
<ide> end | 2 |
Javascript | Javascript | improve the cleanup functionality for thumbnails | 1a22836dcbc404bda6057d47427cec0a9aa6cf1e | <ide><path>web/pdf_thumbnail_view.js
<ide> class PDFThumbnailView {
<ide> }
<ide> });
<ide> }
<del>
<del> static cleanup() {
<del> TempImageFactory.destroyCanvas();
<del> }
<ide> }
<ide>
<del>export { PDFThumbnailView };
<add>export { PDFThumbnailView, TempImageFactory };
<ide><path>web/pdf_thumbnail_viewer.js
<ide> import {
<ide> scrollIntoView,
<ide> watchScroll,
<ide> } from "./ui_utils.js";
<del>import { PDFThumbnailView } from "./pdf_thumbnail_view.js";
<add>import { PDFThumbnailView, TempImageFactory } from "./pdf_thumbnail_view.js";
<add>import { RenderingStates } from "./pdf_rendering_queue.js";
<ide>
<ide> const THUMBNAIL_SCROLL_MARGIN = -19;
<ide> const THUMBNAIL_SELECTED_CLASS = "selected";
<ide> class PDFThumbnailViewer {
<ide> }
<ide>
<ide> cleanup() {
<del> PDFThumbnailView.cleanup();
<add> for (let i = 0, ii = this._thumbnails.length; i < ii; i++) {
<add> if (
<add> this._thumbnails[i] &&
<add> this._thumbnails[i].renderingState !== RenderingStates.FINISHED
<add> ) {
<add> this._thumbnails[i].reset();
<add> }
<add> }
<add> TempImageFactory.destroyCanvas();
<ide> }
<ide>
<ide> /** | 2 |
Python | Python | fix typo in package command message (closes ) | 270fcfd925cb71c77038a50cec76347d07471493 | <ide><path>spacy/cli/package.py
<ide> def generate_meta(model_path, existing_meta):
<ide> meta['vectors'] = {'width': nlp.vocab.vectors_length,
<ide> 'vectors': len(nlp.vocab.vectors),
<ide> 'keys': nlp.vocab.vectors.n_keys}
<del> prints(Messages.M047, title=Messages.Mo46)
<add> prints(Messages.M047, title=Messages.M046)
<ide> for setting, desc, default in settings:
<ide> response = util.get_raw_input(desc, default)
<ide> meta[setting] = default if response == '' and default else response | 1 |
Mixed | Text | add option to auto-configure blkdev for devmapper | 5ef07d79c4712d5b1ff4f0c896932ea8902a129c | <ide><path>daemon/graphdriver/devmapper/README.md
<ide> The device mapper graphdriver uses the device mapper thin provisioning
<ide> module (dm-thinp) to implement CoW snapshots. The preferred model is
<ide> to have a thin pool reserved outside of Docker and passed to the
<del>daemon via the `--storage-opt dm.thinpooldev` option.
<add>daemon via the `--storage-opt dm.thinpooldev` option. Alternatively,
<add>the device mapper graphdriver can setup a block device to handle this
<add>for you via the `--storage-opt dm.directlvm_device` option.
<ide>
<ide> As a fallback if no thin pool is provided, loopback files will be
<ide> created. Loopback is very slow, but can be used without any
<ide><path>daemon/graphdriver/devmapper/device_setup.go
<add>package devmapper
<add>
<add>import (
<add> "bufio"
<add> "bytes"
<add> "encoding/json"
<add> "fmt"
<add> "io/ioutil"
<add> "os"
<add> "os/exec"
<add> "path/filepath"
<add> "reflect"
<add> "strings"
<add>
<add> "github.com/Sirupsen/logrus"
<add> "github.com/pkg/errors"
<add>)
<add>
<add>type directLVMConfig struct {
<add> Device string
<add> ThinpPercent uint64
<add> ThinpMetaPercent uint64
<add> AutoExtendPercent uint64
<add> AutoExtendThreshold uint64
<add>}
<add>
<add>var (
<add> errThinpPercentMissing = errors.New("must set both `dm.thinp_percent` and `dm.thinp_metapercent` if either is specified")
<add> errThinpPercentTooBig = errors.New("combined `dm.thinp_percent` and `dm.thinp_metapercent` must not be greater than 100")
<add> errMissingSetupDevice = errors.New("must provide device path in `dm.setup_device` in order to configure direct-lvm")
<add>)
<add>
<add>func validateLVMConfig(cfg directLVMConfig) error {
<add> if reflect.DeepEqual(cfg, directLVMConfig{}) {
<add> return nil
<add> }
<add> if cfg.Device == "" {
<add> return errMissingSetupDevice
<add> }
<add> if (cfg.ThinpPercent > 0 && cfg.ThinpMetaPercent == 0) || cfg.ThinpMetaPercent > 0 && cfg.ThinpPercent == 0 {
<add> return errThinpPercentMissing
<add> }
<add>
<add> if cfg.ThinpPercent+cfg.ThinpMetaPercent > 100 {
<add> return errThinpPercentTooBig
<add> }
<add> return nil
<add>}
<add>
<add>func checkDevAvailable(dev string) error {
<add> lvmScan, err := exec.LookPath("lvmdiskscan")
<add> if err != nil {
<add> logrus.Debug("could not find lvmdiskscan")
<add> return nil
<add> }
<add>
<add> out, err := exec.Command(lvmScan).CombinedOutput()
<add> if err != nil {
<add> logrus.WithError(err).Error(string(out))
<add> return nil
<add> }
<add>
<add> if !bytes.Contains(out, []byte(dev)) {
<add> return errors.Errorf("%s is not available for use with devicemapper", dev)
<add> }
<add> return nil
<add>}
<add>
<add>func checkDevInVG(dev string) error {
<add> pvDisplay, err := exec.LookPath("pvdisplay")
<add> if err != nil {
<add> logrus.Debug("could not find pvdisplay")
<add> return nil
<add> }
<add>
<add> out, err := exec.Command(pvDisplay, dev).CombinedOutput()
<add> if err != nil {
<add> logrus.WithError(err).Error(string(out))
<add> return nil
<add> }
<add>
<add> scanner := bufio.NewScanner(bytes.NewReader(bytes.TrimSpace(out)))
<add> for scanner.Scan() {
<add> fields := strings.SplitAfter(strings.TrimSpace(scanner.Text()), "VG Name")
<add> if len(fields) > 1 {
<add> // got "VG Name" line"
<add> vg := strings.TrimSpace(fields[1])
<add> if len(vg) > 0 {
<add> return errors.Errorf("%s is already part of a volume group %q: must remove this device from any volume group or provide a different device", dev, vg)
<add> }
<add> logrus.Error(fields)
<add> break
<add> }
<add> }
<add> return nil
<add>}
<add>
<add>func checkDevHasFS(dev string) error {
<add> blkid, err := exec.LookPath("blkid")
<add> if err != nil {
<add> logrus.Debug("could not find blkid")
<add> return nil
<add> }
<add>
<add> out, err := exec.Command(blkid, dev).CombinedOutput()
<add> if err != nil {
<add> logrus.WithError(err).Error(string(out))
<add> return nil
<add> }
<add>
<add> fields := bytes.Fields(out)
<add> for _, f := range fields {
<add> kv := bytes.Split(f, []byte{'='})
<add> if bytes.Equal(kv[0], []byte("TYPE")) {
<add> v := bytes.Trim(kv[1], "\"")
<add> if len(v) > 0 {
<add> return errors.Errorf("%s has a filesystem already, use dm.directlvm_device_force=true if you want to wipe the device", dev)
<add> }
<add> return nil
<add> }
<add> }
<add> return nil
<add>}
<add>
<add>func verifyBlockDevice(dev string, force bool) error {
<add> if err := checkDevAvailable(dev); err != nil {
<add> return err
<add> }
<add> if err := checkDevInVG(dev); err != nil {
<add> return err
<add> }
<add>
<add> if force {
<add> return nil
<add> }
<add>
<add> if err := checkDevHasFS(dev); err != nil {
<add> return err
<add> }
<add> return nil
<add>}
<add>
<add>func readLVMConfig(root string) (directLVMConfig, error) {
<add> var cfg directLVMConfig
<add>
<add> p := filepath.Join(root, "setup-config.json")
<add> b, err := ioutil.ReadFile(p)
<add> if err != nil {
<add> if os.IsNotExist(err) {
<add> return cfg, nil
<add> }
<add> return cfg, errors.Wrap(err, "error reading existing setup config")
<add> }
<add>
<add> // check if this is just an empty file, no need to produce a json error later if so
<add> if len(b) == 0 {
<add> return cfg, nil
<add> }
<add>
<add> err = json.Unmarshal(b, &cfg)
<add> return cfg, errors.Wrap(err, "error unmarshaling previous device setup config")
<add>}
<add>
<add>func writeLVMConfig(root string, cfg directLVMConfig) error {
<add> p := filepath.Join(root, "setup-config.json")
<add> b, err := json.Marshal(cfg)
<add> if err != nil {
<add> return errors.Wrap(err, "error marshalling direct lvm config")
<add> }
<add> err = ioutil.WriteFile(p, b, 0600)
<add> return errors.Wrap(err, "error writing direct lvm config to file")
<add>}
<add>
<add>func setupDirectLVM(cfg directLVMConfig) error {
<add> pvCreate, err := exec.LookPath("pvcreate")
<add> if err != nil {
<add> return errors.Wrap(err, "error lookuping up command `pvcreate` while setting up direct lvm")
<add> }
<add>
<add> vgCreate, err := exec.LookPath("vgcreate")
<add> if err != nil {
<add> return errors.Wrap(err, "error lookuping up command `vgcreate` while setting up direct lvm")
<add> }
<add>
<add> lvCreate, err := exec.LookPath("lvcreate")
<add> if err != nil {
<add> return errors.Wrap(err, "error lookuping up command `lvcreate` while setting up direct lvm")
<add> }
<add>
<add> lvConvert, err := exec.LookPath("lvconvert")
<add> if err != nil {
<add> return errors.Wrap(err, "error lookuping up command `lvconvert` while setting up direct lvm")
<add> }
<add>
<add> lvChange, err := exec.LookPath("lvchange")
<add> if err != nil {
<add> return errors.Wrap(err, "error lookuping up command `lvchange` while setting up direct lvm")
<add> }
<add>
<add> if cfg.AutoExtendPercent == 0 {
<add> cfg.AutoExtendPercent = 20
<add> }
<add>
<add> if cfg.AutoExtendThreshold == 0 {
<add> cfg.AutoExtendThreshold = 80
<add> }
<add>
<add> if cfg.ThinpPercent == 0 {
<add> cfg.ThinpPercent = 95
<add> }
<add> if cfg.ThinpMetaPercent == 0 {
<add> cfg.ThinpMetaPercent = 1
<add> }
<add>
<add> out, err := exec.Command(pvCreate, "-f", cfg.Device).CombinedOutput()
<add> if err != nil {
<add> return errors.Wrap(err, string(out))
<add> }
<add>
<add> out, err = exec.Command(vgCreate, "docker", cfg.Device).CombinedOutput()
<add> if err != nil {
<add> return errors.Wrap(err, string(out))
<add> }
<add>
<add> out, err = exec.Command(lvCreate, "--wipesignatures", "y", "-n", "thinpool", "docker", "--extents", fmt.Sprintf("%d%%VG", cfg.ThinpPercent)).CombinedOutput()
<add> if err != nil {
<add> return errors.Wrap(err, string(out))
<add> }
<add> out, err = exec.Command(lvCreate, "--wipesignatures", "y", "-n", "thinpoolmeta", "docker", "--extents", fmt.Sprintf("%d%%VG", cfg.ThinpMetaPercent)).CombinedOutput()
<add> if err != nil {
<add> return errors.Wrap(err, string(out))
<add> }
<add>
<add> out, err = exec.Command(lvConvert, "-y", "--zero", "n", "-c", "512K", "--thinpool", "docker/thinpool", "--poolmetadata", "docker/thinpoolmeta").CombinedOutput()
<add> if err != nil {
<add> return errors.Wrap(err, string(out))
<add> }
<add>
<add> profile := fmt.Sprintf("activation{\nthin_pool_autoextend_threshold=%d\nthin_pool_autoextend_percent=%d\n}", cfg.AutoExtendThreshold, cfg.AutoExtendPercent)
<add> err = ioutil.WriteFile("/etc/lvm/profile/docker-thinpool.profile", []byte(profile), 0600)
<add> if err != nil {
<add> return errors.Wrap(err, "error writing docker thinp autoextend profile")
<add> }
<add>
<add> out, err = exec.Command(lvChange, "--metadataprofile", "docker-thinpool", "docker/thinpool").CombinedOutput()
<add> return errors.Wrap(err, string(out))
<add>}
<ide><path>daemon/graphdriver/devmapper/deviceset.go
<ide> package devmapper
<ide> import (
<ide> "bufio"
<ide> "encoding/json"
<del> "errors"
<ide> "fmt"
<ide> "io"
<ide> "io/ioutil"
<ide> "os"
<ide> "os/exec"
<ide> "path"
<ide> "path/filepath"
<add> "reflect"
<ide> "strconv"
<ide> "strings"
<ide> "sync"
<ide> import (
<ide> "github.com/docker/docker/pkg/mount"
<ide> "github.com/docker/docker/pkg/parsers"
<ide> units "github.com/docker/go-units"
<add> "github.com/pkg/errors"
<ide>
<ide> "github.com/opencontainers/runc/libcontainer/label"
<ide> )
<ide> var (
<ide> enableDeferredDeletion = false
<ide> userBaseSize = false
<ide> defaultMinFreeSpacePercent uint32 = 10
<add> lvmSetupConfigForce bool
<ide> )
<ide>
<ide> const deviceSetMetaFile string = "deviceset-metadata"
<ide> type DeviceSet struct {
<ide> gidMaps []idtools.IDMap
<ide> minFreeSpacePercent uint32 //min free space percentage in thinpool
<ide> xfsNospaceRetries string // max retries when xfs receives ENOSPC
<add> lvmSetupConfig directLVMConfig
<ide> }
<ide>
<ide> // DiskUsage contains information about disk usage and is used when reporting Status of a device.
<ide> func (devices *DeviceSet) initDevmapper(doInit bool) error {
<ide> return err
<ide> }
<ide>
<del> // Set the device prefix from the device id and inode of the docker root dir
<add> prevSetupConfig, err := readLVMConfig(devices.root)
<add> if err != nil {
<add> return err
<add> }
<add>
<add> if !reflect.DeepEqual(devices.lvmSetupConfig, directLVMConfig{}) {
<add> if devices.thinPoolDevice != "" {
<add> return errors.New("cannot setup direct-lvm when `dm.thinpooldev` is also specified")
<add> }
<add>
<add> if !reflect.DeepEqual(prevSetupConfig, devices.lvmSetupConfig) {
<add> if !reflect.DeepEqual(prevSetupConfig, directLVMConfig{}) {
<add> return errors.New("changing direct-lvm config is not supported")
<add> }
<add> logrus.WithField("storage-driver", "devicemapper").WithField("direct-lvm-config", devices.lvmSetupConfig).Debugf("Setting up direct lvm mode")
<add> if err := verifyBlockDevice(devices.lvmSetupConfig.Device, lvmSetupConfigForce); err != nil {
<add> return err
<add> }
<add> if err := setupDirectLVM(devices.lvmSetupConfig); err != nil {
<add> return err
<add> }
<add> if err := writeLVMConfig(devices.root, devices.lvmSetupConfig); err != nil {
<add> return err
<add> }
<add> }
<add> devices.thinPoolDevice = "docker-thinpool"
<add> logrus.WithField("storage-driver", "devicemapper").Debugf("Setting dm.thinpooldev to %q", devices.thinPoolDevice)
<add> }
<ide>
<add> // Set the device prefix from the device id and inode of the docker root dir
<ide> st, err := os.Stat(devices.root)
<ide> if err != nil {
<ide> return fmt.Errorf("devmapper: Error looking up dir %s: %s", devices.root, err)
<ide> func NewDeviceSet(root string, doInit bool, options []string, uidMaps, gidMaps [
<ide> }
<ide>
<ide> foundBlkDiscard := false
<add> var lvmSetupConfig directLVMConfig
<ide> for _, option := range options {
<ide> key, val, err := parsers.ParseKeyValueOpt(option)
<ide> if err != nil {
<ide> func NewDeviceSet(root string, doInit bool, options []string, uidMaps, gidMaps [
<ide> return nil, err
<ide> }
<ide> devices.xfsNospaceRetries = val
<add> case "dm.directlvm_device":
<add> lvmSetupConfig.Device = val
<add> case "dm.directlvm_device_force":
<add> lvmSetupConfigForce, err = strconv.ParseBool(val)
<add> if err != nil {
<add> return nil, err
<add> }
<add> case "dm.thinp_percent":
<add> per, err := strconv.ParseUint(strings.TrimSuffix(val, "%"), 10, 32)
<add> if err != nil {
<add> return nil, errors.Wrapf(err, "could not parse `dm.thinp_percent=%s`", val)
<add> }
<add> if per >= 100 {
<add> return nil, errors.New("dm.thinp_percent must be greater than 0 and less than 100")
<add> }
<add> lvmSetupConfig.ThinpPercent = per
<add> case "dm.thinp_metapercent":
<add> per, err := strconv.ParseUint(strings.TrimSuffix(val, "%"), 10, 32)
<add> if err != nil {
<add> return nil, errors.Wrapf(err, "could not parse `dm.thinp_metapercent=%s`", val)
<add> }
<add> if per >= 100 {
<add> return nil, errors.New("dm.thinp_metapercent must be greater than 0 and less than 100")
<add> }
<add> lvmSetupConfig.ThinpMetaPercent = per
<add> case "dm.thinp_autoextend_percent":
<add> per, err := strconv.ParseUint(strings.TrimSuffix(val, "%"), 10, 32)
<add> if err != nil {
<add> return nil, errors.Wrapf(err, "could not parse `dm.thinp_autoextend_percent=%s`", val)
<add> }
<add> if per > 100 {
<add> return nil, errors.New("dm.thinp_autoextend_percent must be greater than 0 and less than 100")
<add> }
<add> lvmSetupConfig.AutoExtendPercent = per
<add> case "dm.thinp_autoextend_threshold":
<add> per, err := strconv.ParseUint(strings.TrimSuffix(val, "%"), 10, 32)
<add> if err != nil {
<add> return nil, errors.Wrapf(err, "could not parse `dm.thinp_autoextend_threshold=%s`", val)
<add> }
<add> if per > 100 {
<add> return nil, errors.New("dm.thinp_autoextend_threshold must be greater than 0 and less than 100")
<add> }
<add> lvmSetupConfig.AutoExtendThreshold = per
<ide> default:
<ide> return nil, fmt.Errorf("devmapper: Unknown option %s\n", key)
<ide> }
<ide> }
<ide>
<add> if err := validateLVMConfig(lvmSetupConfig); err != nil {
<add> return nil, err
<add> }
<add>
<add> devices.lvmSetupConfig = lvmSetupConfig
<add>
<ide> // By default, don't do blk discard hack on raw devices, its rarely useful and is expensive
<ide> if !foundBlkDiscard && (devices.dataDevice != "" || devices.thinPoolDevice != "") {
<ide> devices.doBlkDiscard = false
<ide><path>docs/reference/commandline/dockerd.md
<ide> not use loopback in production. Ensure your Engine daemon has a
<ide> $ sudo dockerd --storage-opt dm.thinpooldev=/dev/mapper/thin-pool
<ide> ```
<ide>
<add>##### `dm.directlvm_device`
<add>
<add>As an alternative to providing a thin pool as above, Docker can setup a block
<add>device for you.
<add>
<add>###### Example:
<add>
<add>```bash
<add>$ sudo dockerd --storage-opt dm.directlvm_device=/dev/xvdf
<add>```
<add>
<add>##### `dm.thinp_percent`
<add>
<add>Sets the percentage of passed in block device to use for storage.
<add>
<add>###### Example:
<add>
<add>```bash
<add>$ sudo dockerd --storage-opt dm.thinp_percent=95
<add>```
<add>
<add>##### `dm.thinp_metapercent`
<add>
<add>Sets the percentage of the passed in block device to use for metadata storage.
<add>
<add>###### Example:
<add>
<add>```bash
<add>$ sudo dockerd --storage-opt dm.thinp_metapercent=1
<add>```
<add>
<add>##### `dm.thinp_autoextend_threshold`
<add>
<add>Sets the value of the percentage of space used before `lvm` attempts to
<add>autoextend the available space [100 = disabled]
<add>
<add>###### Example:
<add>
<add>```bash
<add>$ sudo dockerd --storage-opt dm.thinp_autoextend_threshold=80
<add>```
<add>
<add>##### `dm.thinp_autoextend_percent`
<add>
<add>Sets the value percentage value to increase the thin pool by when when `lvm`
<add>attempts to autoextend the available space [100 = disabled]
<add>
<add>###### Example:
<add>
<add>```bash
<add>$ sudo dockerd --storage-opt dm.thinp_autoextend_percent=20
<add>```
<add>
<add>
<ide> ##### `dm.basesize`
<ide>
<ide> Specifies the size to use when creating the base device, which limits the
<ide><path>man/dockerd.8.md
<ide> Example use:
<ide> $ dockerd \
<ide> --storage-opt dm.thinpooldev=/dev/mapper/thin-pool
<ide>
<add>#### dm.directlvm_device
<add>
<add>As an alternative to manually creating a thin pool as above, Docker can
<add>automatically configure a block device for you.
<add>
<add>Example use:
<add>
<add> $ dockerd \
<add> --storage-opt dm.directlvm_device=/dev/xvdf
<add>
<add>##### dm.thinp_percent
<add>
<add>Sets the percentage of passed in block device to use for storage.
<add>
<add>###### Example:
<add>
<add> $ sudo dockerd \
<add> --storage-opt dm.thinp_percent=95
<add>
<add>##### `dm.thinp_metapercent`
<add>
<add>Sets the percentage of the passed in block device to use for metadata storage.
<add>
<add>###### Example:
<add>
<add> $ sudo dockerd \
<add> --storage-opt dm.thinp_metapercent=1
<add>
<add>##### dm.thinp_autoextend_threshold
<add>
<add>Sets the value of the percentage of space used before `lvm` attempts to
<add>autoextend the available space [100 = disabled]
<add>
<add>###### Example:
<add>
<add> $ sudo dockerd \
<add> --storage-opt dm.thinp_autoextend_threshold=80
<add>
<add>##### dm.thinp_autoextend_percent
<add>
<add>Sets the value percentage value to increase the thin pool by when when `lvm`
<add>attempts to autoextend the available space [100 = disabled]
<add>
<add>###### Example:
<add>
<add> $ sudo dockerd \
<add> --storage-opt dm.thinp_autoextend_percent=20
<add>
<ide> #### dm.basesize
<ide>
<ide> Specifies the size to use when creating the base device, which limits | 5 |
Javascript | Javascript | restore properties correctly | 1f8c70bd0af18afbfb20517b76c8752625872e3f | <ide><path>packages/ember-metal/lib/watch_key.js
<ide> export function watchKey(obj, keyName, meta) {
<ide>
<ide>
<ide> if (isEnabled('mandatory-setter')) {
<add> // Future traveler, although this code looks scary. It merely exists in
<add> // development to aid in development asertions. Production builds of
<add> // ember strip this entire block out
<ide> handleMandatorySetter = function handleMandatorySetter(m, obj, keyName) {
<ide> let descriptor = lookupDescriptor(obj, keyName);
<ide> var configurable = descriptor ? descriptor.configurable : true;
<ide> export function unwatchKey(obj, keyName, meta) {
<ide> m.writeWatching(keyName, 0);
<ide>
<ide> var possibleDesc = obj[keyName];
<del> var desc = (possibleDesc !== null && typeof possibleDesc === 'object' && possibleDesc.isDescriptor) ? possibleDesc : undefined;
<add> var desc = (possibleDesc !== null &&
<add> typeof possibleDesc === 'object' &&
<add> possibleDesc.isDescriptor) ? possibleDesc : undefined;
<add>
<ide> if (desc && desc.didUnwatch) { desc.didUnwatch(obj, keyName); }
<ide>
<ide> if ('function' === typeof obj.didUnwatchProperty) {
<ide> export function unwatchKey(obj, keyName, meta) {
<ide> // that occurs, and attempt to provide more helpful feedback. The alternative
<ide> // is tricky to debug partially observable properties.
<ide> if (!desc && keyName in obj) {
<del> var maybeMandatoryDescriptor = lookupDescriptor(obj, keyName);
<add> let maybeMandatoryDescriptor = lookupDescriptor(obj, keyName);
<add>
<ide> if (maybeMandatoryDescriptor.set && maybeMandatoryDescriptor.set.isMandatorySetter) {
<add> let isEnumerable = Object.prototype.propertyIsEnumerable.call(obj, keyName);
<ide> Object.defineProperty(obj, keyName, {
<ide> configurable: true,
<del> enumerable: Object.prototype.propertyIsEnumerable.call(obj, keyName),
<del> set(val) {
<del> // redefine to set as enumerable
<del> Object.defineProperty(obj, keyName, {
<del> configurable: true,
<del> writable: true,
<del> enumerable: true,
<del> value: val
<del> });
<del> m.deleteFromValues(keyName);
<del> },
<del> get: DEFAULT_GETTER_FUNCTION(keyName)
<add> enumerable: isEnumerable,
<add> writable: true,
<add> value: m.peekValues(keyName)
<ide> });
<add> m.deleteFromValues(keyName);
<ide> }
<ide> }
<ide> }
<ide><path>packages/ember-metal/tests/accessors/mandatory_setters_test.js
<ide> import { meta as metaFor } from 'ember-metal/meta';
<ide> QUnit.module('mandatory-setters');
<ide>
<ide> function hasMandatorySetter(object, property) {
<add> try {
<add> return Object.getOwnPropertyDescriptor(object, property).set.isMandatorySetter === true;
<add> } catch(e) {
<add> return false;
<add> }
<add>}
<add>
<add>function hasMetaValue(object, property) {
<ide> return metaFor(object).hasInValues(property);
<ide> }
<ide>
<ide> if (isEnabled('mandatory-setter')) {
<ide> ok(!(hasMandatorySetter(obj, 'someProp')), 'blastix');
<ide> });
<ide>
<add> QUnit.test('ensure after watch the property is restored (and the value is no-longer stored in meta) [non-enumerable]', function() {
<add> var obj = {
<add> someProp: null,
<add> toString() {
<add> return 'custom-object';
<add> }
<add> };
<add>
<add> Object.defineProperty(obj, 'someProp', {
<add> configurable: true,
<add> enumerable: false,
<add> value: 'blastix'
<add> });
<add>
<add> watch(obj, 'someProp');
<add> equal(hasMandatorySetter(obj, 'someProp'), true, 'should have a mandatory setter');
<add>
<add> let descriptor = Object.getOwnPropertyDescriptor(obj, 'someProp');
<add>
<add> equal(descriptor.enumerable, false, 'property should remain non-enumerable');
<add> equal(descriptor.configurable, true, 'property should remain configurable');
<add> equal(obj.someProp, 'blastix', 'expected value to be the getter');
<add>
<add> equal(descriptor.value, undefined, 'expected existing value to NOT remain');
<add>
<add> ok(hasMetaValue(obj, 'someProp'), 'someProp is stored in meta.values');
<add>
<add> unwatch(obj, 'someProp');
<add>
<add> ok(!hasMetaValue(obj, 'someProp'), 'someProp is no longer stored in meta.values');
<add>
<add> descriptor = Object.getOwnPropertyDescriptor(obj, 'someProp');
<add>
<add> equal(hasMandatorySetter(obj, 'someProp'), false, 'should no longer have a mandatory setter');
<add>
<add> equal(descriptor.enumerable, false, 'property should remain non-enumerable');
<add> equal(descriptor.configurable, true, 'property should remain configurable');
<add> equal(obj.someProp, 'blastix', 'expected value to be the getter');
<add> equal(descriptor.value, 'blastix', 'expected existing value to remain');
<add>
<add> obj.someProp = 'new value';
<add>
<add> // make sure the descriptor remains correct (nothing funky, like a redefined, happened in the setter);
<add> descriptor = Object.getOwnPropertyDescriptor(obj, 'someProp');
<add>
<add> equal(descriptor.enumerable, false, 'property should remain non-enumerable');
<add> equal(descriptor.configurable, true, 'property should remain configurable');
<add> equal(descriptor.value, 'new value', 'expected existing value to NOT remain');
<add> equal(obj.someProp, 'new value', 'expected value to be the getter');
<add> equal(obj.someProp, 'new value');
<add> });
<add>
<add> QUnit.test('ensure after watch the property is restored (and the value is no-longer stored in meta) [enumerable]', function() {
<add> var obj = {
<add> someProp: null,
<add> toString() {
<add> return 'custom-object';
<add> }
<add> };
<add>
<add> Object.defineProperty(obj, 'someProp', {
<add> configurable: true,
<add> enumerable: true,
<add> value: 'blastix'
<add> });
<add>
<add> watch(obj, 'someProp');
<add> equal(hasMandatorySetter(obj, 'someProp'), true, 'should have a mandatory setter');
<add>
<add> let descriptor = Object.getOwnPropertyDescriptor(obj, 'someProp');
<add>
<add> equal(descriptor.enumerable, true, 'property should remain enumerable');
<add> equal(descriptor.configurable, true, 'property should remain configurable');
<add> equal(obj.someProp, 'blastix', 'expected value to be the getter');
<add>
<add> equal(descriptor.value, undefined, 'expected existing value to NOT remain');
<add>
<add> ok(hasMetaValue(obj, 'someProp'), 'someProp is stored in meta.values');
<add>
<add> unwatch(obj, 'someProp');
<add>
<add> ok(!hasMetaValue(obj, 'someProp'), 'someProp is no longer stored in meta.values');
<add>
<add> descriptor = Object.getOwnPropertyDescriptor(obj, 'someProp');
<add>
<add> equal(hasMandatorySetter(obj, 'someProp'), false, 'should no longer have a mandatory setter');
<add>
<add> equal(descriptor.enumerable, true, 'property should remain enumerable');
<add> equal(descriptor.configurable, true, 'property should remain configurable');
<add> equal(obj.someProp, 'blastix', 'expected value to be the getter');
<add> equal(descriptor.value, 'blastix', 'expected existing value to remain');
<add>
<add> obj.someProp = 'new value';
<add>
<add> // make sure the descriptor remains correct (nothing funky, like a redefined, happened in the setter);
<add> descriptor = Object.getOwnPropertyDescriptor(obj, 'someProp');
<add>
<add> equal(descriptor.enumerable, true, 'property should remain enumerable');
<add> equal(descriptor.configurable, true, 'property should remain configurable');
<add> equal(descriptor.value, 'new value', 'expected existing value to NOT remain');
<add> equal(obj.someProp, 'new value');
<add> });
<add>
<ide> QUnit.test('sets up mandatory-setter if property comes from prototype', function() {
<ide> expect(2);
<ide> | 2 |
Text | Text | add a missing period | 11b308adc399277a4407c44ccecf8b2f392f0f39 | <ide><path>curriculum/challenges/english/05-back-end-development-and-apis/managing-packages-with-npm/remove-a-package-from-your-dependencies.md
<ide> You have now tested a few ways you can manage dependencies of your project by us
<ide>
<ide> But what if you want to remove an external package that you no longer need? You might already have guessed it, just remove the corresponding key-value pair for that package from your dependencies.
<ide>
<del>This same method applies to removing other fields in your package.json as well
<add>This same method applies to removing other fields in your package.json as well.
<ide>
<ide> # --instructions--
<ide> | 1 |
Python | Python | fix typo at datasets/cifar10.py | d47d046ec69105dc814a014263037e6b1e21daa5 | <ide><path>slim/datasets/cifar10.py
<ide> """Provides data for the Cifar10 dataset.
<ide>
<ide> The dataset scripts used to create the dataset can be found at:
<del>tensorflow/models/slim/data/create_cifar10_dataset.py
<add>tensorflow/models/slim/datasets/download_and_convert_cifar10.py
<ide> """
<ide>
<ide> from __future__ import absolute_import | 1 |
Ruby | Ruby | ask the formula object for the opt path | 87e77350125602e6114e43ccc349f0287f8f5349 | <ide><path>Library/Homebrew/extend/ENV/shared.rb
<ide> def gcc_version_formula(name)
<ide> version = name[GNU_GCC_REGEXP, 1]
<ide> gcc_version_name = "gcc#{version.delete('.')}"
<ide>
<del> if HOMEBREW_PREFIX.join("opt", "gcc", "bin", name).exist?
<del> Formulary.factory("gcc")
<add> gcc = Formulary.factory("gcc")
<add> if gcc.opt_bin.join(name).exist?
<add> gcc
<ide> else
<ide> Formulary.factory(gcc_version_name)
<ide> end | 1 |
Python | Python | revise backend tests for dtype | d789bd94a3a1bdc18390b8bff742936dcc70c54f | <ide><path>tests/keras/backend/backend_test.py
<ide> import warnings
<ide>
<ide> from keras import backend as K
<del>from keras.backend import floatx, set_floatx, variable
<ide> from keras.utils.conv_utils import convert_kernel
<ide> from keras.backend import numpy_backend as KNP
<ide>
<ide>
<ide>
<ide> def check_dtype(var, dtype):
<del> if K.backend() == 'theano':
<del> assert var.dtype == dtype
<add> if K.backend() == 'tensorflow':
<add> assert dtype in str(var.dtype.name)
<ide> else:
<del> assert var.dtype.name == '%s_ref' % dtype
<add> assert dtype in str(var.dtype)
<ide>
<ide>
<ide> def cntk_func_tensors(function_name, shapes_or_vals, **kwargs):
<ide> def test_in_test_phase(self, training):
<ide> check_two_tensor_operation('in_test_phase', (2, 3), (2, 3), WITH_NP,
<ide> training=training)
<ide>
<del> def test_setfloatx_incorrect_values(self):
<add> @pytest.mark.parametrize('dtype', ['', 'beerfloat', 123])
<add> def test_setfloatx_incorrect_values(self, dtype):
<ide> # Keep track of the old value
<del> old_floatx = floatx()
<del> # Try some incorrect values
<del> initial = floatx()
<del> for value in ['', 'beerfloat', 123]:
<del> with pytest.raises(ValueError):
<del> set_floatx(value)
<del> assert floatx() == initial
<del> # Restore old value
<del> set_floatx(old_floatx)
<add> old_floatx = K.floatx()
<add> with pytest.raises(ValueError):
<add> K.set_floatx(dtype)
<add> assert K.floatx() == old_floatx
<ide>
<del> def test_setfloatx_correct_values(self):
<add> @pytest.mark.parametrize('dtype', ['float16', 'float32', 'float64'])
<add> def test_setfloatx_correct_values(self, dtype):
<ide> # Keep track of the old value
<del> old_floatx = floatx()
<add> old_floatx = K.floatx()
<ide> # Check correct values
<del> for value in ['float16', 'float32', 'float64']:
<del> set_floatx(value)
<del> assert floatx() == value
<add> K.set_floatx(dtype)
<add> assert K.floatx() == dtype
<add> # Make sure that changes to the global floatx are effectively
<add> # taken into account by the backend.
<add> check_dtype(K.variable([10]), dtype)
<ide> # Restore old value
<del> set_floatx(old_floatx)
<del>
<del> @pytest.mark.skipif((K.backend() == 'cntk'),
<del> reason='cntk does not support float16')
<del> def test_set_floatx(self):
<del> """
<del> Make sure that changes to the global floatx are effectively
<del> taken into account by the backend.
<del> """
<del> # Keep track of the old value
<del> old_floatx = floatx()
<del>
<del> set_floatx('float16')
<del> var = variable([10])
<del> check_dtype(var, 'float16')
<add> K.set_floatx(old_floatx)
<ide>
<del> set_floatx('float64')
<del> var = variable([10])
<del> check_dtype(var, 'float64')
<del>
<del> # Restore old value
<del> set_floatx(old_floatx)
<del>
<del> def test_dtype(self):
<del> assert K.dtype(K.variable(1, dtype='float64')) == 'float64'
<del> assert K.dtype(K.variable(1, dtype='float32')) == 'float32'
<del> assert K.dtype(K.variable(1, dtype='float16')) == 'float16'
<add> @pytest.mark.parametrize('dtype', ['float16', 'float32', 'float64'])
<add> def test_dtype(self, dtype):
<add> assert K.dtype(K.variable(1, dtype=dtype)) == dtype
<ide>
<add> @pytest.mark.skipif(K.backend() == 'cntk', reason='Not supported')
<ide> def test_variable_support_bool_dtype(self):
<del> # Github issue: 7819
<del> if K.backend() == 'tensorflow':
<del> assert K.dtype(K.variable(1, dtype='int16')) == 'int16'
<del> assert K.dtype(K.variable(False, dtype='bool')) == 'bool'
<del> with pytest.raises(TypeError):
<del> K.variable('', dtype='unsupported')
<add> assert K.dtype(K.variable(1, dtype='int16')) == 'int16'
<add> assert K.dtype(K.variable(False, dtype='bool')) == 'bool'
<add> with pytest.raises(TypeError):
<add> K.variable('', dtype='unsupported')
<ide>
<ide> @pytest.mark.parametrize('shape', [(4, 2), (2, 3)])
<ide> def test_clip_supports_tensor_arguments(self, shape): | 1 |
Java | Java | add single.delay overload that delays errors | ad8ca8e587be6f7836e5c76707248ac955af54e3 | <ide><path>src/main/java/io/reactivex/Single.java
<ide> public final Flowable<T> concatWith(SingleSource<? extends T> other) {
<ide> }
<ide>
<ide> /**
<del> * Delays the emission of the success or error signal from the current Single by
<del> * the specified amount.
<add> * Delays the emission of the success signal from the current Single by the specified amount.
<add> * An error signal will not be delayed.
<ide> * <dl>
<ide> * <dt><b>Scheduler:</b></dt>
<ide> * <dd>{@code delay} operates by default on the {@code computation} {@link Scheduler}.</dd>
<ide> * </dl>
<ide> *
<del> * @param time the time amount to delay the signals
<add> * @param time the amount of time the success signal should be delayed for
<ide> * @param unit the time unit
<ide> * @return the new Single instance
<ide> * @since 2.0
<ide> */
<ide> @CheckReturnValue
<ide> @SchedulerSupport(SchedulerSupport.COMPUTATION)
<ide> public final Single<T> delay(long time, TimeUnit unit) {
<del> return delay(time, unit, Schedulers.computation());
<add> return delay(time, unit, Schedulers.computation(), false);
<add> }
<add>
<add> /**
<add> * Delays the emission of the success or error signal from the current Single by the specified amount.
<add> * <dl>
<add> * <dt><b>Scheduler:</b></dt>
<add> * <dd>{@code delay} operates by default on the {@code computation} {@link Scheduler}.</dd>
<add> * </dl>
<add> *
<add> * @param time the amount of time the success or error signal should be delayed for
<add> * @param unit the time unit
<add> * @param delayError if true, both success and error signals are delayed. if false, only success signals are delayed.
<add> * @return the new Single instance
<add> * @since 2.1.5 - experimental
<add> */
<add> @Experimental
<add> @CheckReturnValue
<add> @SchedulerSupport(SchedulerSupport.COMPUTATION)
<add> public final Single<T> delay(long time, TimeUnit unit, boolean delayError) {
<add> return delay(time, unit, Schedulers.computation(), delayError);
<ide> }
<ide>
<ide> /**
<ide> * Delays the emission of the success signal from the current Single by the specified amount.
<add> * An error signal will not be delayed.
<ide> * <dl>
<ide> * <dt><b>Scheduler:</b></dt>
<ide> * <dd>you specify the {@link Scheduler} where the non-blocking wait and emission happens</dd>
<ide> * </dl>
<ide> *
<del> * @param time the time amount to delay the emission of the success signal
<add> * @param time the amount of time the success signal should be delayed for
<ide> * @param unit the time unit
<ide> * @param scheduler the target scheduler to use for the non-blocking wait and emission
<ide> * @return the new Single instance
<ide> public final Single<T> delay(long time, TimeUnit unit) {
<ide> @CheckReturnValue
<ide> @SchedulerSupport(SchedulerSupport.CUSTOM)
<ide> public final Single<T> delay(final long time, final TimeUnit unit, final Scheduler scheduler) {
<add> return delay(time, unit, scheduler, false);
<add> }
<add>
<add> /**
<add> * Delays the emission of the success or error signal from the current Single by the specified amount.
<add> * <dl>
<add> * <dt><b>Scheduler:</b></dt>
<add> * <dd>you specify the {@link Scheduler} where the non-blocking wait and emission happens</dd>
<add> * </dl>
<add> *
<add> * @param time the amount of time the success or error signal should be delayed for
<add> * @param unit the time unit
<add> * @param scheduler the target scheduler to use for the non-blocking wait and emission
<add> * @param delayError if true, both success and error signals are delayed. if false, only success signals are delayed.
<add> * @return the new Single instance
<add> * @throws NullPointerException
<add> * if unit is null, or
<add> * if scheduler is null
<add> * @since 2.1.5 - experimental
<add> */
<add> @Experimental
<add> @CheckReturnValue
<add> @SchedulerSupport(SchedulerSupport.CUSTOM)
<add> public final Single<T> delay(final long time, final TimeUnit unit, final Scheduler scheduler, boolean delayError) {
<ide> ObjectHelper.requireNonNull(unit, "unit is null");
<ide> ObjectHelper.requireNonNull(scheduler, "scheduler is null");
<del> return RxJavaPlugins.onAssembly(new SingleDelay<T>(this, time, unit, scheduler));
<add> return RxJavaPlugins.onAssembly(new SingleDelay<T>(this, time, unit, scheduler, delayError));
<ide> }
<ide>
<ide> /**
<ide><path>src/main/java/io/reactivex/internal/operators/single/SingleDelay.java
<ide>
<ide> public final class SingleDelay<T> extends Single<T> {
<ide>
<del>
<ide> final SingleSource<? extends T> source;
<ide> final long time;
<ide> final TimeUnit unit;
<ide> final Scheduler scheduler;
<add> final boolean delayError;
<ide>
<del> public SingleDelay(SingleSource<? extends T> source, long time, TimeUnit unit, Scheduler scheduler) {
<add> public SingleDelay(SingleSource<? extends T> source, long time, TimeUnit unit, Scheduler scheduler, boolean delayError) {
<ide> this.source = source;
<ide> this.time = time;
<ide> this.unit = unit;
<ide> this.scheduler = scheduler;
<add> this.delayError = delayError;
<ide> }
<ide>
<ide> @Override
<ide> public void onSuccess(final T value) {
<ide>
<ide> @Override
<ide> public void onError(final Throwable e) {
<del> sd.replace(scheduler.scheduleDirect(new OnError(e), 0, unit));
<add> sd.replace(scheduler.scheduleDirect(new OnError(e), delayError ? time : 0, unit));
<ide> }
<ide>
<ide> final class OnSuccess implements Runnable {
<ide><path>src/test/java/io/reactivex/ParamValidationCheckerTest.java
<ide> public void checkParallelFlowable() {
<ide>
<ide> // negative time is considered as zero time
<ide> addOverride(new ParamOverride(Single.class, 0, ParamMode.ANY, "delay", Long.TYPE, TimeUnit.class));
<add> addOverride(new ParamOverride(Single.class, 0, ParamMode.ANY, "delay", Long.TYPE, TimeUnit.class, Boolean.TYPE));
<ide> addOverride(new ParamOverride(Single.class, 0, ParamMode.ANY, "delay", Long.TYPE, TimeUnit.class, Scheduler.class));
<add> addOverride(new ParamOverride(Single.class, 0, ParamMode.ANY, "delay", Long.TYPE, TimeUnit.class, Scheduler.class, Boolean.TYPE));
<ide>
<ide>
<ide> // zero repeat is allowed
<ide><path>src/test/java/io/reactivex/internal/operators/single/SingleDelayTest.java
<ide> import io.reactivex.exceptions.TestException;
<ide> import io.reactivex.functions.*;
<ide> import io.reactivex.internal.subscriptions.BooleanSubscription;
<add>import io.reactivex.observers.TestObserver;
<ide> import io.reactivex.plugins.RxJavaPlugins;
<ide> import io.reactivex.schedulers.Schedulers;
<add>import io.reactivex.schedulers.TestScheduler;
<ide> import io.reactivex.subjects.PublishSubject;
<ide>
<ide> public class SingleDelayTest {
<ide> @Test
<del> public void delay() throws Exception {
<del> final AtomicInteger value = new AtomicInteger();
<add> public void delayOnSuccess() {
<add> final TestScheduler scheduler = new TestScheduler();
<add> final TestObserver<Integer> observer = Single.just(1)
<add> .delay(5, TimeUnit.SECONDS, scheduler)
<add> .test();
<ide>
<del> Single.just(1).delay(200, TimeUnit.MILLISECONDS)
<del> .subscribe(new BiConsumer<Integer, Throwable>() {
<del> @Override
<del> public void accept(Integer v, Throwable e) throws Exception {
<del> value.set(v);
<del> }
<del> });
<add> scheduler.advanceTimeTo(2, TimeUnit.SECONDS);
<add> observer.assertNoValues();
<ide>
<del> Thread.sleep(100);
<add> scheduler.advanceTimeTo(5, TimeUnit.SECONDS);
<add> observer.assertValue(1);
<add> }
<add>
<add> @Test
<add> public void delayOnError() {
<add> final TestScheduler scheduler = new TestScheduler();
<add> final TestObserver<?> observer = Single.error(new TestException())
<add> .delay(5, TimeUnit.SECONDS, scheduler)
<add> .test();
<add>
<add> scheduler.triggerActions();
<add> observer.assertError(TestException.class);
<add> }
<ide>
<del> assertEquals(0, value.get());
<add> @Test
<add> public void delayedErrorOnSuccess() {
<add> final TestScheduler scheduler = new TestScheduler();
<add> final TestObserver<Integer> observer = Single.just(1)
<add> .delay(5, TimeUnit.SECONDS, scheduler, true)
<add> .test();
<ide>
<del> Thread.sleep(200);
<add> scheduler.advanceTimeTo(2, TimeUnit.SECONDS);
<add> observer.assertNoValues();
<ide>
<del> assertEquals(1, value.get());
<add> scheduler.advanceTimeTo(5, TimeUnit.SECONDS);
<add> observer.assertValue(1);
<ide> }
<ide>
<ide> @Test
<del> public void delayError() {
<del> Single.error(new TestException()).delay(5, TimeUnit.SECONDS)
<del> .test()
<del> .awaitDone(1, TimeUnit.SECONDS)
<del> .assertFailure(TestException.class);
<add> public void delayedErrorOnError() {
<add> final TestScheduler scheduler = new TestScheduler();
<add> final TestObserver<?> observer = Single.error(new TestException())
<add> .delay(5, TimeUnit.SECONDS, scheduler, true)
<add> .test();
<add>
<add> scheduler.advanceTimeTo(2, TimeUnit.SECONDS);
<add> observer.assertNoErrors();
<add>
<add> scheduler.advanceTimeTo(5, TimeUnit.SECONDS);
<add> observer.assertError(TestException.class);
<ide> }
<ide>
<ide> @Test | 4 |
Python | Python | handle complex special values and negative zero | 0d3aeb0dde468a15ad15a44017842115bc3313e4 | <ide><path>numpy/testing/tests/test_utils.py
<ide> def test_non_numeric(self):
<ide> self._assert_func('ab', 'ab')
<ide> self._test_not_equal('ab', 'abb')
<ide>
<add> def test_complex_item(self):
<add> self._assert_func(complex(1, 2), complex(1, 2))
<add> self._assert_func(complex(1, np.nan), complex(1, np.nan))
<add> self._test_not_equal(complex(1, np.nan), complex(1, 2))
<add> self._test_not_equal(complex(np.nan, 1), complex(1, np.nan))
<add> self._test_not_equal(complex(np.nan, np.inf), complex(np.nan, 2))
<add>
<add> def test_negative_zero(self):
<add> self._test_not_equal(np.PZERO, np.NZERO)
<add>
<add> def test_complex(self):
<add> x = np.array([complex(1, 2), complex(1, np.nan)])
<add> y = np.array([complex(1, 2), complex(1, 2)])
<add> self._assert_func(x, x)
<add> self._test_not_equal(x, y)
<add>
<ide> class TestArrayAlmostEqual(_GenericTest, unittest.TestCase):
<ide> def setUp(self):
<ide> self._assert_func = assert_array_almost_equal
<ide><path>numpy/testing/utils.py
<ide> def assert_equal(actual,desired,err_msg='',verbose=True):
<ide> for k in range(len(desired)):
<ide> assert_equal(actual[k], desired[k], 'item=%r\n%s' % (k,err_msg), verbose)
<ide> return
<del> from numpy.core import ndarray, isscalar
<add> from numpy.core import ndarray, isscalar, signbit
<add> from numpy.lib import iscomplexobj, real, imag
<ide> if isinstance(actual, ndarray) or isinstance(desired, ndarray):
<ide> return assert_array_equal(actual, desired, err_msg, verbose)
<ide> msg = build_err_msg([actual, desired], err_msg, verbose=verbose)
<ide>
<add> # Handle complex numbers: separate into real/imag to handle
<add> # nan/inf/negative zero correctly
<add> # XXX: catch ValueError for subclasses of ndarray where iscomplex fail
<ide> try:
<add> usecomplex = iscomplexobj(actual) or iscomplexobj(desired)
<add> except ValueError:
<add> usecomplex = False
<add>
<add> if usecomplex:
<add> if iscomplexobj(actual):
<add> actualr = real(actual)
<add> actuali = imag(actual)
<add> else:
<add> actualr = actual
<add> actuali = 0
<add> if iscomplexobj(desired):
<add> desiredr = real(desired)
<add> desiredi = imag(desired)
<add> else:
<add> desiredr = desired
<add> desiredi = 0
<add> try:
<add> assert_equal(actualr, desiredr)
<add> assert_equal(actuali, desiredi)
<add> except AssertionError:
<add> raise AssertionError("Items are not equal:\n" \
<add> "ACTUAL: %s\n" \
<add> "DESIRED: %s\n" % (str(actual), str(desired)))
<add>
<add> # Inf/nan/negative zero handling
<add> try:
<add> # isscalar test to check cases such as [np.nan] != np.nan
<add> if isscalar(desired) != isscalar(actual):
<add> raise AssertionError(msg)
<add>
<ide> # If one of desired/actual is not finite, handle it specially here:
<ide> # check that both are nan if any is a nan, and test for equality
<ide> # otherwise
<ide> if not (gisfinite(desired) and gisfinite(actual)):
<ide> isdesnan = gisnan(desired)
<ide> isactnan = gisnan(actual)
<ide> if isdesnan or isactnan:
<del> # isscalar test to check so that [np.nan] != np.nan
<del> if not (isdesnan and isactnan) \
<del> or (isscalar(isdesnan) != isscalar(isactnan)):
<add> if not (isdesnan and isactnan):
<ide> raise AssertionError(msg)
<ide> else:
<ide> if not desired == actual:
<ide> raise AssertionError(msg)
<ide> return
<add> elif desired == 0 and actual == 0:
<add> if not signbit(desired) == signbit(actual):
<add> raise AssertionError(msg)
<ide> # If TypeError or ValueError raised while using isnan and co, just handle
<ide> # as before
<ide> except TypeError:
<ide> def isnumber(x):
<ide> verbose=verbose, header=header,
<ide> names=('x', 'y'))
<ide> raise AssertionError(msg)
<add> # If only one item, it was a nan, so just return
<add> if x.size == y.size == 1:
<add> return
<ide> val = comparison(x[~xnanid], y[~ynanid])
<ide> else:
<ide> val = comparison(x,y) | 2 |
Go | Go | memoize seccomp value for sysinfo | df7031b669a9224b1409213fd01bee4722eae895 | <ide><path>pkg/sysinfo/sysinfo_linux.go
<ide> import (
<ide> "os"
<ide> "path"
<ide> "strings"
<add> "sync"
<ide>
<ide> "github.com/opencontainers/runc/libcontainer/cgroups"
<ide> "github.com/sirupsen/logrus"
<ide> func applyCgroupNsInfo(info *SysInfo, _ map[string]string) []string {
<ide> return warnings
<ide> }
<ide>
<add>var (
<add> seccompOnce sync.Once
<add> seccompEnabled bool
<add>)
<add>
<ide> // applySeccompInfo checks if Seccomp is supported, via CONFIG_SECCOMP.
<ide> func applySeccompInfo(info *SysInfo, _ map[string]string) []string {
<ide> var warnings []string
<del> // Check if Seccomp is supported, via CONFIG_SECCOMP.
<del> if err := unix.Prctl(unix.PR_GET_SECCOMP, 0, 0, 0, 0); err != unix.EINVAL {
<del> // Make sure the kernel has CONFIG_SECCOMP_FILTER.
<del> if err := unix.Prctl(unix.PR_SET_SECCOMP, unix.SECCOMP_MODE_FILTER, 0, 0, 0); err != unix.EINVAL {
<del> info.Seccomp = true
<add> seccompOnce.Do(func() {
<add> // Check if Seccomp is supported, via CONFIG_SECCOMP.
<add> if err := unix.Prctl(unix.PR_GET_SECCOMP, 0, 0, 0, 0); err != unix.EINVAL {
<add> // Make sure the kernel has CONFIG_SECCOMP_FILTER.
<add> if err := unix.Prctl(unix.PR_SET_SECCOMP, unix.SECCOMP_MODE_FILTER, 0, 0, 0); err != unix.EINVAL {
<add> seccompEnabled = true
<add> }
<ide> }
<del> }
<add> })
<add> info.Seccomp = seccompEnabled
<ide> return warnings
<ide> }
<ide> | 1 |
Go | Go | reuse a single devicesetdm for all the tests | 52294192b2a008b624862701cbf8491ad19b0798 | <ide><path>runtime_test.go
<ide> func TestRestore(t *testing.T) {
<ide>
<ide> // Here are are simulating a docker restart - that is, reloading all containers
<ide> // from scratch
<del> runtime2, err := NewRuntimeFromDirectory(runtime1.root, devmapper.NewDeviceSetDM(runtime1.root), false)
<add> runtime2, err := NewRuntimeFromDirectory(runtime1.root, runtime1.deviceSet, false)
<ide> if err != nil {
<ide> t.Fatal(err)
<ide> }
<ide><path>utils_test.go
<ide> package docker
<ide>
<ide> import (
<ide> "github.com/dotcloud/docker/utils"
<del> "github.com/dotcloud/docker/devmapper"
<add> "path/filepath"
<ide> "io"
<ide> "io/ioutil"
<ide> "os"
<ide> func newTestRuntime() (*Runtime, error) {
<ide> return nil, err
<ide> }
<ide>
<del> runtime, err := NewRuntimeFromDirectory(root, devmapper.NewDeviceSetDM(root), false)
<add> runtime, err := NewRuntimeFromDirectory(root, NewDeviceSetWrapper (globalRuntime.deviceSet, filepath.Base(root)), false)
<ide> if err != nil {
<ide> return nil, err
<ide> } | 2 |
Javascript | Javascript | handle goal reached 🎉 | 2e578a7af828c9ee199e6cf0eb17263336992486 | <ide><path>client/src/components/Supporters.js
<ide> const supportersLocale = supporterGoal.toLocaleString();
<ide>
<ide> function Supporters({ isDonating, activeDonations }) {
<ide> const donationsLocale = activeDonations.toLocaleString();
<add> const isGoalReached = activeDonations >= supporterGoal;
<ide> return (
<ide> <Fragment>
<ide> <FullWidthRow>
<ide> <h2>Support an open future.</h2>
<ide> </FullWidthRow>
<del> <FullWidthRow>
<del> <div id='supporter-progress-wrapper'>
<del> <ProgressBar max={10000} now={activeDonations} />
<del> <div id='progress-label-wrapper'>
<del> <span className='progress-label'>
<del> {donationsLocale} supporters out of {supportersLocale} supporter
<del> goal
<del> </span>
<add> {isGoalReached ? (
<add> <FullWidthRow>
<add> <Spacer />
<add> <p>
<add> 🎉 {donationsLocale} supporters help keep freeCodeCamp.org free to
<add> use
<add> </p>
<add> </FullWidthRow>
<add> ) : (
<add> <FullWidthRow>
<add> <div id='supporter-progress-wrapper'>
<add> <ProgressBar max={supporterGoal} now={activeDonations} />
<add> <div id='progress-label-wrapper'>
<add> <span className='progress-label'>
<add> {donationsLocale} supporters out of {supportersLocale} supporter
<add> goal
<add> </span>
<add> </div>
<ide> </div>
<del> </div>
<del> </FullWidthRow>
<add> </FullWidthRow>
<add> )}
<ide> <Spacer />
<ide> <FullWidthRow>
<ide> <b> | 1 |
Python | Python | set version to 2.1.0 | f0c1efcb00177345b8d08821ae24591707306164 | <ide><path>spacy/about.py
<ide> # fmt: off
<ide>
<ide> __title__ = "spacy"
<del>__version__ = "2.1.0.dev1"
<add>__version__ = "2.1.0"
<ide> __summary__ = "Industrial-strength Natural Language Processing (NLP) with Python and Cython"
<ide> __uri__ = "https://spacy.io"
<ide> __author__ = "Explosion AI" | 1 |
Javascript | Javascript | evaluate stuff for renaming to capture more cases | 23d28ddd43285392c4cbe6a79dc598619ac9fd7c | <ide><path>lib/Parser.js
<ide> Parser.prototype.initializeEvaluating = function() {
<ide> if(expr.operator == "&&") {
<ide> var left = this.evaluateExpression(expr.left);
<ide> var leftAsBool = left && left.asBool();
<del> if(leftAsBool === false) return new BasicEvaluatedExpression().setBoolean(false).setRange(expr.range);
<add> if(leftAsBool === false) return left.setRange(expr.range);
<ide> if(leftAsBool !== true) return;
<ide> var right = this.evaluateExpression(expr.right);
<del> var rightAsBool = right && right.asBool();
<del> if(typeof rightAsBool === "boolean")
<del> return new BasicEvaluatedExpression().setBoolean(rightAsBool).setRange(expr.range);
<add> return right.setRange(expr.range);
<ide> } else if(expr.operator == "||") {
<ide> var left = this.evaluateExpression(expr.left);
<ide> var leftAsBool = left && left.asBool();
<del> if(leftAsBool === true) return new BasicEvaluatedExpression().setBoolean(true).setRange(expr.range);
<add> if(leftAsBool === true) return left.setRange(expr.range);
<ide> if(leftAsBool !== false) return;
<ide> var right = this.evaluateExpression(expr.right);
<del> var rightAsBool = right && right.asBool();
<del> if(typeof rightAsBool === "boolean")
<del> return new BasicEvaluatedExpression().setBoolean(rightAsBool).setRange(expr.range);
<add> return right.setRange(expr.range);
<ide> }
<ide> });
<ide> this.plugin("evaluate BinaryExpression", function(expr) {
<ide> Parser.prototype.initializeEvaluating = function() {
<ide> return new BasicEvaluatedExpression().setItems(items).setRange(expr.range);
<ide> });
<ide> }
<add>
<add>Parser.prototype.getRenameIdentifier = function getRenameIdentifier(expr) {
<add> var result = this.evaluateExpression(expr);
<add> if(!result) return;
<add> if(result.isIdentifier()) return result.identifier;
<add> return;
<add>};
<add>
<ide> Parser.prototype.walkStatements = function walkStatements(statements) {
<ide> statements.forEach(function(statement) {
<ide> this.walkStatement(statement);
<ide> Parser.prototype.walkVariableDeclarators = function walkVariableDeclarators(decl
<ide> declarators.forEach(function(declarator) {
<ide> switch(declarator.type) {
<ide> case "VariableDeclarator":
<del> var initToIdentifier = declarator.init && declarator.init.type === "Identifier";
<del> if(initToIdentifier && declarator.id.type === "Identifier" && !this.applyPluginsBailResult("rename " + declarator.init.name, declarator.init)) {
<add> var renameIdentifier = declarator.init && this.getRenameIdentifier(declarator.init);
<add> if(renameIdentifier && declarator.id.type === "Identifier" && !this.applyPluginsBailResult("rename " + renameIdentifier, declarator.init)) {
<ide> // renaming with "var a = b;"
<del> this.scope.renames["$"+declarator.id.name] = this.scope.renames["$"+declarator.init.name] || declarator.init.name;
<add> this.scope.renames["$"+declarator.id.name] = this.scope.renames["$"+renameIdentifier] || renameIdentifier;
<ide> var idx = this.scope.definitions.indexOf(declarator.id.name);
<ide> if(idx >= 0) this.scope.definitions.splice(idx, 1);
<ide> } else if(declarator.id.type === "Identifier" && !this.applyPluginsBailResult("var " + declarator.id.name, declarator)) {
<ide> Parser.prototype.walkExpression = function walkExpression(expression) {
<ide> this.walkExpression(expression.right);
<ide> break;
<ide> case "AssignmentExpression":
<del> if(expression.left.type === "Identifier" && expression.right.type === "Identifier" && !this.applyPluginsBailResult("rename " + expression.right.name, expression.right)) {
<add> var renameIdentifier = this.getRenameIdentifier(expression.right);
<add> if(expression.left.type === "Identifier" && renameIdentifier && !this.applyPluginsBailResult("rename " + renameIdentifier, expression.right)) {
<ide> // renaming "a = b;"
<del> this.scope.renames["$"+expression.left.name] = expression.right.name;
<add> this.scope.renames["$"+expression.left.name] = renameIdentifier;
<ide> var idx = this.scope.definitions.indexOf(expression.left.name);
<ide> if(idx >= 0) this.scope.definitions.splice(idx, 1);
<ide> } else if(expression.left.type === "Identifier") {
<ide> Parser.prototype.walkExpression = function walkExpression(expression) {
<ide> function walkIIFE(functionExpression, args) {
<ide> var params = functionExpression.params;
<ide> var args = args.map(function(arg, idx) {
<del> var result = this.evaluateExpression(arg);
<del> if(result && !result.isIdentifier()) result = undefined;
<del> if(!result) {
<add> var renameIdentifier = this.getRenameIdentifier(arg);
<add> if(!renameIdentifier) {
<ide> this.walkExpression(arg);
<ide> return;
<del> } else if(this.applyPluginsBailResult("rename " + result.identifier, arg)) {
<add> } else if(this.applyPluginsBailResult("rename " + renameIdentifier, arg)) {
<ide> return;
<ide> }
<del> return result.identifier;
<add> return renameIdentifier;
<ide> }, this);
<ide> this.inScope(params.filter(function(identifier, idx) {
<ide> return !args[idx];
<ide><path>test/cases/parsing/renaming/index.js
<ide> it("should be able to rename require by var", function() {
<ide> var cjsRequire; // just to make it difficult
<del> var cjsRequire = require;
<add> var cjsRequire = require, cjsRequire2 = typeof require !== "undefined" && require;
<ide> cjsRequire("./file").should.be.eql("ok");
<add> cjsRequire2("./file").should.be.eql("ok");
<ide> });
<ide>
<ide> it("should be able to rename require by assign", function() {
<del> var cjsRequire;
<add> var cjsRequire, cjsRequire2;
<ide> (function() {
<ide> cjsRequire = require;
<add> cjsRequire2 = typeof require === "function" && require;
<ide> cjsRequire("./file").should.be.eql("ok");
<add> cjsRequire2("./file").should.be.eql("ok");
<ide> }());
<ide> });
<ide> | 2 |
Javascript | Javascript | fix exception when starting to drag text | e99545ee419838d56310c79806836c0e4136f3d0 | <ide><path>spec/dock-spec.js
<ide> describe('Dock', () => {
<ide> })
<ide> })
<ide>
<del> describe('when dragging an item over an empty dock', () => {
<del> it('has the preferred size of the item', () => {
<add> describe('drag handling', () => {
<add> it('expands docks to match the preferred size of the dragged item', () => {
<ide> jasmine.attachToDOM(atom.workspace.getElement())
<ide>
<del> const item = {
<del> element: document.createElement('div'),
<add> const element = document.createElement('div')
<add> element.setAttribute('is', 'tabs-tab')
<add> element.item = {
<add> element,
<ide> getDefaultLocation() { return 'left' },
<del> getPreferredWidth() { return 144 },
<del> serialize: () => ({deserializer: 'DockTestItem'})
<add> getPreferredWidth() { return 144 }
<ide> }
<del> const dock = atom.workspace.getLeftDock()
<del> const dockElement = dock.getElement()
<ide>
<del> dock.setDraggingItem(item)
<del> expect(dock.wrapperElement.offsetWidth).toBe(144)
<add> const dragEvent = new DragEvent('dragstart')
<add> Object.defineProperty(dragEvent, 'target', {value: element})
<add>
<add> atom.workspace.getElement().handleDragStart(dragEvent)
<add> expect(atom.workspace.getLeftDock().wrapperElement.offsetWidth).toBe(144)
<add> })
<add>
<add> it('does nothing when text nodes are dragged', () => {
<add> jasmine.attachToDOM(atom.workspace.getElement())
<add>
<add> const textNode = document.createTextNode('hello')
<add>
<add> const dragEvent = new DragEvent('dragstart')
<add> Object.defineProperty(dragEvent, 'target', {value: textNode})
<add>
<add> expect(() => atom.workspace.getElement().handleDragStart(dragEvent)).not.toThrow()
<ide> })
<ide> })
<ide> })
<ide><path>src/workspace-element.js
<ide> module.exports = document.registerElement('atom-workspace', {prototype: Workspac
<ide> function isTab (element) {
<ide> let el = element
<ide> while (el != null) {
<del> if (el.getAttribute('is') === 'tabs-tab') { return true }
<add> if (el.getAttribute && el.getAttribute('is') === 'tabs-tab') return true
<ide> el = el.parentElement
<ide> }
<ide> return false | 2 |
PHP | PHP | remove lies in doc blocks | 5f95a813c634288327c60ba548fc5d5ae6e2e3e1 | <ide><path>Cake/Cache/Engine/WincacheEngine.php
<ide> class WincacheEngine extends CacheEngine {
<ide> *
<ide> * @param array $config array of setting for the engine
<ide> * @return boolean True if the engine has been successfully initialized, false if not
<del> * @see CacheEngine::__defaults
<ide> */
<ide> public function init($config = []) {
<ide> if (!isset($config['prefix'])) { | 1 |
Go | Go | extract code from exportchanges to addtarfile() | 5b77e51e0a15eddefcb40380673df8c0c24f95d1 | <ide><path>archive/archive.go
<ide> import (
<ide> "path/filepath"
<ide> "strings"
<ide> "syscall"
<add> "time"
<ide> )
<ide>
<ide> type Archive io.Reader
<ide> func (compression *Compression) Extension() string {
<ide> return ""
<ide> }
<ide>
<add>func addTarFile(path, name string, tw *tar.Writer) error {
<add> var stat syscall.Stat_t
<add> if err := syscall.Lstat(path, &stat); err != nil {
<add> return err
<add> }
<add>
<add> mtim := getLastModification(&stat)
<add> atim := getLastAccess(&stat)
<add> hdr := &tar.Header{
<add> Name: name,
<add> Mode: int64(stat.Mode & 07777),
<add> Uid: int(stat.Uid),
<add> Gid: int(stat.Gid),
<add> ModTime: time.Unix(int64(mtim.Sec), int64(mtim.Nsec)),
<add> AccessTime: time.Unix(int64(atim.Sec), int64(atim.Nsec)),
<add> }
<add>
<add> if stat.Mode&syscall.S_IFDIR == syscall.S_IFDIR {
<add> hdr.Typeflag = tar.TypeDir
<add> } else if stat.Mode&syscall.S_IFLNK == syscall.S_IFLNK {
<add> hdr.Typeflag = tar.TypeSymlink
<add> if link, err := os.Readlink(path); err != nil {
<add> return err
<add> } else {
<add> hdr.Linkname = link
<add> }
<add> } else if stat.Mode&syscall.S_IFBLK == syscall.S_IFBLK ||
<add> stat.Mode&syscall.S_IFCHR == syscall.S_IFCHR {
<add> if stat.Mode&syscall.S_IFBLK == syscall.S_IFBLK {
<add> hdr.Typeflag = tar.TypeBlock
<add> } else {
<add> hdr.Typeflag = tar.TypeChar
<add> }
<add> hdr.Devmajor = int64(major(uint64(stat.Rdev)))
<add> hdr.Devminor = int64(minor(uint64(stat.Rdev)))
<add> } else if stat.Mode&syscall.S_IFIFO == syscall.S_IFIFO ||
<add> stat.Mode&syscall.S_IFSOCK == syscall.S_IFSOCK {
<add> hdr.Typeflag = tar.TypeFifo
<add> } else if stat.Mode&syscall.S_IFREG == syscall.S_IFREG {
<add> hdr.Typeflag = tar.TypeReg
<add> hdr.Size = stat.Size
<add> } else {
<add> return fmt.Errorf("Unknown file type: %s\n", path)
<add> }
<add>
<add> if err := tw.WriteHeader(hdr); err != nil {
<add> return err
<add> }
<add>
<add> if hdr.Typeflag == tar.TypeReg {
<add> if file, err := os.Open(path); err != nil {
<add> return err
<add> } else {
<add> _, err := io.Copy(tw, file)
<add> if err != nil {
<add> return err
<add> }
<add> file.Close()
<add> }
<add> }
<add>
<add> return nil
<add>}
<add>
<ide> func createTarFile(path, extractDir string, hdr *tar.Header, reader *tar.Reader) error {
<ide> switch hdr.Typeflag {
<ide> case tar.TypeDir:
<ide><path>archive/changes.go
<ide> func ExportChanges(dir string, changes []Change) (Archive, error) {
<ide> }
<ide> } else {
<ide> path := filepath.Join(dir, change.Path)
<del>
<del> var stat syscall.Stat_t
<del> if err := syscall.Lstat(path, &stat); err != nil {
<del> utils.Debugf("Can't stat source file: %s\n", err)
<del> continue
<del> }
<del>
<del> mtim := getLastModification(&stat)
<del> atim := getLastAccess(&stat)
<del> hdr := &tar.Header{
<del> Name: change.Path[1:],
<del> Mode: int64(stat.Mode & 07777),
<del> Uid: int(stat.Uid),
<del> Gid: int(stat.Gid),
<del> ModTime: time.Unix(int64(mtim.Sec), int64(mtim.Nsec)),
<del> AccessTime: time.Unix(int64(atim.Sec), int64(atim.Nsec)),
<del> }
<del>
<del> if stat.Mode&syscall.S_IFDIR == syscall.S_IFDIR {
<del> hdr.Typeflag = tar.TypeDir
<del> } else if stat.Mode&syscall.S_IFLNK == syscall.S_IFLNK {
<del> hdr.Typeflag = tar.TypeSymlink
<del> if link, err := os.Readlink(path); err != nil {
<del> utils.Debugf("Can't readlink source file: %s\n", err)
<del> continue
<del> } else {
<del> hdr.Linkname = link
<del> }
<del> } else if stat.Mode&syscall.S_IFBLK == syscall.S_IFBLK ||
<del> stat.Mode&syscall.S_IFCHR == syscall.S_IFCHR {
<del> if stat.Mode&syscall.S_IFBLK == syscall.S_IFBLK {
<del> hdr.Typeflag = tar.TypeBlock
<del> } else {
<del> hdr.Typeflag = tar.TypeChar
<del> }
<del> hdr.Devmajor = int64(major(uint64(stat.Rdev)))
<del> hdr.Devminor = int64(minor(uint64(stat.Rdev)))
<del> } else if stat.Mode&syscall.S_IFIFO == syscall.S_IFIFO ||
<del> stat.Mode&syscall.S_IFSOCK == syscall.S_IFSOCK {
<del> hdr.Typeflag = tar.TypeFifo
<del> } else if stat.Mode&syscall.S_IFREG == syscall.S_IFREG {
<del> hdr.Typeflag = tar.TypeReg
<del> hdr.Size = stat.Size
<del> } else {
<del> utils.Debugf("Unknown file type: %s\n", path)
<del> continue
<del> }
<del>
<del> if err := tw.WriteHeader(hdr); err != nil {
<del> utils.Debugf("Can't write tar header: %s\n", err)
<del> }
<del> if hdr.Typeflag == tar.TypeReg {
<del> if file, err := os.Open(path); err != nil {
<del> utils.Debugf("Can't open file: %s\n", err)
<del> } else {
<del> _, err := io.Copy(tw, file)
<del> if err != nil {
<del> utils.Debugf("Can't copy file: %s\n", err)
<del> }
<del> file.Close()
<del> }
<add> if err := addTarFile(path, change.Path[1:], tw); err != nil {
<add> utils.Debugf("Can't add file %s to tar: %s\n", path, err)
<ide> }
<ide> }
<ide> }
<add>
<ide> // Make sure to check the error on Close.
<ide> if err := tw.Close(); err != nil {
<ide> utils.Debugf("Can't close layer: %s\n", err) | 2 |
Text | Text | remove dynamic assetprefix from docs | 2a70b268c8184918bd217e78cc9449edaccb4224 | <ide><path>docs/advanced-features/custom-server.md
<ide> module.exports = {
<ide> > Note that `useFileSystemPublicRoutes` simply disables filename routes from SSR; client-side routing may still access those paths. When using this option, you should guard against navigation to routes you do not want programmatically.
<ide>
<ide> > You may also wish to configure the client-side Router to disallow client-side redirects to filename routes; for that refer to [`Router.beforePopState`](/docs/api-reference/next/router.md#router.beforePopState).
<del>
<del>## Dynamic `assetPrefix`
<del>
<del>Sometimes you may need to set the [`assetPrefix`](/docs/api-reference/next.config.js/cdn-support-with-asset-prefix.md) dynamically. This is useful when changing the `assetPrefix` based on incoming requests.
<del>
<del>For that, you can use `app.setAssetPrefix`, as in the following example:
<del>
<del>```js
<del>const next = require('next')
<del>const http = require('http')
<del>
<del>const dev = process.env.NODE_ENV !== 'production'
<del>const app = next({ dev })
<del>const handleNextRequests = app.getRequestHandler()
<del>
<del>app.prepare().then(() => {
<del> const server = new http.Server((req, res) => {
<del> // Add assetPrefix support based on the hostname
<del> if (req.headers.host === 'my-app.com') {
<del> app.setAssetPrefix('http://cdn.com/myapp')
<del> } else {
<del> app.setAssetPrefix('')
<del> }
<del>
<del> handleNextRequests(req, res)
<del> })
<del>
<del> server.listen(port, err => {
<del> if (err) {
<del> throw err
<del> }
<del>
<del> console.log(`> Ready on http://localhost:${port}`)
<del> })
<del>})
<del>``` | 1 |
Mixed | Ruby | remove deprecated assertion files | 92e27d30d8112962ee068f7b14aa7b10daf0c976 | <ide><path>actionpack/CHANGELOG.md
<add>* Remove deprecated assertion files.
<add>
<add> *Rafael Mendonça França*
<add>
<ide> * Remove deprecated usage of string keys in URL helpers.
<ide>
<ide> *Rafael Mendonça França*
<ide><path>actionpack/lib/action_dispatch/testing/assertions/dom.rb
<del>require 'active_support/deprecation'
<del>
<del>ActiveSupport::Deprecation.warn("ActionDispatch::Assertions::DomAssertions has been extracted to the rails-dom-testing gem.")
<ide>\ No newline at end of file
<ide><path>actionpack/lib/action_dispatch/testing/assertions/selector.rb
<del>require 'active_support/deprecation'
<del>
<del>ActiveSupport::Deprecation.warn("ActionDispatch::Assertions::SelectorAssertions has been extracted to the rails-dom-testing gem.")
<ide><path>actionpack/lib/action_dispatch/testing/assertions/tag.rb
<del>require 'active_support/deprecation'
<del>
<del>ActiveSupport::Deprecation.warn('`ActionDispatch::Assertions::TagAssertions` has been extracted to the rails-dom-testing gem.') | 4 |
Java | Java | adapt test to changes in rsocket java | 2002a1689a544bd49001062afcde7564ce010cea | <ide><path>spring-messaging/src/test/java/org/springframework/messaging/rsocket/DefaultRSocketRequesterBuilderTests.java
<ide> public Flux<ByteBuf> receive() {
<ide> return Flux.empty();
<ide> }
<ide>
<add> @Override
<add> public ByteBufAllocator alloc() {
<add> return ByteBufAllocator.DEFAULT;
<add> }
<add>
<ide> @Override
<ide> public Mono<Void> onClose() {
<ide> return Mono.empty(); | 1 |
Python | Python | standardize dynamodb naming | 1fc0fa5ea96913faf78a5bf5d5f75f1d2fb91e97 | <ide><path>airflow/providers/amazon/aws/hooks/dynamodb.py
<ide>
<ide>
<ide> """This module contains the AWS DynamoDB hook"""
<add>import warnings
<ide> from typing import Iterable, List, Optional
<ide>
<ide> from airflow.exceptions import AirflowException
<ide> from airflow.providers.amazon.aws.hooks.base_aws import AwsBaseHook
<ide>
<ide>
<del>class AwsDynamoDBHook(AwsBaseHook):
<add>class DynamoDBHook(AwsBaseHook):
<ide> """
<ide> Interact with AWS DynamoDB.
<ide>
<ide> def write_batch_data(self, items: Iterable) -> bool:
<ide> return True
<ide> except Exception as general_error:
<ide> raise AirflowException(f"Failed to insert items in dynamodb, error: {str(general_error)}")
<add>
<add>
<add>class AwsDynamoDBHook(DynamoDBHook):
<add> """
<add> This class is deprecated.
<add> Please use :class:`airflow.providers.amazon.aws.hooks.dynamodb.DynamoDBHook`.
<add> """
<add>
<add> def __init__(self, *args, **kwargs):
<add> warnings.warn(
<add> "This class is deprecated. "
<add> "Please use :class:`airflow.providers.amazon.aws.hooks.dynamodb.DynamoDBHook`.",
<add> DeprecationWarning,
<add> stacklevel=2,
<add> )
<add> super().__init__(*args, **kwargs)
<ide><path>airflow/providers/amazon/aws/transfers/dynamodb_to_s3.py
<ide> from uuid import uuid4
<ide>
<ide> from airflow.models import BaseOperator
<del>from airflow.providers.amazon.aws.hooks.dynamodb import AwsDynamoDBHook
<add>from airflow.providers.amazon.aws.hooks.dynamodb import DynamoDBHook
<ide> from airflow.providers.amazon.aws.hooks.s3 import S3Hook
<ide>
<ide> if TYPE_CHECKING:
<ide> def __init__(
<ide> self.aws_conn_id = aws_conn_id
<ide>
<ide> def execute(self, context: 'Context') -> None:
<del> hook = AwsDynamoDBHook(aws_conn_id=self.aws_conn_id)
<add> hook = DynamoDBHook(aws_conn_id=self.aws_conn_id)
<ide> table = hook.get_conn().Table(self.dynamodb_table_name)
<ide>
<ide> scan_kwargs = copy(self.dynamodb_scan_kwargs) if self.dynamodb_scan_kwargs else {}
<ide><path>airflow/providers/amazon/aws/transfers/hive_to_dynamodb.py
<ide> from typing import TYPE_CHECKING, Callable, Optional, Sequence
<ide>
<ide> from airflow.models import BaseOperator
<del>from airflow.providers.amazon.aws.hooks.dynamodb import AwsDynamoDBHook
<add>from airflow.providers.amazon.aws.hooks.dynamodb import DynamoDBHook
<ide> from airflow.providers.apache.hive.hooks.hive import HiveServer2Hook
<ide>
<ide> if TYPE_CHECKING:
<ide> def execute(self, context: 'Context'):
<ide> self.log.info(self.sql)
<ide>
<ide> data = hive.get_pandas_df(self.sql, schema=self.schema)
<del> dynamodb = AwsDynamoDBHook(
<add> dynamodb = DynamoDBHook(
<ide> aws_conn_id=self.aws_conn_id,
<ide> table_name=self.table_name,
<ide> table_keys=self.table_keys,
<ide><path>tests/deprecated_classes.py
<ide> 'airflow.contrib.hooks.aws_hook.AwsHook',
<ide> ),
<ide> (
<del> 'airflow.providers.amazon.aws.hooks.dynamodb.AwsDynamoDBHook',
<add> 'airflow.providers.amazon.aws.hooks.dynamodb.DynamoDBHook',
<ide> 'airflow.contrib.hooks.aws_dynamodb_hook.AwsDynamoDBHook',
<ide> ),
<ide> (
<ide><path>tests/providers/amazon/aws/hooks/test_dynamodb.py
<ide> import unittest
<ide> import uuid
<ide>
<del>from airflow.providers.amazon.aws.hooks.dynamodb import AwsDynamoDBHook
<add>from airflow.providers.amazon.aws.hooks.dynamodb import DynamoDBHook
<ide>
<ide> try:
<ide> from moto import mock_dynamodb2
<ide> class TestDynamoDBHook(unittest.TestCase):
<ide> @unittest.skipIf(mock_dynamodb2 is None, 'mock_dynamodb2 package not present')
<ide> @mock_dynamodb2
<ide> def test_get_conn_returns_a_boto3_connection(self):
<del> hook = AwsDynamoDBHook(aws_conn_id='aws_default')
<add> hook = DynamoDBHook(aws_conn_id='aws_default')
<ide> assert hook.get_conn() is not None
<ide>
<ide> @unittest.skipIf(mock_dynamodb2 is None, 'mock_dynamodb2 package not present')
<ide> @mock_dynamodb2
<ide> def test_insert_batch_items_dynamodb_table(self):
<ide>
<del> hook = AwsDynamoDBHook(
<add> hook = DynamoDBHook(
<ide> aws_conn_id='aws_default', table_name='test_airflow', table_keys=['id'], region_name='us-east-1'
<ide> )
<ide>
<ide><path>tests/providers/amazon/aws/transfers/test_dynamodb_to_s3.py
<ide> def mock_upload_file(self, Filename, Bucket, Key):
<ide> self.output_queue.append(json.loads(line))
<ide>
<ide> @patch('airflow.providers.amazon.aws.transfers.dynamodb_to_s3.S3Hook')
<del> @patch('airflow.providers.amazon.aws.transfers.dynamodb_to_s3.AwsDynamoDBHook')
<add> @patch('airflow.providers.amazon.aws.transfers.dynamodb_to_s3.DynamoDBHook')
<ide> def test_dynamodb_to_s3_success(self, mock_aws_dynamodb_hook, mock_s3_hook):
<ide> responses = [
<ide> {
<ide> def test_dynamodb_to_s3_success(self, mock_aws_dynamodb_hook, mock_s3_hook):
<ide> assert [{'a': 1}, {'b': 2}, {'c': 3}] == self.output_queue
<ide>
<ide> @patch('airflow.providers.amazon.aws.transfers.dynamodb_to_s3.S3Hook')
<del> @patch('airflow.providers.amazon.aws.transfers.dynamodb_to_s3.AwsDynamoDBHook')
<add> @patch('airflow.providers.amazon.aws.transfers.dynamodb_to_s3.DynamoDBHook')
<ide> def test_dynamodb_to_s3_with_different_aws_conn_id(self, mock_aws_dynamodb_hook, mock_s3_hook):
<ide> responses = [
<ide> {
<ide><path>tests/providers/amazon/aws/transfers/test_hive_to_dynamodb.py
<ide>
<ide> import airflow.providers.amazon.aws.transfers.hive_to_dynamodb
<ide> from airflow.models.dag import DAG
<del>from airflow.providers.amazon.aws.hooks.dynamodb import AwsDynamoDBHook
<add>from airflow.providers.amazon.aws.hooks.dynamodb import DynamoDBHook
<ide>
<ide> DEFAULT_DATE = datetime.datetime(2015, 1, 1)
<ide> DEFAULT_DATE_ISO = DEFAULT_DATE.isoformat()
<ide> def setUp(self):
<ide> dag = DAG('test_dag_id', default_args=args)
<ide> self.dag = dag
<ide> self.sql = 'SELECT 1'
<del> self.hook = AwsDynamoDBHook(aws_conn_id='aws_default', region_name='us-east-1')
<add> self.hook = DynamoDBHook(aws_conn_id='aws_default', region_name='us-east-1')
<ide>
<ide> @staticmethod
<ide> def process_data(data, *args, **kwargs):
<ide> def process_data(data, *args, **kwargs):
<ide> @unittest.skipIf(mock_dynamodb2 is None, 'mock_dynamodb2 package not present')
<ide> @mock_dynamodb2
<ide> def test_get_conn_returns_a_boto3_connection(self):
<del> hook = AwsDynamoDBHook(aws_conn_id='aws_default')
<add> hook = DynamoDBHook(aws_conn_id='aws_default')
<ide> assert hook.get_conn() is not None
<ide>
<ide> @mock.patch( | 7 |
Java | Java | fix failing tests | aa51ed19403129bd9f0b8f5373391ba6bc8baaaa | <ide><path>spring-web/src/main/java/org/springframework/web/cors/CorsConfiguration.java
<ide> private List<OriginPattern> combinePatterns(
<ide>
<ide> /**
<ide> * Check the origin of the request against the configured allowed origins.
<del> * @param requestOrigin the origin to check
<add> * @param origin the origin to check
<ide> * @return the origin to use for the response, or {@code null} which
<ide> * means the request origin is not allowed
<ide> */
<ide> @Nullable
<del> public String checkOrigin(@Nullable String requestOrigin) {
<del> if (!StringUtils.hasText(requestOrigin)) {
<add> public String checkOrigin(@Nullable String origin) {
<add> if (!StringUtils.hasText(origin)) {
<ide> return null;
<ide> }
<del> requestOrigin = trimTrailingSlash(requestOrigin);
<add> String originToCheck = trimTrailingSlash(origin);
<ide> if (!ObjectUtils.isEmpty(this.allowedOrigins)) {
<ide> if (this.allowedOrigins.contains(ALL)) {
<ide> validateAllowCredentials();
<ide> return ALL;
<ide> }
<ide> for (String allowedOrigin : this.allowedOrigins) {
<del> if (requestOrigin.equalsIgnoreCase(allowedOrigin)) {
<del> return requestOrigin;
<add> if (originToCheck.equalsIgnoreCase(allowedOrigin)) {
<add> return origin;
<ide> }
<ide> }
<ide> }
<ide> if (!ObjectUtils.isEmpty(this.allowedOriginPatterns)) {
<ide> for (OriginPattern p : this.allowedOriginPatterns) {
<del> if (p.getDeclaredPattern().equals(ALL) || p.getPattern().matcher(requestOrigin).matches()) {
<del> return requestOrigin;
<add> if (p.getDeclaredPattern().equals(ALL) || p.getPattern().matcher(originToCheck).matches()) {
<add> return origin;
<ide> }
<ide> }
<ide> }
<ide><path>spring-web/src/test/java/org/springframework/web/cors/CorsConfigurationTests.java
<ide> public void checkOriginAllowed() {
<ide> // specific origin matches Origin header with or without trailing "/"
<ide> config.setAllowedOrigins(Collections.singletonList("https://domain.com"));
<ide> assertThat(config.checkOrigin("https://domain.com")).isEqualTo("https://domain.com");
<del> assertThat(config.checkOrigin("https://domain.com/")).isEqualTo("https://domain.com");
<add> assertThat(config.checkOrigin("https://domain.com/")).isEqualTo("https://domain.com/");
<ide>
<ide> // specific origin with trailing "/" matches Origin header with or without trailing "/"
<ide> config.setAllowedOrigins(Collections.singletonList("https://domain.com/"));
<ide> assertThat(config.checkOrigin("https://domain.com")).isEqualTo("https://domain.com");
<del> assertThat(config.checkOrigin("https://domain.com/")).isEqualTo("https://domain.com");
<add> assertThat(config.checkOrigin("https://domain.com/")).isEqualTo("https://domain.com/");
<ide>
<ide> config.setAllowCredentials(false);
<ide> assertThat(config.checkOrigin("https://domain.com")).isEqualTo("https://domain.com");
<ide><path>spring-webmvc/src/test/java/org/springframework/web/servlet/mvc/method/annotation/CrossOriginTests.java
<ide> void classLevelComposedAnnotation(TestRequestMappingInfoHandlerMapping mapping)
<ide> CorsConfiguration config = getCorsConfiguration(chain, false);
<ide> assertThat(config).isNotNull();
<ide> assertThat(config.getAllowedMethods()).containsExactly("GET");
<del> assertThat(config.getAllowedOrigins()).containsExactly("http://www.foo.example/");
<add> assertThat(config.getAllowedOrigins()).containsExactly("http://www.foo.example");
<ide> assertThat(config.getAllowCredentials()).isTrue();
<ide> }
<ide>
<ide> void methodLevelComposedAnnotation(TestRequestMappingInfoHandlerMapping mapping)
<ide> CorsConfiguration config = getCorsConfiguration(chain, false);
<ide> assertThat(config).isNotNull();
<ide> assertThat(config.getAllowedMethods()).containsExactly("GET");
<del> assertThat(config.getAllowedOrigins()).containsExactly("http://www.foo.example/");
<add> assertThat(config.getAllowedOrigins()).containsExactly("http://www.foo.example");
<ide> assertThat(config.getAllowCredentials()).isTrue();
<ide> }
<ide> | 3 |
Text | Text | fix links in documentations | f16d8a32e2641bae8483a5194b7ddb1ec56ab5fa | <ide><path>docs/axes/cartesian/time.md
<ide> The following options are provided by the time scale. They are all located in th
<ide> | `max` | [Time](#date-formats) | | If defined, this will override the data maximum
<ide> | `min` | [Time](#date-formats) | | If defined, this will override the data minimum
<ide> | `parser` | `String` or `Function` | | Custom parser for dates. [more...](#parser)
<del>| `round` | `String` | `false` | If defined, dates will be rounded to the start of this unit. See [Time Units](#scales-time-units) below for the allowed units.
<add>| `round` | `String` | `false` | If defined, dates will be rounded to the start of this unit. See [Time Units](#time-units) below for the allowed units.
<ide> | `tooltipFormat` | `String` | | The moment js format string to use for the tooltip.
<del>| `unit` | `String` | `false` | If defined, will force the unit to be a certain type. See [Time Units](#scales-time-units) section below for details.
<add>| `unit` | `String` | `false` | If defined, will force the unit to be a certain type. See [Time Units](#time-units) section below for details.
<ide> | `stepSize` | `Number` | `1` | The number of units between grid lines.
<ide> | `minUnit` | `String` | `'millisecond'` | The minimum display format to be used for a time unit.
<ide>
<ide><path>docs/charts/bar.md
<ide> Some properties can be specified as an array. If these are set to an array value
<ide> | `backgroundColor` | `Color/Color[]` | The fill color of the bar. See [Colors](../general/colors.md#colors)
<ide> | `borderColor` | `Color/Color[]` | The color of the bar border. See [Colors](../general/colors.md#colors)
<ide> | `borderWidth` | `Number/Number[]` | The stroke width of the bar in pixels.
<del>| `borderSkipped` | `String` | Which edge to skip drawing the border for. [more...](#borderSkipped)
<add>| `borderSkipped` | `String` | Which edge to skip drawing the border for. [more...](#borderskipped)
<ide> | `hoverBackgroundColor` | `Color/Color[]` | The fill colour of the bars when hovered.
<ide> | `hoverBorderColor` | `Color/Color[]` | The stroke colour of the bars when hovered.
<ide> | `hoverBorderWidth` | `Number/Number[]` | The stroke width of the bars when hovered.
<ide> The bar chart defines the following configuration options. These options are mer
<ide>
<ide> | Name | Type | Default | Description
<ide> | ---- | ---- | ------- | -----------
<del>| `barPercentage` | `Number` | `0.9` | Percent (0-1) of the available width each bar should be within the category percentage. 1.0 will take the whole category width and put the bars right next to each other. [more...](#bar-chart-barpercentage-vs-categorypercentage)
<del>| `categoryPercentage` | `Number` | `0.8` | Percent (0-1) of the available width (the space between the gridlines for small datasets) for each data-point to use for the bars. [more...](#bar-chart-barpercentage-vs-categorypercentage)
<add>| `barPercentage` | `Number` | `0.9` | Percent (0-1) of the available width each bar should be within the category percentage. 1.0 will take the whole category width and put the bars right next to each other. [more...](#barpercentage-vs-categorypercentage)
<add>| `categoryPercentage` | `Number` | `0.8` | Percent (0-1) of the available width (the space between the gridlines for small datasets) for each data-point to use for the bars. [more...](#barpercentage-vs-categorypercentage)
<ide> | `barThickness` | `Number` | | Manually set width of each bar in pixels. If not set, the bars are sized automatically using `barPercentage` and `categoryPercentage`;
<ide> | `maxBarThickness` | `Number` | | Set this to ensure that the automatically sized bars are not sized thicker than this. Only works if barThickness is not set (automatic sizing is enabled).
<del>| `gridLines.offsetGridLines` | `Boolean` | `true` | If true, the bars for a particular data point fall between the grid lines. If false, the grid line will go right down the middle of the bars. [more...](#offsetGridLines)
<add>| `gridLines.offsetGridLines` | `Boolean` | `true` | If true, the bars for a particular data point fall between the grid lines. If false, the grid line will go right down the middle of the bars. [more...](#offsetgridlines)
<ide>
<ide> ### offsetGridLines
<ide> If true, the bars for a particular data point fall between the grid lines. If false, the grid line will go right down the middle of the bars. It is unlikely that this will ever need to be changed in practice. It exists more as a way to reuse the axis code by configuring the existing axis slightly differently.
<ide> var myBarChart = new Chart(ctx, {
<ide> ```
<ide>
<ide> ## Config Options
<del>The configuration options for the horizontal bar chart are the same as for the [bar chart](../bar/config-options.md#config-options). However, any options specified on the x axis in a bar chart, are applied to the y axis in a horizontal bar chart.
<add>The configuration options for the horizontal bar chart are the same as for the [bar chart](#configuration-options). However, any options specified on the x axis in a bar chart, are applied to the y axis in a horizontal bar chart.
<ide>
<del>The default horizontal bar configuration is specified in `Chart.defaults.horizontalBar`.
<ide>\ No newline at end of file
<add>The default horizontal bar configuration is specified in `Chart.defaults.horizontalBar`.
<ide><path>docs/charts/bubble.md
<ide> All properties, except `label` can be specified as an array. If these are set to
<ide>
<ide> ## Config Options
<ide>
<del>The bubble chart has no unique configuration options. To configure options common to all of the bubbles, the [point element options](../configuration/elements/point.md#point-configuration) are used.
<add>The bubble chart has no unique configuration options. To configure options common to all of the bubbles, the [point element options](../configuration/elements.md#point-configuration) are used.
<ide>
<ide> ## Default Options
<ide>
<ide> Data for the bubble chart is passed in the form of an object. The object must im
<ide> // Radius of bubble. This is not scaled.
<ide> r: <Number>
<ide> }
<del>```
<ide>\ No newline at end of file
<add>```
<ide><path>docs/charts/line.md
<ide> All point* properties can be specified as an array. If these are set to an array
<ide> | `borderDashOffset` | `Number` | Offset for line dashes. See [MDN](https://developer.mozilla.org/en-US/docs/Web/API/CanvasRenderingContext2D/lineDashOffset)
<ide> | `borderCapStyle` | `String` | Cap style of the line. See [MDN](https://developer.mozilla.org/en-US/docs/Web/API/CanvasRenderingContext2D/lineCap)
<ide> | `borderJoinStyle` | `String` | Line joint style. See [MDN](https://developer.mozilla.org/en-US/docs/Web/API/CanvasRenderingContext2D/lineJoin)
<del>| `cubicInterpolationMode` | `String` | Algorithm used to interpolate a smooth curve from the discrete data points. [more...](#cubicInterpolationMode)
<add>| `cubicInterpolationMode` | `String` | Algorithm used to interpolate a smooth curve from the discrete data points. [more...](#cubicinterpolationmode)
<ide> | `fill` | `Boolean/String` | How to fill the area under the line. See [area charts](area.md)
<ide> | `lineTension` | `Number` | Bezier curve tension of the line. Set to 0 to draw straightlines. This option is ignored if monotone cubic interpolation is used.
<ide> | `pointBackgroundColor` | `Color/Color[]` | The fill color for points.
<ide> | `pointBorderColor` | `Color/Color[]` | The border color for points.
<ide> | `pointBorderWidth` | `Number/Number[]` | The width of the point border in pixels.
<ide> | `pointRadius` | `Number/Number[]` | The radius of the point shape. If set to 0, the point is not rendered.
<del>| `pointStyle` | `String/String[]/Image/Image[]` | Style of the point. [more...](#pointStyle)
<add>| `pointStyle` | `String/String[]/Image/Image[]` | Style of the point. [more...](#pointstyle)
<ide> | `pointHitRadius` | `Number/Number[]` | The pixel size of the non-displayed point that reacts to mouse events.
<ide> | `pointHoverBackgroundColor` | `Color/Color[]` | Point background color when hovered.
<ide> | `pointHoverBorderColor` | `Color/Color[]` | Point border color when hovered.
<ide><path>docs/charts/polar.md
<ide> The following options can be included in a polar area chart dataset to configure
<ide>
<ide> ## Config Options
<ide>
<del>These are the customisation options specific to Polar Area charts. These options are merged with the [global chart configuration options](#global-chart-configuration), and form the options of the chart.
<add>These are the customisation options specific to Polar Area charts. These options are merged with the [global chart default options](#default-options), and form the options of the chart.
<ide>
<ide> | Name | Type | Default | Description
<ide> | ---- | ---- | ------- | -----------
<ide> data = {
<ide> 'Blue'
<ide> ]
<ide> };
<del>```
<ide>\ No newline at end of file
<add>```
<ide><path>docs/charts/radar.md
<ide> All point* properties can be specified as an array. If these are set to an array
<ide> | `pointBorderColor` | `Color/Color[]` | The border color for points.
<ide> | `pointBorderWidth` | `Number/Number[]` | The width of the point border in pixels.
<ide> | `pointRadius` | `Number/Number[]` | The radius of the point shape. If set to 0, the point is not rendered.
<del>| `pointStyle` | `String/String[]/Image/Image[]` | Style of the point. [more...](#pointStyle)
<add>| `pointStyle` | `String/String[]/Image/Image[]` | Style of the point. [more...](#pointstyle)
<ide> | `pointHitRadius` | `Number/Number[]` | The pixel size of the non-displayed point that reacts to mouse events.
<ide> | `pointHoverBackgroundColor` | `Color/Color[]` | Point background color when hovered.
<ide> | `pointHoverBorderColor` | `Color/Color[]` | Point border color when hovered.
<ide><path>docs/configuration/legend.md
<ide> The legend label configuration is nested below the legend configuration using th
<ide> | `fontColor` | Color | `'#666'` | Color of text
<ide> | `fontFamily` | `String` | `"'Helvetica Neue', 'Helvetica', 'Arial', sans-serif"` | Font family of legend text.
<ide> | `padding` | `Number` | `10` | Padding between rows of colored boxes.
<del>| `generateLabels` | `Function` | | Generates legend items for each thing in the legend. Default implementation returns the text + styling for the color box. See [Legend Item](#chart-configuration-legend-item-interface) for details.
<del>| `filter` | `Function` | `null` | Filters legend items out of the legend. Receives 2 parameters, a [Legend Item](#chart-configuration-legend-item-interface) and the chart data.
<add>| `generateLabels` | `Function` | | Generates legend items for each thing in the legend. Default implementation returns the text + styling for the color box. See [Legend Item](#legend-item-interface) for details.
<add>| `filter` | `Function` | `null` | Filters legend items out of the legend. Receives 2 parameters, a [Legend Item](#legend-item-interface) and the chart data.
<ide> | `usePointStyle` | `Boolean` | `false` | Label style will match corresponding point style (size is based on fontSize, boxWidth is not used in this case).
<ide>
<ide> ## Legend Item Interface
<ide> var chart = new Chart(ctx, {
<ide> }
<ide> }
<ide> });
<del>```
<ide>\ No newline at end of file
<add>```
<ide><path>docs/configuration/tooltip.md
<ide> The tooltip configuration is passed into the `options.tooltips` namespace. The g
<ide> | Name | Type | Default | Description
<ide> | -----| ---- | --------| -----------
<ide> | `enabled` | `Boolean` | `true` | Are tooltips enabled
<del>| `custom` | `Function` | `null` | See [custom tooltip](#custom-tooltips) section.
<add>| `custom` | `Function` | `null` | See [custom tooltip](#external-custom-tooltips) section.
<ide> | `mode` | `String` | `'nearest'` | Sets which elements appear in the tooltip. [more...](../general/interactions/modes.md#interaction-modes).
<ide> | `intersect` | `Boolean` | `true` | if true, the tooltip mode applies only when the mouse position intersects with an element. If false, the mode will be applied at all times.
<ide> | `position` | `String` | `'average'` | The mode for positioning the tooltip. [more...](#position-modes)
<ide> New modes can be defined by adding functions to the Chart.Tooltip.positioners ma
<ide>
<ide> ### Sort Callback
<ide>
<del>Allows sorting of [tooltip items](#chart-configuration-tooltip-item-interface). Must implement at minimum a function that can be passed to [Array.prototype.sort](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Array/sort). This function can also accept a third parameter that is the data object passed to the chart.
<add>Allows sorting of [tooltip items](#tooltip-item-interface). Must implement at minimum a function that can be passed to [Array.prototype.sort](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Array/sort). This function can also accept a third parameter that is the data object passed to the chart.
<ide>
<ide> ### Filter Callback
<ide>
<del>Allows filtering of [tooltip items](#chart-configuration-tooltip-item-interface). Must implement at minimum a function that can be passed to [Array.prototype.filter](https://developer.mozilla.org/en/docs/Web/JavaScript/Reference/Global_Objects/Array/filter). This function can also accept a second parameter that is the data object passed to the chart.
<add>Allows filtering of [tooltip items](#tooltip-item-interface). Must implement at minimum a function that can be passed to [Array.prototype.filter](https://developer.mozilla.org/en/docs/Web/JavaScript/Reference/Global_Objects/Array/filter). This function can also accept a second parameter that is the data object passed to the chart.
<ide>
<ide> ## Tooltip Callbacks
<ide>
<ide> The tooltip label configuration is nested below the tooltip configuration using the `callbacks` key. The tooltip has the following callbacks for providing text. For all functions, 'this' will be the tooltip object created from the Chart.Tooltip constructor.
<ide>
<del>All functions are called with the same arguments: a [tooltip item](#chart-configuration-tooltip-item-interface) and the data object passed to the chart. All functions must return either a string or an array of strings. Arrays of strings are treated as multiple lines of text.
<add>All functions are called with the same arguments: a [tooltip item](#tooltip-item-interface) and the data object passed to the chart. All functions must return either a string or an array of strings. Arrays of strings are treated as multiple lines of text.
<ide>
<ide> | Name | Arguments | Description
<ide> | ---- | --------- | -----------
<ide><path>docs/developers/plugins.md
<ide> # Plugins
<ide>
<del>Plugins are the most efficient way to customize or change the default behavior of a chart. They have been introduced at [version 2.1.0](https://github.com/chartjs/Chart.js/releases/tag/2.1.0) (global plugins only) and extended at [version 2.5.0](https://github.com/chartjs/Chart.js/releases/tag/2.5.0) (per chart plugins and options).
<add>Plugins are the most efficient way to customize or change the default behavior of a chart. They have been introduced at [version 2.1.0](https://github.com/chartjs/Chart.js/releases/tag/2.1.0) (global plugins only) and extended at [version 2.5.0](https://github.com/chartjs/Chart.js/releases/tag/v2.5.0) (per chart plugins and options).
<ide>
<ide> ## Using plugins
<ide> | 9 |
PHP | PHP | add default logging config in config/bootstrap.php | 5c4f7415295e2588810abfa8044e0b9ffb3c8985 | <ide><path>app/Config/bootstrap.php
<ide> 'AssetDispatcher',
<ide> 'CacheDispatcher'
<ide> ));
<add>
<add>/**
<add> * Configures default file logging options
<add> */
<add>CakeLog::config('debug', array(
<add> 'engine' => 'FileLog',
<add> 'types' => array('notice', 'info', 'debug'),
<add> 'file' => 'debug',
<add>));
<add>CakeLog::config('error', array(
<add> 'engine' => 'FileLog',
<add> 'types' => array('error', 'warning'),
<add> 'file' => 'error',
<add>));
<ide><path>lib/Cake/Console/Templates/skel/Config/bootstrap.php
<ide> 'AssetDispatcher',
<ide> 'CacheDispatcher'
<ide> ));
<add>
<add>/**
<add> * Configures default file logging options
<add> */
<add>CakeLog::config('debug', array(
<add> 'engine' => 'FileLog',
<add> 'scopes' => array('notice', 'info', 'debug'),
<add> 'file' => 'debug',
<add>));
<add>CakeLog::config('error', array(
<add> 'engine' => 'FileLog',
<add> 'scopes' => array('error', 'warning'),
<add> 'file' => 'error',
<add>)); | 2 |
Text | Text | fix links to agility cms examples | 4e6c326a56ec39c73971397e8ce2a80e2091e953 | <ide><path>docs/advanced-features/preview-mode.md
<ide> description: Next.js has the preview mode for statically generated pages. You ca
<ide> <li><a href="https://github.com/vercel/next.js/tree/canary/examples/cms-prismic">Prismic Example</a> (<a href="https://next-blog-prismic.now.sh/">Demo</a>)</li>
<ide> <li><a href="https://github.com/vercel/next.js/tree/canary/examples/cms-contentful">Contentful Example</a> (<a href="https://next-blog-contentful.now.sh/">Demo</a>)</li>
<ide> <li><a href="https://github.com/vercel/next.js/tree/canary/examples/cms-strapi">Strapi Example</a> (<a href="https://next-blog-strapi.now.sh/">Demo</a>)</li>
<del> <li><a href="https://github.com/vercel/next.js/tree/canary/examples/cms-agility-cms">Agility CMS Example</a> (<a href="https://next-blog-agilitycms.now.sh/">Demo</a>)</li>
<add> <li><a href="https://github.com/vercel/next.js/tree/canary/examples/cms-agilitycms">Agility CMS Example</a> (<a href="https://next-blog-agilitycms.now.sh/">Demo</a>)</li>
<ide> <li><a href="https://github.com/vercel/next.js/tree/canary/examples/cms-cosmic">Cosmic Example</a> (<a href="https://next-blog-cosmic.now.sh/">Demo</a>)</li>
<ide> </ul>
<ide> </details>
<ide><path>docs/basic-features/data-fetching.md
<ide> description: 'Next.js has 2 pre-rendering modes: Static Generation and Server-si
<ide> <li><a href="https://github.com/vercel/next.js/tree/canary/examples/cms-prismic">Prismic Example</a> (<a href="https://next-blog-prismic.now.sh/">Demo</a>)</li>
<ide> <li><a href="https://github.com/vercel/next.js/tree/canary/examples/cms-contentful">Contentful Example</a> (<a href="https://next-blog-contentful.now.sh/">Demo</a>)</li>
<ide> <li><a href="https://github.com/vercel/next.js/tree/canary/examples/cms-strapi">Strapi Example</a> (<a href="https://next-blog-strapi.now.sh/">Demo</a>)</li>
<del> <li><a href="https://github.com/vercel/next.js/tree/canary/examples/cms-agility-cms">Agility CMS Example</a> (<a href="https://next-blog-agilitycms.now.sh/">Demo</a>)</li>
<add> <li><a href="https://github.com/vercel/next.js/tree/canary/examples/cms-agilitycms">Agility CMS Example</a> (<a href="https://next-blog-agilitycms.now.sh/">Demo</a>)</li>
<ide> <li><a href="https://github.com/vercel/next.js/tree/canary/examples/cms-cosmic">Cosmic Example</a> (<a href="https://next-blog-cosmic.now.sh/">Demo</a>)</li>
<ide> <li><a href="https://static-tweet.now.sh/">Static Tweet Demo</a></li>
<ide> </ul>
<ide><path>docs/basic-features/pages.md
<ide> Finally, you can always use **Client-side Rendering** along with Static Generati
<ide> <li><a href="https://github.com/vercel/next.js/tree/canary/examples/cms-prismic">Prismic Example</a> (<a href="https://next-blog-prismic.now.sh/">Demo</a>)</li>
<ide> <li><a href="https://github.com/vercel/next.js/tree/canary/examples/cms-contentful">Contentful Example</a> (<a href="https://next-blog-contentful.now.sh/">Demo</a>)</li>
<ide> <li><a href="https://github.com/vercel/next.js/tree/canary/examples/cms-strapi">Strapi Example</a> (<a href="https://next-blog-strapi.now.sh/">Demo</a>)</li>
<del> <li><a href="https://github.com/vercel/next.js/tree/canary/examples/cms-agility-cms">Agility CMS Example</a> (<a href="https://next-blog-agilitycms.now.sh/">Demo</a>)</li>
<add> <li><a href="https://github.com/vercel/next.js/tree/canary/examples/cms-agilitycms">Agility CMS Example</a> (<a href="https://next-blog-agilitycms.now.sh/">Demo</a>)</li>
<ide> <li><a href="https://github.com/vercel/next.js/tree/canary/examples/cms-cosmic">Cosmic Example</a> (<a href="https://next-blog-cosmic.now.sh/">Demo</a>)</li>
<ide> <li><a href="https://static-tweet.now.sh/">Static Tweet Demo</a></li>
<ide> </ul> | 3 |
Go | Go | fix use of **/ in .dockerignore | 376bb84c5c8f6183dbb88dae705e1de4182da4c9 | <ide><path>pkg/fileutils/fileutils.go
<ide> func regexpMatch(pattern, path string) (bool, error) {
<ide> // is some flavor of "**"
<ide> scan.Next()
<ide>
<add> // Treat **/ as ** so eat the "/"
<add> if string(scan.Peek()) == sl {
<add> scan.Next()
<add> }
<add>
<ide> if scan.Peek() == scanner.EOF {
<ide> // is "**EOF" - to align with .gitignore just accept all
<ide> regStr += ".*"
<ide> } else {
<ide> // is "**"
<del> regStr += "((.*" + escSL + ")|([^" + escSL + "]*))"
<del> }
<del>
<del> // Treat **/ as ** so eat the "/"
<del> if string(scan.Peek()) == sl {
<del> scan.Next()
<add> // Note that this allows for any # of /'s (even 0) because
<add> // the .* will eat everything, even /'s
<add> regStr += "(.*" + escSL + ")?"
<ide> }
<ide> } else {
<ide> // is "*" so map it to anything but "/"
<ide><path>pkg/fileutils/fileutils_test.go
<ide> func TestMatches(t *testing.T) {
<ide> {"**", "/", true},
<ide> {"**/", "/", true},
<ide> {"**", "dir/file", true},
<del> {"**/", "dir/file", false},
<add> {"**/", "dir/file", true},
<ide> {"**", "dir/file/", true},
<ide> {"**/", "dir/file/", true},
<ide> {"**/**", "dir/file", true},
<ide> func TestMatches(t *testing.T) {
<ide> {"abc/**", "abc", false},
<ide> {"abc/**", "abc/def", true},
<ide> {"abc/**", "abc/def/ghi", true},
<add> {"**/.foo", ".foo", true},
<add> {"**/.foo", "bar.foo", false},
<ide> }
<ide>
<ide> for _, test := range tests { | 2 |
Java | Java | restore formatting change following revert | ad88c02bc3bc031916e65b0a777cbce561848bca | <ide><path>spring-webmvc/src/main/java/org/springframework/web/servlet/mvc/method/annotation/AbstractMessageConverterMethodProcessor.java
<ide> else if (mediaType.equals(MediaType.ALL) || mediaType.equals(MEDIA_TYPE_APPLICAT
<ide> inputMessage, outputMessage);
<ide> if (body != null) {
<ide> if (logger.isDebugEnabled()) {
<del> Object formatted = (body instanceof CharSequence ? "\"" + body + "\"" : body);
<del> logger.debug("Writing [" + formatted + "]");
<add> logger.debug("Writing [" + formatValue(body) + "]");
<ide> }
<ide> addContentDispositionHeader(inputMessage, outputMessage);
<ide> if (genericConverter != null) {
<ide> private MediaType getMostSpecificMediaType(MediaType acceptType, MediaType produ
<ide> return (MediaType.SPECIFICITY_COMPARATOR.compare(acceptType, produceTypeToUse) <= 0 ? acceptType : produceTypeToUse);
<ide> }
<ide>
<add> static String formatValue(Object body) {
<add> return (body instanceof CharSequence ? "\"" + body + "\"" : body.toString());
<add> }
<add>
<ide> /**
<ide> * Check if the path has a file extension and whether the extension is
<ide> * either {@link #WHITELISTED_EXTENSIONS whitelisted} or explicitly
<ide><path>spring-webmvc/src/main/java/org/springframework/web/servlet/mvc/method/annotation/RequestMappingHandlerAdapter.java
<ide> protected ModelAndView invokeHandlerMethod(HttpServletRequest request,
<ide> mavContainer = (ModelAndViewContainer) asyncManager.getConcurrentResultContext()[0];
<ide> asyncManager.clearConcurrentResult();
<ide> if (logger.isDebugEnabled()) {
<del> logger.debug("Resume with async result [" +
<del> (result instanceof CharSequence ? "\"" + result + "\"" : result) + "]");
<add> String formatted = AbstractMessageConverterMethodProcessor.formatValue(result);
<add> logger.debug("Resume with async result [" + formatted + "]");
<ide> }
<ide> invocableMethod = invocableMethod.wrapConcurrentResult(result);
<ide> } | 2 |
Ruby | Ruby | remove support for legacy homebrew repo. (#287) | f946693b561bae7f065d06776751a19d199f7031 | <ide><path>Library/Homebrew/cmd/test-bot.rb
<ide> # --junit: Generate a JUnit XML test results file.
<ide> # --no-bottle: Run brew install without --build-bottle.
<ide> # --keep-old: Run brew bottle --keep-old to build new bottles for a single platform.
<del># --legacy Build formula from legacy Homebrew/legacy-homebrew repo.
<del># (TODO remove it when it's not longer necessary)
<ide> # --HEAD: Run brew install with --HEAD.
<ide> # --local: Ask Homebrew to write verbose logs under ./logs/ and set HOME to ./home/.
<ide> # --tap=<tap>: Use the git repository of the given tap.
<ide> def diff_formulae(start_revision, end_revision, path, filter)
<ide> # the right commit to BrewTestBot.
<ide> unless travis_pr
<ide> diff_start_sha1 = current_sha1
<del> if ARGV.include?("--legacy")
<del> test "brew", "pull", "--clean", "--legacy", @url
<del> else
<del> test "brew", "pull", "--clean", @url
<del> end
<add> test "brew", "pull", "--clean", @url
<ide> diff_end_sha1 = current_sha1
<ide> end
<ide> @short_url = @url.gsub("https://github.com/", "")
<ide> def test_ci_upload(tap)
<ide> ENV["HUDSON_COOKIE"] = nil
<ide>
<ide> ARGV << "--verbose"
<del> ARGV << "--legacy" if ENV["UPSTREAM_BOTTLE_LEGACY"]
<ide>
<ide> bottles = Dir["#{jenkins}/jobs/#{job}/configurations/axis-version/*/builds/#{id}/archive/*.bottle*.*"]
<ide> return if bottles.empty?
<ide> def test_ci_upload(tap)
<ide> safe_system "brew", "update"
<ide>
<ide> if pr
<del> if ARGV.include?("--legacy")
<del> pull_pr = "https://github.com/Homebrew/legacy-homebrew/pull/#{pr}"
<del> safe_system "brew", "pull", "--clean", "--legacy", pull_pr
<del> else
<del> pull_pr = "https://github.com/#{tap.user}/homebrew-#{tap.repo}/pull/#{pr}"
<del> safe_system "brew", "pull", "--clean", pull_pr
<del> end
<add> pull_pr = "https://github.com/#{tap.user}/homebrew-#{tap.repo}/pull/#{pr}"
<add> safe_system "brew", "pull", "--clean", pull_pr
<ide> end
<ide>
<ide> json_files = Dir.glob("*.bottle.json") | 1 |
Python | Python | change default tag for 動詞,非自立可能 | 234a8a75917aa01c48e06ed4767d6f47cdfead22 | <ide><path>spacy/ja/tag_map.py
<ide>
<ide> "代名詞,*,*,*":{POS: PRON},
<ide> "動詞,一般,*,*":{POS: VERB},
<del> "動詞,非自立可能,*,*":{POS: AUX}, # XXX VERB if alone, AUX otherwise
<add> "動詞,非自立可能,*,*":{POS: VERB}, # XXX VERB if alone, AUX otherwise
<ide> "動詞,非自立可能,*,*,AUX":{POS: AUX},
<ide> "動詞,非自立可能,*,*,VERB":{POS: VERB},
<ide> "副詞,*,*,*":{POS: ADV}, | 1 |
Ruby | Ruby | use classes for instance doubles | 78d4db2755995ff56736582758ad97f8d76c814c | <ide><path>Library/Homebrew/test/cache_store_spec.rb
<ide> describe "#close_if_open!" do
<ide> context "database open" do
<ide> before(:each) do
<del> subject.instance_variable_set(:@db, instance_double("db", close: nil))
<add> subject.instance_variable_set(:@db, instance_double(DBM, close: nil))
<ide> end
<ide>
<ide> it "does not raise an error when `close` is called on the database" do
<ide><path>Library/Homebrew/test/cask/artifact/installer_spec.rb
<ide> describe Hbc::Artifact::Installer, :cask do
<ide> let(:staged_path) { mktmpdir }
<del> let(:cask) { instance_double("Cask", staged_path: staged_path, config: nil) }
<add> let(:cask) { instance_double(Hbc::Cask, staged_path: staged_path, config: nil) }
<ide> subject(:installer) { described_class.new(cask, **args) }
<ide> let(:command) { SystemCommand }
<ide> | 2 |
Javascript | Javascript | remove the unuse file. | b1016a4d34b526a83a09ea92209bf6c99516e59a | <ide><path>examples/real-world/src/routes.js
<del>import React from 'react'
<del>import { Route } from 'react-router'
<del>import App from './containers/App'
<del>import UserPage from './containers/UserPage'
<del>import RepoPage from './containers/RepoPage'
<del>
<del>export default <Route path="/" component={App}>
<del> <Route path="/:login/:name"
<del> component={RepoPage} />
<del> <Route path="/:login"
<del> component={UserPage} />
<del></Route> | 1 |
Python | Python | fix transfo xl integration test | b189226e8cb31361916d638c5a1e2540e3208a61 | <ide><path>tests/test_modeling_tf_transfo_xl.py
<ide> def test_model_from_pretrained(self):
<ide>
<ide> @require_tf
<ide> class TFTransfoXLModelLanguageGenerationTest(unittest.TestCase):
<add> @unittest.skip("Skip test until #12651 is resolved.")
<ide> @slow
<ide> def test_lm_generate_transfo_xl_wt103(self):
<ide> model = TFTransfoXLLMHeadModel.from_pretrained("transfo-xl-wt103")
<del> input_ids = tf.convert_to_tensor(
<del> [
<del> [
<del> 33,
<del> 1297,
<del> 2,
<del> 1,
<del> 1009,
<del> 4,
<del> 1109,
<del> 11739,
<del> 4762,
<del> 358,
<del> 5,
<del> 25,
<del> 245,
<del> 22,
<del> 1706,
<del> 17,
<del> 20098,
<del> 5,
<del> 3215,
<del> 21,
<del> 37,
<del> 1110,
<del> 3,
<del> 13,
<del> 1041,
<del> 4,
<del> 24,
<del> 603,
<del> 490,
<del> 2,
<del> 71477,
<del> 20098,
<del> 104447,
<del> 2,
<del> 20961,
<del> 1,
<del> 2604,
<del> 4,
<del> 1,
<del> 329,
<del> 3,
<del> 6224,
<del> 831,
<del> 16002,
<del> 2,
<del> 8,
<del> 603,
<del> 78967,
<del> 29546,
<del> 23,
<del> 803,
<del> 20,
<del> 25,
<del> 416,
<del> 5,
<del> 8,
<del> 232,
<del> 4,
<del> 277,
<del> 6,
<del> 1855,
<del> 4601,
<del> 3,
<del> 29546,
<del> 54,
<del> 8,
<del> 3609,
<del> 5,
<del> 57211,
<del> 49,
<del> 4,
<del> 1,
<del> 277,
<del> 18,
<del> 8,
<del> 1755,
<del> 15691,
<del> 3,
<del> 341,
<del> 25,
<del> 416,
<del> 693,
<del> 42573,
<del> 71,
<del> 17,
<del> 401,
<del> 94,
<del> 31,
<del> 17919,
<del> 2,
<del> 29546,
<del> 7873,
<del> 18,
<del> 1,
<del> 435,
<del> 23,
<del> 11011,
<del> 755,
<del> 5,
<del> 5167,
<del> 3,
<del> 7983,
<del> 98,
<del> 84,
<del> 2,
<del> 29546,
<del> 3267,
<del> 8,
<del> 3609,
<del> 4,
<del> 1,
<del> 4865,
<del> 1075,
<del> 2,
<del> 6087,
<del> 71,
<del> 6,
<del> 346,
<del> 8,
<del> 5854,
<del> 3,
<del> 29546,
<del> 824,
<del> 1400,
<del> 1868,
<del> 2,
<del> 19,
<del> 160,
<del> 2,
<del> 311,
<del> 8,
<del> 5496,
<del> 2,
<del> 20920,
<del> 17,
<del> 25,
<del> 15097,
<del> 3,
<del> 24,
<del> 24,
<del> 0,
<del> ]
<del> ],
<del> dtype=tf.int32,
<del> )
<add> # fmt: off
<add> input_ids = tf.convert_to_tensor([[33,1297,2,1,1009,4,1109,11739,4762,358,5,25,245,22,1706,17,20098,5,3215,21,37,1110,3,13,1041,4,24,603,490,2,71477,20098,104447,2,20961,1,2604,4,1,329,3,6224,831,16002,2,8,603,78967,29546,23,803,20,25,416,5,8,232,4,277,6,1855,4601,3,29546,54,8,3609,5,57211,49,4,1,277,18,8,1755,15691,3,341,25,416,693,42573,71,17,401,94,31,17919,2,29546,7873,18,1,435,23,11011,755,5,5167,3,7983,98,84,2,29546,3267,8,3609,4,1,4865,1075,2,6087,71,6,346,8,5854,3,29546,824,1400,1868,2,19,160,2,311,8,5496,2,20920,17,25,15097,3,24,24,0]],dtype=tf.int32) # noqa: E231
<add> # fmt: on
<ide> # In 1991 , the remains of Russian Tsar Nicholas II and his family
<ide> # ( except for Alexei and Maria ) are discovered .
<ide> # The voice of Nicholas's young son , Tsarevich Alexei Nikolaevich , narrates the
<ide> def test_lm_generate_transfo_xl_wt103(self):
<ide> # the Virgin Mary , prompting him to become a priest . Rasputin quickly becomes famous ,
<ide> # with people , even a bishop , begging for his blessing . <eod> </s> <eos>
<ide>
<del> expected_output_ids = [
<del> 33,
<del> 1297,
<del> 2,
<del> 1,
<del> 1009,
<del> 4,
<del> 1109,
<del> 11739,
<del> 4762,
<del> 358,
<del> 5,
<del> 25,
<del> 245,
<del> 22,
<del> 1706,
<del> 17,
<del> 20098,
<del> 5,
<del> 3215,
<del> 21,
<del> 37,
<del> 1110,
<del> 3,
<del> 13,
<del> 1041,
<del> 4,
<del> 24,
<del> 603,
<del> 490,
<del> 2,
<del> 71477,
<del> 20098,
<del> 104447,
<del> 2,
<del> 20961,
<del> 1,
<del> 2604,
<del> 4,
<del> 1,
<del> 329,
<del> 3,
<del> 6224,
<del> 831,
<del> 16002,
<del> 2,
<del> 8,
<del> 603,
<del> 78967,
<del> 29546,
<del> 23,
<del> 803,
<del> 20,
<del> 25,
<del> 416,
<del> 5,
<del> 8,
<del> 232,
<del> 4,
<del> 277,
<del> 6,
<del> 1855,
<del> 4601,
<del> 3,
<del> 29546,
<del> 54,
<del> 8,
<del> 3609,
<del> 5,
<del> 57211,
<del> 49,
<del> 4,
<del> 1,
<del> 277,
<del> 18,
<del> 8,
<del> 1755,
<del> 15691,
<del> 3,
<del> 341,
<del> 25,
<del> 416,
<del> 693,
<del> 42573,
<del> 71,
<del> 17,
<del> 401,
<del> 94,
<del> 31,
<del> 17919,
<del> 2,
<del> 29546,
<del> 7873,
<del> 18,
<del> 1,
<del> 435,
<del> 23,
<del> 11011,
<del> 755,
<del> 5,
<del> 5167,
<del> 3,
<del> 7983,
<del> 98,
<del> 84,
<del> 2,
<del> 29546,
<del> 3267,
<del> 8,
<del> 3609,
<del> 4,
<del> 1,
<del> 4865,
<del> 1075,
<del> 2,
<del> 6087,
<del> 71,
<del> 6,
<del> 346,
<del> 8,
<del> 5854,
<del> 3,
<del> 29546,
<del> 824,
<del> 1400,
<del> 1868,
<del> 2,
<del> 19,
<del> 160,
<del> 2,
<del> 311,
<del> 8,
<del> 5496,
<del> 2,
<del> 20920,
<del> 17,
<del> 25,
<del> 15097,
<del> 3,
<del> 24,
<del> 24,
<del> 0,
<del> 33,
<del> 1,
<del> 1857,
<del> 2,
<del> 1,
<del> 1009,
<del> 4,
<del> 1109,
<del> 11739,
<del> 4762,
<del> 358,
<del> 5,
<del> 25,
<del> 245,
<del> 28,
<del> 1110,
<del> 3,
<del> 13,
<del> 1041,
<del> 4,
<del> 24,
<del> 603,
<del> 490,
<del> 2,
<del> 71477,
<del> 20098,
<del> 104447,
<del> 2,
<del> 20961,
<del> 1,
<del> 2604,
<del> 4,
<del> 1,
<del> 329,
<del> 3,
<del> 0,
<del> ]
<add> # fmt: off
<add> expected_output_ids = [33,1297,2,1,1009,4,1109,11739,4762,358,5,25,245,22,1706,17,20098,5,3215,21,37,1110,3,13,1041,4,24,603,490,2,71477,20098,104447,2,20961,1,2604,4,1,329,3,6224,831,16002,2,8,603,78967,29546,23,803,20,25,416,5,8,232,4,277,6,1855,4601,3,29546,54,8,3609,5,57211,49,4,1,277,18,8,1755,15691,3,341,25,416,693,42573,71,17,401,94,31,17919,2,29546,7873,18,1,435,23,11011,755,5,5167,3,7983,98,84,2,29546,3267,8,3609,4,1,4865,1075,2,6087,71,6,346,8,5854,3,29546,824,1400,1868,2,19,160,2,311,8,5496,2,20920,17,25,15097,3,24,24,0,33,1,1857,2,1,1009,4,1109,11739,4762,358,5,25,245,28,1110,3,13,1041,4,24,603,490,2,71477,20098,104447,2,20961,1,2604,4,1,329,3,0] # noqa: E231
<add> # fmt: on
<ide> # In 1991, the remains of Russian Tsar Nicholas II and his family (
<ide> # except for Alexei and Maria ) are discovered. The voice of young son,
<ide> # Tsarevich Alexei Nikolaevich, narrates the remainder of the story.
<ide><path>tests/test_modeling_transfo_xl.py
<ide> class TransfoXLModelLanguageGenerationTest(unittest.TestCase):
<ide> def test_lm_generate_transfo_xl_wt103(self):
<ide> model = TransfoXLLMHeadModel.from_pretrained("transfo-xl-wt103")
<ide> model.to(torch_device)
<del> input_ids = torch.tensor(
<del> [
<del> [
<del> 33,
<del> 1297,
<del> 2,
<del> 1,
<del> 1009,
<del> 4,
<del> 1109,
<del> 11739,
<del> 4762,
<del> 358,
<del> 5,
<del> 25,
<del> 245,
<del> 22,
<del> 1706,
<del> 17,
<del> 20098,
<del> 5,
<del> 3215,
<del> 21,
<del> 37,
<del> 1110,
<del> 3,
<del> 13,
<del> 1041,
<del> 4,
<del> 24,
<del> 603,
<del> 490,
<del> 2,
<del> 71477,
<del> 20098,
<del> 104447,
<del> 2,
<del> 20961,
<del> 1,
<del> 2604,
<del> 4,
<del> 1,
<del> 329,
<del> 3,
<del> 6224,
<del> 831,
<del> 16002,
<del> 2,
<del> 8,
<del> 603,
<del> 78967,
<del> 29546,
<del> 23,
<del> 803,
<del> 20,
<del> 25,
<del> 416,
<del> 5,
<del> 8,
<del> 232,
<del> 4,
<del> 277,
<del> 6,
<del> 1855,
<del> 4601,
<del> 3,
<del> 29546,
<del> 54,
<del> 8,
<del> 3609,
<del> 5,
<del> 57211,
<del> 49,
<del> 4,
<del> 1,
<del> 277,
<del> 18,
<del> 8,
<del> 1755,
<del> 15691,
<del> 3,
<del> 341,
<del> 25,
<del> 416,
<del> 693,
<del> 42573,
<del> 71,
<del> 17,
<del> 401,
<del> 94,
<del> 31,
<del> 17919,
<del> 2,
<del> 29546,
<del> 7873,
<del> 18,
<del> 1,
<del> 435,
<del> 23,
<del> 11011,
<del> 755,
<del> 5,
<del> 5167,
<del> 3,
<del> 7983,
<del> 98,
<del> 84,
<del> 2,
<del> 29546,
<del> 3267,
<del> 8,
<del> 3609,
<del> 4,
<del> 1,
<del> 4865,
<del> 1075,
<del> 2,
<del> 6087,
<del> 71,
<del> 6,
<del> 346,
<del> 8,
<del> 5854,
<del> 3,
<del> 29546,
<del> 824,
<del> 1400,
<del> 1868,
<del> 2,
<del> 19,
<del> 160,
<del> 2,
<del> 311,
<del> 8,
<del> 5496,
<del> 2,
<del> 20920,
<del> 17,
<del> 25,
<del> 15097,
<del> 3,
<del> 24,
<del> 24,
<del> 0,
<del> ]
<del> ],
<del> dtype=torch.long,
<del> device=torch_device,
<del> )
<add>
<add> # fmt: off
<add> input_ids = torch.tensor([[33,1297,2,1,1009,4,1109,11739,4762,358,5,25,245,22,1706,17,20098,5,3215,21,37,1110,3,13,1041,4,24,603,490,2,71477,20098,104447,2,20961,1,2604,4,1,329,3,6224,831,16002,2,8,603,78967,29546,23,803,20,25,416,5,8,232,4,277,6,1855,4601,3,29546,54,8,3609,5,57211,49,4,1,277,18,8,1755,15691,3,341,25,416,693,42573,71,17,401,94,31,17919,2,29546,7873,18,1,435,23,11011,755,5,5167,3,7983,98,84,2,29546,3267,8,3609,4,1,4865,1075,2,6087,71,6,346,8,5854,3,29546,824,1400,1868,2,19,160,2,311,8,5496,2,20920,17,25,15097,3,24,24,0]],dtype=torch.long,device=torch_device) # noqa: E231
<add> # fmt: on
<ide> # In 1991 , the remains of Russian Tsar Nicholas II and his family
<ide> # ( except for Alexei and Maria ) are discovered .
<ide> # The voice of Nicholas's young son , Tsarevich Alexei Nikolaevich , narrates the
<ide> def test_lm_generate_transfo_xl_wt103(self):
<ide> # the Virgin Mary , prompting him to become a priest . Rasputin quickly becomes famous ,
<ide> # with people , even a bishop , begging for his blessing . <eod> </s> <eos>
<ide>
<del> expected_output_ids = [
<del> 33,
<del> 1297,
<del> 2,
<del> 1,
<del> 1009,
<del> 4,
<del> 1109,
<del> 11739,
<del> 4762,
<del> 358,
<del> 5,
<del> 25,
<del> 245,
<del> 22,
<del> 1706,
<del> 17,
<del> 20098,
<del> 5,
<del> 3215,
<del> 21,
<del> 37,
<del> 1110,
<del> 3,
<del> 13,
<del> 1041,
<del> 4,
<del> 24,
<del> 603,
<del> 490,
<del> 2,
<del> 71477,
<del> 20098,
<del> 104447,
<del> 2,
<del> 20961,
<del> 1,
<del> 2604,
<del> 4,
<del> 1,
<del> 329,
<del> 3,
<del> 6224,
<del> 831,
<del> 16002,
<del> 2,
<del> 8,
<del> 603,
<del> 78967,
<del> 29546,
<del> 23,
<del> 803,
<del> 20,
<del> 25,
<del> 416,
<del> 5,
<del> 8,
<del> 232,
<del> 4,
<del> 277,
<del> 6,
<del> 1855,
<del> 4601,
<del> 3,
<del> 29546,
<del> 54,
<del> 8,
<del> 3609,
<del> 5,
<del> 57211,
<del> 49,
<del> 4,
<del> 1,
<del> 277,
<del> 18,
<del> 8,
<del> 1755,
<del> 15691,
<del> 3,
<del> 341,
<del> 25,
<del> 416,
<del> 693,
<del> 42573,
<del> 71,
<del> 17,
<del> 401,
<del> 94,
<del> 31,
<del> 17919,
<del> 2,
<del> 29546,
<del> 7873,
<del> 18,
<del> 1,
<del> 435,
<del> 23,
<del> 11011,
<del> 755,
<del> 5,
<del> 5167,
<del> 3,
<del> 7983,
<del> 98,
<del> 84,
<del> 2,
<del> 29546,
<del> 3267,
<del> 8,
<del> 3609,
<del> 4,
<del> 1,
<del> 4865,
<del> 1075,
<del> 2,
<del> 6087,
<del> 71,
<del> 6,
<del> 346,
<del> 8,
<del> 5854,
<del> 3,
<del> 29546,
<del> 824,
<del> 1400,
<del> 1868,
<del> 2,
<del> 19,
<del> 160,
<del> 2,
<del> 311,
<del> 8,
<del> 5496,
<del> 2,
<del> 20920,
<del> 17,
<del> 25,
<del> 15097,
<del> 3,
<del> 24,
<del> 24,
<del> 0,
<del> 33,
<del> 1,
<del> 142,
<del> 1298,
<del> 188,
<del> 2,
<del> 29546,
<del> 113,
<del> 8,
<del> 3654,
<del> 4,
<del> 1,
<del> 1109,
<del> 7136,
<del> 833,
<del> 3,
<del> 13,
<del> 1645,
<del> 4,
<del> 29546,
<del> 11,
<del> 104,
<del> 7,
<del> 1,
<del> 1109,
<del> 532,
<del> 7129,
<del> 2,
<del> 10,
<del> 83507,
<del> 2,
<del> 1162,
<del> 1123,
<del> 2,
<del> 6,
<del> 7245,
<del> 10,
<del> 2,
<del> 5,
<del> 11,
<del> 104,
<del> 7,
<del> 1,
<del> 1109,
<del> 532,
<del> 7129,
<del> 2,
<del> 10,
<del> 24,
<del> 24,
<del> 10,
<del> 22,
<del> 10,
<del> 13,
<del> 770,
<del> 5863,
<del> 4,
<del> 7245,
<del> 10,
<del> ]
<add> # fmt: off
<add> expected_output_ids = [33,1297,2,1,1009,4,1109,11739,4762,358,5,25,245,22,1706,17,20098,5,3215,21,37,1110,3,13,1041,4,24,603,490,2,71477,20098,104447,2,20961,1,2604,4,1,329,3,6224,831,16002,2,8,603,78967,29546,23,803,20,25,416,5,8,232,4,277,6,1855,4601,3,29546,54,8,3609,5,57211,49,4,1,277,18,8,1755,15691,3,341,25,416,693,42573,71,17,401,94,31,17919,2,29546,7873,18,1,435,23,11011,755,5,5167,3,7983,98,84,2,29546,3267,8,3609,4,1,4865,1075,2,6087,71,6,346,8,5854,3,29546,824,1400,1868,2,19,160,2,311,8,5496,2,20920,17,25,15097,3,24,24,0,33,1,142,1298,188,2,29546,113,8,3654,4,1,1109,7136,833,3,13,1645,4,29546,11,104,7,1,1109,532,7129,2,10,83507,2,1162,1123,2,6,7245,10,2,5,11,104,7,1,1109,532,7129,2,10,24,24,10,22,10,13,770,5863,4,7245,10] # noqa: E231
<add> # fmt: on
<ide> # In 1991, the remains of Russian Tsar Nicholas II and his family ( except for
<ide> # Alexei and Maria ) are discovered. The voice of young son, Tsarevich Alexei
<ide> # Nikolaevich, narrates the remainder of the story. 1883 Western Siberia, a young | 2 |
Java | Java | fix spel withtypedrootobject() test | 59c6b7e44543d3fdadfb78ae3c6a7c76de2fce91 | <ide><path>spring-expression/src/test/java/org/springframework/expression/spel/PropertyAccessTests.java
<ide> public void propertyAccessWithInstanceMethodResolver() {
<ide>
<ide> @Test
<ide> public void propertyAccessWithInstanceMethodResolverAndTypedRootObject() {
<del> Person target = new Person("p1");
<add> Person rootObject = new Person("p1");
<ide> EvaluationContext context = SimpleEvaluationContext.forReadOnlyDataBinding().
<del> withInstanceMethods().withTypedRootObject(target, TypeDescriptor.valueOf(Object.class)).build();
<add> withInstanceMethods().withTypedRootObject(rootObject, TypeDescriptor.valueOf(Object.class)).build();
<ide>
<del> assertThat(parser.parseExpression("name.substring(1)").getValue(context, target)).isEqualTo("1");
<del> assertThat(context.getRootObject().getValue()).isSameAs(target);
<add> assertThat(parser.parseExpression("name.substring(1)").getValue(context)).isEqualTo("1");
<add> assertThat(context.getRootObject().getValue()).isSameAs(rootObject);
<ide> assertThat(context.getRootObject().getTypeDescriptor().getType()).isSameAs(Object.class);
<ide> }
<ide> | 1 |
Java | Java | consolidate tests related to @config inheritance | 8cb5c36512e6355d379f533d75d2bef63d0de6bd | <ide><path>org.springframework.context/src/test/java/org/springframework/context/annotation/BeanMethodPolymorphismTests.java
<ide> import static org.junit.Assert.assertTrue;
<ide> import static org.junit.Assert.fail;
<ide>
<add>import java.lang.annotation.Inherited;
<add>
<ide> import org.junit.Test;
<ide> import org.springframework.beans.factory.parsing.BeanDefinitionParsingException;
<add>import org.springframework.beans.factory.support.DefaultListableBeanFactory;
<add>import org.springframework.beans.factory.support.RootBeanDefinition;
<ide>
<ide> /**
<ide> * Tests regarding overloading and overriding of bean methods.
<ide> public void beanMethodOverloadingWithInheritance() {
<ide> @Bean Integer anInt() { return 5; }
<ide> @Bean String aString(Integer dependency) { return "overloaded"+dependency; }
<ide> }
<del>
<add>
<ide> /**
<ide> * When inheritance is not involved, it is still possible to override a bean method from
<ide> * the container's point of view. This is not strictly 'overloading' of a method per se,
<ide> public void beanMethodShadowing() {
<ide> @Bean String aString() { return "shadow"; }
<ide> }
<ide>
<add> /**
<add> * Tests that polymorphic Configuration classes need not explicitly redeclare the
<add> * {@link Configuration} annotation. This respects the {@link Inherited} nature
<add> * of the Configuration annotation, even though it's being detected via ASM.
<add> */
<add> @Test
<add> public void beanMethodsDetectedOnSuperClass() {
<add> DefaultListableBeanFactory beanFactory = new DefaultListableBeanFactory();
<add> beanFactory.registerBeanDefinition("config", new RootBeanDefinition(Config.class));
<add> ConfigurationClassPostProcessor pp = new ConfigurationClassPostProcessor();
<add> pp.postProcessBeanFactory(beanFactory);
<add> beanFactory.getBean("testBean", TestBean.class);
<add> }
<add>
<add>
<add> @Configuration
<add> static class BaseConfig {
<add>
<add> @Bean
<add> public TestBean testBean() {
<add> return new TestBean();
<add> }
<add> }
<add>
<add>
<add> @Configuration
<add> static class Config extends BaseConfig {
<add> }
<add>
<ide> }
<ide><path>org.springframework.context/src/test/java/org/springframework/context/annotation/configuration/PolymorphicConfigurationTests.java
<del>/*
<del> * Copyright 2002-2009 the original author or authors.
<del> *
<del> * Licensed under the Apache License, Version 2.0 (the "License");
<del> * you may not use this file except in compliance with the License.
<del> * You may obtain a copy of the License at
<del> *
<del> * http://www.apache.org/licenses/LICENSE-2.0
<del> *
<del> * Unless required by applicable law or agreed to in writing, software
<del> * distributed under the License is distributed on an "AS IS" BASIS,
<del> * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
<del> * See the License for the specific language governing permissions and
<del> * limitations under the License.
<del> */
<del>
<del>package org.springframework.context.annotation.configuration;
<del>
<del>import java.lang.annotation.Inherited;
<del>
<del>import org.junit.Test;
<del>import test.beans.TestBean;
<del>
<del>import org.springframework.beans.factory.support.DefaultListableBeanFactory;
<del>import org.springframework.beans.factory.support.RootBeanDefinition;
<del>import org.springframework.context.annotation.Bean;
<del>import org.springframework.context.annotation.Configuration;
<del>import org.springframework.context.annotation.ConfigurationClassPostProcessor;
<del>
<del>/**
<del> * Tests that polymorphic Configuration classes need not explicitly redeclare the
<del> * {@link Configuration} annotation. This respects the {@link Inherited} nature
<del> * of the Configuration annotation, even though it's being detected via ASM.
<del> *
<del> * @author Chris Beams
<del> */
<del>public class PolymorphicConfigurationTests {
<del>
<del> @Test
<del> public void beanMethodsDetectedOnSuperClass() {
<del> DefaultListableBeanFactory beanFactory = new DefaultListableBeanFactory();
<del> beanFactory.registerBeanDefinition("config", new RootBeanDefinition(Config.class));
<del> ConfigurationClassPostProcessor pp = new ConfigurationClassPostProcessor();
<del> pp.postProcessBeanFactory(beanFactory);
<del> beanFactory.getBean("testBean", TestBean.class);
<del> }
<del>
<del>
<del> @Configuration
<del> static class SuperConfig {
<del>
<del> @Bean
<del> public TestBean testBean() {
<del> return new TestBean();
<del> }
<del> }
<del>
<del>
<del> @Configuration
<del> static class Config extends SuperConfig {
<del> }
<del>
<del>} | 2 |
Javascript | Javascript | add documentation for ember.enumerableutils | 33c89fc3ff143820cfb0b5629080817c6cb62bd6 | <ide><path>packages/ember-metal/lib/enumerable_utils.js
<ide> indexOf = Array.prototype.indexOf || Ember.ArrayPolyfills.indexOf;
<ide> filter = Array.prototype.filter || Ember.ArrayPolyfills.filter;
<ide> splice = Array.prototype.splice;
<ide>
<add>/**
<add> * Defines some convenience methods for working with Enumerables.
<add> * `Ember.EnumerableUtils` uses `Ember.ArrayPolyfills` when necessary.
<add> *
<add> * @class EnumerableUtils
<add> * @namespace Ember
<add> * @static
<add> * */
<ide> var utils = Ember.EnumerableUtils = {
<add> /**
<add> * Calls the map function on the passed object with a specified callback. This
<add> * uses `Ember.ArrayPolyfill`'s-map method when necessary.
<add> *
<add> * @method map
<add> * @param {Object} obj The object that should be mapped
<add> * @param {Function} callback The callback to execute
<add> * @param {Object} thisArg Value to use as this when executing *callback*
<add> *
<add> * @return {Array} An array of mapped values.
<add> */
<ide> map: function(obj, callback, thisArg) {
<ide> return obj.map ? obj.map.call(obj, callback, thisArg) : map.call(obj, callback, thisArg);
<ide> },
<ide>
<add> /**
<add> * Calls the forEach function on the passed object with a specified callback. This
<add> * uses `Ember.ArrayPolyfill`'s-forEach method when necessary.
<add> *
<add> * @method forEach
<add> * @param {Object} obj The object to call forEach on
<add> * @param {Function} callback The callback to execute
<add> * @param {Object} thisArg Value to use as this when executing *callback*
<add> *
<add> */
<ide> forEach: function(obj, callback, thisArg) {
<ide> return obj.forEach ? obj.forEach.call(obj, callback, thisArg) : forEach.call(obj, callback, thisArg);
<ide> },
<ide>
<add> /**
<add> * Calls the filter function on the passed object with a specified callback. This
<add> * uses `Ember.ArrayPolyfill`'s-filter method when necessary.
<add> *
<add> * @method filter
<add> * @param {Object} obj The object to call filter on
<add> * @param {Function} callback The callback to execute
<add> * @param {Object} thisArg Value to use as this when executing *callback*
<add> *
<add> * @return {Array} An array containing the filtered values
<add> */
<ide> filter: function(obj, callback, thisArg) {
<ide> return obj.filter ? obj.filter.call(obj, callback, thisArg) : filter.call(obj, callback, thisArg);
<ide> },
<ide>
<add> /**
<add> * Calls the indexOf function on the passed object with a specified callback. This
<add> * uses `Ember.ArrayPolyfill`'s-indexOf method when necessary.
<add> *
<add> * @method indexOf
<add> * @param {Object} obj The object to call indexOn on
<add> * @param {Function} callback The callback to execute
<add> * @param {Object} index The index to start searching from
<add> *
<add> */
<ide> indexOf: function(obj, element, index) {
<ide> return obj.indexOf ? obj.indexOf.call(obj, element, index) : indexOf.call(obj, element, index);
<ide> },
<ide>
<add> /**
<add> * Returns an array of indexes of the first occurrences of the passed elements
<add> * on the passed object.
<add> *
<add> * ```javascript
<add> * var array = [1, 2, 3, 4, 5];
<add> * Ember.EnumerableUtils.indexesOf(array, [2, 5]); // [1, 4]
<add> *
<add> * var fubar = "Fubarr";
<add> * Ember.EnumerableUtils.indexesOf(fubar, ['b', 'r']); // [2, 4]
<add> * ```
<add> *
<add> * @method indexesOf
<add> * @param {Object} obj The object to check for element indexes
<add> * @param {Array} elements The elements to search for on *obj*
<add> *
<add> * @return {Array} An array of indexes.
<add> *
<add> */
<ide> indexesOf: function(obj, elements) {
<ide> return elements === undefined ? [] : utils.map(elements, function(item) {
<ide> return utils.indexOf(obj, item);
<ide> });
<ide> },
<ide>
<add> /**
<add> * Adds an object to an array. If the array already includes the object this
<add> * method has no effect.
<add> *
<add> * @method addObject
<add> * @param {Array} array The array the passed item should be added to
<add> * @param {Object} item The item to add to the passed array
<add> *
<add> * @return 'undefined'
<add> */
<ide> addObject: function(array, item) {
<ide> var index = utils.indexOf(array, item);
<ide> if (index === -1) { array.push(item); }
<ide> },
<ide>
<add> /**
<add> * Removes an object from an array. If the array does not contain the passed
<add> * object this method has no effect.
<add> *
<add> * @method removeObject
<add> * @param {Array} array The array to remove the item from.
<add> * @param {Object} item The item to remove from the passed array.
<add> *
<add> * @return 'undefined'
<add> */
<ide> removeObject: function(array, item) {
<ide> var index = utils.indexOf(array, item);
<ide> if (index !== -1) { array.splice(index, 1); }
<ide> var utils = Ember.EnumerableUtils = {
<ide> return ret;
<ide> },
<ide>
<add> /**
<add> * Replaces objects in an array with the passed objects.
<add> *
<add> * ```javascript
<add> * var array = [1,2,3];
<add> * Ember.EnumerableUtils.replace(array, 1, 2, [4, 5]); // [1, 4, 5]
<add> *
<add> * var array = [1,2,3];
<add> * Ember.EnumerableUtils.replace(array, 1, 1, [4, 5]); // [1, 4, 5, 3]
<add> *
<add> * var array = [1,2,3];
<add> * Ember.EnumerableUtils.replace(array, 10, 1, [4, 5]); // [1, 2, 3, 4, 5]
<add> * ```
<add> *
<add> * @method replace
<add> * @param {Array} array The array the objects should be inserted into.
<add> * @param {Number} idx Starting index in the array to replace. If *idx* >=
<add> * length, then append to the end of the array.
<add> * @param {Number} amt Number of elements that should be remove from the array,
<add> * starting at *idx*
<add> * @param {Array} objects An array of zero or more objects that should be
<add> * inserted into the array at *idx*
<add> *
<add> * @return {Array} The changed array.
<add> */
<ide> replace: function(array, idx, amt, objects) {
<ide> if (array.replace) {
<ide> return array.replace(idx, amt, objects);
<ide> var utils = Ember.EnumerableUtils = {
<ide> }
<ide> },
<ide>
<add> /**
<add> * Calculates the intersection of two arrays. This method returns a new array
<add> * filled with the records that the two passed arrays share with each other.
<add> * If there is no intersection, an empty array will be returned.
<add> *
<add> * ```javascript
<add> * var array1 = [1, 2, 3, 4, 5];
<add> * var array2 = [1, 3, 5, 6, 7];
<add> *
<add> * Ember.EnumerableUtils.intersection(array1, array2); // [1, 3, 5]
<add> *
<add> * var array1 = [1, 2, 3];
<add> * var array2 = [4, 5, 6];
<add> *
<add> * Ember.EnumerableUtils.intersection(array1, array2); // []
<add> * ```
<add> *
<add> * @method intersection
<add> * @param {Array} array1 The first array
<add> * @param {Array} array2 The second array
<add> *
<add> * @return {Array} The intersection of the two passed arrays.
<add> */
<ide> intersection: function(array1, array2) {
<ide> var intersection = [];
<ide> | 1 |
Ruby | Ruby | allow compare with string | b100ad223d24bef79bc2e40491b68aa06c434465 | <ide><path>Library/Homebrew/tap.rb
<ide> def formula_renames
<ide> end
<ide> end
<ide>
<add> def ==(other)
<add> other = Tap.fetch(other) if other.is_a?(String)
<add> self.class == other.class && self.name == other.name
<add> end
<add>
<ide> def self.each
<ide> return unless TAP_DIRECTORY.directory?
<ide> | 1 |
Go | Go | move caps and device spec utils to `oci` pkg | b940cc5cff325ecba2dc2a950f14f098e1519511 | <ide><path>daemon/exec_linux.go
<ide> package daemon // import "github.com/docker/docker/daemon"
<ide>
<ide> import (
<ide> "github.com/docker/docker/container"
<del> "github.com/docker/docker/daemon/caps"
<ide> "github.com/docker/docker/daemon/exec"
<add> "github.com/docker/docker/oci/caps"
<ide> "github.com/opencontainers/runc/libcontainer/apparmor"
<ide> "github.com/opencontainers/runtime-spec/specs-go"
<ide> )
<ide><path>daemon/oci_linux.go
<ide> func setDevices(s *specs.Spec, c *container.Container) error {
<ide> }
<ide>
<ide> var err error
<del> devPermissions, err = appendDevicePermissionsFromCgroupRules(devPermissions, c.HostConfig.DeviceCgroupRules)
<add> devPermissions, err = oci.AppendDevicePermissionsFromCgroupRules(devPermissions, c.HostConfig.DeviceCgroupRules)
<ide> if err != nil {
<ide> return err
<ide> }
<ide> func (daemon *Daemon) createSpec(c *container.Container) (retSpec *specs.Spec, e
<ide> if err := setNamespaces(daemon, &s, c); err != nil {
<ide> return nil, fmt.Errorf("linux spec namespaces: %v", err)
<ide> }
<del> if err := setCapabilities(&s, c); err != nil {
<add> if err := oci.SetCapabilities(&s, c.HostConfig.CapAdd, c.HostConfig.CapDrop, c.HostConfig.Privileged); err != nil {
<ide> return nil, fmt.Errorf("linux spec capabilities: %v", err)
<ide> }
<ide> if err := setSeccomp(daemon, &s, c); err != nil {
<ide><path>daemon/oci_windows.go
<ide> func (daemon *Daemon) createSpecLinuxFields(c *container.Container, s *specs.Spe
<ide> }
<ide> s.Root.Path = "rootfs"
<ide> s.Root.Readonly = c.HostConfig.ReadonlyRootfs
<del> if err := setCapabilities(s, c); err != nil {
<add> if err := oci.SetCapabilities(s, c.HostConfig.CapAdd, c.HostConfig.CapDrop, c.HostConfig.Privileged); err != nil {
<ide> return fmt.Errorf("linux spec capabilities: %v", err)
<ide> }
<del> devPermissions, err := appendDevicePermissionsFromCgroupRules(nil, c.HostConfig.DeviceCgroupRules)
<add> devPermissions, err := oci.AppendDevicePermissionsFromCgroupRules(nil, c.HostConfig.DeviceCgroupRules)
<ide> if err != nil {
<ide> return fmt.Errorf("linux runtime spec devices: %v", err)
<ide> }
<add><path>oci/caps/utils.go
<del><path>daemon/caps/utils.go
<del>package caps // import "github.com/docker/docker/daemon/caps"
<add>package caps // import "github.com/docker/docker/oci/caps"
<ide>
<ide> import (
<ide> "fmt"
<add><path>oci/oci.go
<del><path>daemon/oci.go
<del>package daemon // import "github.com/docker/docker/daemon"
<add>package oci // import "github.com/docker/docker/oci"
<ide>
<ide> import (
<ide> "fmt"
<ide> "regexp"
<ide> "strconv"
<ide>
<del> "github.com/docker/docker/container"
<del> "github.com/docker/docker/daemon/caps"
<add> "github.com/docker/docker/oci/caps"
<ide> specs "github.com/opencontainers/runtime-spec/specs-go"
<ide> )
<ide>
<ide> // nolint: gosimple
<del>var (
<del> deviceCgroupRuleRegex = regexp.MustCompile("^([acb]) ([0-9]+|\\*):([0-9]+|\\*) ([rwm]{1,3})$")
<del>)
<add>var deviceCgroupRuleRegex = regexp.MustCompile("^([acb]) ([0-9]+|\\*):([0-9]+|\\*) ([rwm]{1,3})$")
<ide>
<del>func setCapabilities(s *specs.Spec, c *container.Container) error {
<del> var caplist []string
<del> var err error
<del> if c.HostConfig.Privileged {
<add>// SetCapabilities sets the provided capabilities on the spec
<add>// All capabilities are added if privileged is true
<add>func SetCapabilities(s *specs.Spec, add, drop []string, privileged bool) error {
<add> var (
<add> caplist []string
<add> err error
<add> )
<add> if privileged {
<ide> caplist = caps.GetAllCapabilities()
<ide> } else {
<del> caplist, err = caps.TweakCapabilities(s.Process.Capabilities.Bounding, c.HostConfig.CapAdd, c.HostConfig.CapDrop)
<add> caplist, err = caps.TweakCapabilities(s.Process.Capabilities.Bounding, add, drop)
<ide> if err != nil {
<ide> return err
<ide> }
<ide> func setCapabilities(s *specs.Spec, c *container.Container) error {
<ide> return nil
<ide> }
<ide>
<del>func appendDevicePermissionsFromCgroupRules(devPermissions []specs.LinuxDeviceCgroup, rules []string) ([]specs.LinuxDeviceCgroup, error) {
<add>// AppendDevicePermissionsFromCgroupRules takes rules for the devices cgroup to append to the default set
<add>func AppendDevicePermissionsFromCgroupRules(devPermissions []specs.LinuxDeviceCgroup, rules []string) ([]specs.LinuxDeviceCgroup, error) {
<ide> for _, deviceCgroupRule := range rules {
<ide> ss := deviceCgroupRuleRegex.FindAllStringSubmatch(deviceCgroupRule, -1)
<ide> if len(ss[0]) != 5 { | 5 |
Go | Go | add detach event | 83ad006d4724929ccbde4bdf768374fad0eeab44 | <ide><path>container/container.go
<ide> var (
<ide> errInvalidNetwork = fmt.Errorf("invalid network settings while building port map info")
<ide> )
<ide>
<add>// AttachError represents errors of attach
<add>type AttachError interface {
<add> IsDetached() bool
<add>}
<add>
<add>type detachError struct{}
<add>
<add>func (e detachError) IsDetached() bool {
<add> return true
<add>}
<add>
<add>func (e detachError) Error() string {
<add> return "detached from container"
<add>}
<add>
<ide> // CommonContainer holds the fields for a container which are
<ide> // applicable across all platforms supported by the daemon.
<ide> type CommonContainer struct {
<ide> func AttachStreams(ctx context.Context, streamConfig *runconfig.StreamConfig, op
<ide> _, err = copyEscapable(cStdin, stdin, keys)
<ide> } else {
<ide> _, err = io.Copy(cStdin, stdin)
<del>
<ide> }
<ide> if err == io.ErrClosedPipe {
<ide> err = nil
<ide> func copyEscapable(dst io.Writer, src io.ReadCloser, keys []byte) (written int64
<ide> break
<ide> }
<ide> if i == len(keys)-1 {
<del> if err := src.Close(); err != nil {
<del> return 0, err
<del> }
<del> return 0, nil
<add> src.Close()
<add> return 0, detachError{}
<ide> }
<ide> nr, er = src.Read(buf)
<ide> }
<ide><path>daemon/attach.go
<ide> func (daemon *Daemon) ContainerAttachRaw(prefixOrName string, stdin io.ReadClose
<ide> return daemon.containerAttach(container, stdin, stdout, stderr, false, stream, nil)
<ide> }
<ide>
<del>func (daemon *Daemon) containerAttach(container *container.Container, stdin io.ReadCloser, stdout, stderr io.Writer, logs, stream bool, keys []byte) error {
<add>func (daemon *Daemon) containerAttach(c *container.Container, stdin io.ReadCloser, stdout, stderr io.Writer, logs, stream bool, keys []byte) error {
<ide> if logs {
<del> logDriver, err := daemon.getLogger(container)
<add> logDriver, err := daemon.getLogger(c)
<ide> if err != nil {
<ide> return err
<ide> }
<ide> func (daemon *Daemon) containerAttach(container *container.Container, stdin io.R
<ide> }
<ide> }
<ide>
<del> daemon.LogContainerEvent(container, "attach")
<add> daemon.LogContainerEvent(c, "attach")
<ide>
<ide> //stream
<ide> if stream {
<ide> func (daemon *Daemon) containerAttach(container *container.Container, stdin io.R
<ide> }()
<ide> stdinPipe = r
<ide> }
<del> <-container.Attach(stdinPipe, stdout, stderr, keys)
<add> err := <-c.Attach(stdinPipe, stdout, stderr, keys)
<add> if err != nil {
<add> e, ok := err.(container.AttachError)
<add> if ok && e.IsDetached() {
<add> daemon.LogContainerEvent(c, "detach")
<add> } else {
<add> logrus.Errorf("attach failed with error: %v", err)
<add> }
<add> }
<add>
<ide> // If we are in stdinonce mode, wait for the process to end
<ide> // otherwise, simply return
<del> if container.Config.StdinOnce && !container.Config.Tty {
<del> container.WaitStop(-1 * time.Second)
<add> if c.Config.StdinOnce && !c.Config.Tty {
<add> c.WaitStop(-1 * time.Second)
<ide> }
<ide> }
<ide> return nil
<ide><path>daemon/exec.go
<ide> func (d *Daemon) ContainerExecStart(ctx context.Context, name string, stdin io.R
<ide> return fmt.Errorf("context cancelled")
<ide> case err := <-attachErr:
<ide> if err != nil {
<del> return fmt.Errorf("attach failed with error: %v", err)
<add> e, ok := err.(container.AttachError)
<add> if !ok || !e.IsDetached() {
<add> return fmt.Errorf("attach failed with error: %v", err)
<add> }
<add> d.LogContainerEvent(c, "detach")
<ide> }
<ide> }
<ide> return nil
<ide><path>integration-cli/docker_cli_run_unix_test.go
<ide> func (s *DockerSuite) TestRunAttachDetach(c *check.C) {
<ide>
<ide> running := inspectField(c, name, "State.Running")
<ide> c.Assert(running, checker.Equals, "true", check.Commentf("expected container to still be running"))
<add>
<add> out, _ = dockerCmd(c, "events", "--since=0", "--until", daemonUnixTime(c), "-f", "container="+name)
<add> // attach and detach event should be monitored
<add> c.Assert(out, checker.Contains, "attach")
<add> c.Assert(out, checker.Contains, "detach")
<ide> }
<ide>
<ide> // TestRunDetach checks attaching and detaching with the escape sequence specified via flags. | 4 |
Text | Text | remove zerowidth space at the beginning of file | 1b86246d64854240abd291fb4bd36f7b3dd765c2 | <ide><path>docs/docs/conferences.md
<del>---
<add>---
<ide> id: conferences
<ide> title: Conferences
<ide> permalink: conferences.html | 1 |
Java | Java | add set/getcontentlanguage() to httpheaders | 36da299f96e7be182ea956f3094d0cf1715da6c5 | <ide><path>spring-web/src/main/java/org/springframework/http/HttpHeaders.java
<ide> public ContentDisposition getContentDisposition() {
<ide> return ContentDisposition.empty();
<ide> }
<ide>
<add> /**
<add> * Set the {@link Locale} of the content language,
<add> * as specified by the {@literal Content-Language} header.
<add> * <p>Use {@code set(CONTENT_LANGUAGE, ...)} if you need
<add> * to set multiple content languages.</p>
<add> */
<add> public void setContentLanguage(Locale locale) {
<add> Assert.notNull(locale, "'locale' must not be null");
<add> set(CONTENT_LANGUAGE, locale.toLanguageTag());
<add> }
<add>
<add> /**
<add> * Return the first {@link Locale} of the content languages,
<add> * as specified by the {@literal Content-Language} header.
<add> * <p>Returns {@code null} when the content language is unknown.
<add> * <p>Use {@code getValuesAsList(CONTENT_LANGUAGE)} if you need
<add> * to get multiple content languages.</p>
<add> */
<add> public Locale getContentLanguage() {
<add> return getValuesAsList(CONTENT_LANGUAGE)
<add> .stream()
<add> .findFirst()
<add> .map(Locale::forLanguageTag)
<add> .orElse(null);
<add> }
<add>
<ide> /**
<ide> * Set the length of the body in bytes, as specified by the
<ide> * {@code Content-Length} header.
<ide><path>spring-web/src/test/java/org/springframework/http/HttpHeadersTests.java
<ide> public void acceptLanguage() {
<ide> assertArrayEquals(languageArray, languages.toArray());
<ide> }
<ide>
<add> @Test
<add> public void contentLanguage() {
<add> assertNull(headers.getContentLanguage());
<add> headers.setContentLanguage(Locale.FRANCE);
<add> assertEquals(Locale.FRANCE, headers.getContentLanguage());
<add> assertEquals("fr-FR", headers.getFirst(HttpHeaders.CONTENT_LANGUAGE));
<add> headers.clear();
<add> headers.set(HttpHeaders.CONTENT_LANGUAGE, Locale.GERMAN.toLanguageTag() + ", " + Locale.CANADA);
<add> assertEquals(Locale.GERMAN, headers.getContentLanguage());
<add> }
<add>
<ide> } | 2 |
Python | Python | fix typo in test | 163ae9681e4ec19de40be444ebf4cdcbca759307 | <ide><path>keras/tests/integration_test.py
<ide> def densify(x, y):
<ide> keras.layers.Conv1D(4, 5, padding='same', activation='relu'),
<ide> keras.layers.Conv1D(8, 5, padding='same'),
<ide> keras.layers.BatchNormalization(),
<del> keras.layers.Conv2D(3, 5, padding='same', activation='softmax'),
<add> keras.layers.Conv1D(3, 5, padding='same', activation='softmax'),
<ide> ]
<ide> model = test_utils.get_model_from_layers(
<ide> layers, input_shape=(None,)) | 1 |
Mixed | Text | change the default adapter from inline to async | 625baa69d14881ac49ba2e5c7d9cac4b222d7022 | <ide><path>activejob/CHANGELOG.md
<add>* Change the default adapter from inline to async. It's a better default as tests will then not mistakenly
<add> come to rely on behavior happening synchronously. This is especially important with things like jobs kicked off
<add> in Active Record lifecycle callbacks.
<add>
<add> *DHH*
<add>
<add>
<ide> ## Rails 5.0.0.beta2 (February 01, 2016) ##
<ide>
<ide> * No changes.
<ide><path>activejob/lib/active_job/queue_adapter.rb
<ide> module QueueAdapter #:nodoc:
<ide>
<ide> included do
<ide> class_attribute :_queue_adapter, instance_accessor: false, instance_predicate: false
<del> self.queue_adapter = :inline
<add> self.queue_adapter = :async
<ide> end
<ide>
<ide> # Includes the setter method for changing the active queue adapter.
<ide> module ClassMethods
<ide> # Returns the backend queue provider. The default queue adapter
<del> # is the +:inline+ queue. See QueueAdapters for more information.
<add> # is the +:async+ queue. See QueueAdapters for more information.
<ide> def queue_adapter
<ide> _queue_adapter
<ide> end
<ide>
<ide> # Specify the backend queue provider. The default queue adapter
<del> # is the +:inline+ queue. See QueueAdapters for more
<add> # is the +:async+ queue. See QueueAdapters for more
<ide> # information.
<ide> def queue_adapter=(name_or_adapter_or_class)
<ide> self._queue_adapter = interpret_adapter(name_or_adapter_or_class)
<ide><path>guides/source/active_job_basics.md
<ide> That's it!
<ide> Job Execution
<ide> -------------
<ide>
<del>For enqueuing and executing jobs you need to set up a queuing backend, that is to
<del>say you need to decide for a 3rd-party queuing library that Rails should use.
<del>Rails itself does not provide a sophisticated queuing system and just executes the
<del>job immediately if no adapter is set.
<add>For enqueuing and executing jobs in production you need to set up a queuing backend,
<add>that is to say you need to decide for a 3rd-party queuing library that Rails should use.
<add>Rails itself only provides an in-process queuing system, which only keeps the jobs in RAM.
<add>If the process crashes or the machine is reset, then all outstanding jobs are lost with the
<add>default async back-end. This may be fine for smaller apps or non-critical jobs, but most
<add>production apps will need to pick a persistent backend.
<ide>
<ide> ### Backends
<ide> | 3 |
Javascript | Javascript | remove multimaterial from webgldeferredrenderer | 8349b07b5d3d1ebb55c7ee01ff27544604facd2e | <ide><path>examples/js/renderers/WebGLDeferredRenderer.js
<ide> THREE.WebGLDeferredRenderer = function ( parameters ) {
<ide> var _lightPrePass = false;
<ide> var _cacheKeepAlive = false;
<ide>
<del> var _invisibleMaterial = new THREE.ShaderMaterial( { visible: false } );
<del>
<add> var _tmpMaterial = new THREE.ShaderMaterial( { visible: false } );
<ide> var _tmpVector3 = new THREE.Vector3();
<ide>
<ide> // scene/material/light cache for deferred rendering.
<ide> THREE.WebGLDeferredRenderer = function ( parameters ) {
<ide> var _lightScenesCache = {};
<ide> var _lightFullscreenScenesCache = {};
<ide>
<del> // originalMaterial.uuid -> deferredMaterial
<del> // (no mapping from children of MultiMaterial)
<add> // object.material.uuid -> deferredMaterial or
<add> // object.material[ n ].uuid -> deferredMaterial
<ide> var _normalDepthMaterialsCache = {};
<ide> var _normalDepthShininessMaterialsCache = {};
<ide> var _colorMaterialsCache = {};
<ide> var _reconstructionMaterialsCache = {};
<del> var _invisibleMultiMaterialsCache = {};
<ide>
<ide> // originalLight.uuid -> deferredLight
<ide> var _deferredLightsCache = {};
<ide> THREE.WebGLDeferredRenderer = function ( parameters ) {
<ide>
<ide> var _removeThresholdCount = 60;
<ide>
<del> // object.uuid -> originalMaterial
<del> // deferred materials.uuid -> originalMaterial
<add> // deferredMaterials.uuid -> object.material or
<add> // deferredMaterials.uuid -> object.material[ n ]
<ide> // save before render and release after render.
<ide> var _originalMaterialsTable = {};
<ide>
<ide> // object.uuid -> originalOnBeforeRender
<ide> // save before render and release after render.
<ide> var _originalOnBeforeRendersTable = {};
<ide>
<add> // object.material.uuid -> object.material.visible or
<add> // object.material[ i ].uuid -> object.material[ i ].visible or
<add> // save before render and release after render.
<add> var _originalVisibleTable = {};
<add>
<ide> // external properties
<ide>
<ide> this.renderer = undefined;
<ide> THREE.WebGLDeferredRenderer = function ( parameters ) {
<ide>
<ide> }
<ide>
<del> function getMaterialFromCacheOrCreate( originalMaterial, cache, func ) {
<add> function getMaterialFromCacheOrCreate( originalMaterial, cache, createFunc, updateFunc ) {
<ide>
<ide> var data = cache[ originalMaterial.uuid ];
<ide>
<ide> if ( data === undefined ) {
<ide>
<ide> data = createCacheData();
<add> data.material = createFunc( originalMaterial );
<add> cache[ originalMaterial.uuid ] = data;
<ide>
<del> var material;
<add> }
<ide>
<del> if ( originalMaterial.isMultiMaterial === true ) {
<add> data.used = true;
<ide>
<del> var materials = [];
<add> updateFunc( data.material, originalMaterial );
<ide>
<del> for ( var i = 0, il = originalMaterial.materials.length; i < il; i ++ ) {
<add> _originalMaterialsTable[ data.material.uuid ] = originalMaterial;
<ide>
<del> materials.push( func( originalMaterial.materials[ i ] ) );
<add> return data.material;
<ide>
<del> }
<add> }
<ide>
<del> material = new THREE.MultiMaterial( materials );
<add> function overrideMaterialAndOnBeforeRender( object, getMaterialFunc, onBeforeRender ) {
<ide>
<del> } else {
<add> if ( object.material === undefined ) return;
<add>
<add> if ( Array.isArray( object.material ) ) {
<ide>
<del> material = func( originalMaterial );
<add> for ( var i = 0, il = object.material.length; i < il; i ++ ) {
<add>
<add> object.material[ i ] = getMaterialFunc( object.material[ i ] );
<ide>
<ide> }
<ide>
<del> data.material = material;
<add> } else {
<ide>
<del> cache[ originalMaterial.uuid ] = data;
<add> object.material = getMaterialFunc( object.material );
<ide>
<ide> }
<ide>
<del> return data.material;
<add> object.onBeforeRender = onBeforeRender;
<ide>
<ide> }
<ide>
<del> function setMaterialNormalDepth( object ) {
<add> function restoreOriginalMaterial( object ) {
<ide>
<ide> if ( object.material === undefined ) return;
<ide>
<del> var originalMaterial = _originalMaterialsTable[ object.uuid ];
<del> var material = getNormalDepthMaterial( originalMaterial );
<del>
<del> _originalMaterialsTable[ material.uuid ] = originalMaterial;
<add> if ( Array.isArray( object.material ) ) {
<ide>
<del> if ( material.isMultiMaterial === true ) {
<add> for ( var i = 0, il = object.material.length; i < il; i ++ ) {
<ide>
<del> for ( var i = 0, il = material.materials.length; i < il; i ++ ) {
<del>
<del> _originalMaterialsTable[ material.materials[ i ].uuid ] = originalMaterial.materials[ i ];
<del> updateDeferredNormalDepthMaterial( material.materials[ i ], originalMaterial.materials[ i ] );
<add> object.material[ i ] = _originalMaterialsTable[ object.material[ i ].uuid ];
<ide>
<ide> }
<ide>
<ide> } else {
<ide>
<del> updateDeferredNormalDepthMaterial( material, originalMaterial );
<add> object.material = _originalMaterialsTable[ object.material.uuid ];
<ide>
<ide> }
<ide>
<del> object.material = material;
<del> object.onBeforeRender = updateDeferredNormalDepthUniforms;
<add> }
<add>
<add> function setMaterialNormalDepth( object ) {
<add>
<add> overrideMaterialAndOnBeforeRender( object, getNormalDepthMaterial, updateDeferredNormalDepthUniforms );
<ide>
<ide> }
<ide>
<ide> THREE.WebGLDeferredRenderer = function ( parameters ) {
<ide> return getMaterialFromCacheOrCreate(
<ide> originalMaterial,
<ide> ( _lightPrePass ) ? _normalDepthShininessMaterialsCache : _normalDepthMaterialsCache,
<del> createDeferredNormalDepthMaterial
<add> createDeferredNormalDepthMaterial,
<add> updateDeferredNormalDepthMaterial
<ide> );
<ide>
<ide> }
<ide> THREE.WebGLDeferredRenderer = function ( parameters ) {
<ide>
<ide> function setMaterialColor( object ) {
<ide>
<del> if ( object.material === undefined ) return;
<del>
<del> var originalMaterial = _originalMaterialsTable[ object.uuid ];
<del> var material = getColorMaterial( originalMaterial );
<del>
<del> _originalMaterialsTable[ material.uuid ] = originalMaterial;
<del>
<del> if ( originalMaterial.isMultiMaterial === true ) {
<del>
<del> for ( var i = 0, il = originalMaterial.materials.length; i < il; i ++ ) {
<del>
<del> _originalMaterialsTable[ material.materials[ i ].uuid ] = originalMaterial.materials[ i ];
<del> updateDeferredColorMaterial( material.materials[ i ], originalMaterial.materials[ i ] );
<del>
<del> }
<del>
<del> } else {
<del>
<del> updateDeferredColorMaterial( material, originalMaterial );
<del>
<del> }
<del>
<del> object.material = material;
<del> object.onBeforeRender = updateDeferredColorUniforms;
<add> overrideMaterialAndOnBeforeRender( object, getColorMaterial, updateDeferredColorUniforms );
<ide>
<ide> }
<ide>
<ide> THREE.WebGLDeferredRenderer = function ( parameters ) {
<ide> return getMaterialFromCacheOrCreate(
<ide> originalMaterial,
<ide> _colorMaterialsCache,
<del> createDeferredColorMaterial
<add> createDeferredColorMaterial,
<add> updateDeferredColorMaterial
<ide> );
<ide>
<ide> }
<ide> THREE.WebGLDeferredRenderer = function ( parameters ) {
<ide>
<ide> function setMaterialReconstruction( object ) {
<ide>
<del> if ( object.material === undefined ) return;
<del>
<del> var originalMaterial = _originalMaterialsTable[ object.uuid ];
<del>
<del> if ( originalMaterial.transparent === true ) {
<del>
<del> object.material = originalMaterial;
<del> object.onBeforeRender = _originalOnBeforeRendersTable[ object.uuid ];
<del>
<del> return;
<del>
<del> }
<del>
<del> var material = getReconstructionMaterial( originalMaterial );
<del> _originalMaterialsTable[ material.uuid ] = originalMaterial;
<add> overrideMaterialAndOnBeforeRender( object, getReconstructionMaterial, updateDeferredReconstructionUniforms );
<ide>
<del> if ( originalMaterial.isMultiMaterial === true ) {
<del>
<del> for ( var i = 0, il = originalMaterial.materials.length; i < il; i ++ ) {
<del>
<del> _originalMaterialsTable[ material.materials[ i ].uuid ] = originalMaterial.materials[ i ];
<del> updateDeferredReconstructionMaterial( material.materials[ i ], originalMaterial.materials[ i ] );
<add> }
<ide>
<del> }
<add> function getReconstructionMaterial( originalMaterial ) {
<ide>
<del> } else {
<add> if ( originalMaterial.transparent === true ) {
<ide>
<del> updateDeferredReconstructionMaterial( material, originalMaterial );
<add> _originalMaterialsTable[ originalMaterial.uuid ] = originalMaterial;
<add> return originalMaterial;
<ide>
<ide> }
<ide>
<del> object.material = material;
<del> object.onBeforeRender = updateDeferredReconstructionUniforms;
<del>
<del> }
<del>
<del> function getReconstructionMaterial( originalMaterial ) {
<del>
<ide> return getMaterialFromCacheOrCreate(
<ide> originalMaterial,
<ide> _reconstructionMaterialsCache,
<del> createDeferredReconstructionMaterial
<add> createDeferredReconstructionMaterial,
<add> updateDeferredReconstructionMaterial
<ide> );
<ide>
<ide> }
<ide> THREE.WebGLDeferredRenderer = function ( parameters ) {
<ide>
<ide> function updateDeferredReconstructionUniforms( renderer, scene, camera, geometry, material, group ) {
<ide>
<add> if ( material.transparent === true ) {
<add>
<add> // 'this' is object here because this method is set as object.onBefore()
<add> var onBeforeRender = _originalOnBeforeRendersTable[ this.uuid ];
<add>
<add> if ( onBeforeRender ) {
<add>
<add> onBeforeRender.call( this, renderer, scene, camera, geometry, material, group );
<add>
<add> }
<add>
<add> return;
<add>
<add> }
<add>
<ide> updateDeferredColorUniforms( renderer, scene, camera, geometry, material, group );
<ide>
<ide> material.uniforms.samplerLight.value = _compLight.renderTarget2.texture;
<ide>
<ide> }
<ide>
<del> function setMaterialForwardRendering( object ) {
<add> function setVisibleForForwardRendering( object ) {
<ide>
<ide> if ( object.material === undefined ) return;
<ide>
<del> var originalMaterial = _originalMaterialsTable[ object.uuid ];
<add> if ( Array.isArray( object.material ) ) {
<ide>
<del> if ( originalMaterial.isMultiMaterial === true ) {
<add> for ( var i = 0, il = object.material.length; i < il; i ++ ) {
<ide>
<del> var material = getInvisibleMultiMaterial( originalMaterial );
<add> if ( _originalVisibleTable[ object.material[ i ].uuid ] === undefined ) {
<ide>
<del> for ( var i = 0, il = originalMaterial.materials.length; i < il; i ++ ) {
<add> _originalVisibleTable[ object.material[ i ].uuid ] = object.material[ i ].visible;
<add> object.material[ i ].visible = object.material[ i ].transparent && object.material[ i ].visible;
<ide>
<del> material.materials[ i ] = getForwardRenderingMaterial( originalMaterial.materials[ i ] );
<add> }
<ide>
<ide> }
<ide>
<del> object.material = material;
<del>
<ide> } else {
<ide>
<del> object.material = getForwardRenderingMaterial( originalMaterial );
<add> if ( _originalVisibleTable[ object.material.uuid ] === undefined ) {
<ide>
<del> }
<add> _originalVisibleTable[ object.material.uuid ] = object.material.visible;
<add> object.material.visible = object.material.transparent && object.material.visible;
<ide>
<del> object.onBeforeRender = _originalOnBeforeRendersTable[ object.uuid ];
<del>
<del> }
<del>
<del> function getInvisibleMultiMaterial( originalMaterial ) {
<add> }
<ide>
<del> return getMaterialFromCacheOrCreate(
<del> originalMaterial,
<del> _invisibleMultiMaterialsCache,
<del> createInvisibleMaterial
<del> );
<add> }
<ide>
<ide> }
<ide>
<del> function createInvisibleMaterial( originalMaterial ) {
<add> function restoreVisible( object ) {
<ide>
<del> return _invisibleMaterial;
<add> if ( object.material === undefined ) return;
<ide>
<del> }
<add> if ( Array.isArray( object.material ) ) {
<ide>
<del> function getForwardRenderingMaterial( originalMaterial ) {
<add> for ( var i = 0, il = object.material.length; i < il; i ++ ) {
<ide>
<del> if ( originalMaterial.transparent === true && originalMaterial.visible === true ) {
<add> object.material[ i ].visible = _originalVisibleTable[ object.material[ i ].uuid ];
<ide>
<del> return originalMaterial;
<add> }
<ide>
<ide> } else {
<ide>
<del> return _invisibleMaterial;
<add> object.material.visible = _originalVisibleTable[ object.material.uuid ];
<ide>
<ide> }
<ide>
<ide> THREE.WebGLDeferredRenderer = function ( parameters ) {
<ide>
<ide> data = createCacheData();
<ide> data.material = createDeferredLightMaterial( light.userData.originalLight );
<del>
<ide> cache[ light.uuid ] = data;
<ide>
<ide> }
<ide>
<add> data.used = true;
<add>
<ide> return data.material;
<ide>
<ide> }
<ide> THREE.WebGLDeferredRenderer = function ( parameters ) {
<ide>
<ide> function createDeferredLightMesh( light, geometry ) {
<ide>
<del> var mesh = new THREE.Mesh( geometry, _invisibleMaterial );
<add> var mesh = new THREE.Mesh( geometry, _tmpMaterial );
<ide>
<ide> mesh.userData.originalLight = light;
<ide>
<ide> THREE.WebGLDeferredRenderer = function ( parameters ) {
<ide>
<ide> }
<ide>
<del> function saveOriginalMaterialAndCheckTransparency( object ) {
<add> function saveOriginalOnBeforeRenderAndCheckTransparency( object ) {
<ide>
<ide> if ( object.material === undefined ) return;
<ide>
<del> _originalMaterialsTable[ object.uuid ] = object.material;
<ide> _originalOnBeforeRendersTable[ object.uuid ] = object.onBeforeRender;
<ide>
<ide> // _hasTransparentObject is used only for Classic Deferred Rendering
<ide> if ( _hasTransparentObject || _lightPrePass ) return;
<ide>
<del> if ( object.material.isMultiMaterial === true ) {
<add> if ( ! object.visible ) return;
<add>
<add> if ( Array.isArray( object.material ) ) {
<ide>
<del> for ( var i = 0, il = object.material.materials.length; i < il; i ++ ) {
<add> for ( var i = 0, il = object.material.length; i < il; i ++ ) {
<ide>
<del> if ( object.material.materials[ i ].transparent === true ) {
<add> if ( object.material[ i ].visible === true && object.material[ i ].transparent === true ) {
<ide>
<ide> _hasTransparentObject = true;
<ide> break;
<ide> THREE.WebGLDeferredRenderer = function ( parameters ) {
<ide>
<ide> } else {
<ide>
<del> if ( object.material.transparent === true ) _hasTransparentObject = true;
<add> if ( object.material.visible === true && object.material.transparent === true ) _hasTransparentObject = true;
<ide>
<ide> }
<ide>
<ide> }
<ide>
<del> function restoreOriginalMaterial( object ) {
<add> function restoreOriginalOnBeforeRender( object ) {
<ide>
<ide> if ( object.material === undefined ) return;
<ide>
<del> object.material = _originalMaterialsTable[ object.uuid ];
<ide> object.onBeforeRender = _originalOnBeforeRendersTable[ object.uuid ];
<ide>
<ide> }
<ide> THREE.WebGLDeferredRenderer = function ( parameters ) {
<ide>
<ide> data = createCacheData();
<ide> data.light = createDeferredLight( object );
<del>
<ide> _deferredLightsCache[ object.uuid ] = data;
<ide>
<ide> }
<ide>
<add> data.used = true;
<add>
<ide> var light = data.light;
<ide>
<ide> if ( light === null ) return;
<ide> THREE.WebGLDeferredRenderer = function ( parameters ) {
<ide>
<ide> function cleanupTable( table ) {
<ide>
<del> var keys = Object.keys( cache );
<add> var keys = Object.keys( table );
<ide>
<ide> for ( var i = 0, il = keys.length; i < il; i ++ ) {
<ide>
<ide> THREE.WebGLDeferredRenderer = function ( parameters ) {
<ide> cleanupCache( _normalDepthShininessMaterialsCache );
<ide> cleanupCache( _colorMaterialsCache );
<ide> cleanupCache( _reconstructionMaterialsCache );
<del> cleanupCache( _invisibleMultiMaterialsCache );
<ide> cleanupCache( _classicDeferredLightMaterialsCache );
<ide> cleanupCache( _lightPrePassMaterialsCache );
<ide> cleanupCache( _deferredLightsCache );
<ide>
<ide> cleanupTable( _originalMaterialsTable );
<ide> cleanupTable( _originalOnBeforeRendersTable );
<add> cleanupTable( _originalVisibleTable );
<ide>
<ide> }
<ide>
<ide> THREE.WebGLDeferredRenderer = function ( parameters ) {
<ide>
<ide> _compNormalDepth.render();
<ide>
<add> scene.traverse( restoreOriginalMaterial );
<add>
<ide> }
<ide>
<ide> /*
<ide> THREE.WebGLDeferredRenderer = function ( parameters ) {
<ide>
<ide> _compColor.render();
<ide>
<add> scene.traverse( restoreOriginalMaterial );
<add>
<ide> }
<ide>
<ide> /*
<ide> THREE.WebGLDeferredRenderer = function ( parameters ) {
<ide>
<ide> _gl.disable( _gl.STENCIL_TEST );
<ide>
<add> scene.traverse( restoreOriginalMaterial );
<add>
<ide> }
<ide>
<ide> /*
<ide> THREE.WebGLDeferredRenderer = function ( parameters ) {
<ide>
<ide> if ( ! _lightPrePass && _hasTransparentObject ) {
<ide>
<del> scene.traverse( setMaterialForwardRendering );
<add> scene.traverse( setVisibleForForwardRendering );
<add> scene.traverse( restoreOriginalOnBeforeRender );
<ide>
<ide> _passForward.scene = scene;
<ide> _passForward.camera = camera;
<ide> THREE.WebGLDeferredRenderer = function ( parameters ) {
<ide>
<ide> _compFinal.render();
<ide>
<add> if ( ! _lightPrePass && _hasTransparentObject ) {
<add>
<add> scene.traverse( restoreVisible );
<add>
<add> }
<add>
<ide> }
<ide>
<ide> // external APIs
<ide> THREE.WebGLDeferredRenderer = function ( parameters ) {
<ide>
<ide> _hasTransparentObject = false;
<ide>
<del> scene.traverse( saveOriginalMaterialAndCheckTransparency );
<add> scene.traverse( saveOriginalOnBeforeRenderAndCheckTransparency );
<ide>
<ide> updateDeferredCommonUniforms( camera );
<ide>
<ide> THREE.WebGLDeferredRenderer = function ( parameters ) {
<ide>
<ide> renderFinal( scene, camera );
<ide>
<del> scene.traverse( restoreOriginalMaterial );
<add> scene.traverse( restoreOriginalOnBeforeRender );
<add>
<add> cleanupCaches();
<ide>
<ide> scene.autoUpdate = currentSceneAutoUpdate;
<ide> this.renderer.autoClearColor = currentAutoClearColor; | 1 |
Text | Text | add demo url for the datocms example | 745494a9a3ef630a87d6ee2fca602b222d1f3ba9 | <ide><path>docs/advanced-features/preview-mode.md
<ide> https://<your-site>/api/preview?secret=<token>&slug=<path>
<ide>
<ide> Take a look at the following examples to learn more:
<ide>
<del>- [DatoCMS Example](https://github.com/zeit/next.js/tree/canary/examples/cms-datocms)
<add>- [DatoCMS Example](https://github.com/zeit/next.js/tree/canary/examples/cms-datocms) ([Demo](https://next-blog-datocms.now.sh/))
<ide>
<ide> ## More Details
<ide>
<ide><path>docs/basic-features/data-fetching.md
<ide> function Profile() {
<ide>
<ide> Take a look at the following examples to learn more:
<ide>
<del>- [DatoCMS Example](https://github.com/zeit/next.js/tree/canary/examples/cms-datocms)
<add>- [DatoCMS Example](https://github.com/zeit/next.js/tree/canary/examples/cms-datocms) ([Demo](https://next-blog-datocms.now.sh/))
<ide>
<ide> ## Learn more
<ide>
<ide><path>docs/basic-features/pages.md
<ide> We've discussed two forms of pre-rendering for Next.js.
<ide>
<ide> Take a look at the following examples to learn more:
<ide>
<del>- [DatoCMS Example](https://github.com/zeit/next.js/tree/canary/examples/cms-datocms)
<add>- [DatoCMS Example](https://github.com/zeit/next.js/tree/canary/examples/cms-datocms) ([Demo](https://next-blog-datocms.now.sh/))
<ide>
<ide> ## Learn more
<ide>
<ide><path>examples/cms-datocms/README.md
<ide>
<ide> This example showcases Next.js's [Static Generation](/docs/basic-features/pages.md) feature using [DatoCMS](https://www.datocms.com/) as the data source.
<ide>
<add>## Demo
<add>
<add>### [https://next-blog-datocms.now.sh/](https://next-blog-datocms.now.sh/)
<add>
<ide> ## How to use
<ide>
<ide> ### Using `create-next-app` | 4 |
Go | Go | remove unused repositorydata and imgdata | 02ed26585451fac98281b564f6afe054c20d776c | <ide><path>registry/types.go
<ide> import (
<ide> registrytypes "github.com/docker/docker/api/types/registry"
<ide> )
<ide>
<del>// RepositoryData tracks the image list, list of endpoints for a repository
<del>type RepositoryData struct {
<del> // ImgList is a list of images in the repository
<del> ImgList map[string]*ImgData
<del> // Endpoints is a list of endpoints returned in X-Docker-Endpoints
<del> Endpoints []string
<del>}
<del>
<del>// ImgData is used to transfer image checksums to and from the registry
<del>type ImgData struct {
<del> // ID is an opaque string that identifies the image
<del> ID string `json:"id"`
<del> Checksum string `json:"checksum,omitempty"`
<del> ChecksumPayload string `json:"-"`
<del> Tag string `json:",omitempty"`
<del>}
<del>
<ide> // PingResult contains the information returned when pinging a registry. It
<ide> // indicates the registry's version and whether the registry claims to be a
<ide> // standalone registry. | 1 |
Python | Python | modify example debug_task to ignore result | 4627b9364891af55c72e509eb1b7630114b1bb82 | <ide><path>examples/django/proj/celery.py
<ide> app.autodiscover_tasks()
<ide>
<ide>
<del>@app.task(bind=True)
<add>@app.task(bind=True, ignore_result=True)
<ide> def debug_task(self):
<ide> print(f'Request: {self.request!r}') | 1 |
Javascript | Javascript | fix comment docks | c96b4a9774ecb63bb8d4b93eedf673d4ce97a4fa | <ide><path>src/workspace.js
<ide> const ALL_LOCATIONS = ['center', 'left', 'right', 'bottom'];
<ide> // Returns a {String} containing a longer version of the title to display in
<ide> // places like the window title or on tabs their short titles are ambiguous.
<ide> //
<del>// #### `onDidChangeTitle`
<add>// #### `onDidChangeTitle(callback)`
<ide> //
<ide> // Called by the workspace so it can be notified when the item's title changes.
<ide> // Must return a {Disposable}. | 1 |
Go | Go | remove unused functions from archive | 967ef7e6d2bd88a5d7010863f3d7138ca61b1939 | <ide><path>container/container_unix.go
<ide> func copyExistingContents(source, destination string) error {
<ide> }
<ide> if len(srcList) == 0 {
<ide> // If the source volume is empty, copies files from the root into the volume
<del> if err := chrootarchive.CopyWithTar(source, destination); err != nil {
<add> if err := chrootarchive.NewArchiver(nil).CopyWithTar(source, destination); err != nil {
<ide> return err
<ide> }
<ide> }
<ide><path>daemon/archive.go
<ide> func (daemon *Daemon) CopyOnBuild(cID, destPath, srcRoot, srcPath string, decomp
<ide> destExists = false
<ide> }
<ide>
<del> archiver := &archive.Archiver{
<del> Untar: chrootarchive.Untar,
<del> UIDMaps: daemon.idMappings.UIDs(),
<del> GIDMaps: daemon.idMappings.GIDs(),
<del> }
<del>
<add> archiver := chrootarchive.NewArchiver(daemon.idMappings)
<ide> src, err := os.Stat(fullSrcPath)
<ide> if err != nil {
<ide> return err
<ide><path>daemon/archive_tarcopyoptions_unix.go
<ide> func (daemon *Daemon) tarCopyOptions(container *container.Container, noOverwrite
<ide>
<ide> return &archive.TarOptions{
<ide> NoOverwriteDirNonDir: noOverwriteDirNonDir,
<del> ChownOpts: &archive.TarChownOptions{
<del> UID: user.Uid,
<del> GID: user.Gid,
<del> },
<add> ChownOpts: idtools.IDPair{UID: user.Uid, GID: user.Gid},
<ide> }, nil
<ide> }
<ide><path>daemon/graphdriver/vfs/driver.go
<ide> import (
<ide>
<ide> var (
<ide> // CopyWithTar defines the copy method to use.
<del> CopyWithTar = chrootarchive.CopyWithTar
<add> CopyWithTar = chrootarchive.NewArchiver(nil).CopyWithTar
<ide> )
<ide>
<ide> func init() {
<ide><path>distribution/registry_unit_test.go
<ide> func testDirectory(templateDir string) (dir string, err error) {
<ide> return
<ide> }
<ide> if templateDir != "" {
<del> if err = archive.CopyWithTar(templateDir, dir); err != nil {
<add> if err = archive.NewDefaultArchiver().CopyWithTar(templateDir, dir); err != nil {
<ide> return
<ide> }
<ide> }
<ide><path>layer/layer_test.go
<ide> import (
<ide>
<ide> func init() {
<ide> graphdriver.ApplyUncompressedLayer = archive.UnpackLayer
<del> vfs.CopyWithTar = archive.CopyWithTar
<add> defaultArchiver := archive.NewDefaultArchiver()
<add> vfs.CopyWithTar = defaultArchiver.CopyWithTar
<ide> }
<ide>
<ide> func newVFSGraphDriver(td string) (graphdriver.Driver, error) {
<ide><path>pkg/archive/archive.go
<ide> import (
<ide> "bytes"
<ide> "compress/bzip2"
<ide> "compress/gzip"
<del> "errors"
<ide> "fmt"
<ide> "io"
<ide> "io/ioutil"
<ide> type (
<ide> Compression int
<ide> // WhiteoutFormat is the format of whiteouts unpacked
<ide> WhiteoutFormat int
<del> // TarChownOptions wraps the chown options UID and GID.
<del> TarChownOptions struct {
<del> UID, GID int
<del> }
<ide>
<ide> // TarOptions wraps the tar options.
<ide> TarOptions struct {
<ide> type (
<ide> NoLchown bool
<ide> UIDMaps []idtools.IDMap
<ide> GIDMaps []idtools.IDMap
<del> ChownOpts *TarChownOptions
<add> ChownOpts idtools.IDPair
<ide> IncludeSourceDir bool
<ide> // WhiteoutFormat is the expected on disk format for whiteout files.
<ide> // This format will be converted to the standard format on pack
<ide> type (
<ide> RebaseNames map[string]string
<ide> InUserNS bool
<ide> }
<del>
<del> // Archiver allows the reuse of most utility functions of this package
<del> // with a pluggable Untar function. Also, to facilitate the passing of
<del> // specific id mappings for untar, an archiver can be created with maps
<del> // which will then be passed to Untar operations
<del> Archiver struct {
<del> Untar func(io.Reader, string, *TarOptions) error
<del> UIDMaps []idtools.IDMap
<del> GIDMaps []idtools.IDMap
<del> }
<del>
<del> // breakoutError is used to differentiate errors related to breaking out
<del> // When testing archive breakout in the unit tests, this error is expected
<del> // in order for the test to pass.
<del> breakoutError error
<ide> )
<ide>
<del>var (
<del> // ErrNotImplemented is the error message of function not implemented.
<del> ErrNotImplemented = errors.New("Function not implemented")
<del> defaultArchiver = &Archiver{Untar: Untar, UIDMaps: nil, GIDMaps: nil}
<del>)
<add>// Archiver allows the reuse of most utility functions of this package
<add>// with a pluggable Untar function. Also, to facilitate the passing of
<add>// specific id mappings for untar, an archiver can be created with maps
<add>// which will then be passed to Untar operations
<add>type Archiver struct {
<add> Untar func(io.Reader, string, *TarOptions) error
<add> IDMappings *idtools.IDMappings
<add>}
<ide>
<del>const (
<del> // HeaderSize is the size in bytes of a tar header
<del> HeaderSize = 512
<del>)
<add>// NewDefaultArchiver returns a new Archiver without any IDMappings
<add>func NewDefaultArchiver() *Archiver {
<add> return &Archiver{Untar: Untar, IDMappings: &idtools.IDMappings{}}
<add>}
<add>
<add>// breakoutError is used to differentiate errors related to breaking out
<add>// When testing archive breakout in the unit tests, this error is expected
<add>// in order for the test to pass.
<add>type breakoutError error
<ide>
<ide> const (
<ide> // Uncompressed represents the uncompressed.
<ide> const (
<ide> OverlayWhiteoutFormat
<ide> )
<ide>
<del>// IsArchive checks for the magic bytes of a tar or any supported compression
<del>// algorithm.
<del>func IsArchive(header []byte) bool {
<del> compression := DetectCompression(header)
<del> if compression != Uncompressed {
<del> return true
<del> }
<del> r := tar.NewReader(bytes.NewBuffer(header))
<del> _, err := r.Next()
<del> return err == nil
<del>}
<del>
<ide> // IsArchivePath checks if the (possibly compressed) file at the given path
<ide> // starts with a tar file header.
<ide> func IsArchivePath(path string) bool {
<ide> func (ta *tarAppender) addTarFile(path, name string) error {
<ide> return nil
<ide> }
<ide>
<del>func createTarFile(path, extractDir string, hdr *tar.Header, reader io.Reader, Lchown bool, chownOpts *TarChownOptions, inUserns bool) error {
<add>func createTarFile(path, extractDir string, hdr *tar.Header, reader io.Reader, Lchown bool, chownOpts *idtools.IDPair, inUserns bool) error {
<ide> // hdr.Mode is in linux format, which we can use for sycalls,
<ide> // but for os.Foo() calls we need the mode converted to os.FileMode,
<ide> // so use hdrInfo.Mode() (they differ for e.g. setuid bits)
<ide> func createTarFile(path, extractDir string, hdr *tar.Header, reader io.Reader, L
<ide> // Lchown is not supported on Windows.
<ide> if Lchown && runtime.GOOS != "windows" {
<ide> if chownOpts == nil {
<del> chownOpts = &TarChownOptions{UID: hdr.Uid, GID: hdr.Gid}
<add> chownOpts = &idtools.IDPair{UID: hdr.Uid, GID: hdr.Gid}
<ide> }
<ide> if err := os.Lchown(path, chownOpts.UID, chownOpts.GID); err != nil {
<ide> return err
<ide> loop:
<ide> }
<ide> }
<ide>
<del> if err := createTarFile(path, dest, hdr, trBuf, !options.NoLchown, options.ChownOpts, options.InUserNS); err != nil {
<add> if err := createTarFile(path, dest, hdr, trBuf, !options.NoLchown, &options.ChownOpts, options.InUserNS); err != nil {
<ide> return err
<ide> }
<ide>
<ide> func (archiver *Archiver) TarUntar(src, dst string) error {
<ide> return err
<ide> }
<ide> defer archive.Close()
<del>
<del> var options *TarOptions
<del> if archiver.UIDMaps != nil || archiver.GIDMaps != nil {
<del> options = &TarOptions{
<del> UIDMaps: archiver.UIDMaps,
<del> GIDMaps: archiver.GIDMaps,
<del> }
<add> options := &TarOptions{
<add> UIDMaps: archiver.IDMappings.UIDs(),
<add> GIDMaps: archiver.IDMappings.GIDs(),
<ide> }
<ide> return archiver.Untar(archive, dst, options)
<ide> }
<ide>
<del>// TarUntar is a convenience function which calls Tar and Untar, with the output of one piped into the other.
<del>// If either Tar or Untar fails, TarUntar aborts and returns the error.
<del>func TarUntar(src, dst string) error {
<del> return defaultArchiver.TarUntar(src, dst)
<del>}
<del>
<ide> // UntarPath untar a file from path to a destination, src is the source tar file path.
<ide> func (archiver *Archiver) UntarPath(src, dst string) error {
<ide> archive, err := os.Open(src)
<ide> if err != nil {
<ide> return err
<ide> }
<ide> defer archive.Close()
<del> var options *TarOptions
<del> if archiver.UIDMaps != nil || archiver.GIDMaps != nil {
<del> options = &TarOptions{
<del> UIDMaps: archiver.UIDMaps,
<del> GIDMaps: archiver.GIDMaps,
<del> }
<add> options := &TarOptions{
<add> UIDMaps: archiver.IDMappings.UIDs(),
<add> GIDMaps: archiver.IDMappings.GIDs(),
<ide> }
<ide> return archiver.Untar(archive, dst, options)
<ide> }
<ide>
<del>// UntarPath is a convenience function which looks for an archive
<del>// at filesystem path `src`, and unpacks it at `dst`.
<del>func UntarPath(src, dst string) error {
<del> return defaultArchiver.UntarPath(src, dst)
<del>}
<del>
<ide> // CopyWithTar creates a tar archive of filesystem path `src`, and
<ide> // unpacks it at filesystem path `dst`.
<ide> // The archive is streamed directly with fixed buffering and no
<ide> func (archiver *Archiver) CopyWithTar(src, dst string) error {
<ide> // if this archiver is set up with ID mapping we need to create
<ide> // the new destination directory with the remapped root UID/GID pair
<ide> // as owner
<del> rootUID, rootGID, err := idtools.GetRootUIDGID(archiver.UIDMaps, archiver.GIDMaps)
<add> rootIDs, err := archiver.IDMappings.RootPair()
<ide> if err != nil {
<ide> return err
<ide> }
<ide> // Create dst, copy src's content into it
<ide> logrus.Debugf("Creating dest directory: %s", dst)
<del> if err := idtools.MkdirAllNewAs(dst, 0755, rootUID, rootGID); err != nil {
<add> if err := idtools.MkdirAllAndChownNew(dst, 0755, rootIDs); err != nil {
<ide> return err
<ide> }
<ide> logrus.Debugf("Calling TarUntar(%s, %s)", src, dst)
<ide> return archiver.TarUntar(src, dst)
<ide> }
<ide>
<del>// CopyWithTar creates a tar archive of filesystem path `src`, and
<del>// unpacks it at filesystem path `dst`.
<del>// The archive is streamed directly with fixed buffering and no
<del>// intermediary disk IO.
<del>func CopyWithTar(src, dst string) error {
<del> return defaultArchiver.CopyWithTar(src, dst)
<del>}
<del>
<ide> // CopyFileWithTar emulates the behavior of the 'cp' command-line
<ide> // for a single file. It copies a regular file from path `src` to
<ide> // path `dst`, and preserves all its metadata.
<ide> func (archiver *Archiver) CopyFileWithTar(src, dst string) (err error) {
<ide> hdr.Name = filepath.Base(dst)
<ide> hdr.Mode = int64(chmodTarEntry(os.FileMode(hdr.Mode)))
<ide>
<del> remappedRootUID, remappedRootGID, err := idtools.GetRootUIDGID(archiver.UIDMaps, archiver.GIDMaps)
<add> remappedRootIDs, err := archiver.IDMappings.RootPair()
<ide> if err != nil {
<ide> return err
<ide> }
<ide>
<ide> // only perform mapping if the file being copied isn't already owned by the
<ide> // uid or gid of the remapped root in the container
<del> if remappedRootUID != hdr.Uid {
<del> xUID, err := idtools.ToHost(hdr.Uid, archiver.UIDMaps)
<add> if remappedRootIDs.UID != hdr.Uid {
<add> hdr.Uid, err = archiver.IDMappings.UIDToHost(hdr.Uid)
<ide> if err != nil {
<ide> return err
<ide> }
<del> hdr.Uid = xUID
<ide> }
<del> if remappedRootGID != hdr.Gid {
<del> xGID, err := idtools.ToHost(hdr.Gid, archiver.GIDMaps)
<add> if remappedRootIDs.GID != hdr.Gid {
<add> hdr.Gid, err = archiver.IDMappings.GIDToHost(hdr.Gid)
<ide> if err != nil {
<ide> return err
<ide> }
<del> hdr.Gid = xGID
<ide> }
<ide>
<ide> tw := tar.NewWriter(w)
<ide> func (archiver *Archiver) CopyFileWithTar(src, dst string) (err error) {
<ide> return err
<ide> }
<ide>
<del>// CopyFileWithTar emulates the behavior of the 'cp' command-line
<del>// for a single file. It copies a regular file from path `src` to
<del>// path `dst`, and preserves all its metadata.
<del>//
<del>// Destination handling is in an operating specific manner depending
<del>// where the daemon is running. If `dst` ends with a trailing slash
<del>// the final destination path will be `dst/base(src)` (Linux) or
<del>// `dst\base(src)` (Windows).
<del>func CopyFileWithTar(src, dst string) (err error) {
<del> return defaultArchiver.CopyFileWithTar(src, dst)
<del>}
<del>
<ide> // cmdStream executes a command, and returns its stdout as a stream.
<ide> // If the command fails to run or doesn't complete successfully, an error
<ide> // will be returned, including anything written on stderr.
<ide><path>pkg/archive/archive_test.go
<ide> func init() {
<ide> }
<ide> }
<ide>
<del>func TestIsArchiveNilHeader(t *testing.T) {
<del> out := IsArchive(nil)
<del> if out {
<del> t.Fatalf("isArchive should return false as nil is not a valid archive header")
<del> }
<add>var defaultArchiver = NewDefaultArchiver()
<add>
<add>func defaultTarUntar(src, dst string) error {
<add> return defaultArchiver.TarUntar(src, dst)
<ide> }
<ide>
<del>func TestIsArchiveInvalidHeader(t *testing.T) {
<del> header := []byte{0x00, 0x01, 0x02}
<del> out := IsArchive(header)
<del> if out {
<del> t.Fatalf("isArchive should return false as %s is not a valid archive header", header)
<del> }
<add>func defaultUntarPath(src, dst string) error {
<add> return defaultArchiver.UntarPath(src, dst)
<ide> }
<ide>
<del>func TestIsArchiveBzip2(t *testing.T) {
<del> header := []byte{0x42, 0x5A, 0x68}
<del> out := IsArchive(header)
<del> if !out {
<del> t.Fatalf("isArchive should return true as %s is a bz2 header", header)
<del> }
<add>func defaultCopyFileWithTar(src, dst string) (err error) {
<add> return defaultArchiver.CopyFileWithTar(src, dst)
<ide> }
<ide>
<del>func TestIsArchive7zip(t *testing.T) {
<del> header := []byte{0x50, 0x4b, 0x03, 0x04}
<del> out := IsArchive(header)
<del> if out {
<del> t.Fatalf("isArchive should return false as %s is a 7z header and it is not supported", header)
<del> }
<add>func defaultCopyWithTar(src, dst string) error {
<add> return defaultArchiver.CopyWithTar(src, dst)
<ide> }
<ide>
<ide> func TestIsArchivePathDir(t *testing.T) {
<ide> func TestUntarPathWithInvalidDest(t *testing.T) {
<ide> t.Fatal(err)
<ide> }
<ide>
<del> err = UntarPath(tarFile, invalidDestFolder)
<add> err = defaultUntarPath(tarFile, invalidDestFolder)
<ide> if err == nil {
<ide> t.Fatalf("UntarPath with invalid destination path should throw an error.")
<ide> }
<ide> func TestUntarPathWithInvalidSrc(t *testing.T) {
<ide> t.Fatalf("Fail to create the destination file")
<ide> }
<ide> defer os.RemoveAll(dest)
<del> err = UntarPath("/invalid/path", dest)
<add> err = defaultUntarPath("/invalid/path", dest)
<ide> if err == nil {
<ide> t.Fatalf("UntarPath with invalid src path should throw an error.")
<ide> }
<ide> func TestUntarPath(t *testing.T) {
<ide> t.Fatal(err)
<ide> }
<ide>
<del> err = UntarPath(tarFile, destFolder)
<add> err = defaultUntarPath(tarFile, destFolder)
<ide> if err != nil {
<ide> t.Fatalf("UntarPath shouldn't throw an error, %s.", err)
<ide> }
<ide> func TestUntarPathWithDestinationFile(t *testing.T) {
<ide> if err != nil {
<ide> t.Fatalf("Fail to create the destination file")
<ide> }
<del> err = UntarPath(tarFile, destFile)
<add> err = defaultUntarPath(tarFile, destFile)
<ide> if err == nil {
<ide> t.Fatalf("UntarPath should throw an error if the destination if a file")
<ide> }
<ide> func TestUntarPathWithDestinationSrcFileAsFolder(t *testing.T) {
<ide> if err != nil {
<ide> t.Fatal(err)
<ide> }
<del> err = UntarPath(tarFile, destFolder)
<add> err = defaultUntarPath(tarFile, destFolder)
<ide> if err != nil {
<ide> t.Fatalf("UntarPath should throw not throw an error if the extracted file already exists and is a folder")
<ide> }
<ide> func TestCopyWithTarInvalidSrc(t *testing.T) {
<ide> if err != nil {
<ide> t.Fatal(err)
<ide> }
<del> err = CopyWithTar(invalidSrc, destFolder)
<add> err = defaultCopyWithTar(invalidSrc, destFolder)
<ide> if err == nil {
<ide> t.Fatalf("archiver.CopyWithTar with invalid src path should throw an error.")
<ide> }
<ide> func TestCopyWithTarInexistentDestWillCreateIt(t *testing.T) {
<ide> if err != nil {
<ide> t.Fatal(err)
<ide> }
<del> err = CopyWithTar(srcFolder, inexistentDestFolder)
<add> err = defaultCopyWithTar(srcFolder, inexistentDestFolder)
<ide> if err != nil {
<ide> t.Fatalf("CopyWithTar with an inexistent folder shouldn't fail.")
<ide> }
<ide> func TestCopyWithTarSrcFile(t *testing.T) {
<ide> t.Fatal(err)
<ide> }
<ide> ioutil.WriteFile(src, []byte("content"), 0777)
<del> err = CopyWithTar(src, dest)
<add> err = defaultCopyWithTar(src, dest)
<ide> if err != nil {
<ide> t.Fatalf("archiver.CopyWithTar shouldn't throw an error, %s.", err)
<ide> }
<ide> func TestCopyWithTarSrcFolder(t *testing.T) {
<ide> t.Fatal(err)
<ide> }
<ide> ioutil.WriteFile(filepath.Join(src, "file"), []byte("content"), 0777)
<del> err = CopyWithTar(src, dest)
<add> err = defaultCopyWithTar(src, dest)
<ide> if err != nil {
<ide> t.Fatalf("archiver.CopyWithTar shouldn't throw an error, %s.", err)
<ide> }
<ide> func TestCopyFileWithTarInvalidSrc(t *testing.T) {
<ide> t.Fatal(err)
<ide> }
<ide> invalidFile := filepath.Join(tempFolder, "doesnotexists")
<del> err = CopyFileWithTar(invalidFile, destFolder)
<add> err = defaultCopyFileWithTar(invalidFile, destFolder)
<ide> if err == nil {
<ide> t.Fatalf("archiver.CopyWithTar with invalid src path should throw an error.")
<ide> }
<ide> func TestCopyFileWithTarInexistentDestWillCreateIt(t *testing.T) {
<ide> if err != nil {
<ide> t.Fatal(err)
<ide> }
<del> err = CopyFileWithTar(srcFile, inexistentDestFolder)
<add> err = defaultCopyFileWithTar(srcFile, inexistentDestFolder)
<ide> if err != nil {
<ide> t.Fatalf("CopyWithTar with an inexistent folder shouldn't fail.")
<ide> }
<ide> func TestCopyFileWithTarSrcFolder(t *testing.T) {
<ide> if err != nil {
<ide> t.Fatal(err)
<ide> }
<del> err = CopyFileWithTar(src, dest)
<add> err = defaultCopyFileWithTar(src, dest)
<ide> if err == nil {
<ide> t.Fatalf("CopyFileWithTar should throw an error with a folder.")
<ide> }
<ide> func TestCopyFileWithTarSrcFile(t *testing.T) {
<ide> t.Fatal(err)
<ide> }
<ide> ioutil.WriteFile(src, []byte("content"), 0777)
<del> err = CopyWithTar(src, dest+"/")
<add> err = defaultCopyWithTar(src, dest+"/")
<ide> if err != nil {
<ide> t.Fatalf("archiver.CopyFileWithTar shouldn't throw an error, %s.", err)
<ide> }
<ide> func checkNoChanges(fileNum int, hardlinks bool) error {
<ide> return err
<ide> }
<ide>
<del> err = TarUntar(srcDir, destDir)
<add> err = defaultTarUntar(srcDir, destDir)
<ide> if err != nil {
<ide> return err
<ide> }
<ide> func BenchmarkTarUntar(b *testing.B) {
<ide> b.ResetTimer()
<ide> b.SetBytes(int64(n))
<ide> for n := 0; n < b.N; n++ {
<del> err := TarUntar(origin, target)
<add> err := defaultTarUntar(origin, target)
<ide> if err != nil {
<ide> b.Fatal(err)
<ide> }
<ide> func BenchmarkTarUntarWithLinks(b *testing.B) {
<ide> b.ResetTimer()
<ide> b.SetBytes(int64(n))
<ide> for n := 0; n < b.N; n++ {
<del> err := TarUntar(origin, target)
<add> err := defaultTarUntar(origin, target)
<ide> if err != nil {
<ide> b.Fatal(err)
<ide> }
<ide><path>pkg/archive/archive_windows_test.go
<ide> func TestCopyFileWithInvalidDest(t *testing.T) {
<ide> t.Fatal(err)
<ide> }
<ide> ioutil.WriteFile(src, []byte("content"), 0777)
<del> err = CopyWithTar(src, dest)
<add> err = defaultCopyWithTar(src, dest)
<ide> if err == nil {
<ide> t.Fatalf("archiver.CopyWithTar should throw an error on invalid dest.")
<ide> }
<ide><path>pkg/chrootarchive/archive.go
<ide> import (
<ide> "github.com/docker/docker/pkg/idtools"
<ide> )
<ide>
<del>var chrootArchiver = &archive.Archiver{Untar: Untar}
<add>// NewArchiver returns a new Archiver which uses chrootarchive.Untar
<add>func NewArchiver(idMappings *idtools.IDMappings) *archive.Archiver {
<add> if idMappings == nil {
<add> idMappings = &idtools.IDMappings{}
<add> }
<add> return &archive.Archiver{Untar: Untar, IDMappings: idMappings}
<add>}
<ide>
<ide> // Untar reads a stream of bytes from `archive`, parses it as a tar archive,
<ide> // and unpacks it into the directory at `dest`.
<ide> func UntarUncompressed(tarArchive io.Reader, dest string, options *archive.TarOp
<ide>
<ide> // Handler for teasing out the automatic decompression
<ide> func untarHandler(tarArchive io.Reader, dest string, options *archive.TarOptions, decompress bool) error {
<del>
<ide> if tarArchive == nil {
<ide> return fmt.Errorf("Empty archive")
<ide> }
<ide> func untarHandler(tarArchive io.Reader, dest string, options *archive.TarOptions
<ide>
<ide> return invokeUnpack(r, dest, options)
<ide> }
<del>
<del>// TarUntar is a convenience function which calls Tar and Untar, with the output of one piped into the other.
<del>// If either Tar or Untar fails, TarUntar aborts and returns the error.
<del>func TarUntar(src, dst string) error {
<del> return chrootArchiver.TarUntar(src, dst)
<del>}
<del>
<del>// CopyWithTar creates a tar archive of filesystem path `src`, and
<del>// unpacks it at filesystem path `dst`.
<del>// The archive is streamed directly with fixed buffering and no
<del>// intermediary disk IO.
<del>func CopyWithTar(src, dst string) error {
<del> return chrootArchiver.CopyWithTar(src, dst)
<del>}
<del>
<del>// CopyFileWithTar emulates the behavior of the 'cp' command-line
<del>// for a single file. It copies a regular file from path `src` to
<del>// path `dst`, and preserves all its metadata.
<del>//
<del>// If `dst` ends with a trailing slash '/' ('\' on Windows), the final
<del>// destination path will be `dst/base(src)` or `dst\base(src)`
<del>func CopyFileWithTar(src, dst string) (err error) {
<del> return chrootArchiver.CopyFileWithTar(src, dst)
<del>}
<del>
<del>// UntarPath is a convenience function which looks for an archive
<del>// at filesystem path `src`, and unpacks it at `dst`.
<del>func UntarPath(src, dst string) error {
<del> return chrootArchiver.UntarPath(src, dst)
<del>}
<ide><path>pkg/chrootarchive/archive_test.go
<ide> func init() {
<ide> reexec.Init()
<ide> }
<ide>
<add>var chrootArchiver = NewArchiver(nil)
<add>
<add>func TarUntar(src, dst string) error {
<add> return chrootArchiver.TarUntar(src, dst)
<add>}
<add>
<add>func CopyFileWithTar(src, dst string) (err error) {
<add> return chrootArchiver.CopyFileWithTar(src, dst)
<add>}
<add>
<add>func UntarPath(src, dst string) error {
<add> return chrootArchiver.UntarPath(src, dst)
<add>}
<add>
<add>func CopyWithTar(src, dst string) error {
<add> return chrootArchiver.CopyWithTar(src, dst)
<add>}
<add>
<ide> func TestChrootTarUntar(t *testing.T) {
<ide> tmpdir, err := ioutil.TempDir("", "docker-TestChrootTarUntar")
<ide> if err != nil {
<ide><path>pkg/idtools/idtools.go
<ide> func ToContainer(hostID int, idMap []IDMap) (int, error) {
<ide> // ToHost takes an id mapping and a remapped ID, and translates the
<ide> // ID to the mapped host ID. If no map is provided, then the translation
<ide> // assumes a 1-to-1 mapping and returns the passed in id #
<add>// Depercated: use IDMappings.UIDToHost and IDMappings.GIDToHost
<ide> func ToHost(contID int, idMap []IDMap) (int, error) {
<ide> if idMap == nil {
<ide> return contID, nil
<ide> func (i *IDMappings) RootPair() (IDPair, error) {
<ide> return IDPair{UID: uid, GID: gid}, err
<ide> }
<ide>
<add>// UIDToHost returns the host UID for the container uid
<add>func (i *IDMappings) UIDToHost(uid int) (int, error) {
<add> return ToHost(uid, i.uids)
<add>}
<add>
<add>// GIDToHost returns the host GID for the container gid
<add>func (i *IDMappings) GIDToHost(gid int) (int, error) {
<add> return ToHost(gid, i.gids)
<add>}
<add>
<ide> // UIDs return the UID mapping
<ide> // TODO: remove this once everything has been refactored to use pairs
<ide> func (i *IDMappings) UIDs() []IDMap { | 12 |
Javascript | Javascript | eliminate unnecessary exports | b855dadae00b616c7669530a75d4e410f1b7d601 | <ide><path>lib/internal/util.js
<ide> const prefix = `(${process.release.name}:${process.pid}) `;
<ide> const kArrowMessagePrivateSymbolIndex = binding['arrow_message_private_symbol'];
<ide> const kDecoratedPrivateSymbolIndex = binding['decorated_private_symbol'];
<ide>
<del>exports.getHiddenValue = binding.getHiddenValue;
<del>exports.setHiddenValue = binding.setHiddenValue;
<del>
<ide> // The `buffer` module uses this. Defining it here instead of in the public
<ide> // `util` module makes it accessible without having to `require('util')` there.
<ide> exports.customInspectSymbol = Symbol('util.inspect.custom');
<ide> exports._deprecate = function(fn, msg, code) {
<ide>
<ide> exports.decorateErrorStack = function decorateErrorStack(err) {
<ide> if (!(exports.isError(err) && err.stack) ||
<del> exports.getHiddenValue(err, kDecoratedPrivateSymbolIndex) === true)
<add> binding.getHiddenValue(err, kDecoratedPrivateSymbolIndex) === true)
<ide> return;
<ide>
<del> const arrow = exports.getHiddenValue(err, kArrowMessagePrivateSymbolIndex);
<add> const arrow = binding.getHiddenValue(err, kArrowMessagePrivateSymbolIndex);
<ide>
<ide> if (arrow) {
<ide> err.stack = arrow + err.stack;
<del> exports.setHiddenValue(err, kDecoratedPrivateSymbolIndex, true);
<add> binding.setHiddenValue(err, kDecoratedPrivateSymbolIndex, true);
<ide> }
<ide> };
<ide>
<ide><path>test/parallel/test-internal-util-decorate-error-stack.js
<ide> const path = require('path');
<ide> const kArrowMessagePrivateSymbolIndex = binding['arrow_message_private_symbol'];
<ide> const kDecoratedPrivateSymbolIndex = binding['decorated_private_symbol'];
<ide>
<add>const decorateErrorStack = internalUtil.decorateErrorStack;
<add>
<ide> assert.doesNotThrow(function() {
<del> internalUtil.decorateErrorStack();
<del> internalUtil.decorateErrorStack(null);
<del> internalUtil.decorateErrorStack(1);
<del> internalUtil.decorateErrorStack(true);
<add> decorateErrorStack();
<add> decorateErrorStack(null);
<add> decorateErrorStack(1);
<add> decorateErrorStack(true);
<ide> });
<ide>
<ide> // Verify that a stack property is not added to non-Errors
<ide> const obj = {};
<del>internalUtil.decorateErrorStack(obj);
<add>decorateErrorStack(obj);
<ide> assert.strictEqual(obj.stack, undefined);
<ide>
<ide> // Verify that the stack is decorated when possible
<ide> assert(typeof err, 'object');
<ide> checkStack(err.stack);
<ide>
<ide> // Verify that the stack is only decorated once
<del>internalUtil.decorateErrorStack(err);
<del>internalUtil.decorateErrorStack(err);
<add>decorateErrorStack(err);
<add>decorateErrorStack(err);
<ide> checkStack(err.stack);
<ide>
<ide> // Verify that the stack is only decorated once for uncaught exceptions
<ide> checkStack(result.stderr);
<ide> // Verify that the stack is unchanged when there is no arrow message
<ide> err = new Error('foo');
<ide> let originalStack = err.stack;
<del>internalUtil.decorateErrorStack(err);
<add>decorateErrorStack(err);
<ide> assert.strictEqual(originalStack, err.stack);
<ide>
<ide> // Verify that the arrow message is added to the start of the stack when it
<ide> const arrowMessage = 'arrow_message';
<ide> err = new Error('foo');
<ide> originalStack = err.stack;
<ide>
<del>internalUtil.setHiddenValue(err, kArrowMessagePrivateSymbolIndex, arrowMessage);
<del>internalUtil.decorateErrorStack(err);
<add>binding.setHiddenValue(err, kArrowMessagePrivateSymbolIndex, arrowMessage);
<add>decorateErrorStack(err);
<ide>
<ide> assert.strictEqual(err.stack, `${arrowMessage}${originalStack}`);
<del>assert.strictEqual(internalUtil
<del> .getHiddenValue(err, kDecoratedPrivateSymbolIndex), true);
<add>assert.strictEqual(
<add> binding.getHiddenValue(err, kDecoratedPrivateSymbolIndex), true);
<ide><path>test/parallel/test-util-internal.js
<ide> const common = require('../common');
<ide> const path = require('path');
<ide> const assert = require('assert');
<del>const internalUtil = require('internal/util');
<ide> const spawnSync = require('child_process').spawnSync;
<ide>
<ide> const binding = process.binding('util');
<ide> const kArrowMessagePrivateSymbolIndex = binding['arrow_message_private_symbol'];
<ide>
<ide> function getHiddenValue(obj, index) {
<ide> return function() {
<del> internalUtil.getHiddenValue(obj, index);
<add> binding.getHiddenValue(obj, index);
<ide> };
<ide> }
<ide>
<ide> function setHiddenValue(obj, index, val) {
<ide> return function() {
<del> internalUtil.setHiddenValue(obj, index, val);
<add> binding.setHiddenValue(obj, index, val);
<ide> };
<ide> }
<ide>
<ide> assert.throws(getHiddenValue({}), /index must be an uint32/);
<ide> assert.throws(getHiddenValue({}, null), /index must be an uint32/);
<ide> assert.throws(getHiddenValue({}, []), /index must be an uint32/);
<ide> assert.deepStrictEqual(
<del> internalUtil.getHiddenValue({}, kArrowMessagePrivateSymbolIndex),
<add> binding.getHiddenValue({}, kArrowMessagePrivateSymbolIndex),
<ide> undefined);
<ide>
<ide> assert.throws(setHiddenValue(), /obj must be an object/);
<ide> assert.throws(setHiddenValue({}, null), /index must be an uint32/);
<ide> assert.throws(setHiddenValue({}, []), /index must be an uint32/);
<ide> const obj = {};
<ide> assert.strictEqual(
<del> internalUtil.setHiddenValue(obj, kArrowMessagePrivateSymbolIndex, 'bar'),
<add> binding.setHiddenValue(obj, kArrowMessagePrivateSymbolIndex, 'bar'),
<ide> true);
<ide> assert.strictEqual(
<del> internalUtil.getHiddenValue(obj, kArrowMessagePrivateSymbolIndex),
<add> binding.getHiddenValue(obj, kArrowMessagePrivateSymbolIndex),
<ide> 'bar');
<ide>
<ide> let arrowMessage;
<ide> try {
<ide> require(path.join(common.fixturesDir, 'syntax', 'bad_syntax'));
<ide> } catch (err) {
<ide> arrowMessage =
<del> internalUtil.getHiddenValue(err, kArrowMessagePrivateSymbolIndex);
<add> binding.getHiddenValue(err, kArrowMessagePrivateSymbolIndex);
<ide> }
<ide>
<ide> assert(/bad_syntax\.js:1/.test(arrowMessage)); | 3 |
Text | Text | add missing metadata for dns.lookup | a02b98b0b5d2c732fdb22f84f07335b2b5866697 | <ide><path>doc/api/dns.md
<ide> section if a custom port is used.
<ide> <!-- YAML
<ide> added: v0.1.90
<ide> changes:
<add> - version: v8.5.0
<add> pr-url: https://github.com/nodejs/node/pull/14731
<add> description: The `verbatim` option is supported now.
<ide> - version: v1.2.0
<ide> pr-url: https://github.com/nodejs/node/pull/744
<ide> description: The `all` option is supported now. | 1 |
Python | Python | update docstrings for nttc | 6ae3a29ba5522c566b893f6994cc1748c1c7b149 | <ide><path>libcloud/compute/drivers/nttcis.py
<ide> def ex_create_consistency_group(self, name, journal_size_gb,
<ide>
<ide> :param name: Name of consistency group
<ide> :type name: ``str``
<add>
<ide> :param journal_size_gb: Journal size in GB
<ide> :type journal_size_gb: ``str``
<add>
<ide> :param source_server_id: Id of the server to copy
<ide> :type source_server_id: ``str``
<add>
<ide> :param target_server_id: Id of the server to receive the copy
<ide> :type: target_server_id: ``str``
<add>
<ide> :param description: (Optional) Description of consistency group
<ide> :type: description: ``str``
<del> :return: :class: `NttCisConsistenccyGroup`
<add>
<add> :rtype: :class:`NttCisConsistencyGroup`
<ide> """
<ide>
<ide> consistency_group_elm = ET.Element('createConsistencyGroup',
<ide> def ex_list_consistency_groups(self, params={}):
<ide> * state=
<ide> * operation_status=
<ide> * drs_infrastructure_status=
<del> :return: `list` of :class: `NttCisConsistencyGroup`
<add> :rtype: `list` of :class: `NttCisConsistencyGroup`
<ide> """
<ide>
<ide> response = self.connection.request_with_orgId_api_2(
<ide> def ex_get_consistency_group(self, consistency_group_id):
<ide> """
<ide> Retrieves a Consistency by it's id and is more efficient thatn listing
<ide> all consistency groups and filtering that result.
<add>
<ide> :param consistency_group_id: An id of a consistency group
<ide> :type consistency_group_id: ``str``
<del> :return: :class: `NttCisConsistencygroup`
<add>
<add> :rtype: :class:`NttCisConsistencygroup`
<ide> """
<ide> response = self.connection.request_with_orgId_api_2(
<ide> "consistencyGroup/consistencyGroup/%s" % consistency_group_id
<ide> def ex_list_consistency_group_snapshots(self, consistency_group_id,
<ide>
<ide> :param consistency_group_id: The id of consistency group
<ide> :type consistency_group_id: ``str``
<add>
<ide> :param create_time_min: (Optional) in form YYYY-MM-DDT00:00.00.00Z or
<ide> substitute time offset for Z, i.e,
<ide> -05:00
<ide> :type create_time_min: ``str``
<add>
<ide> :param create_time_max: (Optional) in form YYYY-MM-DDT00:00:00.000Z or
<ide> substitute time offset for Z, i.e,
<ide> -05:00
<ide> :type create_time_max: ``str``
<del> :return: `list` of :class" `NttCisSnapshots`
<add>
<add> :rtype: `list` of :class:`NttCisSnapshots`
<ide> """
<ide>
<ide> if create_time_min is None and create_time_max is None:
<ide> def ex_list_consistency_group_snapshots(self, consistency_group_id,
<ide>
<ide> def ex_expand_journal(self, consistency_group_id, size_gb):
<ide> """
<del> Expand the consistency group's journhal size in 100Gb increments
<add> Expand the consistency group's journhal size in 100Gb increments.
<add>
<ide> :param consistency_group_id: The consistency group's UUID
<del> :type consistency_group_id: ``str``
<add> :type consistency_group_id: ``str``
<add>
<ide> :param size_gb: Gb in 100 Gb increments
<del> :type size_gb: ``str``
<del> :return: True/False
<add> :type size_gb: ``str``
<add>
<add> :return: True if response_code contains either 'IN_PROGRESS' or 'OK'
<add> otherwise False
<ide> :rtype: ``bool``
<ide> """
<ide>
<ide> def ex_expand_journal(self, consistency_group_id, size_gb):
<ide> def ex_start_drs_failover_preview(self, consistency_group_id,
<ide> snapshot_id):
<ide> """
<del> Brings a Consistency Group into PREVIEWING_SNAPSHOT mode
<add> Brings a Consistency Group into PREVIEWING_SNAPSHOT mode.
<ide>
<ide> :param consistency_group_id: Id of the Consistency Group to put into
<ide> PRIEVEW_MODE
<ide> :type consistency_group_id: ``str``
<add>
<ide> :param snapshot_id: Id of the Snapshot to preview
<ide> :type snapshot_id: ``str``
<del> :return: True if response_code contains eiether 'IN_PROGRESS' or 'OK'
<add>
<add> :return: True if response_code contains either 'IN_PROGRESS' or 'OK'
<ide> otherwise False
<ide> :rtype: ``bool``
<ide> """
<ide> def ex_stop_drs_failover_preview(self, consistency_group_id):
<ide>
<ide> :param consistency_group_id: Consistency Group's Id
<ide> :type ``str``
<del> :return: True if response_code contains eiether 'IN_PROGRESS' or 'OK'
<del> otherwise False
<add>
<add> :return: True if response_code contains either 'IN_PROGRESS' or 'OK'
<add> otherwise False
<ide> :rtype: ``bool``
<ide> """
<ide> preview_elm = ET.Element("stopPreviewSnapshot",
<ide> def ex_initiate_drs_failover(self, consistency_group_id):
<ide>
<ide> :param consistency_group_id: Consistency Group's Id to failover
<ide> :type consistency_group_id: ``str``
<del> :return: :return: True if response_code contains eiether
<del> IN_PROGRESS' or 'OK' otherwise False
<add>
<add> :return: True if response_code contains either
<add> IN_PROGRESS' or 'OK' otherwise False
<ide> :rtype: ``bool``
<ide> """
<ide> failover_elm = ET.Element("initiateFailover",
<ide> def ex_delete_consistency_group(self, consistency_group_id):
<ide>
<ide> :param consistency_group_id: Id of Consistency Group to delete
<ide> :type ``str``
<del> :return: True if response_code contains eiether
<del> IN_PROGRESS' or 'OK' otherwise False
<add> :return: True if response_code contains either
<add> IN_PROGRESS' or 'OK' otherwise False
<ide> :rtype: ``bool``
<ide> """
<ide> delete_elm = ET.Element("deleteConsistencyGroup", | 1 |
Text | Text | add an article on authentication | 2960824eabfcd429ea85215606f2ea7b040d0661 | <ide><path>docs/introduction/Ecosystem.md
<ide> On this page we will only feature a few of them that the Redux maintainers have
<ide> * [Understanding Redux](http://www.youhavetolearncomputers.com/blog/2015/9/15/a-conceptual-overview-of-redux-or-how-i-fell-in-love-with-a-javascript-state-container) — Learn the basic concepts of Redux
<ide> * [Handcrafting an Isomorphic Redux Application (With Love)](https://medium.com/@bananaoomarang/handcrafting-an-isomorphic-redux-application-with-love-40ada4468af4) — A guide to creating a universal app with data fetching and routing
<ide> * [Full-Stack Redux Tutorial](http://teropa.info/blog/2015/09/10/full-stack-redux-tutorial.html) — A comprehensive guide to test-first development with Redux, React, and Immutable
<add>* [Secure Your React and Redux App with JWT Authentication](https://auth0.com/blog/2016/01/04/secure-your-react-and-redux-app-with-jwt-authentication/) — Learn how to add JWT authentication to your React and Redux app
<ide> * [Understanding Redux Middleware](https://medium.com/@meagle/understanding-87566abcfb7a#.l033pyr02) — In-depth guide to implementing Redux middleware
<ide> * [Angular 2 — Introduction to Redux](https://medium.com/google-developer-experts/angular-2-introduction-to-redux-1cf18af27e6e) — An introduction to Redux fundamental concepts with an example in Angular 2
<ide> * [Working with VK API (in Russian)](https://www.gitbook.com/book/maxfarseer/redux-course-ru/details) — A tutorial in Russian that demonstrates creating an app that consumes VK API | 1 |
Go | Go | create layer folders with correct acl | ed10ac6ee93cf5c389a735c0c97b08d5d5dff3a9 | <ide><path>api/common.go
<ide> func MatchesContentType(contentType, expectedType string) bool {
<ide> // LoadOrCreateTrustKey attempts to load the libtrust key at the given path,
<ide> // otherwise generates a new one
<ide> func LoadOrCreateTrustKey(trustKeyPath string) (libtrust.PrivateKey, error) {
<del> err := system.MkdirAll(filepath.Dir(trustKeyPath), 0700)
<add> err := system.MkdirAll(filepath.Dir(trustKeyPath), 0700, "")
<ide> if err != nil {
<ide> return nil, err
<ide> }
<ide><path>container/container_windows.go
<ide> func (container *Container) CreateSecretSymlinks() error {
<ide> if err != nil {
<ide> return err
<ide> }
<del> if err := system.MkdirAll(filepath.Dir(resolvedPath), 0); err != nil {
<add> if err := system.MkdirAll(filepath.Dir(resolvedPath), 0, ""); err != nil {
<ide> return err
<ide> }
<ide> if err := os.Symlink(filepath.Join(containerInternalSecretMountPath, r.SecretID), resolvedPath); err != nil {
<ide> func (container *Container) CreateConfigSymlinks() error {
<ide> if err != nil {
<ide> return err
<ide> }
<del> if err := system.MkdirAll(filepath.Dir(resolvedPath), 0); err != nil {
<add> if err := system.MkdirAll(filepath.Dir(resolvedPath), 0, ""); err != nil {
<ide> return err
<ide> }
<ide> if err := os.Symlink(filepath.Join(containerInternalConfigsDirPath, configRef.ConfigID), resolvedPath); err != nil {
<ide><path>daemon/container_operations_windows.go
<ide> func (daemon *Daemon) setupConfigDir(c *container.Container) (setupErr error) {
<ide> logrus.Debugf("configs: setting up config dir: %s", localPath)
<ide>
<ide> // create local config root
<del> if err := system.MkdirAllWithACL(localPath, 0); err != nil {
<add> if err := system.MkdirAllWithACL(localPath, 0, system.SddlAdministratorsLocalSystem); err != nil {
<ide> return errors.Wrap(err, "error creating config dir")
<ide> }
<ide>
<ide> func (daemon *Daemon) setupSecretDir(c *container.Container) (setupErr error) {
<ide> logrus.Debugf("secrets: setting up secret dir: %s", localMountPath)
<ide>
<ide> // create local secret root
<del> if err := system.MkdirAllWithACL(localMountPath, 0); err != nil {
<add> if err := system.MkdirAllWithACL(localMountPath, 0, system.SddlAdministratorsLocalSystem); err != nil {
<ide> return errors.Wrap(err, "error creating secret local directory")
<ide> }
<ide>
<ide><path>daemon/daemon.go
<ide> func NewDaemon(config *config.Config, registryService registry.Service, containe
<ide> }
<ide>
<ide> if runtime.GOOS == "windows" {
<del> if err := system.MkdirAll(filepath.Join(config.Root, "credentialspecs"), 0); err != nil && !os.IsExist(err) {
<add> if err := system.MkdirAll(filepath.Join(config.Root, "credentialspecs"), 0, ""); err != nil && !os.IsExist(err) {
<ide> return nil, err
<ide> }
<ide> }
<ide> func NewDaemon(config *config.Config, registryService registry.Service, containe
<ide>
<ide> trustDir := filepath.Join(config.Root, "trust")
<ide>
<del> if err := system.MkdirAll(trustDir, 0700); err != nil {
<add> if err := system.MkdirAll(trustDir, 0700, ""); err != nil {
<ide> return nil, err
<ide> }
<ide>
<ide><path>daemon/daemon_windows.go
<ide> func setupRemappedRoot(config *config.Config) (*idtools.IDMappings, error) {
<ide> func setupDaemonRoot(config *config.Config, rootDir string, rootIDs idtools.IDPair) error {
<ide> config.Root = rootDir
<ide> // Create the root directory if it doesn't exists
<del> if err := system.MkdirAllWithACL(config.Root, 0); err != nil && !os.IsExist(err) {
<add> if err := system.MkdirAllWithACL(config.Root, 0, system.SddlAdministratorsLocalSystem); err != nil && !os.IsExist(err) {
<ide> return err
<ide> }
<ide> return nil
<ide><path>daemon/graphdriver/lcow/lcow.go
<ide> func (d *Driver) Create(id, parent string, opts *graphdriver.CreateOpts) error {
<ide> // Make sure layers are created with the correct ACL so that VMs can access them.
<ide> layerPath := d.dir(id)
<ide> logrus.Debugf("lcowdriver: create: id %s: creating layerPath %s", id, layerPath)
<add> // Make sure the layers are created with the correct ACL so that VMs can access them.
<ide> if err := system.MkdirAllWithACL(layerPath, 755, system.SddlNtvmAdministratorsLocalSystem); err != nil {
<ide> return err
<ide> }
<ide><path>libcontainerd/remote_unix.go
<ide> func New(stateDir string, options ...RemoteOption) (_ Remote, err error) {
<ide> }
<ide> }
<ide>
<del> if err := system.MkdirAll(stateDir, 0700); err != nil {
<add> if err := system.MkdirAll(stateDir, 0700, ""); err != nil {
<ide> return nil, err
<ide> }
<ide>
<ide><path>pkg/archive/archive.go
<ide> func (archiver *Archiver) CopyFileWithTar(src, dst string) (err error) {
<ide> dst = filepath.Join(dst, filepath.Base(src))
<ide> }
<ide> // Create the holding directory if necessary
<del> if err := system.MkdirAll(filepath.Dir(dst), 0700); err != nil {
<add> if err := system.MkdirAll(filepath.Dir(dst), 0700, ""); err != nil {
<ide> return err
<ide> }
<ide>
<ide><path>pkg/archive/diff.go
<ide> func UnpackLayer(dest string, layer io.Reader, options *TarOptions) (size int64,
<ide> parentPath := filepath.Join(dest, parent)
<ide>
<ide> if _, err := os.Lstat(parentPath); err != nil && os.IsNotExist(err) {
<del> err = system.MkdirAll(parentPath, 0600)
<add> err = system.MkdirAll(parentPath, 0600, "")
<ide> if err != nil {
<ide> return 0, err
<ide> }
<ide><path>pkg/chrootarchive/archive_test.go
<ide> func TestChrootTarUntar(t *testing.T) {
<ide> }
<ide> defer os.RemoveAll(tmpdir)
<ide> src := filepath.Join(tmpdir, "src")
<del> if err := system.MkdirAll(src, 0700); err != nil {
<add> if err := system.MkdirAll(src, 0700, ""); err != nil {
<ide> t.Fatal(err)
<ide> }
<ide> if err := ioutil.WriteFile(filepath.Join(src, "toto"), []byte("hello toto"), 0644); err != nil {
<ide> func TestChrootTarUntar(t *testing.T) {
<ide> t.Fatal(err)
<ide> }
<ide> dest := filepath.Join(tmpdir, "src")
<del> if err := system.MkdirAll(dest, 0700); err != nil {
<add> if err := system.MkdirAll(dest, 0700, ""); err != nil {
<ide> t.Fatal(err)
<ide> }
<ide> if err := Untar(stream, dest, &archive.TarOptions{ExcludePatterns: []string{"lolo"}}); err != nil {
<ide> func TestChrootUntarWithHugeExcludesList(t *testing.T) {
<ide> }
<ide> defer os.RemoveAll(tmpdir)
<ide> src := filepath.Join(tmpdir, "src")
<del> if err := system.MkdirAll(src, 0700); err != nil {
<add> if err := system.MkdirAll(src, 0700, ""); err != nil {
<ide> t.Fatal(err)
<ide> }
<ide> if err := ioutil.WriteFile(filepath.Join(src, "toto"), []byte("hello toto"), 0644); err != nil {
<ide> func TestChrootUntarWithHugeExcludesList(t *testing.T) {
<ide> t.Fatal(err)
<ide> }
<ide> dest := filepath.Join(tmpdir, "dest")
<del> if err := system.MkdirAll(dest, 0700); err != nil {
<add> if err := system.MkdirAll(dest, 0700, ""); err != nil {
<ide> t.Fatal(err)
<ide> }
<ide> options := &archive.TarOptions{}
<ide> func TestChrootTarUntarWithSymlink(t *testing.T) {
<ide> }
<ide> defer os.RemoveAll(tmpdir)
<ide> src := filepath.Join(tmpdir, "src")
<del> if err := system.MkdirAll(src, 0700); err != nil {
<add> if err := system.MkdirAll(src, 0700, ""); err != nil {
<ide> t.Fatal(err)
<ide> }
<ide> if _, err := prepareSourceDirectory(10, src, false); err != nil {
<ide> func TestChrootCopyWithTar(t *testing.T) {
<ide> }
<ide> defer os.RemoveAll(tmpdir)
<ide> src := filepath.Join(tmpdir, "src")
<del> if err := system.MkdirAll(src, 0700); err != nil {
<add> if err := system.MkdirAll(src, 0700, ""); err != nil {
<ide> t.Fatal(err)
<ide> }
<ide> if _, err := prepareSourceDirectory(10, src, true); err != nil {
<ide> func TestChrootCopyFileWithTar(t *testing.T) {
<ide> }
<ide> defer os.RemoveAll(tmpdir)
<ide> src := filepath.Join(tmpdir, "src")
<del> if err := system.MkdirAll(src, 0700); err != nil {
<add> if err := system.MkdirAll(src, 0700, ""); err != nil {
<ide> t.Fatal(err)
<ide> }
<ide> if _, err := prepareSourceDirectory(10, src, true); err != nil {
<ide> func TestChrootUntarPath(t *testing.T) {
<ide> }
<ide> defer os.RemoveAll(tmpdir)
<ide> src := filepath.Join(tmpdir, "src")
<del> if err := system.MkdirAll(src, 0700); err != nil {
<add> if err := system.MkdirAll(src, 0700, ""); err != nil {
<ide> t.Fatal(err)
<ide> }
<ide> if _, err := prepareSourceDirectory(10, src, false); err != nil {
<ide> func TestChrootUntarEmptyArchiveFromSlowReader(t *testing.T) {
<ide> }
<ide> defer os.RemoveAll(tmpdir)
<ide> dest := filepath.Join(tmpdir, "dest")
<del> if err := system.MkdirAll(dest, 0700); err != nil {
<add> if err := system.MkdirAll(dest, 0700, ""); err != nil {
<ide> t.Fatal(err)
<ide> }
<ide> stream := &slowEmptyTarReader{size: 10240, chunkSize: 1024}
<ide> func TestChrootApplyEmptyArchiveFromSlowReader(t *testing.T) {
<ide> }
<ide> defer os.RemoveAll(tmpdir)
<ide> dest := filepath.Join(tmpdir, "dest")
<del> if err := system.MkdirAll(dest, 0700); err != nil {
<add> if err := system.MkdirAll(dest, 0700, ""); err != nil {
<ide> t.Fatal(err)
<ide> }
<ide> stream := &slowEmptyTarReader{size: 10240, chunkSize: 1024}
<ide> func TestChrootApplyDotDotFile(t *testing.T) {
<ide> }
<ide> defer os.RemoveAll(tmpdir)
<ide> src := filepath.Join(tmpdir, "src")
<del> if err := system.MkdirAll(src, 0700); err != nil {
<add> if err := system.MkdirAll(src, 0700, ""); err != nil {
<ide> t.Fatal(err)
<ide> }
<ide> if err := ioutil.WriteFile(filepath.Join(src, "..gitme"), []byte(""), 0644); err != nil {
<ide> func TestChrootApplyDotDotFile(t *testing.T) {
<ide> t.Fatal(err)
<ide> }
<ide> dest := filepath.Join(tmpdir, "dest")
<del> if err := system.MkdirAll(dest, 0700); err != nil {
<add> if err := system.MkdirAll(dest, 0700, ""); err != nil {
<ide> t.Fatal(err)
<ide> }
<ide> if _, err := ApplyLayer(dest, stream); err != nil {
<ide><path>pkg/idtools/idtools_unix.go
<ide> func mkdirAs(path string, mode os.FileMode, ownerUID, ownerGID int, mkAll, chown
<ide> paths = append(paths, dirPath)
<ide> }
<ide> }
<del> if err := system.MkdirAll(path, mode); err != nil && !os.IsExist(err) {
<add> if err := system.MkdirAll(path, mode, ""); err != nil && !os.IsExist(err) {
<ide> return err
<ide> }
<ide> } else {
<ide><path>pkg/idtools/idtools_windows.go
<ide> import (
<ide> // Platforms such as Windows do not support the UID/GID concept. So make this
<ide> // just a wrapper around system.MkdirAll.
<ide> func mkdirAs(path string, mode os.FileMode, ownerUID, ownerGID int, mkAll, chownExisting bool) error {
<del> if err := system.MkdirAll(path, mode); err != nil && !os.IsExist(err) {
<add> if err := system.MkdirAll(path, mode, ""); err != nil && !os.IsExist(err) {
<ide> return err
<ide> }
<ide> return nil
<ide><path>pkg/pidfile/pidfile.go
<ide> func New(path string) (*PIDFile, error) {
<ide> return nil, err
<ide> }
<ide> // Note MkdirAll returns nil if a directory already exists
<del> if err := system.MkdirAll(filepath.Dir(path), os.FileMode(0755)); err != nil {
<add> if err := system.MkdirAll(filepath.Dir(path), os.FileMode(0755), ""); err != nil {
<ide> return nil, err
<ide> }
<ide> if err := ioutil.WriteFile(path, []byte(fmt.Sprintf("%d", os.Getpid())), 0644); err != nil {
<ide><path>pkg/system/filesys.go
<ide> import (
<ide> "path/filepath"
<ide> )
<ide>
<del>// MkdirAllWithACL is a wrapper for MkdirAll that creates a directory
<del>// ACL'd for Builtin Administrators and Local System.
<del>func MkdirAllWithACL(path string, perm os.FileMode) error {
<del> return MkdirAll(path, perm)
<add>// MkdirAllWithACL is a wrapper for MkdirAll on unix systems.
<add>func MkdirAllWithACL(path string, perm os.FileMode, sddl string) error {
<add> return MkdirAll(path, perm, sddl)
<ide> }
<ide>
<ide> // MkdirAll creates a directory named path along with any necessary parents,
<ide> // with permission specified by attribute perm for all dir created.
<del>func MkdirAll(path string, perm os.FileMode) error {
<add>func MkdirAll(path string, perm os.FileMode, sddl string) error {
<ide> return os.MkdirAll(path, perm)
<ide> }
<ide>
<ide><path>pkg/system/filesys_windows.go
<ide> import (
<ide> winio "github.com/Microsoft/go-winio"
<ide> )
<ide>
<add>const (
<add> // SddlAdministratorsLocalSystem is local administrators plus NT AUTHORITY\System
<add> SddlAdministratorsLocalSystem = "D:P(A;OICI;GA;;;BA)(A;OICI;GA;;;SY)"
<add> // SddlNtvmAdministratorsLocalSystem is NT VIRTUAL MACHINE\Virtual Machines plus local administrators plus NT AUTHORITY\System
<add> SddlNtvmAdministratorsLocalSystem = "D:P(A;OICI;GA;;;S-1-5-83-0)(A;OICI;GA;;;BA)(A;OICI;GA;;;SY)"
<add>)
<add>
<ide> // MkdirAllWithACL is a wrapper for MkdirAll that creates a directory
<del>// ACL'd for Builtin Administrators and Local System.
<del>func MkdirAllWithACL(path string, perm os.FileMode) error {
<del> return mkdirall(path, true)
<add>// with an appropriate SDDL defined ACL.
<add>func MkdirAllWithACL(path string, perm os.FileMode, sddl string) error {
<add> return mkdirall(path, true, sddl)
<ide> }
<ide>
<ide> // MkdirAll implementation that is volume path aware for Windows.
<del>func MkdirAll(path string, _ os.FileMode) error {
<del> return mkdirall(path, false)
<add>func MkdirAll(path string, _ os.FileMode, sddl string) error {
<add> return mkdirall(path, false, sddl)
<ide> }
<ide>
<ide> // mkdirall is a custom version of os.MkdirAll modified for use on Windows
<ide> // so that it is both volume path aware, and can create a directory with
<ide> // a DACL.
<del>func mkdirall(path string, adminAndLocalSystem bool) error {
<add>func mkdirall(path string, applyACL bool, sddl string) error {
<ide> if re := regexp.MustCompile(`^\\\\\?\\Volume{[a-z0-9-]+}$`); re.MatchString(path) {
<ide> return nil
<ide> }
<ide> func mkdirall(path string, adminAndLocalSystem bool) error {
<ide>
<ide> if j > 1 {
<ide> // Create parent
<del> err = mkdirall(path[0:j-1], false)
<add> err = mkdirall(path[0:j-1], false, sddl)
<ide> if err != nil {
<ide> return err
<ide> }
<ide> }
<ide>
<ide> // Parent now exists; invoke os.Mkdir or mkdirWithACL and use its result.
<del> if adminAndLocalSystem {
<del> err = mkdirWithACL(path)
<add> if applyACL {
<add> err = mkdirWithACL(path, sddl)
<ide> } else {
<ide> err = os.Mkdir(path, 0)
<ide> }
<ide> func mkdirall(path string, adminAndLocalSystem bool) error {
<ide> // in golang to cater for creating a directory am ACL permitting full
<ide> // access, with inheritance, to any subfolder/file for Built-in Administrators
<ide> // and Local System.
<del>func mkdirWithACL(name string) error {
<add>func mkdirWithACL(name string, sddl string) error {
<ide> sa := syscall.SecurityAttributes{Length: 0}
<del> sddl := "D:P(A;OICI;GA;;;BA)(A;OICI;GA;;;SY)"
<add>
<ide> sd, err := winio.SddlToSecurityDescriptor(sddl)
<ide> if err != nil {
<ide> return &os.PathError{Op: "mkdir", Path: name, Err: err} | 15 |
Mixed | Ruby | support direct uploads to multiple services | 193289dbbe146c56ec16faf8dd1a2c88611feb83 | <ide><path>actiontext/app/assets/javascripts/actiontext.js
<ide> var activestorage = {exports: {}};
<ide> }
<ide> }
<ide> class BlobRecord {
<del> constructor(file, checksum, url) {
<add> constructor(file, checksum, url, directUploadToken, attachmentName) {
<ide> this.file = file;
<ide> this.attributes = {
<ide> filename: file.name,
<ide> content_type: file.type || "application/octet-stream",
<ide> byte_size: file.size,
<del> checksum: checksum
<add> checksum: checksum,
<ide> };
<add> this.directUploadToken = directUploadToken;
<add> this.attachmentName = attachmentName;
<ide> this.xhr = new XMLHttpRequest;
<ide> this.xhr.open("POST", url, true);
<ide> this.xhr.responseType = "json";
<ide> var activestorage = {exports: {}};
<ide> create(callback) {
<ide> this.callback = callback;
<ide> this.xhr.send(JSON.stringify({
<del> blob: this.attributes
<add> blob: this.attributes,
<add> direct_upload_token: this.directUploadToken,
<add> attachment_name: this.attachmentName
<ide> }));
<ide> }
<ide> requestDidLoad(event) {
<ide> var activestorage = {exports: {}};
<ide> }
<ide> let id = 0;
<ide> class DirectUpload {
<del> constructor(file, url, delegate) {
<add> constructor(file, url, directUploadToken, attachmentName, delegate) {
<ide> this.id = ++id;
<ide> this.file = file;
<ide> this.url = url;
<add> this.directUploadToken = directUploadToken;
<add> this.attachmentName = attachmentName;
<ide> this.delegate = delegate;
<ide> }
<ide> create(callback) {
<ide> var activestorage = {exports: {}};
<ide> callback(error);
<ide> return;
<ide> }
<del> const blob = new BlobRecord(this.file, checksum, this.url);
<add> const blob = new BlobRecord(this.file, checksum, this.url, this.directUploadToken, this.attachmentName);
<ide> notify(this.delegate, "directUploadWillCreateBlobWithXHR", blob.xhr);
<ide> blob.create((error => {
<ide> if (error) {
<ide> var activestorage = {exports: {}};
<ide> constructor(input, file) {
<ide> this.input = input;
<ide> this.file = file;
<del> this.directUpload = new DirectUpload(this.file, this.url, this);
<add> this.directUpload = new DirectUpload(this.file, this.url, this.directUploadToken, this.attachmentName, this);
<ide> this.dispatch("initialize");
<ide> }
<ide> start(callback) {
<ide> var activestorage = {exports: {}};
<ide> get url() {
<ide> return this.input.getAttribute("data-direct-upload-url");
<ide> }
<add> get directUploadToken() {
<add> return this.input.getAttribute("data-direct-upload-token");
<add> }
<add> get attachmentName() {
<add> return this.input.getAttribute("data-direct-upload-attachment-name");
<add> }
<ide> dispatch(name, detail = {}) {
<ide> detail.file = this.file;
<ide> detail.id = this.directUpload.id;
<ide> class AttachmentUpload {
<ide> constructor(attachment, element) {
<ide> this.attachment = attachment;
<ide> this.element = element;
<del> this.directUpload = new activestorage.exports.DirectUpload(attachment.file, this.directUploadUrl, this);
<add> this.directUpload = new activestorage.exports.DirectUpload(attachment.file, this.directUploadUrl, this.directUploadToken, this.directUploadAttachmentName, this);
<ide> }
<ide>
<ide> start() {
<ide> class AttachmentUpload {
<ide> return this.element.dataset.directUploadUrl
<ide> }
<ide>
<add> get directUploadToken() {
<add> return this.element.dataset.directUploadToken
<add> }
<add>
<add> get directUploadAttachmentName() {
<add> return this.element.dataset.directUploadAttachmentName
<add> }
<add>
<ide> get blobUrlTemplate() {
<ide> return this.element.dataset.blobUrlTemplate
<ide> }
<ide><path>actiontext/app/helpers/action_text/tag_helper.rb
<ide> def rich_text_area_tag(name, value = nil, options = {})
<ide> options[:data][:direct_upload_url] ||= main_app.rails_direct_uploads_url
<ide> options[:data][:blob_url_template] ||= main_app.rails_service_blob_url(":signed_id", ":filename")
<ide>
<add> class_with_attachment = "ActionText::RichText#embeds"
<add> options[:data][:direct_upload_attachment_name] ||= class_with_attachment
<add> options[:data][:direct_upload_token] = ActiveStorage::DirectUploadToken.generate_direct_upload_token(
<add> class_with_attachment,
<add> ActiveStorage::Blob.service.name,
<add> session
<add> )
<add>
<ide> editor_tag = content_tag("trix-editor", "", options)
<ide> input_tag = hidden_field_tag(name, value.try(:to_trix_html) || value, id: options[:input], form: form)
<ide>
<ide><path>actiontext/test/template/form_helper_test.rb
<ide> # frozen_string_literal: true
<ide>
<ide> require "test_helper"
<add>require "minitest/mock"
<ide>
<ide> class ActionText::FormHelperTest < ActionView::TestCase
<ide> tests ActionText::TagHelper
<ide> def form_with(*, **)
<ide> test "rich text area tag" do
<ide> message = Message.new
<ide>
<del> form_with model: message, scope: :message do |form|
<del> rich_text_area_tag :content, message.content, { input: "trix_input_1" }
<del> end
<add> with_stub_token do
<add> form_with model: message, scope: :message do |form|
<add> rich_text_area_tag :content, message.content, { input: "trix_input_1" }
<add> end
<ide>
<del> assert_dom_equal \
<del> '<form action="/messages" accept-charset="UTF-8" method="post">' \
<del> '<input type="hidden" name="content" id="trix_input_1" autocomplete="off" />' \
<del> '<trix-editor input="trix_input_1" class="trix-content" data-direct-upload-url="http://test.host/rails/active_storage/direct_uploads" data-blob-url-template="http://test.host/rails/active_storage/blobs/redirect/:signed_id/:filename">' \
<del> "</trix-editor>" \
<del> "</form>",
<del> output_buffer
<add> assert_dom_equal \
<add> '<form action="/messages" accept-charset="UTF-8" method="post">' \
<add> '<input type="hidden" name="content" id="trix_input_1" autocomplete="off" />' \
<add> '<trix-editor input="trix_input_1" class="trix-content" data-direct-upload-url="http://test.host/rails/active_storage/direct_uploads" data-blob-url-template="http://test.host/rails/active_storage/blobs/redirect/:signed_id/:filename" data-direct-upload-attachment-name="ActionText::RichText#embeds" data-direct-upload-token="token">' \
<add> "</trix-editor>" \
<add> "</form>",
<add> output_buffer
<add> end
<ide> end
<ide>
<ide> test "form with rich text area" do
<del> form_with model: Message.new, scope: :message do |form|
<del> form.rich_text_area :content
<del> end
<add> with_stub_token do
<add> form_with model: Message.new, scope: :message do |form|
<add> form.rich_text_area :content
<add> end
<ide>
<del> assert_dom_equal \
<del> '<form action="/messages" accept-charset="UTF-8" method="post">' \
<del> '<input type="hidden" name="message[content]" id="message_content_trix_input_message" autocomplete="off" />' \
<del> '<trix-editor id="message_content" input="message_content_trix_input_message" class="trix-content" data-direct-upload-url="http://test.host/rails/active_storage/direct_uploads" data-blob-url-template="http://test.host/rails/active_storage/blobs/redirect/:signed_id/:filename">' \
<del> "</trix-editor>" \
<del> "</form>",
<del> output_buffer
<add> assert_dom_equal \
<add> '<form action="/messages" accept-charset="UTF-8" method="post">' \
<add> '<input type="hidden" name="message[content]" id="message_content_trix_input_message" autocomplete="off" />' \
<add> '<trix-editor id="message_content" input="message_content_trix_input_message" class="trix-content" data-direct-upload-url="http://test.host/rails/active_storage/direct_uploads" data-blob-url-template="http://test.host/rails/active_storage/blobs/redirect/:signed_id/:filename" data-direct-upload-attachment-name="ActionText::RichText#embeds" data-direct-upload-token="token">' \
<add> "</trix-editor>" \
<add> "</form>",
<add> output_buffer
<add> end
<ide> end
<ide>
<ide> test "form with rich text area having class" do
<del> form_with model: Message.new, scope: :message do |form|
<del> form.rich_text_area :content, class: "custom-class"
<del> end
<add> with_stub_token do
<add> form_with model: Message.new, scope: :message do |form|
<add> form.rich_text_area :content, class: "custom-class"
<add> end
<ide>
<del> assert_dom_equal \
<del> '<form action="/messages" accept-charset="UTF-8" method="post">' \
<del> '<input type="hidden" name="message[content]" id="message_content_trix_input_message" autocomplete="off" />' \
<del> '<trix-editor id="message_content" input="message_content_trix_input_message" class="custom-class" data-direct-upload-url="http://test.host/rails/active_storage/direct_uploads" data-blob-url-template="http://test.host/rails/active_storage/blobs/redirect/:signed_id/:filename">' \
<del> "</trix-editor>" \
<del> "</form>",
<del> output_buffer
<add> assert_dom_equal \
<add> '<form action="/messages" accept-charset="UTF-8" method="post">' \
<add> '<input type="hidden" name="message[content]" id="message_content_trix_input_message" autocomplete="off" />' \
<add> '<trix-editor id="message_content" input="message_content_trix_input_message" class="custom-class" data-direct-upload-url="http://test.host/rails/active_storage/direct_uploads" data-blob-url-template="http://test.host/rails/active_storage/blobs/redirect/:signed_id/:filename" data-direct-upload-attachment-name="ActionText::RichText#embeds" data-direct-upload-token="token">' \
<add> "</trix-editor>" \
<add> "</form>",
<add> output_buffer
<add> end
<ide> end
<ide>
<ide> test "form with rich text area for non-attribute" do
<del> form_with model: Message.new, scope: :message do |form|
<del> form.rich_text_area :not_an_attribute
<del> end
<add> with_stub_token do
<add> form_with model: Message.new, scope: :message do |form|
<add> form.rich_text_area :not_an_attribute
<add> end
<ide>
<del> assert_dom_equal \
<del> '<form action="/messages" accept-charset="UTF-8" method="post">' \
<del> '<input type="hidden" name="message[not_an_attribute]" id="message_not_an_attribute_trix_input_message" autocomplete="off" />' \
<del> '<trix-editor id="message_not_an_attribute" input="message_not_an_attribute_trix_input_message" class="trix-content" data-direct-upload-url="http://test.host/rails/active_storage/direct_uploads" data-blob-url-template="http://test.host/rails/active_storage/blobs/redirect/:signed_id/:filename">' \
<del> "</trix-editor>" \
<del> "</form>",
<del> output_buffer
<add> assert_dom_equal \
<add> '<form action="/messages" accept-charset="UTF-8" method="post">' \
<add> '<input type="hidden" name="message[not_an_attribute]" id="message_not_an_attribute_trix_input_message" autocomplete="off" />' \
<add> '<trix-editor id="message_not_an_attribute" input="message_not_an_attribute_trix_input_message" class="trix-content" data-direct-upload-url="http://test.host/rails/active_storage/direct_uploads" data-blob-url-template="http://test.host/rails/active_storage/blobs/redirect/:signed_id/:filename" data-direct-upload-attachment-name="ActionText::RichText#embeds" data-direct-upload-token="token">' \
<add> "</trix-editor>" \
<add> "</form>",
<add> output_buffer
<add> end
<ide> end
<ide>
<ide> test "modelless form with rich text area" do
<del> form_with url: "/messages", scope: :message do |form|
<del> form.rich_text_area :content, { input: "trix_input_2" }
<del> end
<add> with_stub_token do
<add> form_with url: "/messages", scope: :message do |form|
<add> form.rich_text_area :content, { input: "trix_input_2" }
<add> end
<ide>
<del> assert_dom_equal \
<del> '<form action="/messages" accept-charset="UTF-8" method="post">' \
<del> '<input type="hidden" name="message[content]" id="trix_input_2" autocomplete="off" />' \
<del> '<trix-editor id="message_content" input="trix_input_2" class="trix-content" data-direct-upload-url="http://test.host/rails/active_storage/direct_uploads" data-blob-url-template="http://test.host/rails/active_storage/blobs/redirect/:signed_id/:filename">' \
<del> "</trix-editor>" \
<del> "</form>",
<del> output_buffer
<add> assert_dom_equal \
<add> '<form action="/messages" accept-charset="UTF-8" method="post">' \
<add> '<input type="hidden" name="message[content]" id="trix_input_2" autocomplete="off" />' \
<add> '<trix-editor id="message_content" input="trix_input_2" class="trix-content" data-direct-upload-url="http://test.host/rails/active_storage/direct_uploads" data-blob-url-template="http://test.host/rails/active_storage/blobs/redirect/:signed_id/:filename" data-direct-upload-attachment-name="ActionText::RichText#embeds" data-direct-upload-token="token">' \
<add> "</trix-editor>" \
<add> "</form>",
<add> output_buffer
<add> end
<ide> end
<ide>
<ide> test "form with rich text area having placeholder without locale" do
<del> form_with model: Message.new, scope: :message do |form|
<del> form.rich_text_area :content, placeholder: true
<del> end
<add> with_stub_token do
<add> form_with model: Message.new, scope: :message do |form|
<add> form.rich_text_area :content, placeholder: true
<add> end
<ide>
<del> assert_dom_equal \
<del> '<form action="/messages" accept-charset="UTF-8" method="post">' \
<del> '<input type="hidden" name="message[content]" id="message_content_trix_input_message" autocomplete="off" />' \
<del> '<trix-editor placeholder="Content" id="message_content" input="message_content_trix_input_message" class="trix-content" data-direct-upload-url="http://test.host/rails/active_storage/direct_uploads" data-blob-url-template="http://test.host/rails/active_storage/blobs/redirect/:signed_id/:filename">' \
<del> "</trix-editor>" \
<del> "</form>",
<del> output_buffer
<add> assert_dom_equal \
<add> '<form action="/messages" accept-charset="UTF-8" method="post">' \
<add> '<input type="hidden" name="message[content]" id="message_content_trix_input_message" autocomplete="off" />' \
<add> '<trix-editor placeholder="Content" id="message_content" input="message_content_trix_input_message" class="trix-content" data-direct-upload-url="http://test.host/rails/active_storage/direct_uploads" data-blob-url-template="http://test.host/rails/active_storage/blobs/redirect/:signed_id/:filename" data-direct-upload-attachment-name="ActionText::RichText#embeds" data-direct-upload-token="token">' \
<add> "</trix-editor>" \
<add> "</form>",
<add> output_buffer
<add> end
<ide> end
<ide>
<ide> test "form with rich text area having placeholder with locale" do
<ide> I18n.with_locale :placeholder do
<ide> form_with model: Message.new, scope: :message do |form|
<del> form.rich_text_area :title, placeholder: true
<add> with_stub_token do
<add> form.rich_text_area :title, placeholder: true
<add> end
<ide> end
<del> end
<ide>
<del> assert_dom_equal \
<del> '<form action="/messages" accept-charset="UTF-8" method="post">' \
<del> '<input type="hidden" name="message[title]" id="message_title_trix_input_message" autocomplete="off" />' \
<del> '<trix-editor placeholder="Story title" id="message_title" input="message_title_trix_input_message" class="trix-content" data-direct-upload-url="http://test.host/rails/active_storage/direct_uploads" data-blob-url-template="http://test.host/rails/active_storage/blobs/redirect/:signed_id/:filename">' \
<del> "</trix-editor>" \
<del> "</form>",
<del> output_buffer
<add> assert_dom_equal \
<add> '<form action="/messages" accept-charset="UTF-8" method="post">' \
<add> '<input type="hidden" name="message[title]" id="message_title_trix_input_message" autocomplete="off" />' \
<add> '<trix-editor placeholder="Story title" id="message_title" input="message_title_trix_input_message" class="trix-content" data-direct-upload-url="http://test.host/rails/active_storage/direct_uploads" data-blob-url-template="http://test.host/rails/active_storage/blobs/redirect/:signed_id/:filename" data-direct-upload-attachment-name="ActionText::RichText#embeds" data-direct-upload-token="token">' \
<add> "</trix-editor>" \
<add> "</form>",
<add> output_buffer
<add> end
<ide> end
<ide>
<ide> test "form with rich text area with value" do
<del> form_with model: Message.new, scope: :message do |form|
<del> form.rich_text_area :title, value: "<h1>hello world</h1>"
<del> end
<add> with_stub_token do
<add> form_with model: Message.new, scope: :message do |form|
<add> form.rich_text_area :title, value: "<h1>hello world</h1>"
<add> end
<ide>
<del> assert_dom_equal \
<del> '<form action="/messages" accept-charset="UTF-8" method="post">' \
<del> '<input type="hidden" name="message[title]" id="message_title_trix_input_message" value="<h1>hello world</h1>" autocomplete="off" />' \
<del> '<trix-editor id="message_title" input="message_title_trix_input_message" class="trix-content" data-direct-upload-url="http://test.host/rails/active_storage/direct_uploads" data-blob-url-template="http://test.host/rails/active_storage/blobs/redirect/:signed_id/:filename">' \
<del> "</trix-editor>" \
<del> "</form>",
<del> output_buffer
<add> assert_dom_equal \
<add> '<form action="/messages" accept-charset="UTF-8" method="post">' \
<add> '<input type="hidden" name="message[title]" id="message_title_trix_input_message" value="<h1>hello world</h1>" autocomplete="off" />' \
<add> '<trix-editor id="message_title" input="message_title_trix_input_message" class="trix-content" data-direct-upload-url="http://test.host/rails/active_storage/direct_uploads" data-blob-url-template="http://test.host/rails/active_storage/blobs/redirect/:signed_id/:filename" data-direct-upload-attachment-name="ActionText::RichText#embeds" data-direct-upload-token="token">' \
<add> "</trix-editor>" \
<add> "</form>",
<add> output_buffer
<add> end
<ide> end
<ide>
<ide> test "form with rich text area with form attribute" do
<del> form_with model: Message.new, scope: :message do |form|
<del> form.rich_text_area :title, form: "other_form"
<del> end
<add> with_stub_token do
<add> form_with model: Message.new, scope: :message do |form|
<add> form.rich_text_area :title, form: "other_form"
<add> end
<ide>
<del> assert_dom_equal \
<del> '<form action="/messages" accept-charset="UTF-8" method="post">' \
<del> '<input type="hidden" name="message[title]" id="message_title_trix_input_message" form="other_form" autocomplete="off" />' \
<del> '<trix-editor id="message_title" input="message_title_trix_input_message" class="trix-content" data-direct-upload-url="http://test.host/rails/active_storage/direct_uploads" data-blob-url-template="http://test.host/rails/active_storage/blobs/redirect/:signed_id/:filename">' \
<del> "</trix-editor>" \
<del> "</form>",
<del> output_buffer
<add> assert_dom_equal \
<add> '<form action="/messages" accept-charset="UTF-8" method="post">' \
<add> '<input type="hidden" name="message[title]" id="message_title_trix_input_message" form="other_form" autocomplete="off" />' \
<add> '<trix-editor id="message_title" input="message_title_trix_input_message" class="trix-content" data-direct-upload-url="http://test.host/rails/active_storage/direct_uploads" data-blob-url-template="http://test.host/rails/active_storage/blobs/redirect/:signed_id/:filename" data-direct-upload-attachment-name="ActionText::RichText#embeds" data-direct-upload-token="token">' \
<add> "</trix-editor>" \
<add> "</form>",
<add> output_buffer
<add> end
<ide> end
<ide>
<ide> test "form with rich text area with data[direct_upload_url]" do
<del> form_with model: Message.new, scope: :message do |form|
<del> form.rich_text_area :content, data: { direct_upload_url: "http://test.host/direct_uploads" }
<del> end
<add> with_stub_token do
<add> form_with model: Message.new, scope: :message do |form|
<add> form.rich_text_area :content, data: { direct_upload_url: "http://test.host/direct_uploads" }
<add> end
<ide>
<del> assert_dom_equal \
<del> '<form action="/messages" accept-charset="UTF-8" method="post">' \
<del> '<input type="hidden" name="message[content]" id="message_content_trix_input_message" autocomplete="off" />' \
<del> '<trix-editor id="message_content" input="message_content_trix_input_message" class="trix-content" data-direct-upload-url="http://test.host/direct_uploads" data-blob-url-template="http://test.host/rails/active_storage/blobs/redirect/:signed_id/:filename">' \
<del> "</trix-editor>" \
<del> "</form>",
<del> output_buffer
<add> assert_dom_equal \
<add> '<form action="/messages" accept-charset="UTF-8" method="post">' \
<add> '<input type="hidden" name="message[content]" id="message_content_trix_input_message" autocomplete="off" />' \
<add> '<trix-editor id="message_content" input="message_content_trix_input_message" class="trix-content" data-direct-upload-url="http://test.host/direct_uploads" data-blob-url-template="http://test.host/rails/active_storage/blobs/redirect/:signed_id/:filename" data-direct-upload-attachment-name="ActionText::RichText#embeds" data-direct-upload-token="token">' \
<add> "</trix-editor>" \
<add> "</form>",
<add> output_buffer
<add> end
<ide> end
<ide>
<ide> test "form with rich text area with data[blob_url_template]" do
<del> form_with model: Message.new, scope: :message do |form|
<del> form.rich_text_area :content, data: { blob_url_template: "http://test.host/blobs/:signed_id/:filename" }
<add> with_stub_token do
<add> form_with model: Message.new, scope: :message do |form|
<add> form.rich_text_area :content, data: { blob_url_template: "http://test.host/blobs/:signed_id/:filename" }
<add> end
<add>
<add> assert_dom_equal \
<add> '<form action="/messages" accept-charset="UTF-8" method="post">' \
<add> '<input type="hidden" name="message[content]" id="message_content_trix_input_message" autocomplete="off" />' \
<add> '<trix-editor id="message_content" input="message_content_trix_input_message" class="trix-content" data-direct-upload-url="http://test.host/rails/active_storage/direct_uploads" data-blob-url-template="http://test.host/blobs/:signed_id/:filename" data-direct-upload-attachment-name="ActionText::RichText#embeds" data-direct-upload-token="token">' \
<add> "</trix-editor>" \
<add> "</form>",
<add> output_buffer
<ide> end
<add> end
<ide>
<del> assert_dom_equal \
<del> '<form action="/messages" accept-charset="UTF-8" method="post">' \
<del> '<input type="hidden" name="message[content]" id="message_content_trix_input_message" autocomplete="off" />' \
<del> '<trix-editor id="message_content" input="message_content_trix_input_message" class="trix-content" data-direct-upload-url="http://test.host/rails/active_storage/direct_uploads" data-blob-url-template="http://test.host/blobs/:signed_id/:filename">' \
<del> "</trix-editor>" \
<del> "</form>",
<del> output_buffer
<add> def with_stub_token(&block)
<add> ActiveStorage::DirectUploadToken.stub(:generate_direct_upload_token, "token", &block)
<ide> end
<ide> end
<ide><path>actionview/lib/action_view/helpers/form_helper.rb
<ide> def hidden_field(object_name, method, options = {})
<ide> # file_field(:attachment, :file, class: 'file_input')
<ide> # # => <input type="file" id="attachment_file" name="attachment[file]" class="file_input" />
<ide> def file_field(object_name, method, options = {})
<del> Tags::FileField.new(object_name, method, self, convert_direct_upload_option_to_url(options.dup)).render
<add> Tags::FileField.new(object_name, method, self, convert_direct_upload_option_to_url(method, options.dup)).render
<ide> end
<ide>
<ide> # Returns a textarea opening and closing tag set tailored for accessing a specified attribute (identified by +method+)
<ide><path>actionview/lib/action_view/helpers/form_tag_helper.rb
<ide> def hidden_field_tag(name, value = nil, options = {})
<ide> # file_field_tag 'file', accept: 'text/html', class: 'upload', value: 'index.html'
<ide> # # => <input accept="text/html" class="upload" id="file" name="file" type="file" value="index.html" />
<ide> def file_field_tag(name, options = {})
<del> text_field_tag(name, nil, convert_direct_upload_option_to_url(options.merge(type: :file)))
<add> text_field_tag(name, nil, convert_direct_upload_option_to_url(name, options.merge(type: :file)))
<ide> end
<ide>
<ide> # Creates a password field, a masked text field that will hide the users input behind a mask character.
<ide> def set_default_disable_with(value, tag_options)
<ide> tag_options.delete("data-disable-with")
<ide> end
<ide>
<del> def convert_direct_upload_option_to_url(options)
<add> def convert_direct_upload_option_to_url(name, options)
<ide> if options.delete(:direct_upload) && respond_to?(:rails_direct_uploads_url)
<ide> options["data-direct-upload-url"] = rails_direct_uploads_url
<add>
<add> if options[:object] && options[:object].class.respond_to?(:reflect_on_attachment)
<add> attachment_reflection = options[:object].class.reflect_on_attachment(name)
<add>
<add> class_with_attachment = "#{options[:object].class.name.underscore}##{name}"
<add> options["data-direct-upload-attachment-name"] = class_with_attachment
<add>
<add> service_name = attachment_reflection.options[:service_name] || ActiveStorage::Blob.service.name
<add> options["data-direct-upload-token"] = ActiveStorage::DirectUploadToken.generate_direct_upload_token(
<add> class_with_attachment,
<add> service_name,
<add> session
<add> )
<add> end
<ide> end
<ide> options
<ide> end
<ide><path>activestorage/CHANGELOG.md
<add>* Support direct uploads to multiple services.
<add>
<add> *Dmitry Tsepelev*
<add>
<ide> * Invalid default content types are deprecated
<ide>
<ide> Blobs created with content_type `image/jpg`, `image/pjpeg`, `image/bmp`, `text/javascript` will now produce
<ide><path>activestorage/app/assets/javascripts/activestorage.esm.js
<ide> function toArray(value) {
<ide> }
<ide>
<ide> class BlobRecord {
<del> constructor(file, checksum, url) {
<add> constructor(file, checksum, url, directUploadToken, attachmentName) {
<ide> this.file = file;
<ide> this.attributes = {
<ide> filename: file.name,
<ide> content_type: file.type || "application/octet-stream",
<ide> byte_size: file.size,
<ide> checksum: checksum
<ide> };
<add> this.directUploadToken = directUploadToken;
<add> this.attachmentName = attachmentName;
<ide> this.xhr = new XMLHttpRequest;
<ide> this.xhr.open("POST", url, true);
<ide> this.xhr.responseType = "json";
<ide> class BlobRecord {
<ide> create(callback) {
<ide> this.callback = callback;
<ide> this.xhr.send(JSON.stringify({
<del> blob: this.attributes
<add> blob: this.attributes,
<add> direct_upload_token: this.directUploadToken,
<add> attachment_name: this.attachmentName
<ide> }));
<ide> }
<ide> requestDidLoad(event) {
<ide> class BlobUpload {
<ide> let id = 0;
<ide>
<ide> class DirectUpload {
<del> constructor(file, url, delegate) {
<add> constructor(file, url, serviceName, attachmentName, delegate) {
<ide> this.id = ++id;
<ide> this.file = file;
<ide> this.url = url;
<add> this.serviceName = serviceName;
<add> this.attachmentName = attachmentName;
<ide> this.delegate = delegate;
<ide> }
<ide> create(callback) {
<ide> class DirectUpload {
<ide> callback(error);
<ide> return;
<ide> }
<del> const blob = new BlobRecord(this.file, checksum, this.url);
<add> const blob = new BlobRecord(this.file, checksum, this.url, this.serviceName, this.attachmentName);
<ide> notify(this.delegate, "directUploadWillCreateBlobWithXHR", blob.xhr);
<ide> blob.create((error => {
<ide> if (error) {
<ide> class DirectUploadController {
<ide> constructor(input, file) {
<ide> this.input = input;
<ide> this.file = file;
<del> this.directUpload = new DirectUpload(this.file, this.url, this);
<add> this.directUpload = new DirectUpload(this.file, this.url, this.directUploadToken, this.attachmentName, this);
<ide> this.dispatch("initialize");
<ide> }
<ide> start(callback) {
<ide> class DirectUploadController {
<ide> get url() {
<ide> return this.input.getAttribute("data-direct-upload-url");
<ide> }
<add> get directUploadToken() {
<add> return this.input.getAttribute("data-direct-upload-token");
<add> }
<add> get attachmentName() {
<add> return this.input.getAttribute("data-direct-upload-attachment-name");
<add> }
<ide> dispatch(name, detail = {}) {
<ide> detail.file = this.file;
<ide> detail.id = this.directUpload.id;
<ide><path>activestorage/app/assets/javascripts/activestorage.js
<ide> }
<ide> }
<ide> class BlobRecord {
<del> constructor(file, checksum, url) {
<add> constructor(file, checksum, url, directUploadToken, attachmentName) {
<ide> this.file = file;
<ide> this.attributes = {
<ide> filename: file.name,
<ide> content_type: file.type || "application/octet-stream",
<ide> byte_size: file.size,
<ide> checksum: checksum
<ide> };
<add> this.directUploadToken = directUploadToken;
<add> this.attachmentName = attachmentName;
<ide> this.xhr = new XMLHttpRequest;
<ide> this.xhr.open("POST", url, true);
<ide> this.xhr.responseType = "json";
<ide> create(callback) {
<ide> this.callback = callback;
<ide> this.xhr.send(JSON.stringify({
<del> blob: this.attributes
<add> blob: this.attributes,
<add> direct_upload_token: this.directUploadToken,
<add> attachment_name: this.attachmentName
<ide> }));
<ide> }
<ide> requestDidLoad(event) {
<ide> }
<ide> let id = 0;
<ide> class DirectUpload {
<del> constructor(file, url, delegate) {
<add> constructor(file, url, serviceName, attachmentName, delegate) {
<ide> this.id = ++id;
<ide> this.file = file;
<ide> this.url = url;
<add> this.serviceName = serviceName;
<add> this.attachmentName = attachmentName;
<ide> this.delegate = delegate;
<ide> }
<ide> create(callback) {
<ide> callback(error);
<ide> return;
<ide> }
<del> const blob = new BlobRecord(this.file, checksum, this.url);
<add> const blob = new BlobRecord(this.file, checksum, this.url, this.serviceName, this.attachmentName);
<ide> notify(this.delegate, "directUploadWillCreateBlobWithXHR", blob.xhr);
<ide> blob.create((error => {
<ide> if (error) {
<ide> constructor(input, file) {
<ide> this.input = input;
<ide> this.file = file;
<del> this.directUpload = new DirectUpload(this.file, this.url, this);
<add> this.directUpload = new DirectUpload(this.file, this.url, this.directUploadToken, this.attachmentName, this);
<ide> this.dispatch("initialize");
<ide> }
<ide> start(callback) {
<ide> get url() {
<ide> return this.input.getAttribute("data-direct-upload-url");
<ide> }
<add> get directUploadToken() {
<add> return this.input.getAttribute("data-direct-upload-token");
<add> }
<add> get attachmentName() {
<add> return this.input.getAttribute("data-direct-upload-attachment-name");
<add> }
<ide> dispatch(name, detail = {}) {
<ide> detail.file = this.file;
<ide> detail.id = this.directUpload.id;
<ide><path>activestorage/app/controllers/active_storage/direct_uploads_controller.rb
<ide> # When the client-side upload is completed, the signed_blob_id can be submitted as part of the form to reference
<ide> # the blob that was created up front.
<ide> class ActiveStorage::DirectUploadsController < ActiveStorage::BaseController
<add> include ActiveStorage::DirectUploadToken
<add>
<ide> def create
<del> blob = ActiveStorage::Blob.create_before_direct_upload!(**blob_args)
<add> blob = ActiveStorage::Blob.create_before_direct_upload!(**blob_args.merge(service_name: verified_service_name))
<ide> render json: direct_upload_json(blob)
<ide> end
<ide>
<ide> def blob_args
<ide> params.require(:blob).permit(:filename, :byte_size, :checksum, :content_type, metadata: {}).to_h.symbolize_keys
<ide> end
<ide>
<add> def verified_service_name
<add> ActiveStorage::DirectUploadToken.verify_direct_upload_token(params[:direct_upload_token], params[:attachment_name], session)
<add> end
<add>
<ide> def direct_upload_json(blob)
<ide> blob.as_json(root: false, methods: :signed_id).merge(direct_upload: {
<ide> url: blob.service_url_for_direct_upload,
<ide><path>activestorage/app/javascript/activestorage/blob_record.js
<ide> import { getMetaValue } from "./helpers"
<ide>
<ide> export class BlobRecord {
<del> constructor(file, checksum, url) {
<add> constructor(file, checksum, url, directUploadToken, attachmentName) {
<ide> this.file = file
<ide>
<ide> this.attributes = {
<ide> filename: file.name,
<ide> content_type: file.type || "application/octet-stream",
<ide> byte_size: file.size,
<del> checksum: checksum
<add> checksum: checksum,
<ide> }
<ide>
<add> this.directUploadToken = directUploadToken
<add> this.attachmentName = attachmentName
<add>
<ide> this.xhr = new XMLHttpRequest
<ide> this.xhr.open("POST", url, true)
<ide> this.xhr.responseType = "json"
<ide> export class BlobRecord {
<ide>
<ide> create(callback) {
<ide> this.callback = callback
<del> this.xhr.send(JSON.stringify({ blob: this.attributes }))
<add> this.xhr.send(JSON.stringify({
<add> blob: this.attributes,
<add> direct_upload_token: this.directUploadToken,
<add> attachment_name: this.attachmentName
<add> }))
<ide> }
<ide>
<ide> requestDidLoad(event) {
<ide><path>activestorage/app/javascript/activestorage/direct_upload.js
<ide> import { BlobUpload } from "./blob_upload"
<ide> let id = 0
<ide>
<ide> export class DirectUpload {
<del> constructor(file, url, delegate) {
<add> constructor(file, url, serviceName, attachmentName, delegate) {
<ide> this.id = ++id
<ide> this.file = file
<ide> this.url = url
<add> this.serviceName = serviceName
<add> this.attachmentName = attachmentName
<ide> this.delegate = delegate
<ide> }
<ide>
<ide> export class DirectUpload {
<ide> return
<ide> }
<ide>
<del> const blob = new BlobRecord(this.file, checksum, this.url)
<add> const blob = new BlobRecord(this.file, checksum, this.url, this.serviceName, this.attachmentName)
<ide> notify(this.delegate, "directUploadWillCreateBlobWithXHR", blob.xhr)
<ide>
<ide> blob.create(error => {
<ide><path>activestorage/app/javascript/activestorage/direct_upload_controller.js
<ide> export class DirectUploadController {
<ide> constructor(input, file) {
<ide> this.input = input
<ide> this.file = file
<del> this.directUpload = new DirectUpload(this.file, this.url, this)
<add> this.directUpload = new DirectUpload(this.file, this.url, this.directUploadToken, this.attachmentName, this)
<ide> this.dispatch("initialize")
<ide> }
<ide>
<ide> export class DirectUploadController {
<ide> return this.input.getAttribute("data-direct-upload-url")
<ide> }
<ide>
<add> get directUploadToken() {
<add> return this.input.getAttribute("data-direct-upload-token")
<add> }
<add>
<add> get attachmentName() {
<add> return this.input.getAttribute("data-direct-upload-attachment-name")
<add> }
<add>
<ide> dispatch(name, detail = {}) {
<ide> detail.file = this.file
<ide> detail.id = this.directUpload.id
<ide><path>activestorage/lib/active_storage.rb
<ide> module ActiveStorage
<ide> autoload :Service
<ide> autoload :Previewer
<ide> autoload :Analyzer
<add> autoload :DirectUploadToken
<ide>
<ide> mattr_accessor :logger
<ide> mattr_accessor :verifier
<ide><path>activestorage/lib/active_storage/direct_upload_token.rb
<add># frozen_string_literal: true
<add>
<add>module ActiveStorage
<add> module DirectUploadToken
<add> extend self
<add>
<add> SEPARATOR = "."
<add> DIRECT_UPLOAD_TOKEN_LENGTH = 32
<add>
<add> def generate_direct_upload_token(attachment_name, service_name, session)
<add> token = direct_upload_token(session, attachment_name)
<add> encode_direct_upload_token([service_name, token].join(SEPARATOR))
<add> end
<add>
<add> def verify_direct_upload_token(token, attachment_name, session)
<add> raise ActiveStorage::InvalidDirectUploadTokenError if token.nil?
<add>
<add> service_name, *token_components = decode_token(token).split(SEPARATOR)
<add> decoded_token = token_components.join(SEPARATOR)
<add>
<add> return service_name if valid_direct_upload_token?(decoded_token, attachment_name, session)
<add>
<add> raise ActiveStorage::InvalidDirectUploadTokenError
<add> end
<add>
<add> private
<add> def direct_upload_token(session, attachment_name) # :doc:
<add> direct_upload_token_hmac(session, "direct_upload##{attachment_name}")
<add> end
<add>
<add> def valid_direct_upload_token?(token, attachment_name, session) # :doc:
<add> correct_token = direct_upload_token(session, attachment_name)
<add> ActiveSupport::SecurityUtils.fixed_length_secure_compare(token, correct_token)
<add> rescue ArgumentError
<add> raise ActiveStorage::InvalidDirectUploadTokenError
<add> end
<add>
<add> def direct_upload_token_hmac(session, identifier) # :doc:
<add> OpenSSL::HMAC.digest(
<add> OpenSSL::Digest::SHA256.new,
<add> real_direct_upload_token(session),
<add> identifier
<add> )
<add> end
<add>
<add> def real_direct_upload_token(session) # :doc:
<add> session[:_direct_upload_token] ||= SecureRandom.urlsafe_base64(DIRECT_UPLOAD_TOKEN_LENGTH, padding: false)
<add> encode_direct_upload_token(session[:_direct_upload_token])
<add> end
<add>
<add> def decode_token(encoded_token) # :nodoc:
<add> Base64.urlsafe_decode64(encoded_token)
<add> end
<add>
<add> def encode_direct_upload_token(raw_token) # :nodoc:
<add> Base64.urlsafe_encode64(raw_token)
<add> end
<add> end
<add>end
<ide><path>activestorage/lib/active_storage/errors.rb
<ide> class FileNotFoundError < Error; end
<ide>
<ide> # Raised when a Previewer is unable to generate a preview image.
<ide> class PreviewError < Error; end
<add>
<add> # Raised when direct upload fails because of the invalid token
<add> class InvalidDirectUploadTokenError < Error; end
<ide> end
<ide><path>activestorage/test/controllers/direct_uploads_controller_test.rb
<ide>
<ide> require "test_helper"
<ide> require "database/setup"
<add>require "minitest/mock"
<ide>
<ide> if SERVICE_CONFIGURATIONS[:s3] && SERVICE_CONFIGURATIONS[:s3][:access_key_id].present?
<ide> class ActiveStorage::S3DirectUploadsControllerTest < ActionDispatch::IntegrationTest
<ide> class ActiveStorage::S3DirectUploadsControllerTest < ActionDispatch::Integration
<ide> "library_ID": "12345"
<ide> }
<ide>
<del> post rails_direct_uploads_url, params: { blob: {
<del> filename: "hello.txt", byte_size: 6, checksum: checksum, content_type: "text/plain", metadata: metadata } }
<add> ActiveStorage::DirectUploadToken.stub(:verify_direct_upload_token, "local") do
<add> post rails_direct_uploads_url, params: { blob: {
<add> filename: "hello.txt", byte_size: 6, checksum: checksum, content_type: "text/plain", metadata: metadata } }
<add> end
<ide>
<ide> response.parsed_body.tap do |details|
<ide> assert_equal ActiveStorage::Blob.find(details["id"]), ActiveStorage::Blob.find_signed!(details["signed_id"])
<ide> class ActiveStorage::GCSDirectUploadsControllerTest < ActionDispatch::Integratio
<ide> "library_ID": "12345"
<ide> }
<ide>
<del> post rails_direct_uploads_url, params: { blob: {
<del> filename: "hello.txt", byte_size: 6, checksum: checksum, content_type: "text/plain", metadata: metadata } }
<add> ActiveStorage::DirectUploadToken.stub(:verify_direct_upload_token, "local") do
<add> post rails_direct_uploads_url, params: { blob: {
<add> filename: "hello.txt", byte_size: 6, checksum: checksum, content_type: "text/plain", metadata: metadata } }
<add> end
<ide>
<ide> @response.parsed_body.tap do |details|
<ide> assert_equal ActiveStorage::Blob.find(details["id"]), ActiveStorage::Blob.find_signed!(details["signed_id"])
<ide> class ActiveStorage::AzureStorageDirectUploadsControllerTest < ActionDispatch::I
<ide> "library_ID": "12345"
<ide> }
<ide>
<del> post rails_direct_uploads_url, params: { blob: {
<del> filename: "hello.txt", byte_size: 6, checksum: checksum, content_type: "text/plain", metadata: metadata } }
<add> ActiveStorage::DirectUploadToken.stub(:verify_direct_upload_token, "local") do
<add> post rails_direct_uploads_url, params: { blob: {
<add> filename: "hello.txt", byte_size: 6, checksum: checksum, content_type: "text/plain", metadata: metadata } }
<add> end
<ide>
<ide> @response.parsed_body.tap do |details|
<ide> assert_equal ActiveStorage::Blob.find(details["id"]), ActiveStorage::Blob.find_signed!(details["signed_id"])
<ide> class ActiveStorage::DiskDirectUploadsControllerTest < ActionDispatch::Integrati
<ide> "library_ID": "12345"
<ide> }
<ide>
<del> post rails_direct_uploads_url, params: { blob: {
<del> filename: "hello.txt", byte_size: 6, checksum: checksum, content_type: "text/plain", metadata: metadata } }
<add> with_valid_service_name do
<add> post rails_direct_uploads_url, params: { blob: {
<add> filename: "hello.txt", byte_size: 6, checksum: checksum, content_type: "text/plain", metadata: metadata } }
<add> end
<ide>
<ide> @response.parsed_body.tap do |details|
<ide> assert_equal ActiveStorage::Blob.find(details["id"]), ActiveStorage::Blob.find_signed!(details["signed_id"])
<ide> class ActiveStorage::DiskDirectUploadsControllerTest < ActionDispatch::Integrati
<ide> "library_ID": "12345"
<ide> }
<ide>
<del> set_include_root_in_json(true) do
<del> post rails_direct_uploads_url, params: { blob: {
<del> filename: "hello.txt", byte_size: 6, checksum: checksum, content_type: "text/plain", metadata: metadata } }
<add> with_valid_service_name do
<add> set_include_root_in_json(true) do
<add> post rails_direct_uploads_url, params: { blob: {
<add> filename: "hello.txt", byte_size: 6, checksum: checksum, content_type: "text/plain", metadata: metadata } }
<add> end
<ide> end
<ide>
<ide> @response.parsed_body.tap do |details|
<ide> class ActiveStorage::DiskDirectUploadsControllerTest < ActionDispatch::Integrati
<ide> end
<ide> end
<ide>
<add> test "handling direct upload with custom service name" do
<add> checksum = OpenSSL::Digest::MD5.base64digest("Hello")
<add> metadata = {
<add> "foo": "bar",
<add> "my_key_1": "my_value_1",
<add> "my_key_2": "my_value_2",
<add> "platform": "my_platform",
<add> "library_ID": "12345"
<add> }
<add>
<add> with_valid_service_name do
<add> post rails_direct_uploads_url, params: { blob: {
<add> filename: "hello.txt", byte_size: 6, checksum: checksum, content_type: "text/plain", metadata: metadata } }
<add> end
<add>
<add> @response.parsed_body.tap do |details|
<add> assert_equal ActiveStorage::Blob.find(details["id"]), ActiveStorage::Blob.find_signed!(details["signed_id"])
<add> assert_equal "hello.txt", details["filename"]
<add> assert_equal 6, details["byte_size"]
<add> assert_equal checksum, details["checksum"]
<add> assert_equal metadata, details["metadata"].transform_keys(&:to_sym)
<add> assert_equal "text/plain", details["content_type"]
<add> assert_match(/rails\/active_storage\/disk/, details["direct_upload"]["url"])
<add> assert_equal({ "Content-Type" => "text/plain" }, details["direct_upload"]["headers"])
<add> end
<add> end
<add>
<ide> private
<ide> def set_include_root_in_json(value)
<ide> original = ActiveRecord::Base.include_root_in_json
<ide> def set_include_root_in_json(value)
<ide> ensure
<ide> ActiveRecord::Base.include_root_in_json = original
<ide> end
<add>
<add> def with_valid_service_name(&block)
<add> ActiveStorage::DirectUploadToken.stub(:verify_direct_upload_token, "local", &block)
<add> end
<ide> end
<ide><path>activestorage/test/direct_upload_token_test.rb
<add># frozen_string_literal: true
<add>
<add>require "test_helper"
<add>
<add>class DirectUploadTokenTest < ActionController::TestCase
<add> setup do
<add> @session = {}
<add> @service_name = "local"
<add> @avatar_attachment_name = "user#avatar"
<add> end
<add>
<add> def test_validates_correct_token
<add> token = ActiveStorage::DirectUploadToken.generate_direct_upload_token(@avatar_attachment_name, @service_name, @session)
<add> verified_service_name = ActiveStorage::DirectUploadToken.verify_direct_upload_token(token, @avatar_attachment_name, @session)
<add>
<add> assert_equal verified_service_name, @service_name
<add> end
<add>
<add> def test_not_validates_nil_token
<add> token = nil
<add>
<add> assert_raises(ActiveStorage::InvalidDirectUploadTokenError) do
<add> ActiveStorage::DirectUploadToken.verify_direct_upload_token(token, @avatar_attachment_name, @session)
<add> end
<add> end
<add>
<add> def test_not_validates_token_when_session_is_empty
<add> token = ActiveStorage::DirectUploadToken.generate_direct_upload_token(@avatar_attachment_name, @service_name, {})
<add>
<add> assert_raises(ActiveStorage::InvalidDirectUploadTokenError) do
<add> ActiveStorage::DirectUploadToken.verify_direct_upload_token(token, @avatar_attachment_name, @session)
<add> end
<add> end
<add>
<add> def test_not_validates_token_from_different_attachment
<add> background_attachment_name = "user#background"
<add> token = ActiveStorage::DirectUploadToken.generate_direct_upload_token(background_attachment_name, @service_name, @session)
<add>
<add> assert_raises(ActiveStorage::InvalidDirectUploadTokenError) do
<add> ActiveStorage::DirectUploadToken.verify_direct_upload_token(token, @avatar_attachment_name, @session)
<add> end
<add> end
<add>
<add> def test_not_validates_token_from_different_session
<add> token = ActiveStorage::DirectUploadToken.generate_direct_upload_token(@avatar_attachment_name, @service_name, @session)
<add>
<add> another_session = {}
<add> ActiveStorage::DirectUploadToken.generate_direct_upload_token(@avatar_attachment_name, @service_name, another_session)
<add>
<add> assert_raises(ActiveStorage::InvalidDirectUploadTokenError) do
<add> ActiveStorage::DirectUploadToken.verify_direct_upload_token(token, @avatar_attachment_name, another_session)
<add> end
<add> end
<add>end
<ide><path>guides/source/active_storage_overview.md
<ide> input.addEventListener('change', (event) => {
<ide>
<ide> const uploadFile = (file) => {
<ide> // your form needs the file_field direct_upload: true, which
<del> // provides data-direct-upload-url
<add> // provides data-direct-upload-url, data-direct-upload-token
<add> // and data-direct-upload-attachment-name
<ide> const url = input.dataset.directUploadUrl
<del> const upload = new DirectUpload(file, url)
<add> const token = input.dataset.directUploadToken
<add> const attachmentName = input.dataset.directUploadAttachmentName
<add> const upload = new DirectUpload(file, url, token, attachmentName)
<ide>
<ide> upload.create((error, blob) => {
<ide> if (error) {
<ide> const uploadFile = (file) => {
<ide> }
<ide> ```
<ide>
<del>If you need to track the progress of the file upload, you can pass a third
<add>If you need to track the progress of the file upload, you can pass a fifth
<ide> parameter to the `DirectUpload` constructor. During the upload, DirectUpload
<ide> will call the object's `directUploadWillStoreFileWithXHR` method. You can then
<ide> bind your own progress handler on the XHR.
<ide> bind your own progress handler on the XHR.
<ide> import { DirectUpload } from "@rails/activestorage"
<ide>
<ide> class Uploader {
<del> constructor(file, url) {
<del> this.upload = new DirectUpload(this.file, this.url, this)
<add> constructor(file, url, token, attachmentName) {
<add> this.upload = new DirectUpload(file, url, token, attachmentName, this)
<ide> }
<ide>
<ide> upload(file) { | 18 |
Javascript | Javascript | fix typo in application.initialize assertion | 48da23b0956d0f0eeef8ddcd212e2e17ba79b3f7 | <ide><path>packages/ember-application/lib/system/application.js
<ide> Ember.Application = Ember.Namespace.extend(
<ide> @param router {Ember.Router}
<ide> */
<ide> initialize: function(router) {
<del> Ember.assert("Application initialize may only be call once", !this.isInitialized);
<add> Ember.assert("Application initialize may only be called once", !this.isInitialized);
<ide> Ember.assert("Application not destroyed", !this.isDestroyed);
<ide>
<ide> router = this.setupRouter(router); | 1 |
Javascript | Javascript | change locale comments | 4b3fc58ddc777ec7d65fd3a85bd0ae64632e3dfe | <ide><path>src/locale/af.js
<ide> //! moment.js locale configuration
<del>//! locale : afrikaans (af)
<add>//! locale : Afrikaans [af]
<ide> //! author : Werner Mollentze : https://github.com/wernerm
<ide>
<ide> import moment from '../moment';
<ide><path>src/locale/ar-ma.js
<ide> //! moment.js locale configuration
<del>//! locale : Moroccan Arabic (ar-ma)
<add>//! locale : Arabic (Morocco) [ar-ma]
<ide> //! author : ElFadili Yassine : https://github.com/ElFadiliY
<ide> //! author : Abdel Said : https://github.com/abdelsaid
<ide>
<ide><path>src/locale/ar-sa.js
<ide> //! moment.js locale configuration
<del>//! locale : Arabic Saudi Arabia (ar-sa)
<add>//! locale : Arabic (Saudi Arabia) [ar-sa]
<ide> //! author : Suhail Alkowaileet : https://github.com/xsoh
<ide>
<ide> import moment from '../moment';
<ide><path>src/locale/ar-tn.js
<ide> //! moment.js locale configuration
<del>//! locale : Tunisian Arabic (ar-tn)
<add>//! locale : Arabic (Tunisia) [ar-tn]
<ide>
<ide> import moment from '../moment';
<ide>
<ide><path>src/locale/ar.js
<ide> //! moment.js locale configuration
<del>//! Locale: Arabic (ar)
<add>//! Locale: Arabic [ar]
<ide> //! Author: Abdel Said: https://github.com/abdelsaid
<ide> //! Changes in months, weekdays: Ahmed Elkhatib
<ide> //! Native plural forms: forabi https://github.com/forabi
<ide><path>src/locale/az.js
<ide> //! moment.js locale configuration
<del>//! locale : azerbaijani (az)
<add>//! locale : Azerbaijani [az]
<ide> //! author : topchiyev : https://github.com/topchiyev
<ide>
<ide> import moment from '../moment';
<ide><path>src/locale/be.js
<ide> //! moment.js locale configuration
<del>//! locale : belarusian (be)
<add>//! locale : Belarusian [be]
<ide> //! author : Dmitry Demidov : https://github.com/demidov91
<ide> //! author: Praleska: http://praleska.pro/
<ide> //! Author : Menelion Elensúle : https://github.com/Oire
<ide><path>src/locale/bg.js
<ide> //! moment.js locale configuration
<del>//! locale : bulgarian (bg)
<add>//! locale : Bulgarian [bg]
<ide> //! author : Krasen Borisov : https://github.com/kraz
<ide>
<ide> import moment from '../moment';
<ide><path>src/locale/bn.js
<ide> //! moment.js locale configuration
<del>//! locale : Bengali (bn)
<add>//! locale : Bengali [bn]
<ide> //! author : Kaushik Gandhi : https://github.com/kaushikgandhi
<ide>
<ide> import moment from '../moment';
<ide><path>src/locale/bo.js
<ide> //! moment.js locale configuration
<del>//! locale : tibetan (bo)
<add>//! locale : Tibetan [bo]
<ide> //! author : Thupten N. Chakrishar : https://github.com/vajradog
<ide>
<ide> import moment from '../moment';
<ide><path>src/locale/br.js
<ide> //! moment.js locale configuration
<del>//! locale : breton (br)
<add>//! locale : Breton [br]
<ide> //! author : Jean-Baptiste Le Duigou : https://github.com/jbleduigou
<ide>
<ide> import moment from '../moment';
<ide><path>src/locale/bs.js
<ide> //! moment.js locale configuration
<del>//! locale : bosnian (bs)
<add>//! locale : Bosnian [bs]
<ide> //! author : Nedim Cholich : https://github.com/frontyard
<ide> //! based on (hr) translation by Bojan Marković
<ide>
<ide><path>src/locale/ca.js
<ide> //! moment.js locale configuration
<del>//! locale : catalan (ca)
<add>//! locale : Catalan [ca]
<ide> //! author : Juan G. Hurtado : https://github.com/juanghurtado
<ide>
<ide> import moment from '../moment';
<ide><path>src/locale/cs.js
<ide> //! moment.js locale configuration
<del>//! locale : czech (cs)
<add>//! locale : Czech [cs]
<ide> //! author : petrbela : https://github.com/petrbela
<ide>
<ide> import moment from '../moment';
<ide><path>src/locale/cv.js
<ide> //! moment.js locale configuration
<del>//! locale : chuvash (cv)
<add>//! locale : Chuvash [cv]
<ide> //! author : Anatoly Mironov : https://github.com/mirontoli
<ide>
<ide> import moment from '../moment';
<ide><path>src/locale/cy.js
<ide> //! moment.js locale configuration
<del>//! locale : Welsh (cy)
<add>//! locale : Welsh [cy]
<ide> //! author : Robert Allen
<ide>
<ide> import moment from '../moment';
<ide><path>src/locale/da.js
<ide> //! moment.js locale configuration
<del>//! locale : danish (da)
<add>//! locale : Danish [da]
<ide> //! author : Ulrik Nielsen : https://github.com/mrbase
<ide>
<ide> import moment from '../moment';
<ide><path>src/locale/de-at.js
<ide> //! moment.js locale configuration
<del>//! locale : austrian german (de-at)
<add>//! locale : German (Austria) [de-at]
<ide> //! author : lluchs : https://github.com/lluchs
<ide> //! author: Menelion Elensúle: https://github.com/Oire
<ide> //! author : Martin Groller : https://github.com/MadMG
<ide><path>src/locale/de.js
<ide> //! moment.js locale configuration
<del>//! locale : german (de)
<add>//! locale : German [de]
<ide> //! author : lluchs : https://github.com/lluchs
<ide> //! author: Menelion Elensúle: https://github.com/Oire
<ide> //! author : Mikolaj Dadela : https://github.com/mik01aj
<ide><path>src/locale/dv.js
<ide> //! moment.js locale configuration
<del>//! locale : dhivehi (dv)
<add>//! locale : Maldivian [dv]
<ide> //! author : Jawish Hameed : https://github.com/jawish
<ide>
<ide> import moment from '../moment';
<ide><path>src/locale/el.js
<ide> //! moment.js locale configuration
<del>//! locale : modern greek (el)
<add>//! locale : Greek [el]
<ide> //! author : Aggelos Karalias : https://github.com/mehiel
<ide>
<ide> import moment from '../moment';
<ide><path>src/locale/en-au.js
<ide> //! moment.js locale configuration
<del>//! locale : australian english (en-au)
<add>//! locale : English (Australia) [en-au]
<ide>
<ide> import moment from '../moment';
<ide>
<ide><path>src/locale/en-ca.js
<ide> //! moment.js locale configuration
<del>//! locale : canadian english (en-ca)
<add>//! locale : English (Canada) [en-ca]
<ide> //! author : Jonathan Abourbih : https://github.com/jonbca
<ide>
<ide> import moment from '../moment';
<ide><path>src/locale/en-gb.js
<ide> //! moment.js locale configuration
<del>//! locale : great britain english (en-gb)
<add>//! locale : English (United Kingdom) [en-gb]
<ide> //! author : Chris Gedrim : https://github.com/chrisgedrim
<ide>
<ide> import moment from '../moment';
<ide><path>src/locale/en-ie.js
<ide> //! moment.js locale configuration
<del>//! locale : Irish english (en-ie)
<add>//! locale : English (Ireland) [en-ie]
<ide> //! author : Chris Cartlidge : https://github.com/chriscartlidge
<ide>
<ide> import moment from '../moment';
<ide><path>src/locale/en-nz.js
<ide> //! moment.js locale configuration
<del>//! locale : New Zealand english (en-nz)
<add>//! locale : English (New Zealand) [en-nz]
<ide>
<ide> import moment from '../moment';
<ide>
<ide><path>src/locale/eo.js
<ide> //! moment.js locale configuration
<del>//! locale : esperanto (eo)
<add>//! locale : Esperanto [eo]
<ide> //! author : Colin Dean : https://github.com/colindean
<ide> //! komento: Mi estas malcerta se mi korekte traktis akuzativojn en tiu traduko.
<ide> //! Se ne, bonvolu korekti kaj avizi min por ke mi povas lerni!
<ide><path>src/locale/es-do.js
<ide> //! moment.js locale configuration
<del>//! locale : dominican republic spanish (es-do)
<add>//! locale : Spanish (Dominican Republic) [es-do]
<ide>
<ide> import moment from '../moment';
<ide>
<ide><path>src/locale/es.js
<ide> //! moment.js locale configuration
<del>//! locale : spanish (es)
<add>//! locale : Spanish [es]
<ide> //! author : Julio Napurí : https://github.com/julionc
<ide>
<ide> import moment from '../moment';
<ide><path>src/locale/et.js
<ide> //! moment.js locale configuration
<del>//! locale : estonian (et)
<add>//! locale : Estonian [et]
<ide> //! author : Henry Kehlmann : https://github.com/madhenry
<ide> //! improvements : Illimar Tambek : https://github.com/ragulka
<ide>
<ide><path>src/locale/eu.js
<ide> //! moment.js locale configuration
<del>//! locale : euskara (eu)
<add>//! locale : Basque [eu]
<ide> //! author : Eneko Illarramendi : https://github.com/eillarra
<ide>
<ide> import moment from '../moment';
<ide><path>src/locale/fa.js
<ide> //! moment.js locale configuration
<del>//! locale : Persian (fa)
<add>//! locale : Persian [fa]
<ide> //! author : Ebrahim Byagowi : https://github.com/ebraminio
<ide>
<ide> import moment from '../moment';
<ide><path>src/locale/fi.js
<ide> //! moment.js locale configuration
<del>//! locale : finnish (fi)
<add>//! locale : Finnish [fi]
<ide> //! author : Tarmo Aidantausta : https://github.com/bleadof
<ide>
<ide> import moment from '../moment';
<ide><path>src/locale/fo.js
<ide> //! moment.js locale configuration
<del>//! locale : faroese (fo)
<add>//! locale : Faroese [fo]
<ide> //! author : Ragnar Johannesen : https://github.com/ragnar123
<ide>
<ide> import moment from '../moment';
<ide><path>src/locale/fr-ca.js
<ide> //! moment.js locale configuration
<del>//! locale : canadian french (fr-ca)
<add>//! locale : French (Canada) [fr-ca]
<ide> //! author : Jonathan Abourbih : https://github.com/jonbca
<ide>
<ide> import moment from '../moment';
<ide><path>src/locale/fr-ch.js
<ide> //! moment.js locale configuration
<del>//! locale : swiss french (fr)
<add>//! locale : French (Switzerland) [fr-ch]
<ide> //! author : Gaspard Bucher : https://github.com/gaspard
<ide>
<ide> import moment from '../moment';
<ide><path>src/locale/fr.js
<ide> //! moment.js locale configuration
<del>//! locale : french (fr)
<add>//! locale : French [fr]
<ide> //! author : John Fischer : https://github.com/jfroffice
<ide>
<ide> import moment from '../moment';
<ide><path>src/locale/fy.js
<ide> //! moment.js locale configuration
<del>//! locale : frisian (fy)
<add>//! locale : Frisian [fy]
<ide> //! author : Robin van der Vliet : https://github.com/robin0van0der0v
<ide>
<ide> import moment from '../moment';
<ide><path>src/locale/gd.js
<ide> //! moment.js locale configuration
<del>//! locale : great britain scottish gealic (gd)
<add>//! locale : Scottish Gaelic [gd]
<ide> //! author : Jon Ashdown : https://github.com/jonashdown
<ide>
<ide> import moment from '../moment';
<ide><path>src/locale/gl.js
<ide> //! moment.js locale configuration
<del>//! locale : galician (gl)
<add>//! locale : Galician [gl]
<ide> //! author : Juan G. Hurtado : https://github.com/juanghurtado
<ide>
<ide> import moment from '../moment';
<ide><path>src/locale/he.js
<ide> //! moment.js locale configuration
<del>//! locale : Hebrew (he)
<add>//! locale : Hebrew [he]
<ide> //! author : Tomer Cohen : https://github.com/tomer
<ide> //! author : Moshe Simantov : https://github.com/DevelopmentIL
<ide> //! author : Tal Ater : https://github.com/TalAter
<ide><path>src/locale/hi.js
<ide> //! moment.js locale configuration
<del>//! locale : hindi (hi)
<add>//! locale : Hindi [hi]
<ide> //! author : Mayank Singhal : https://github.com/mayanksinghal
<ide>
<ide> import moment from '../moment';
<ide><path>src/locale/hr.js
<ide> //! moment.js locale configuration
<del>//! locale : hrvatski (hr)
<add>//! locale : Croatian [hr]
<ide> //! author : Bojan Marković : https://github.com/bmarkovic
<ide>
<ide> import moment from '../moment';
<ide><path>src/locale/hu.js
<ide> //! moment.js locale configuration
<del>//! locale : hungarian (hu)
<add>//! locale : Hungarian [hu]
<ide> //! author : Adam Brunner : https://github.com/adambrunner
<ide>
<ide> import moment from '../moment';
<ide><path>src/locale/hy-am.js
<ide> //! moment.js locale configuration
<del>//! locale : Armenian (hy-am)
<add>//! locale : Armenian [hy-am]
<ide> //! author : Armendarabyan : https://github.com/armendarabyan
<ide>
<ide> import moment from '../moment';
<ide><path>src/locale/id.js
<ide> //! moment.js locale configuration
<del>//! locale : Bahasa Indonesia (id)
<add>//! locale : Indonesian [id]
<ide> //! author : Mohammad Satrio Utomo : https://github.com/tyok
<ide> //! reference: http://id.wikisource.org/wiki/Pedoman_Umum_Ejaan_Bahasa_Indonesia_yang_Disempurnakan
<ide>
<ide><path>src/locale/is.js
<ide> //! moment.js locale configuration
<del>//! locale : icelandic (is)
<add>//! locale : Icelandic [is]
<ide> //! author : Hinrik Örn Sigurðsson : https://github.com/hinrik
<ide>
<ide> import moment from '../moment';
<ide><path>src/locale/it.js
<ide> //! moment.js locale configuration
<del>//! locale : italian (it)
<add>//! locale : Italian [it]
<ide> //! author : Lorenzo : https://github.com/aliem
<ide> //! author: Mattia Larentis: https://github.com/nostalgiaz
<ide>
<ide><path>src/locale/ja.js
<ide> //! moment.js locale configuration
<del>//! locale : japanese (ja)
<add>//! locale : Japanese [ja]
<ide> //! author : LI Long : https://github.com/baryon
<ide>
<ide> import moment from '../moment';
<ide><path>src/locale/jv.js
<ide> //! moment.js locale configuration
<del>//! locale : Boso Jowo (jv)
<add>//! locale : Japanese [jv]
<ide> //! author : Rony Lantip : https://github.com/lantip
<ide> //! reference: http://jv.wikipedia.org/wiki/Basa_Jawa
<ide>
<ide><path>src/locale/ka.js
<ide> //! moment.js locale configuration
<del>//! locale : Georgian (ka)
<add>//! locale : Georgian [ka]
<ide> //! author : Irakli Janiashvili : https://github.com/irakli-janiashvili
<ide>
<ide> import moment from '../moment';
<ide><path>src/locale/kk.js
<ide> //! moment.js locale configuration
<del>//! locale : kazakh (kk)
<add>//! locale : Kazakh [kk]
<ide> //! authors : Nurlan Rakhimzhanov : https://github.com/nurlan
<ide>
<ide> import moment from '../moment';
<ide><path>src/locale/km.js
<ide> //! moment.js locale configuration
<del>//! locale : khmer (km)
<add>//! locale : Cambodian [km]
<ide> //! author : Kruy Vanna : https://github.com/kruyvanna
<ide>
<ide> import moment from '../moment';
<ide><path>src/locale/ko.js
<ide> //! moment.js locale configuration
<del>//! locale : korean (ko)
<add>//! locale : Korean [ko]
<ide> //!
<ide> //! authors
<ide> //!
<ide><path>src/locale/ky.js
<ide> //! moment.js locale configuration
<del>//! locale : kyrgyz (ky)
<add>//! locale : Kyrgyz [ky]
<ide> //! author : Chyngyz Arystan uulu : https://github.com/chyngyz
<ide>
<ide>
<ide><path>src/locale/lb.js
<ide> //! moment.js locale configuration
<del>//! locale : Luxembourgish (lb)
<add>//! locale : Luxembourgish [lb]
<ide> //! author : mweimerskirch : https://github.com/mweimerskirch, David Raison : https://github.com/kwisatz
<ide>
<ide> import moment from '../moment';
<ide><path>src/locale/lo.js
<ide> //! moment.js locale configuration
<del>//! locale : lao (lo)
<add>//! locale : Lao [lo]
<ide> //! author : Ryan Hart : https://github.com/ryanhart2
<ide>
<ide> import moment from '../moment';
<ide><path>src/locale/lt.js
<ide> //! moment.js locale configuration
<del>//! locale : Lithuanian (lt)
<add>//! locale : Lithuanian [lt]
<ide> //! author : Mindaugas Mozūras : https://github.com/mmozuras
<ide>
<ide> import moment from '../moment';
<ide><path>src/locale/lv.js
<ide> //! moment.js locale configuration
<del>//! locale : latvian (lv)
<add>//! locale : Latvian [lv]
<ide> //! author : Kristaps Karlsons : https://github.com/skakri
<ide> //! author : Jānis Elmeris : https://github.com/JanisE
<ide>
<ide><path>src/locale/me.js
<ide> //! moment.js locale configuration
<del>//! locale : Montenegrin (me)
<add>//! locale : Montenegrin [me]
<ide> //! author : Miodrag Nikač <miodrag@restartit.me> : https://github.com/miodragnikac
<ide>
<ide> import moment from '../moment';
<ide><path>src/locale/mk.js
<ide> //! moment.js locale configuration
<del>//! locale : macedonian (mk)
<add>//! locale : Macedonian [mk]
<ide> //! author : Borislav Mickov : https://github.com/B0k0
<ide>
<ide> import moment from '../moment';
<ide><path>src/locale/ml.js
<ide> //! moment.js locale configuration
<del>//! locale : malayalam (ml)
<add>//! locale : Malayalam [ml]
<ide> //! author : Floyd Pink : https://github.com/floydpink
<ide>
<ide> import moment from '../moment';
<ide><path>src/locale/mr.js
<ide> //! moment.js locale configuration
<del>//! locale : Marathi (mr)
<add>//! locale : Marathi [mr]
<ide> //! author : Harshad Kale : https://github.com/kalehv
<ide> //! author : Vivek Athalye : https://github.com/vnathalye
<ide>
<ide><path>src/locale/ms-my.js
<ide> //! moment.js locale configuration
<del>//! locale : Bahasa Malaysia (ms-MY)
<add>//! locale : Malay [ms-my]
<add>//! note : DEPRECATED, the correct one is [ms]
<ide> //! author : Weldan Jamili : https://github.com/weldan
<ide>
<ide> import moment from '../moment';
<ide><path>src/locale/ms.js
<ide> //! moment.js locale configuration
<del>//! locale : Bahasa Malaysia (ms-MY)
<add>//! locale : Malay (ms)
<ide> //! author : Weldan Jamili : https://github.com/weldan
<ide>
<ide> import moment from '../moment';
<ide><path>src/locale/my.js
<ide> //! moment.js locale configuration
<del>//! locale : Burmese (my)
<add>//! locale : Burmese [my]
<ide> //! author : Squar team, mysquar.com
<ide>
<ide> import moment from '../moment';
<ide><path>src/locale/nb.js
<ide> //! moment.js locale configuration
<del>//! locale : norwegian bokmål (nb)
<add>//! locale : Norwegian Bokmål [nb]
<ide> //! authors : Espen Hovlandsdal : https://github.com/rexxars
<ide> //! Sigurd Gartmann : https://github.com/sigurdga
<ide>
<ide><path>src/locale/ne.js
<ide> //! moment.js locale configuration
<del>//! locale : nepali/nepalese
<add>//! locale : Nepalese [ne]
<ide> //! author : suvash : https://github.com/suvash
<ide>
<ide> import moment from '../moment';
<ide><path>src/locale/nl.js
<ide> //! moment.js locale configuration
<del>//! locale : dutch (nl)
<add>//! locale : Dutch [nl]
<ide> //! author : Joris Röling : https://github.com/jjupiter
<ide>
<ide> import moment from '../moment';
<ide><path>src/locale/nn.js
<ide> //! moment.js locale configuration
<del>//! locale : norwegian nynorsk (nn)
<add>//! locale : Nynorsk [nn]
<ide> //! author : https://github.com/mechuwind
<ide>
<ide> import moment from '../moment';
<ide><path>src/locale/pa-in.js
<ide> //! moment.js locale configuration
<del>//! locale : punjabi india (pa-in)
<add>//! locale : Punjabi (India) [pa-in]
<ide> //! author : Harpreet Singh : https://github.com/harpreetkhalsagtbit
<ide>
<ide> import moment from '../moment';
<ide><path>src/locale/pl.js
<ide> //! moment.js locale configuration
<del>//! locale : polish (pl)
<add>//! locale : Polish [pl]
<ide> //! author : Rafal Hirsz : https://github.com/evoL
<ide>
<ide> import moment from '../moment';
<ide><path>src/locale/pt-br.js
<ide> //! moment.js locale configuration
<del>//! locale : brazilian portuguese (pt-br)
<add>//! locale : Portuguese (Brazil) [pt-br]
<ide> //! author : Caio Ribeiro Pereira : https://github.com/caio-ribeiro-pereira
<ide>
<ide> import moment from '../moment';
<ide><path>src/locale/pt.js
<ide> //! moment.js locale configuration
<del>//! locale : portuguese (pt)
<add>//! locale : Portuguese [pt]
<ide> //! author : Jefferson : https://github.com/jalex79
<ide>
<ide> import moment from '../moment';
<ide><path>src/locale/ro.js
<ide> //! moment.js locale configuration
<del>//! locale : romanian (ro)
<add>//! locale : Romanian [ro]
<ide> //! author : Vlad Gurdiga : https://github.com/gurdiga
<ide> //! author : Valentin Agachi : https://github.com/avaly
<ide>
<ide><path>src/locale/ru.js
<ide> //! moment.js locale configuration
<del>//! locale : russian (ru)
<add>//! locale : Russian [ru]
<ide> //! author : Viktorminator : https://github.com/Viktorminator
<ide> //! Author : Menelion Elensúle : https://github.com/Oire
<ide> //! author : Коренберг Марк : https://github.com/socketpair
<ide><path>src/locale/se.js
<ide> //! moment.js locale configuration
<del>//! locale : Northern Sami (se)
<add>//! locale : Northern Sami [se]
<ide> //! authors : Bård Rolstad Henriksen : https://github.com/karamell
<ide>
<ide>
<ide><path>src/locale/si.js
<ide> //! moment.js locale configuration
<del>//! locale : Sinhalese (si)
<add>//! locale : Sinhalese [si]
<ide> //! author : Sampath Sitinamaluwa : https://github.com/sampathsris
<ide>
<ide> import moment from '../moment';
<ide><path>src/locale/sk.js
<ide> //! moment.js locale configuration
<del>//! locale : slovak (sk)
<add>//! locale : Slovak [sk]
<ide> //! author : Martin Minka : https://github.com/k2s
<ide> //! based on work of petrbela : https://github.com/petrbela
<ide>
<ide><path>src/locale/sl.js
<ide> //! moment.js locale configuration
<del>//! locale : slovenian (sl)
<add>//! locale : Slovenian [sl]
<ide> //! author : Robert Sedovšek : https://github.com/sedovsek
<ide>
<ide> import moment from '../moment';
<ide><path>src/locale/sq.js
<ide> //! moment.js locale configuration
<del>//! locale : Albanian (sq)
<add>//! locale : Albanian [sq]
<ide> //! author : Flakërim Ismani : https://github.com/flakerimi
<ide> //! author: Menelion Elensúle: https://github.com/Oire (tests)
<ide> //! author : Oerd Cukalla : https://github.com/oerd (fixes)
<ide><path>src/locale/sr-cyrl.js
<ide> //! moment.js locale configuration
<del>//! locale : Serbian-cyrillic (sr-cyrl)
<add>//! locale : Serbian Cyrillic [sr-cyrl]
<ide> //! author : Milan Janačković<milanjanackovic@gmail.com> : https://github.com/milan-j
<ide>
<ide> import moment from '../moment';
<ide><path>src/locale/sr.js
<ide> //! moment.js locale configuration
<del>//! locale : Serbian-latin (sr)
<add>//! locale : Serbian [sr]
<ide> //! author : Milan Janačković<milanjanackovic@gmail.com> : https://github.com/milan-j
<ide>
<ide> import moment from '../moment';
<ide><path>src/locale/ss.js
<ide> //! moment.js locale configuration
<del>//! locale : siSwati (ss)
<add>//! locale : Swazi [ss]
<ide> //! author : Nicolai Davies<mail@nicolai.io> : https://github.com/nicolaidavies
<ide>
<ide>
<ide><path>src/locale/sv.js
<ide> //! moment.js locale configuration
<del>//! locale : swedish (sv)
<add>//! locale : Swedish [sv]
<ide> //! author : Jens Alm : https://github.com/ulmus
<ide>
<ide> import moment from '../moment';
<ide><path>src/locale/sw.js
<ide> //! moment.js locale configuration
<del>//! locale : swahili (sw)
<add>//! locale : Swahili [sw]
<ide> //! author : Fahad Kassim : https://github.com/fadsel
<ide>
<ide> import moment from '../moment';
<ide><path>src/locale/ta.js
<ide> //! moment.js locale configuration
<del>//! locale : tamil (ta)
<add>//! locale : Tamil [ta]
<ide> //! author : Arjunkumar Krishnamoorthy : https://github.com/tk120404
<ide>
<ide> import moment from '../moment';
<ide><path>src/locale/te.js
<ide> //! moment.js locale configuration
<del>//! locale : telugu (te)
<add>//! locale : Telugu [te]
<ide> //! author : Krishna Chaitanya Thota : https://github.com/kcthota
<ide>
<ide> import moment from '../moment';
<ide><path>src/locale/th.js
<ide> //! moment.js locale configuration
<del>//! locale : thai (th)
<add>//! locale : Thai [th]
<ide> //! author : Kridsada Thanabulpong : https://github.com/sirn
<ide>
<ide> import moment from '../moment';
<ide><path>src/locale/tl-ph.js
<ide> //! moment.js locale configuration
<del>//! locale : Tagalog/Filipino (tl-ph)
<add>//! locale : Tagalog (Philippines) [tl-ph]
<ide> //! author : Dan Hagman
<ide>
<ide> import moment from '../moment';
<ide><path>src/locale/tr.js
<ide> //! moment.js locale configuration
<del>//! locale : turkish (tr)
<add>//! locale : Turkish [tr]
<ide> //! authors : Erhan Gundogan : https://github.com/erhangundogan,
<ide> //! Burak Yiğit Kaya: https://github.com/BYK
<ide>
<ide><path>src/locale/tzl.js
<ide> //! moment.js locale configuration
<del>//! locale : talossan (tzl)
<add>//! locale : Talossan [tzl]
<ide> //! author : Robin van der Vliet : https://github.com/robin0van0der0v with the help of Iustì Canun
<ide>
<ide>
<ide><path>src/locale/tzm-latn.js
<ide> //! moment.js locale configuration
<del>//! locale : Morocco Central Atlas Tamaziɣt in Latin (tzm-latn)
<add>//! locale : Central Atlas Tamazight Latin [tzm-latn]
<ide> //! author : Abdel Said : https://github.com/abdelsaid
<ide>
<ide> import moment from '../moment';
<ide><path>src/locale/tzm.js
<ide> //! moment.js locale configuration
<del>//! locale : Morocco Central Atlas Tamaziɣt (tzm)
<add>//! locale : Central Atlas Tamazight [tzm]
<ide> //! author : Abdel Said : https://github.com/abdelsaid
<ide>
<ide> import moment from '../moment';
<ide><path>src/locale/uk.js
<ide> //! moment.js locale configuration
<del>//! locale : ukrainian (uk)
<add>//! locale : Ukrainian [uk]
<ide> //! author : zemlanin : https://github.com/zemlanin
<ide> //! Author : Menelion Elensúle : https://github.com/Oire
<ide>
<ide><path>src/locale/uz.js
<ide> //! moment.js locale configuration
<del>//! locale : uzbek (uz)
<add>//! locale : Uzbek [uz]
<ide> //! author : Sardor Muminov : https://github.com/muminoff
<ide>
<ide> import moment from '../moment';
<ide><path>src/locale/vi.js
<ide> //! moment.js locale configuration
<del>//! locale : vietnamese (vi)
<add>//! locale : Vietnamese [vi]
<ide> //! author : Bang Nguyen : https://github.com/bangnk
<ide>
<ide> import moment from '../moment';
<ide><path>src/locale/x-pseudo.js
<ide> //! moment.js locale configuration
<del>//! locale : pseudo (x-pseudo)
<add>//! locale : Pseudo [x-pseudo]
<ide> //! author : Andrew Hood : https://github.com/andrewhood125
<ide>
<ide> import moment from '../moment';
<ide><path>src/locale/zh-cn.js
<ide> //! moment.js locale configuration
<del>//! locale : chinese (zh-cn)
<add>//! locale : Chinese (China) [zh-cn]
<ide> //! author : suupic : https://github.com/suupic
<ide> //! author : Zeno Zeng : https://github.com/zenozeng
<ide>
<ide><path>src/locale/zh-tw.js
<ide> //! moment.js locale configuration
<del>//! locale : traditional chinese (zh-tw)
<add>//! locale : Chinese (Taiwan) [zh-tw]
<ide> //! author : Ben : https://github.com/ben-lin
<ide> //! author : Chris Lam : https://github.com/hehachris
<ide> | 100 |
Javascript | Javascript | use correct handle handle | d040f1d19d17e056223ab3de0e58ed470c8b9f52 | <ide><path>lib/_debugger.js
<ide> exports.Client = Client;
<ide>
<ide> Client.prototype._addHandle = function(desc) {
<ide> if (typeof desc != 'object' || !desc.handle) throw new Error("bad type");
<del> this.handles[desc.id] = desc;
<add> this.handles[desc.handle] = desc;
<ide>
<ide> if (desc.type == 'script') {
<ide> this._addScript(desc);
<ide> Client.prototype.reqVersion = function(cb) {
<ide>
<ide>
<ide> Client.prototype.reqEval = function(expression, cb) {
<add> var self = this;
<ide> var req = {
<ide> command: 'evaluate',
<ide> arguments: { expression: expression }
<ide> };
<ide>
<add>
<ide> if (this.currentFrame == NO_FRAME) {
<ide> req.arguments.global = true;
<ide> } else {
<ide> req.arguments.frame = this.currentFrame;
<ide> }
<ide>
<ide> this.req(req, function (res) {
<add> console.error('reqEval res ', res.body);
<add> self._addHandle(res.body);
<ide> if (cb) cb(res.body);
<ide> });
<ide> }; | 1 |
Javascript | Javascript | remove unnecessary else statement | a6a15fefb6aea781d3e08f44633cd137ca2c844f | <ide><path>lib/internal/fs/promises.js
<ide> async function readFileHandle(filehandle, options) {
<ide> } while (!endOfFile);
<ide>
<ide> const result = Buffer.concat(chunks);
<del> if (options.encoding) {
<del> return result.toString(options.encoding);
<del> } else {
<del> return result;
<del> }
<add>
<add> return options.encoding ? result.toString(options.encoding) : result;
<ide> }
<ide>
<ide> // All of the functions are defined as async in order to ensure that errors | 1 |
Javascript | Javascript | convert safeareaview to es module | d9a86aa651c7a8baa7082f91a6c02b94e5e76420 | <ide><path>Libraries/Components/SafeAreaView/SafeAreaView.js
<ide> if (Platform.OS === 'android') {
<ide> );
<ide> }
<ide>
<del>module.exports = exported;
<add>export default exported;
<ide><path>Libraries/Components/SafeAreaView/__tests__/SafeAreaView-test.js
<ide> 'use strict';
<ide>
<ide> const React = require('react');
<del>const SafeAreaView = require('../SafeAreaView');
<add>import SafeAreaView from '../SafeAreaView';
<ide> const ReactNativeTestTools = require('../../../Utilities/ReactNativeTestTools');
<ide> const View = require('../../View/View');
<ide> const Text = require('../../../Text/Text');
<ide><path>index.js
<ide> module.exports = {
<ide> return require('./Libraries/Components/RefreshControl/RefreshControl');
<ide> },
<ide> get SafeAreaView(): SafeAreaView {
<del> return require('./Libraries/Components/SafeAreaView/SafeAreaView');
<add> return require('./Libraries/Components/SafeAreaView/SafeAreaView').default;
<ide> },
<ide> get ScrollView(): ScrollView {
<ide> return require('./Libraries/Components/ScrollView/ScrollView'); | 3 |
Javascript | Javascript | fix graphql issue with lambdas flag | 08e3396574cd39ea94aab10ac9b07272cd7b669a | <ide><path>packages/next/build/webpack.js
<ide> export default async function getBaseWebpackConfig (dir, {dev = false, isServer
<ide>
<ide> const resolveConfig = {
<ide> // Disable .mjs for node_modules bundling
<del> extensions: ['.wasm', !lambdas && '.mjs', '.js', '.jsx', '.json'].filter(Boolean),
<add> extensions: ['.wasm', '.mjs', '.js', '.jsx', '.json'].filter(Boolean),
<ide> modules: [
<ide> NEXT_PROJECT_ROOT_NODE_MODULES,
<ide> 'node_modules', | 1 |
Go | Go | return correct status on invalid filters | 2d45b5ddbc203e424d07682799874ed2e79b85c5 | <ide><path>api/server/router/container/container_routes.go
<ide> func (s *containerRouter) postContainersPrune(ctx context.Context, w http.Respon
<ide>
<ide> pruneFilters, err := filters.FromJSON(r.Form.Get("filters"))
<ide> if err != nil {
<del> return errdefs.InvalidParameter(err)
<add> return err
<ide> }
<ide>
<ide> pruneReport, err := s.backend.ContainersPrune(ctx, pruneFilters)
<ide><path>api/server/router/network/network_routes.go
<ide> func (n *networkRouter) getNetworksList(ctx context.Context, w http.ResponseWrit
<ide> }
<ide>
<ide> if err := network.ValidateFilters(filter); err != nil {
<del> return errdefs.InvalidParameter(err)
<add> return err
<ide> }
<ide>
<ide> var list []types.NetworkResource
<ide><path>api/server/router/swarm/cluster_routes.go
<ide> func (sr *swarmRouter) getServices(ctx context.Context, w http.ResponseWriter, r
<ide> }
<ide> filter, err := filters.FromJSON(r.Form.Get("filters"))
<ide> if err != nil {
<del> return errdefs.InvalidParameter(err)
<add> return err
<ide> }
<ide>
<ide> // the status query parameter is only support in API versions >= 1.41. If
<ide><path>api/server/router/volume/volume_routes.go
<ide> func (v *volumeRouter) getVolumesList(ctx context.Context, w http.ResponseWriter
<ide>
<ide> filters, err := filters.FromJSON(r.Form.Get("filters"))
<ide> if err != nil {
<del> return errdefs.InvalidParameter(errors.Wrap(err, "error reading volume filters"))
<add> return errors.Wrap(err, "error reading volume filters")
<ide> }
<ide> volumes, warnings, err := v.backend.List(ctx, filters)
<ide> if err != nil {
<ide><path>api/types/filters/parse.go
<ide> import (
<ide> "strings"
<ide>
<ide> "github.com/docker/docker/api/types/versions"
<add> "github.com/pkg/errors"
<ide> )
<ide>
<ide> // Args stores a mapping of keys to a set of multiple values.
<ide> func FromJSON(p string) (Args, error) {
<ide> // Fallback to parsing arguments in the legacy slice format
<ide> deprecated := map[string][]string{}
<ide> if legacyErr := json.Unmarshal(raw, &deprecated); legacyErr != nil {
<del> return args, err
<add> return args, invalidFilter{errors.Wrap(err, "invalid filter")}
<ide> }
<ide>
<ide> args.fields = deprecatedArgs(deprecated)
<ide> func (args Args) Contains(field string) bool {
<ide> return ok
<ide> }
<ide>
<del>type invalidFilter string
<add>type invalidFilter struct{ error }
<ide>
<ide> func (e invalidFilter) Error() string {
<del> return "Invalid filter '" + string(e) + "'"
<add> return e.error.Error()
<ide> }
<ide>
<ide> func (invalidFilter) InvalidParameter() {}
<ide> func (invalidFilter) InvalidParameter() {}
<ide> func (args Args) Validate(accepted map[string]bool) error {
<ide> for name := range args.fields {
<ide> if !accepted[name] {
<del> return invalidFilter(name)
<add> return invalidFilter{errors.New("invalid filter '" + name + "'")}
<ide> }
<ide> }
<ide> return nil
<ide><path>api/types/filters/parse_test.go
<ide> func TestFromJSON(t *testing.T) {
<ide> }
<ide>
<ide> for _, invalid := range invalids {
<del> if _, err := FromJSON(invalid); err == nil {
<add> _, err := FromJSON(invalid)
<add> if err == nil {
<ide> t.Fatalf("Expected an error with %v, got nothing", invalid)
<ide> }
<add> var invalidFilterError invalidFilter
<add> if !errors.As(err, &invalidFilterError) {
<add> t.Fatalf("Expected an invalidFilter error, got %T", err)
<add> }
<ide> }
<ide>
<ide> for expectedArgs, matchers := range valid {
<ide> func TestValidate(t *testing.T) {
<ide> }
<ide>
<ide> f.Add("bogus", "running")
<del> if err := f.Validate(valid); err == nil {
<add> err := f.Validate(valid)
<add> if err == nil {
<ide> t.Fatal("Expected to return an error, got nil")
<ide> }
<add> var invalidFilterError invalidFilter
<add> if !errors.As(err, &invalidFilterError) {
<add> t.Fatalf("Expected an invalidFilter error, got %T", err)
<add> }
<ide> }
<ide>
<ide> func TestWalkValues(t *testing.T) { | 6 |
PHP | PHP | fix missing exception code in cli | e49d313ef7cc27a5e64eab5e5e78ff159cabfada | <ide><path>src/Error/ExceptionTrap.php
<ide> public function handleException(Throwable $exception): void
<ide> } catch (Throwable $exception) {
<ide> $this->logInternalError($exception);
<ide> }
<add> // Use this constant as a proxy for cakephp tests.
<add> if (PHP_SAPI == 'cli' && !env('FIXTURE_SCHEMA_METADATA')) {
<add> exit(1);
<add> }
<ide> }
<ide>
<ide> /** | 1 |
PHP | PHP | add integration test for cookies + psr7 | cadeb649ca5762235b202be06b07a838b7b93615 | <ide><path>tests/TestCase/TestSuite/IntegrationTestCaseTest.php
<ide> public function testFlashSessionAndCookieAsserts()
<ide> $this->assertCookieNotSet('user_id');
<ide> }
<ide>
<add> /**
<add> * Test flash and cookie assertions
<add> *
<add> * @return void
<add> */
<add> public function testFlashSessionAndCookieAssertsHttpServer()
<add> {
<add> $this->useHttpServer(true);
<add> $this->post('/posts/index');
<add>
<add> $this->assertSession('An error message', 'Flash.flash.0.message');
<add> $this->assertCookieNotSet('user_id');
<add> $this->assertCookie(1, 'remember_me');
<add> }
<add>
<ide> /**
<ide> * Tests the failure message for assertCookieNotSet
<ide> * | 1 |
Javascript | Javascript | resolve uncaught typeerror | 02fd34d2690f07ec84d61e21547e5452f4391be9 | <ide><path>src/project.js
<ide> module.exports = class Project extends Model {
<ide> // Public: Get an {Array} of {String}s containing the paths of the project's
<ide> // directories.
<ide> getPaths() {
<del> return this.rootDirectories.map(rootDirectory => rootDirectory.getPath());
<add> try {
<add> return this.rootDirectories.map(rootDirectory => rootDirectory.getPath());
<add> } catch (e) {
<add> atom.notifications.addError(
<add> "Please clear Atom's window state with: atom --clear-window-state"
<add> );
<add> }
<ide> }
<ide>
<ide> // Public: Set the paths of the project's directories. | 1 |
Python | Python | dump args and kwargs in error e-mail | ef0d40dbf5bb7bd2850b2f1178532c57d3aec201 | <ide><path>celery/worker/job.py
<ide> TASK_FAIL_EMAIL_BODY = """
<ide> Task %%(name)s with id %%(id)s raised exception: %%(exc)s
<ide>
<add>
<add>Task was called with args:%%(args)s kwargs:%%(kwargs)s.
<ide> The contents of the full traceback was:
<ide>
<ide> %%(traceback)s
<ide> def on_failure(self, exc_info, meta):
<ide> "name": task_name,
<ide> "exc": exc_info.exception,
<ide> "traceback": exc_info.traceback,
<add> "args": self.args,
<add> "kwargs": self.kwargs,
<ide> }
<ide> self.logger.error(self.fail_msg.strip() % context)
<ide> | 1 |
Text | Text | add russian translation | ac0567c6d9c76157a0bc15d0f5b18bd941ddff21 | <ide><path>curriculum/challenges/russian/03-front-end-libraries/react/give-sibling-elements-a-unique-key-attribute.russian.md
<ide> localeTitle: Дайте сиблинг-элементам уникальный
<ide> ---
<ide>
<ide> ## Description
<del>undefined
<add>Последняя задача показала, как используется метод `map` для динамичного рендеринга нескольких элементов, основываясь на вводе пользователя. Однако в том примере отсутсвовала важная часть. Когда вы создаете массив, атрибут `key` каждого из элементов должен иметь уникальное значение. React использует эти ключи для того, чтобы отслеживать какие элементы были добавлены, изменены или удалены. Это помогает сделать процесс повторного рендеринга более эффективным, когда мы изменяем список. Обратите внимание, что ключи должны быть уникальными только среди сиблинг-элементов, они не обязаны быть глобально уникальными в вашем приложении.
<ide>
<ide> ## Instructions
<ide> undefined
<ide> undefined
<ide>
<ide> ```yml
<ide> tests:
<del> - text: ''
<add> - text: 'Компонент <code>Frameworks</code> должен присутсовать и отображаться на странице.'
<ide> testString: 'assert(Enzyme.mount(React.createElement(Frameworks)).find("Frameworks").length === 1, "The <code>Frameworks</code> component should exist and render to the page.");'
<del> - text: ''
<add> - text: '<code>Frameworks</code> должен отображать элемент <code>h1</code>.'
<ide> testString: 'assert(Enzyme.mount(React.createElement(Frameworks)).find("h1").length === 1, "<code>Frameworks</code> should render an <code>h1</code> element.");'
<ide> - text: <code>Frameworks</code> должны отображать элемент <code>ul</code> .
<ide> testString: 'assert(Enzyme.mount(React.createElement(Frameworks)).find("ul").length === 1, "<code>Frameworks</code> should render a <code>ul</code> element.");' | 1 |
Python | Python | add smoke test for 64-bit blas | d57739d3152c366a43f0d17694e2ea8d5db142d7 | <ide><path>numpy/linalg/tests/test_linalg.py
<ide> from numpy.testing import (
<ide> assert_, assert_equal, assert_raises, assert_array_equal,
<ide> assert_almost_equal, assert_allclose, suppress_warnings,
<del> assert_raises_regex,
<add> assert_raises_regex, HAS_LAPACK64,
<ide> )
<add>from numpy.testing._private.utils import requires_memory
<ide>
<ide>
<ide> def consistent_subclass(out, in_):
<ide> def test_unsupported_commontype():
<ide> arr = np.array([[1, -2], [2, 5]], dtype='float16')
<ide> with assert_raises_regex(TypeError, "unsupported in linalg"):
<ide> linalg.cholesky(arr)
<add>
<add>
<add>@pytest.mark.slow
<add>@pytest.mark.xfail(not HAS_LAPACK64, run=False,
<add> reason="Numpy not compiled with 64-bit BLAS/LAPACK")
<add>@requires_memory(16e9)
<add>def test_blas64_dot():
<add> n = 2**32
<add> a = np.zeros([1, n], dtype=np.float32)
<add> b = np.ones([1, 1], dtype=np.float32)
<add> a[0,-1] = 1
<add> c = np.dot(b, a)
<add> assert_equal(c[0,-1], 1)
<ide><path>numpy/testing/_private/utils.py
<ide>
<ide> from numpy.core import(
<ide> intp, float32, empty, arange, array_repr, ndarray, isnat, array)
<add>import numpy.__config__
<ide>
<ide> if sys.version_info[0] >= 3:
<ide> from io import StringIO
<ide> 'SkipTest', 'KnownFailureException', 'temppath', 'tempdir', 'IS_PYPY',
<ide> 'HAS_REFCOUNT', 'suppress_warnings', 'assert_array_compare',
<ide> '_assert_valid_refcount', '_gen_alignment_data', 'assert_no_gc_cycles',
<del> 'break_cycles',
<add> 'break_cycles', 'HAS_LAPACK64'
<ide> ]
<ide>
<ide>
<ide> class KnownFailureException(Exception):
<ide>
<ide> IS_PYPY = platform.python_implementation() == 'PyPy'
<ide> HAS_REFCOUNT = getattr(sys, 'getrefcount', None) is not None
<add>HAS_LAPACK64 = hasattr(numpy.__config__, 'lapack64__opt_info')
<ide>
<ide>
<ide> def import_nose(): | 2 |
Text | Text | clarify eol platform support | 8d2df41618266f6bcd5579d6de63a777149f4166 | <ide><path>BUILDING.md
<ide> There are three support tiers:
<ide>
<ide> ### Supported platforms
<ide>
<del>The community does not build or test against end-of-life distributions (EoL).
<ide> For production applications, run Node.js on supported platforms only.
<ide>
<add>Node.js does not support a platform version if a vendor has expired support
<add>for it. In other words, Node.js does not support running on End-of-life (EoL)
<add>platforms. This is true regardless of entries in the table below.
<add>
<ide> | System | Support type | Version | Architectures | Notes |
<ide> | ------------ | ------------ | ------------------------------- | ---------------- | ----------------------------- |
<ide> | GNU/Linux | Tier 1 | kernel >= 2.6.32, glibc >= 2.12 | x64, arm | | | 1 |
Javascript | Javascript | correct some test assertions | ef5a2914724c04420de3690be24bcb292f051c16 | <ide><path>test/cases/wasm/decoding/index.js
<ide> it("should raw memory export without data", function() {
<ide> return import("./memory2.wasm").then(function(wasm) {
<ide> expect(wasm.memory).toBeInstanceOf(WebAssembly.Memory);
<ide> expect(wasm.memory.buffer).toBeInstanceOf(ArrayBuffer);
<del> expect(wasm.memory.buffer.byLength).toBe(1 << 16);
<add> expect(wasm.memory.buffer.byteLength).toBe(1 << 16);
<ide> });
<ide> });
<ide><path>test/cases/wasm/table/index.js
<ide> it("should support tables", function() {
<ide> return import("./wasm-table.wasm").then(function(wasm) {
<ide> expect(wasm.callByIndex(0)).toEqual(42);
<ide> expect(wasm.callByIndex(1)).toEqual(13);
<del> expect(() => wasm.callByIndex(2)).toThrow("fefef");
<add> expect(() => wasm.callByIndex(2)).toThrow("invalid function");
<ide> });
<ide> });
<ide> | 2 |
Text | Text | add a link to the emoji cheat sheet | 029e4336bc90107b5ee19c115b1d7c18a3272337 | <ide><path>docs/contributing.md
<ide> dependencies up to date by running `apm update` after pulling upstream changes.
<ide> * :racehorse: when improving performance
<ide> * :non-potable_water: when plugging memory leaks
<ide> * :memo: when writing docs
<add> * :bulb: Check out the [Emoji Cheat Sheet](http://www.emoji-cheat-sheet.com)
<add> for more ideas.
<ide>
<ide> ## CoffeeScript Styleguide
<ide> | 1 |
Python | Python | update version number | 1a7ed296395a941c4db0d8ef02855ca079e7ff0b | <ide><path>rest_framework/__init__.py
<ide> """
<ide>
<ide> __title__ = 'Django REST framework'
<del>__version__ = '3.6.3'
<add>__version__ = '3.6.4'
<ide> __author__ = 'Tom Christie'
<ide> __license__ = 'BSD 2-Clause'
<ide> __copyright__ = 'Copyright 2011-2017 Tom Christie' | 1 |
Javascript | Javascript | choose parser based on file extension | 078f6310ba5008224e8683c1463b248f447fefa8 | <ide><path>packages/react-native-codegen/src/cli/combine/combine-js-to-schema-cli.js
<ide> const allFiles = [];
<ide> fileList.forEach(file => {
<ide> if (fs.lstatSync(file).isDirectory()) {
<ide> const dirFiles = glob
<del> .sync(`${file}/**/*.js`, {
<add> .sync(`${file}/**/*.{js,ts,tsx}`, {
<ide> nodir: true,
<ide> })
<ide> .filter(filterJSFile);
<ide><path>packages/react-native-codegen/src/cli/combine/combine-js-to-schema.js
<ide> import type {SchemaType} from '../../CodegenSchema.js';
<ide>
<ide> const FlowParser = require('../../parsers/flow');
<add>const TypeScriptParser = require('../../parsers/typescript');
<ide> const fs = require('fs');
<add>const path = require('path');
<ide>
<ide> function combineSchemas(files: Array<string>): SchemaType {
<ide> return files.reduce(
<ide> (merged, filename) => {
<ide> const contents = fs.readFileSync(filename, 'utf8');
<add>
<ide> if (
<ide> contents &&
<ide> (/export\s+default\s+\(?codegenNativeComponent</.test(contents) ||
<ide> /extends TurboModule/.test(contents))
<ide> ) {
<del> const schema = FlowParser.parseFile(filename);
<add> const isTypeScript =
<add> path.extname(filename) === '.ts' || path.extname(filename) === '.tsx';
<add>
<add> const schema = isTypeScript
<add> ? TypeScriptParser.parseFile(filename)
<add> : FlowParser.parseFile(filename);
<ide>
<ide> if (schema && schema.modules) {
<ide> merged.modules = {...merged.modules, ...schema.modules};
<ide><path>packages/react-native-codegen/src/cli/parser/parser.js
<ide>
<ide> 'use strict';
<ide>
<add>const path = require('path');
<ide> const FlowParser = require('../../parsers/flow');
<add>const TypeScriptParser = require('../../parsers/typescript');
<ide>
<ide> function parseFiles(files: Array<string>) {
<ide> files.forEach(filename => {
<add> const isTypeScript =
<add> path.extname(filename) === '.ts' || path.extname(filename) === '.tsx';
<add>
<ide> console.log(
<ide> filename,
<del> JSON.stringify(FlowParser.parseFile(filename), null, 2),
<add> JSON.stringify(
<add> isTypeScript
<add> ? TypeScriptParser.parseFile(filename)
<add> : FlowParser.parseFile(filename),
<add> null,
<add> 2,
<add> ),
<ide> );
<ide> });
<ide> }
<ide><path>packages/react-native-codegen/src/parsers/typescript/index.js
<add>/**
<add> * Copyright (c) Facebook, Inc. and its affiliates.
<add> *
<add> * This source code is licensed under the MIT license found in the
<add> * LICENSE file in the root directory of this source tree.
<add> *
<add> * @flow
<add> * @format
<add> */
<add>
<add>'use strict';
<add>
<add>import type {SchemaType} from '../../CodegenSchema.js';
<add>const babelParser = require('@babel/parser');
<add>const fs = require('fs');
<add>const path = require('path');
<add>const {buildComponentSchema} = require('./components');
<add>const {wrapComponentSchema} = require('./components/schema');
<add>const {buildModuleSchema} = require('./modules');
<add>const {wrapModuleSchema} = require('./modules/schema');
<add>
<add>const {
<add> createParserErrorCapturer,
<add> visit,
<add> isModuleRegistryCall,
<add>} = require('./utils');
<add>const invariant = require('invariant');
<add>
<add>function getConfigType(
<add> // TODO(T108222691): Use flow-types for @babel/parser
<add> ast: $FlowFixMe,
<add>): 'module' | 'component' | 'none' {
<add> let isComponent = false;
<add> let isModule = false;
<add>
<add> visit(ast, {
<add> CallExpression(node) {
<add> if (
<add> node.callee.type === 'Identifier' &&
<add> node.callee.name === 'codegenNativeComponent'
<add> ) {
<add> isComponent = true;
<add> }
<add>
<add> if (isModuleRegistryCall(node)) {
<add> isModule = true;
<add> }
<add> },
<add>
<add> TSInterfaceDeclaration(node) {
<add> if (
<add> Array.isArray(node.extends) &&
<add> node.extends.some(
<add> extension => extension.expression.name === 'TurboModule',
<add> )
<add> ) {
<add> isModule = true;
<add> }
<add> },
<add> });
<add>
<add> if (isModule && isComponent) {
<add> throw new Error(
<add> 'Found type extending "TurboModule" and exported "codegenNativeComponent" declaration in one file. Split them into separated files.',
<add> );
<add> }
<add>
<add> if (isModule) {
<add> return 'module';
<add> } else if (isComponent) {
<add> return 'component';
<add> } else {
<add> return 'none';
<add> }
<add>}
<add>
<add>function buildSchema(contents: string, filename: ?string): SchemaType {
<add> // Early return for non-Spec JavaScript files
<add> if (
<add> !contents.includes('codegenNativeComponent') &&
<add> !contents.includes('TurboModule')
<add> ) {
<add> return {modules: {}};
<add> }
<add>
<add> const ast = babelParser.parse(contents, {
<add> sourceType: 'module',
<add> plugins: ['typescript'],
<add> }).program;
<add>
<add> const configType = getConfigType(ast);
<add>
<add> switch (configType) {
<add> case 'component': {
<add> return wrapComponentSchema(buildComponentSchema(ast));
<add> }
<add> case 'module': {
<add> if (filename === undefined || filename === null) {
<add> throw new Error('Filepath expected while parasing a module');
<add> }
<add> const hasteModuleName = path.basename(filename).replace(/\.tsx?$/, '');
<add>
<add> const [parsingErrors, tryParse] = createParserErrorCapturer();
<add>
<add> const schema = tryParse(() =>
<add> buildModuleSchema(hasteModuleName, ast, tryParse),
<add> );
<add>
<add> if (parsingErrors.length > 0) {
<add> /**
<add> * TODO(T77968131): We have two options:
<add> * - Throw the first error, but indicate there are more then one errors.
<add> * - Display all errors, nicely formatted.
<add> *
<add> * For the time being, we're just throw the first error.
<add> **/
<add>
<add> throw parsingErrors[0];
<add> }
<add>
<add> invariant(
<add> schema != null,
<add> 'When there are no parsing errors, the schema should not be null',
<add> );
<add>
<add> return wrapModuleSchema(schema, hasteModuleName);
<add> }
<add> default:
<add> return {modules: {}};
<add> }
<add>}
<add>
<add>function parseFile(filename: string): SchemaType {
<add> const contents = fs.readFileSync(filename, 'utf8');
<add>
<add> return buildSchema(contents, filename);
<add>}
<add>
<add>function parseModuleFixture(filename: string): SchemaType {
<add> const contents = fs.readFileSync(filename, 'utf8');
<add>
<add> return buildSchema(contents, 'path/NativeSampleTurboModule.ts');
<add>}
<add>
<add>function parseString(contents: string, filename: ?string): SchemaType {
<add> return buildSchema(contents, filename);
<add>}
<add>
<add>module.exports = {
<add> parseFile,
<add> parseModuleFixture,
<add> parseString,
<add>}; | 4 |
PHP | PHP | fix coding standards | a8eca60fc52d73b114eacf08f9fa0be1c399df9d | <ide><path>lib/Cake/Model/Model.php
<ide> public function exists($id = null) {
<ide> if ($id === false) {
<ide> return false;
<ide> }
<del> return (bool)$this->find('count', array(
<add> return (bool)$this->find('count', array(
<ide> 'conditions' => array(
<ide> $this->alias . '.' . $this->primaryKey => $id
<ide> ),
<ide><path>lib/Cake/Test/Case/Routing/Route/CakeRouteTest.php
<ide> public function testParseTrailingUTF8() {
<ide> $this->assertEquals($expected, $result);
<ide> }
<ide>
<del>
<ide> /**
<ide> * test that utf-8 patterns work for :section
<ide> *
<ide> * @return void
<ide> */
<ide> public function testUTF8PatternOnSection() {
<ide> $route = new CakeRoute(
<del> '/:section',
<add> '/:section',
<ide> array('plugin' => 'blogs', 'controller' => 'posts' , 'action' => 'index' ),
<ide> array(
<ide> 'persist' => array('section'),
<ide> 'section' => 'آموزش|weblog'
<ide> )
<ide> );
<del>
<add>
<ide> $result = $route->parse('/%D8%A2%D9%85%D9%88%D8%B2%D8%B4');
<ide> $expected = array('section' => 'آموزش', 'plugin' => 'blogs', 'controller' => 'posts', 'action' => 'index', 'pass' => array(), 'named' => array());
<ide> $this->assertEquals($expected, $result);
<ide><path>lib/Cake/Test/Case/View/MediaViewTest.php
<ide> public function testRenderCachingAndName() {
<ide> * @return void
<ide> */
<ide> public function testRenderUpperExtension() {
<del> return;
<ide> $this->MediaView->viewVars = array(
<ide> 'path' => CAKE . 'Test' . DS . 'test_app' . DS . 'Vendor' . DS . 'img' . DS,
<ide> 'id' => 'test_2.JPG'
<ide><path>lib/Cake/basics.php
<ide> function env($key) {
<ide> $offset = 4;
<ide> }
<ide> return substr($filename, 0, -(strlen($name) + $offset));
<del> break;
<ide> case 'PHP_SELF':
<ide> return str_replace(env('DOCUMENT_ROOT'), '', env('SCRIPT_FILENAME'));
<del> break;
<ide> case 'CGI_MODE':
<ide> return (PHP_SAPI === 'cgi');
<del> break;
<ide> case 'HTTP_BASE':
<ide> $host = env('HTTP_HOST');
<ide> $parts = explode('.', $host);
<ide> function env($key) {
<ide> }
<ide> array_shift($parts);
<ide> return '.' . implode('.', $parts);
<del> break;
<ide> }
<ide> return null;
<ide> } | 4 |
Python | Python | fix memory issue in tensorflow backend's batch_dot | edd0c0a4bd4cea027842bc8925048976d65b1ae9 | <ide><path>keras/backend/tensorflow_backend.py
<ide> def batch_dot(x, y, axes=None):
<ide> ' with axes=' + str(axes) + '. x.shape[%d] != '
<ide> 'y.shape[%d] (%d != %d).' % (axes[0], axes[1], d1, d2))
<ide>
<del> # bring the dimensions to be reduced to axis 1
<del> if a0 != 1:
<del> pattern = list(range(x_ndim))
<del> for i in range(a0, 1, -1):
<del> pattern[i] = pattern[i - 1]
<del> pattern[1] = a0
<del> x = permute_dimensions(x, pattern)
<del> if a1 != 1:
<del> pattern = list(range(y_ndim))
<del> for i in range(a1, 1, -1):
<del> pattern[i] = pattern[i - 1]
<del> pattern[1] = a1
<del> y = permute_dimensions(y, pattern)
<del>
<del> # reshape to closest broadcastable shape
<del> x_shape = tf.shape(x)
<del> y_shape = tf.shape(y)
<add> # There are 2 ways to perform theano's batched_tensordot in tensorflow:
<add> # 1) Elementwise multiplication followed by tf.reduce_sum. This requires
<add> # more memory but works with partial shape information.
<add> # 2) Using tf.matmul. This is more efficient but all dimensions except
<add> # batch size should be known for input with rank > 3.
<add>
<add> if x_ndim > 3:
<add> if None in x_shape[1:]:
<add> x_matmullabe = False
<add> else:
<add> x_matmullabe = True
<add> else:
<add> x_matmullabe = True
<add>
<add> if y_ndim > 3:
<add> if None in y_shape[1:]:
<add> y_matmullabe = False
<add> else:
<add> y_matmullabe = True
<add> else:
<add> y_matmullabe = True
<ide>
<del> new_x_shape = tf.concat([x_shape, tf.ones_like(y_shape[2:])], 0)
<del> new_y_shape = tf.concat([y_shape[:2], tf.ones_like(x_shape[2:]), y_shape[2:]], 0)
<add> use_matmul = x_matmullabe and y_matmullabe
<ide>
<del> x = reshape(x, new_x_shape)
<del> y = reshape(y, new_y_shape)
<add> if use_matmul:
<add> # backup ndims. Need them later.
<add> orig_x_ndim = x_ndim
<add> orig_y_ndim = y_ndim
<ide>
<del> result = tf.reduce_sum(x * y, 1)
<add> # if rank is 2, expand to 3.
<add> if x_ndim == 2:
<add> x = tf.expand_dims(x, 1)
<add> a0 += 1
<add> x_ndim += 1
<add> if y_ndim == 2:
<add> y = tf.expand_dims(y, 2)
<add> y_ndim += 1
<add>
<add> # bring x's dimension to be reduced to last axis.
<add> if a0 != x_ndim - 1:
<add> pattern = list(range(x_ndim))
<add> for i in range(a0, x_ndim - 1):
<add> pattern[i] = pattern[i + 1]
<add> pattern[-1] = a0
<add> x = tf.transpose(x, pattern)
<add>
<add> # bring y's dimension to be reduced to axis 1.
<add> if a1 != 1:
<add> pattern = list(range(y_ndim))
<add> for i in range(a1, 1, -1):
<add> pattern[i] = pattern[i - 1]
<add> pattern[1] = a1
<add> y = tf.transpose(y, pattern)
<add>
<add> # normalize both inputs to rank 3.
<add> if x_ndim > 3:
<add> # squash middle dimensions of x.
<add> x_shape = list(int_shape(x))
<add> x_mid_dims = x_shape[1:-1]
<add> x_squashed_dim = np.prod(x_mid_dims)
<add> if x_batch_size is None:
<add> x_batch_size = -1
<add> x = tf.reshape(x, [x_batch_size, x_squashed_dim, x_shape[-1]])
<add> x_squashed = True
<add> else:
<add> x_squashed = False
<add> if y_ndim > 3:
<add> # squash trailing dimensions of y
<add> y_shape = list(int_shape(y))
<add> y_trail_dims = y_shape[2:]
<add> y_squashed_dim = np.prod(y_trail_dims)
<add> if y_batch_size is None:
<add> y_batch_size = -1
<add> y = tf.reshape(y, [y_batch_size, y_shape[1], y_squashed_dim])
<add> y_squashed = True
<add> else:
<add> y_squashed = False
<add>
<add> result = tf.matmul(x, y)
<add>
<add> # if inputs were squashed, we have to reshape the matmul output.
<add> output_shape = list(int_shape(result))
<add> do_reshape = False
<add> if x_squashed:
<add> output_shape = [output_shape[0]] + x_mid_dims + [output_shape[-1]]
<add> do_reshape = True
<add> if y_squashed:
<add> output_shape = output_shape[:-1] + y_trail_dims
<add> do_reshape = True
<add>
<add> if do_reshape:
<add> if output_shape[0] is None:
<add> output_shape[0] = -1
<add> result = tf.reshape(result, output_shape)
<add>
<add> # if the inputs were originally rank 2, we remove the added 1 dim.
<add> if orig_x_ndim == 2:
<add> result = tf.squeeze(result, 1)
<add> elif orig_y_ndim == 2:
<add> result = tf.squeeze(result, -1)
<add> else:
<ide>
<del> if ndim(result) == 1:
<del> result = tf.expand_dims(result, -1)
<add> # bring the dimension to be reduced to axis 1.
<add> if a0 != 1:
<add> pattern = list(range(x_ndim))
<add> for i in range(a0, 1, -1):
<add> pattern[i] = pattern[i - 1]
<add> pattern[1] = a0
<add> x = tf.transpose(x, pattern)
<add>
<add> if a1 != 1:
<add> pattern = list(range(y_ndim))
<add> for i in range(a1, 1, -1):
<add> pattern[i] = pattern[i - 1]
<add> pattern[1] = a1
<add> y = tf.transpose(y, pattern)
<add>
<add> # reshape to closest broadcastable shape.
<add> x_shape = tf.shape(x)
<add> y_shape = tf.shape(y)
<add>
<add> new_x_shape = tf.concat([x_shape, tf.ones_like(y_shape[2:])], 0)
<add> new_y_shape = tf.concat([y_shape[:2],
<add> tf.ones_like(x_shape[2:]),
<add> y_shape[2:]], 0)
<add>
<add> x = reshape(x, new_x_shape)
<add> y = reshape(y, new_y_shape)
<add>
<add> result = tf.reduce_sum(x * y, 1)
<add>
<add> if ndim(result) == 1:
<add> result = tf.expand_dims(result, -1)
<ide>
<ide> return result
<ide>
<ide><path>tests/keras/backend/backend_test.py
<ide> def random(shape):
<ide>
<ide> assert_allclose(f([x_np, y_np])[0], z_np, atol=1e-05)
<ide>
<add> # test with placeholders (no shape info)
<add> if K.backend() != 'cntk':
<add> x = K.placeholder(ndim=len(x_shape))
<add> y = K.placeholder(ndim=len(y_shape))
<add> z = K.batch_dot(x, y, axes)
<add>
<add> z_shape = K.int_shape(z)
<add> if z_shape is not None:
<add> assert len(z_shape) == z_np.ndim
<add> assert set(z_shape) <= set((None, 1))
<add>
<add> f = K.function([x, y], [z])
<add>
<add> assert_allclose(f([x_np, y_np])[0], z_np, atol=1e-05)
<add>
<ide> # test with variables
<ide> x = K.variable(x_np)
<ide> y = K.variable(y_np) | 2 |
Javascript | Javascript | remove duplicate comment line | bd03a556c0804c2d40f2fc61b910ca968b27c202 | <ide><path>src/ng/parse.js
<ide> var $parseMinErr = minErr('$parse');
<ide> // access to any member named "constructor".
<ide> //
<ide> // For reflective calls (a[b]) we check that the value of the lookup is not the Function constructor while evaluating
<del>// For reflective calls (a[b]) we check that the value of the lookup is not the Function constructor while evaluating
<ide> // the expression, which is a stronger but more expensive test. Since reflective calls are expensive anyway, this is not
<ide> // such a big deal compared to static dereferencing.
<ide> // | 1 |
Python | Python | add a test for japanese bert tokenizers | a09da4eeb0397dd66d61182177dd3b753d70e62a | <ide><path>transformers/tests/tokenization_bert_japanese_test.py
<add># coding=utf-8
<add># Copyright 2018 The Google AI Language Team Authors.
<add>#
<add># Licensed under the Apache License, Version 2.0 (the "License");
<add># you may not use this file except in compliance with the License.
<add># You may obtain a copy of the License at
<add>#
<add># http://www.apache.org/licenses/LICENSE-2.0
<add>#
<add># Unless required by applicable law or agreed to in writing, software
<add># distributed under the License is distributed on an "AS IS" BASIS,
<add># WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
<add># See the License for the specific language governing permissions and
<add># limitations under the License.
<add>from __future__ import absolute_import, division, print_function, unicode_literals
<add>
<add>import os
<add>import unittest
<add>import pytest
<add>from io import open
<add>
<add>from transformers.tokenization_bert import WordpieceTokenizer
<add>from transformers.tokenization_bert_japanese import (BertJapaneseTokenizer,
<add> MecabTokenizer, CharacterTokenizer,
<add> VOCAB_FILES_NAMES)
<add>
<add>from .tokenization_tests_commons import CommonTestCases
<add>
<add>
<add>class BertJapaneseTokenizationTest(CommonTestCases.CommonTokenizerTester):
<add>
<add> tokenizer_class = BertJapaneseTokenizer
<add>
<add> def setUp(self):
<add> super(BertJapaneseTokenizationTest, self).setUp()
<add>
<add> vocab_tokens = [u"[UNK]", u"[CLS]", u"[SEP]",
<add> u"こんにちは", u"こん", u"にちは", u"ばんは", u"##こん", u"##にちは", u"##ばんは",
<add> u"世界", u"##世界", u"、", u"##、", u"。", u"##。"]
<add>
<add> self.vocab_file = os.path.join(self.tmpdirname, VOCAB_FILES_NAMES["vocab_file"])
<add> with open(self.vocab_file, "w", encoding="utf-8") as vocab_writer:
<add> vocab_writer.write("".join([x + "\n" for x in vocab_tokens]))
<add>
<add> def get_tokenizer(self, **kwargs):
<add> return BertJapaneseTokenizer.from_pretrained(self.tmpdirname, **kwargs)
<add>
<add> def get_input_output_texts(self):
<add> input_text = u"こんにちは、世界。 \nこんばんは、世界。"
<add> output_text = u"こんにちは 、 世界 。 こんばんは 、 世界 。"
<add> return input_text, output_text
<add>
<add> def test_full_tokenizer(self):
<add> tokenizer = self.tokenizer_class(self.vocab_file)
<add>
<add> tokens = tokenizer.tokenize(u"こんにちは、世界。\nこんばんは、世界。")
<add> self.assertListEqual(tokens,
<add> [u"こんにちは", u"、", u"世界", u"。",
<add> u"こん", u"##ばんは", u"、", u"世界", "。"])
<add> self.assertListEqual(tokenizer.convert_tokens_to_ids(tokens),
<add> [3, 12, 10, 14, 4, 9, 12, 10, 14])
<add>
<add> def test_mecab_tokenizer(self):
<add> tokenizer = MecabTokenizer()
<add>
<add> self.assertListEqual(
<add> tokenizer.tokenize(u" \tアップルストアでiPhone8 が \n 発売された 。 "),
<add> [u"アップルストア", u"で", u"iPhone", u"8", u"が",
<add> u"発売", u"さ", u"れ", u"た", u"。"])
<add>
<add> def test_mecab_tokenizer_lower(self):
<add> tokenizer = MecabTokenizer(do_lower_case=True)
<add>
<add> self.assertListEqual(
<add> tokenizer.tokenize(u" \tアップルストアでiPhone8 が \n 発売された 。 "),
<add> [u"アップルストア", u"で", u"iphone", u"8", u"が",
<add> u"発売", u"さ", u"れ", u"た", u"。"])
<add>
<add> def test_mecab_tokenizer_no_normalize(self):
<add> tokenizer = MecabTokenizer(normalize_text=False)
<add>
<add> self.assertListEqual(
<add> tokenizer.tokenize(u" \tアップルストアでiPhone8 が \n 発売された 。 "),
<add> [u"アップルストア", u"で", u"iPhone", u"8", u"が",
<add> u"発売", u"さ", u"れ", u"た", u" ", u"。"])
<add>
<add> def test_wordpiece_tokenizer(self):
<add> vocab_tokens = [u"[UNK]", u"[CLS]", u"[SEP]",
<add> u"こんにちは", u"こん", u"にちは" u"ばんは", u"##こん", u"##にちは", u"##ばんは"]
<add>
<add> vocab = {}
<add> for (i, token) in enumerate(vocab_tokens):
<add> vocab[token] = i
<add> tokenizer = WordpieceTokenizer(vocab=vocab, unk_token=u"[UNK]")
<add>
<add> self.assertListEqual(tokenizer.tokenize(u""), [])
<add>
<add> self.assertListEqual(tokenizer.tokenize(u"こんにちは"),
<add> [u"こんにちは"])
<add>
<add> self.assertListEqual(tokenizer.tokenize(u"こんばんは"),
<add> [u"こん", u"##ばんは"])
<add>
<add> self.assertListEqual(tokenizer.tokenize(u"こんばんは こんばんにちは こんにちは"),
<add> [u"こん", u"##ばんは", u"[UNK]", u"こんにちは"])
<add>
<add> @pytest.mark.slow
<add> def test_sequence_builders(self):
<add> tokenizer = self.tokenizer_class.from_pretrained("bert-base-japanese")
<add>
<add> text = tokenizer.encode(u"ありがとう。", add_special_tokens=False)
<add> text_2 = tokenizer.encode(u"どういたしまして。", add_special_tokens=False)
<add>
<add> encoded_sentence = tokenizer.build_inputs_with_special_tokens(text)
<add> encoded_pair = tokenizer.build_inputs_with_special_tokens(text, text_2)
<add>
<add> # 2 is for "[CLS]", 3 is for "[SEP]"
<add> assert encoded_sentence == [2] + text + [3]
<add> assert encoded_pair == [2] + text + [3] + text_2 + [3]
<add>
<add>
<add>class BertJapaneseCharacterTokenizationTest(CommonTestCases.CommonTokenizerTester):
<add>
<add> tokenizer_class = BertJapaneseTokenizer
<add>
<add> def setUp(self):
<add> super(BertJapaneseCharacterTokenizationTest, self).setUp()
<add>
<add> vocab_tokens = [u"[UNK]", u"[CLS]", u"[SEP]",
<add> u"こ", u"ん", u"に", u"ち", u"は", u"ば", u"世", u"界", u"、", u"。"]
<add>
<add> self.vocab_file = os.path.join(self.tmpdirname, VOCAB_FILES_NAMES["vocab_file"])
<add> with open(self.vocab_file, "w", encoding="utf-8") as vocab_writer:
<add> vocab_writer.write("".join([x + "\n" for x in vocab_tokens]))
<add>
<add> def get_tokenizer(self, **kwargs):
<add> return BertJapaneseTokenizer.from_pretrained(self.tmpdirname,
<add> subword_tokenizer_type="character",
<add> **kwargs)
<add>
<add> def get_input_output_texts(self):
<add> input_text = u"こんにちは、世界。 \nこんばんは、世界。"
<add> output_text = u"こ ん に ち は 、 世 界 。 こ ん ば ん は 、 世 界 。"
<add> return input_text, output_text
<add>
<add> def test_full_tokenizer(self):
<add> tokenizer = self.tokenizer_class(self.vocab_file,
<add> subword_tokenizer_type="character")
<add>
<add> tokens = tokenizer.tokenize(u"こんにちは、世界。 \nこんばんは、世界。")
<add> self.assertListEqual(tokens,
<add> [u"こ", u"ん", u"に", u"ち", u"は", u"、", u"世", u"界", u"。",
<add> u"こ", u"ん", u"ば", u"ん", u"は", u"、", u"世", u"界", u"。"])
<add> self.assertListEqual(tokenizer.convert_tokens_to_ids(tokens),
<add> [3, 4, 5, 6, 7, 11, 9, 10, 12,
<add> 3, 4, 8, 4, 7, 11, 9, 10, 12])
<add>
<add> def test_character_tokenizer(self):
<add> vocab_tokens = [u"[UNK]", u"[CLS]", u"[SEP]",
<add> u"こ", u"ん", u"に", u"ち", u"は", u"ば", u"世", u"界"u"、", u"。"]
<add>
<add> vocab = {}
<add> for (i, token) in enumerate(vocab_tokens):
<add> vocab[token] = i
<add> tokenizer = CharacterTokenizer(vocab=vocab, unk_token=u"[UNK]")
<add>
<add> self.assertListEqual(tokenizer.tokenize(u""), [])
<add>
<add> self.assertListEqual(tokenizer.tokenize(u"こんにちは"),
<add> [u"こ", u"ん", u"に", u"ち", u"は"])
<add>
<add> self.assertListEqual(tokenizer.tokenize(u"こんにちほ"),
<add> [u"こ", u"ん", u"に", u"ち", u"[UNK]"])
<add>
<add> @pytest.mark.slow
<add> def test_sequence_builders(self):
<add> tokenizer = self.tokenizer_class.from_pretrained("bert-base-japanese-char")
<add>
<add> text = tokenizer.encode(u"ありがとう。", add_special_tokens=False)
<add> text_2 = tokenizer.encode(u"どういたしまして。", add_special_tokens=False)
<add>
<add> encoded_sentence = tokenizer.build_inputs_with_special_tokens(text)
<add> encoded_pair = tokenizer.build_inputs_with_special_tokens(text, text_2)
<add>
<add> # 2 is for "[CLS]", 3 is for "[SEP]"
<add> assert encoded_sentence == [2] + text + [3]
<add> assert encoded_pair == [2] + text + [3] + text_2 + [3]
<add>
<add>
<add>
<add>if __name__ == '__main__':
<add> unittest.main() | 1 |
Go | Go | update integration tests with --net flag | 581e8e891886e6db387ed27aabda7dd8f1d14174 | <ide><path>integration-cli/docker_cli_run_test.go
<ide> func TestDockerRunWorkingDirectory(t *testing.T) {
<ide>
<ide> // pinging Google's DNS resolver should fail when we disable the networking
<ide> func TestDockerRunWithoutNetworking(t *testing.T) {
<del> runCmd := exec.Command(dockerBinary, "run", "--networking=false", "busybox", "ping", "-c", "1", "8.8.8.8")
<add> runCmd := exec.Command(dockerBinary, "run", "--net=none", "busybox", "ping", "-c", "1", "8.8.8.8")
<ide> out, _, exitCode, err := runCommandWithStdoutStderr(runCmd)
<ide> if err != nil && exitCode != 1 {
<ide> t.Fatal(out, err)
<ide> }
<ide> if exitCode != 1 {
<del> t.Errorf("--networking=false should've disabled the network; the container shouldn't have been able to ping 8.8.8.8")
<add> t.Errorf("--net=none should've disabled the network; the container shouldn't have been able to ping 8.8.8.8")
<ide> }
<ide>
<ide> runCmd = exec.Command(dockerBinary, "run", "-n=false", "busybox", "ping", "-c", "1", "8.8.8.8")
<ide> func TestDockerRunWithoutNetworking(t *testing.T) {
<ide>
<ide> deleteAllContainers()
<ide>
<del> logDone("run - disable networking with --networking=false")
<add> logDone("run - disable networking with --net=none")
<ide> logDone("run - disable networking with -n=false")
<ide> }
<ide>
<ide> func TestContainerNetwork(t *testing.T) {
<ide>
<ide> // Issue #4681
<ide> func TestLoopbackWhenNetworkDisabled(t *testing.T) {
<del> cmd := exec.Command(dockerBinary, "run", "--networking=false", "busybox", "ping", "-c", "1", "127.0.0.1")
<add> cmd := exec.Command(dockerBinary, "run", "--net=none", "busybox", "ping", "-c", "1", "127.0.0.1")
<ide> if _, err := runCommand(cmd); err != nil {
<ide> t.Fatal(err)
<ide> }
<ide> func TestLoopbackWhenNetworkDisabled(t *testing.T) {
<ide> }
<ide>
<ide> func TestLoopbackOnlyExistsWhenNetworkingDisabled(t *testing.T) {
<del> cmd := exec.Command(dockerBinary, "run", "--networking=false", "busybox", "ip", "a", "show", "up")
<add> cmd := exec.Command(dockerBinary, "run", "--net=none", "busybox", "ip", "a", "show", "up")
<ide> out, _, err := runCommandWithOutput(cmd)
<ide> if err != nil {
<ide> t.Fatal(err, out) | 1 |
Ruby | Ruby | remove useless conditional | 109fb8e7a94e2bebb1f7b628c2fd1b377f5601e2 | <ide><path>actionpack/lib/action_controller/metal.rb
<ide> def response_body=(body)
<ide>
<ide> # Tests if render or redirect has already happened.
<ide> def performed?
<del> response_body || (response && response.committed?)
<add> response_body || response.committed?
<ide> end
<ide>
<ide> def dispatch(name, request, response) #:nodoc: | 1 |
Javascript | Javascript | fix title info for pdf document | dee158d80c926e02c7a7573bc3d87659b2e64408 | <ide><path>src/worker.js
<ide> var WorkerMessageHandler = {
<ide> fingerprint: pdfModel.getFingerprint(),
<ide> destinations: pdfModel.catalog.destinations,
<ide> outline: pdfModel.catalog.documentOutline,
<del> info: pdfModel.info,
<add> info: pdfModel.getDocumentInfo(),
<ide> metadata: pdfModel.catalog.metadata
<ide> };
<ide> handler.send('GetDoc', {pdfInfo: doc});
<ide><path>web/viewer.js
<ide> var PDFView = {
<ide> self.setInitialView(storedHash, scale);
<ide> });
<ide>
<del> pdfDocument.getMetadata().then(function(info, metadata) {
<add> pdfDocument.getMetadata().then(function(data) {
<add> var info = data.info, metadata = data.metadata;
<ide> self.documentInfo = info;
<ide> self.metadata = metadata;
<ide> | 2 |
Python | Python | add test for printing of scalar values | d55b5aa49d2ba1c98d568660b1a91b4b552872f0 | <ide><path>numpy/core/tests/test_scalarprint.py
<add># -*- coding: utf-8 -*-
<add>""" Test printing of scalar types.
<add>
<add>"""
<add>
<add>import numpy as np
<add>from numpy.testing import TestCase, assert_, run_module_suite
<add>
<add>
<add>class TestRealScalars(TestCase):
<add> def test_str(self):
<add> svals = [0.0, -0.0, 1, -1, np.inf, -np.inf, np.nan]
<add> styps = [np.float16, np.float32, np.float64, np.longdouble]
<add> actual = [str(f(c)) for c in svals for f in styps]
<add> wanted = [
<add> '0.0', '0.0', '0.0', '0.0',
<add> '-0.0', '-0.0', '-0.0', '-0.0',
<add> '1.0', '1.0', '1.0', '1.0',
<add> '-1.0', '-1.0', '-1.0', '-1.0',
<add> 'inf', 'inf', 'inf', 'inf',
<add> '-inf', '-inf', '-inf', '-inf',
<add> 'nan', 'nan', 'nan', 'nan']
<add>
<add> for res, val in zip(actual, wanted):
<add> assert_(res == val)
<add>
<add>
<add>if __name__ == "__main__":
<add> run_module_suite() | 1 |
Python | Python | remove comment in fabfile | 7c0823ad76e0b38dd31df59eccd0e3d7e0e3a4e0 | <ide><path>fabfile.py
<ide> def virtualenv(name, create=False, python='/usr/bin/python3.6'):
<ide> def wrapped_local(cmd, env_vars=[], capture=False, direct=False):
<ide> return local('source {}/bin/activate && {}'.format(env_path, cmd),
<ide> shell='/bin/bash', capture=False)
<del> #env_vars = ' '.join(env_vars)
<del> #if cmd.split()[0] == 'python':
<del> # cmd = cmd.replace('python', str(env_py))
<del> # return local(env_vars + ' ' + cmd, capture=capture)
<del> #elif direct:
<del> # cmd, args = cmd.split(' ', 1)
<del> # env_cmd = str(env_py).replace('python', cmd)
<del> # return local('{env_vars} {env_cmd} {args}'.format(
<del> # env_cmd=env_cmd, args=args, env_vars=env_vars),
<del> # capture=capture)
<del> #else:
<del> # return local('{env_vars} {env_py} -m {cmd}'.format(
<del> # env_py=env_py, cmd=cmd, env_vars=env_vars),
<del> # capture=capture)
<ide> yield wrapped_local
<ide>
<ide> | 1 |
Ruby | Ruby | combine github version regexes | 364c5c34733a3b941e2310f1d4b2899075ee303d | <ide><path>Library/Homebrew/version.rb
<ide> def self._parse spec
<ide> spec.stem
<ide> end
<ide>
<del> # GitHub tarballs, e.g. v1.2.3
<del> m = %r[github.com/.+/(?:zip|tar)ball/v?((\d+\.)+\d+)$].match(spec_s)
<del> return m.captures.first unless m.nil?
<del>
<add> # GitHub tarballs
<add> # e.g. https://github.com/foo/bar/tarball/v1.2.3
<ide> # e.g. https://github.com/sam-github/libnet/tarball/libnet-1.1.4
<del> m = %r[github.com/.+/(?:zip|tar)ball/.*-((\d+\.)+\d+)$].match(spec_s)
<del> return m.captures.first unless m.nil?
<del>
<ide> # e.g. https://github.com/isaacs/npm/tarball/v0.2.5-1
<del> m = %r[github.com/.+/(?:zip|tar)ball/v?((\d+\.)+\d+-(\d+))$].match(spec_s)
<del> return m.captures.first unless m.nil?
<del>
<ide> # e.g. https://github.com/petdance/ack/tarball/1.93_02
<del> m = %r[github.com/.+/(?:zip|tar)ball/v?((\d+\.)+\d+_(\d+))$].match(spec_s)
<add> m = %r[github.com/.+/(?:zip|tar)ball/(?:v|\w+-)?((?:\d+[-._])+\d*)$].match(spec_s)
<ide> return m.captures.first unless m.nil?
<ide>
<ide> # e.g. https://github.com/erlang/otp/tarball/OTP_R15B01 (erlang style) | 1 |
Javascript | Javascript | fix missing bidi for extension | d0143cc2894307a7d0e652e190f9d2a67874dfb2 | <ide><path>src/bidi.js
<ide>
<ide> 'use strict';
<ide>
<del>var bidi = (function bidiClosure() {
<add>var bidi = PDFJS.bidi = (function bidiClosure() {
<ide> // Character types for symbols from 0000 to 00FF.
<ide> var baseTypes = [
<ide> 'BN', 'BN', 'BN', 'BN', 'BN', 'BN', 'BN', 'BN', 'BN', 'S', 'B', 'S', 'WS',
<ide><path>web/viewer.js
<ide> var TextLayerBuilder = function textLayerBuilder(textLayerDiv) {
<ide> textDiv.style.fontSize = fontHeight + 'px';
<ide> textDiv.style.left = text.geom.x + 'px';
<ide> textDiv.style.top = (text.geom.y - fontHeight) + 'px';
<del> textDiv.textContent = bidi(text, -1);
<add> textDiv.textContent = PDFJS.bidi(text, -1);
<ide> textDiv.dir = text.direction;
<ide> textDiv.dataset.textLength = text.length;
<ide> this.textDivs.push(textDiv); | 2 |
Text | Text | fix typo in docs | f76480a127cd653d981e1463fa6bafdcea0021c6 | <ide><path>docs/api-guide/permissions.md
<ide> The following third party packages are also available.
<ide>
<ide> ## DRF - Access Policy
<ide>
<del>The [Django REST - Access Policy][drf-access-policy] package provides a way to define complex access rules in declaritive policy classes that are attached to view sets or function-based views. The policies are defined in JSON in a format similer to AWS' Identity & Access Management policies.
<add>The [Django REST - Access Policy][drf-access-policy] package provides a way to define complex access rules in declaritive policy classes that are attached to view sets or function-based views. The policies are defined in JSON in a format similar to AWS' Identity & Access Management policies.
<ide>
<ide> ## Composed Permissions
<ide>
<ide> The [Django Rest Framework Role Filters][django-rest-framework-role-filters] pac
<ide> [djangorestframework-api-key]: https://github.com/florimondmanca/djangorestframework-api-key
<ide> [django-rest-framework-role-filters]: https://github.com/allisson/django-rest-framework-role-filters
<ide> [django-rest-framework-guardian]: https://github.com/rpkilby/django-rest-framework-guardian
<del>[drf-access-policy]: https://github.com/rsinger86/drf-access-policy
<ide>\ No newline at end of file
<add>[drf-access-policy]: https://github.com/rsinger86/drf-access-policy | 1 |
Javascript | Javascript | hide object.assign polyfill behind a module | 8210beeef4219d11b0a997f5a5abe7c348c01448 | <ide><path>jest/environment.js
<ide> __DEV__ = true;
<del>
<del>require.requireActual('../src/vendor/polyfill/Object.es6.js');
<ide><path>src/addons/transitions/ReactCSSTransitionGroup.js
<ide>
<ide> var React = require('React');
<ide>
<add>var assign = require('Object.assign');
<add>
<ide> var ReactTransitionGroup = React.createFactory(
<ide> require('ReactTransitionGroup')
<ide> );
<ide> var ReactCSSTransitionGroup = React.createClass({
<ide> render: function() {
<ide> return (
<ide> ReactTransitionGroup(
<del> Object.assign({}, this.props, {childFactory: this._wrapChild})
<add> assign({}, this.props, {childFactory: this._wrapChild})
<ide> )
<ide> );
<ide> }
<ide><path>src/addons/transitions/ReactTransitionGroup.js
<ide> var React = require('React');
<ide> var ReactTransitionChildMapping = require('ReactTransitionChildMapping');
<ide>
<add>var assign = require('Object.assign');
<ide> var cloneWithProps = require('cloneWithProps');
<ide> var emptyFunction = require('emptyFunction');
<ide>
<ide> var ReactTransitionGroup = React.createClass({
<ide> // This entered again before it fully left. Add it again.
<ide> this.performEnter(key);
<ide> } else {
<del> var newChildren = Object.assign({}, this.state.children);
<add> var newChildren = assign({}, this.state.children);
<ide> delete newChildren[key];
<ide> this.setState({children: newChildren});
<ide> }
<ide><path>src/addons/update.js
<ide>
<ide> "use strict";
<ide>
<add>var assign = require('Object.assign');
<ide> var keyOf = require('keyOf');
<ide> var invariant = require('invariant');
<ide>
<ide> function shallowCopy(x) {
<ide> if (Array.isArray(x)) {
<ide> return x.concat();
<ide> } else if (x && typeof x === 'object') {
<del> return Object.assign(new x.constructor(), x);
<add> return assign(new x.constructor(), x);
<ide> } else {
<ide> return x;
<ide> }
<ide> function update(value, spec) {
<ide> COMMAND_MERGE,
<ide> nextValue
<ide> );
<del> Object.assign(nextValue, spec[COMMAND_MERGE]);
<add> assign(nextValue, spec[COMMAND_MERGE]);
<ide> }
<ide>
<ide> if (spec.hasOwnProperty(COMMAND_PUSH)) {
<ide><path>src/browser/ReactBrowserEventEmitter.js
<ide> var EventPluginRegistry = require('EventPluginRegistry');
<ide> var ReactEventEmitterMixin = require('ReactEventEmitterMixin');
<ide> var ViewportMetrics = require('ViewportMetrics');
<ide>
<add>var assign = require('Object.assign');
<ide> var isEventSupported = require('isEventSupported');
<ide>
<ide> /**
<ide> function getListeningForDocument(mountAt) {
<ide> *
<ide> * @internal
<ide> */
<del>var ReactBrowserEventEmitter = Object.assign({}, ReactEventEmitterMixin, {
<add>var ReactBrowserEventEmitter = assign({}, ReactEventEmitterMixin, {
<ide>
<ide> /**
<ide> * Injectable event backend
<ide><path>src/browser/ReactPutListenerQueue.js
<ide> var PooledClass = require('PooledClass');
<ide> var ReactBrowserEventEmitter = require('ReactBrowserEventEmitter');
<ide>
<add>var assign = require('Object.assign');
<add>
<ide> function ReactPutListenerQueue() {
<ide> this.listenersToPut = [];
<ide> }
<ide>
<del>Object.assign(ReactPutListenerQueue.prototype, {
<add>assign(ReactPutListenerQueue.prototype, {
<ide> enqueuePutListener: function(rootNodeID, propKey, propValue) {
<ide> this.listenersToPut.push({
<ide> rootNodeID: rootNodeID,
<ide><path>src/browser/ReactReconcileTransaction.js
<ide> var ReactInputSelection = require('ReactInputSelection');
<ide> var ReactPutListenerQueue = require('ReactPutListenerQueue');
<ide> var Transaction = require('Transaction');
<ide>
<add>var assign = require('Object.assign');
<add>
<ide> /**
<ide> * Ensures that, when possible, the selection range (currently selected text
<ide> * input) is not disturbed by performing the transaction.
<ide> var Mixin = {
<ide> };
<ide>
<ide>
<del>Object.assign(ReactReconcileTransaction.prototype, Transaction.Mixin, Mixin);
<add>assign(ReactReconcileTransaction.prototype, Transaction.Mixin, Mixin);
<ide>
<ide> PooledClass.addPoolingTo(ReactReconcileTransaction);
<ide>
<ide><path>src/browser/ReactTextComponent.js
<ide> var DOMPropertyOperations = require('DOMPropertyOperations');
<ide> var ReactComponent = require('ReactComponent');
<ide> var ReactElement = require('ReactElement');
<ide>
<add>var assign = require('Object.assign');
<ide> var escapeTextForBrowser = require('escapeTextForBrowser');
<ide>
<ide> /**
<ide> var ReactTextComponent = function(props) {
<ide> // This constructor and it's argument is currently used by mocks.
<ide> };
<ide>
<del>Object.assign(ReactTextComponent.prototype, ReactComponent.Mixin, {
<add>assign(ReactTextComponent.prototype, ReactComponent.Mixin, {
<ide>
<ide> /**
<ide> * Creates the markup for this text node. This node is not intended to have
<ide><path>src/browser/server/ReactServerRenderingTransaction.js
<ide> var CallbackQueue = require('CallbackQueue');
<ide> var ReactPutListenerQueue = require('ReactPutListenerQueue');
<ide> var Transaction = require('Transaction');
<ide>
<add>var assign = require('Object.assign');
<ide> var emptyFunction = require('emptyFunction');
<ide>
<ide> /**
<ide> var Mixin = {
<ide> };
<ide>
<ide>
<del>Object.assign(
<add>assign(
<ide> ReactServerRenderingTransaction.prototype,
<ide> Transaction.Mixin,
<ide> Mixin
<ide><path>src/browser/syntheticEvents/SyntheticEvent.js
<ide>
<ide> var PooledClass = require('PooledClass');
<ide>
<add>var assign = require('Object.assign');
<ide> var emptyFunction = require('emptyFunction');
<ide> var getEventTarget = require('getEventTarget');
<ide>
<ide> function SyntheticEvent(dispatchConfig, dispatchMarker, nativeEvent) {
<ide> this.isPropagationStopped = emptyFunction.thatReturnsFalse;
<ide> }
<ide>
<del>Object.assign(SyntheticEvent.prototype, {
<add>assign(SyntheticEvent.prototype, {
<ide>
<ide> preventDefault: function() {
<ide> this.defaultPrevented = true;
<ide> SyntheticEvent.augmentClass = function(Class, Interface) {
<ide> var Super = this;
<ide>
<ide> var prototype = Object.create(Super.prototype);
<del> Object.assign(prototype, Class.prototype);
<add> assign(prototype, Class.prototype);
<ide> Class.prototype = prototype;
<ide> Class.prototype.constructor = Class;
<ide>
<del> Class.Interface = Object.assign({}, Super.Interface, Interface);
<add> Class.Interface = assign({}, Super.Interface, Interface);
<ide> Class.augmentClass = Super.augmentClass;
<ide>
<ide> PooledClass.addPoolingTo(Class, PooledClass.threeArgumentPooler);
<ide><path>src/browser/ui/React.js
<ide>
<ide> "use strict";
<ide>
<del>// TODO: Move this elsewhere - it only exists in open source until a better
<del>// solution is found.
<del>require('Object.es6');
<del>
<ide> var DOMPropertyOperations = require('DOMPropertyOperations');
<ide> var EventPluginUtils = require('EventPluginUtils');
<ide> var ReactChildren = require('ReactChildren');
<ide> var ReactPropTypes = require('ReactPropTypes');
<ide> var ReactServerRendering = require('ReactServerRendering');
<ide> var ReactTextComponent = require('ReactTextComponent');
<ide>
<add>var assign = require('Object.assign');
<ide> var deprecated = require('deprecated');
<ide> var onlyChild = require('onlyChild');
<ide>
<ide> var React = {
<ide> isValidElement: ReactElement.isValidElement,
<ide> withContext: ReactContext.withContext,
<ide>
<add> // Hook for JSX spread, don't use this for anything else.
<add> __spread: assign,
<add>
<ide> // Deprecations (remove for 0.13)
<ide> renderComponent: deprecated(
<ide> 'React',
<ide><path>src/browser/ui/ReactDOMComponent.js
<ide> var ReactMount = require('ReactMount');
<ide> var ReactMultiChild = require('ReactMultiChild');
<ide> var ReactPerf = require('ReactPerf');
<ide>
<add>var assign = require('Object.assign');
<ide> var escapeTextForBrowser = require('escapeTextForBrowser');
<ide> var invariant = require('invariant');
<ide> var isEventSupported = require('isEventSupported');
<ide> ReactDOMComponent.Mixin = {
<ide> } else {
<ide> if (propKey === STYLE) {
<ide> if (propValue) {
<del> propValue = props.style = Object.assign({}, props.style);
<add> propValue = props.style = assign({}, props.style);
<ide> }
<ide> propValue = CSSPropertyOperations.createMarkupForStyles(propValue);
<ide> }
<ide> ReactDOMComponent.Mixin = {
<ide> }
<ide> if (propKey === STYLE) {
<ide> if (nextProp) {
<del> nextProp = nextProps.style = Object.assign({}, nextProp);
<add> nextProp = nextProps.style = assign({}, nextProp);
<ide> }
<ide> if (lastProp) {
<ide> // Unset styles on `lastProp` but not on `nextProp`.
<ide> ReactDOMComponent.Mixin = {
<ide>
<ide> };
<ide>
<del>Object.assign(
<add>assign(
<ide> ReactDOMComponent.prototype,
<ide> ReactComponent.Mixin,
<ide> ReactDOMComponent.Mixin,
<ide><path>src/browser/ui/ReactEventListener.js
<ide> var ReactInstanceHandles = require('ReactInstanceHandles');
<ide> var ReactMount = require('ReactMount');
<ide> var ReactUpdates = require('ReactUpdates');
<ide>
<add>var assign = require('Object.assign');
<ide> var getEventTarget = require('getEventTarget');
<ide> var getUnboundedScrollPosition = require('getUnboundedScrollPosition');
<ide>
<ide> function TopLevelCallbackBookKeeping(topLevelType, nativeEvent) {
<ide> this.nativeEvent = nativeEvent;
<ide> this.ancestors = [];
<ide> }
<del>Object.assign(TopLevelCallbackBookKeeping.prototype, {
<add>assign(TopLevelCallbackBookKeeping.prototype, {
<ide> destructor: function() {
<ide> this.topLevelType = null;
<ide> this.nativeEvent = null;
<ide><path>src/browser/ui/__tests__/ReactDOMComponent-test.js
<ide>
<ide> "use strict";
<ide>
<add>var assign = require('Object.assign');
<ide> var mocks = require('mocks');
<ide>
<ide> describe('ReactDOMComponent', function() {
<ide> describe('ReactDOMComponent', function() {
<ide> this.props = initialProps || {};
<ide> this._rootNodeID = 'test';
<ide> };
<del> Object.assign(NodeStub.prototype, ReactDOMComponent.Mixin);
<add> assign(NodeStub.prototype, ReactDOMComponent.Mixin);
<ide>
<ide> genMarkup = function(props) {
<ide> var transaction = new ReactReconcileTransaction();
<ide> describe('ReactDOMComponent', function() {
<ide> this.props = initialProps || {};
<ide> this._rootNodeID = 'test';
<ide> };
<del> Object.assign(NodeStub.prototype, ReactDOMComponent.Mixin);
<add> assign(NodeStub.prototype, ReactDOMComponent.Mixin);
<ide>
<ide> genMarkup = function(props) {
<ide> var transaction = new ReactReconcileTransaction();
<ide> describe('ReactDOMComponent', function() {
<ide> var StubNativeComponent = function(element) {
<ide> ReactComponent.Mixin.construct.call(this, element);
<ide> };
<del> Object.assign(StubNativeComponent.prototype, ReactComponent.Mixin);
<del> Object.assign(StubNativeComponent.prototype, ReactDOMComponent.Mixin);
<del> Object.assign(StubNativeComponent.prototype, ReactMultiChild.Mixin);
<add> assign(StubNativeComponent.prototype, ReactComponent.Mixin);
<add> assign(StubNativeComponent.prototype, ReactDOMComponent.Mixin);
<add> assign(StubNativeComponent.prototype, ReactMultiChild.Mixin);
<ide>
<ide> mountComponent = function(props) {
<ide> var transaction = new ReactReconcileTransaction();
<ide><path>src/browser/ui/dom/components/ReactDOMInput.js
<ide> var ReactDOM = require('ReactDOM');
<ide> var ReactMount = require('ReactMount');
<ide> var ReactUpdates = require('ReactUpdates');
<ide>
<add>var assign = require('Object.assign');
<ide> var invariant = require('invariant');
<ide>
<ide> // Store a reference to the <input> `ReactDOMComponent`. TODO: use string
<ide> var ReactDOMInput = ReactCompositeComponent.createClass({
<ide>
<ide> render: function() {
<ide> // Clone `this.props` so we don't mutate the input.
<del> var props = Object.assign({}, this.props);
<add> var props = assign({}, this.props);
<ide>
<ide> props.defaultChecked = null;
<ide> props.defaultValue = null;
<ide><path>src/browser/ui/dom/components/ReactDOMSelect.js
<ide> var ReactElement = require('ReactElement');
<ide> var ReactDOM = require('ReactDOM');
<ide> var ReactUpdates = require('ReactUpdates');
<ide>
<add>var assign = require('Object.assign');
<add>
<ide> // Store a reference to the <select> `ReactDOMComponent`. TODO: use string
<ide> var select = ReactElement.createFactory(ReactDOM.select.type);
<ide>
<ide> var ReactDOMSelect = ReactCompositeComponent.createClass({
<ide>
<ide> render: function() {
<ide> // Clone `this.props` so we don't mutate the input.
<del> var props = Object.assign({}, this.props);
<add> var props = assign({}, this.props);
<ide>
<ide> props.onChange = this._handleChange;
<ide> props.value = null;
<ide><path>src/browser/ui/dom/components/ReactDOMTextarea.js
<ide> var ReactElement = require('ReactElement');
<ide> var ReactDOM = require('ReactDOM');
<ide> var ReactUpdates = require('ReactUpdates');
<ide>
<add>var assign = require('Object.assign');
<ide> var invariant = require('invariant');
<ide>
<ide> var warning = require('warning');
<ide> var ReactDOMTextarea = ReactCompositeComponent.createClass({
<ide>
<ide> render: function() {
<ide> // Clone `this.props` so we don't mutate the input.
<del> var props = Object.assign({}, this.props);
<add> var props = assign({}, this.props);
<ide>
<ide> invariant(
<ide> props.dangerouslySetInnerHTML == null,
<ide><path>src/core/ReactComponent.js
<ide> var ReactElement = require('ReactElement');
<ide> var ReactOwner = require('ReactOwner');
<ide> var ReactUpdates = require('ReactUpdates');
<ide>
<add>var assign = require('Object.assign');
<ide> var invariant = require('invariant');
<ide> var keyMirror = require('keyMirror');
<ide>
<ide> var ReactComponent = {
<ide> // element props.
<ide> var element = this._pendingElement || this._currentElement;
<ide> this.replaceProps(
<del> Object.assign({}, element.props, partialProps),
<add> assign({}, element.props, partialProps),
<ide> callback
<ide> );
<ide> },
<ide> var ReactComponent = {
<ide> var element = this._pendingElement || this._currentElement;
<ide> this._pendingElement = ReactElement.cloneAndReplaceProps(
<ide> element,
<del> Object.assign({}, element.props, partialProps)
<add> assign({}, element.props, partialProps)
<ide> );
<ide> ReactUpdates.enqueueUpdate(this, callback);
<ide> },
<ide><path>src/core/ReactCompositeComponent.js
<ide> var ReactPropTypeLocations = require('ReactPropTypeLocations');
<ide> var ReactPropTypeLocationNames = require('ReactPropTypeLocationNames');
<ide> var ReactUpdates = require('ReactUpdates');
<ide>
<add>var assign = require('Object.assign');
<ide> var instantiateReactComponent = require('instantiateReactComponent');
<ide> var invariant = require('invariant');
<ide> var keyMirror = require('keyMirror');
<ide> var RESERVED_SPEC_KEYS = {
<ide> childContextTypes,
<ide> ReactPropTypeLocations.childContext
<ide> );
<del> Constructor.childContextTypes = Object.assign(
<add> Constructor.childContextTypes = assign(
<ide> {},
<ide> Constructor.childContextTypes,
<ide> childContextTypes
<ide> var RESERVED_SPEC_KEYS = {
<ide> contextTypes,
<ide> ReactPropTypeLocations.context
<ide> );
<del> Constructor.contextTypes = Object.assign(
<add> Constructor.contextTypes = assign(
<ide> {},
<ide> Constructor.contextTypes,
<ide> contextTypes
<ide> var RESERVED_SPEC_KEYS = {
<ide> propTypes,
<ide> ReactPropTypeLocations.prop
<ide> );
<del> Constructor.propTypes = Object.assign(
<add> Constructor.propTypes = assign(
<ide> {},
<ide> Constructor.propTypes,
<ide> propTypes
<ide> var ReactCompositeComponentMixin = {
<ide> }
<ide> // Merge with `_pendingState` if it exists, otherwise with existing state.
<ide> this.replaceState(
<del> Object.assign({}, this._pendingState || this.state, partialState),
<add> assign({}, this._pendingState || this.state, partialState),
<ide> callback
<ide> );
<ide> },
<ide> var ReactCompositeComponentMixin = {
<ide> name
<ide> );
<ide> }
<del> return Object.assign({}, currentContext, childContext);
<add> return assign({}, currentContext, childContext);
<ide> }
<ide> return currentContext;
<ide> },
<ide> var ReactCompositeComponentMixin = {
<ide> };
<ide>
<ide> var ReactCompositeComponentBase = function() {};
<del>Object.assign(
<add>assign(
<ide> ReactCompositeComponentBase.prototype,
<ide> ReactComponent.Mixin,
<ide> ReactOwner.Mixin,
<ide><path>src/core/ReactContext.js
<ide>
<ide> "use strict";
<ide>
<add>var assign = require('Object.assign');
<add>
<ide> /**
<ide> * Keeps track of the current context.
<ide> *
<ide> var ReactContext = {
<ide> withContext: function(newContext, scopedCallback) {
<ide> var result;
<ide> var previousContext = ReactContext.current;
<del> ReactContext.current = Object.assign({}, previousContext, newContext);
<add> ReactContext.current = assign({}, previousContext, newContext);
<ide> try {
<ide> result = scopedCallback();
<ide> } finally {
<ide><path>src/core/ReactDefaultBatchingStrategy.js
<ide> var ReactUpdates = require('ReactUpdates');
<ide> var Transaction = require('Transaction');
<ide>
<add>var assign = require('Object.assign');
<ide> var emptyFunction = require('emptyFunction');
<ide>
<ide> var RESET_BATCHED_UPDATES = {
<ide> function ReactDefaultBatchingStrategyTransaction() {
<ide> this.reinitializeTransaction();
<ide> }
<ide>
<del>Object.assign(
<add>assign(
<ide> ReactDefaultBatchingStrategyTransaction.prototype,
<ide> Transaction.Mixin,
<ide> {
<ide><path>src/core/ReactNativeComponent.js
<ide>
<ide> "use strict";
<ide>
<add>var assign = require('Object.assign');
<ide> var invariant = require('invariant');
<ide>
<ide> var genericComponentClass = null;
<ide> var ReactNativeComponentInjection = {
<ide> // This accepts a keyed object with classes as values. Each key represents a
<ide> // tag. That particular tag will use this class instead of the generic one.
<ide> injectComponentClasses: function(componentClasses) {
<del> Object.assign(tagToComponentClass, componentClasses);
<add> assign(tagToComponentClass, componentClasses);
<ide> }
<ide> };
<ide>
<ide><path>src/core/ReactPropTransferer.js
<ide>
<ide> "use strict";
<ide>
<add>var assign = require('Object.assign');
<ide> var emptyFunction = require('emptyFunction');
<ide> var invariant = require('invariant');
<ide> var joinClasses = require('joinClasses');
<ide> var transferStrategyMerge = createTransferStrategy(function(a, b) {
<ide> // `merge` overrides the first object's (`props[key]` above) keys using the
<ide> // second object's (`value`) keys. An object's style's existing `propA` would
<ide> // get overridden. Flip the order here.
<del> return Object.assign({}, b, a);
<add> return assign({}, b, a);
<ide> });
<ide>
<ide> /**
<ide> var ReactPropTransferer = {
<ide> * @return {object} a new object containing both sets of props merged.
<ide> */
<ide> mergeProps: function(oldProps, newProps) {
<del> return transferInto(Object.assign({}, oldProps), newProps);
<add> return transferInto(assign({}, oldProps), newProps);
<ide> },
<ide>
<ide> /**
<ide><path>src/core/ReactUpdates.js
<ide> var ReactCurrentOwner = require('ReactCurrentOwner');
<ide> var ReactPerf = require('ReactPerf');
<ide> var Transaction = require('Transaction');
<ide>
<add>var assign = require('Object.assign');
<ide> var invariant = require('invariant');
<ide> var warning = require('warning');
<ide>
<ide> function ReactUpdatesFlushTransaction() {
<ide> ReactUpdates.ReactReconcileTransaction.getPooled();
<ide> }
<ide>
<del>Object.assign(
<add>assign(
<ide> ReactUpdatesFlushTransaction.prototype,
<ide> Transaction.Mixin, {
<ide> getTransactionWrappers: function() {
<ide><path>src/event/__tests__/EventPluginRegistry-test.js
<ide>
<ide> "use strict";
<ide>
<add>var assign = require('Object.assign');
<add>
<ide> describe('EventPluginRegistry', function() {
<ide> var EventPluginRegistry;
<ide> var createPlugin;
<ide> describe('EventPluginRegistry', function() {
<ide> EventPluginRegistry._resetEventPlugins();
<ide>
<ide> createPlugin = function(properties) {
<del> return Object.assign({extractEvents: function() {}}, properties);
<add> return assign({extractEvents: function() {}}, properties);
<ide> };
<ide> });
<ide>
<ide><path>src/test/ReactDefaultPerfAnalysis.js
<ide> * @providesModule ReactDefaultPerfAnalysis
<ide> */
<ide>
<add>var assign = require('Object.assign');
<add>
<ide> // Don't try to save users less than 1.2ms (a number I made up)
<ide> var DONT_CARE_THRESHOLD = 1.2;
<ide> var DOM_OPERATION_TYPES = {
<ide> function getExclusiveSummary(measurements) {
<ide>
<ide> for (var i = 0; i < measurements.length; i++) {
<ide> var measurement = measurements[i];
<del> var allIDs = Object.assign(
<add> var allIDs = assign(
<ide> {},
<ide> measurement.exclusive,
<ide> measurement.inclusive
<ide> function getInclusiveSummary(measurements, onlyClean) {
<ide>
<ide> for (var i = 0; i < measurements.length; i++) {
<ide> var measurement = measurements[i];
<del> var allIDs = Object.assign(
<add> var allIDs = assign(
<ide> {},
<ide> measurement.exclusive,
<ide> measurement.inclusive
<ide> function getUnchangedComponents(measurement) {
<ide> // the amount of time it took to render the entire subtree.
<ide> var cleanComponents = {};
<ide> var dirtyLeafIDs = Object.keys(measurement.writes);
<del> var allIDs = Object.assign({}, measurement.exclusive, measurement.inclusive);
<add> var allIDs = assign({}, measurement.exclusive, measurement.inclusive);
<ide>
<ide> for (var id in allIDs) {
<ide> var isDirty = false;
<ide><path>src/test/ReactTestUtils.js
<ide> var ReactTextComponent = require('ReactTextComponent');
<ide> var ReactUpdates = require('ReactUpdates');
<ide> var SyntheticEvent = require('SyntheticEvent');
<ide>
<add>var assign = require('Object.assign');
<add>
<ide> var topLevelTypes = EventConstants.topLevelTypes;
<ide>
<ide> function Event(suffix) {}
<ide> function makeSimulator(eventType) {
<ide> ReactMount.getID(node),
<ide> fakeNativeEvent
<ide> );
<del> Object.assign(event, eventData);
<add> assign(event, eventData);
<ide> EventPropagators.accumulateTwoPhaseDispatches(event);
<ide>
<ide> ReactUpdates.batchedUpdates(function() {
<ide> buildSimulators();
<ide> function makeNativeSimulator(eventType) {
<ide> return function(domComponentOrNode, nativeEventData) {
<ide> var fakeNativeEvent = new Event(eventType);
<del> Object.assign(fakeNativeEvent, nativeEventData);
<add> assign(fakeNativeEvent, nativeEventData);
<ide> if (ReactTestUtils.isDOMComponent(domComponentOrNode)) {
<ide> ReactTestUtils.simulateNativeEventOnDOMComponent(
<ide> eventType,
<ide><path>src/test/reactComponentExpect.js
<ide>
<ide> var ReactTestUtils = require('ReactTestUtils');
<ide>
<add>var assign = require('Object.assign');
<add>
<ide> function reactComponentExpect(instance) {
<ide> if (instance instanceof reactComponentExpect) {
<ide> return instance;
<ide> function reactComponentExpect(instance) {
<ide> expect(ReactTestUtils.isElement(instance)).toBe(false);
<ide> }
<ide>
<del>Object.assign(reactComponentExpect.prototype, {
<add>assign(reactComponentExpect.prototype, {
<ide> // Getters -------------------------------------------------------------------
<ide>
<ide> /**
<ide><path>src/utils/CallbackQueue.js
<ide>
<ide> var PooledClass = require('PooledClass');
<ide>
<add>var assign = require('Object.assign');
<ide> var invariant = require('invariant');
<ide>
<ide> /**
<ide> function CallbackQueue() {
<ide> this._contexts = null;
<ide> }
<ide>
<del>Object.assign(CallbackQueue.prototype, {
<add>assign(CallbackQueue.prototype, {
<ide>
<ide> /**
<ide> * Enqueues a callback to be invoked when `notifyAll` is invoked.
<ide><path>src/utils/LegacyImmutableObject.js
<ide>
<ide> "use strict";
<ide>
<add>var assign = require('Object.assign');
<ide> var invariant = require('invariant');
<ide> var isNode = require('isNode');
<ide> var mergeHelpers = require('mergeHelpers');
<ide> if (__DEV__) {
<ide> * @constructor
<ide> */
<ide> LegacyImmutableObject = function LegacyImmutableObject(initialProperties) {
<del> Object.assign(this, initialProperties);
<add> assign(this, initialProperties);
<ide> deepFreeze(this);
<ide> };
<ide>
<ide> if (__DEV__) {
<ide> */
<ide> LegacyImmutableObject.set = function(immutableObject, put) {
<ide> assertLegacyImmutableObject(immutableObject);
<del> var totalNewFields = Object.assign({}, immutableObject, put);
<add> var totalNewFields = assign({}, immutableObject, put);
<ide> return new LegacyImmutableObject(totalNewFields);
<ide> };
<ide>
<ide> if (__DEV__) {
<ide> * @constructor
<ide> */
<ide> LegacyImmutableObject = function LegacyImmutableObject(initialProperties) {
<del> Object.assign(this, initialProperties);
<add> assign(this, initialProperties);
<ide> };
<ide>
<ide> /**
<ide> if (__DEV__) {
<ide> LegacyImmutableObject.set = function(immutableObject, put) {
<ide> assertLegacyImmutableObject(immutableObject);
<ide> var newMap = new LegacyImmutableObject(immutableObject);
<del> Object.assign(newMap, put);
<add> assign(newMap, put);
<ide> return newMap;
<ide> };
<ide> }
<ide><path>src/utils/OrderedMap.js
<ide>
<ide> "use strict";
<ide>
<add>var assign = require('Object.assign');
<ide> var invariant = require('invariant');
<ide>
<ide> var PREFIX = 'key:';
<ide> var OrderedMapMethods = {
<ide> }
<ide> };
<ide>
<del>Object.assign(OrderedMapImpl.prototype, OrderedMapMethods);
<add>assign(OrderedMapImpl.prototype, OrderedMapMethods);
<ide>
<ide> var OrderedMap = {
<ide> from: function(orderedMap) {
<ide><path>src/utils/__tests__/Transaction-test.js
<ide>
<ide> "use strict";
<ide>
<add>var assign = require('Object.assign');
<add>
<ide> var Transaction;
<ide>
<ide> var INIT_ERRORED = 'initErrored'; // Just a dummy value to check for.
<ide> describe('Transaction', function() {
<ide> this.secondCloseParam = INIT_ERRORED; // WILL be set to something else
<ide> this.lastCloseParam = INIT_ERRORED; // WON'T be set to something else
<ide> };
<del> Object.assign(TestTransaction.prototype, Transaction.Mixin);
<add> assign(TestTransaction.prototype, Transaction.Mixin);
<ide> TestTransaction.prototype.getTransactionWrappers = function() {
<ide> return [
<ide> {
<ide> describe('Transaction', function() {
<ide> this.secondCloseParam = INIT_ERRORED; // WILL be set to something else
<ide> this.lastCloseParam = INIT_ERRORED; // WILL be set to something else
<ide> };
<del> Object.assign(TestTransaction.prototype, Transaction.Mixin);
<add> assign(TestTransaction.prototype, Transaction.Mixin);
<ide> TestTransaction.prototype.getTransactionWrappers = function() {
<ide> return [
<ide> {
<ide> describe('Transaction', function() {
<ide> this.secondCloseParam = INIT_ERRORED; // WILL be set to something else
<ide> this.lastCloseParam = INIT_ERRORED; // WILL be set to something else
<ide> };
<del> Object.assign(TestTransaction.prototype, Transaction.Mixin);
<add> assign(TestTransaction.prototype, Transaction.Mixin);
<ide> // Now, none of the close/inits throw, but the operation we wrap will throw.
<ide> TestTransaction.prototype.getTransactionWrappers = function() {
<ide> return [
<ide> describe('Transaction', function() {
<ide> var TestTransaction = function() {
<ide> this.reinitializeTransaction();
<ide> };
<del> Object.assign(TestTransaction.prototype, Transaction.Mixin);
<add> assign(TestTransaction.prototype, Transaction.Mixin);
<ide> var exceptionMsg = 'This exception should throw.';
<ide> TestTransaction.prototype.getTransactionWrappers = function() {
<ide> return [
<ide> describe('Transaction', function() {
<ide> this.reinitializeTransaction();
<ide> this.firstCloseParam = INIT_ERRORED; // WILL be set to something else
<ide> };
<del> Object.assign(TestTransaction.prototype, Transaction.Mixin);
<add> assign(TestTransaction.prototype, Transaction.Mixin);
<ide> TestTransaction.prototype.getTransactionWrappers = function() {
<ide> return [
<ide> {
<ide> describe('Transaction', function() {
<ide> var NestedTransaction = function() {
<ide> this.reinitializeTransaction();
<ide> };
<del> Object.assign(NestedTransaction.prototype, Transaction.Mixin);
<add> assign(NestedTransaction.prototype, Transaction.Mixin);
<ide> NestedTransaction.prototype.getTransactionWrappers = function() {
<ide> return [{
<ide> initialize: function() {
<ide><path>src/utils/deprecated.js
<ide> * @providesModule deprecated
<ide> */
<ide>
<add>var assign = require('Object.assign');
<ide> var warning = require('warning');
<ide>
<ide> /**
<ide> function deprecated(namespace, oldName, newName, ctx, fn) {
<ide> newFn.displayName = `${namespace}_${oldName}`;
<ide> // We need to make sure all properties of the original fn are copied over.
<ide> // In particular, this is needed to support PropTypes
<del> return Object.assign(newFn, fn);
<add> return assign(newFn, fn);
<ide> }
<ide>
<ide> return fn;
<ide><path>src/vendor/core/emptyFunction.js
<ide> function makeEmptyFunction(arg) {
<ide> */
<ide> function emptyFunction() {}
<ide>
<del>Object.assign(emptyFunction, {
<del> thatReturns: makeEmptyFunction,
<del> thatReturnsFalse: makeEmptyFunction(false),
<del> thatReturnsTrue: makeEmptyFunction(true),
<del> thatReturnsNull: makeEmptyFunction(null),
<del> thatReturnsThis: function() { return this; },
<del> thatReturnsArgument: function(arg) { return arg; }
<del>});
<add>emptyFunction.thatReturns = makeEmptyFunction;
<add>emptyFunction.thatReturnsFalse = makeEmptyFunction(false);
<add>emptyFunction.thatReturnsTrue = makeEmptyFunction(true);
<add>emptyFunction.thatReturnsNull = makeEmptyFunction(null);
<add>emptyFunction.thatReturnsThis = function() { return this; };
<add>emptyFunction.thatReturnsArgument = function(arg) { return arg; };
<ide>
<ide> module.exports = emptyFunction;
<ide><path>src/vendor/immutable/Immutable.js
<ide> * @providesModule Immutable
<ide> */
<ide>
<add>var assign = require('Object.assign');
<ide> var invariant = require('invariant');
<ide> var isNode = require('isNode');
<ide> var keyOf = require('keyOf');
<ide> class Immutable {
<ide> static mergeAllPropertiesInto(destination, propertyObjects) {
<ide> var argLength = propertyObjects.length;
<ide> for (var i = 0; i < argLength; i++) {
<del> Object.assign(destination, propertyObjects[i]);
<add> assign(destination, propertyObjects[i]);
<ide> }
<ide> }
<ide>
<ide><path>src/vendor/polyfill/Object.es6.js
<del>/**
<del> * Copyright 2014, Facebook, Inc.
<del> * All rights reserved.
<del> *
<del> * This source code is licensed under the BSD-style license found in the
<del> * LICENSE file in the root directory of this source tree. An additional grant
<del> * of patent rights can be found in the PATENTS file in the same directory.
<del> *
<del> * @providesModule Object.es6
<del> * @polyfill
<del> */
<del>
<del>// https://people.mozilla.org/~jorendorff/es6-draft.html#sec-object.assign
<del>
<del>if (!Object.assign) {
<del> Object.assign = function(target, sources) {
<del> if (target == null) {
<del> throw new TypeError('Object.assign target cannot be null or undefined');
<del> }
<del>
<del> var to = Object(target);
<del> var hasOwnProperty = Object.prototype.hasOwnProperty;
<del>
<del> for (var nextIndex = 1; nextIndex < arguments.length; nextIndex++) {
<del> var nextSource = arguments[nextIndex];
<del> if (nextSource == null) {
<del> continue;
<del> }
<del>
<del> var from = Object(nextSource);
<del>
<del> // We don't currently support accessors nor proxies. Therefore this
<del> // copy cannot throw. If we ever supported this then we must handle
<del> // exceptions and side-effects.
<del>
<del> for (var key in from) {
<del> if (hasOwnProperty.call(from, key)) {
<del> to[key] = from[key];
<del> }
<del> }
<del> }
<del>
<del> return to;
<del> };
<del>}
<ide><path>src/vendor/stubs/Object.assign.js
<add>/**
<add> * Copyright 2014, Facebook, Inc.
<add> * All rights reserved.
<add> *
<add> * This source code is licensed under the BSD-style license found in the
<add> * LICENSE file in the root directory of this source tree. An additional grant
<add> * of patent rights can be found in the PATENTS file in the same directory.
<add> *
<add> * @providesModule Object.assign
<add> */
<add>
<add>// This is an optimized version that fails on hasOwnProperty checks
<add>// and non objects. It's not spec-compliant. It's a perf optimization.
<add>
<add>var hasOwnProperty = Object.prototype.hasOwnProperty;
<add>
<add>function assign(target, sources) {
<add> if (__DEV__) {
<add> if (target == null) {
<add> throw new TypeError('Object.assign target cannot be null or undefined');
<add> }
<add> if (typeof target !== 'object' && typeof target !== 'function') {
<add> throw new TypeError(
<add> 'In this environment the target of assign MUST be an object. ' +
<add> 'This error is a performance optimization and not spec compliant.'
<add> );
<add> }
<add> }
<add>
<add> for (var nextIndex = 1; nextIndex < arguments.length; nextIndex++) {
<add> var nextSource = arguments[nextIndex];
<add> if (nextSource == null) {
<add> continue;
<add> }
<add>
<add> // We don't currently support accessors nor proxies. Therefore this
<add> // copy cannot throw. If we ever supported this then we must handle
<add> // exceptions and side-effects.
<add>
<add> for (var key in nextSource) {
<add> if (__DEV__) {
<add> if (!hasOwnProperty.call(nextSource, key)) {
<add> throw new TypeError(
<add> 'One of the sources to assign has an enumerable key on the ' +
<add> 'prototype chain. This is an edge case that we do not support. ' +
<add> 'This error is a performance optimization and not spec compliant.'
<add> );
<add> }
<add> }
<add> target[key] = nextSource[key];
<add> }
<add> }
<add>
<add> return target;
<add>};
<add>
<add>module.exports = assign;
<ide><path>src/vendor_deprecated/core/merge.js
<ide>
<ide> "use strict";
<ide>
<add>var assign = require('Object.assign');
<add>
<ide> /**
<ide> * Shallow merges two structures into a return value, without mutating either.
<ide> *
<ide> * @return {object} The shallow extension of one by two.
<ide> */
<ide> var merge = function(one, two) {
<del> return Object.assign({}, one, two);
<add> return assign({}, one, two);
<ide> };
<ide>
<ide> module.exports = merge;
<ide><path>src/vendor_deprecated/core/mergeInto.js
<ide>
<ide> "use strict";
<ide>
<del>module.exports = Object.assign;
<add>var assign = require('Object.assign');
<add>
<add>module.exports = assign;
<ide>
<ide> // deprecation notice
<ide> console.warn(
<ide><path>vendor/fbtransform/transforms/__tests__/react-test.js
<ide> describe('react jsx', function() {
<ide> var Component = jest.genMockFunction();
<ide> var Child = jest.genMockFunction();
<ide> var objectAssignMock = jest.genMockFunction();
<del> Object.assign = objectAssignMock;
<add> React.__spread = objectAssignMock;
<ide> eval(transform(code).code);
<ide> return expect(objectAssignMock);
<ide> }
<ide> describe('react jsx', function() {
<ide> expect(() => transform(code)).toThrow();
<ide> });
<ide>
<del> it('wraps props in Object.assign for spread attributes', function() {
<add> it('wraps props in React.__spread for spread attributes', function() {
<ide> var code =
<ide> '<Component { ... x } y\n' +
<ide> '={2 } z />';
<ide> var result =
<del> 'React.createElement(Component, Object.assign({}, x , {y: \n' +
<add> 'React.createElement(Component, React.__spread({}, x , {y: \n' +
<ide> '2, z: true}))';
<ide>
<ide> expect(transform(code).code).toBe(result);
<ide> describe('react jsx', function() {
<ide> ' {...this.props}\n' +
<ide> ' sound="moo" />';
<ide> var result =
<del> 'React.createElement(Component, Object.assign({}, \n' +
<add> 'React.createElement(Component, React.__spread({}, \n' +
<ide> ' this.props, \n' +
<ide> ' {sound: "moo"}))';
<ide>
<ide> describe('react jsx', function() {
<ide> expect(transform(code).code).toBe(result);
<ide> });
<ide>
<del> it('does not call Object.assign when there are no spreads', function() {
<add> it('does not call React.__spread when there are no spreads', function() {
<ide> expectObjectAssign(
<ide> '<Component x={y} />'
<ide> ).not.toBeCalled();
<ide> describe('react jsx', function() {
<ide> ).toBeCalledWith({x: 1}, y);
<ide> });
<ide>
<del> it('passes the same value multiple times to Object.assign', function() {
<add> it('passes the same value multiple times to React.__spread', function() {
<ide> expectObjectAssign(
<ide> '<Component x={1} y="2" {...z} {...z}><Child /></Component>'
<ide> ).toBeCalledWith({x: 1, y: "2"}, z, z);
<ide> });
<ide>
<del> it('evaluates sequences before passing them to Object.assign', function() {
<add> it('evaluates sequences before passing them to React.__spread', function() {
<ide> expectObjectAssign(
<ide> '<Component x="1" {...(z = { y: 2 }, z)} z={3}>Text</Component>'
<ide> ).toBeCalledWith({x: "1"}, { y: 2 }, {z: 3});
<ide><path>vendor/fbtransform/transforms/react.js
<ide> function visitReactTag(traverse, object, path, state) {
<ide>
<ide> // if we don't have any attributes, pass in null
<ide> if (hasAtLeastOneSpreadProperty) {
<del> utils.append('Object.assign({', state);
<add> utils.append('React.__spread({', state);
<ide> } else if (hasAttributes) {
<ide> utils.append('{', state);
<ide> } else { | 41 |
Javascript | Javascript | remove workaround for old ff bug | a03b75c6a812fcc2f616fc05c0f1710e03fca8e9 | <ide><path>src/ng/browser.js
<ide> function Browser(window, document, $log, $sniffer) {
<ide> // - pendingLocation is needed as browsers don't allow to read out
<ide> // the new location.href if a reload happened or if there is a bug like in iOS 9 (see
<ide> // https://openradar.appspot.com/22186109).
<del> // - the replacement is a workaround for https://bugzilla.mozilla.org/show_bug.cgi?id=407172
<del> return pendingLocation || location.href.replace(/%27/g,'\'');
<add> return pendingLocation || location.href;
<ide> }
<ide> };
<ide>
<ide><path>test/ng/browserSpecs.js
<ide> describe('browser', function() {
<ide> expect(browser.url('http://any.com', true, state).url('http://any.com', true, state)).toBe(browser);
<ide> });
<ide>
<del> it('should decode single quotes to work around FF bug 407273', function() {
<del> fakeWindow.location.href = 'http://ff-bug/?single%27quote';
<del> expect(browser.url()).toBe('http://ff-bug/?single\'quote');
<del> });
<del>
<ide> it('should not set URL when the URL is already set', function() {
<ide> var current = fakeWindow.location.href;
<ide> sniffer.history = false; | 2 |
Text | Text | fix linting of markdown documentation | 83b4fd150910917d4f91d8f743df0cc4d3ca5696 | <ide><path>docs/basic-features/data-fetching.md
<ide> export async function getStaticPaths() {
<ide>
<ide> // Get the paths we want to pre-render based on posts
<ide> const paths = posts.map(post => ({
<del> params: {id: post.id}
<add> params: { id: post.id },
<ide> }))
<ide>
<ide> // We'll pre-render only these paths at build time. | 1 |
Javascript | Javascript | add api to force scheduler to yield for macrotask | ca990e9a753597768765e71c98fe650789c9ee8b | <ide><path>packages/scheduler/npm/umd/scheduler.development.js
<ide> );
<ide> }
<ide>
<add> function unstable_requestYield() {
<add> return global.React.__SECRET_INTERNALS_DO_NOT_USE_OR_YOU_WILL_BE_FIRED.Scheduler.unstable_requestYield.apply(
<add> this,
<add> arguments
<add> );
<add> }
<add>
<ide> function unstable_runWithPriority() {
<ide> return global.React.__SECRET_INTERNALS_DO_NOT_USE_OR_YOU_WILL_BE_FIRED.Scheduler.unstable_runWithPriority.apply(
<ide> this,
<ide> unstable_cancelCallback: unstable_cancelCallback,
<ide> unstable_shouldYield: unstable_shouldYield,
<ide> unstable_requestPaint: unstable_requestPaint,
<add> unstable_requestYield: unstable_requestYield,
<ide> unstable_runWithPriority: unstable_runWithPriority,
<ide> unstable_next: unstable_next,
<ide> unstable_wrapCallback: unstable_wrapCallback,
<ide><path>packages/scheduler/npm/umd/scheduler.production.min.js
<ide> );
<ide> }
<ide>
<add> function unstable_requestYield() {
<add> return global.React.__SECRET_INTERNALS_DO_NOT_USE_OR_YOU_WILL_BE_FIRED.Scheduler.unstable_requestYield.apply(
<add> this,
<add> arguments
<add> );
<add> }
<add>
<ide> function unstable_runWithPriority() {
<ide> return global.React.__SECRET_INTERNALS_DO_NOT_USE_OR_YOU_WILL_BE_FIRED.Scheduler.unstable_runWithPriority.apply(
<ide> this,
<ide> unstable_cancelCallback: unstable_cancelCallback,
<ide> unstable_shouldYield: unstable_shouldYield,
<ide> unstable_requestPaint: unstable_requestPaint,
<add> unstable_requestYield: unstable_requestYield,
<ide> unstable_runWithPriority: unstable_runWithPriority,
<ide> unstable_next: unstable_next,
<ide> unstable_wrapCallback: unstable_wrapCallback,
<ide><path>packages/scheduler/npm/umd/scheduler.profiling.min.js
<ide> );
<ide> }
<ide>
<add> function unstable_requestYield() {
<add> return global.React.__SECRET_INTERNALS_DO_NOT_USE_OR_YOU_WILL_BE_FIRED.Scheduler.unstable_requestYield.apply(
<add> this,
<add> arguments
<add> );
<add> }
<add>
<ide> function unstable_runWithPriority() {
<ide> return global.React.__SECRET_INTERNALS_DO_NOT_USE_OR_YOU_WILL_BE_FIRED.Scheduler.unstable_runWithPriority.apply(
<ide> this,
<ide> unstable_cancelCallback: unstable_cancelCallback,
<ide> unstable_shouldYield: unstable_shouldYield,
<ide> unstable_requestPaint: unstable_requestPaint,
<add> unstable_requestYield: unstable_requestYield,
<ide> unstable_runWithPriority: unstable_runWithPriority,
<ide> unstable_next: unstable_next,
<ide> unstable_wrapCallback: unstable_wrapCallback,
<ide><path>packages/scheduler/src/__tests__/Scheduler-test.js
<ide> let performance;
<ide> let cancelCallback;
<ide> let scheduleCallback;
<ide> let requestPaint;
<add>let requestYield;
<add>let shouldYield;
<ide> let NormalPriority;
<ide>
<ide> // The Scheduler implementation uses browser APIs like `MessageChannel` and
<ide> describe('SchedulerBrowser', () => {
<ide> scheduleCallback = Scheduler.unstable_scheduleCallback;
<ide> NormalPriority = Scheduler.unstable_NormalPriority;
<ide> requestPaint = Scheduler.unstable_requestPaint;
<add> requestYield = Scheduler.unstable_requestYield;
<add> shouldYield = Scheduler.unstable_shouldYield;
<ide> });
<ide>
<ide> afterEach(() => {
<ide> describe('SchedulerBrowser', () => {
<ide> 'Yield at 5ms',
<ide> ]);
<ide> });
<add>
<add> it('requestYield forces a yield immediately', () => {
<add> scheduleCallback(NormalPriority, () => {
<add> runtime.log('Original Task');
<add> runtime.log('shouldYield: ' + shouldYield());
<add> runtime.log('requestYield');
<add> requestYield();
<add> runtime.log('shouldYield: ' + shouldYield());
<add> return () => {
<add> runtime.log('Continuation Task');
<add> runtime.log('shouldYield: ' + shouldYield());
<add> runtime.log('Advance time past frame deadline');
<add> runtime.advanceTime(10000);
<add> runtime.log('shouldYield: ' + shouldYield());
<add> };
<add> });
<add> runtime.assertLog(['Post Message']);
<add>
<add> runtime.fireMessageEvent();
<add> runtime.assertLog([
<add> 'Message Event',
<add> 'Original Task',
<add> 'shouldYield: false',
<add> 'requestYield',
<add> // Immediately after calling requestYield, shouldYield starts
<add> // returning true, even though no time has elapsed in the frame
<add> 'shouldYield: true',
<add>
<add> // The continuation should be scheduled in a separate macrotask.
<add> 'Post Message',
<add> ]);
<add>
<add> // No time has elapsed
<add> expect(performance.now()).toBe(0);
<add>
<add> // Subsequent tasks work as normal
<add> runtime.fireMessageEvent();
<add> runtime.assertLog([
<add> 'Message Event',
<add> 'Continuation Task',
<add> 'shouldYield: false',
<add> 'Advance time past frame deadline',
<add> 'shouldYield: true',
<add> ]);
<add> });
<ide> });
<ide><path>packages/scheduler/src/__tests__/SchedulerMock-test.js
<ide> describe('Scheduler', () => {
<ide> scheduleCallback(ImmediatePriority, 42);
<ide> expect(Scheduler).toFlushWithoutYielding();
<ide> });
<add>
<add> it('requestYield forces a yield immediately', () => {
<add> scheduleCallback(NormalPriority, () => {
<add> Scheduler.unstable_yieldValue('Original Task');
<add> Scheduler.unstable_yieldValue(
<add> 'shouldYield: ' + Scheduler.unstable_shouldYield(),
<add> );
<add> Scheduler.unstable_yieldValue('requestYield');
<add> Scheduler.unstable_requestYield();
<add> Scheduler.unstable_yieldValue(
<add> 'shouldYield: ' + Scheduler.unstable_shouldYield(),
<add> );
<add> return () => {
<add> Scheduler.unstable_yieldValue('Continuation Task');
<add> Scheduler.unstable_yieldValue(
<add> 'shouldYield: ' + Scheduler.unstable_shouldYield(),
<add> );
<add> Scheduler.unstable_yieldValue('Advance time past frame deadline');
<add> Scheduler.unstable_yieldValue(
<add> 'shouldYield: ' + Scheduler.unstable_shouldYield(),
<add> );
<add> };
<add> });
<add>
<add> // The continuation should be scheduled in a separate macrotask.
<add> expect(Scheduler).toFlushUntilNextPaint([
<add> 'Original Task',
<add> 'shouldYield: false',
<add> 'requestYield',
<add> // Immediately after calling requestYield, shouldYield starts
<add> // returning true
<add> 'shouldYield: true',
<add> ]);
<add> });
<ide> });
<ide> });
<ide><path>packages/scheduler/src/__tests__/SchedulerPostTask-test.js
<ide> let NormalPriority;
<ide> let UserBlockingPriority;
<ide> let LowPriority;
<ide> let IdlePriority;
<add>let shouldYield;
<add>let requestYield;
<ide>
<ide> // The Scheduler postTask implementation uses a new postTask browser API to
<ide> // schedule work on the main thread. This test suite mocks all browser methods
<ide> describe('SchedulerPostTask', () => {
<ide> NormalPriority = Scheduler.unstable_NormalPriority;
<ide> LowPriority = Scheduler.unstable_LowPriority;
<ide> IdlePriority = Scheduler.unstable_IdlePriority;
<add> shouldYield = Scheduler.unstable_shouldYield;
<add> requestYield = Scheduler.unstable_requestYield;
<ide> });
<ide>
<ide> afterEach(() => {
<ide> describe('SchedulerPostTask', () => {
<ide> 'E',
<ide> ]);
<ide> });
<add>
<add> it('requestYield forces a yield immediately', () => {
<add> scheduleCallback(NormalPriority, () => {
<add> runtime.log('Original Task');
<add> runtime.log('shouldYield: ' + shouldYield());
<add> runtime.log('requestYield');
<add> requestYield();
<add> runtime.log('shouldYield: ' + shouldYield());
<add> return () => {
<add> runtime.log('Continuation Task');
<add> runtime.log('shouldYield: ' + shouldYield());
<add> runtime.log('Advance time past frame deadline');
<add> runtime.advanceTime(10000);
<add> runtime.log('shouldYield: ' + shouldYield());
<add> };
<add> });
<add> runtime.assertLog(['Post Task 0 [user-visible]']);
<add>
<add> runtime.flushTasks();
<add> runtime.assertLog([
<add> 'Task 0 Fired',
<add> 'Original Task',
<add> 'shouldYield: false',
<add> 'requestYield',
<add> // Immediately after calling requestYield, shouldYield starts
<add> // returning true, even though no time has elapsed in the frame
<add> 'shouldYield: true',
<add>
<add> // The continuation should be scheduled in a separate macrotask.
<add> 'Post Task 1 [user-visible]',
<add> ]);
<add>
<add> // No time has elapsed
<add> expect(performance.now()).toBe(0);
<add>
<add> // Subsequent tasks work as normal
<add> runtime.flushTasks();
<add> runtime.assertLog([
<add> 'Task 1 Fired',
<add> 'Continuation Task',
<add> 'shouldYield: false',
<add> 'Advance time past frame deadline',
<add> 'shouldYield: true',
<add> ]);
<add> });
<ide> });
<ide><path>packages/scheduler/src/forks/Scheduler.js
<ide> function requestPaint() {
<ide> // Since we yield every frame regardless, `requestPaint` has no effect.
<ide> }
<ide>
<add>function requestYield() {
<add> // Force a yield at the next opportunity.
<add> startTime = -99999;
<add>}
<add>
<ide> function forceFrameRate(fps) {
<ide> if (fps < 0 || fps > 125) {
<ide> // Using console['error'] to evade Babel and ESLint
<ide> function cancelHostTimeout() {
<ide> taskTimeoutID = -1;
<ide> }
<ide>
<del>const unstable_requestPaint = requestPaint;
<del>
<ide> export {
<ide> ImmediatePriority as unstable_ImmediatePriority,
<ide> UserBlockingPriority as unstable_UserBlockingPriority,
<ide> export {
<ide> unstable_wrapCallback,
<ide> unstable_getCurrentPriorityLevel,
<ide> shouldYieldToHost as unstable_shouldYield,
<del> unstable_requestPaint,
<add> requestPaint as unstable_requestPaint,
<add> requestYield as unstable_requestYield,
<ide> unstable_continueExecution,
<ide> unstable_pauseExecution,
<ide> unstable_getFirstCallbackNode,
<ide><path>packages/scheduler/src/forks/SchedulerMock.js
<ide> function requestPaint() {
<ide> needsPaint = true;
<ide> }
<ide>
<add>function requestYield() {
<add> // Force a yield at the next opportunity.
<add> shouldYieldForPaint = needsPaint = true;
<add>}
<add>
<ide> export {
<ide> ImmediatePriority as unstable_ImmediatePriority,
<ide> UserBlockingPriority as unstable_UserBlockingPriority,
<ide> export {
<ide> unstable_getCurrentPriorityLevel,
<ide> shouldYieldToHost as unstable_shouldYield,
<ide> requestPaint as unstable_requestPaint,
<add> requestYield as unstable_requestYield,
<ide> unstable_continueExecution,
<ide> unstable_pauseExecution,
<ide> unstable_getFirstCallbackNode,
<ide><path>packages/scheduler/src/forks/SchedulerPostTask.js
<ide> export function unstable_requestPaint() {
<ide> // Since we yield every frame regardless, `requestPaint` has no effect.
<ide> }
<ide>
<add>export function unstable_requestYield() {
<add> // Force a yield at the next opportunity.
<add> deadline = -99999;
<add>}
<add>
<ide> type SchedulerCallback<T> = (
<ide> didTimeout_DEPRECATED: boolean,
<ide> ) => | 9 |
Javascript | Javascript | fix relative paths in scripts | 7fbbacc104dfb142f560bb1a162dac6d8a4952a9 | <ide><path>i18n/src/closureSlurper.js
<ide> var Q = require('q'),
<ide> localeInfo = {};
<ide>
<ide>
<del>var NG_LOCALE_DIR = '../src/ngLocale/';
<add>var NG_LOCALE_DIR = __dirname + '/../../src/ngLocale/';
<ide>
<ide>
<ide> function readSymbols() {
<ide><path>i18n/ucd/src/extract.js
<ide> var propertiesToExtract = {'IDS': 'Y', 'IDC': 'Y'};
<ide>
<ide> function main() {
<ide> extractValues(
<del> fs.createReadStream('./ucd/src/ucd.all.flat.xml.gz').pipe(zlib.createGunzip()),
<add> fs.createReadStream(__dirname + '/ucd.all.flat.xml.gz').pipe(zlib.createGunzip()),
<ide> propertiesToExtract,
<ide> writeFile);
<ide>
<ide> function writeFile(validRanges) {
<ide> var code = generateCode(validRanges);
<ide> try {
<del> fs.lstatSync('../src/ngParseExt');
<add> fs.lstatSync(__dirname + '/../../../src/ngParseExt');
<ide> } catch (e) {
<del> fs.mkdirSync('../src/ngParseExt');
<add> fs.mkdirSync(__dirname + '/../../../src/ngParseExt');
<ide> }
<del> fs.writeFile('../src/ngParseExt/ucd.js', code);
<add> fs.writeFile(__dirname + '/../../../src/ngParseExt/ucd.js', code);
<ide> }
<ide> }
<ide> | 2 |
Ruby | Ruby | add documentation explaining reorder behavior | 23f8c635c0c8c9f6fbe810766104a4fa9f6794d4 | <ide><path>activerecord/lib/active_record/relation/query_methods.rb
<ide> def order(*args)
<ide> relation
<ide> end
<ide>
<add> # Replaces any existing order defined on the relation with the specified order.
<add> #
<add> # User.order('email DESC').reorder('id ASC') # generated SQL has 'ORDER BY id ASC'
<add> #
<add> # Subsequent calls to order on the same relation will be appended. For example:
<add> #
<add> # User.order('email DESC').reorder('id ASC').order('name ASC')
<add> #
<add> # generates a query with 'ORDER BY id ASC, name ASC'.
<add> #
<ide> def reorder(*args)
<ide> return self if args.blank?
<ide> | 1 |
Javascript | Javascript | remove local camelcase method | ee42cfea04e619945b51e8cdcd9c9d5c6176cb77 | <ide><path>src/ngAria/aria.js
<ide> function $AriaProvider() {
<ide> config = angular.extend(config, newConfig);
<ide> };
<ide>
<del> function camelCase(input) {
<del> return input.replace(/-./g, function(letter, pos) {
<del> return letter[1].toUpperCase();
<del> });
<del> }
<del>
<del>
<ide> function watchExpr(attrName, ariaAttr, negate) {
<del> var ariaCamelName = camelCase(ariaAttr);
<ide> return function(scope, elem, attr) {
<add> var ariaCamelName = attr.$normalize(ariaAttr);
<ide> if (config[ariaCamelName] && !attr[ariaCamelName]) {
<ide> scope.$watch(attr[attrName], function(boolVal) {
<ide> if (negate) {
<ide> function $AriaProvider() {
<ide> this.$get = function() {
<ide> return {
<ide> config: function(key) {
<del> return config[camelCase(key)];
<add> return config[key];
<ide> },
<ide> $$watchExpr: watchExpr
<ide> };
<ide> ngAriaModule.directive('ngShow', ['$aria', function($aria) {
<ide> }])
<ide> .directive('ngModel', ['$aria', function($aria) {
<ide>
<del> function shouldAttachAttr(attr, elem) {
<del> return $aria.config(attr) && !elem.attr(attr);
<add> function shouldAttachAttr(attr, normalizedAttr, elem) {
<add> return $aria.config(normalizedAttr) && !elem.attr(attr);
<ide> }
<ide>
<ide> function getShape(attr, elem) {
<ide> ngAriaModule.directive('ngShow', ['$aria', function($aria) {
<ide> require: '?ngModel',
<ide> link: function(scope, elem, attr, ngModel) {
<ide> var shape = getShape(attr, elem);
<del> var needsTabIndex = shouldAttachAttr('tabindex', elem);
<add> var needsTabIndex = shouldAttachAttr('tabindex', 'tabindex', elem);
<ide>
<ide> function ngAriaWatchModelValue() {
<ide> return ngModel.$modelValue;
<ide> ngAriaModule.directive('ngShow', ['$aria', function($aria) {
<ide> switch (shape) {
<ide> case 'radio':
<ide> case 'checkbox':
<del> if (shouldAttachAttr('aria-checked', elem)) {
<add> if (shouldAttachAttr('aria-checked', 'ariaChecked', elem)) {
<ide> scope.$watch(ngAriaWatchModelValue, shape === 'radio' ?
<ide> getRadioReaction() : ngAriaCheckboxReaction);
<ide> }
<ide> ngAriaModule.directive('ngShow', ['$aria', function($aria) {
<ide> }
<ide> break;
<ide> case 'multiline':
<del> if (shouldAttachAttr('aria-multiline', elem)) {
<add> if (shouldAttachAttr('aria-multiline', 'ariaMultiline', elem)) {
<ide> elem.attr('aria-multiline', true);
<ide> }
<ide> break;
<ide> ngAriaModule.directive('ngShow', ['$aria', function($aria) {
<ide> elem.attr('tabindex', 0);
<ide> }
<ide>
<del> if (ngModel.$validators.required && shouldAttachAttr('aria-required', elem)) {
<add> if (ngModel.$validators.required && shouldAttachAttr('aria-required', 'ariaRequired', elem)) {
<ide> scope.$watch(function ngAriaRequiredWatch() {
<ide> return ngModel.$error.required;
<ide> }, function ngAriaRequiredReaction(newVal) {
<ide> elem.attr('aria-required', !!newVal);
<ide> });
<ide> }
<ide>
<del> if (shouldAttachAttr('aria-invalid', elem)) {
<add> if (shouldAttachAttr('aria-invalid', 'ariaInvalid', elem)) {
<ide> scope.$watch(function ngAriaInvalidWatch() {
<ide> return ngModel.$invalid;
<ide> }, function ngAriaInvalidReaction(newVal) { | 1 |
Python | Python | rationalize decimal logic. closes | 38a1b3ec6b62ea0e5bfcb0de2043067d8e333c95 | <ide><path>rest_framework/fields.py
<ide> def __init__(self, max_digits, decimal_places, coerce_to_string=None, max_value=
<ide> self.max_value = max_value
<ide> self.min_value = min_value
<ide>
<add> if self.max_digits is not None and self.decimal_places is not None:
<add> self.max_whole_digits = self.max_digits - self.decimal_places
<add> else:
<add> self.max_whole_digits = None
<add>
<ide> super(DecimalField, self).__init__(**kwargs)
<ide>
<ide> if self.max_value is not None:
<ide> def validate_precision(self, value):
<ide> values or to enhance it in any way you need to.
<ide> """
<ide> sign, digittuple, exponent = value.as_tuple()
<del> decimals = exponent * decimal.Decimal(-1) if exponent < 0 else 0
<del>
<del> # digittuple doesn't include any leading zeros.
<del> digits = len(digittuple)
<del> if decimals > digits:
<del> # We have leading zeros up to or past the decimal point. Count
<del> # everything past the decimal point as a digit. We do not count
<del> # 0 before the decimal point as a digit since that would mean
<del> # we would not allow max_digits = decimal_places.
<del> digits = decimals
<del> whole_digits = digits - decimals
<del>
<del> if self.max_digits is not None and digits > self.max_digits:
<add>
<add> if exponent >= 0:
<add> # 1234500.0
<add> total_digits = len(digittuple) + exponent
<add> whole_digits = total_digits
<add> decimal_places = 0
<add> elif len(digittuple) > abs(exponent):
<add> # 123.45
<add> total_digits = len(digittuple)
<add> whole_digits = total_digits - abs(exponent)
<add> decimal_places = abs(exponent)
<add> else:
<add> # 0.001234
<add> total_digits = abs(exponent)
<add> whole_digits = 0
<add> decimal_places = total_digits
<add>
<add> if self.max_digits is not None and total_digits > self.max_digits:
<ide> self.fail('max_digits', max_digits=self.max_digits)
<del> if self.decimal_places is not None and decimals > self.decimal_places:
<add> if self.decimal_places is not None and decimal_places > self.decimal_places:
<ide> self.fail('max_decimal_places', max_decimal_places=self.decimal_places)
<del> if self.max_digits is not None and self.decimal_places is not None and whole_digits > (self.max_digits - self.decimal_places):
<del> self.fail('max_whole_digits', max_whole_digits=self.max_digits - self.decimal_places)
<add> if self.max_whole_digits is not None and whole_digits > self.max_whole_digits:
<add> self.fail('max_whole_digits', max_whole_digits=self.max_whole_digits)
<ide>
<ide> return value
<ide>
<ide><path>tests/test_fields.py
<ide> class TestDecimalField(FieldValues):
<ide> 0: Decimal('0'),
<ide> 12.3: Decimal('12.3'),
<ide> 0.1: Decimal('0.1'),
<del> '2E+2': Decimal('200'),
<add> '2E+1': Decimal('20'),
<ide> }
<ide> invalid_inputs = (
<ide> ('abc', ["A valid number is required."]),
<ide> (Decimal('Nan'), ["A valid number is required."]),
<ide> (Decimal('Inf'), ["A valid number is required."]),
<ide> ('12.345', ["Ensure that there are no more than 3 digits in total."]),
<add> (200000000000.0, ["Ensure that there are no more than 3 digits in total."]),
<ide> ('0.01', ["Ensure that there are no more than 1 decimal places."]),
<del> (123, ["Ensure that there are no more than 2 digits before the decimal point."])
<add> (123, ["Ensure that there are no more than 2 digits before the decimal point."]),
<add> ('2E+2', ["Ensure that there are no more than 2 digits before the decimal point."])
<ide> )
<ide> outputs = {
<ide> '1': '1.0', | 2 |
PHP | PHP | fix more errors reported by phpstan | 578d2b0462007339805e6b46c7b7d36dabe6ef7f | <ide><path>src/Auth/Storage/SessionStorage.php
<ide> class SessionStorage implements StorageInterface
<ide> * Stores user record array if fetched from session or false if session
<ide> * does not have user record.
<ide> *
<del> * @var \ArrayAccess|array|bool
<add> * @var \ArrayAccess|array|false
<ide> */
<ide> protected $_user;
<ide>
<ide><path>src/Auth/Storage/StorageInterface.php
<ide> interface StorageInterface
<ide> /**
<ide> * Read user record.
<ide> *
<del> * @return array|null
<add> * @return \ArrayAccess|array|null
<ide> */
<ide> public function read();
<ide>
<ide><path>src/Database/Type/DateTimeType.php
<ide> public function toPHP($value, Driver $driver)
<ide> * Convert request data into a datetime object.
<ide> *
<ide> * @param mixed $value Request data
<del> * @return \DateTimeInterface
<add> * @return \DateTimeInterface|null
<ide> */
<ide> public function marshal($value)
<ide> {
<ide><path>src/Filesystem/Folder.php
<ide> public function pwd()
<ide> public function cd($path)
<ide> {
<ide> $path = $this->realpath($path);
<del> if (is_dir($path)) {
<add> if ($path !== false && is_dir($path)) {
<ide> return $this->path = $path;
<ide> }
<ide>
<ide><path>src/Http/ServerRequest.php
<ide> class ServerRequest implements ArrayAccess, ServerRequestInterface
<ide> * In PUT/PATCH/DELETE requests this property will contain the form-urlencoded
<ide> * data.
<ide> *
<del> * @var array
<add> * @var null|array|object
<ide> * @deprecated 3.4.0 This public property will be removed in 4.0.0. Use getData() instead.
<ide> */
<ide> public $data = [];
<ide><path>src/ORM/Association/Loader/SelectLoader.php
<ide> public function __construct(array $options)
<ide> * iterator. The options accepted by this method are the same as `Association::eagerLoader()`
<ide> *
<ide> * @param array $options Same options as `Association::eagerLoader()`
<del> * @return callable
<add> * @return \Closure
<ide> */
<ide> public function buildEagerLoader(array $options)
<ide> {
<ide> protected function _addFilteringJoin($query, $key, $subquery)
<ide> }
<ide> $subquery->select($filter, true);
<ide>
<add> $conditions = null;
<ide> if (is_array($key)) {
<ide> $conditions = $this->_createTupleCondition($query, $key, $filter, '=');
<ide> } else {
<ide> $filter = current($filter);
<ide> }
<ide>
<del> $conditions = isset($conditions) ? $conditions : $query->newExpr([$key => $filter]);
<add> $conditions = $conditions ?: $query->newExpr([$key => $filter]);
<ide>
<ide> return $query->innerJoin(
<ide> [$aliasedTable => $subquery],
<ide><path>src/ORM/Locator/TableLocator.php
<ide> public function get($alias, array $options = [])
<ide> *
<ide> * @param string $alias The alias name you want to get.
<ide> * @param array $options Table options array.
<del> * @return string
<add> * @return string|false
<ide> */
<ide> protected function _getClassName($alias, array $options = [])
<ide> {
<ide><path>src/Utility/Inflector.php
<ide> class Inflector
<ide> * @param string $type Inflection type
<ide> * @param string $key Original value
<ide> * @param string|bool $value Inflected value
<del> * @return string|bool Inflected value on cache hit or false on cache miss.
<add> * @return string|false Inflected value on cache hit or false on cache miss.
<ide> */
<ide> protected static function _cache($type, $key, $value = false)
<ide> {
<ide><path>src/View/Helper/TimeHelper.php
<ide> public function format($date, $format = null, $invalid = false, $timezone = null
<ide> * @param string|null $format Intl compatible format string.
<ide> * @param bool|string $invalid Default value to display on invalid dates
<ide> * @param string|\DateTimeZone|null $timezone User's timezone string or DateTimeZone object
<del> * @return string Formatted and translated date string
<add> * @return string|false Formatted and translated date string or value for `$invalid` on failure.
<ide> * @throws \Exception When the date cannot be parsed
<ide> * @see \Cake\I18n\Time::i18nFormat()
<ide> */
<ide><path>src/View/Helper/UrlHelper.php
<ide> public function build($url = null, $options = false)
<ide> }
<ide> $options += $defaults;
<ide>
<add> /** @var string $url */
<ide> $url = Router::url($url, $options['fullBase']);
<ide> if ($options['escape']) {
<ide> $url = h($url);
<ide><path>src/View/ViewBuilder.php
<ide> class ViewBuilder implements JsonSerializable, Serializable
<ide> /**
<ide> * The plugin name to use.
<ide> *
<del> * @var string
<add> * @var string|null|false
<ide> */
<ide> protected $_plugin;
<ide>
<ide> /**
<ide> * The theme name to use.
<ide> *
<del> * @var string
<add> * @var string|null|false
<ide> */
<ide> protected $_theme;
<ide> | 11 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.