content_type
stringclasses 8
values | main_lang
stringclasses 7
values | message
stringlengths 1
50
| sha
stringlengths 40
40
| patch
stringlengths 52
962k
| file_count
int64 1
300
|
|---|---|---|---|---|---|
Ruby
|
Ruby
|
add rules on field order
|
8604799f1a320850ac564935265f414f93598599
|
<ide><path>Library/Homebrew/cmd/audit.rb
<ide> def audit
<ide> class FormulaText
<ide> def initialize path
<ide> @text = path.open("rb", &:read)
<add> @lines = @text.lines
<ide> end
<ide>
<ide> def without_patch
<ide> def has_trailing_newline?
<ide> def =~ regex
<ide> regex =~ @text
<ide> end
<add>
<add> def line_number regex
<add> index = @lines.index { |line| line =~ regex }
<add> index ? index + 1 : nil
<add> end
<ide> end
<ide>
<ide> class FormulaAuditor
<ide> def audit_file
<ide> unless text.has_trailing_newline?
<ide> problem "File should end with a newline"
<ide> end
<add>
<add> return unless @strict
<add>
<add> component_list = [
<add> [/^ desc ["'][\S\ ]+["']/, "desc" ],
<add> [/^ homepage ["'][\S\ ]+["']/, "homepage" ],
<add> [/^ url ["'][\S\ ]+["']/, "url" ],
<add> [/^ mirror ["'][\S\ ]+["']/, "mirror" ],
<add> [/^ version ["'][\S\ ]+["']/, "version" ],
<add> [/^ (sha1|sha256) ["'][\S\ ]+["']/, "checksum" ],
<add> [/^ head ["'][\S\ ]+["']/, "head" ],
<add> [/^ stable do/, "stable block" ],
<add> [/^ bottle do/, "bottle block" ],
<add> [/^ devel do/, "devel block" ],
<add> [/^ head do/, "head block" ],
<add> [/^ option/, "option" ],
<add> [/^ depends_on/, "depends_on" ],
<add> [/^ def install/, "install method"],
<add> [/^ def caveats/, "caveats method"],
<add> [/^ test do/, "test block" ],
<add> ]
<add>
<add> component_list.map do |regex, name|
<add> lineno = text.line_number regex
<add> next unless lineno
<add> [lineno, name]
<add> end.compact.each_cons(2) do |c1, c2|
<add> unless c1[0] < c2[0]
<add> problem "`#{c1[1]}`(line #{c1[0]}) should be put before `#{c2[1]}`(line #{c2[0]})"
<add> end
<add> end
<ide> end
<ide>
<ide> def audit_class
| 1
|
Python
|
Python
|
fix default labels for parser and ner
|
0e2bedc373f16e45c955c9cf30a1219e884a3769
|
<ide><path>spacy/language.py
<ide> def Pipeline(self, nlp, **cfg):
<ide>
<ide> tokenizer_exceptions = {}
<ide>
<del> parser_labels = {0: {'ROOT': True}}
<add> parser_labels = {0: {'': True}, 1: {'': True}, 2: {'ROOT': True, 'nmod': True},
<add> 3: {'ROOT': True, 'nmod': True}, 4: {'ROOT': True}}
<add>
<ide>
<del> entity_labels = {0: {'PER': True, 'LOC': True, 'ORG': True, 'MISC': True}}
<add> entity_labels = {
<add> 0: {'': True},
<add> 1: {'PER': True, 'LOC': True, 'ORG': True, 'MISC': True},
<add> 2: {'PER': True, 'LOC': True, 'ORG': True, 'MISC': True},
<add> 3: {'PER': True, 'LOC': True, 'ORG': True, 'MISC': True},
<add> 4: {'PER': True, 'LOC': True, 'ORG': True, 'MISC': True},
<add> 5: {'': True}
<add> }
<ide>
<ide> parser_features = get_templates('parser')
<ide>
| 1
|
Text
|
Text
|
update doc for @zeit/next-stylus
|
b393833aeab99647fc9851981b2156c497927206
|
<ide><path>readme.md
<ide> export default () => <p style={{ color: 'red' }}>hi there</p>
<ide>
<ide> To use more sophisticated CSS-in-JS solutions, you typically have to implement style flushing for server-side rendering. We enable this by allowing you to define your own [custom `<Document>`](#user-content-custom-document) component that wraps each page.
<ide>
<del>#### Importing CSS / Sass / Less files
<add>#### Importing CSS / Sass / Less / Stylus files
<ide>
<del>To support importing `.css` `.scss` or `.less` files you can use these modules, which configure sensible defaults for server rendered applications.
<add>To support importing `.css`, `.scss`, `.less` or `.styl` files you can use these modules, which configure sensible defaults for server rendered applications.
<ide>
<ide> - [@zeit/next-css](https://github.com/zeit/next-plugins/tree/master/packages/next-css)
<ide> - [@zeit/next-sass](https://github.com/zeit/next-plugins/tree/master/packages/next-sass)
<ide> - [@zeit/next-less](https://github.com/zeit/next-plugins/tree/master/packages/next-less)
<add>- [@zeit/next-stylus](https://github.com/zeit/next-plugins/tree/master/packages/next-stylus)
<ide>
<ide> ### Static file serving (e.g.: images)
<ide>
| 1
|
Text
|
Text
|
update korean translation to 0185c68
|
57f14017fb7443b4c02c03a938837fb2634d1c98
|
<ide><path>docs/docs/01-why-react.ko-KR.md
<ide> React는 페이스북과 인스타그램에서 사용자 인터페이스를 구
<ide>
<ide> 우리는 단 하나의 문제를 해결하기 위해 React를 만들었습니다: **지속해서 데이터가 변화하는 대규모 애플리케이션을 구축하기.** 이 문제를 해결하기 위해, React는 두가지 컨셉을 도입했습니다.
<ide>
<del>### 단순함
<add>## 단순함
<ide>
<ide> 당신의 애플리케이션이 특정 시점에 어떻게 보여야 할지를 단순히 표현하는 것만으로, 데이터가 변할 때 React는 자동으로 모든 UI 업데이트를 관리해줍니다.
<ide>
<del>### 선언적 문법
<add>## 선언적 문법
<ide>
<ide> 데이터가 변할 때 React는 "새로 고침" 버튼을 누르듯이 작동하며, 데이터의 바뀐 부분만을 업데이트할 수 있습니다.
<ide>
<ide><path>docs/docs/05-reusable-components.ko-KR.md
<ide> var ComponentWithDefaultProps = React.createClass({
<ide>
<ide> ## Prop 전달하기: 단축
<ide>
<del>React 컴포넌트의 흔히 그냥 기본 HTML을 확장해서 씁니다. 타이핑을 아끼기 위해 기저의 HTML 엘리먼트에 HTML 속성들을 단순히 복사하는 컴포넌트가 필요할 수도 있습니다. JSX의 _spread_ 문법을 사용하면 이렇게 할 수 있습니다.
<add>React 컴포넌트의 흔히 그냥 기본 HTML 엘리먼트를 확장해서 씁니다. 타이핑을 아끼기 위해 기저의 HTML 엘리먼트에 HTML 속성들을 단순히 복사하는 컴포넌트가 필요할 수도 있습니다. JSX의 _spread_ 문법을 사용하면 이렇게 할 수 있습니다.
<ide>
<ide> ```javascript
<ide> var CheckLink = React.createClass({
<ide><path>docs/docs/11-advanced-performance.ko-KR.md
<ide> React를 도입하려 할 때 많은 사람이 묻는 첫 번째 질문은 React
<ide>
<ide> React는 브라우저에서 렌더된 DOM 하위 트리의 서술자 개념인 *가상의 DOM*을 사용합니다. 이 병렬적인 서술체는 React가 DOM 노드를 생성하거나 이미 존재하는 DOM 노드에 접근하는 것(JavaScript 객체를 조작하는 것보다 느리죠)을 피하게 해 줍니다. 컴포넌트의 props나 state가 변경되면 React는 새로운 가상의 DOM을 구성해 이전의 것과 비교해서 실제 DOM 업데이트가 필요한지 결정합니다. 가능한 적게 변화를 적용하기 위해, React는 둘이 다를 경우에만 DOM을 [조정](/react/docs/reconciliation-ko-KR.html)할 것입니다.
<ide>
<del>이에 더해, React는 컴포넌트 생명주기 함수인 `shouldComponentUpdate`를 제공합니다. 이는 다시 렌더링하는 프로세스가 일어나기 직전에 일어나며 개발자가 프로세스를 중단할 수 있게 합니다. 이 함수의 기본구현은 `true`를 반환해 React가 업데이트를 수행하도록 합니다.
<add>이에 더해, React는 컴포넌트 생명주기 함수인 `shouldComponentUpdate`를 제공합니다. 이는 다시 렌더링하는 프로세스(가상 DOM 비교와 어쩌면 일어날 DOM 조정)가 일어나기 직전에 일어나며 개발자가 프로세스를 중단할 수 있게 합니다. 이 함수의 기본구현은 `true`를 반환해 React가 업데이트를 수행하도록 합니다.
<ide>
<ide> ```javascript
<ide> shouldComponentUpdate: function(nextProps, nextState) {
<ide><path>docs/docs/ref-03-component-specs.ko-KR.md
<ide> next: tags-and-attributes-ko-KR.html
<ide> ### render
<ide>
<ide> ```javascript
<del>ReactComponent render()
<add>ReactElement render()
<ide> ```
<ide>
<ide> `render()` 메소드는 필수 항목입니다.
<ide>
<del>호출되면 `this.props`와 `this.state`를 토대로 하나의 자식 컴포넌트를 리턴합니다. 이 자식 컴포넌트는 네이티브 DOM 컴포넌트의 가상 표현 (`<div />`나 `React.DOM.div()` 등) 또는 직접 정의한 조합(composite) 컴포넌트가 될 수 있습니다.
<add>호출되면 `this.props`와 `this.state`를 토대로 하나의 자식 엘리먼트를 리턴합니다. 이 자식 엘리먼트는 네이티브 DOM 컴포넌트의 가상 표현 (`<div />`나 `React.DOM.div()` 등) 또는 직접 정의한 조합(composite) 컴포넌트가 될 수 있습니다.
<ide>
<ide> 아무 것도 렌더링되지 않도록 하려면 `null`이나 `false`를 리턴합니다. React는 지금의 차이 비교 알고리즘이 작동할 수 있도록 내부적으로는 `<noscript>` 태그를 렌더링합니다. `null`이나 `false`를 리턴한 경우, `React.findDOMNode(this)`는 `null`을 리턴합니다.
<ide>
<ide><path>docs/tips/05-maximum-number-of-jsx-root-nodes.ko-KR.md
<ide> next: style-props-value-px-ko-KR.html
<ide>
<ide> 현재 컴포넌트의 `render`는 한 노드만 리턴할 수 있습니다. 만약 `div` 배열을 리턴하려면, `div`, `span`과 같은 다른 컴포넌트로 한 번 더 싸주어야 합니다.
<ide>
<del>JSX는 일반 JS로 컴파일 함을 잊지말아야 합니다. 두개의 함수를 리턴하는 것은 문법적으로 맞지 않습니다. 이와 마찬가지로, 한 삼항 연산자 안에 한개 이상의 자식 컴포넌트를 넣으면 안됩니다.
<ide>\ No newline at end of file
<add>JSX는 일반 JS로 컴파일 함을 잊지말아야 합니다. 두개의 함수를 리턴하는 것은 문법적으로 맞지 않습니다. 이와 마찬가지로, 한 삼항 연산자 안에 한개 이상의 자식 컴포넌트를 넣으면 안됩니다.
<ide><path>docs/tips/14-communicate-between-components.ko-KR.md
<ide> var GroceryList = React.createClass({
<ide> });
<ide>
<ide> React.render(
<del> <GroceryList items={['사과', '바나나', '크랜베리']} />, mountNode );
<add> <GroceryList items={['사과', '바나나', '크랜베리']} />, mountNode
<add>);
<ide> ```
<ide>
<ide> `bind(this, arg1, arg2, ...)`의 사용을 확인하세요: 간단히 `handleClick`에 인자를 더 넘겼습니다. 이는 React의 새로운 컨셉이 아닙니다; 그냥 JavaScript죠.
<ide>
<del>부모-자식 관계가 없는 두 컴포넌트간의 통신을 위해, 별도로 전역(global) 이벤트 시스템을 사용할 수 있습니다.
<del>`componentDidMount()`에서 이벤트를 구독하고, `componentWillUnmount()`에서 해제합니다. 이벤트를 받으면 `setState()`를 호출합니다.
<add>부모-자식 관계가 없는 두 컴포넌트간의 통신을 위해, 별도로 전역(global) 이벤트 시스템을 사용할 수 있습니다. `componentDidMount()`에서 이벤트를 구독하고, `componentWillUnmount()`에서 해제합니다. 이벤트를 받으면 `setState()`를 호출합니다. [Flux](https://facebook.github.io/flux/) 패턴은 이를 정리하는 방법 중 하나입니다.
| 6
|
Javascript
|
Javascript
|
forget window when it gets closed
|
3b4c1015cc2674676dbc340cfbafc3a437614fc6
|
<add><path>spec/main-process/file-recovery-service.spec.js
<del><path>spec/browser/file-recovery-service.spec.js
<ide> 'use babel'
<ide>
<del>import FileRecoveryService from '../../src/browser/file-recovery-service'
<add>import {BrowserWindow} from 'electron'
<add>import FileRecoveryService from '../../src/main-process/file-recovery-service'
<ide> import temp from 'temp'
<ide> import fs from 'fs-plus'
<ide> import {Emitter} from 'event-kit'
<ide>
<ide> describe("FileRecoveryService", () => {
<del> let mockWindow, recoveryService, recoveryDirectory
<add> let recoveryService, recoveryDirectory, windows
<add>
<add> function createWindow () {
<add> const window = new BrowserWindow({show: false})
<add> windows.push(window)
<add> return window
<add> }
<ide>
<ide> beforeEach(() => {
<del> mockWindow = new Emitter()
<add> windows = []
<ide> recoveryDirectory = temp.mkdirSync()
<ide> recoveryService = new FileRecoveryService(recoveryDirectory)
<ide> })
<ide>
<add> afterEach(() => {
<add> for (let window of windows) {
<add> window.destroy()
<add> }
<add> })
<add>
<ide> describe("when no crash happens during a save", () => {
<ide> it("creates a recovery file and deletes it after saving", () => {
<del> let filePath = temp.path()
<add> const mockWindow = createWindow()
<add> const filePath = temp.path()
<ide>
<ide> fs.writeFileSync(filePath, "some content")
<del> recoveryService.willSavePath({sender: mockWindow}, filePath)
<add> recoveryService.willSavePath({sender: mockWindow.webContents}, filePath)
<ide> assert.equal(fs.listTreeSync(recoveryDirectory).length, 1)
<ide>
<ide> fs.writeFileSync(filePath, "changed")
<del> recoveryService.didSavePath({sender: mockWindow}, filePath)
<add> recoveryService.didSavePath({sender: mockWindow.webContents}, filePath)
<ide> assert.equal(fs.listTreeSync(recoveryDirectory).length, 0)
<ide> assert.equal(fs.readFileSync(filePath, 'utf8'), "changed")
<ide> })
<ide>
<ide> it("creates many recovery files and deletes them when many windows attempt to save the same file", () => {
<del> const anotherMockWindow = new Emitter()
<del> let filePath = temp.path()
<add> const mockWindow = createWindow()
<add> const anotherMockWindow = createWindow()
<add> const filePath = temp.path()
<ide>
<ide> fs.writeFileSync(filePath, "some content")
<del> recoveryService.willSavePath({sender: mockWindow}, filePath)
<del> recoveryService.willSavePath({sender: anotherMockWindow}, filePath)
<add> recoveryService.willSavePath({sender: mockWindow.webContents}, filePath)
<add> recoveryService.willSavePath({sender: anotherMockWindow.webContents}, filePath)
<ide> assert.equal(fs.listTreeSync(recoveryDirectory).length, 2)
<ide>
<ide> fs.writeFileSync(filePath, "changed")
<del> recoveryService.didSavePath({sender: mockWindow}, filePath)
<add> recoveryService.didSavePath({sender: mockWindow.webContents}, filePath)
<ide> assert.equal(fs.listTreeSync(recoveryDirectory).length, 1)
<ide> assert.equal(fs.readFileSync(filePath, 'utf8'), "changed")
<ide>
<del> recoveryService.didSavePath({sender: anotherMockWindow}, filePath)
<add> recoveryService.didSavePath({sender: anotherMockWindow.webContents}, filePath)
<ide> assert.equal(fs.listTreeSync(recoveryDirectory).length, 0)
<ide> assert.equal(fs.readFileSync(filePath, 'utf8'), "changed")
<ide> })
<ide> })
<ide>
<ide> describe("when a crash happens during a save", () => {
<ide> it("restores the created recovery file and deletes it", () => {
<del> let filePath = temp.path()
<add> const mockWindow = createWindow()
<add> const filePath = temp.path()
<ide>
<ide> fs.writeFileSync(filePath, "some content")
<del> recoveryService.willSavePath({sender: mockWindow}, filePath)
<add> recoveryService.willSavePath({sender: mockWindow.webContents}, filePath)
<ide> assert.equal(fs.listTreeSync(recoveryDirectory).length, 1)
<ide>
<ide> fs.writeFileSync(filePath, "changed")
<del> mockWindow.emit("crashed")
<add> mockWindow.webContents.emit("crashed")
<ide> assert.equal(fs.listTreeSync(recoveryDirectory).length, 0)
<ide> assert.equal(fs.readFileSync(filePath, 'utf8'), "some content")
<ide> })
<ide>
<ide> it("restores the created recovery files and deletes them in the order in which windows crash when they attempt to save the same file", () => {
<del> const anotherMockWindow = new Emitter
<del> let filePath = temp.path()
<add> const mockWindow = createWindow()
<add> const anotherMockWindow = createWindow()
<add> const filePath = temp.path()
<ide>
<ide> fs.writeFileSync(filePath, "window 1")
<del> recoveryService.willSavePath({sender: mockWindow}, filePath)
<add> recoveryService.willSavePath({sender: mockWindow.webContents}, filePath)
<ide> fs.writeFileSync(filePath, "window 2")
<del> recoveryService.willSavePath({sender: anotherMockWindow}, filePath)
<add> recoveryService.willSavePath({sender: anotherMockWindow.webContents}, filePath)
<ide> assert.equal(fs.listTreeSync(recoveryDirectory).length, 2)
<ide>
<ide> fs.writeFileSync(filePath, "changed")
<ide>
<del> mockWindow.emit("crashed")
<add> mockWindow.webContents.emit("crashed")
<ide> assert.equal(fs.readFileSync(filePath, 'utf8'), "window 1")
<ide> assert.equal(fs.listTreeSync(recoveryDirectory).length, 1)
<ide>
<del> anotherMockWindow.emit("crashed")
<add> anotherMockWindow.webContents.emit("crashed")
<ide> assert.equal(fs.readFileSync(filePath, 'utf8'), "window 2")
<ide> assert.equal(fs.listTreeSync(recoveryDirectory).length, 0)
<ide> })
<ide>
<ide> it("emits a warning when a file can't be recovered", () => {
<del> let filePath = temp.path()
<add> const mockWindow = createWindow()
<add> const filePath = temp.path()
<ide> fs.writeFileSync(filePath, "content")
<ide> fs.chmodSync(filePath, 0444)
<ide>
<ide> const previousConsoleLog = console.log
<ide> let logs = []
<ide> console.log = (message) => logs.push(message)
<ide>
<del> recoveryService.willSavePath({sender: mockWindow}, filePath)
<del> mockWindow.emit("crashed")
<add> recoveryService.willSavePath({sender: mockWindow.webContents}, filePath)
<add> mockWindow.webContents.emit("crashed")
<ide> let recoveryFiles = fs.listTreeSync(recoveryDirectory)
<ide> assert.equal(recoveryFiles.length, 1)
<ide> assert.deepEqual(logs, [`Cannot recover ${filePath}. A recovery file has been saved here: ${recoveryFiles[0]}.`])
<ide> describe("FileRecoveryService", () => {
<ide> })
<ide>
<ide> it("doesn't create a recovery file when the file that's being saved doesn't exist yet", () => {
<del> recoveryService.willSavePath({sender: mockWindow}, "a-file-that-doesnt-exist")
<add> const mockWindow = createWindow()
<add>
<add> recoveryService.willSavePath({sender: mockWindow.webContents}, "a-file-that-doesnt-exist")
<ide> assert.equal(fs.listTreeSync(recoveryDirectory).length, 0)
<ide>
<del> recoveryService.didSavePath({sender: mockWindow}, "a-file-that-doesnt-exist")
<add> recoveryService.didSavePath({sender: mockWindow.webContents}, "a-file-that-doesnt-exist")
<ide> assert.equal(fs.listTreeSync(recoveryDirectory).length, 0)
<ide> })
<ide> })
<add><path>src/main-process/file-recovery-service.js
<del><path>src/browser/file-recovery-service.js
<ide> 'use babel'
<ide>
<del>import {ipcMain} from 'electron'
<add>import {BrowserWindow, ipcMain} from 'electron'
<ide> import crypto from 'crypto'
<ide> import Path from 'path'
<ide> import fs from 'fs-plus'
<ide> export default class FileRecoveryService {
<ide> constructor (recoveryDirectory) {
<ide> this.recoveryDirectory = recoveryDirectory
<ide> this.recoveryPathsByWindowAndFilePath = new WeakMap()
<del> this.crashListeners = new WeakSet()
<add> this.observedWindows = new WeakSet()
<ide> }
<ide>
<ide> start () {
<ide> export default class FileRecoveryService {
<ide> return
<ide> }
<ide>
<del> const window = event.sender
<add> const window = BrowserWindow.fromWebContents(event.sender)
<ide> const recoveryFileName = crypto.randomBytes(5).toString('hex')
<ide> const recoveryPath = Path.join(this.recoveryDirectory, recoveryFileName)
<ide> fs.writeFileSync(recoveryPath, fs.readFileSync(path))
<ide> export default class FileRecoveryService {
<ide> }
<ide> this.recoveryPathsByWindowAndFilePath.get(window).set(path, recoveryPath)
<ide>
<del> if (!this.crashListeners.has(window)) {
<del> window.on('crashed', () => this.recoverFilesForWindow(window))
<del> this.crashListeners.add(window)
<add> if (!this.observedWindows.has(window)) {
<add> window.webContents.on("crashed", () => this.recoverFilesForWindow(window))
<add> window.on("closed", () => {
<add> this.observedWindows.delete(window)
<add> this.recoveryPathsByWindowAndFilePath.delete(window)
<add> })
<add> this.observedWindows.add(window)
<ide> }
<ide>
<ide> event.returnValue = true
<ide> }
<ide>
<ide> didSavePath (event, path) {
<del> const window = event.sender
<add> const window = BrowserWindow.fromWebContents(event.sender)
<ide> const recoveryPathsByFilePath = this.recoveryPathsByWindowAndFilePath.get(window)
<ide> if (recoveryPathsByFilePath == null || !recoveryPathsByFilePath.has(path)) {
<ide> event.returnValue = false
| 2
|
Java
|
Java
|
fix typo in disposablehelper
|
b7f81d2a1fb37e7cdd7e72b34c032a186cf01dbc
|
<ide><path>src/main/java/io/reactivex/internal/disposables/DisposableHelper.java
<ide> public static boolean replace(AtomicReference<Disposable> field, Disposable d) {
<ide> /**
<ide> * Atomically disposes the Disposable in the field if not already disposed.
<ide> * @param field the target field
<del> * @return true if the curren thread managed to dispose the Disposable
<add> * @return true if the current thread managed to dispose the Disposable
<ide> */
<ide> public static boolean dispose(AtomicReference<Disposable> field) {
<ide> Disposable current = field.get();
| 1
|
Ruby
|
Ruby
|
convert uninstall test to spec
|
3d031420404506223d047fc509e53df4cc9185d9
|
<add><path>Library/Homebrew/cask/spec/cask/cli/uninstall_spec.rb
<del><path>Library/Homebrew/cask/test/cask/cli/uninstall_test.rb
<del>require "test_helper"
<add>require "spec_helper"
<ide>
<ide> describe Hbc::CLI::Uninstall do
<ide> it "shows an error when a bad Cask is provided" do
<del> lambda {
<add> expect {
<ide> Hbc::CLI::Uninstall.run("notacask")
<del> }.must_raise Hbc::CaskUnavailableError
<add> }.to raise_error(Hbc::CaskUnavailableError)
<ide> end
<ide>
<ide> it "shows an error when a Cask is provided that's not installed" do
<del> lambda {
<add> expect {
<ide> Hbc::CLI::Uninstall.run("anvil")
<del> }.must_raise Hbc::CaskNotInstalledError
<add> }.to raise_error(Hbc::CaskNotInstalledError)
<ide> end
<ide>
<ide> it "tries anyway on a non-present Cask when --force is given" do
<del> lambda do
<add> expect {
<ide> Hbc::CLI::Uninstall.run("anvil", "--force")
<del> end # wont_raise
<add> }.not_to raise_error
<ide> end
<ide>
<ide> it "can uninstall and unlink multiple Casks at once" do
<ide> Hbc::Installer.new(transmission).install
<ide> end
<ide>
<del> caffeine.must_be :installed?
<del> transmission.must_be :installed?
<add> expect(caffeine).to be_installed
<add> expect(transmission).to be_installed
<ide>
<ide> shutup do
<ide> Hbc::CLI::Uninstall.run("local-caffeine", "local-transmission")
<ide> end
<ide>
<del> caffeine.wont_be :installed?
<del> Hbc.appdir.join("Transmission.app").wont_be :exist?
<del> transmission.wont_be :installed?
<del> Hbc.appdir.join("Caffeine.app").wont_be :exist?
<add> expect(caffeine).not_to be_installed
<add> expect(Hbc.appdir.join("Transmission.app")).not_to exist
<add> expect(transmission).not_to be_installed
<add> expect(Hbc.appdir.join("Caffeine.app")).not_to exist
<ide> end
<ide>
<ide> describe "when multiple versions of a cask are installed" do
<ide> end
<ide> end
<ide>
<del> after(:each) do
<del> caskroom_path.rmtree if caskroom_path.exist?
<del> end
<del>
<ide> it "uninstalls one version at a time" do
<ide> shutup do
<ide> Hbc::CLI::Uninstall.run("versioned-cask")
<ide> end
<ide>
<del> caskroom_path.join(first_installed_version).must_be :exist?
<del> caskroom_path.join(last_installed_version).wont_be :exist?
<del> caskroom_path.must_be :exist?
<add> expect(caskroom_path.join(first_installed_version)).to exist
<add> expect(caskroom_path.join(last_installed_version)).not_to exist
<add> expect(caskroom_path).to exist
<ide>
<ide> shutup do
<ide> Hbc::CLI::Uninstall.run("versioned-cask")
<ide> end
<ide>
<del> caskroom_path.join(first_installed_version).wont_be :exist?
<del> caskroom_path.wont_be :exist?
<add> expect(caskroom_path.join(first_installed_version)).not_to exist
<add> expect(caskroom_path).not_to exist
<ide> end
<ide>
<ide> it "displays a message when versions remain installed" do
<del> out, err = capture_io do
<del> Hbc::CLI::Uninstall.run("versioned-cask")
<del> end
<del>
<del> out.must_match(/#{token} #{first_installed_version} is still installed./)
<del> err.must_be :empty?
<add> expect {
<add> expect {
<add> Hbc::CLI::Uninstall.run("versioned-cask")
<add> }.not_to output.to_stderr
<add> }.to output(/#{token} #{first_installed_version} is still installed./).to_stdout
<ide> end
<ide> end
<ide>
<ide> EOS
<ide> end
<ide>
<del> after do
<del> app.rmtree if app.exist?
<del> caskroom_path.rmtree if caskroom_path.exist?
<del> end
<del>
<ide> it "can still uninstall those Casks" do
<ide> shutup do
<ide> Hbc::CLI::Uninstall.run("ive-been-renamed")
<ide> end
<ide>
<del> app.wont_be :exist?
<del> caskroom_path.wont_be :exist?
<add> expect(app).not_to exist
<add> expect(caskroom_path).not_to exist
<ide> end
<ide> end
<ide>
<ide> describe "when no Cask is specified" do
<ide> it "raises an exception" do
<del> lambda {
<add> expect {
<ide> Hbc::CLI::Uninstall.run
<del> }.must_raise Hbc::CaskUnspecifiedError
<add> }.to raise_error(Hbc::CaskUnspecifiedError)
<ide> end
<ide> end
<ide>
<ide> describe "when no Cask is specified, but an invalid option" do
<ide> it "raises an exception" do
<del> lambda {
<add> expect {
<ide> Hbc::CLI::Uninstall.run("--notavalidoption")
<del> }.must_raise Hbc::CaskUnspecifiedError
<add> }.to raise_error(Hbc::CaskUnspecifiedError)
<ide> end
<ide> end
<ide> end
| 1
|
Ruby
|
Ruby
|
add support for short flag options
|
f7ec07680ccf859fd331d3fe7195afc366fc3741
|
<ide><path>Library/Homebrew/cli_parser.rb
<ide> def comma_array(name, description: nil)
<ide> end
<ide> end
<ide>
<del> def flag(name, description: nil, required_for: nil, depends_on: nil)
<del> if name.end_with? "="
<add> def flag(*names, description: nil, required_for: nil, depends_on: nil)
<add> if names.any? { |name| name.end_with? "=" }
<ide> required = OptionParser::REQUIRED_ARGUMENT
<del> name.chomp! "="
<ide> else
<ide> required = OptionParser::OPTIONAL_ARGUMENT
<ide> end
<del> description = option_to_description(name) if description.nil?
<del> process_option(name, description)
<del> @parser.on(name, *wrap_option_desc(description), required) do |option_value|
<del> Homebrew.args[option_to_name(name)] = option_value
<add> names.map! { |name| name.chomp "=" }
<add> description = option_to_description(*names) if description.nil?
<add> process_option(*names, description)
<add> @parser.on(*names, *wrap_option_desc(description), required) do |option_value|
<add> names.each do |name|
<add> Homebrew.args[option_to_name(name)] = option_value
<add> end
<ide> end
<ide>
<del> set_constraints(name, required_for: required_for, depends_on: depends_on)
<add> names.each do |name|
<add> set_constraints(name, required_for: required_for, depends_on: depends_on)
<add> end
<ide> end
<ide>
<ide> def conflicts(*options)
<ide><path>Library/Homebrew/test/cli_parser_spec.rb
<ide> end
<ide> end
<ide>
<add> describe "test short flag options" do
<add> subject(:parser) {
<add> described_class.new do
<add> flag "-f", "--filename=", description: "Name of the file"
<add> end
<add> }
<add>
<add> it "parses a short flag option with its argument" do
<add> parser.parse(["--filename=random.txt"])
<add> expect(Homebrew.args.filename).to eq "random.txt"
<add> expect(Homebrew.args.f).to eq "random.txt"
<add> end
<add> end
<add>
<ide> describe "test constraints for flag options" do
<ide> subject(:parser) {
<ide> described_class.new do
| 2
|
Python
|
Python
|
obtain size info defensively for packet
|
1c920a48f2912c2d0cb64bb3ed830ba232e9bb3f
|
<ide><path>libcloud/compute/drivers/packet.py
<ide> def _to_location(self, data):
<ide> driver=self, extra=extra)
<ide>
<ide> def _to_size(self, data):
<del> cpus = data['specs']['cpus'][0].get('count')
<add> try:
<add> cpus = data['specs']['cpus'][0].get('count')
<add> except KeyError:
<add> cpus = None
<ide> regions = [region.get('href').strip('/facilities/')
<ide> for region in data.get('available_in')]
<ide> extra = {'description': data['description'], 'line': data['line'],
<ide> 'cpus': cpus, 'regions': regions}
<del>
<del> ram = data['specs']['memory']['total']
<del> disk = 0
<del> for disks in data['specs']['drives']:
<del> disk_size = disks['size'].replace('GB', '')
<del> if 'TB' in disk_size:
<del> disk_size = float(disks['size'].replace('TB', '')) * 1000
<del> disk += disks['count'] * int(disk_size)
<add> try:
<add> ram = int(data['specs']['memory']['total'].replace('GB', '')) * 1024
<add> except KeyError:
<add> ram = None
<add> disk = None
<add> if data['specs'].get('drives', ''):
<add> disk = 0
<add> for disks in data['specs']['drives']:
<add> disk_size = disks['size'].replace('GB', '')
<add> if 'TB' in disk_size:
<add> disk_size = float(disks['size'].replace('TB', '')) * 1000
<add> disk += disks['count'] * int(disk_size)
<ide> name = "%s - %s RAM" % (data.get('name'), ram)
<ide> price = data['pricing'].get('hour')
<ide> return NodeSize(id=data['slug'], name=name,
<del> ram=int(ram.replace('GB', '')) * 1024, disk=disk,
<del> bandwidth=0, price=price, extra=extra, driver=self)
<add> ram=ram, disk=disk, bandwidth=0,
<add> price=price, extra=extra, driver=self)
<ide>
<ide> def _to_key_pairs(self, data):
<ide> extra = {'label': data['label'],
| 1
|
Go
|
Go
|
return more context on awslogs create failure
|
7a5c813d9ca913cd9a2f03a58902b2a9bf5c7f23
|
<ide><path>daemon/logger/awslogs/cloudwatchlogs.go
<ide> func (l *logStream) create() error {
<ide> if l.logCreateGroup {
<ide> if awsErr, ok := err.(awserr.Error); ok && awsErr.Code() == resourceNotFoundCode {
<ide> if err := l.createLogGroup(); err != nil {
<del> return err
<add> return errors.Wrap(err, "failed to create Cloudwatch log group")
<ide> }
<del> return l.createLogStream()
<add> err := l.createLogStream()
<add> if err != nil {
<add> return errors.Wrap(err, "failed to create Cloudwatch log stream")
<add> }
<add> return nil
<ide> }
<ide> }
<ide> if err != nil {
<del> return err
<add> return errors.Wrap(err, "failed to create Cloudwatch log stream")
<ide> }
<ide> }
<ide>
| 1
|
PHP
|
PHP
|
fix error in old sqlsserver
|
599b317d081789b72349664a51fa0467ec6a816a
|
<ide><path>tests/Fixture/DatatypeFixture.php
<ide> class DatatypeFixture extends TestFixture {
<ide> *
<ide> * @var array
<ide> */
<del> public $records = array(
<del> array('id' => 1, 'float_field' => 42.23, 'huge_int' => '1234567891234567891', 'bool' => 0),
<del> );
<add> public $records = [
<add> ['float_field' => 42.23, 'huge_int' => '1234567891234567891', 'bool' => 0],
<add> ];
<add>
<ide> }
| 1
|
Ruby
|
Ruby
|
fix the sorting
|
0f6d3b2923a41bd48f58cf202db96d19c8b98868
|
<ide><path>Library/Homebrew/tap_migrations.rb
<ide> "freerdp" => "homebrew/x11",
<ide> "fsv" => "homebrew/boneyard",
<ide> "fuse-zip" => "homebrew/fuse",
<del> "fuse4x-kext" => "homebrew/fuse",
<ide> "fuse4x" => "homebrew/fuse",
<add> "fuse4x-kext" => "homebrew/fuse",
<ide> "gant" => "homebrew/boneyard",
<ide> "gcsfuse" => "homebrew/fuse",
<ide> "geany" => "homebrew/x11",
<ide> "ggobi" => "homebrew/x11",
<ide> "giblib" => "homebrew/x11",
<ide> "git-flow-clone" => "homebrew/boneyard",
<del> "gitfs" => "homebrew/fuse",
<ide> "git-latexdiff" => "homebrew/tex",
<add> "gitfs" => "homebrew/fuse",
<ide> "gkrellm" => "homebrew/boneyard",
<ide> "glade" => "homebrew/x11",
<ide> "gle" => "homebrew/x11",
<ide> "grace" => "homebrew/x11",
<ide> "grads" => "homebrew/binary",
<ide> "graylog2-server" => "homebrew/boneyard",
<del> "guilt" => "homebrew/boneyard",
<ide> "gromacs" => "homebrew/science",
<ide> "gsmartcontrol" => "homebrew/x11",
<ide> "gtk-chtheme" => "homebrew/x11",
<ide> "gtkglarea" => "homebrew/boneyard",
<ide> "gtksourceviewmm" => "homebrew/x11",
<ide> "gtksourceviewmm3" => "homebrew/x11",
<ide> "gtkwave" => "homebrew/x11",
<add> "guilt" => "homebrew/boneyard",
<ide> "gv" => "homebrew/x11",
<ide> "hatari" => "homebrew/x11",
<ide> "helios" => "spotify/public",
<ide> "latex-mk" => "homebrew/tex",
<ide> "libdlna" => "homebrew/boneyard",
<ide> "libgtextutils" => "homebrew/science",
<del> "librets" => "homebrew/boneyard",
<ide> "libqxt" => "homebrew/boneyard",
<add> "librets" => "homebrew/boneyard",
<ide> "libspotify" => "homebrew/binary",
<ide> "lilypond" => "homebrew/tex",
<ide> "lmutil" => "homebrew/binary",
<ide> "ori" => "homebrew/fuse",
<ide> "owamp" => "homebrew/boneyard",
<ide> "pan" => "homebrew/boneyard",
<del> "pari" => "homebrew/x11",
<ide> "par2tbb" => "homebrew/boneyard",
<add> "pari" => "homebrew/x11",
<ide> "pathfinder" => "homebrew/boneyard",
<ide> "pcb" => "homebrew/x11",
<del> "pdfjam" => "homebrew/tex",
<ide> "pdf2image" => "homebrew/x11",
<add> "pdfjam" => "homebrew/tex",
<ide> "pdftoipe" => "homebrew/head-only",
<ide> "pebble-sdk" => "pebble/pebble-sdk",
<ide> "pgplot" => "homebrew/x11",
<ide> "qiv" => "homebrew/boneyard",
<ide> "qrupdate" => "homebrew/science",
<ide> "rdesktop" => "homebrew/x11",
<del> "rofs-filtered" => "homebrew/fuse",
<ide> "rocket" => "homebrew/boneyard",
<add> "rofs-filtered" => "homebrew/fuse",
<ide> "rxvt-unicode" => "homebrew/x11",
<ide> "s3-backer" => "homebrew/fuse",
<ide> "s3fs" => "homebrew/fuse",
<ide> "tetgen" => "homebrew/science",
<ide> "texmacs" => "homebrew/boneyard",
<ide> "texwrapper" => "homebrew/tex",
<del> "tiger-vnc" => "homebrew/x11",
<ide> "ticcutils" => "homebrew/science",
<add> "tiger-vnc" => "homebrew/x11",
<ide> "timbl" => "homebrew/science",
<ide> "tmap" => "homebrew/boneyard",
<ide> "transmission-remote-gtk" => "homebrew/x11",
<ide> "xclip" => "homebrew/x11",
<ide> "xdotool" => "homebrew/x11",
<ide> "xdu" => "homebrew/x11",
<del> "xournal" => "homebrew/x11",
<ide> "xmount" => "homebrew/fuse",
<add> "xournal" => "homebrew/x11",
<ide> "xpa" => "homebrew/x11",
<ide> "xpdf" => "homebrew/x11",
<ide> "xplot" => "homebrew/x11",
| 1
|
PHP
|
PHP
|
add another test
|
72ef859df410d685101c8389eebb0cb8940ebbcd
|
<ide><path>tests/Integration/Queue/JobChainingTest.php
<ide> public function test_second_job_is_not_fired_if_first_was_already_deleted()
<ide>
<ide> $this->assertFalse(JobChainingTestSecondJob::$ran);
<ide> }
<add>
<add> public function test_third_job_is_not_fired_if_second_fails()
<add> {
<add> Queue::connection('sync')->push((new JobChainingTestFirstJob)->chain([
<add> new JobChainingTestFailingJob,
<add> new JobChainingTestThirdJob,
<add> ]));
<add>
<add> $this->assertTrue(JobChainingTestFirstJob::$ran);
<add> $this->assertFalse(JobChainingTestThirdJob::$ran);
<add> }
<ide> }
<ide>
<ide> class JobChainingTestFirstJob implements ShouldQueue
| 1
|
Python
|
Python
|
restart consumer if connection drops
|
e726978a39a05838805d2b026c4f1c962cfb23b7
|
<ide><path>celery/worker/loops.py
<ide> def _quick_drain(connection, timeout=0.1):
<ide>
<ide>
<ide> def _enable_amqheartbeats(timer, connection, rate=2.0):
<del> if connection:
<del> tick = connection.heartbeat_check
<del> heartbeat = connection.get_heartbeat_interval() # negotiated
<del> if heartbeat and connection.supports_heartbeats:
<del> timer.call_repeatedly(heartbeat / rate, tick, (rate,))
<add> heartbeat_error = [None]
<add>
<add> if not connection:
<add> return heartbeat_error
<add>
<add> heartbeat = connection.get_heartbeat_interval() # negotiated
<add> if not (heartbeat and connection.supports_heartbeats):
<add> return heartbeat_error
<add>
<add> def tick(rate):
<add> try:
<add> connection.heartbeat_check(rate)
<add> except Exception as e:
<add> # heartbeat_error is passed by reference can be updated
<add> # no append here list should be fixed size=1
<add> heartbeat_error[0] = e
<add>
<add> timer.call_repeatedly(heartbeat / rate, tick, (rate,))
<add> return heartbeat_error
<ide>
<ide>
<ide> def asynloop(obj, connection, consumer, blueprint, hub, qos,
<ide> def asynloop(obj, connection, consumer, blueprint, hub, qos,
<ide>
<ide> on_task_received = obj.create_task_handler()
<ide>
<del> _enable_amqheartbeats(hub.timer, connection, rate=hbrate)
<add> heartbeat_error = _enable_amqheartbeats(hub.timer, connection, rate=hbrate)
<ide>
<ide> consumer.on_message = on_task_received
<ide> obj.controller.register_with_event_loop(hub)
<ide> def asynloop(obj, connection, consumer, blueprint, hub, qos,
<ide> try:
<ide> while blueprint.state == RUN and obj.connection:
<ide> state.maybe_shutdown()
<add> if heartbeat_error[0] is not None:
<add> raise heartbeat_error[0]
<ide>
<ide> # We only update QoS when there's no more messages to read.
<ide> # This groups together qos calls, and makes sure that remote
<ide> def synloop(obj, connection, consumer, blueprint, hub, qos,
<ide> RUN = bootsteps.RUN
<ide> on_task_received = obj.create_task_handler()
<ide> perform_pending_operations = obj.perform_pending_operations
<add> heartbeat_error = [None]
<ide> if getattr(obj.pool, 'is_green', False):
<del> _enable_amqheartbeats(obj.timer, connection, rate=hbrate)
<add> heartbeat_error = _enable_amqheartbeats(obj.timer, connection, rate=hbrate)
<ide> consumer.on_message = on_task_received
<ide> consumer.consume()
<ide>
<ide> obj.on_ready()
<ide>
<ide> while blueprint.state == RUN and obj.connection:
<ide> state.maybe_shutdown()
<add> if heartbeat_error[0] is not None:
<add> raise heartbeat_error[0]
<ide> if qos.prev != qos.value:
<ide> qos.update()
<ide> try:
<ide><path>t/unit/worker/test_loops.py
<ide> def test_setup_heartbeat(self):
<ide> asynloop(*x.args)
<ide> x.consumer.consume.assert_called_with()
<ide> x.obj.on_ready.assert_called_with()
<del> x.hub.timer.call_repeatedly.assert_called_with(
<del> 10 / 2.0, x.connection.heartbeat_check, (2.0,),
<del> )
<add> last_call_args, _ = x.hub.timer.call_repeatedly.call_args
<add>
<add> assert last_call_args[0] == 10 / 2.0
<add> assert last_call_args[2] == (2.0,)
<ide>
<ide> def task_context(self, sig, **kwargs):
<ide> x, on_task = get_task_callback(self.app, **kwargs)
<ide> def test_poll_raises_ValueError(self):
<ide> asynloop(*x.args)
<ide> poller.poll.assert_called()
<ide>
<add> def test_heartbeat_error(self):
<add> x = X(self.app, heartbeat=10)
<add> x.connection.heartbeat_check = Mock(
<add> side_effect=RuntimeError("Heartbeat error")
<add> )
<add>
<add> def call_repeatedly(rate, fn, args):
<add> fn(*args)
<add>
<add> x.hub.timer.call_repeatedly = call_repeatedly
<add> with pytest.raises(RuntimeError):
<add> asynloop(*x.args)
<add>
<add> def test_no_heartbeat_support(self):
<add> x = X(self.app)
<add> x.connection.supports_heartbeats = False
<add> x.hub.timer.call_repeatedly = Mock(
<add> name='x.hub.timer.call_repeatedly()'
<add> )
<add> x.hub.on_tick.add(x.closer(mod=2))
<add> asynloop(*x.args)
<add>
<add> x.hub.timer.call_repeatedly.assert_not_called()
<add>
<ide>
<ide> class test_synloop:
<ide>
<ide> def test_ignores_socket_errors_when_closed(self):
<ide> x.close_then_error(x.connection.drain_events)
<ide> assert synloop(*x.args) is None
<ide>
<add> def test_no_connection(self):
<add> x = X(self.app)
<add> x.connection = None
<add> x.hub.timer.call_repeatedly = Mock(
<add> name='x.hub.timer.call_repeatedly()'
<add> )
<add> x.blueprint.state = CLOSE
<add> synloop(*x.args)
<add>
<add> x.hub.timer.call_repeatedly.assert_not_called()
<add>
<add> def test_heartbeat_error(self):
<add> x = X(self.app, heartbeat=10)
<add> x.obj.pool.is_green = True
<add>
<add> def heartbeat_check(rate):
<add> raise RuntimeError('Heartbeat error')
<add>
<add> def call_repeatedly(rate, fn, args):
<add> fn(*args)
<add>
<add> x.connection.heartbeat_check = Mock(
<add> name='heartbeat_check', side_effect=heartbeat_check
<add> )
<add> x.obj.timer.call_repeatedly = call_repeatedly
<add> with pytest.raises(RuntimeError):
<add> synloop(*x.args)
<add>
<add> def test_no_heartbeat_support(self):
<add> x = X(self.app)
<add> x.connection.supports_heartbeats = False
<add> x.obj.pool.is_green = True
<add> x.obj.timer.call_repeatedly = Mock(
<add> name='x.obj.timer.call_repeatedly()'
<add> )
<add>
<add> def drain_events(timeout):
<add> x.blueprint.state = CLOSE
<add> x.connection.drain_events.side_effect = drain_events
<add> synloop(*x.args)
<add>
<add> x.obj.timer.call_repeatedly.assert_not_called()
<add>
<ide>
<ide> class test_quick_drain:
<ide>
| 2
|
Javascript
|
Javascript
|
use number.isnan instead of global isnan
|
b681d8d1a1400847780a6d2b07bbb879fb72a812
|
<ide><path>test/parallel/test-os.js
<ide> const { inspect } = require('util');
<ide>
<ide> const is = {
<ide> number: (value, key) => {
<del> assert(!isNaN(value), `${key} should not be NaN`);
<add> assert(!Number.isNaN(value), `${key} should not be NaN`);
<ide> assert.strictEqual(typeof value, 'number');
<ide> },
<ide> string: (value) => { assert.strictEqual(typeof value, 'string'); },
<ide><path>test/parallel/test-readdouble.js
<ide> function test(clazz) {
<ide> buffer[5] = 0xff;
<ide> buffer[6] = 0x0f;
<ide> buffer[7] = 0x00;
<del> assert.ok(isNaN(buffer.readDoubleBE(0)));
<add> assert.ok(Number.isNaN(buffer.readDoubleBE(0)));
<ide> assert.strictEqual(2.225073858507201e-308, buffer.readDoubleLE(0));
<ide>
<ide> buffer[6] = 0xef;
<ide> buffer[7] = 0x7f;
<del> assert.ok(isNaN(buffer.readDoubleBE(0)));
<add> assert.ok(Number.isNaN(buffer.readDoubleBE(0)));
<ide> assert.strictEqual(1.7976931348623157e+308, buffer.readDoubleLE(0));
<ide>
<ide> buffer[0] = 0;
<ide><path>test/parallel/test-writefloat.js
<ide> function test(clazz) {
<ide> // Darwin ia32 does the other kind of NaN.
<ide> // Compiler bug. No one really cares.
<ide> assert(0x7F === buffer[7] || 0xFF === buffer[7]);
<del> assert.ok(isNaN(buffer.readFloatBE(0)));
<del> assert.ok(isNaN(buffer.readFloatLE(4)));
<add> assert.ok(Number.isNaN(buffer.readFloatBE(0)));
<add> assert.ok(Number.isNaN(buffer.readFloatLE(4)));
<ide> }
<ide>
<ide>
| 3
|
Ruby
|
Ruby
|
pluralize message only when needed
|
420e8fdbe4742eb655a58c7faabfca6fe200d4a8
|
<ide><path>Library/Homebrew/dev-cmd/pr-automerge.rb
<ide> def pr_automerge
<ide> return
<ide> end
<ide>
<del> ohai "#{prs.size} matching pull requests:"
<add> ohai "#{prs.count} matching pull #{"request".pluralize(prs.count)}:"
<ide> pr_urls = []
<ide> prs.each do |pr|
<ide> puts "#{tap.full_name unless tap.core_tap?}##{pr["number"]}: #{pr["title"]}"
| 1
|
PHP
|
PHP
|
add ordoesnthave and orwheredoesnthave
|
ef11aee425a0973b2ccef485eb579802fd7d45f9
|
<ide><path>src/Illuminate/Database/Eloquent/Concerns/QueriesRelationships.php
<ide> public function doesntHave($relation, $boolean = 'and', Closure $callback = null
<ide> return $this->has($relation, '<', 1, $boolean, $callback);
<ide> }
<ide>
<add> /**
<add> * Add a relationship count / exists condition to the query with an "or".
<add> *
<add> * @param string $relation
<add> * @return \Illuminate\Database\Eloquent\Builder|static
<add> */
<add> public function orDoesntHave($relation)
<add> {
<add> return $this->doesntHave($relation, 'or');
<add> }
<add>
<ide> /**
<ide> * Add a relationship count / exists condition to the query with where clauses.
<ide> *
<ide> public function whereDoesntHave($relation, Closure $callback = null)
<ide> return $this->doesntHave($relation, 'and', $callback);
<ide> }
<ide>
<add> /**
<add> * Add a relationship count / exists condition to the query with where clauses and an "or".
<add> *
<add> * @param string $relation
<add> * @param \Closure $callback
<add> * @return \Illuminate\Database\Eloquent\Builder|static
<add> */
<add> public function orWhereDoesntHave($relation, Closure $callback = null)
<add> {
<add> return $this->doesntHave($relation, 'or', $callback);
<add> }
<add>
<ide> /**
<ide> * Add subselect queries to count the relations.
<ide> *
<ide><path>tests/Database/DatabaseEloquentBuilderTest.php
<ide> public function testSelfHasNestedUsesAlias()
<ide> $this->assertContains('"self_alias_hash"."id" = "self_related_stubs"."parent_id"', $sql);
<ide> }
<ide>
<add> public function testDoesntHave()
<add> {
<add> $model = new EloquentBuilderTestModelParentStub;
<add>
<add> $builder = $model->doesntHave('foo');
<add>
<add> $this->assertEquals('select * from "eloquent_builder_test_model_parent_stubs" where not exists (select * from "eloquent_builder_test_model_close_related_stubs" where "eloquent_builder_test_model_parent_stubs"."foo_id" = "eloquent_builder_test_model_close_related_stubs"."id")', $builder->toSql());
<add> }
<add>
<add> public function testOrDoesntHave()
<add> {
<add> $model = new EloquentBuilderTestModelParentStub;
<add>
<add> $builder = $model->where('bar', 'baz')->orDoesntHave('foo');
<add>
<add> $this->assertEquals('select * from "eloquent_builder_test_model_parent_stubs" where "bar" = ? or not exists (select * from "eloquent_builder_test_model_close_related_stubs" where "eloquent_builder_test_model_parent_stubs"."foo_id" = "eloquent_builder_test_model_close_related_stubs"."id")', $builder->toSql());
<add> $this->assertEquals(['baz'], $builder->getBindings());
<add> }
<add>
<add> public function testWhereDoesntHave()
<add> {
<add> $model = new EloquentBuilderTestModelParentStub;
<add>
<add> $builder = $model->whereDoesntHave('foo', function ($query) {
<add> $query->where('bar', 'baz');
<add> });
<add>
<add> $this->assertEquals('select * from "eloquent_builder_test_model_parent_stubs" where not exists (select * from "eloquent_builder_test_model_close_related_stubs" where "eloquent_builder_test_model_parent_stubs"."foo_id" = "eloquent_builder_test_model_close_related_stubs"."id" and "bar" = ?)', $builder->toSql());
<add> $this->assertEquals(['baz'], $builder->getBindings());
<add> }
<add>
<add> public function testOrWhereDoesntHave()
<add> {
<add> $model = new EloquentBuilderTestModelParentStub;
<add>
<add> $builder = $model->where('bar', 'baz')->orWhereDoesntHave('foo', function ($query) {
<add> $query->where('qux', 'quux');
<add> });
<add>
<add> $this->assertEquals('select * from "eloquent_builder_test_model_parent_stubs" where "bar" = ? or not exists (select * from "eloquent_builder_test_model_close_related_stubs" where "eloquent_builder_test_model_parent_stubs"."foo_id" = "eloquent_builder_test_model_close_related_stubs"."id" and "qux" = ?)', $builder->toSql());
<add> $this->assertEquals(['baz', 'quux'], $builder->getBindings());
<add> }
<add>
<ide> public function testWhereKeyMethodWithInt()
<ide> {
<ide> $model = $this->getMockModel();
| 2
|
Python
|
Python
|
fix type annotations in randomized_heap.py
|
7488c5070e3f5a29a08f584644666494c420f834
|
<ide><path>data_structures/heap/randomized_heap.py
<ide> from __future__ import annotations
<ide>
<ide> import random
<del>from typing import Generic, Iterable, TypeVar
<add>from typing import Any, Generic, Iterable, TypeVar
<ide>
<del>T = TypeVar("T")
<add>T = TypeVar("T", bound=bool)
<ide>
<ide>
<ide> class RandomizedHeapNode(Generic[T]):
<ide> def __init__(self, data: Iterable[T] | None = ()) -> None:
<ide> [1, 3, 3, 7]
<ide> """
<ide> self._root: RandomizedHeapNode[T] | None = None
<del> for item in data:
<del> self.insert(item)
<add>
<add> if data:
<add> for item in data:
<add> self.insert(item)
<ide>
<ide> def insert(self, value: T) -> None:
<ide> """
<ide> def insert(self, value: T) -> None:
<ide> """
<ide> self._root = RandomizedHeapNode.merge(self._root, RandomizedHeapNode(value))
<ide>
<del> def pop(self) -> T:
<add> def pop(self) -> T | None:
<ide> """
<ide> Pop the smallest value from the heap and return it.
<ide>
<ide> def pop(self) -> T:
<ide> ...
<ide> IndexError: Can't get top element for the empty heap.
<ide> """
<add>
<ide> result = self.top()
<add>
<add> if self._root is None:
<add> return None
<add>
<ide> self._root = RandomizedHeapNode.merge(self._root.left, self._root.right)
<ide>
<ide> return result
<ide> def top(self) -> T:
<ide> raise IndexError("Can't get top element for the empty heap.")
<ide> return self._root.value
<ide>
<del> def clear(self):
<add> def clear(self) -> None:
<ide> """
<ide> Clear the heap.
<ide>
<ide> def clear(self):
<ide> """
<ide> self._root = None
<ide>
<del> def to_sorted_list(self) -> list[T]:
<add> def to_sorted_list(self) -> list[Any]:
<ide> """
<ide> Returns sorted list containing all the values in the heap.
<ide>
| 1
|
Go
|
Go
|
handle ebusy while removing
|
2c82fd93d8a01cc1f53fe861378e6d2dca0486c6
|
<ide><path>graphdriver/devmapper/deviceset.go
<ide> import (
<ide> "path"
<ide> "path/filepath"
<ide> "strconv"
<add> "strings"
<ide> "sync"
<ide> "time"
<ide> )
<ide> type DeviceSet struct {
<ide> TransactionId uint64
<ide> NewTransactionId uint64
<ide> nextFreeDevice int
<add> sawBusy bool
<ide> }
<ide>
<ide> type DiskUsage struct {
<ide> func (devices *DeviceSet) log(level int, file string, line int, dmError int, mes
<ide> return // Ignore _LOG_DEBUG
<ide> }
<ide>
<add> if strings.Contains(message, "busy") {
<add> devices.sawBusy = true
<add> }
<add>
<ide> utils.Debugf("libdevmapper(%d): %s:%d (%d) %s", level, file, line, dmError, message)
<ide> }
<ide>
<ide> func (devices *DeviceSet) deactivateDevice(hash string) error {
<ide> // Issues the underlying dm remove operation and then waits
<ide> // for it to finish.
<ide> func (devices *DeviceSet) removeDeviceAndWait(devname string) error {
<del> if err := removeDevice(devname); err != nil {
<add> var err error
<add>
<add> for i := 0; i < 10; i++ {
<add> devices.sawBusy = false
<add> err = removeDevice(devname)
<add> if err == nil {
<add> break
<add> }
<add> if !devices.sawBusy {
<add> return err
<add> }
<add>
<add> // If we see EBUSY it may be a transient error,
<add> // sleep a bit a retry a few times.
<add> time.Sleep(5 * time.Millisecond)
<add> }
<add> if err != nil {
<ide> return err
<ide> }
<add>
<ide> if err := devices.waitRemove(devname); err != nil {
<ide> return err
<ide> }
| 1
|
Java
|
Java
|
fix clone issue in yoganodejnibase
|
2707c17b0727f241d404f4a21090021c27c66f2c
|
<ide><path>ReactAndroid/src/main/java/com/facebook/yoga/YogaNodeJNIBase.java
<ide> public void swapChildAt(YogaNode newChild, int position) {
<ide> public YogaNodeJNIBase cloneWithChildren() {
<ide> try {
<ide> YogaNodeJNIBase clonedYogaNode = (YogaNodeJNIBase) super.clone();
<add> if (clonedYogaNode.mChildren != null) {
<add> clonedYogaNode.mChildren = new ArrayList<>(clonedYogaNode.mChildren);
<add> }
<ide> long clonedNativePointer = YogaNative.jni_YGNodeCloneJNI(mNativePointer);
<ide> clonedYogaNode.mOwner = null;
<ide> clonedYogaNode.mNativePointer = clonedNativePointer;
| 1
|
Python
|
Python
|
add multiple gpu support
|
ceee992a6e0d114923a397020ac5f100269f3d2f
|
<ide><path>swivel/swivel.py
<ide> """
<ide>
<ide> from __future__ import print_function
<del>import argparse
<ide> import glob
<ide> import math
<ide> import os
<ide>
<ide> import numpy as np
<ide> import tensorflow as tf
<add>from tensorflow.python.client import device_lib
<ide>
<ide> flags = tf.app.flags
<ide>
<ide> flags.DEFINE_float('learning_rate', 1.0, 'Initial learning rate')
<ide> flags.DEFINE_integer('num_concurrent_steps', 2,
<ide> 'Number of threads to train with')
<add>flags.DEFINE_integer('num_readers', 4,
<add> 'Number of threads to read the input data and feed it')
<ide> flags.DEFINE_float('num_epochs', 40, 'Number epochs to train for')
<del>flags.DEFINE_float('per_process_gpu_memory_fraction', 0.25,
<del> 'Fraction of GPU memory to use')
<add>flags.DEFINE_float('per_process_gpu_memory_fraction', 0,
<add> 'Fraction of GPU memory to use, 0 means allow_growth')
<add>flags.DEFINE_integer('num_gpus', 0,
<add> 'Number of GPUs to use, 0 means all available')
<ide>
<ide> FLAGS = flags.FLAGS
<ide>
<ide>
<add>def log(message, *args, **kwargs):
<add> tf.logging.info(message, *args, **kwargs)
<add>
<add>
<add>def get_available_gpus():
<add> return [d.name for d in device_lib.list_local_devices()
<add> if d.device_type == 'GPU']
<add>
<add>
<ide> def embeddings_with_init(vocab_size, embedding_dim, name):
<ide> """Creates and initializes the embedding tensors."""
<ide> return tf.get_variable(name=name,
<ide> def count_matrix_input(filenames, submatrix_rows, submatrix_cols):
<ide> queued_global_row, queued_global_col, queued_count = tf.train.batch(
<ide> [global_row, global_col, count],
<ide> batch_size=1,
<del> num_threads=4,
<add> num_threads=FLAGS.num_readers,
<ide> capacity=32)
<ide>
<ide> queued_global_row = tf.reshape(queued_global_row, [submatrix_rows])
<ide> def write_embeddings_to_disk(config, model, sess):
<ide> # Row Embedding
<ide> row_vocab_path = config.input_base_path + '/row_vocab.txt'
<ide> row_embedding_output_path = config.output_base_path + '/row_embedding.tsv'
<del> print('Writing row embeddings to:', row_embedding_output_path)
<del> sys.stdout.flush()
<add> log('Writing row embeddings to: %s', row_embedding_output_path)
<ide> write_embedding_tensor_to_disk(row_vocab_path, row_embedding_output_path,
<ide> sess, model.row_embedding)
<ide>
<ide> # Column Embedding
<ide> col_vocab_path = config.input_base_path + '/col_vocab.txt'
<ide> col_embedding_output_path = config.output_base_path + '/col_embedding.tsv'
<del> print('Writing column embeddings to:', col_embedding_output_path)
<del> sys.stdout.flush()
<add> log('Writing column embeddings to: %s', col_embedding_output_path)
<ide> write_embedding_tensor_to_disk(col_vocab_path, col_embedding_output_path,
<ide> sess, model.col_embedding)
<ide>
<ide> def __init__(self, config):
<ide> self._config = config
<ide>
<ide> # Create paths to input data files
<del> print('Reading model from:', config.input_base_path)
<del> sys.stdout.flush()
<add> log('Reading model from: %s', config.input_base_path)
<ide> count_matrix_files = glob.glob(config.input_base_path + '/shard-*.pb')
<ide> row_sums_path = config.input_base_path + '/row_sums.txt'
<ide> col_sums_path = config.input_base_path + '/col_sums.txt'
<ide> def __init__(self, config):
<ide>
<ide> self.n_rows = len(row_sums)
<ide> self.n_cols = len(col_sums)
<del> print('Matrix dim: (%d,%d) SubMatrix dim: (%d,%d) ' % (
<del> self.n_rows, self.n_cols, config.submatrix_rows, config.submatrix_cols))
<del> sys.stdout.flush()
<add> log('Matrix dim: (%d,%d) SubMatrix dim: (%d,%d)',
<add> self.n_rows, self.n_cols, config.submatrix_rows, config.submatrix_cols)
<ide> self.n_submatrices = (self.n_rows * self.n_cols /
<ide> (config.submatrix_rows * config.submatrix_cols))
<del> print('n_submatrices: %d' % (self.n_submatrices))
<del> sys.stdout.flush()
<del>
<del> # ===== CREATE VARIABLES ======
<del> # embeddings
<del> self.row_embedding = embeddings_with_init(
<del> embedding_dim=config.embedding_size,
<del> vocab_size=self.n_rows,
<del> name='row_embedding')
<del> self.col_embedding = embeddings_with_init(
<del> embedding_dim=config.embedding_size,
<del> vocab_size=self.n_cols,
<del> name='col_embedding')
<del> tf.summary.histogram('row_emb', self.row_embedding)
<del> tf.summary.histogram('col_emb', self.col_embedding)
<del>
<del> matrix_log_sum = math.log(np.sum(row_sums) + 1)
<del> row_bias_init = [math.log(x + 1) for x in row_sums]
<del> col_bias_init = [math.log(x + 1) for x in col_sums]
<del> self.row_bias = tf.Variable(
<del> row_bias_init, trainable=config.trainable_bias)
<del> self.col_bias = tf.Variable(
<del> col_bias_init, trainable=config.trainable_bias)
<del> tf.summary.histogram('row_bias', self.row_bias)
<del> tf.summary.histogram('col_bias', self.col_bias)
<del>
<del> # ===== CREATE GRAPH =====
<del>
<del> # Get input
<del> global_row, global_col, count = count_matrix_input(
<del> count_matrix_files, config.submatrix_rows, config.submatrix_cols)
<del>
<del> # Fetch embeddings.
<del> selected_row_embedding = tf.nn.embedding_lookup(
<del> self.row_embedding, global_row)
<del> selected_col_embedding = tf.nn.embedding_lookup(
<del> self.col_embedding, global_col)
<del>
<del> # Fetch biases.
<del> selected_row_bias = tf.nn.embedding_lookup([self.row_bias], global_row)
<del> selected_col_bias = tf.nn.embedding_lookup([self.col_bias], global_col)
<del>
<del> # Multiply the row and column embeddings to generate predictions.
<del> predictions = tf.matmul(
<del> selected_row_embedding, selected_col_embedding, transpose_b=True)
<del>
<del> # These binary masks separate zero from non-zero values.
<del> count_is_nonzero = tf.to_float(tf.cast(count, tf.bool))
<del> count_is_zero = 1 - tf.to_float(tf.cast(count, tf.bool))
<del>
<del> objectives = count_is_nonzero * tf.log(count + 1e-30)
<del> objectives -= tf.reshape(selected_row_bias, [config.submatrix_rows, 1])
<del> objectives -= selected_col_bias
<del> objectives += matrix_log_sum
<del>
<del> err = predictions - objectives
<del>
<del> # The confidence function scales the L2 loss based on the raw co-occurrence
<del> # count.
<del> l2_confidence = (config.confidence_base + config.confidence_scale * tf.pow(
<del> count, config.confidence_exponent))
<del>
<del> l2_loss = config.loss_multiplier * tf.reduce_sum(
<del> 0.5 * l2_confidence * err * err * count_is_nonzero)
<del>
<del> sigmoid_loss = config.loss_multiplier * tf.reduce_sum(
<del> tf.nn.softplus(err) * count_is_zero)
<del>
<del> self.loss = l2_loss + sigmoid_loss
<del>
<del> tf.summary.scalar("l2_loss", l2_loss)
<del> tf.summary.scalar("sigmoid_loss", sigmoid_loss)
<del> tf.summary.scalar("loss", self.loss)
<del>
<del> # Add optimizer.
<del> self.global_step = tf.Variable(0, name='global_step')
<del> opt = tf.train.AdagradOptimizer(config.learning_rate)
<del> self.train_op = opt.minimize(self.loss, global_step=self.global_step)
<del> self.saver = tf.train.Saver(sharded=True)
<add> log('n_submatrices: %d', self.n_submatrices)
<add>
<add> with tf.device('/cpu:0'):
<add> # ===== CREATE VARIABLES ======
<add> # Get input
<add> global_row, global_col, count = count_matrix_input(
<add> count_matrix_files, config.submatrix_rows, config.submatrix_cols)
<add>
<add> # Embeddings
<add> self.row_embedding = embeddings_with_init(
<add> embedding_dim=config.embedding_size,
<add> vocab_size=self.n_rows,
<add> name='row_embedding')
<add> self.col_embedding = embeddings_with_init(
<add> embedding_dim=config.embedding_size,
<add> vocab_size=self.n_cols,
<add> name='col_embedding')
<add> tf.summary.histogram('row_emb', self.row_embedding)
<add> tf.summary.histogram('col_emb', self.col_embedding)
<add>
<add> matrix_log_sum = math.log(np.sum(row_sums) + 1)
<add> row_bias_init = [math.log(x + 1) for x in row_sums]
<add> col_bias_init = [math.log(x + 1) for x in col_sums]
<add> self.row_bias = tf.Variable(
<add> row_bias_init, trainable=config.trainable_bias)
<add> self.col_bias = tf.Variable(
<add> col_bias_init, trainable=config.trainable_bias)
<add> tf.summary.histogram('row_bias', self.row_bias)
<add> tf.summary.histogram('col_bias', self.col_bias)
<add>
<add> # Add optimizer
<add> l2_losses = []
<add> sigmoid_losses = []
<add> self.global_step = tf.Variable(0, name='global_step')
<add> opt = tf.train.AdagradOptimizer(config.learning_rate)
<add>
<add> all_grads = []
<add>
<add> devices = ['/gpu:%d' % i for i in range(FLAGS.num_gpus)] \
<add> if FLAGS.num_gpus > 0 else get_available_gpus()
<add> self.devices_number = len(devices)
<add> with tf.variable_scope(tf.get_variable_scope()):
<add> for dev in devices:
<add> with tf.device(dev):
<add> with tf.name_scope(dev[1:].replace(':', '_')):
<add> # ===== CREATE GRAPH =====
<add> # Fetch embeddings.
<add> selected_row_embedding = tf.nn.embedding_lookup(
<add> self.row_embedding, global_row)
<add> selected_col_embedding = tf.nn.embedding_lookup(
<add> self.col_embedding, global_col)
<add>
<add> # Fetch biases.
<add> selected_row_bias = tf.nn.embedding_lookup(
<add> [self.row_bias], global_row)
<add> selected_col_bias = tf.nn.embedding_lookup(
<add> [self.col_bias], global_col)
<add>
<add> # Multiply the row and column embeddings to generate predictions.
<add> predictions = tf.matmul(
<add> selected_row_embedding, selected_col_embedding,
<add> transpose_b=True)
<add>
<add> # These binary masks separate zero from non-zero values.
<add> count_is_nonzero = tf.to_float(tf.cast(count, tf.bool))
<add> count_is_zero = 1 - count_is_nonzero
<add>
<add> objectives = count_is_nonzero * tf.log(count + 1e-30)
<add> objectives -= tf.reshape(
<add> selected_row_bias, [config.submatrix_rows, 1])
<add> objectives -= selected_col_bias
<add> objectives += matrix_log_sum
<add>
<add> err = predictions - objectives
<add>
<add> # The confidence function scales the L2 loss based on the raw
<add> # co-occurrence count.
<add> l2_confidence = (config.confidence_base +
<add> config.confidence_scale * tf.pow(
<add> count, config.confidence_exponent))
<add>
<add> l2_loss = config.loss_multiplier * tf.reduce_sum(
<add> 0.5 * l2_confidence * err * err * count_is_nonzero)
<add> l2_losses.append(tf.expand_dims(l2_loss, 0))
<add>
<add> sigmoid_loss = config.loss_multiplier * tf.reduce_sum(
<add> tf.nn.softplus(err) * count_is_zero)
<add> sigmoid_losses.append(tf.expand_dims(sigmoid_loss, 0))
<add>
<add> loss = l2_loss + sigmoid_loss
<add> grads = opt.compute_gradients(loss)
<add> all_grads.append(grads)
<add>
<add> with tf.device('/cpu:0'):
<add> # ===== MERGE LOSSES =====
<add> l2_loss = tf.reduce_mean(tf.concat(l2_losses, 0), 0, name="l2_loss")
<add> sigmoid_loss = tf.reduce_mean(tf.concat(sigmoid_losses, 0), 0,
<add> name="sigmoid_loss")
<add> self.loss = l2_loss + sigmoid_loss
<add> average = tf.train.ExponentialMovingAverage(0.8, self.global_step)
<add> loss_average_op = average.apply((self.loss,))
<add> tf.summary.scalar("l2_loss", l2_loss)
<add> tf.summary.scalar("sigmoid_loss", sigmoid_loss)
<add> tf.summary.scalar("loss", self.loss)
<add>
<add> # Apply the gradients to adjust the shared variables.
<add> apply_gradient_ops = []
<add> for grads in all_grads:
<add> apply_gradient_ops.append(opt.apply_gradients(
<add> grads, global_step=self.global_step))
<add>
<add> self.train_op = tf.group(loss_average_op, *apply_gradient_ops)
<add> self.saver = tf.train.Saver(sharded=True)
<ide>
<ide>
<ide> def main(_):
<add> tf.logging.set_verbosity(tf.logging.INFO)
<add> start_time = time.time()
<add>
<ide> # Create the output path. If this fails, it really ought to fail
<ide> # now. :)
<ide> if not os.path.isdir(FLAGS.output_base_path):
<ide> def main(_):
<ide> model = SwivelModel(FLAGS)
<ide>
<ide> # Create a session for running Ops on the Graph.
<del> gpu_options = tf.GPUOptions(
<del> per_process_gpu_memory_fraction=FLAGS.per_process_gpu_memory_fraction)
<add> gpu_opts = {}
<add> if FLAGS.per_process_gpu_memory_fraction > 0:
<add> gpu_opts["per_process_gpu_memory_fraction"] = \
<add> FLAGS.per_process_gpu_memory_fraction
<add> else:
<add> gpu_opts["allow_growth"] = True
<add> gpu_options = tf.GPUOptions(**gpu_opts)
<ide> sess = tf.Session(config=tf.ConfigProto(gpu_options=gpu_options))
<ide>
<ide> # Run the Op to initialize the variables.
<ide> def main(_):
<ide> # Calculate how many steps each thread should run
<ide> n_total_steps = int(FLAGS.num_epochs * model.n_rows * model.n_cols) / (
<ide> FLAGS.submatrix_rows * FLAGS.submatrix_cols)
<del> n_steps_per_thread = n_total_steps / FLAGS.num_concurrent_steps
<add> n_steps_per_thread = n_total_steps / (
<add> FLAGS.num_concurrent_steps * model.devices_number)
<ide> n_submatrices_to_train = model.n_submatrices * FLAGS.num_epochs
<ide> t0 = [time.time()]
<add> n_steps_between_status_updates = 100
<add> status_i = [0]
<add> status_lock = threading.Lock()
<add> msg = ('%%%dd/%%d submatrices trained (%%.1f%%%%), %%5.1f submatrices/sec |'
<add> ' loss %%f') % len(str(n_submatrices_to_train))
<ide>
<ide> def TrainingFn():
<ide> for _ in range(int(n_steps_per_thread)):
<del> _, global_step = sess.run([model.train_op, model.global_step])
<del> n_steps_between_status_updates = 100
<del> if (global_step % n_steps_between_status_updates) == 0:
<add> _, global_step, loss = sess.run((
<add> model.train_op, model.global_step, model.loss))
<add>
<add> show_status = False
<add> with status_lock:
<add> new_i = global_step // n_steps_between_status_updates
<add> if new_i > status_i[0]:
<add> status_i[0] = new_i
<add> show_status = True
<add> if show_status:
<ide> elapsed = float(time.time() - t0[0])
<del> print('%d/%d submatrices trained (%.1f%%), %.1f submatrices/sec' % (
<del> global_step, n_submatrices_to_train,
<add> log(msg, global_step, n_submatrices_to_train,
<ide> 100.0 * global_step / n_submatrices_to_train,
<del> n_steps_between_status_updates / elapsed))
<del> sys.stdout.flush()
<add> n_steps_between_status_updates / elapsed, loss)
<ide> t0[0] = time.time()
<ide>
<ide> # Start training threads
<ide> def TrainingFn():
<ide> # Write out vectors
<ide> write_embeddings_to_disk(FLAGS, model, sess)
<ide>
<del> #Shutdown
<add> # Shutdown
<ide> sess.close()
<add> log("Elapsed: %s", time.time() - start_time)
<ide>
<ide>
<ide> if __name__ == '__main__':
| 1
|
PHP
|
PHP
|
check mbstring.func_overload before using strlen()
|
f39c683130164e64214eb7e0958763a56eb71748
|
<ide><path>lib/Cake/Network/CakeResponse.php
<ide> public function send() {
<ide> $this->_sendHeader('Content-Type', "{$this->_contentType}; charset={$this->_charset}");
<ide> $shouldSetLength = empty($this->_headers['Content-Length']) && !in_array($this->_status, range(301, 307));
<ide> if ($shouldSetLength && !$this->outputCompressed()) {
<del> $this->_headers['Content-Length'] = strlen($this->_body);
<add> if (ini_get('mbstring.func_overload') & 2 && function_exists('mb_strlen')) {
<add> $this->_headers['Content-Length'] = mb_strlen($this->_body, '8bit');
<add> } else {
<add> $this->_headers['Content-Length'] = strlen($this->_body);
<add> }
<ide> }
<ide> foreach ($this->_headers as $header => $value) {
<ide> $this->_sendHeader($header, $value);
| 1
|
Python
|
Python
|
fix issue with clip max_value
|
6fb7ba721cf215e88176c665d2d7f90027f4e26f
|
<ide><path>keras/backend/tensorflow_backend.py
<ide> def clip(x, min_value, max_value):
<ide> # Returns
<ide> A tensor.
<ide> """
<del> if max_value < min_value:
<add> if max_value is not None and max_value < min_value:
<ide> max_value = min_value
<ide> min_value = _to_tensor(min_value, x.dtype.base_dtype)
<ide> max_value = _to_tensor(max_value, x.dtype.base_dtype)
<ide><path>keras/backend/theano_backend.py
<ide> def pow(x, a):
<ide>
<ide>
<ide> def clip(x, min_value, max_value):
<del> if max_value < min_value:
<add> if max_value is not None and max_value < min_value:
<ide> max_value = min_value
<ide> return T.clip(x, min_value, max_value)
<ide>
| 2
|
PHP
|
PHP
|
remove invalid annotation
|
5144f2555bf5a0139e2ad859111adf0f8a9693ac
|
<ide><path>tests/TestCase/I18n/PackageTest.php
<ide> class PackageTest extends TestCase
<ide> {
<ide> /**
<del> * @covers void
<add> * Test adding messages.
<add> *
<add> * @return void
<ide> */
<ide> public function testAddMessage()
<ide> {
| 1
|
Python
|
Python
|
add docstrings to tf backend
|
201352784088145cb1b320fc7762c81f493df19e
|
<ide><path>keras/backend/tensorflow_backend.py
<ide>
<ide>
<ide> def learning_phase():
<del> # False = test, True = train
<add> '''Returns the learning phase flag.
<add>
<add> The learning phase flag is an integer tensor (0 = test, 1 = train)
<add> to be passed as input to any Keras function
<add> that uses a different behavior at train time and test time.
<add> '''
<ide> return _LEARNING_PHASE
<ide>
<ide>
<ide> def get_session():
<add> '''Returns the TF session in use by the backend.
<add> '''
<ide> global _SESSION
<ide> if _SESSION is None:
<ide> if not os.environ.get('OMP_NUM_THREADS'):
<ide> def get_session():
<ide>
<ide>
<ide> def set_session(session):
<add> '''Sets the TF session.
<add> '''
<ide> global _SESSION
<ide> _SESSION = session
<ide>
<ide>
<ide> # VARIABLE MANIPULATION
<ide>
<ide> def variable(value, dtype=_FLOATX, name=None):
<add> '''Instantiates a tensor.
<add>
<add> # Arguments
<add> value: numpy array, initial value of the tensor.
<add> dtype: tensor type.
<add> name: optional name string for the tensor.
<add>
<add> # Returns
<add> Tensor variable instance.
<add> '''
<ide> v = tf.Variable(np.asarray(value, dtype=dtype), name=name)
<ide> get_session().run(v.initializer)
<ide> return v
<ide>
<ide>
<ide> def placeholder(shape=None, ndim=None, dtype=_FLOATX, name=None):
<add> '''Instantiates a placeholder.
<add>
<add> # Arguments:
<add> shape: shape of the placeholder
<add> (integer tuple, may include None entries).
<add> ndim: number of axes of the tensor.
<add> At least one of {`shape`, `ndim`} must be specified.
<add> If both are specified, `shape` is used.
<add> dtype: placeholder type.
<add> name: optional name string for the placeholder.
<add>
<add> # Returns:
<add> Placeholder tensor instance.
<add> '''
<ide> if not shape:
<ide> if ndim:
<ide> shape = tuple([None for _ in range(ndim)])
<ide> def placeholder(shape=None, ndim=None, dtype=_FLOATX, name=None):
<ide>
<ide>
<ide> def shape(x):
<del> # symbolic shape
<add> '''Returns the symbolic shape of a tensor.
<add> '''
<ide> return tf.shape(x)
<ide>
<ide>
<ide> def int_shape(x):
<add> '''Returns the shape of a tensor as a tuple of
<add> integers or None entries.
<add> '''
<ide> shape = x.get_shape()
<ide> return tuple([i.__int__() for i in shape])
<ide>
<ide>
<ide> def ndim(x):
<add> '''Returns the number of axes in a tensor, as an integer.
<add> '''
<ide> dims = x.get_shape()._dims
<ide> if dims is not None:
<ide> return len(dims)
<ide> return None
<ide>
<ide>
<ide> def dtype(x):
<add> '''Returns the dtype of a tensor, as a string.
<add> '''
<ide> return x.dtype.name
<ide>
<ide>
<ide> def eval(x):
<del> '''Run a graph.
<add> '''Evaluates the value of a tensor.
<add> Returns a Numpy array.
<ide> '''
<ide> return x.eval(session=get_session())
<ide>
<ide>
<ide> def zeros(shape, dtype=_FLOATX, name=None):
<add> '''Instantiates an all-zeros tensor variable.
<add> '''
<ide> return variable(np.zeros(shape), dtype, name)
<ide>
<ide>
<ide> def ones(shape, dtype=_FLOATX, name=None):
<add> '''Instantiates an all-ones tensor variable.
<add> '''
<ide> return variable(np.ones(shape), dtype, name)
<ide>
<ide>
<del>def ones_like(x, name=None):
<del> return tf.ones_like(x, name=name)
<del>
<del>
<ide> def zeros_like(x, name=None):
<add> '''Instantiates an all-zeros tensor
<add> of the same shape as another tensor.
<add> '''
<ide> return tf.zeros_like(x, name=name)
<ide>
<ide>
<add>def ones_like(x, name=None):
<add> '''Instantiates an all-ones tensor
<add> of the same shape as another tensor.
<add> '''
<add> return tf.ones_like(x, name=name)
<add>
<add>
<ide> def count_params(x):
<del> '''Return number of scalars in a tensor.
<add> '''Returns the number of scalars in a tensor.
<ide> '''
<ide> shape = x.get_shape()
<ide> return np.prod([shape[i]._value for i in range(len(shape))])
<ide>
<ide>
<ide> def cast(x, dtype):
<add> '''Casts a tensor to a different dtype.
<add> '''
<ide> return tf.cast(x, dtype)
<ide>
<ide>
<ide> def dot(x, y):
<ide>
<ide>
<ide> def batch_dot(x, y, axes=None):
<del> '''batchwise dot product
<add> '''Batchwise dot product.
<add>
<ide> batch_dot results in a tensor with less dimensions than the input.
<ide> If the number of dimensions is reduced to 1, we use `expand_dims` to
<ide> make sure that ndim is at least 2.
<ide> def batch_dot(x, y, axes=None):
<ide>
<ide>
<ide> def transpose(x):
<add> '''Transposes a matrix.
<add> '''
<ide> return tf.transpose(x)
<ide>
<ide>
<ide> def gather(reference, indices):
<del> '''
<add> '''Retrieves the vectors of indices `indices`
<add> in the 2D tensor `reference`.
<add>
<ide> # Arguments
<del> reference: a tensor.
<add> reference: a 2D tensor.
<ide> indices: an int tensor of indices.
<ide>
<ide> # Returns
<del> a tensor of same type as `reference`.
<add> A 3D tensor of same type as `reference`.
<ide> '''
<ide> return tf.gather(reference, indices)
<ide>
<ide>
<ide> # ELEMENT-WISE OPERATIONS
<ide>
<del>def normalize_axis(axis, ndim):
<add>def _normalize_axis(axis, ndim):
<ide> if type(axis) is tuple:
<ide> axis = list(axis)
<ide> if type(axis) is list:
<ide> def normalize_axis(axis, ndim):
<ide>
<ide>
<ide> def max(x, axis=None, keepdims=False):
<del> axis = normalize_axis(axis, ndim(x))
<add> '''Maximum value in a tensor.
<add> '''
<add> axis = _normalize_axis(axis, ndim(x))
<ide> return tf.reduce_max(x, reduction_indices=axis, keep_dims=keepdims)
<ide>
<ide>
<ide> def min(x, axis=None, keepdims=False):
<del> axis = normalize_axis(axis, ndim(x))
<add> '''Minimum value in a tensor.
<add> '''
<add> axis = _normalize_axis(axis, ndim(x))
<ide> return tf.reduce_min(x, reduction_indices=axis, keep_dims=keepdims)
<ide>
<ide>
<ide> def sum(x, axis=None, keepdims=False):
<ide> '''Sum of the values in a tensor, alongside the specified axis.
<ide> '''
<del> axis = normalize_axis(axis, ndim(x))
<add> axis = _normalize_axis(axis, ndim(x))
<ide> return tf.reduce_sum(x, reduction_indices=axis, keep_dims=keepdims)
<ide>
<ide>
<ide> def prod(x, axis=None, keepdims=False):
<del> '''Multiply the values in a tensor, alongside the specified axis.
<add> '''Multiplies the values in a tensor, alongside the specified axis.
<ide> '''
<del> axis = normalize_axis(axis, ndim(x))
<add> axis = _normalize_axis(axis, ndim(x))
<ide> return tf.reduce_prod(x, reduction_indices=axis, keep_dims=keepdims)
<ide>
<ide>
<ide> def std(x, axis=None, keepdims=False):
<del> axis = normalize_axis(axis, ndim(x))
<add> '''Standard deviation of a tensor, alongside the specificied axis.
<add> '''
<add> axis = _normalize_axis(axis, ndim(x))
<ide> if x.dtype.base_dtype == tf.bool:
<ide> x = tf.cast(x, _FLOATX)
<ide> m = tf.reduce_mean(x, reduction_indices=axis, keep_dims=True)
<ide> def std(x, axis=None, keepdims=False):
<ide>
<ide>
<ide> def mean(x, axis=None, keepdims=False):
<del> axis = normalize_axis(axis, ndim(x))
<add> '''Mean of a tensor, alongside the specificied axis.
<add> '''
<add> axis = _normalize_axis(axis, ndim(x))
<ide> if x.dtype.base_dtype == tf.bool:
<ide> x = tf.cast(x, _FLOATX)
<ide> return tf.reduce_mean(x, reduction_indices=axis, keep_dims=keepdims)
<ide> def mean(x, axis=None, keepdims=False):
<ide> def any(x, axis=None, keepdims=False):
<ide> '''Bitwise reduction (logical OR).
<ide>
<del> Return array of uint8 (0s and 1s).
<add> Returns an uint8 tensor (0s and 1s).
<ide> '''
<del> axis = normalize_axis(axis, ndim(x))
<add> axis = _normalize_axis(axis, ndim(x))
<ide> x = tf.cast(x, tf.bool)
<ide> x = tf.reduce_any(x, reduction_indices=axis, keep_dims=keepdims)
<ide> return tf.cast(x, tf.uint8)
<ide>
<ide>
<ide> def argmax(x, axis=-1):
<add> '''Returns the index of the maximum value
<add> along a tensor axis.
<add> '''
<ide> if axis < 0:
<ide> axis = axis % len(x.get_shape())
<ide> return tf.argmax(x, axis)
<ide>
<ide>
<ide> def argmin(x, axis=-1):
<add> '''Returns the index of the minimum value
<add> along a tensor axis.
<add> '''
<ide> if axis < 0:
<ide> axis = axis % len(x.get_shape())
<ide> return tf.argmin(x, axis)
<ide>
<ide>
<ide> def square(x):
<add> '''Element-wise square.
<add> '''
<ide> return tf.square(x)
<ide>
<ide>
<ide> def abs(x):
<add> '''Element-wise absolute value.
<add> '''
<ide> return tf.abs(x)
<ide>
<ide>
<ide> def sqrt(x):
<add> '''Element-wise square root.
<add> '''
<ide> x = tf.clip_by_value(x, tf.cast(0., dtype=_FLOATX),
<ide> tf.cast(np.inf, dtype=_FLOATX))
<ide> return tf.sqrt(x)
<ide>
<ide>
<ide> def exp(x):
<add> '''Element-wise exponential.
<add> '''
<ide> return tf.exp(x)
<ide>
<ide>
<ide> def log(x):
<add> '''Element-wise log.
<add> '''
<ide> return tf.log(x)
<ide>
<ide>
<ide> def round(x):
<add> '''Element-wise rounding to the closest integer.
<add> '''
<ide> return tf.round(x)
<ide>
<ide>
<ide> def sign(x):
<add> '''Element-wise sign.
<add> '''
<ide> return tf.sign(x)
<ide>
<ide>
<ide> def pow(x, a):
<add> '''Element-wise exponentiation.
<add> '''
<ide> return tf.pow(x, a)
<ide>
<ide>
<ide> def clip(x, min_value, max_value):
<add> '''Element-wise value clipping.
<add> '''
<ide> if max_value < min_value:
<ide> max_value = min_value
<ide> return tf.clip_by_value(x, tf.cast(min_value, dtype=_FLOATX),
<ide> tf.cast(max_value, dtype=_FLOATX))
<ide>
<ide>
<ide> def equal(x, y):
<add> '''Element-wise equality between two tensors.
<add> Returns a bool tensor.
<add> '''
<ide> return tf.equal(x, y)
<ide>
<ide>
<ide> def not_equal(x, y):
<add> '''Element-wise inequality between two tensors.
<add> Returns a bool tensor.
<add> '''
<ide> return tf.not_equal(x, y)
<ide>
<ide>
<ide> def maximum(x, y):
<add> '''Element-wise maximum of two tensors.
<add> '''
<ide> return tf.maximum(x, y)
<ide>
<ide>
<ide> def minimum(x, y):
<add> '''Element-wise minimum of two tensors.
<add> '''
<ide> return tf.minimum(x, y)
<ide>
<ide>
<ide> # SHAPE OPERATIONS
<ide>
<ide> def concatenate(tensors, axis=-1):
<add> '''Concantes a list of tensors alongside the specified axis.
<add> '''
<ide> if axis < 0:
<ide> if len(tensors[0].get_shape()):
<ide> axis = axis % len(tensors[0].get_shape())
<ide> def concatenate(tensors, axis=-1):
<ide>
<ide>
<ide> def reshape(x, shape):
<add> '''Reshapes a tensor to the specified shape.
<add> '''
<ide> return tf.reshape(x, shape)
<ide>
<ide>
<ide> def permute_dimensions(x, pattern):
<del> '''Transpose dimensions.
<add> '''Permutes axes in a tensor.
<ide>
<ide> # Arguments
<del> pattern: should be a tuple or list of
<del> dimension indices, e.g. [0, 2, 1].
<add> pattern: should be a tuple of
<add> dimension indices, e.g. (0, 2, 1).
<ide> '''
<ide> return tf.transpose(x, perm=pattern)
<ide>
<ide>
<ide> def resize_images(X, height_factor, width_factor, dim_ordering):
<del> '''Resize the images contained in a 4D tensor of shape
<add> '''Resizes the images contained in a 4D tensor of shape
<ide> - [batch, channels, height, width] (for 'th' dim_ordering)
<ide> - [batch, height, width, channels] (for 'tf' dim_ordering)
<ide> by a factor of (height_factor, width_factor). Both factors should be
<ide> def repeat_elements(x, rep, axis):
<ide>
<ide>
<ide> def repeat(x, n):
<del> '''Repeat a 2D tensor:
<add> '''Repeats a 2D tensor:
<ide>
<ide> if x has shape (samples, dim) and n=2,
<ide> the output will have shape (samples, 2, dim)
<ide> def batch_flatten(x):
<ide>
<ide>
<ide> def expand_dims(x, dim=-1):
<del> '''Add a 1-sized dimension at index "dim".
<add> '''Adds a 1-sized dimension at index "dim".
<ide> '''
<ide> return tf.expand_dims(x, dim)
<ide>
<ide>
<ide> def squeeze(x, axis):
<del> '''Remove a 1-dimension from the tensor at index "axis".
<add> '''Removes a 1-dimension from the tensor at index "axis".
<ide> '''
<ide> return tf.squeeze(x, [axis])
<ide>
<ide>
<ide> def temporal_padding(x, padding=1):
<del> '''Pad the middle dimension of a 3D tensor
<add> '''Pads the middle dimension of a 3D tensor
<ide> with "padding" zeros left and right.
<ide> '''
<ide> pattern = [[0, 0], [padding, padding], [0, 0]]
<ide> return tf.pad(x, pattern)
<ide>
<ide>
<ide> def spatial_2d_padding(x, padding=(1, 1), dim_ordering='th'):
<del> '''Pad the 2nd and 3rd dimensions of a 4D tensor
<add> '''Pads the 2nd and 3rd dimensions of a 4D tensor
<ide> with "padding[0]" and "padding[1]" (resp.) zeros left and right.
<ide> '''
<ide> if dim_ordering == 'th':
<ide> def pack(x):
<ide>
<ide>
<ide> def get_value(x):
<del> '''Technically the same as eval() for TF.
<add> '''Returns the value of a tensor variable,
<add> as a Numpy array.
<ide> '''
<ide> return x.eval(session=get_session())
<ide>
<ide>
<ide> def set_value(x, value):
<add> '''Sets the value of a tensor variable,
<add> from a Numpy array.
<add> '''
<ide> tf.assign(x, np.asarray(value)).op.run(session=get_session())
<ide>
<ide>
<ide> def __call__(self, inputs):
<ide>
<ide>
<ide> def function(inputs, outputs, updates=[], **kwargs):
<add> '''Instantiates a Keras function.
<add>
<add> # Arguments
<add> inputs: list of placeholder/variable tensors.
<add> outputs: list of output tensors.
<add> updates: list of update tuples (old_tensor, new_tensor).
<add> '''
<ide> if len(kwargs) > 0:
<ide> msg = [
<ide> "Expected no kwargs, you passed %s" % len(kwargs),
<ide> def function(inputs, outputs, updates=[], **kwargs):
<ide>
<ide>
<ide> def gradients(loss, variables):
<add> '''Returns the gradients of `variables` (list of tensor variables)
<add> with regard to `loss`.
<add> '''
<ide> return tf.gradients(loss, variables)
<ide>
<ide>
<ide> def rnn(step_function, inputs, initial_states,
<ide> mask: binary tensor with shape (samples, time, 1),
<ide> with a zero for every element that is masked.
<ide> constants: a list of constant values passed at each step.
<del> unroll: with TensorFlow the RNN is always unrolled, so this
<del> does do anything for the time being.
<add> unroll: with TensorFlow the RNN is always unrolled, but with Theano you
<add> can use this boolean flag to unroll the RNN.
<ide> input_length: not relevant in the TensorFlow implementation.
<add> Must be specified if using unrolling with Theano.
<ide>
<ide> # Returns
<ide> A tuple (last_output, outputs, new_states).
<ide> def rnn(step_function, inputs, initial_states,
<ide>
<ide>
<ide> def switch(condition, then_expression, else_expression):
<del> '''Switch between two operations depending on a scalar value (int or bool).
<add> '''Switches between two operations depending on a scalar value (int or bool).
<ide> Note that both `then_expression` and `else_expression`
<ide> should be symbolic tensors of the *same shape*.
<ide>
<ide> def in_test_phase(x, expression):
<ide> # NN OPERATIONS
<ide>
<ide> def relu(x, alpha=0., max_value=None):
<del> '''ReLU.
<add> '''Rectified linear unit
<ide>
<ide> # Arguments
<ide> alpha: slope of negative section.
<ide> def relu(x, alpha=0., max_value=None):
<ide>
<ide>
<ide> def softmax(x):
<add> '''Softmax of a tensor.
<add> '''
<ide> return tf.nn.softmax(x)
<ide>
<ide>
<ide> def softplus(x):
<add> '''Softplus of a tensor.
<add> '''
<ide> return tf.nn.softplus(x)
<ide>
<ide>
<ide> def categorical_crossentropy(output, target, from_logits=False):
<del> '''Note: tf.nn.softmax_cross_entropy_with_logits
<del> expects logits, Keras expects probabilities.
<add> '''Categorical crossentropy between an output tensor
<add> and a target tensor, where the target is a tensor of the same
<add> shape as the output.
<ide> '''
<add> # Note: tf.nn.softmax_cross_entropy_with_logits
<add> # expects logits, Keras expects probabilities.
<ide> if not from_logits:
<ide> # scale preds so that the class probas of each sample sum to 1
<ide> output /= tf.reduce_sum(output,
<ide> def categorical_crossentropy(output, target, from_logits=False):
<ide>
<ide>
<ide> def sparse_categorical_crossentropy(output, target, from_logits=False):
<del> '''Note: tf.nn.sparse_softmax_cross_entropy_with_logits
<del> expects logits, Keras expects probabilities.
<add> '''Categorical crossentropy between an output tensor
<add> and a target tensor, where the target is an integer tensor.
<ide> '''
<add> # Note: tf.nn.softmax_cross_entropy_with_logits
<add> # expects logits, Keras expects probabilities.
<ide> if not from_logits:
<ide> output = tf.clip_by_value(output, tf.cast(_EPSILON, dtype=_FLOATX),
<ide> tf.cast(1.-_EPSILON, dtype=_FLOATX))
<ide> def sparse_categorical_crossentropy(output, target, from_logits=False):
<ide>
<ide>
<ide> def binary_crossentropy(output, target, from_logits=False):
<del> '''Note: tf.nn.sigmoid_cross_entropy_with_logits
<del> expects logits, Keras expects probabilities.
<add> '''Binary crossentropy between an output tensor and a target tensor.
<ide> '''
<add> # Note: tf.nn.softmax_cross_entropy_with_logits
<add> # expects logits, Keras expects probabilities.
<ide> if not from_logits:
<ide> # transform back to logits
<ide> output = tf.clip_by_value(output, tf.cast(_EPSILON, dtype=_FLOATX),
<ide> def binary_crossentropy(output, target, from_logits=False):
<ide>
<ide>
<ide> def sigmoid(x):
<add> '''Element-wise sigmoid.
<add> '''
<ide> return tf.nn.sigmoid(x)
<ide>
<ide>
<ide> def hard_sigmoid(x):
<add> '''Segment-wise linear approximation of sigmoid.
<add> Faster than sigmoid.
<add> '''
<ide> x = (0.2 * x) + 0.5
<ide> x = tf.clip_by_value(x, tf.cast(0., dtype=_FLOATX),
<ide> tf.cast(1., dtype=_FLOATX))
<ide> return x
<ide>
<ide>
<ide> def tanh(x):
<add> '''Element-wise tanh.
<add> '''
<ide> return tf.nn.tanh(x)
<ide>
<ide>
<ide> def dropout(x, level, seed=None):
<add> '''Sets entries in `x` to zero at random,
<add> while scaling the entire tensor.
<add>
<add> # Arguments
<add> x: tensor
<add> level: fraction of the entries in the tensor
<add> that will be set to 0
<add> seed: random seed to ensure determinism.
<add> '''
<ide> retain_prob = 1. - level
<ide> if seed is None:
<ide> seed = np.random.randint(10e6)
<ide> def dropout(x, level, seed=None):
<ide>
<ide>
<ide> def l2_normalize(x, axis):
<add> '''Normalizes a tensor wrt the L2 norm alonside the specified axis.
<add> '''
<ide> if axis < 0:
<ide> axis = axis % len(x.get_shape())
<ide> return tf.nn.l2_normalize(x, dim=axis)
<ide> def l2_normalize(x, axis):
<ide>
<ide> def conv2d(x, kernel, strides=(1, 1), border_mode='valid', dim_ordering='th',
<ide> image_shape=None, filter_shape=None):
<del> '''Runs on cuDNN if available.
<add> '''2D convolution.
<ide>
<ide> # Arguments
<add> kernel: kernel tensor.
<add> strides: strides tuple.
<ide> border_mode: string, "same" or "valid".
<del> dim_ordering: whether to use Theano or TensorFlow dimension ordering
<add> dim_ordering: "tf" or "th". Whether to use Theano or TensorFlow dimension ordering
<ide> in inputs/kernels/ouputs.
<ide> '''
<ide> if border_mode == 'same':
<ide> def conv2d(x, kernel, strides=(1, 1), border_mode='valid', dim_ordering='th',
<ide>
<ide> def pool2d(x, pool_size, strides=(1, 1),
<ide> border_mode='valid', dim_ordering='th', pool_mode='max'):
<del> '''
<add> '''2D Pooling.
<add>
<ide> # Arguments
<ide> pool_size: tuple of 2 integers.
<ide> strides: tuple of 2 integers.
<ide> border_mode: one of "valid", "same".
<ide> dim_ordering: one of "th", "tf".
<add> pool_mode: one of "max", "avg".
<ide> '''
<ide> if border_mode == 'same':
<ide> padding = 'SAME'
| 1
|
Go
|
Go
|
remove the lookup before pushing
|
89763bc8af81dcda3d9989cf6687c75101bbd9bc
|
<ide><path>commands.go
<ide> func (srv *Server) CmdPush(stdin io.ReadCloser, stdout io.Writer, args ...string
<ide> remote = local
<ide> }
<ide>
<add> Debugf("Pushing [%s] to [%s]\n", local, remote)
<add>
<ide> // Try to get the image
<ide> // FIXME: Handle lookup
<ide> // FIXME: Also push the tags in case of ./docker push myrepo:mytag
<ide> // img, err := srv.runtime.LookupImage(cmd.Arg(0))
<ide> img, err := srv.runtime.graph.Get(local)
<ide> if err != nil {
<add> Debugf("The push refers to a repository [%s] (len: %d)\n", local, len(srv.runtime.repositories.Repositories[local]))
<add>
<ide> // If it fails, try to get the repository
<ide> if localRepo, exists := srv.runtime.repositories.Repositories[local]; exists {
<del> fmt.Fprintf(stdout, "Pushing %s (%d images) on %s...\n", local, len(localRepo), remote)
<add> fmt.Fprintf(stdout, "Pushing %s (%d tags) on %s...\n", local, len(localRepo), remote)
<ide> if err := srv.runtime.graph.PushRepository(remote, localRepo, srv.runtime.authConfig); err != nil {
<ide> return err
<ide> }
<ide><path>registry.go
<ide> func (graph *Graph) pushPrimitive(remote, tag, imgId string, authConfig *auth.Au
<ide> // Remote has the format '<user>/<repo>
<ide> func (graph *Graph) PushRepository(remote string, localRepo Repository, authConfig *auth.AuthConfig) error {
<ide> // Check if the remote repository exists
<del> if !graph.LookupRemoteRepository(remote, authConfig) {
<del> return fmt.Errorf("The remote repository %s does not exist\n", remote)
<del> }
<add> // FIXME: @lopter How to handle this?
<add> // if !graph.LookupRemoteRepository(remote, authConfig) {
<add> // return fmt.Errorf("The remote repository %s does not exist\n", remote)
<add> // }
<ide>
<ide> // For each image within the repo, push them
<ide> for tag, imgId := range localRepo {
| 2
|
Text
|
Text
|
fix typo in contributing.md
|
380a9b5b83a60aed93599dcd854b77990633bf38
|
<ide><path>CONTRIBUTING.md
<ide> This sets up the local webserver using connect and then watches source files, te
<ide>
<ide> #### Step 1: Verify
<ide>
<del>Whether you're adding something new, making something better, or fixing a bug, you'll first want to search the [GitHub issues](https://github.com/videojs/video.js/issues) and [plugins list](https://github.com/videojs/video.js/wiki/Plugins) to make sure you're aware of any previous discussion or work. If an unclaimed issue exists, claim it via a comment. If no issue exists for your change, submit one, follwing the [issue filing guidelines](#filing-issues).
<add>Whether you're adding something new, making something better, or fixing a bug, you'll first want to search the [GitHub issues](https://github.com/videojs/video.js/issues) and [plugins list](https://github.com/videojs/video.js/wiki/Plugins) to make sure you're aware of any previous discussion or work. If an unclaimed issue exists, claim it via a comment. If no issue exists for your change, submit one, following the [issue filing guidelines](#filing-issues).
<ide>
<ide> #### Step 2: Update remote
<ide>
| 1
|
Text
|
Text
|
update typescript instructions
|
3f2027d58376bb9bb0db069603b50fae2619a4c9
|
<ide><path>packages/next/README.md
<ide> As `href` is a filesystem path, it shouldn't change at runtime, instead, you wil
<ide> dynamically according to your needs. Here's an example to create a list of links:
<ide>
<ide> ```jsx
<del>const pids = ['id1', 'id2', 'id3'];
<del>{pids.map(pid => (
<del> <Link href="/post/[pid]" as={`/post/${pid}`}>
<del> <a>Post {pid}</a>
<del> </Link>
<del>))}
<add>const pids = ['id1', 'id2', 'id3']
<add>{
<add> pids.map(pid => (
<add> <Link href="/post/[pid]" as={`/post/${pid}`}>
<add> <a>Post {pid}</a>
<add> </Link>
<add> ))
<add>}
<ide> ```
<ide>
<ide> > You can [read more about `<Link>` here](#with-link).
<ide> export default withRouter(MyLink)
<ide> <li><a href="/examples/api-routes-micro">API routes with micro</a></li>
<ide> <li><a href="/examples/api-routes-middleware">API routes with middleware</a></li>
<ide> <li><a href="/examples/api-routes-graphql">API routes with GraphQL server</a></li>
<del> <li><a href="/examples/api-routes-rest">API routes with REST</a></li>
<add> <li><a href="/examples/api-routes-rest">API routes with REST</a></li>
<ide> </ul>
<ide> </details>
<ide>
<ide> export const config = {
<ide> api: {
<ide> bodyParser: {
<ide> sizeLimit: '1mb',
<del> }
<add> },
<ide> },
<ide> }
<ide> ```
<ide> The [polyfills](https://github.com/zeit/next.js/tree/canary/examples/with-polyfi
<ide>
<ide> ## TypeScript
<ide>
<del>TypeScript is supported out of the box in Next.js. To get started using it create a `tsconfig.json` in your project:
<add>Next.js provides an integrated TypeScript experience out of the box, similar to an IDE.
<ide>
<del>```js
<del>{
<del> "compilerOptions": {
<del> "allowJs": true, /* Allow JavaScript files to be type checked. */
<del> "alwaysStrict": true, /* Parse in strict mode. */
<del> "esModuleInterop": true, /* matches compilation setting */
<del> "isolatedModules": true, /* to match webpack loader */
<del> "jsx": "preserve", /* Preserves jsx outside of Next.js. */
<del> "lib": ["dom", "es2017"], /* List of library files to be included in the type checking. */
<del> "module": "esnext", /* Specifies the type of module to type check. */
<del> "moduleResolution": "node", /* Determine how modules get resolved. */
<del> "noEmit": true, /* Do not emit outputs. Makes sure tsc only does type checking. */
<del>
<del> /* Strict Type-Checking Options, optional, but recommended. */
<del> "noFallthroughCasesInSwitch": true, /* Report errors for fallthrough cases in switch statement. */
<del> "noUnusedLocals": true, /* Report errors on unused locals. */
<del> "noUnusedParameters": true, /* Report errors on unused parameters. */
<del> "strict": true, /* Enable all strict type-checking options. */
<del> "target": "esnext" /* The type checking input. */
<del> }
<del>}
<add>To get started, create a empty `tsconfig.json` file in the root of your project:
<add>
<add>```bash
<add>touch tsconfig.json
<ide> ```
<ide>
<del>After adding the `tsconfig.json` you need to install `@types` to get proper TypeScript typing.
<add>Next.js will automatically configure this file with default values if you (providing [your own `tsconfig.json`](https://www.typescriptlang.org/docs/handbook/compiler-options.html) is also supported).
<add>
<add>Then, run `next dev` (normally `npm run dev`) and Next.js will guide you through installing the necessary packages to complete setup.
<ide>
<ide> ```bash
<del>npm install --save-dev @types/react @types/react-dom @types/node
<add>npm run dev
<add>
<add># You'll see instructions like these:
<add>#
<add># Please install typescript, @types/react, and @types/node by running:
<add>#
<add># yarn add --dev typescript @types/react @types/node
<add>#
<add># ...
<ide> ```
<ide>
<del>Now can change any file from `.js` to `.ts` / `.tsx` (tsx is for files using JSX). To learn more about TypeScript checkout its [documentation](https://www.typescriptlang.org/).
<add>You're now ready to start converting files from `.js` to `.tsx` and leveraging the benefits TypeScript provides!
<add>
<add>To learn more about TypeScript checkout its [documentation](https://www.typescriptlang.org/).
<add>
<add>> **Note**: Next.js will create a file named `next-env.d.ts` in the root of your project.
<add>> This file ensures Next.js' types are picked up by the TypeScript compiler.
<add>>
<add>> **You cannot remove this file, however, you can edit it (but don't need to).**
<add>
<add>> **Note**: Next.js does not enable TypeScript's `strict` mode by default.
<add>> When you feel comfortable with TypeScript, you may turn this option on in your `tsconfig.json`.
<ide>
<ide> ### Exported types
<ide>
| 1
|
PHP
|
PHP
|
fix comment style
|
9899909363e80ad4c160d1e136dfa754d4f05401
|
<ide><path>tests/TestCase/Validation/ValidationTest.php
<ide> public function testLocalizedTime()
<ide> $this->assertFalse(Validation::localizedTime('', 'date'));
<ide> $this->assertFalse(Validation::localizedTime('invalid', 'date'));
<ide>
<del> /* English (US) */
<add> // English (US)
<ide> I18N::locale('en_US');
<ide> $this->assertTrue(Validation::localizedTime('12/31/2006', 'date'));
<ide> $this->assertTrue(Validation::localizedTime('6.40pm', 'time'));
<ide> public function testLocalizedTime()
<ide> $this->assertFalse(Validation::localizedTime('31. Dezember 2006', 'date')); // non-US format
<ide> $this->assertFalse(Validation::localizedTime('18:40', 'time')); // non-US format
<ide>
<del> /* German */
<add> // German
<ide> I18N::locale('de_DE');
<ide> $this->assertTrue(Validation::localizedTime('31.12.2006', 'date'));
<ide> $this->assertTrue(Validation::localizedTime('31. Dezember 2006', 'date'));
<ide> $this->assertTrue(Validation::localizedTime('18:40', 'time'));
<ide>
<ide> $this->assertFalse(Validation::localizedTime('December 31, 2006', 'date')); // non-German format
<ide>
<del> /* Russian */
<add> // Russian
<ide> I18N::locale('ru_RU');
<ide> $this->assertTrue(Validation::localizedTime('31 декабря 2006', 'date'));
<ide>
| 1
|
Ruby
|
Ruby
|
need a default application.css for propshaft
|
78446733905556cd51775d253e3d13e3e7a477e2
|
<ide><path>railties/lib/rails/generators/rails/app/app_generator.rb
<ide> def delete_assets_initializer_skipping_sprockets
<ide> if skip_sprockets?
<ide> remove_file "config/initializers/assets.rb"
<ide> remove_file "app/assets/config/manifest.js"
<add> remove_dir "app/assets/config"
<ide> remove_file "app/assets/stylesheets/application.css"
<add> create_file "app/assets/stylesheets/application.css", "/* Application styles */\n"
<ide> end
<ide> end
<ide>
| 1
|
Text
|
Text
|
add changelog for system tests
|
13151d511ec4dacfd750d0c630970a24c4f960cf
|
<ide><path>actionpack/CHANGELOG.md
<add>* Add `ActionDispatch::SystemTestCase` to Action Pack
<add>
<add> Adds Capybara integration directly into Rails through Action Pack!
<add>
<add> See PR [#26703](https://github.com/rails/rails/pull/26703)
<add>
<add> *Eileen M.Uchitelle*
<add>
<ide> * Remove deprecated `.to_prepare`, `.to_cleanup`, `.prepare!` and `.cleanup!` from `ActionDispatch::Reloader`.
<ide>
<ide> *Rafael Mendonça França*
| 1
|
Javascript
|
Javascript
|
replace const / var with let
|
9f271b8f38f7104e031938c17149a48273a435f3
|
<ide><path>test/parallel/test-vm-function-declaration.js
<ide> const o = vm.createContext({ console });
<ide>
<ide> // Function declaration and expression should both be copied to the
<ide> // sandboxed context.
<del>let code = 'var a = function() {};\n';
<add>let code = 'let a = function() {};\n';
<ide> code += 'function b(){}\n';
<add>code += 'var c = function() {};\n';
<add>code += 'var d = () => {};\n';
<add>code += 'let e = () => {};\n';
<ide>
<ide> // Grab the global b function as the completion value, to ensure that
<ide> // we are getting the global function, and not some other thing
<ide> code += '(function(){return this})().b;\n';
<ide>
<ide> const res = vm.runInContext(code, o, 'test');
<del>
<ide> assert.strictEqual(typeof res, 'function');
<ide> assert.strictEqual(res.name, 'b');
<del>assert.strictEqual(typeof o.a, 'function');
<add>assert.strictEqual(typeof o.a, 'undefined');
<ide> assert.strictEqual(typeof o.b, 'function');
<add>assert.strictEqual(typeof o.c, 'function');
<add>assert.strictEqual(typeof o.d, 'function');
<add>assert.strictEqual(typeof o.e, 'undefined');
<ide> assert.strictEqual(res, o.b);
| 1
|
Text
|
Text
|
fix typo in http2.md
|
469036add4e0e879de3e9307bca4d22a1d547793
|
<ide><path>doc/api/http2.md
<ide> changes:
<ide> Http2ServerRequest class to use.
<ide> Useful for extending the original `Http2ServerRequest`.
<ide> **Default:** `Http2ServerRequest`
<del> * `Http2ServerResponse` {htt2.Http2ServerResponse} Specifies the
<add> * `Http2ServerResponse` {http2.Http2ServerResponse} Specifies the
<ide> Http2ServerResponse class to use.
<ide> Useful for extending the original `Http2ServerResponse`.
<ide> **Default:** `Http2ServerResponse`
| 1
|
Text
|
Text
|
apply sentence case to headers in doc/guides
|
0486cb86cd7f5e99caf3e249fda7013420c2b32f
|
<ide><path>doc/guides/commit-queue.md
<del># Commit Queue
<add># Commit queue
<ide>
<ide> > Stability: 1 - Experimental
<ide>
<ide> From a high-level, the Commit Queue works as follow:
<ide> 3. Close the PR
<ide> 4. Go to next PR in the queue
<ide>
<del>## Current Limitations
<add>## Current limitations
<ide>
<ide> The Commit Queue feature is still in early stages, and as such it might not
<ide> work for more complex Pull Requests. These are the currently known limitations
<ide> If no errors happen during `git node land`, the script will use the
<ide> `Landed in ...` comment in the PR, and then will close it. Iteration continues
<ide> until all PRs have done the steps above.
<ide>
<del>## Reverting Broken Commits
<add>## Reverting broken commits
<ide>
<ide> Reverting broken commits is done manually by Collaborators, just like when
<ide> commits are landed manually via `git node land`. An easy way to revert is a
<ide><path>doc/guides/diagnostic-tooling-support-tiers.md
<del># Diagnostic Tooling Support Tiers
<add># Diagnostic tooling support tiers
<ide>
<ide> Diagnostic tooling is important to the consumers of Node.js. It is used both
<ide> in development and in production in order to investigate problems. The failure
<ide><path>doc/guides/investigating_native_memory_leak.md
<del># Investigating Memory Leaks with valgrind
<add># Investigating memory leaks with valgrind
<ide>
<ide> A Node.js process may run out of memory due to excessive consumption of
<ide> native memory. Native Memory is memory which is not managed by the
<ide><path>doc/guides/maintaining-icu.md
<ide> $ node -p process.versions
<ide> }
<ide> ```
<ide>
<del>### Time Zone Data
<add>### Time zone data
<ide>
<ide> Time zone data files are updated independently of ICU CLDR data. ICU and its
<ide> main data files do not need to be upgraded in order to apply time zone data file
<ide> in the icu/icu-data repository.
<ide> All modern versions of Node.js use the version 44 ABI of the time zone data
<ide> files.
<ide>
<del>#### Example: Updating the ICU `.dat` File
<add>#### Example: updating the ICU `.dat` file
<ide>
<ide> * Decompress `deps/icu/source/data/in/icudt##l.dat.bz2`, where `##` is
<ide> the ICU major version number.
<ide> files.
<ide> * Build, test, verifying `process.versions.tz` matches the desired version.
<ide> * Create a new minor version release.
<ide>
<del>## Release Schedule
<add>## Release schedule
<ide>
<ide> ICU typically has >1 release a year, particularly coinciding with a major
<ide> release of [Unicode][]. The current release schedule is available on the [ICU][]
<ide> You should see a message such as:
<ide> INFO: Using floating patch "tools/icu/patches/63/source/tools/toolutil/pkg_genc.cpp" from "tools/icu"
<ide> ```
<ide>
<del>### Clean Up
<add>### Clean up
<ide>
<ide> Any patches older than the minimum version given in `tools/icu/icu_versions.json`
<ide> ought to be deleted, because they will never be used.
<ide><path>doc/guides/maintaining-npm.md
<ide> $ git commit -m "doc: update npm LICENSE using license-builder.sh"
<ide>
<ide> Note: please ensure you are only making the updates that are changed by npm.
<ide>
<del>## Step 4: Apply Whitespace fix
<add>## Step 4: Apply whitespace fix
<ide>
<ide> ```console
<ide> $ git rebase --whitespace=fix master
<ide><path>doc/guides/maintaining-openssl.md
<ide> currently need to generate three PRs as follows:
<ide> * `nasm` (<https://www.nasm.us/>) Version 2.11 or higher is needed.
<ide> * GNU `as` in binutils. Version 2.26 or higher is needed.
<ide>
<del>## 0. Check Requirements
<add>## 0. Check requirements
<ide>
<ide> ```console
<ide> % perl -v
<ide><path>doc/guides/maintaining-root-certs.md
<del># Maintaining the Root Certificates
<add># Maintaining the root certificates
<ide>
<ide> Node.js contains a compiled-in set of root certificates used as trust anchors
<ide> for TLS certificate validation.
<ide><path>doc/guides/maintaining-the-build-files.md
<del># Maintaining the Build files
<add># Maintaining the build files
<ide>
<ide> This document explains how to maintain the build files in the codebase.
<ide>
| 8
|
Ruby
|
Ruby
|
change core formula tap naming
|
2afa1c3b86356292a751468bf67bc615bb6594f5
|
<ide><path>Library/Homebrew/cmd/tap.rb
<ide> def link_tap_formula formulae
<ide> else
<ide> to = to.realpath if to.exist?
<ide> # Whitelist gcc42 temporarily until Mavericks/Xcode 5.0 issues are resolved.
<del> unless to.tap_ref == 'mxcl/master/apple-gcc42'
<add> unless to.tap_ref == 'Homebrew/homebrew/apple-gcc42'
<ide> opoo "Could not tap #{Tty.white}#{from.tap_ref}#{Tty.reset} over #{Tty.white}#{to.tap_ref}#{Tty.reset}"
<ide> end
<ide> end
<ide> def tap_ref
<ide> when %r{^#{HOMEBREW_LIBRARY}/Taps/([a-z\-_]+)-(\w+)/(.+)}
<ide> "#$1/#$2/#{File.basename($3, '.rb')}"
<ide> when %r{^#{HOMEBREW_LIBRARY}/Formula/(.+)}
<del> "mxcl/master/#{File.basename($1, '.rb')}"
<add> "Homebrew/homebrew/#{File.basename($1, '.rb')}"
<ide> else
<ide> nil
<ide> end
<ide><path>Library/Homebrew/formula.rb
<ide> def tap
<ide> if path.realpath.to_s =~ HOMEBREW_TAP_DIR_REGEX
<ide> "#$1/#$2"
<ide> elsif core_formula?
<del> "mxcl/master"
<add> "Homebrew/homebrew"
<ide> else
<ide> "path or URL"
<ide> end
| 2
|
Mixed
|
Python
|
add docs for 401 vs 403 responses
|
5ae49a4ec4ccfdab13bc848ecd175d44ecaf4ed1
|
<ide><path>docs/api-guide/authentication.md
<ide>
<ide> Authentication is the mechanism of associating an incoming request with a set of identifying credentials, such as the user the request came from, or the token that it was signed with. The [permission] and [throttling] policies can then use those credentials to determine if the request should be permitted.
<ide>
<del>REST framework provides a number of authentication policies out of the box, and also allows you to implement custom policies.
<add>REST framework provides a number of authentication schemes out of the box, and also allows you to implement custom schemes.
<ide>
<ide> Authentication will run the first time either the `request.user` or `request.auth` properties are accessed, and determines how those properties are initialized.
<ide>
<ide> The `request.auth` property is used for any additional authentication informatio
<ide>
<ide> ---
<ide>
<del>**Note:** Don't forget that authentication by itself wont allow or disallow an incoming request, it simply identifies the credentials that the request was made with. For information on how to setup the permission polices for your API please see the [permissions documentation][permission].
<add>**Note:** Don't forget that **authentication by itself won't allow or disallow an incoming request**, it simply identifies the credentials that the request was made with. For information on how to setup the permission polices for your API please see the [permissions documentation][permission].
<ide>
<ide> ---
<ide>
<ide> ## How authentication is determined
<ide>
<del>The authentication policy is always defined as a list of classes. REST framework will attempt to authenticate with each class in the list, and will set `request.user` and `request.auth` using the return value of the first class that successfully authenticates.
<add>The authentication schemes are always defined as a list of classes. REST framework will attempt to authenticate with each class in the list, and will set `request.user` and `request.auth` using the return value of the first class that successfully authenticates.
<ide>
<ide> If no class authenticates, `request.user` will be set to an instance of `django.contrib.auth.models.AnonymousUser`, and `request.auth` will be set to `None`.
<ide>
<ide> The value of `request.user` and `request.auth` for unauthenticated requests can be modified using the `UNAUTHENTICATED_USER` and `UNAUTHENTICATED_TOKEN` settings.
<ide>
<del>## Setting the authentication policy
<add>## Setting the authentication scheme
<ide>
<del>The default authentication policy may be set globally, using the `DEFAULT_AUTHENTICATION` setting. For example.
<add>The default authentication schemes may be set globally, using the `DEFAULT_AUTHENTICATION` setting. For example.
<ide>
<ide> REST_FRAMEWORK = {
<ide> 'DEFAULT_AUTHENTICATION': (
<ide> The default authentication policy may be set globally, using the `DEFAULT_AUTHEN
<ide> )
<ide> }
<ide>
<del>You can also set the authentication policy on a per-view basis, using the `APIView` class based views.
<add>You can also set the authentication scheme on a per-view basis, using the `APIView` class based views.
<ide>
<ide> class ExampleView(APIView):
<ide> authentication_classes = (SessionAuthentication, UserBasicAuthentication)
<ide> Or, if you're using the `@api_view` decorator with function based views.
<ide> }
<ide> return Response(content)
<ide>
<add>## Unauthorized and Forbidden responses
<add>
<add>When an unauthenticated request is denied permission there are two different error codes that may be appropriate.
<add>
<add>* [HTTP 401 Unauthorized][http401]
<add>* [HTTP 403 Permission Denied][http403]
<add>
<add>The kind of response that will be used depends on the type of authentication scheme in use, and the ordering of the authentication classes.
<add>
<add>Although multiple authentication schemes may be in use, only one scheme may be used to determine the type of response. **The first authentication class set on the view is given priority when determining the type of response**.
<add>
<add>Note that when a *successfully authenticated* request is denied permission, a `403 Permission Denied` response will always be used, regardless of the authentication scheme.
<add>
<add>---
<add>
<ide> # API Reference
<ide>
<ide> ## BasicAuthentication
<ide>
<del>This policy uses [HTTP Basic Authentication][basicauth], signed against a user's username and password. Basic authentication is generally only appropriate for testing.
<add>This authentication scheme uses [HTTP Basic Authentication][basicauth], signed against a user's username and password. Basic authentication is generally only appropriate for testing.
<ide>
<ide> If successfully authenticated, `BasicAuthentication` provides the following credentials.
<ide>
<ide> * `request.user` will be a `django.contrib.auth.models.User` instance.
<ide> * `request.auth` will be `None`.
<ide>
<add>Unauthenticated responses that are denied permission will result in an `HTTP 401 Unauthenticated` response with an appropriate WWW-Authenticate header. For example:
<add>
<add> WWW-Authenticate: Basic realm="api"
<add>
<ide> **Note:** If you use `BasicAuthentication` in production you must ensure that your API is only available over `https` only. You should also ensure that your API clients will always re-request the username and password at login, and will never store those details to persistent storage.
<ide>
<ide> ## TokenAuthentication
<ide>
<del>This policy uses a simple token-based HTTP Authentication scheme. Token authentication is appropriate for client-server setups, such as native desktop and mobile clients.
<add>This authentication scheme uses a simple token-based HTTP Authentication scheme. Token authentication is appropriate for client-server setups, such as native desktop and mobile clients.
<ide>
<del>To use the `TokenAuthentication` policy, include `rest_framework.authtoken` in your `INSTALLED_APPS` setting.
<add>To use the `TokenAuthentication` scheme, include `rest_framework.authtoken` in your `INSTALLED_APPS` setting.
<ide>
<ide> You'll also need to create tokens for your users.
<ide>
<ide> If successfully authenticated, `TokenAuthentication` provides the following cred
<ide> * `request.user` will be a `django.contrib.auth.models.User` instance.
<ide> * `request.auth` will be a `rest_framework.tokenauth.models.BasicToken` instance.
<ide>
<add>Unauthenticated responses that are denied permission will result in an `HTTP 401 Unauthenticated` response with an appropriate WWW-Authenticate header. For example:
<add>
<add> WWW-Authenticate: Token
<add>
<ide> **Note:** If you use `TokenAuthentication` in production you must ensure that your API is only available over `https` only.
<ide>
<del>## OAuthAuthentication
<add>## OAuth2Authentication
<ide>
<del>This policy uses the [OAuth 2.0][oauth] protocol to authenticate requests. OAuth is appropriate for server-server setups, such as when you want to allow a third-party service to access your API on a user's behalf.
<add>This authentication scheme uses the [OAuth 2.0][oauth] protocol to authenticate requests. OAuth is appropriate for server-server setups, such as when you want to allow a third-party service to access your API on a user's behalf.
<ide>
<del>If successfully authenticated, `OAuthAuthentication` provides the following credentials.
<add>If successfully authenticated, `OAuth2Authentication` provides the following credentials.
<ide>
<ide> * `request.user` will be a `django.contrib.auth.models.User` instance.
<ide> * `request.auth` will be a `rest_framework.models.OAuthToken` instance.
<ide>
<add>**TODO**: Note type of response (401 vs 403)
<add>
<add>**TODO**: Implement OAuth2Authentication, using django-oauth2-provider.
<add>
<ide> ## SessionAuthentication
<ide>
<del>This policy uses Django's default session backend for authentication. Session authentication is appropriate for AJAX clients that are running in the same session context as your website.
<add>This authentication scheme uses Django's default session backend for authentication. Session authentication is appropriate for AJAX clients that are running in the same session context as your website.
<ide>
<ide> If successfully authenticated, `SessionAuthentication` provides the following credentials.
<ide>
<ide> * `request.user` will be a `django.contrib.auth.models.User` instance.
<ide> * `request.auth` will be `None`.
<ide>
<add>Unauthenticated responses that are denied permission will result in an `HTTP 403 Forbidden` response.
<add>
<add>---
<add>
<ide> # Custom authentication
<ide>
<del>To implement a custom authentication policy, subclass `BaseAuthentication` and override the `.authenticate(self, request)` method. The method should return a two-tuple of `(user, auth)` if authentication succeeds, or `None` otherwise.
<add>To implement a custom authentication scheme, subclass `BaseAuthentication` and override the `.authenticate(self, request)` method. The method should return a two-tuple of `(user, auth)` if authentication succeeds, or `None` otherwise.
<add>
<add>In some circumstances instead of returning `None`, you may want to raise an `Unauthenticated` exception from the `.authenticate()` method.
<add>
<add>Typically the approach you should take is:
<add>
<add>* If authentication is not attempted, return `None`. Any other authentication schemes also in use will still be checked.
<add>* If authentication is attempted but fails, raise an `Unauthenticated` exception. An error response will be returned immediately, without checking any other authentication schemes.
<add>
<add>You *may* also override the `.authentication_header(self, request)` method. If implemented, it should return a string that will be used as the value of the `WWW-Authenticate` header in a `HTTP 401 Unauthenticated` response.
<add>
<add>If the `.authentication_header()` method is not overridden, the authentication scheme will return `HTTP 403 Forbidden` responses when an unauthenticated request is denied access.
<ide>
<ide> [cite]: http://jacobian.org/writing/rest-worst-practices/
<add>[http401]: http://www.w3.org/Protocols/rfc2616/rfc2616-sec10.html#sec10.4.2
<add>[http403]: http://www.w3.org/Protocols/rfc2616/rfc2616-sec10.html#sec10.4.4
<ide> [basicauth]: http://tools.ietf.org/html/rfc2617
<ide> [oauth]: http://oauth.net/2/
<ide> [permission]: permissions.md
<ide><path>rest_framework/authentication.py
<ide> def authenticate(self, request):
<ide> """
<ide> raise NotImplementedError(".authenticate() must be overridden.")
<ide>
<add> def authenticate_header(self, request):
<add> """
<add> Return a string to be used as the value of the `WWW-Authenticate`
<add> header in a `401 Unauthenticated` response, or `None` if the
<add> authentication scheme should return `403 Permission Denied` responses.
<add> """
<add> pass
<add>
<ide>
<ide> class BasicAuthentication(BaseAuthentication):
<ide> """
| 2
|
Python
|
Python
|
fix empty input into staticvectors layer
|
cfb9770a94980db9e385724568b434b7790e8bc2
|
<ide><path>spacy/ml/staticvectors.py
<ide> def StaticVectors(
<ide> def forward(
<ide> model: Model[List[Doc], Ragged], docs: List[Doc], is_train: bool
<ide> ) -> Tuple[Ragged, Callable]:
<del> if not len(docs):
<add> if not sum(len(doc) for doc in docs):
<ide> return _handle_empty(model.ops, model.get_dim("nO"))
<ide> key_attr = model.attrs["key_attr"]
<ide> W = cast(Floats2d, model.ops.as_contig(model.get_param("W")))
<ide><path>spacy/tests/test_models.py
<ide>
<ide> from spacy.ml.models import build_Tok2Vec_model, MultiHashEmbed, MaxoutWindowEncoder
<ide> from spacy.ml.models import build_text_classifier, build_simple_cnn_text_classifier
<add>from spacy.ml.staticvectors import StaticVectors
<ide> from spacy.lang.en import English
<ide> from spacy.lang.en.examples import sentences as EN_SENTENCES
<ide>
<ide> def get_updated_model():
<ide> model1 = get_updated_model()
<ide> model2 = get_updated_model()
<ide> assert_array_equal(get_all_params(model1), get_all_params(model2))
<add>
<add>
<add>@pytest.mark.parametrize(
<add> "model_func,kwargs",
<add> [
<add> (StaticVectors, {"nO": 128, "nM": 300}),
<add> ]
<add>)
<add>def test_empty_docs(model_func, kwargs):
<add> nlp = English()
<add> model = model_func(**kwargs).initialize()
<add> # Test the layer can be called successfully with 0, 1 and 2 empty docs.
<add> for n_docs in range(3):
<add> docs = [nlp("") for _ in range(n_docs)]
<add> # Test predict
<add> _ = model.predict(docs)
<add> # Test backprop
<add> output, backprop = model.begin_update(docs)
<add> _ = backprop(output)
| 2
|
Text
|
Text
|
provide mention of pgcrypto extension for uuids
|
54becc956132e08ed8ed74819d85fe1e0b64fa94
|
<ide><path>guides/source/active_record_postgresql.md
<ide> article.save!
<ide>
<ide> ### UUID
<ide>
<del>* [type definition](http://www.postgresql.org/docs/9.3/static/datatype-uuid.html)
<del>* [generator functions](http://www.postgresql.org/docs/9.3/static/uuid-ossp.html)
<add>* [type definition](http://www.postgresql.org/docs/9.4/static/datatype-uuid.html)
<add>* [pgcrypto generator function](http://www.postgresql.org/docs/9.4/static/pgcrypto.html#AEN159361)
<add>* [uuid-ossp generator functions](http://www.postgresql.org/docs/9.4/static/uuid-ossp.html)
<ide>
<del>NOTE: you need to enable the `uuid-ossp` extension to use uuid.
<add>NOTE: you need to enable the `pgcrypto` (PostgreSQL >= 9.4) or `uuid-ossp`
<add>extension to use uuid.
<ide>
<ide> ```ruby
<ide> # db/migrate/20131220144913_create_revisions.rb
<ide> A point is casted to an array containing `x` and `y` coordinates.
<ide> UUID Primary Keys
<ide> -----------------
<ide>
<del>NOTE: you need to enable the `uuid-ossp` extension to generate UUIDs.
<add>NOTE: you need to enable the `pgcrypto` (PostgreSQL >= 9.4) or `uuid-ossp`
<add>extension to generate random UUIDs.
<ide>
<ide> ```ruby
<ide> # db/migrate/20131220144913_create_devices.rb
<del>enable_extension 'uuid-ossp' unless extension_enabled?('uuid-ossp')
<del>create_table :devices, id: :uuid, default: 'uuid_generate_v4()' do |t|
<add>enable_extension 'pgcrypto' unless extension_enabled?('pgcrypto')
<add>create_table :devices, id: :uuid, default: 'gen_random_uuid()' do |t|
<ide> t.string :kind
<ide> end
<ide>
| 1
|
PHP
|
PHP
|
fix failing tests
|
15c56d2e6de5aaa69a883b3d1d09a3704e73e834
|
<ide><path>tests/TestCase/View/Helper/FormHelperTest.php
<ide> public function testInputCustomization() {
<ide> );
<ide> $this->assertTags($result, $expected);
<ide>
<del> unset($this->Form->request->data);
<add> $this->Form->request->data = [];
<ide>
<ide> $entity->errors('field', 'Badness!');
<ide> $this->Form->create($entity, ['context' => ['table' => 'Contacts']]);
<ide><path>tests/TestCase/View/HelperRegistryTest.php
<ide> use Cake\Core\Plugin;
<ide> use Cake\TestSuite\TestCase;
<ide> use Cake\View\HelperRegistry;
<del>use Cake\View\Helper\HtmlHelper;
<add>use Cake\View\Helper;
<ide> use Cake\View\View;
<ide>
<ide> /**
<ide> * Extended HtmlHelper
<ide> */
<del>class HtmlAliasHelper extends HtmlHelper {
<add>class HtmlAliasHelper extends Helper {
<ide>
<ide> public function afterRender($viewFile) {
<ide> }
| 2
|
Javascript
|
Javascript
|
use common.skip for tap skip output
|
cd5a4c157c31ee7e0e9f9ce14c6aa75dbfbcd376
|
<ide><path>test/parallel/test-fs-readfile-pipe-large.js
<ide> var path = require('path');
<ide> // simulate `cat readfile.js | node readfile.js`
<ide>
<ide> if (common.isWindows || common.isAix) {
<del> console.log(`1..0 # Skipped: No /dev/stdin on ${process.platform}.`);
<add> common.skip(`No /dev/stdin on ${process.platform}.`);
<ide> return;
<ide> }
<ide>
<ide><path>test/parallel/test-fs-readfile-pipe.js
<ide> var assert = require('assert');
<ide> // simulate `cat readfile.js | node readfile.js`
<ide>
<ide> if (common.isWindows || common.isAix) {
<del> console.log(`1..0 # Skipped: No /dev/stdin on ${process.platform}.`);
<add> common.skip(`No /dev/stdin on ${process.platform}.`);
<ide> return;
<ide> }
<ide>
<ide><path>test/parallel/test-fs-readfilesync-pipe-large.js
<ide> var path = require('path');
<ide> // simulate `cat readfile.js | node readfile.js`
<ide>
<ide> if (common.isWindows || common.isAix) {
<del> console.log(`1..0 # Skipped: No /dev/stdin on ${process.platform}.`);
<add> common.skip(`No /dev/stdin on ${process.platform}.`);
<ide> return;
<ide> }
<ide>
| 3
|
Python
|
Python
|
fix training from scratch in new scripts
|
a0c62d249303a68f5336e3f9a96ecf9241d7abbe
|
<ide><path>examples/language-modeling/run_clm.py
<ide> def group_texts(examples):
<ide>
<ide> # Training
<ide> if training_args.do_train:
<del> trainer.train(
<del> model_path=model_args.model_name_or_path if os.path.isdir(model_args.model_name_or_path) else None
<add> model_path = (
<add> model_args.model_name_or_path
<add> if (model_args.model_name_or_path is not None and os.path.isdir(model_args.model_name_or_path))
<add> else None
<ide> )
<add> trainer.train(model_path=model_path)
<ide> trainer.save_model() # Saves the tokenizer too for easy upload
<ide>
<ide> # Evaluation
<ide><path>examples/language-modeling/run_mlm.py
<ide> def group_texts(examples):
<ide>
<ide> # Training
<ide> if training_args.do_train:
<del> trainer.train(
<del> model_path=model_args.model_name_or_path if os.path.isdir(model_args.model_name_or_path) else None
<add> model_path = (
<add> model_args.model_name_or_path
<add> if (model_args.model_name_or_path is not None and os.path.isdir(model_args.model_name_or_path))
<add> else None
<ide> )
<add> trainer.train(model_path=model_path)
<ide> trainer.save_model() # Saves the tokenizer too for easy upload
<ide>
<ide> # Evaluation
<ide><path>examples/language-modeling/run_mlm_wwm.py
<ide> def tokenize_function(examples):
<ide>
<ide> # Training
<ide> if training_args.do_train:
<del> trainer.train(
<del> model_path=model_args.model_name_or_path if os.path.isdir(model_args.model_name_or_path) else None
<add> model_path = (
<add> model_args.model_name_or_path
<add> if (model_args.model_name_or_path is not None and os.path.isdir(model_args.model_name_or_path))
<add> else None
<ide> )
<add> trainer.train(model_path=model_path)
<ide> trainer.save_model() # Saves the tokenizer too for easy upload
<ide>
<ide> # Evaluation
<ide><path>examples/language-modeling/run_plm.py
<ide> def group_texts(examples):
<ide>
<ide> # Training
<ide> if training_args.do_train:
<del> trainer.train(
<del> model_path=model_args.model_name_or_path if os.path.isdir(model_args.model_name_or_path) else None
<add> model_path = (
<add> model_args.model_name_or_path
<add> if (model_args.model_name_or_path is not None and os.path.isdir(model_args.model_name_or_path))
<add> else None
<ide> )
<add> trainer.train(model_path=model_path)
<ide> trainer.save_model() # Saves the tokenizer too for easy upload
<ide>
<ide> # Evaluation
<ide><path>templates/adding_a_new_example_script/{{cookiecutter.directory_name}}/run_{{cookiecutter.example_shortcut}}.py
<ide> def tokenize_function(examples):
<ide>
<ide> # Training
<ide> if training_args.do_train:
<add>{%- if cookiecutter.can_train_from_scratch == "False" %}
<ide> trainer.train(
<ide> model_path=model_args.model_name_or_path if os.path.isdir(model_args.model_name_or_path) else None
<ide> )
<add>{%- elif cookiecutter.can_train_from_scratch == "True" %}
<add> model_path = (
<add> model_args.model_name_or_path
<add> if (model_args.model_name_or_path is not None and os.path.isdir(model_args.model_name_or_path))
<add> else None
<add> )
<add> trainer.train(model_path=model_path)
<add>{% endif %}
<ide> trainer.save_model() # Saves the tokenizer too for easy upload
<ide>
<ide> # Evaluation
| 5
|
Ruby
|
Ruby
|
make class name consistent with the filename
|
55621504b0d527393d2de86d15b5f40752825fde
|
<ide><path>activesupport/test/flush_cache_on_private_memoization_test.rb
<ide> require 'abstract_unit'
<ide> require 'test/unit'
<ide>
<del>class FlashCacheOnPrivateMemoizationTest < Test::Unit::TestCase
<add>class FlushCacheOnPrivateMemoizationTest < Test::Unit::TestCase
<ide> ActiveSupport::Deprecation.silence do
<ide> extend ActiveSupport::Memoizable
<ide> end
| 1
|
Java
|
Java
|
update javadoc of uricomponentsbuilder
|
2965df6bee12638e3e1a1208c382679fe3d46751
|
<ide><path>spring-web/src/main/java/org/springframework/web/util/UriComponentsBuilder.java
<ide> public static UriComponentsBuilder fromUri(URI uri) {
<ide> /**
<ide> * Returns a builder that is initialized with the given URI string.
<ide> *
<del> * @param uri the URI string to initialize with
<add> * <p><strong>Note:</strong> The presence of reserved characters can prevent
<add> * correct parsing of the URI string. For example if a query parameter
<add> * contains {@code '='} or {@code '&'} characters, the query string cannot
<add> * be parsed unambiguously. Such values should be substituted for URI
<add> * variables to enable correct parsing:
<add> *
<add> * <pre>
<add> * String uriString = "/hotels/42?filter={value}";
<add> * UriComponentsBuilder.fromUriString(uriString).buildAndExpand("hot&cold");
<add> * </pre>
<add> *
<add> * @param uri
<add> * the URI string to initialize with
<ide> * @return the new {@code UriComponentsBuilder}
<ide> */
<ide> public static UriComponentsBuilder fromUriString(String uri) {
<ide> public static UriComponentsBuilder fromUriString(String uri) {
<ide> /**
<ide> * Creates a new {@code UriComponents} object from the string HTTP URL.
<ide> *
<add> * <p><strong>Note:</strong> The presence of reserved characters can prevent
<add> * correct parsing of the URI string. For example if a query parameter
<add> * contains {@code '='} or {@code '&'} characters, the query string cannot
<add> * be parsed unambiguously. Such values should be substituted for URI
<add> * variables to enable correct parsing:
<add> *
<add> * <pre>
<add> * String uriString = "/hotels/42?filter={value}";
<add> * UriComponentsBuilder.fromUriString(uriString).buildAndExpand("hot&cold");
<add> * </pre>
<add> *
<ide> * @param httpUrl the source URI
<ide> * @return the URI components of the URI
<ide> */
<ide> public UriComponents build() {
<ide> }
<ide>
<ide> /**
<del> * Builds a {@code UriComponents} instance from the various components contained in this builder.
<add> * Builds a {@code UriComponents} instance from the various components
<add> * contained in this builder.
<ide> *
<del> * @param encoded whether all the components set in this builder are encoded ({@code true}) or not ({@code false}).
<add> * @param encoded whether all the components set in this builder are
<add> * encoded ({@code true}) or not ({@code false}).
<ide> * @return the URI components
<ide> */
<ide> public UriComponents build(boolean encoded) {
<ide> public UriComponentsBuilder uri(URI uri) {
<ide> }
<ide>
<ide> /**
<del> * Sets the URI scheme. The given scheme may contain URI template variables, and may also be {@code null} to clear the
<del> * scheme of this builder.
<add> * Sets the URI scheme. The given scheme may contain URI template variables,
<add> * and may also be {@code null} to clear the scheme of this builder.
<ide> *
<del> * @param scheme the URI scheme
<add> * @param scheme
<add> * the URI scheme
<ide> * @return this UriComponentsBuilder
<ide> */
<ide> public UriComponentsBuilder scheme(String scheme) {
<ide> public UriComponentsBuilder scheme(String scheme) {
<ide> }
<ide>
<ide> /**
<del> * Sets the URI user info. The given user info may contain URI template variables, and may also be {@code null} to
<del> * clear the user info of this builder.
<add> * Sets the URI user info. The given user info may contain URI template
<add> * variables, and may also be {@code null} to clear the user info of this
<add> * builder.
<ide> *
<del> * @param userInfo the URI user info
<add> * @param userInfo
<add> * the URI user info
<ide> * @return this UriComponentsBuilder
<ide> */
<ide> public UriComponentsBuilder userInfo(String userInfo) {
<ide> public UriComponentsBuilder userInfo(String userInfo) {
<ide> }
<ide>
<ide> /**
<del> * Sets the URI host. The given host may contain URI template variables, and may also be {@code null} to clear the host
<del> * of this builder.
<add> * Sets the URI host. The given host may contain URI template variables, and
<add> * may also be {@code null} to clear the host of this builder.
<ide> *
<del> * @param host the URI host
<add> * @param host
<add> * the URI host
<ide> * @return this UriComponentsBuilder
<ide> */
<ide> public UriComponentsBuilder host(String host) {
<ide> public UriComponentsBuilder port(int port) {
<ide> }
<ide>
<ide> /**
<del> * Appends the given path to the existing path of this builder. The given path may contain URI template variables.
<add> * Appends the given path to the existing path of this builder. The given
<add> * path may contain URI template variables.
<ide> *
<del> * @param path the URI path
<add> * @param path
<add> * the URI path
<ide> * @return this UriComponentsBuilder
<ide> */
<ide> public UriComponentsBuilder path(String path) {
<ide> public UriComponentsBuilder pathSegment(String... pathSegments) throws IllegalAr
<ide> }
<ide>
<ide> /**
<del> * Appends the given query to the existing query of this builder. The given query may contain URI template variables.
<add> * Appends the given query to the existing query of this builder.
<add> * The given query may contain URI template variables.
<add> *
<add> * <p><strong>Note:</strong> The presence of reserved characters can prevent
<add> * correct parsing of the URI string. For example if a query parameter
<add> * contains {@code '='} or {@code '&'} characters, the query string cannot
<add> * be parsed unambiguously. Such values should be substituted for URI
<add> * variables to enable correct parsing:
<add> *
<add> * <pre>
<add> * String uriString = "/hotels/42?filter={value}";
<add> * UriComponentsBuilder.fromUriString(uriString).buildAndExpand("hot&cold");
<add> * </pre>
<ide> *
<ide> * @param query the query string
<ide> * @return this UriComponentsBuilder
<ide> public UriComponentsBuilder replaceQuery(String query) {
<ide> }
<ide>
<ide> /**
<del> * Appends the given query parameter to the existing query parameters. The given name or any of the values may contain
<del> * URI template variables. If no values are given, the resulting URI will contain the query parameter name only (i.e.
<del> * {@code ?foo} instead of {@code ?foo=bar}.
<add> * Appends the given query parameter to the existing query parameters. The
<add> * given name or any of the values may contain URI template variables. If no
<add> * values are given, the resulting URI will contain the query parameter name
<add> * only (i.e. {@code ?foo} instead of {@code ?foo=bar}.
<ide> *
<del> * @param name the query parameter name
<del> * @param values the query parameter values
<add> * @param name
<add> * the query parameter name
<add> * @param values
<add> * the query parameter values
<ide> * @return this UriComponentsBuilder
<ide> */
<ide> public UriComponentsBuilder queryParam(String name, Object... values) {
<ide> public UriComponentsBuilder queryParam(String name, Object... values) {
<ide> }
<ide>
<ide> /**
<del> * Sets the query parameter values overriding all existing query values for the same parameter.
<del> * If no values are given, the query parameter is removed.
<add> * Sets the query parameter values overriding all existing query values for
<add> * the same parameter. If no values are given, the query parameter is
<add> * removed.
<ide> *
<del> * @param name the query parameter name
<del> * @param values the query parameter values
<add> * @param name
<add> * the query parameter name
<add> * @param values
<add> * the query parameter values
<ide> * @return this UriComponentsBuilder
<ide> */
<ide> public UriComponentsBuilder replaceQueryParam(String name, Object... values) {
<ide> public UriComponentsBuilder replaceQueryParam(String name, Object... values) {
<ide> }
<ide>
<ide> /**
<del> * Sets the URI fragment. The given fragment may contain URI template variables, and may also be {@code null} to clear
<del> * the fragment of this builder.
<add> * Sets the URI fragment. The given fragment may contain URI template
<add> * variables, and may also be {@code null} to clear the fragment of this
<add> * builder.
<ide> *
<del> * @param fragment the URI fragment
<add> * @param fragment
<add> * the URI fragment
<ide> * @return this UriComponentsBuilder
<ide> */
<ide> public UriComponentsBuilder fragment(String fragment) {
| 1
|
Text
|
Text
|
fix module.ispreloading documentation
|
9d7fb229db86b3fbe8835fe6411aa78d300c6e70
|
<ide><path>doc/api/module.md
<ide> const requireUtil = createRequireFromPath('../src/utils/');
<ide> requireUtil('./some-tool');
<ide> ```
<ide>
<del>### `module.isPreloading`
<del><!-- YAML
<del>added: v15.4.0
<del>-->
<del>
<del>* Type: {boolean} `true` if the module is running during the Node.js preload
<del> phase.
<del>
<ide> ### `module.syncBuiltinESMExports()`
<ide> <!-- YAML
<ide> added: v12.12.0
<ide><path>doc/api/modules.md
<ide> added: v0.1.16
<ide> The identifier for the module. Typically this is the fully resolved
<ide> filename.
<ide>
<add>### `module.isPreloading`
<add><!-- YAML
<add>added: v15.4.0
<add>-->
<add>
<add>* Type: {boolean} `true` if the module is running during the Node.js preload
<add> phase.
<add>
<ide> ### `module.loaded`
<ide> <!-- YAML
<ide> added: v0.1.16
| 2
|
Ruby
|
Ruby
|
add broadcast_to and stream_for methods as per #26
|
5954fd1e0aa907e07ffff932aedc51109d4ce56d
|
<ide><path>lib/action_cable/channel.rb
<ide> module ActionCable
<ide> module Channel
<ide> autoload :Base, 'action_cable/channel/base'
<add> autoload :Broadcasting, 'action_cable/channel/broadcasting'
<ide> autoload :Callbacks, 'action_cable/channel/callbacks'
<add> autoload :Naming, 'action_cable/channel/naming'
<ide> autoload :PeriodicTimers, 'action_cable/channel/periodic_timers'
<ide> autoload :Streams, 'action_cable/channel/streams'
<ide> end
<ide><path>lib/action_cable/channel/base.rb
<ide> class Base
<ide> include Callbacks
<ide> include PeriodicTimers
<ide> include Streams
<add> include Naming
<add> include Broadcasting
<ide>
<ide> on_subscribe :subscribed
<ide> on_unsubscribe :unsubscribed
<ide><path>lib/action_cable/channel/broadcasting.rb
<add>module ActionCable
<add> module Channel
<add> module Broadcasting
<add> extend ActiveSupport::Concern
<add>
<add> delegate :broadcasting_for, to: :class
<add>
<add> class_methods do
<add> # Broadcast a hash to a unique broadcasting for this <tt>model</tt> in this channel.
<add> def broadcast_to(model, message)
<add> ActionCable.server.broadcast(broadcasting_for([ channel_name, model ]), message)
<add> end
<add>
<add> def broadcasting_for(model) #:nodoc:
<add> case
<add> when model.is_a?(Array)
<add> model.map { |m| broadcasting_for(m) }.join(':')
<add> when model.respond_to?(:to_gid_param)
<add> model.to_gid_param
<add> else
<add> model.to_param
<add> end
<add> end
<add> end
<add> end
<add> end
<add>end
<ide><path>lib/action_cable/channel/naming.rb
<add>module ActionCable
<add> module Channel
<add> module Naming
<add> extend ActiveSupport::Concern
<add>
<add> class_methods do
<add> # Returns the name of the channel, underscored, without the <tt>Channel</tt> ending.
<add> # If the channel is in a namespace, then the namespaces are represented by single
<add> # colon separators in the channel name.
<add> #
<add> # ChatChannel.channel_name # => 'chat'
<add> # Chats::AppearancesChannel.channel_name # => 'chats:appearances'
<add> def channel_name
<add> @channel_name ||= name.sub(/Channel$/, '').gsub('::',':').underscore
<add> end
<add> end
<add>
<add> # Delegates to the class' <tt>channel_name</tt>
<add> delegate :channel_name, to: :class
<add> end
<add> end
<add>end
<ide><path>lib/action_cable/channel/streams.rb
<ide> module ActionCable
<ide> module Channel
<ide> # Streams allow channels to route broadcastings to the subscriber. A broadcasting is an discussed elsewhere a pub/sub queue where any data
<del> # put into it is automatically sent to the clients that are connected at that time. It's purely an online queue, though. If you're not
<add> # put into it is automatically sent to the clients that are connected at that time. It's purely an online queue, though. If you're not
<ide> # streaming a broadcasting at the very moment it sends out an update, you'll not get that update when connecting later.
<ide> #
<ide> # Most commonly, the streamed broadcast is sent straight to the subscriber on the client-side. The channel just acts as a connector between
<ide> module Channel
<ide> # def follow(data)
<ide> # stream_from "comments_for_#{data['recording_id']}"
<ide> # end
<del> #
<add> #
<ide> # def unfollow
<ide> # stop_all_streams
<ide> # end
<ide> module Channel
<ide> #
<ide> # ActionCable.server.broadcast "comments_for_45", author: 'DHH', content: 'Rails is just swell'
<ide> #
<del> # If you don't just want to parlay the broadcast unfiltered to the subscriber, you can supply a callback that let's you alter what goes out.
<add> # If you have a stream that is related to a model, then the broadcasting used can be generated from the model and channel.
<add> # The following example would to subscribe to a broadcasting that would be something like `comments:Z2lkOi8vVGVzdEFwcC9Qb3N0LzE`
<add> #
<add> # class CommentsChannel < ApplicationCable::Channel
<add> # def subscribed
<add> # post = Post.find(params[:id])
<add> # stream_for post
<add> # end
<add> # end
<add> #
<add> # You can then broadcast to this channel using:
<add> #
<add> # CommentsChannel.broadcast_to(@post)
<add> #
<add> # If you don't just want to parlay the broadcast unfiltered to the subscriber, you can supply a callback that let's you alter what goes out.
<ide> # Example below shows how you can use this to provide performance introspection in the process:
<ide> #
<ide> # class ChatChannel < ApplicationCable::Channel
<ide> # def subscribed
<ide> # @room = Chat::Room[params[:room_number]]
<del> #
<del> # stream_from @room.channel, -> (message) do
<add> #
<add> # stream_for @room, -> (message) do
<ide> # message = ActiveSupport::JSON.decode(m)
<del> #
<add> #
<ide> # if message['originated_at'].present?
<ide> # elapsed_time = (Time.now.to_f - message['originated_at']).round(2)
<del> #
<add> #
<ide> # ActiveSupport::Notifications.instrument :performance, measurement: 'Chat.message_delay', value: elapsed_time, action: :timing
<ide> # logger.info "Message took #{elapsed_time}s to arrive"
<ide> # end
<del> #
<add> #
<ide> # transmit message
<ide> # end
<ide> # end
<ide> def stream_from(broadcasting, callback = nil)
<ide> logger.info "#{self.class.name} is streaming from #{broadcasting}"
<ide> end
<ide>
<add> # Start streaming the pubsub queue for the <tt>model</tt> in this channel. Optionally, you can pass a
<add> # <tt>callback</tt> that'll be used instead of the default of just transmitting the updates straight
<add> # to the subscriber.
<add> def stream_for(model, callback = nil)
<add> stream_from(broadcasting_for([ channel_name, model ]), callback)
<add> end
<add>
<ide> def stop_all_streams
<ide> streams.each do |broadcasting, callback|
<ide> pubsub.unsubscribe_proc broadcasting, callback
<ide><path>lib/action_cable/server/base.rb
<ide> class Base
<ide> include ActionCable::Server::Connections
<ide>
<ide> cattr_accessor(:config, instance_accessor: true) { ActionCable::Server::Configuration.new }
<del>
<add>
<ide> def self.logger; config.logger; end
<ide> delegate :logger, to: :config
<ide>
<ide> def redis
<ide> logger.info "[ActionCable] Redis reconnect failed."
<ide> # logger.info "[ActionCable] Redis reconnected. Closing all the open connections."
<ide> # @connections.map &:close
<del> end
<add> end
<ide> end
<ide> end
<ide>
<ide><path>lib/action_cable/server/broadcasting.rb
<ide> def broadcaster_for(broadcasting)
<ide> # The redis instance used for broadcasting. Not intended for direct user use.
<ide> def broadcasting_redis
<ide> @broadcasting_redis ||= Redis.new(config.redis)
<del> end
<add> end
<ide>
<ide> private
<ide> class Broadcaster
<ide><path>test/channel/broadcasting_test.rb
<add>require 'test_helper'
<add>require 'stubs/test_connection'
<add>require 'stubs/room'
<add>
<add>class ActionCable::Channel::BroadcastingTest < ActiveSupport::TestCase
<add> class ChatChannel < ActionCable::Channel::Base
<add> end
<add>
<add> setup do
<add> @connection = TestConnection.new
<add> end
<add>
<add> test "broadcasts_to" do
<add> ActionCable.stubs(:server).returns mock().tap { |m| m.expects(:broadcast).with('action_cable:channel:broadcasting_test:chat:Room#1-Campfire', "Hello World") }
<add> ChatChannel.broadcast_to(Room.new(1), "Hello World")
<add> end
<add>
<add> test "broadcasting_for with an object" do
<add> assert_equal "Room#1-Campfire", ChatChannel.broadcasting_for(Room.new(1))
<add> end
<add>
<add> test "broadcasting_for with an array" do
<add> assert_equal "Room#1-Campfire:Room#2-Campfire", ChatChannel.broadcasting_for([ Room.new(1), Room.new(2) ])
<add> end
<add>
<add> test "broadcasting_for with a string" do
<add> assert_equal "hello", ChatChannel.broadcasting_for("hello")
<add> end
<add>end
<ide><path>test/channel/naming_test.rb
<add>require 'test_helper'
<add>
<add>class ActionCable::Channel::NamingTest < ActiveSupport::TestCase
<add> class ChatChannel < ActionCable::Channel::Base
<add> end
<add>
<add> test "channel_name" do
<add> assert_equal "action_cable:channel:naming_test:chat", ChatChannel.channel_name
<add> end
<add>end
<ide><path>test/channel/stream_test.rb
<ide> class ActionCable::Channel::StreamTest < ActiveSupport::TestCase
<ide> class ChatChannel < ActionCable::Channel::Base
<ide> def subscribed
<del> @room = Room.new params[:id]
<del> stream_from "test_room_#{@room.id}"
<add> if params[:id]
<add> @room = Room.new params[:id]
<add> stream_from "test_room_#{@room.id}"
<add> end
<ide> end
<ide> end
<ide>
<ide> def subscribed
<ide> end
<ide>
<ide> test "streaming start and stop" do
<del> @connection.expects(:pubsub).returns mock().tap { |m| m.expects(:subscribe) }
<add> @connection.expects(:pubsub).returns mock().tap { |m| m.expects(:subscribe).with("test_room_1") }
<ide> channel = ChatChannel.new @connection, "{id: 1}", { id: 1 }
<ide>
<ide> @connection.expects(:pubsub).returns mock().tap { |m| m.expects(:unsubscribe_proc) }
<ide> channel.unsubscribe_from_channel
<ide> end
<add>
<add> test "stream_for" do
<add> @connection.expects(:pubsub).returns mock().tap { |m| m.expects(:subscribe).with("action_cable:channel:stream_test:chat:Room#1-Campfire") }
<add> channel = ChatChannel.new @connection, ""
<add> channel.stream_for Room.new(1)
<add> end
<ide> end
<ide><path>test/stubs/room.rb
<ide> def initialize(id, name='Campfire')
<ide> def to_global_id
<ide> "Room##{id}-#{name}"
<ide> end
<add>
<add> def to_gid_param
<add> to_global_id.to_param
<add> end
<ide> end
| 11
|
Javascript
|
Javascript
|
fix previous optimisation and add test
|
833bc5dddff23b56ef6f5e820340a4eaf5867302
|
<ide><path>d3.layout.js
<ide> function d3_layout_packPlace(a, b, c) {
<ide> var db = a.r + c.r,
<ide> dx = b.x - a.x,
<ide> dy = b.y - a.y;
<del> if (db && dx && dy) {
<add> if (db && (dx || dy)) {
<ide> var da = b.r + c.r,
<ide> dc = Math.sqrt(dx * dx + dy * dy),
<del> cos = Math.min(1, (db * db + dc * dc - da * da) / (2 * db * dc)),
<add> cos = Math.max(-1, Math.min(1, (db * db + dc * dc - da * da) / (2 * db * dc))),
<ide> theta = Math.acos(cos),
<ide> x = cos * db,
<ide> h = Math.sin(theta) * db;
<ide><path>d3.layout.min.js
<del>(function(){function bc(a,b){var c=a.x+b[3],d=a.y+b[0],e=a.dx-b[1]-b[3],f=a.dy-b[0]-b[2];e<0&&(c+=e/2,e=0),f<0&&(d+=f/2,f=0);return{x:c,y:d,dx:e,dy:f}}function bb(a){return{x:a.x,y:a.y,dx:a.dx,dy:a.dy}}function ba(a,b,c){return a._tree.ancestor.parent==b.parent?a._tree.ancestor:c}function _(a,b,c){a=a._tree,b=b._tree;var d=c/(b.number-a.number);a.change+=d,b.change-=d,b.shift+=c,b.prelim+=c,b.mod+=c}function $(a){var b=0,c=0,d=a.children,e=d.length,f;while(--e>=0)f=d[e]._tree,f.prelim+=b,f.mod+=b,b+=f.shift+(c+=f.change)}function Z(a,b){function c(a,d){var e=a.children;if(e){var f,g=null,h=-1,i=e.length;while(++h<i)f=e[h],c(f,g),g=f}b(a,d)}c(a,null)}function Y(a,b){return a.depth-b.depth}function X(a,b){return b.x-a.x}function W(a,b){return a.x-b.x}function V(a,b){var c=a.children;if(c){var d,e=c.length,f=-1;while(++f<e)b(d=V(c[f],b),a)>0&&(a=d)}return a}function U(a){return a.children?a.children[a.children.length-1]:a._tree.thread}function T(a){return a.children?a.children[0]:a._tree.thread}function S(a,b){return a.parent==b.parent?1:2}function R(a){var b=a.children;return b?R(b[b.length-1]):a}function Q(a){var b=a.children;return b?Q(b[0]):a}function P(a){return a.reduce(function(a,b){return a+b.x},0)/a.length}function O(a){return 1+d3.max(a,function(a){return a.y})}function N(a,b,c){var d=a.r+c.r,e=b.x-a.x,f=b.y-a.y;if(d&&e&&f){var g=b.r+c.r,h=Math.sqrt(e*e+f*f),i=Math.min(1,(d*d+h*h-g*g)/(2*d*h)),j=Math.acos(i),k=i*d,l=Math.sin(j)*d;e/=h,f/=h,c.x=a.x+k*e+l*f,c.y=a.y+k*f-l*e}else c.x=a.x,c.y=a.y}function M(a,b,c,d){var e=a.children;a.x=b+=d*a.x,a.y=c+=d*a.y,a.r*=d;if(e){var f=-1,g=e.length;while(++f<g)M(e[f],b,c,d)}}function L(a){var b=a.children;b&&b.length?(b.forEach(L),a.r=I(b)):a.r=Math.sqrt(a.value)}function K(a){delete a._pack_next,delete a._pack_prev}function J(a){a._pack_next=a._pack_prev=a}function I(a){function l(a){b=Math.min(a.x-a.r,b),c=Math.max(a.x+a.r,c),d=Math.min(a.y-a.r,d),e=Math.max(a.y+a.r,e)}var b=Infinity,c=-Infinity,d=Infinity,e=-Infinity,f=a.length,g,h,i,j,k;a.forEach(J),g=a[0],g.x=-g.r,g.y=0,l(g);if(f>1){h=a[1],h.x=h.r,h.y=0,l(h);if(f>2){i=a[2],N(g,h,i),l(i),F(g,i),g._pack_prev=i,F(i,h),h=g._pack_next;for(var m=3;m<f;m++){N(g,h,i=a[m]);var n=0,o=1,p=1;for(j=h._pack_next;j!==h;j=j._pack_next,o++)if(H(j,i)){n=1;break}if(n==1)for(k=g._pack_prev;k!==j._pack_prev;k=k._pack_prev,p++)if(H(k,i)){p<o&&(n=-1,j=k);break}n==0?(F(g,i),h=i,l(i)):n>0?(G(g,j),h=j,m--):(G(j,h),g=j,m--)}}}var q=(b+c)/2,r=(d+e)/2,s=0;for(var m=0;m<f;m++){var t=a[m];t.x-=q,t.y-=r,s=Math.max(s,t.r+Math.sqrt(t.x*t.x+t.y*t.y))}a.forEach(K);return s}function H(a,b){var c=b.x-a.x,d=b.y-a.y,e=a.r+b.r;return e*e-c*c-d*d>.001}function G(a,b){a._pack_next=b,b._pack_prev=a}function F(a,b){var c=a._pack_next;a._pack_next=b,b._pack_prev=a,b._pack_next=c,c._pack_prev=b}function E(a,b){return a.value-b.value}function C(a){return d3.merge(a.map(function(a){return(a.children||[]).map(function(b){return{source:a,target:b}})}))}function B(a,b){return b.value-a.value}function A(a){return a.value}function z(a){return a.children}function y(a,b){a.sort=d3.rebind(a,b.sort),a.children=d3.rebind(a,b.children),a.links=C,a.value=d3.rebind(a,b.value),a.nodes=function(b){D=!0;return(a.nodes=a)(b)};return a}function x(a){return[d3.min(a),d3.max(a)]}function w(a,b){var c=-1,d=+a[0],e=(a[1]-d)/b,f=[];while(++c<=b)f[c]=e*c+d;return f}function v(a,b){return w(a,Math.ceil(Math.log(b.length)/Math.LN2+1))}function u(a,b){return a+b[1]}function t(a){return a.reduce(u,0)}function s(a){var b=1,c=0,d=a[0][1],e,f=a.length;for(;b<f;++b)(e=a[b][1])>d&&(c=b,d=e);return c}function p(a,b,c){a.y0=b,a.y=c}function o(a){return a.y}function n(a){return a.x}function m(a){return 1}function l(a){return 20}function k(a){var b=0,c=0;a.count=0;if(!a.leaf){var d=a.nodes,e=d.length,f=-1,g;while(++f<e){g=d[f];if(g==null)continue;k(g),a.count+=g.count,b+=g.count*g.cx,c+=g.count*g.cy}}a.point&&(a.leaf||(a.point.x+=Math.random()-.5,a.point.y+=Math.random()-.5),a.count++,b+=a.point.x,c+=a.point.y),a.cx=b/a.count,a.cy=c/a.count}function j(){f.px+=d3.event.dx,f.py+=d3.event.dy,e.resume()}function i(){j(),f.fixed&=1,e=f=null}function h(a){a!==f&&(a.fixed&=1)}function g(a){a.fixed|=2}function c(a,c){if(a===c)return a;var d=b(a),e=b(c),f=d.pop(),g=e.pop(),h=null;while(f===g)h=f,f=d.pop(),g=e.pop();return h}function b(a){var b=[],c=a.parent;while(c!=null)b.push(a),a=c,c=c.parent;b.push(a);return b}function a(a){var b=a.source,d=a.target,e=c(b,d),f=[b];while(b!==e)b=b.parent,f.push(b);var g=f.length;while(d!==e)f.splice(g,0,d),d=d.parent;return f}d3.layout={},d3.layout.bundle=function(){return function(b){var c=[],d=-1,e=b.length;while(++d<e)c.push(a(b[d]));return c}},d3.layout.chord=function(){function k(){b.sort(function(a,b){return i(a.target.value,b.target.value)})}function j(){var a={},j=[],l=d3.range(e),m=[],n,o,p,q,r;b=[],c=[],n=0,q=-1;while(++q<e){o=0,r=-1;while(++r<e)o+=d[q][r];j.push(o),m.push(d3.range(e)),n+=o}g&&l.sort(function(a,b){return g(j[a],j[b])}),h&&m.forEach(function(a,b){a.sort(function(a,c){return h(d[b][a],d[b][c])})}),n=(2*Math.PI-f*e)/n,o=0,q=-1;while(++q<e){p=o,r=-1;while(++r<e){var s=l[q],t=m[q][r],u=d[s][t];a[s+"-"+t]={index:s,subindex:t,startAngle:o,endAngle:o+=u*n,value:u}}c.push({index:s,startAngle:p,endAngle:o,value:(o-p)/n}),o+=f}q=-1;while(++q<e){r=q-1;while(++r<e){var v=a[q+"-"+r],w=a[r+"-"+q];(v.value||w.value)&&b.push(v.value<w.value?{source:w,target:v}:{source:v,target:w})}}i&&k()}var a={},b,c,d,e,f=0,g,h,i;a.matrix=function(f){if(!arguments.length)return d;e=(d=f)&&d.length,b=c=null;return a},a.padding=function(d){if(!arguments.length)return f;f=d,b=c=null;return a},a.sortGroups=function(d){if(!arguments.length)return g;g=d,b=c=null;return a},a.sortSubgroups=function(c){if(!arguments.length)return h;h=c,b=null;return a},a.sortChords=function(c){if(!arguments.length)return i;i=c,b&&k();return a},a.chords=function(){b||j();return b},a.groups=function(){c||j();return c};return a},d3.layout.force=function(){function B(b){g(f=b),e=a}function A(){var a=v.length,d=w.length,e,f,g,h,i,j,l,m,p;for(f=0;f<d;++f){g=w[f],h=g.source,i=g.target,m=i.x-h.x,p=i.y-h.y;if(j=m*m+p*p)j=n*y[f]*((j=Math.sqrt(j))-x[f])/j,m*=j,p*=j,i.x-=m*(l=h.weight/(i.weight+h.weight)),i.y-=p*l,h.x+=m*(l=1-l),h.y+=p*l}if(l=n*s){m=c[0]/2,p=c[1]/2,f=-1;if(l)while(++f<a)g=v[f],g.x+=(m-g.x)*l,g.y+=(p-g.y)*l}if(l=n*r){k(e=d3.geom.quadtree(v)),f=-1;while(++f<a)(g=v[f]).fixed||e.visit(z(g,l))}f=-1;while(++f<a)g=v[f],g.fixed?(g.x=g.px,g.y=g.py):(g.x-=(g.px-(g.px=g.x))*o,g.y-=(g.py-(g.py=g.y))*o);b.tick.dispatch({type:"tick",alpha:n});return(n*=.99)<.005}function z(a,b){return function(c,d,e,f,g){if(c.point!==a){var h=c.cx-a.x,i=c.cy-a.y,j=1/Math.sqrt(h*h+i*i);if((f-d)*j<t){var k=b*c.count*j*j;a.px-=h*k,a.py-=i*k;return!0}if(c.point&&isFinite(j)){var k=b*j*j;a.px-=h*k,a.py-=i*k}}}}var a={},b=d3.dispatch("tick"),c=[1,1],d,n,o=.9,p=l,q=m,r=-30,s=.1,t=.8,u,v=[],w=[],x,y;a.on=function(c,d){b[c].add(d);return a},a.nodes=function(b){if(!arguments.length)return v;v=b;return a},a.links=function(b){if(!arguments.length)return w;w=b;return a},a.size=function(b){if(!arguments.length)return c;c=b;return a},a.linkDistance=function(b){if(!arguments.length)return p;p=d3.functor(b);return a},a.distance=a.linkDistance,a.linkStrength=function(b){if(!arguments.length)return q;q=d3.functor(b);return a},a.friction=function(b){if(!arguments.length)return o;o=b;return a},a.charge=function(b){if(!arguments.length)return r;r=b;return a},a.gravity=function(b){if(!arguments.length)return s;s=b;return a},a.theta=function(b){if(!arguments.length)return t;t=b;return a},a.start=function(){function l(){if(!i){i=[];for(d=0;d<e;++d)i[d]=[];for(d=0;d<f;++d){var a=w[d];i[a.source.index].push(a.target),i[a.target.index].push(a.source)}}return i[b]}function k(a,c){var d=l(b),e=-1,f=d.length,g;while(++e<f)if(!isNaN(g=d[e][a]))return g;return Math.random()*c}var b,d,e=v.length,f=w.length,g=c[0],h=c[1],i,j;for(b=0;b<e;++b)(j=v[b]).index=b,j.weight=0;x=[],y=[];for(b=0;b<f;++b)j=w[b],typeof j.source=="number"&&(j.source=v[j.source]),typeof j.target=="number"&&(j.target=v[j.target]),x[b]=p.call(this,j,b),y[b]=q.call(this,j,b),++j.source.weight,++j.target.weight;for(b=0;b<e;++b)j=v[b],isNaN(j.x)&&(j.x=k("x",g)),isNaN(j.y)&&(j.y=k("y",h)),isNaN(j.px)&&(j.px=j.x),isNaN(j.py)&&(j.py=j.y);return a.resume()},a.resume=function(){n=.1,d3.timer(A);return a},a.stop=function(){n=0;return a},a.drag=function(){d||(d=d3.behavior.drag().on("dragstart",B).on("drag",j).on("dragend",i)),this.on("mouseover.force",g).on("mouseout.force",h).call(d)};return a};var e,f;d3.layout.partition=function(){function e(e,f){var g=a.call(this,e,f);c(g[0],0,b[0],b[1]/d(g[0]));return g}function d(a){var b=a.children,c=0;if(b){var e=-1,f=b.length;while(++e<f)c=Math.max(c,d(b[e]))}return 1+c}function c(a,b,d,e){var f=a.children;a.x=b,a.y=a.depth*e,a.dx=d,a.dy=e;if(f){var g=-1,h=f.length,i,j;d=a.value?d/a.value:0;while(++g<h)c(i=f[g],b,j=i.value*d,e),b+=j}}var a=d3.layout.hierarchy(),b=[1,1];e.size=function(a){if(!arguments.length)return b;b=a;return e};return y(e,a)},d3.layout.pie=function(){function f(f,g){var h=+(typeof c=="function"?c.apply(this,arguments):c),i=(typeof e=="function"?e.apply(this,arguments):e)-c,j=d3.range(f.length);b!=null&&j.sort(function(a,c){return b(f[a],f[c])});var k=f.map(a);i/=k.reduce(function(a,b){return a+b},0);var l=j.map(function(a){return{data:f[a],value:d=k[a],startAngle:h,endAngle:h+=d*i}});return f.map(function(a,b){return l[j[b]]})}var a=Number,b=null,c=0,e=2*Math.PI;f.value=function(b){if(!arguments.length)return a;a=b;return f},f.sort=function(a){if(!arguments.length)return b;b=a;return f},f.startAngle=function(a){if(!arguments.length)return c;c=a;return f},f.endAngle=function(a){if(!arguments.length)return e;e=a;return f};return f},d3.layout.stack=function(){function g(h,i){var j=h.map(function(b,c){return a.call(g,b,c)}),k=j.map(function(a,b){return a.map(function(a,b){return[e.call(g,a,b),f.call(g,a,b)]})}),l=b.call(g,k,i);j=d3.permute(j,l),k=d3.permute(k,l);var m=c.call(g,k,i),n=j.length,o=j[0].length,p,q,r;for(q=0;q<o;++q){d.call(g,j[0][q],r=m[q],k[0][q][1]);for(p=1;p<n;++p)d.call(g,j[p][q],r+=k[p-1][q][1],k[p][q][1])}return h}var a=Object,b=q["default"],c=r.zero,d=p,e=n,f=o;g.values=function(b){if(!arguments.length)return a;a=b;return g},g.order=function(a){if(!arguments.length)return b;b=typeof a=="function"?a:q[a];return g},g.offset=function(a){if(!arguments.length)return c;c=typeof a=="function"?a:r[a];return g},g.x=function(a){if(!arguments.length)return e;e=a;return g},g.y=function(a){if(!arguments.length)return f;f=a;return g},g.out=function(a){if(!arguments.length)return d;d=a;return g};return g};var q={"inside-out":function(a){var b=a.length,c,d,e=a.map(s),f=a.map(t),g=d3.range(b).sort(function(a,b){return e[a]-e[b]}),h=0,i=0,j=[],k=[];for(c=0;c<b;++c)d=g[c],h<i?(h+=f[d],j.push(d)):(i+=f[d],k.push(d));return k.reverse().concat(j)},reverse:function(a){return d3.range(a.length).reverse()},"default":function(a){return d3.range(a.length)}},r={silhouette:function(a){var b=a.length,c=a[0].length,d=[],e=0,f,g,h,i=[];for(g=0;g<c;++g){for(f=0,h=0;f<b;f++)h+=a[f][g][1];h>e&&(e=h),d.push(h)}for(g=0;g<c;++g)i[g]=(e-d[g])/2;return i},wiggle:function(a){var b=a.length,c=a[0],d=c.length,e=0,f,g,h,i,j,k,l,m,n,o=[];o[0]=m=n=0;for(g=1;g<d;++g){for(f=0,i=0;f<b;++f)i+=a[f][g][1];for(f=0,j=0,l=c[g][0]-c[g-1][0];f<b;++f){for(h=0,k=(a[f][g][1]-a[f][g-1][1])/(2*l);h<f;++h)k+=(a[h][g][1]-a[h][g-1][1])/l;j+=k*a[f][g][1]}o[g]=m-=i?j/i*l:0,m<n&&(n=m)}for(g=0;g<d;++g)o[g]-=n;return o},expand:function(a){var b=a.length,c=a[0].length,d=1/b,e,f,g,h=[];for(f=0;f<c;++f){for(e=0,g=0;e<b;e++)g+=a[e][f][1];if(g)for(e=0;e<b;e++)a[e][f][1]/=g;else for(e=0;e<b;e++)a[e][f][1]=d}for(f=0;f<c;++f)h[f]=0;return h},zero:function(a){var b=-1,c=a[0].length,d=[];while(++b<c)d[b]=0;return d}};d3.layout.histogram=function(){function e(e,f){var g=[],h=e.map(b,this),i=c.call(this,h,f),j=d.call(this,i,h,f),k,f=-1,l=h.length,m=j.length-1,n=a?1:1/l,o;while(++f<m)k=g[f]=[],k.dx=j[f+1]-(k.x=j[f]),k.y=0;f=-1;while(++f<l)o=h[f],o>=i[0]&&o<=i[1]&&(k=g[d3.bisect(j,o,1,m)-1],k.y+=n,k.push(e[f]));return g}var a=!0,b=Number,c=x,d=v;e.value=function(a){if(!arguments.length)return b;b=a;return e},e.range=function(a){if(!arguments.length)return c;c=d3.functor(a);return e},e.bins=function(a){if(!arguments.length)return d;d=typeof a=="number"?function(b){return w(b,a)}:d3.functor(a);return e},e.frequency=function(b){if(!arguments.length)return a;a=!!b;return e};return e},d3.layout.hierarchy=function(){function g(a){var b=[];e(a,0,b);return b}function f(a,b){var d=a.children,e=0;if(d){var h=-1,i=d.length,j=b+1;while(++h<i)e+=f(d[h],j)}else c&&(e=+c.call(g,D?a:a.data,b)||0);c&&(a.value=e);return e}function e(f,h,i){var j=b.call(g,f,h),k=D?f:{data:f};k.depth=h,i.push(k);if(j){var l=-1,m=j.length,n=k.children=[],o=0,p=h+1;while(++l<m)d=e(j[l],p,i),d.parent=k,n.push(d),o+=d.value;a&&n.sort(a),c&&(k.value=o)}else c&&(k.value=+c.call(g,f,h)||0);return k}var a=B,b=z,c=A;g.sort=function(b){if(!arguments.length)return a;a=b;return g},g.children=function(a){if(!arguments.length)return b;b=a;return g},g.value=function(a){if(!arguments.length)return c;c=a;return g},g.revalue=function(a){f(a,0);return a};return g};var D=!1;d3.layout.pack=function(){function c(c,d){var e=a.call(this,c,d),f=e[0];f.x=0,f.y=0,L(f);var g=b[0],h=b[1],i=1/Math.max(2*f.r/g,2*f.r/h);M(f,g/2,h/2,i);return e}var a=d3.layout.hierarchy().sort(E),b=[1,1];c.size=function(a){if(!arguments.length)return b;b=a;return c};return y(c,a)},d3.layout.cluster=function(){function d(d,e){var f=a.call(this,d,e),g=f[0],h,i=0,j,k;Z(g,function(a){a.children?(a.x=P(a.children),a.y=O(a.children)):(a.x=h?i+=b(a,h):0,a.y=0,h=a)});var l=Q(g),m=R(g),n=l.x-b(l,m)/2,o=m.x+b(m,l)/2;Z(g,function(a){a.x=(a.x-n)/(o-n)*c[0],a.y=(1-a.y/g.y)*c[1]});return f}var a=d3.layout.hierarchy().sort(null).value(null),b=S,c=[1,1];d.separation=function(a){if(!arguments.length)return b;b=a;return d},d.size=function(a){if(!arguments.length)return c;c=a;return d};return y(d,a)},d3.layout.tree=function(){function d(d,e){function j(a,c,d){if(c){var e=a,f=a,g=c,h=a.parent.children[0],i=e._tree.mod,j=f._tree.mod,k=g._tree.mod,l=h._tree.mod,m;while(g=U(g),e=T(e),g&&e)h=T(h),f=U(f),f._tree.ancestor=a,m=g._tree.prelim+k-e._tree.prelim-i+b(g,e),m>0&&(_(ba(g,a,d),a,m),i+=m,j+=m),k+=g._tree.mod,i+=e._tree.mod,l+=h._tree.mod,j+=f._tree.mod;g&&!U(f)&&(f._tree.thread=g,f._tree.mod+=k-j),e&&!T(h)&&(h._tree.thread=e,h._tree.mod+=i-l,d=a)}return d}function i(a,b){a.x=a._tree.prelim+b;var c=a.children;if(c){var d=-1,e=c.length;b+=a._tree.mod;while(++d<e)i(c[d],b)}}function h(a,c){var d=a.children,e=a._tree;if(d&&(f=d.length)){var f,g=d[0],i,k=g,l,m=-1;while(++m<f)l=d[m],h(l,i),k=j(l,i,k),i=l;$(a);var n=.5*(g._tree.prelim+l._tree.prelim);c?(e.prelim=c._tree.prelim+b(a,c),e.mod=e.prelim-n):e.prelim=n}else c&&(e.prelim=c._tree.prelim+b(a,c))}var f=a.call(this,d,e),g=f[0];Z(g,function(a,b){a._tree={ancestor:a,prelim:0,mod:0,change:0,shift:0,number:b?b._tree.number+1:0}}),h(g),i(g,-g._tree.prelim);var k=V(g,X),l=V(g,W),m=V(g,Y),n=k.x-b(k,l)/2,o=l.x+b(l,k)/2,p=m.depth||1;Z(g,function(a){a.x=(a.x-n)/(o-n)*c[0],a.y=a.depth/p*c[1],delete a._tree});return f}var a=d3.layout.hierarchy().sort(null).value(null),b=S,c=[1,1];d.separation=function(a){if(!arguments.length)return b;b=a;return d},d.size=function(a){if(!arguments.length)return c;c=a;return d};return y(d,a)},d3.layout.treemap=function(){function n(b){var d=g||a(b),e=d[0];e.x=0,e.y=0,e.dx=c[0],e.dy=c[1],g&&a.revalue(e),i([e],e.dx*e.dy/e.value),(g?k:j)(e),f&&(g=d);return d}function m(a,c,d,e){var f=-1,g=a.length,h=d.x,i=d.y,j=c?b(a.area/c):0,k;if(c==d.dx){if(e||j>d.dy)j=j?d.dy:0;while(++f<g)k=a[f],k.x=h,k.y=i,k.dy=j,h+=k.dx=j?b(k.area/j):0;k.z=!0,k.dx+=d.x+d.dx-h,d.y+=j,d.dy-=j}else{if(e||j>d.dx)j=j?d.dx:0;while(++f<g)k=a[f],k.x=h,k.y=i,k.dx=j,i+=k.dy=j?b(k.area/j):0;k.z=!1,k.dy+=d.y+d.dy-i,d.x+=j,d.dx-=j}}function l(a,b){var c=a.area,d,e=0,f=Infinity,g=-1,i=a.length;while(++g<i){if(!(d=a[g].area))continue;d<f&&(f=d),d>e&&(e=d)}c*=c,b*=b;return c?Math.max(b*e*h/c,c/(b*f*h)):Infinity}function k(a){if(!!a.children){var b=e(a),c=a.children.slice(),d,f=[];i(c,b.dx*b.dy/a.value),f.area=0;while(d=c.pop())f.push(d),f.area+=d.area,d.z!=null&&(m(f,d.z?b.dx:b.dy,b,!c.length),f.length=f.area=0);a.children.forEach(k)}}function j(a){if(!!a.children){var b=e(a),c=[],d=a.children.slice(),f,g=Infinity,h,k=Math.min(b.dx,b.dy),n;i(d,b.dx*b.dy/a.value),c.area=0;while((n=d.length)>0)c.push(f=d[n-1]),c.area+=f.area,(h=l(c,k))<=g?(d.pop(),g=h):(c.area-=c.pop().area,m(c,k,b,!1),k=Math.min(b.dx,b.dy),c.length=c.area=0,g=Infinity);c.length&&(m(c,k,b,!0),c.length=c.area=0),a.children.forEach(j)}}function i(a,b){var c=-1,d=a.length,e,f;while(++c<d)f=(e=a[c]).value*(b<0?0:b),e.area=isNaN(f)||f<=0?0:f}var a=d3.layout.hierarchy(),b=Math.round,c=[1,1],d=null,e=bb,f=!1,g,h=.5*(1+Math.sqrt(5));n.size=function(a){if(!arguments.length)return c;c=a;return n},n.padding=function(a){function c(b){return bc(b,a)}function b(b){var c=a.call(n,b,b.depth);return c==null?bb(b):bc(b,typeof c=="number"?[c,c,c,c]:c)}if(!arguments.length)return d;var f;e=(d=a)==null?bb:(f=typeof a)==="function"?b:f==="number"?(a=[a,a,a,a],c):c;return n},n.round=function(a){if(!arguments.length)return b!=Number;b=a?Math.round:Number;return n},n.sticky=function(a){if(!arguments.length)return f;f=a,g=null;return n},n.ratio=function(a){if(!arguments.length)return h;h=a;return n};return y(n,a)}})()
<ide>\ No newline at end of file
<add>(function(){function bc(a,b){var c=a.x+b[3],d=a.y+b[0],e=a.dx-b[1]-b[3],f=a.dy-b[0]-b[2];e<0&&(c+=e/2,e=0),f<0&&(d+=f/2,f=0);return{x:c,y:d,dx:e,dy:f}}function bb(a){return{x:a.x,y:a.y,dx:a.dx,dy:a.dy}}function ba(a,b,c){return a._tree.ancestor.parent==b.parent?a._tree.ancestor:c}function _(a,b,c){a=a._tree,b=b._tree;var d=c/(b.number-a.number);a.change+=d,b.change-=d,b.shift+=c,b.prelim+=c,b.mod+=c}function $(a){var b=0,c=0,d=a.children,e=d.length,f;while(--e>=0)f=d[e]._tree,f.prelim+=b,f.mod+=b,b+=f.shift+(c+=f.change)}function Z(a,b){function c(a,d){var e=a.children;if(e){var f,g=null,h=-1,i=e.length;while(++h<i)f=e[h],c(f,g),g=f}b(a,d)}c(a,null)}function Y(a,b){return a.depth-b.depth}function X(a,b){return b.x-a.x}function W(a,b){return a.x-b.x}function V(a,b){var c=a.children;if(c){var d,e=c.length,f=-1;while(++f<e)b(d=V(c[f],b),a)>0&&(a=d)}return a}function U(a){return a.children?a.children[a.children.length-1]:a._tree.thread}function T(a){return a.children?a.children[0]:a._tree.thread}function S(a,b){return a.parent==b.parent?1:2}function R(a){var b=a.children;return b?R(b[b.length-1]):a}function Q(a){var b=a.children;return b?Q(b[0]):a}function P(a){return a.reduce(function(a,b){return a+b.x},0)/a.length}function O(a){return 1+d3.max(a,function(a){return a.y})}function N(a,b,c){var d=a.r+c.r,e=b.x-a.x,f=b.y-a.y;if(d&&(e||f)){var g=b.r+c.r,h=Math.sqrt(e*e+f*f),i=Math.max(-1,Math.min(1,(d*d+h*h-g*g)/(2*d*h))),j=Math.acos(i),k=i*d,l=Math.sin(j)*d;e/=h,f/=h,c.x=a.x+k*e+l*f,c.y=a.y+k*f-l*e}else c.x=a.x,c.y=a.y}function M(a,b,c,d){var e=a.children;a.x=b+=d*a.x,a.y=c+=d*a.y,a.r*=d;if(e){var f=-1,g=e.length;while(++f<g)M(e[f],b,c,d)}}function L(a){var b=a.children;b&&b.length?(b.forEach(L),a.r=I(b)):a.r=Math.sqrt(a.value)}function K(a){delete a._pack_next,delete a._pack_prev}function J(a){a._pack_next=a._pack_prev=a}function I(a){function l(a){b=Math.min(a.x-a.r,b),c=Math.max(a.x+a.r,c),d=Math.min(a.y-a.r,d),e=Math.max(a.y+a.r,e)}var b=Infinity,c=-Infinity,d=Infinity,e=-Infinity,f=a.length,g,h,i,j,k;a.forEach(J),g=a[0],g.x=-g.r,g.y=0,l(g);if(f>1){h=a[1],h.x=h.r,h.y=0,l(h);if(f>2){i=a[2],N(g,h,i),l(i),F(g,i),g._pack_prev=i,F(i,h),h=g._pack_next;for(var m=3;m<f;m++){N(g,h,i=a[m]);var n=0,o=1,p=1;for(j=h._pack_next;j!==h;j=j._pack_next,o++)if(H(j,i)){n=1;break}if(n==1)for(k=g._pack_prev;k!==j._pack_prev;k=k._pack_prev,p++)if(H(k,i)){p<o&&(n=-1,j=k);break}n==0?(F(g,i),h=i,l(i)):n>0?(G(g,j),h=j,m--):(G(j,h),g=j,m--)}}}var q=(b+c)/2,r=(d+e)/2,s=0;for(var m=0;m<f;m++){var t=a[m];t.x-=q,t.y-=r,s=Math.max(s,t.r+Math.sqrt(t.x*t.x+t.y*t.y))}a.forEach(K);return s}function H(a,b){var c=b.x-a.x,d=b.y-a.y,e=a.r+b.r;return e*e-c*c-d*d>.001}function G(a,b){a._pack_next=b,b._pack_prev=a}function F(a,b){var c=a._pack_next;a._pack_next=b,b._pack_prev=a,b._pack_next=c,c._pack_prev=b}function E(a,b){return a.value-b.value}function C(a){return d3.merge(a.map(function(a){return(a.children||[]).map(function(b){return{source:a,target:b}})}))}function B(a,b){return b.value-a.value}function A(a){return a.value}function z(a){return a.children}function y(a,b){a.sort=d3.rebind(a,b.sort),a.children=d3.rebind(a,b.children),a.links=C,a.value=d3.rebind(a,b.value),a.nodes=function(b){D=!0;return(a.nodes=a)(b)};return a}function x(a){return[d3.min(a),d3.max(a)]}function w(a,b){var c=-1,d=+a[0],e=(a[1]-d)/b,f=[];while(++c<=b)f[c]=e*c+d;return f}function v(a,b){return w(a,Math.ceil(Math.log(b.length)/Math.LN2+1))}function u(a,b){return a+b[1]}function t(a){return a.reduce(u,0)}function s(a){var b=1,c=0,d=a[0][1],e,f=a.length;for(;b<f;++b)(e=a[b][1])>d&&(c=b,d=e);return c}function p(a,b,c){a.y0=b,a.y=c}function o(a){return a.y}function n(a){return a.x}function m(a){return 1}function l(a){return 20}function k(a){var b=0,c=0;a.count=0;if(!a.leaf){var d=a.nodes,e=d.length,f=-1,g;while(++f<e){g=d[f];if(g==null)continue;k(g),a.count+=g.count,b+=g.count*g.cx,c+=g.count*g.cy}}a.point&&(a.leaf||(a.point.x+=Math.random()-.5,a.point.y+=Math.random()-.5),a.count++,b+=a.point.x,c+=a.point.y),a.cx=b/a.count,a.cy=c/a.count}function j(){f.px+=d3.event.dx,f.py+=d3.event.dy,e.resume()}function i(){j(),f.fixed&=1,e=f=null}function h(a){a!==f&&(a.fixed&=1)}function g(a){a.fixed|=2}function c(a,c){if(a===c)return a;var d=b(a),e=b(c),f=d.pop(),g=e.pop(),h=null;while(f===g)h=f,f=d.pop(),g=e.pop();return h}function b(a){var b=[],c=a.parent;while(c!=null)b.push(a),a=c,c=c.parent;b.push(a);return b}function a(a){var b=a.source,d=a.target,e=c(b,d),f=[b];while(b!==e)b=b.parent,f.push(b);var g=f.length;while(d!==e)f.splice(g,0,d),d=d.parent;return f}d3.layout={},d3.layout.bundle=function(){return function(b){var c=[],d=-1,e=b.length;while(++d<e)c.push(a(b[d]));return c}},d3.layout.chord=function(){function k(){b.sort(function(a,b){return i(a.target.value,b.target.value)})}function j(){var a={},j=[],l=d3.range(e),m=[],n,o,p,q,r;b=[],c=[],n=0,q=-1;while(++q<e){o=0,r=-1;while(++r<e)o+=d[q][r];j.push(o),m.push(d3.range(e)),n+=o}g&&l.sort(function(a,b){return g(j[a],j[b])}),h&&m.forEach(function(a,b){a.sort(function(a,c){return h(d[b][a],d[b][c])})}),n=(2*Math.PI-f*e)/n,o=0,q=-1;while(++q<e){p=o,r=-1;while(++r<e){var s=l[q],t=m[q][r],u=d[s][t];a[s+"-"+t]={index:s,subindex:t,startAngle:o,endAngle:o+=u*n,value:u}}c.push({index:s,startAngle:p,endAngle:o,value:(o-p)/n}),o+=f}q=-1;while(++q<e){r=q-1;while(++r<e){var v=a[q+"-"+r],w=a[r+"-"+q];(v.value||w.value)&&b.push(v.value<w.value?{source:w,target:v}:{source:v,target:w})}}i&&k()}var a={},b,c,d,e,f=0,g,h,i;a.matrix=function(f){if(!arguments.length)return d;e=(d=f)&&d.length,b=c=null;return a},a.padding=function(d){if(!arguments.length)return f;f=d,b=c=null;return a},a.sortGroups=function(d){if(!arguments.length)return g;g=d,b=c=null;return a},a.sortSubgroups=function(c){if(!arguments.length)return h;h=c,b=null;return a},a.sortChords=function(c){if(!arguments.length)return i;i=c,b&&k();return a},a.chords=function(){b||j();return b},a.groups=function(){c||j();return c};return a},d3.layout.force=function(){function B(b){g(f=b),e=a}function A(){var a=v.length,d=w.length,e,f,g,h,i,j,l,m,p;for(f=0;f<d;++f){g=w[f],h=g.source,i=g.target,m=i.x-h.x,p=i.y-h.y;if(j=m*m+p*p)j=n*y[f]*((j=Math.sqrt(j))-x[f])/j,m*=j,p*=j,i.x-=m*(l=h.weight/(i.weight+h.weight)),i.y-=p*l,h.x+=m*(l=1-l),h.y+=p*l}if(l=n*s){m=c[0]/2,p=c[1]/2,f=-1;if(l)while(++f<a)g=v[f],g.x+=(m-g.x)*l,g.y+=(p-g.y)*l}if(l=n*r){k(e=d3.geom.quadtree(v)),f=-1;while(++f<a)(g=v[f]).fixed||e.visit(z(g,l))}f=-1;while(++f<a)g=v[f],g.fixed?(g.x=g.px,g.y=g.py):(g.x-=(g.px-(g.px=g.x))*o,g.y-=(g.py-(g.py=g.y))*o);b.tick.dispatch({type:"tick",alpha:n});return(n*=.99)<.005}function z(a,b){return function(c,d,e,f,g){if(c.point!==a){var h=c.cx-a.x,i=c.cy-a.y,j=1/Math.sqrt(h*h+i*i);if((f-d)*j<t){var k=b*c.count*j*j;a.px-=h*k,a.py-=i*k;return!0}if(c.point&&isFinite(j)){var k=b*j*j;a.px-=h*k,a.py-=i*k}}}}var a={},b=d3.dispatch("tick"),c=[1,1],d,n,o=.9,p=l,q=m,r=-30,s=.1,t=.8,u,v=[],w=[],x,y;a.on=function(c,d){b[c].add(d);return a},a.nodes=function(b){if(!arguments.length)return v;v=b;return a},a.links=function(b){if(!arguments.length)return w;w=b;return a},a.size=function(b){if(!arguments.length)return c;c=b;return a},a.linkDistance=function(b){if(!arguments.length)return p;p=d3.functor(b);return a},a.distance=a.linkDistance,a.linkStrength=function(b){if(!arguments.length)return q;q=d3.functor(b);return a},a.friction=function(b){if(!arguments.length)return o;o=b;return a},a.charge=function(b){if(!arguments.length)return r;r=b;return a},a.gravity=function(b){if(!arguments.length)return s;s=b;return a},a.theta=function(b){if(!arguments.length)return t;t=b;return a},a.start=function(){function l(){if(!i){i=[];for(d=0;d<e;++d)i[d]=[];for(d=0;d<f;++d){var a=w[d];i[a.source.index].push(a.target),i[a.target.index].push(a.source)}}return i[b]}function k(a,c){var d=l(b),e=-1,f=d.length,g;while(++e<f)if(!isNaN(g=d[e][a]))return g;return Math.random()*c}var b,d,e=v.length,f=w.length,g=c[0],h=c[1],i,j;for(b=0;b<e;++b)(j=v[b]).index=b,j.weight=0;x=[],y=[];for(b=0;b<f;++b)j=w[b],typeof j.source=="number"&&(j.source=v[j.source]),typeof j.target=="number"&&(j.target=v[j.target]),x[b]=p.call(this,j,b),y[b]=q.call(this,j,b),++j.source.weight,++j.target.weight;for(b=0;b<e;++b)j=v[b],isNaN(j.x)&&(j.x=k("x",g)),isNaN(j.y)&&(j.y=k("y",h)),isNaN(j.px)&&(j.px=j.x),isNaN(j.py)&&(j.py=j.y);return a.resume()},a.resume=function(){n=.1,d3.timer(A);return a},a.stop=function(){n=0;return a},a.drag=function(){d||(d=d3.behavior.drag().on("dragstart",B).on("drag",j).on("dragend",i)),this.on("mouseover.force",g).on("mouseout.force",h).call(d)};return a};var e,f;d3.layout.partition=function(){function e(e,f){var g=a.call(this,e,f);c(g[0],0,b[0],b[1]/d(g[0]));return g}function d(a){var b=a.children,c=0;if(b){var e=-1,f=b.length;while(++e<f)c=Math.max(c,d(b[e]))}return 1+c}function c(a,b,d,e){var f=a.children;a.x=b,a.y=a.depth*e,a.dx=d,a.dy=e;if(f){var g=-1,h=f.length,i,j;d=a.value?d/a.value:0;while(++g<h)c(i=f[g],b,j=i.value*d,e),b+=j}}var a=d3.layout.hierarchy(),b=[1,1];e.size=function(a){if(!arguments.length)return b;b=a;return e};return y(e,a)},d3.layout.pie=function(){function f(f,g){var h=+(typeof c=="function"?c.apply(this,arguments):c),i=(typeof e=="function"?e.apply(this,arguments):e)-c,j=d3.range(f.length);b!=null&&j.sort(function(a,c){return b(f[a],f[c])});var k=f.map(a);i/=k.reduce(function(a,b){return a+b},0);var l=j.map(function(a){return{data:f[a],value:d=k[a],startAngle:h,endAngle:h+=d*i}});return f.map(function(a,b){return l[j[b]]})}var a=Number,b=null,c=0,e=2*Math.PI;f.value=function(b){if(!arguments.length)return a;a=b;return f},f.sort=function(a){if(!arguments.length)return b;b=a;return f},f.startAngle=function(a){if(!arguments.length)return c;c=a;return f},f.endAngle=function(a){if(!arguments.length)return e;e=a;return f};return f},d3.layout.stack=function(){function g(h,i){var j=h.map(function(b,c){return a.call(g,b,c)}),k=j.map(function(a,b){return a.map(function(a,b){return[e.call(g,a,b),f.call(g,a,b)]})}),l=b.call(g,k,i);j=d3.permute(j,l),k=d3.permute(k,l);var m=c.call(g,k,i),n=j.length,o=j[0].length,p,q,r;for(q=0;q<o;++q){d.call(g,j[0][q],r=m[q],k[0][q][1]);for(p=1;p<n;++p)d.call(g,j[p][q],r+=k[p-1][q][1],k[p][q][1])}return h}var a=Object,b=q["default"],c=r.zero,d=p,e=n,f=o;g.values=function(b){if(!arguments.length)return a;a=b;return g},g.order=function(a){if(!arguments.length)return b;b=typeof a=="function"?a:q[a];return g},g.offset=function(a){if(!arguments.length)return c;c=typeof a=="function"?a:r[a];return g},g.x=function(a){if(!arguments.length)return e;e=a;return g},g.y=function(a){if(!arguments.length)return f;f=a;return g},g.out=function(a){if(!arguments.length)return d;d=a;return g};return g};var q={"inside-out":function(a){var b=a.length,c,d,e=a.map(s),f=a.map(t),g=d3.range(b).sort(function(a,b){return e[a]-e[b]}),h=0,i=0,j=[],k=[];for(c=0;c<b;++c)d=g[c],h<i?(h+=f[d],j.push(d)):(i+=f[d],k.push(d));return k.reverse().concat(j)},reverse:function(a){return d3.range(a.length).reverse()},"default":function(a){return d3.range(a.length)}},r={silhouette:function(a){var b=a.length,c=a[0].length,d=[],e=0,f,g,h,i=[];for(g=0;g<c;++g){for(f=0,h=0;f<b;f++)h+=a[f][g][1];h>e&&(e=h),d.push(h)}for(g=0;g<c;++g)i[g]=(e-d[g])/2;return i},wiggle:function(a){var b=a.length,c=a[0],d=c.length,e=0,f,g,h,i,j,k,l,m,n,o=[];o[0]=m=n=0;for(g=1;g<d;++g){for(f=0,i=0;f<b;++f)i+=a[f][g][1];for(f=0,j=0,l=c[g][0]-c[g-1][0];f<b;++f){for(h=0,k=(a[f][g][1]-a[f][g-1][1])/(2*l);h<f;++h)k+=(a[h][g][1]-a[h][g-1][1])/l;j+=k*a[f][g][1]}o[g]=m-=i?j/i*l:0,m<n&&(n=m)}for(g=0;g<d;++g)o[g]-=n;return o},expand:function(a){var b=a.length,c=a[0].length,d=1/b,e,f,g,h=[];for(f=0;f<c;++f){for(e=0,g=0;e<b;e++)g+=a[e][f][1];if(g)for(e=0;e<b;e++)a[e][f][1]/=g;else for(e=0;e<b;e++)a[e][f][1]=d}for(f=0;f<c;++f)h[f]=0;return h},zero:function(a){var b=-1,c=a[0].length,d=[];while(++b<c)d[b]=0;return d}};d3.layout.histogram=function(){function e(e,f){var g=[],h=e.map(b,this),i=c.call(this,h,f),j=d.call(this,i,h,f),k,f=-1,l=h.length,m=j.length-1,n=a?1:1/l,o;while(++f<m)k=g[f]=[],k.dx=j[f+1]-(k.x=j[f]),k.y=0;f=-1;while(++f<l)o=h[f],o>=i[0]&&o<=i[1]&&(k=g[d3.bisect(j,o,1,m)-1],k.y+=n,k.push(e[f]));return g}var a=!0,b=Number,c=x,d=v;e.value=function(a){if(!arguments.length)return b;b=a;return e},e.range=function(a){if(!arguments.length)return c;c=d3.functor(a);return e},e.bins=function(a){if(!arguments.length)return d;d=typeof a=="number"?function(b){return w(b,a)}:d3.functor(a);return e},e.frequency=function(b){if(!arguments.length)return a;a=!!b;return e};return e},d3.layout.hierarchy=function(){function g(a){var b=[];e(a,0,b);return b}function f(a,b){var d=a.children,e=0;if(d){var h=-1,i=d.length,j=b+1;while(++h<i)e+=f(d[h],j)}else c&&(e=+c.call(g,D?a:a.data,b)||0);c&&(a.value=e);return e}function e(f,h,i){var j=b.call(g,f,h),k=D?f:{data:f};k.depth=h,i.push(k);if(j){var l=-1,m=j.length,n=k.children=[],o=0,p=h+1;while(++l<m)d=e(j[l],p,i),d.parent=k,n.push(d),o+=d.value;a&&n.sort(a),c&&(k.value=o)}else c&&(k.value=+c.call(g,f,h)||0);return k}var a=B,b=z,c=A;g.sort=function(b){if(!arguments.length)return a;a=b;return g},g.children=function(a){if(!arguments.length)return b;b=a;return g},g.value=function(a){if(!arguments.length)return c;c=a;return g},g.revalue=function(a){f(a,0);return a};return g};var D=!1;d3.layout.pack=function(){function c(c,d){var e=a.call(this,c,d),f=e[0];f.x=0,f.y=0,L(f);var g=b[0],h=b[1],i=1/Math.max(2*f.r/g,2*f.r/h);M(f,g/2,h/2,i);return e}var a=d3.layout.hierarchy().sort(E),b=[1,1];c.size=function(a){if(!arguments.length)return b;b=a;return c};return y(c,a)},d3.layout.cluster=function(){function d(d,e){var f=a.call(this,d,e),g=f[0],h,i=0,j,k;Z(g,function(a){a.children?(a.x=P(a.children),a.y=O(a.children)):(a.x=h?i+=b(a,h):0,a.y=0,h=a)});var l=Q(g),m=R(g),n=l.x-b(l,m)/2,o=m.x+b(m,l)/2;Z(g,function(a){a.x=(a.x-n)/(o-n)*c[0],a.y=(1-a.y/g.y)*c[1]});return f}var a=d3.layout.hierarchy().sort(null).value(null),b=S,c=[1,1];d.separation=function(a){if(!arguments.length)return b;b=a;return d},d.size=function(a){if(!arguments.length)return c;c=a;return d};return y(d,a)},d3.layout.tree=function(){function d(d,e){function j(a,c,d){if(c){var e=a,f=a,g=c,h=a.parent.children[0],i=e._tree.mod,j=f._tree.mod,k=g._tree.mod,l=h._tree.mod,m;while(g=U(g),e=T(e),g&&e)h=T(h),f=U(f),f._tree.ancestor=a,m=g._tree.prelim+k-e._tree.prelim-i+b(g,e),m>0&&(_(ba(g,a,d),a,m),i+=m,j+=m),k+=g._tree.mod,i+=e._tree.mod,l+=h._tree.mod,j+=f._tree.mod;g&&!U(f)&&(f._tree.thread=g,f._tree.mod+=k-j),e&&!T(h)&&(h._tree.thread=e,h._tree.mod+=i-l,d=a)}return d}function i(a,b){a.x=a._tree.prelim+b;var c=a.children;if(c){var d=-1,e=c.length;b+=a._tree.mod;while(++d<e)i(c[d],b)}}function h(a,c){var d=a.children,e=a._tree;if(d&&(f=d.length)){var f,g=d[0],i,k=g,l,m=-1;while(++m<f)l=d[m],h(l,i),k=j(l,i,k),i=l;$(a);var n=.5*(g._tree.prelim+l._tree.prelim);c?(e.prelim=c._tree.prelim+b(a,c),e.mod=e.prelim-n):e.prelim=n}else c&&(e.prelim=c._tree.prelim+b(a,c))}var f=a.call(this,d,e),g=f[0];Z(g,function(a,b){a._tree={ancestor:a,prelim:0,mod:0,change:0,shift:0,number:b?b._tree.number+1:0}}),h(g),i(g,-g._tree.prelim);var k=V(g,X),l=V(g,W),m=V(g,Y),n=k.x-b(k,l)/2,o=l.x+b(l,k)/2,p=m.depth||1;Z(g,function(a){a.x=(a.x-n)/(o-n)*c[0],a.y=a.depth/p*c[1],delete a._tree});return f}var a=d3.layout.hierarchy().sort(null).value(null),b=S,c=[1,1];d.separation=function(a){if(!arguments.length)return b;b=a;return d},d.size=function(a){if(!arguments.length)return c;c=a;return d};return y(d,a)},d3.layout.treemap=function(){function n(b){var d=g||a(b),e=d[0];e.x=0,e.y=0,e.dx=c[0],e.dy=c[1],g&&a.revalue(e),i([e],e.dx*e.dy/e.value),(g?k:j)(e),f&&(g=d);return d}function m(a,c,d,e){var f=-1,g=a.length,h=d.x,i=d.y,j=c?b(a.area/c):0,k;if(c==d.dx){if(e||j>d.dy)j=j?d.dy:0;while(++f<g)k=a[f],k.x=h,k.y=i,k.dy=j,h+=k.dx=j?b(k.area/j):0;k.z=!0,k.dx+=d.x+d.dx-h,d.y+=j,d.dy-=j}else{if(e||j>d.dx)j=j?d.dx:0;while(++f<g)k=a[f],k.x=h,k.y=i,k.dx=j,i+=k.dy=j?b(k.area/j):0;k.z=!1,k.dy+=d.y+d.dy-i,d.x+=j,d.dx-=j}}function l(a,b){var c=a.area,d,e=0,f=Infinity,g=-1,i=a.length;while(++g<i){if(!(d=a[g].area))continue;d<f&&(f=d),d>e&&(e=d)}c*=c,b*=b;return c?Math.max(b*e*h/c,c/(b*f*h)):Infinity}function k(a){if(!!a.children){var b=e(a),c=a.children.slice(),d,f=[];i(c,b.dx*b.dy/a.value),f.area=0;while(d=c.pop())f.push(d),f.area+=d.area,d.z!=null&&(m(f,d.z?b.dx:b.dy,b,!c.length),f.length=f.area=0);a.children.forEach(k)}}function j(a){if(!!a.children){var b=e(a),c=[],d=a.children.slice(),f,g=Infinity,h,k=Math.min(b.dx,b.dy),n;i(d,b.dx*b.dy/a.value),c.area=0;while((n=d.length)>0)c.push(f=d[n-1]),c.area+=f.area,(h=l(c,k))<=g?(d.pop(),g=h):(c.area-=c.pop().area,m(c,k,b,!1),k=Math.min(b.dx,b.dy),c.length=c.area=0,g=Infinity);c.length&&(m(c,k,b,!0),c.length=c.area=0),a.children.forEach(j)}}function i(a,b){var c=-1,d=a.length,e,f;while(++c<d)f=(e=a[c]).value*(b<0?0:b),e.area=isNaN(f)||f<=0?0:f}var a=d3.layout.hierarchy(),b=Math.round,c=[1,1],d=null,e=bb,f=!1,g,h=.5*(1+Math.sqrt(5));n.size=function(a){if(!arguments.length)return c;c=a;return n},n.padding=function(a){function c(b){return bc(b,a)}function b(b){var c=a.call(n,b,b.depth);return c==null?bb(b):bc(b,typeof c=="number"?[c,c,c,c]:c)}if(!arguments.length)return d;var f;e=(d=a)==null?bb:(f=typeof a)==="function"?b:f==="number"?(a=[a,a,a,a],c):c;return n},n.round=function(a){if(!arguments.length)return b!=Number;b=a?Math.round:Number;return n},n.sticky=function(a){if(!arguments.length)return f;f=a,g=null;return n},n.ratio=function(a){if(!arguments.length)return h;h=a;return n};return y(n,a)}})()
<ide>\ No newline at end of file
<ide><path>src/layout/pack.js
<ide> function d3_layout_packPlace(a, b, c) {
<ide> var db = a.r + c.r,
<ide> dx = b.x - a.x,
<ide> dy = b.y - a.y;
<del> if (db && dx && dy) {
<add> if (db && (dx || dy)) {
<ide> var da = b.r + c.r,
<ide> dc = Math.sqrt(dx * dx + dy * dy),
<del> cos = Math.min(1, (db * db + dc * dc - da * da) / (2 * db * dc)),
<add> cos = Math.max(-1, Math.min(1, (db * db + dc * dc - da * da) / (2 * db * dc))),
<ide> theta = Math.acos(cos),
<ide> x = cos * db,
<ide> h = Math.sin(theta) * db;
<ide><path>test/layout/pack-test.js
<ide> suite.addBatch({
<ide> {value: 0, depth: 1, x: 0.0, y: 0.5, r: 0.0},
<ide> {value: 1, depth: 1, x: 0.5, y: 0.5, r: 0.5}
<ide> ]);
<add> },
<add> "can handle residual floating point error": function(pack) {
<add> var result = pack.nodes({children: [
<add> {value: 0.005348322447389364},
<add> {value: 0.8065882022492588},
<add> {value: 0}
<add> ]}).map(layout);
<add> assert.deepEqual(result.map(function(d) { return d.depth; }), [0, 1, 1, 1]);
<add> assert.inDelta(result.map(function(d) { return d.value; }), [.811936, .005348, .806588, .0], 1e-6);
<add> assert.inDelta(result.map(function(d) { return d.x; }), [.5, .962350, .462350, .924701], 1e-6);
<add> assert.inDelta(result.map(function(d) { return d.y; }), [.5, .5, .5, .5], 1e-6);
<add> assert.inDelta(result.map(function(d) { return d.r; }), [.5, .037649, .462350, .0], 1e-6);
<ide> }
<ide> }
<ide> });
| 4
|
Python
|
Python
|
remove irrelevant tests
|
8c02f4ec8c4dcb5c773896d7d43d1ca7da9fd41a
|
<ide><path>tests/keras/backend/backend_test.py
<ide> def test_print_tensor(self, capsys):
<ide> # Theano inserts "__str__ = " for no good reason
<ide> assert out.replace('__str__ = ', '') == 'msg [[1.]]\n'
<ide>
<del> check_single_tensor_operation('print_tensor', (), WITH_NP)
<del> check_single_tensor_operation('print_tensor', (2,), WITH_NP)
<del>
<ide> def test_elementwise_operations(self):
<ide> check_single_tensor_operation('max', (4, 2), WITH_NP)
<ide> check_single_tensor_operation('max', (4, 2), WITH_NP, axis=1, keepdims=True)
<ide> def test_function(self):
<ide> x_list.append(x)
<ide> y = k.placeholder(ndim=2)
<ide> exp = k.square(x) + y
<del> update = x * 2
<add> # Need to use `identity` to make this symbolic
<add> # (TODO: fix in tf.keras)
<add> update = k.identity(x) * 2
<ide> f = k.function([y], [exp], updates=[(x, update)])
<ide> f_list.append(f)
<ide>
<ide> def step_function(inputs, states):
<ide> expected_outputs = inputs_vals.copy()
<ide> # but for the second sample all outputs in masked region should be the same
<ide> # as last output before masked region
<del> expected_outputs[1, -mask_last_num_timesteps:] = \
<del> expected_outputs[1, -(mask_last_num_timesteps + 1)]
<add> expected_outputs[1, -mask_last_num_timesteps:] = expected_outputs[
<add> 1, -(mask_last_num_timesteps + 1)]
<ide>
<ide> expected_state = initial_state_vals.copy()
<ide> # first state should be incremented for every timestep (no masking)
<ide> def step_function(inputs, states):
<ide> expected_state[1] += (num_timesteps - mask_last_num_timesteps)
<ide>
<ide> # verify same expected output for `unroll=true/false`
<del> inputs = K.variable(inputs_vals)
<del> initial_states = [K.variable(initial_state_vals)]
<del> mask = K.variable(mask_vals)
<add> inputs = K.constant(inputs_vals)
<add> initial_states = [K.constant(initial_state_vals)]
<add> mask = K.constant(mask_vals)
<ide> for unroll in [True, False]:
<ide> last_output, outputs, last_states = K.rnn(
<ide> step_function,
<ide> def step_function(inputs, states):
<ide> # same as the second to final output (before masked region)
<ide> expected_outputs[-1, -1] = expected_outputs[-1, -2]
<ide>
<del> inputs = K.variable(inputs_vals)
<del> initial_states = [K.variable(initial_state_vals)]
<del> mask = K.variable(mask_vals)
<add> inputs = K.constant(inputs_vals)
<add> initial_states = [K.constant(initial_state_vals)]
<add> mask = K.constant(mask_vals)
<ide> for unroll in [True, False]:
<ide> last_output, outputs, last_states = K.rnn(
<ide> step_function,
<ide> def test_nn_operations(self):
<ide> def test_crossentropy(self):
<ide> # toy label matrix (4 samples, 2 classes)
<ide> label = np.array([[.4, .6], [.3, .7], [.1, .9], [.2, .8]], dtype=np.float32)
<del> check_two_tensor_operation('binary_crossentropy', label, (4, 2), WITH_NP)
<add> binary_targets = np.array([[.3, .7], [.2, .8], [.4, .6], [.1, .9]], dtype=np.float32)
<add> categorical_targets = np.array([[1, 0], [1, 0], [0, 1], [0, 1]], dtype=np.float32)
<add> check_two_tensor_operation('binary_crossentropy', label, binary_targets, WITH_NP)
<ide> check_two_tensor_operation('binary_crossentropy', label, (4, 2),
<ide> WITH_NP, from_logits=True)
<del> check_two_tensor_operation('categorical_crossentropy', label, (4, 2),
<add> check_two_tensor_operation('categorical_crossentropy', label, categorical_targets,
<ide> WITH_NP, cntk_two_dynamicity=True)
<ide> check_two_tensor_operation('categorical_crossentropy', label, (4, 2),
<ide> WITH_NP, cntk_two_dynamicity=True,
<ide> from_logits=True)
<ide>
<ide> # toy label matrix (2 samples, 3 classes)
<ide> label = np.array([[.4, .1, .5], [.2, .6, .2]], dtype=np.float32)
<del> check_two_tensor_operation('categorical_crossentropy', label, (2, 3),
<add> categorical_targets = np.array([[0, 1, 0], [1, 0, 0]], dtype=np.float32)
<add> check_two_tensor_operation('categorical_crossentropy', label, categorical_targets,
<ide> WITH_NP, cntk_two_dynamicity=True)
<ide> check_two_tensor_operation('categorical_crossentropy', label, (2, 3),
<ide> WITH_NP, cntk_two_dynamicity=True,
<ide><path>tests/keras/callbacks/tensorboard_test.py
<ide> def callbacks_factory(histogram_freq=0,
<ide> assert not tmpdir.listdir()
<ide>
<ide>
<del>@pytest.mark.skipif((K.backend() != 'tensorflow'),
<del> reason='Requires TensorFlow backend')
<del>def test_TensorBoard_histogram_freq_must_have_validation_data(tmpdir):
<del> np.random.seed(np.random.randint(1, 1e7))
<del> filepath = str(tmpdir / 'logs')
<del>
<del> (X_train, y_train), (X_test, y_test) = get_data_callbacks()
<del> y_test = np_utils.to_categorical(y_test)
<del> y_train = np_utils.to_categorical(y_train)
<del>
<del> inp = layers.Input((input_dim,))
<del> hidden = layers.Dense(num_hidden, activation='relu')(inp)
<del> hidden = layers.Dropout(0.1)(hidden)
<del> output = layers.Dense(num_classes, activation='softmax')(hidden)
<del> model = Model(inputs=inp, outputs=output)
<del> model.compile(loss='categorical_crossentropy',
<del> optimizer='sgd',
<del> metrics=['accuracy'])
<del>
<del> # we must generate new callbacks for each test, as they aren't stateless
<del> def callbacks_factory(histogram_freq=0,
<del> embeddings_freq=0,
<del> write_images=False,
<del> write_grads=False):
<del> if embeddings_freq:
<del> embeddings_layer_names = ['dense_1']
<del> embeddings_data = X_test
<del> else:
<del> embeddings_layer_names = None
<del> embeddings_data = None
<del> return [callbacks.TensorBoard(log_dir=filepath,
<del> histogram_freq=histogram_freq,
<del> write_images=write_images,
<del> write_grads=write_grads,
<del> embeddings_freq=embeddings_freq,
<del> embeddings_layer_names=embeddings_layer_names,
<del> embeddings_data=embeddings_data)]
<del>
<del> # fit without validation data should raise ValueError if histogram_freq > 0
<del> with pytest.raises(ValueError) as raised_exception:
<del> model.fit(X_train, y_train, batch_size=batch_size,
<del> callbacks=callbacks_factory(histogram_freq=1), epochs=3)
<del> assert 'validation_data must be provided' in str(raised_exception.value)
<del>
<del> train_generator = data_generator(X_train, y_train, batch_size)
<del> validation_generator = data_generator(X_test, y_test, batch_size)
<del>
<del> # fit generator without validation data should raise ValueError if
<del> # histogram_freq > 0
<del> with pytest.raises(ValueError) as raised_exception:
<del> model.fit_generator(train_generator,
<del> len(X_train), epochs=2,
<del> callbacks=callbacks_factory(histogram_freq=1))
<del> assert 'validation_data must be provided' in str(raised_exception.value)
<del>
<del> # fit generator with validation data generator should raise ValueError if
<del> # histogram_freq > 0
<del> with pytest.raises(ValueError) as raised_exception:
<del> model.fit_generator(train_generator, len(X_train), epochs=2,
<del> validation_data=validation_generator,
<del> validation_steps=1,
<del> callbacks=callbacks_factory(histogram_freq=1))
<del> assert 'validation_data must be provided' in str(raised_exception.value)
<del>
<del>
<ide> def test_TensorBoard_multi_input_output(tmpdir):
<ide> np.random.seed(np.random.randint(1, 1e7))
<ide> filepath = str(tmpdir / 'logs')
<ide><path>tests/keras/legacy/interface_test.py
<ide> def test_spatialdropout3d_legacy_interface():
<ide> assert json.dumps(old_layer.get_config()) == json.dumps(new_layer_2.get_config())
<ide>
<ide>
<del>def test_optimizer_get_updates_legacy_interface():
<del> for optimizer_cls in [keras.optimizers.RMSprop,
<del> keras.optimizers.SGD,
<del> keras.optimizers.Adadelta,
<del> keras.optimizers.Adam,
<del> keras.optimizers.Adagrad,
<del> keras.optimizers.Nadam,
<del> keras.optimizers.Adamax]:
<del> optimizer = optimizer_cls()
<del> param = keras.backend.variable(0.)
<del> loss = keras.backend.mean(param)
<del> constraints = {param: lambda x: x}
<del> params = [param]
<del> optimizer.get_updates(params, constraints, loss)
<del> optimizer.get_updates(params, constraints, loss=loss)
<del> optimizer.get_updates(loss, params)
<del> optimizer.get_updates(loss, params=params)
<del> optimizer.get_updates(loss=loss, params=params)
<del>
<del>
<ide> if __name__ == '__main__':
<ide> pytest.main([__file__])
| 3
|
Javascript
|
Javascript
|
remove internal `util._extends()` usage
|
d4c91f28148af8a6c1a95392e5c88cb93d4b61c6
|
<ide><path>lib/_http_agent.js
<ide> function Agent(options) {
<ide> this.defaultPort = 80;
<ide> this.protocol = 'http:';
<ide>
<del> this.options = util._extend({}, options);
<add> this.options = { ...options };
<ide>
<ide> // Don't confuse net and make it think that we're connecting to a pipe
<ide> this.options.path = null;
<ide> Agent.prototype.addRequest = function addRequest(req, options, port/* legacy */,
<ide> };
<ide> }
<ide>
<del> options = util._extend({}, options);
<del> util._extend(options, this.options);
<add> options = { ...options, ...this.options };
<ide> if (options.socketPath)
<ide> options.path = options.socketPath;
<ide>
<ide> Agent.prototype.addRequest = function addRequest(req, options, port/* legacy */,
<ide> };
<ide>
<ide> Agent.prototype.createSocket = function createSocket(req, options, cb) {
<del> options = util._extend({}, options);
<del> util._extend(options, this.options);
<add> options = { ...options, ...this.options };
<ide> if (options.socketPath)
<ide> options.path = options.socketPath;
<ide>
<ide><path>lib/_http_client.js
<ide>
<ide> 'use strict';
<ide>
<del>const util = require('util');
<ide> const net = require('net');
<ide> const url = require('url');
<ide> const assert = require('assert').ok;
<ide> function ClientRequest(input, options, cb) {
<ide>
<ide> if (typeof options === 'function') {
<ide> cb = options;
<del> options = null;
<add> options = input || {};
<add> } else {
<add> options = Object.assign(input || {}, options);
<ide> }
<ide>
<del> options = util._extend(input || {}, options || {});
<del>
<ide> var agent = options.agent;
<ide> var defaultAgent = options._defaultAgent || Agent.globalAgent;
<ide> if (agent === false) {
<ide><path>lib/_http_server.js
<ide> function Server(options, requestListener) {
<ide> requestListener = options;
<ide> options = {};
<ide> } else if (options == null || typeof options === 'object') {
<del> options = util._extend({}, options);
<add> options = { ...options };
<ide> } else {
<ide> throw new ERR_INVALID_ARG_TYPE('options', 'object', options);
<ide> }
<ide><path>lib/_tls_wrap.js
<ide> function SNICallback(servername, callback) {
<ide> //
<ide> //
<ide> function normalizeConnectArgs(listArgs) {
<del> var args = net._normalizeArgs(listArgs);
<del> var options = args[0];
<del> var cb = args[1];
<add> const args = net._normalizeArgs(listArgs);
<add> const options = args[0];
<add> const cb = args[1];
<ide>
<ide> // If args[0] was options, then normalize dealt with it.
<ide> // If args[0] is port, or args[0], args[1] is host, port, we need to
<ide> // find the options and merge them in, normalize's options has only
<ide> // the host/port/path args that it knows about, not the tls options.
<ide> // This means that options.host overrides a host arg.
<ide> if (listArgs[1] !== null && typeof listArgs[1] === 'object') {
<del> util._extend(options, listArgs[1]);
<add> Object.assign(options, listArgs[1]);
<ide> } else if (listArgs[2] !== null && typeof listArgs[2] === 'object') {
<del> util._extend(options, listArgs[2]);
<add> Object.assign(options, listArgs[2]);
<ide> }
<ide>
<del> return (cb) ? [options, cb] : [options];
<add> return cb ? [options, cb] : [options];
<ide> }
<ide>
<ide> function onConnectSecure() {
<ide> exports.connect = function connect(...args) {
<ide> 'certificate verification.');
<ide> }
<ide>
<del> var defaults = {
<add> options = {
<ide> rejectUnauthorized: !allowUnauthorized,
<ide> ciphers: tls.DEFAULT_CIPHERS,
<ide> checkServerIdentity: tls.checkServerIdentity,
<del> minDHSize: 1024
<add> minDHSize: 1024,
<add> ...options
<ide> };
<ide>
<del> options = util._extend(defaults, options || {});
<ide> if (!options.keepAlive)
<ide> options.singleUse = true;
<ide>
<ide><path>lib/child_process.js
<ide> exports.fork = function fork(modulePath /* , args, options */) {
<ide> throw new ERR_INVALID_ARG_VALUE(`arguments[${pos}]`, arguments[pos]);
<ide> }
<ide>
<del> options = util._extend({}, arguments[pos++]);
<add> options = { ...arguments[pos++] };
<ide> }
<ide>
<ide> // Prepare arguments for fork:
<ide> Object.defineProperty(exports.exec, util.promisify.custom, {
<ide> });
<ide>
<ide> exports.execFile = function execFile(file /* , args, options, callback */) {
<del> var args = [];
<del> var callback;
<del> var options = {
<del> encoding: 'utf8',
<del> timeout: 0,
<del> maxBuffer: 200 * 1024,
<del> killSignal: 'SIGTERM',
<del> cwd: null,
<del> env: null,
<del> shell: false
<del> };
<add> let args = [];
<add> let callback;
<add> let options;
<ide>
<ide> // Parse the optional positional parameters.
<del> var pos = 1;
<add> let pos = 1;
<ide> if (pos < arguments.length && Array.isArray(arguments[pos])) {
<ide> args = arguments[pos++];
<ide> } else if (pos < arguments.length && arguments[pos] == null) {
<ide> pos++;
<ide> }
<ide>
<ide> if (pos < arguments.length && typeof arguments[pos] === 'object') {
<del> util._extend(options, arguments[pos++]);
<add> options = arguments[pos++];
<ide> } else if (pos < arguments.length && arguments[pos] == null) {
<ide> pos++;
<ide> }
<ide> exports.execFile = function execFile(file /* , args, options, callback */) {
<ide> throw new ERR_INVALID_ARG_VALUE('args', arguments[pos]);
<ide> }
<ide>
<add> options = {
<add> encoding: 'utf8',
<add> timeout: 0,
<add> maxBuffer: 200 * 1024,
<add> killSignal: 'SIGTERM',
<add> cwd: null,
<add> env: null,
<add> shell: false,
<add> ...options
<add> };
<add>
<ide> // Validate the timeout, if present.
<ide> validateTimeout(options.timeout);
<ide>
<ide> function spawnSync(/* file, args, options */) {
<ide> options.stdio = _validateStdio(options.stdio || 'pipe', true).stdio;
<ide>
<ide> if (options.input) {
<del> var stdin = options.stdio[0] = util._extend({}, options.stdio[0]);
<add> var stdin = options.stdio[0] = { ...options.stdio[0] };
<ide> stdin.input = options.input;
<ide> }
<ide>
<ide> // We may want to pass data in on any given fd, ensure it is a valid buffer
<ide> for (var i = 0; i < options.stdio.length; i++) {
<ide> var input = options.stdio[i] && options.stdio[i].input;
<ide> if (input != null) {
<del> var pipe = options.stdio[i] = util._extend({}, options.stdio[i]);
<add> var pipe = options.stdio[i] = { ...options.stdio[i] };
<ide> if (isArrayBufferView(input)) {
<ide> pipe.input = input;
<ide> } else if (typeof input === 'string') {
<ide><path>lib/domain.js
<ide> Domain.prototype.run = function(fn) {
<ide> function intercepted(_this, self, cb, fnargs) {
<ide> if (fnargs[0] && fnargs[0] instanceof Error) {
<ide> var er = fnargs[0];
<del> util._extend(er, {
<del> domainBound: cb,
<del> domainThrown: false,
<del> domain: self
<del> });
<add> er.domainBound = cb;
<add> er.domainThrown = false;
<add> er.domain = self;
<ide> self.emit('error', er);
<ide> return;
<ide> }
<ide><path>lib/fs.js
<ide> const {
<ide> O_SYMLINK
<ide> } = constants;
<ide>
<del>const { _extend } = require('util');
<ide> const pathModule = require('path');
<ide> const { isArrayBufferView } = require('internal/util/types');
<ide> const binding = internalBinding('fs');
<ide> function watchFile(filename, options, listener) {
<ide> filename = pathModule.resolve(filename);
<ide> let stat;
<ide>
<del> const defaults = {
<add> if (options === null || typeof options !== 'object') {
<add> listener = options;
<add> options = null;
<add> }
<add>
<add> options = {
<ide> // Poll interval in milliseconds. 5007 is what libev used to use. It's
<ide> // a little on the slow side but let's stick with it for now to keep
<ide> // behavioral changes to a minimum.
<ide> interval: 5007,
<del> persistent: true
<add> persistent: true,
<add> ...options
<ide> };
<ide>
<del> if (options !== null && typeof options === 'object') {
<del> options = _extend(defaults, options);
<del> } else {
<del> listener = options;
<del> options = defaults;
<del> }
<del>
<ide> if (typeof listener !== 'function') {
<ide> throw new ERR_INVALID_ARG_TYPE('listener', 'Function', listener);
<ide> }
<ide><path>lib/https.js
<ide> function Server(opts, requestListener) {
<ide> requestListener = opts;
<ide> opts = undefined;
<ide> }
<del> opts = util._extend({}, opts);
<add> opts = { ...opts };
<ide>
<ide> if (!opts.ALPNProtocols) {
<ide> // http/1.0 is not defined as Protocol IDs in IANA
<ide> function createConnection(port, host, options) {
<ide> const session = this._getSession(options._agentKey);
<ide> if (session) {
<ide> debug('reuse session for %j', options._agentKey);
<del> options = util._extend({
<del> session: session
<del> }, options);
<add> options = {
<add> session,
<add> ...options
<add> };
<ide> }
<ide> }
<ide>
<ide> function request(...args) {
<ide> }
<ide>
<ide> if (args[0] && typeof args[0] !== 'function') {
<del> options = util._extend(options, args.shift());
<add> Object.assign(options, args.shift());
<ide> }
<ide>
<ide> options._defaultAgent = globalAgent;
<ide><path>lib/internal/cluster/child.js
<ide> 'use strict';
<ide> const assert = require('assert');
<del>const util = require('util');
<ide> const path = require('path');
<ide> const EventEmitter = require('events');
<ide> const { owner_symbol } = require('internal/async_hooks').symbols;
<ide> cluster._getServer = function(obj, options, cb) {
<ide>
<ide> indexes.set(indexesKey, index);
<ide>
<del> const message = util._extend({
<add> const message = {
<ide> act: 'queryServer',
<ide> index,
<del> data: null
<del> }, options);
<add> data: null,
<add> ...options
<add> };
<ide>
<ide> message.address = address;
<ide>
<ide> function rr(message, indexesKey, cb) {
<ide>
<ide> function getsockname(out) {
<ide> if (key)
<del> util._extend(out, message.sockname);
<add> Object.assign(out, message.sockname);
<ide>
<ide> return 0;
<ide> }
<ide><path>lib/internal/cluster/master.js
<ide> 'use strict';
<ide> const assert = require('assert');
<ide> const { fork } = require('child_process');
<del>const util = require('util');
<ide> const path = require('path');
<ide> const EventEmitter = require('events');
<ide> const RoundRobinHandle = require('internal/cluster/round_robin_handle');
<ide> if (schedulingPolicy === undefined) {
<ide> cluster.schedulingPolicy = schedulingPolicy;
<ide>
<ide> cluster.setupMaster = function(options) {
<del> var settings = {
<add> const settings = {
<ide> args: process.argv.slice(2),
<ide> exec: process.argv[1],
<ide> execArgv: process.execArgv,
<del> silent: false
<add> silent: false,
<add> ...cluster.settings,
<add> ...options
<ide> };
<del> util._extend(settings, cluster.settings);
<del> util._extend(settings, options || {});
<ide>
<ide> // Tell V8 to write profile data for each process to a separate file.
<ide> // Without --logfile=v8-%p.log, everything ends up in a single, unusable
<ide> function setupSettingsNT(settings) {
<ide> }
<ide>
<ide> function createWorkerProcess(id, env) {
<del> const workerEnv = util._extend({}, process.env);
<add> const workerEnv = { ...process.env, ...env, NODE_UNIQUE_ID: `${id}` };
<ide> const execArgv = cluster.settings.execArgv.slice();
<ide> const debugArgRegex = /--inspect(?:-brk|-port)?|--debug-port/;
<ide> const nodeOptions = process.env.NODE_OPTIONS ?
<ide> process.env.NODE_OPTIONS : '';
<ide>
<del> util._extend(workerEnv, env);
<del> workerEnv.NODE_UNIQUE_ID = '' + id;
<del>
<ide> if (execArgv.some((arg) => arg.match(debugArgRegex)) ||
<ide> nodeOptions.match(debugArgRegex)) {
<ide> let inspectPort;
<ide> function queryServer(worker, message) {
<ide>
<ide> // Set custom server data
<ide> handle.add(worker, (errno, reply, handle) => {
<del> reply = util._extend({
<del> errno: errno,
<del> key: key,
<del> ack: message.seq,
<del> data: handles.get(key).data
<del> }, reply);
<add> const { data } = handles.get(key);
<ide>
<ide> if (errno)
<ide> handles.delete(key); // Gives other workers a chance to retry.
<ide>
<del> send(worker, reply, handle);
<add> send(worker, {
<add> errno,
<add> key,
<add> ack: message.seq,
<add> data,
<add> ...reply
<add> }, handle);
<ide> });
<ide> }
<ide>
<ide><path>lib/internal/cluster/utils.js
<ide> 'use strict';
<del>const util = require('util');
<ide>
<ide> module.exports = {
<ide> sendHelper,
<ide> function sendHelper(proc, message, handle, cb) {
<ide> return false;
<ide>
<ide> // Mark message as internal. See INTERNAL_PREFIX in lib/child_process.js
<del> message = util._extend({ cmd: 'NODE_CLUSTER' }, message);
<add> message = { cmd: 'NODE_CLUSTER', ...message, seq };
<ide>
<ide> if (typeof cb === 'function')
<ide> callbacks.set(seq, cb);
<ide>
<del> message.seq = seq;
<ide> seq += 1;
<ide> return proc.send(message, handle);
<ide> }
<ide><path>lib/internal/fs/utils.js
<ide> function getOptions(options, defaultOptions) {
<ide> }
<ide>
<ide> if (typeof options === 'string') {
<del> defaultOptions = util._extend({}, defaultOptions);
<add> defaultOptions = { ...defaultOptions };
<ide> defaultOptions.encoding = options;
<ide> options = defaultOptions;
<ide> } else if (typeof options !== 'object') {
<ide><path>lib/internal/repl.js
<ide> function createRepl(env, opts, cb) {
<ide> cb = opts;
<ide> opts = null;
<ide> }
<del> opts = util._extend({
<add> opts = {
<ide> ignoreUndefined: false,
<ide> terminal: process.stdout.isTTY,
<ide> useGlobal: true,
<del> breakEvalOnSigint: true
<del> }, opts);
<add> breakEvalOnSigint: true,
<add> ...opts
<add> };
<ide>
<ide> if (parseInt(env.NODE_NO_READLINE)) {
<ide> opts.terminal = false;
<ide><path>lib/internal/url.js
<ide> class URLSearchParams {
<ide> return ctx.stylize('[Object]', 'special');
<ide>
<ide> var separator = ', ';
<del> var innerOpts = util._extend({}, ctx);
<add> var innerOpts = { ...ctx };
<ide> if (recurseTimes !== null) {
<ide> innerOpts.depth = recurseTimes - 1;
<ide> }
<ide> Object.defineProperties(URL.prototype, {
<ide> value: function format(options) {
<ide> if (options && typeof options !== 'object')
<ide> throw new ERR_INVALID_ARG_TYPE('options', 'Object', options);
<del> options = util._extend({
<add> options = {
<ide> fragment: true,
<ide> unicode: false,
<ide> search: true,
<del> auth: true
<del> }, options);
<add> auth: true,
<add> ...options
<add> };
<ide> const ctx = this[context];
<ide> var ret = ctx.scheme;
<ide> if (ctx.host !== null) {
<ide> defineIDLClass(URLSearchParamsIteratorPrototype, 'URLSearchParams Iterator', {
<ide> if (typeof recurseTimes === 'number' && recurseTimes < 0)
<ide> return ctx.stylize('[Object]', 'special');
<ide>
<del> const innerOpts = util._extend({}, ctx);
<add> const innerOpts = { ...ctx };
<ide> if (recurseTimes !== null) {
<ide> innerOpts.depth = recurseTimes - 1;
<ide> }
<ide><path>lib/net.js
<ide> function Socket(options) {
<ide> if (typeof options === 'number')
<ide> options = { fd: options }; // Legacy interface.
<ide> else
<del> options = util._extend({}, options);
<add> options = { ...options };
<ide>
<ide> options.readable = options.readable || false;
<ide> options.writable = options.writable || false;
<ide><path>lib/tty.js
<ide>
<ide> 'use strict';
<ide>
<del>const { inherits, _extend } = require('util');
<add>const { inherits } = require('util');
<ide> const net = require('net');
<ide> const { TTY, isTTY } = internalBinding('tty_wrap');
<ide> const errors = require('internal/errors');
<ide> function ReadStream(fd, options) {
<ide> throw new ERR_TTY_INIT_FAILED(ctx);
<ide> }
<ide>
<del> options = _extend({
<add> net.Socket.call(this, {
<ide> highWaterMark: 0,
<ide> readable: true,
<ide> writable: false,
<del> handle: tty
<del> }, options);
<del>
<del> net.Socket.call(this, options);
<add> handle: tty,
<add> ...options
<add> });
<ide>
<ide> this.isRaw = false;
<ide> this.isTTY = true;
<ide><path>lib/zlib.js
<ide> const {
<ide> const Transform = require('_stream_transform');
<ide> const {
<ide> deprecate,
<del> _extend,
<ide> types: {
<ide> isAnyArrayBuffer,
<ide> isArrayBufferView
<ide> function ZlibBase(opts, mode, handle, { flush, finishFlush, fullFlush }) {
<ide> Z_NO_FLUSH, Z_BLOCK, finishFlush);
<ide>
<ide> if (opts.encoding || opts.objectMode || opts.writableObjectMode) {
<del> opts = _extend({}, opts);
<add> opts = { ...opts };
<ide> opts.encoding = null;
<ide> opts.objectMode = false;
<ide> opts.writableObjectMode = false;
| 17
|
Text
|
Text
|
improve pipe description
|
b481beecd803397940b2c4d0ed17d266ba5c1405
|
<ide><path>doc/api/child_process.md
<ide> pipes between the parent and child. The value is one of the following:
<ide> `child_process` object as [`subprocess.stdio[fd]`][`subprocess.stdio`]. Pipes
<ide> created for fds 0, 1, and 2 are also available as [`subprocess.stdin`][],
<ide> [`subprocess.stdout`][] and [`subprocess.stderr`][], respectively.
<add> Currently, these are not actual Unix pipes and therefore the child process
<add> can not use them by their descriptor files,
<add> e.g. `/dev/fd/2` or `/dev/stdout`.
<ide> 2. `'overlapped'`: Same as `'pipe'` except that the `FILE_FLAG_OVERLAPPED` flag
<ide> is set on the handle. This is necessary for overlapped I/O on the child
<ide> process's stdio handles. See the
| 1
|
Ruby
|
Ruby
|
move extract_scale to decimal type
|
e45e4f44e35d6ce9868542cc2a151b2a6c497e9b
|
<ide><path>activerecord/lib/active_record/connection_adapters/type.rb
<ide> module Type # :nodoc:
<ide> class << self
<ide> def extract_scale(sql_type)
<ide> case sql_type
<del> when /^(numeric|decimal|number)\((\d+)\)/i then 0
<del> when /^(numeric|decimal|number)\((\d+)(,(\d+))\)/i then $4.to_i
<add> when /\((\d+)\)/ then 0
<add> when /\((\d+)(,(\d+))\)/ then $3.to_i
<ide> end
<ide> end
<ide> end
<ide><path>activerecord/lib/active_record/connection_adapters/type/decimal.rb
<ide> module Type
<ide> class Decimal < Value # :nodoc:
<ide> include Numeric
<ide>
<add> delegate :extract_scale, to: Type
<add>
<ide> def type
<ide> :decimal
<ide> end
<ide><path>activerecord/lib/active_record/connection_adapters/type/value.rb
<ide> module ConnectionAdapters
<ide> module Type
<ide> class Value # :nodoc:
<ide> def type; end
<del>
<del> def extract_scale(sql_type)
<del> Type.extract_scale(sql_type)
<del> end
<add> def extract_scale(sql_type); end
<ide>
<ide> def type_cast(value)
<ide> cast_value(value) unless value.nil?
<ide><path>activerecord/test/cases/schema_dumper_test.rb
<ide> def test_schema_dump_keeps_large_precision_integer_columns_as_decimal
<ide> output = standard_dump
<ide> # Oracle supports precision up to 38 and it identifies decimals with scale 0 as integers
<ide> if current_adapter?(:OracleAdapter)
<del> assert_match %r{t.integer\s+"atoms_in_universe",\s+precision: 38,\s+scale: 0}, output
<add> assert_match %r{t.integer\s+"atoms_in_universe",\s+precision: 38}, output
<ide> else
<del> assert_match %r{t.decimal\s+"atoms_in_universe",\s+precision: 55,\s+scale: 0}, output
<add> assert_match %r{t.decimal\s+"atoms_in_universe",\s+precision: 55}, output
<ide> end
<ide> end
<ide>
| 4
|
Ruby
|
Ruby
|
update routes.rb template to use app name
|
426348b48403f664cc10e8ec545b640e56c1c090
|
<ide><path>railties/lib/rails/application.rb
<ide> def load_tasks
<ide> end
<ide> end
<ide>
<add> def routes
<add> ActionController::Routing::Routes
<add> end
<add>
<ide> def call(env)
<ide> new.call(env)
<ide> end
<ide><path>railties/lib/rails/generators/actions.rb
<ide> def freeze!(args={})
<ide> #
<ide> def route(routing_code)
<ide> log :route, routing_code
<del> sentinel = "ActionController::Routing::Routes.draw do |map|"
<add> sentinel = "routes.draw do |map|"
<ide>
<ide> in_root do
<ide> inject_into_file 'config/routes.rb', "\n #{routing_code}\n", { :after => sentinel, :verbose => false }
<ide><path>railties/lib/rails/generators/rails/app/app_generator.rb
<ide> def create_config_files
<ide> empty_directory "config"
<ide>
<ide> inside "config" do
<del> copy_file "routes.rb"
<del> template "application.rb"
<del> template "environment.rb"
<add> template "routes.rb"
<add> template "application.rb"
<add> template "environment.rb"
<ide>
<ide> directory "environments"
<ide> directory "initializers"
<ide><path>railties/lib/rails/generators/rails/app/templates/config/routes.rb
<del>ActionController::Routing::Routes.draw do |map|
<add><%= app_const %>.routes.draw do |map|
<ide> # The priority is based upon order of creation:
<ide> # first created -> highest priority.
<ide>
<ide><path>railties/test/application/initializer_test.rb
<ide> def setup
<ide> Rails::Initializer.run do |config|
<ide> config.root = app_path
<ide> end
<add> Object.const_set(:AppTemplate, Rails.application)
<ide>
<ide> Rails.initialize!
<ide> assert $:.include?("#{app_path}/app/models")
<ide> module Zoo::ReptileHouse ; end
<ide> config.root = app_path
<ide> config.eager_load_paths = "#{app_path}/lib"
<ide> end
<add> Object.const_set(:AppTemplate, Rails.application)
<ide>
<ide> Rails.initialize!
<ide>
<ide> module Zoo::ReptileHouse ; end
<ide> app_file "config/environments/development.rb", "$initialize_test_set_from_env = 'success'"
<ide> assert_nil $initialize_test_set_from_env
<ide> Rails::Initializer.run { |config| config.root = app_path }
<add> Object.const_set(:AppTemplate, Rails.application)
<ide> Rails.initialize!
<ide> assert_equal "success", $initialize_test_set_from_env
<ide> end
<ide> module Zoo::ReptileHouse ; end
<ide> config.after_initialize { $test_after_initialize_block1 = "success" }
<ide> config.after_initialize { $test_after_initialize_block2 = "congratulations" }
<ide> end
<add> Object.const_set(:AppTemplate, Rails.application)
<ide> Rails.initialize!
<ide>
<ide> assert_equal "success", $test_after_initialize_block1
<ide> module Zoo::ReptileHouse ; end
<ide> config.after_initialize # don't pass a block, this is what we're testing!
<ide> config.after_initialize { $test_after_initialize_block2 = "congratulations" }
<ide> end
<add> Object.const_set(:AppTemplate, Rails.application)
<ide> Rails.initialize!
<ide>
<ide> assert_equal "success", $test_after_initialize_block1
<ide> module Zoo::ReptileHouse ; end
<ide> config.root = app_path
<ide> config.i18n.default_locale = :de
<ide> end
<add> Object.const_set(:AppTemplate, Rails.application)
<ide> Rails.initialize!
<ide>
<ide> assert_equal :de, I18n.default_locale
<ide> module Zoo::ReptileHouse ; end
<ide> config.root = app_path
<ide> config.action_controller.session_store = :cookie_store
<ide> end
<add> Object.const_set(:AppTemplate, Rails.application)
<ide> Rails.initialize!
<ide>
<ide> assert !Rails.application.config.middleware.include?(ActiveRecord::SessionStore)
<ide> module Zoo::ReptileHouse ; end
<ide> c.root = app_path
<ide> c.action_controller.session_store = :active_record_store
<ide> end
<add> Object.const_set(:AppTemplate, Rails.application)
<ide> Rails.initialize!
<ide>
<ide> expects = [ActiveRecord::ConnectionAdapters::ConnectionManagement, ActiveRecord::QueryCache, ActiveRecord::SessionStore]
<ide> module Zoo::ReptileHouse ; end
<ide> c.root = app_path
<ide> c.frameworks -= [:action_view]
<ide> end
<add> Object.const_set(:AppTemplate, Rails.application)
<ide> Rails.initialize!
<ide>
<ide> assert_equal nil, ActionMailer::Base.template_root
<ide> module Zoo::ReptileHouse ; end
<ide> Rails::Initializer.run do |c|
<ide> c.root = app_path
<ide> end
<add> Object.const_set(:AppTemplate, Rails.application)
<ide> Rails.initialize!
<ide> assert_instance_of Pathname, Rails.root
<ide> end
<ide><path>railties/test/application/routing_test.rb
<ide> def index
<ide> RUBY
<ide>
<ide> app_file 'config/routes.rb', <<-RUBY
<del> ActionController::Routing::Routes.draw do |map|
<add> AppTemplate.routes.draw do |map|
<ide> match ':controller(/:action)'
<ide> end
<ide> RUBY
<ide> def index
<ide> RUBY
<ide>
<ide> app_file 'config/routes.rb', <<-RUBY
<del> ActionController::Routing::Routes.draw do |map|
<add> AppTemplate.routes.draw do |map|
<ide> match ':controller(/:action)'
<ide> end
<ide> RUBY
<ide> def index
<ide> RUBY
<ide>
<ide> app_file 'config/routes.rb', <<-RUBY
<del> ActionController::Routing::Routes.draw do |map|
<add> AppTemplate.routes.draw do |map|
<ide> match ':controller(/:action)'
<ide> end
<ide> RUBY
<ide> def index
<ide> RUBY
<ide>
<ide> app_file 'config/routes.rb', <<-RUBY
<del> ActionController::Routing::Routes.draw do |map|
<add> AppTemplate.routes.draw do |map|
<ide> match 'foo', :to => 'foo#index'
<ide> end
<ide> RUBY
<ide> def index
<ide> RUBY
<ide>
<ide> plugin.write 'config/routes.rb', <<-RUBY
<del> ActionController::Routing::Routes.draw do |map|
<add> AppTemplate.routes.draw do |map|
<ide> match 'bar', :to => 'bar#index'
<ide> end
<ide> RUBY
<ide> def baz
<ide> RUBY
<ide>
<ide> app_file 'config/routes.rb', <<-RUBY
<del> ActionController::Routing::Routes.draw do |map|
<add> AppTemplate.routes.draw do |map|
<ide> match 'foo', :to => 'foo#bar'
<ide> end
<ide> RUBY
<ide> def baz
<ide> assert_equal 'bar', last_response.body
<ide>
<ide> app_file 'config/routes.rb', <<-RUBY
<del> ActionController::Routing::Routes.draw do |map|
<add> AppTemplate.routes.draw do |map|
<ide> match 'foo', :to => 'foo#baz'
<ide> end
<ide> RUBY
<ide><path>railties/test/initializer/initialize_i18n_test.rb
<ide> def setup
<ide> c.root = app_path
<ide> c.i18n.load_path << "my/other/locale.yml"
<ide> end
<add> Object.const_set(:AppTemplate, Rails.application)
<ide> Rails.initialize!
<ide>
<ide> #{RAILS_FRAMEWORK_ROOT}/railties/test/fixtures/plugins/engines/engine/config/locales/en.yml
<ide><path>railties/test/initializer/path_test.rb
<ide> def setup
<ide> ActionController::Base.session_store = nil
<ide> end
<ide> end
<add> Object.const_set(:AppTemplate, Rails.application)
<ide> Rails.initialize!
<ide> @paths = Rails.application.config.paths
<ide> end
<ide><path>railties/test/paths_test.rb
<ide> require 'rails/paths'
<ide>
<ide> class PathsTest < ActiveSupport::TestCase
<del>
<ide> def setup
<ide> @root = Rails::Application::Root.new("/foo/bar")
<ide> end
<ide> def setup
<ide> @root.app.eager_load!
<ide> assert_equal ["/foo/bar/app"], @root.load_paths
<ide> end
<del>end
<ide>\ No newline at end of file
<add>end
| 9
|
Python
|
Python
|
fix success message [ci skip]
|
ff4267d1812d853c79b7e4e937dc29e0e849155c
|
<ide><path>spacy/about.py
<ide> # fmt: off
<ide> __title__ = "spacy-nightly"
<del>__version__ = "3.0.0rc0"
<add>__version__ = "3.0.0rc1"
<ide> __download_url__ = "https://github.com/explosion/spacy-models/releases/download"
<ide> __compatibility__ = "https://raw.githubusercontent.com/explosion/spacy-models/master/compatibility.json"
<ide> __projects__ = "https://github.com/explosion/projects"
<ide><path>spacy/training/loop.py
<ide> def train(
<ide> nlp.to_disk(final_model_path)
<ide> else:
<ide> nlp.to_disk(final_model_path)
<del> # This will only run if we don't hit an error
<del> stdout.write(
<del> msg.good("Saved pipeline to output directory", final_model_path) + "\n"
<del> )
<add> # This will only run if we don't hit an error
<add> stdout.write(
<add> msg.good("Saved pipeline to output directory", final_model_path) + "\n"
<add> )
<ide>
<ide>
<ide> def train_while_improving(
| 2
|
Javascript
|
Javascript
|
increase timeout and add a label
|
db464d3bccbae21a594fe3c9b378bd90cf50bfc7
|
<ide><path>test/jqLiteSpec.js
<ide> describe('jqLite', function() {
<ide>
<ide> // This test is potentially flaky on CI cloud instances, so there is a generous
<ide> // wait period...
<del> waitsFor(function() { return tested; }, 2000);
<add> waitsFor(function() { return tested; }, 'iframe to load', 5000);
<ide> });
<ide> }
<ide> });
| 1
|
PHP
|
PHP
|
at
|
4822b4fc6a0c1ff429f29c5b8f9e575225e79008
|
<ide><path>tests/Database/DatabaseConnectionTest.php
<ide> public function testTransactionLevelNotIncrementedOnTransactionException()
<ide> public function testBeginTransactionMethodRetriesOnFailure()
<ide> {
<ide> $pdo = $this->createMock(DatabaseConnectionTestMockPDO::class);
<del> $pdo->expects($this->at(0))
<del> ->method('beginTransaction')
<del> ->will($this->throwException(new ErrorException('server has gone away')));
<add> $pdo->method('beginTransaction')
<add> ->willReturnOnConsecutiveCalls($this->throwException(new ErrorException('server has gone away')));
<ide> $connection = $this->getMockConnection(['reconnect'], $pdo);
<ide> $connection->expects($this->once())->method('reconnect');
<ide> $connection->beginTransaction();
<ide><path>tests/Validation/ValidationValidatorTest.php
<ide> public function testValidateMax()
<ide> $this->assertFalse($v->passes());
<ide>
<ide> $file = $this->getMockBuilder(UploadedFile::class)->setMethods(['isValid', 'getSize'])->setConstructorArgs([__FILE__, basename(__FILE__)])->getMock();
<del> $file->expects($this->any())->method('isValid')->willReturn(true);
<del> $file->expects($this->at(1))->method('getSize')->willReturn(3072);
<add> $file->method('isValid')->willReturn(true);
<add> $file->method('getSize')->willReturn(3072);
<ide> $v = new Validator($trans, ['photo' => $file], ['photo' => 'Max:10']);
<ide> $this->assertTrue($v->passes());
<ide>
<ide> $file = $this->getMockBuilder(UploadedFile::class)->setMethods(['isValid', 'getSize'])->setConstructorArgs([__FILE__, basename(__FILE__)])->getMock();
<del> $file->expects($this->at(0))->method('isValid')->willReturn(true);
<del> $file->expects($this->at(1))->method('getSize')->willReturn(4072);
<add> $file->method('isValid')->willReturn(true);
<add> $file->method('getSize')->willReturn(4072);
<ide> $v = new Validator($trans, ['photo' => $file], ['photo' => 'Max:2']);
<ide> $this->assertFalse($v->passes());
<ide>
| 2
|
Javascript
|
Javascript
|
remove unused err argument
|
0229e6746cb8d99928df848b96875b46c1f00b31
|
<ide><path>lib/repl.js
<ide> function complete(line, callback) {
<ide>
<ide> // Will be called when all completionGroups are in place
<ide> // Useful for async autocompletion
<del> function completionGroupsLoaded(err) {
<del> if (err) throw err;
<del>
<add> function completionGroupsLoaded() {
<ide> // Filter, sort (within each group), uniq and merge the completion groups.
<ide> if (completionGroups.length && filter) {
<ide> var newCompletionGroups = [];
| 1
|
Java
|
Java
|
fix typo in event class
|
fbb802ca63f2716d742f75fdddd9020c8efe6d92
|
<ide><path>ReactAndroid/src/main/java/com/facebook/react/uimanager/events/Event.java
<ide> public void dispatch(RCTEventEmitter rctEventEmitter) {
<ide> WritableMap eventData = getEventData();
<ide> if (eventData == null) {
<ide> throw new IllegalViewOperationException(
<del> "Event: you must return a valid, non-null value from `getEventData`, or override `dispatch` and `disatchModern`. Event: "
<add> "Event: you must return a valid, non-null value from `getEventData`, or override `dispatch` and `dispatchModern`. Event: "
<ide> + getEventName());
<ide> }
<ide> rctEventEmitter.receiveEvent(getViewTag(), getEventName(), eventData);
| 1
|
PHP
|
PHP
|
add fcn in the docblock
|
43c9990d3842e21068153af45d068c5a77f86c6b
|
<ide><path>src/Auth/BaseAuthorize.php
<ide> abstract class BaseAuthorize
<ide> /**
<ide> * Constructor
<ide> *
<del> * @param ComponentRegistry $registry The controller for this request.
<add> * @param \Cake\Controller\ComponentRegistry $registry The controller for this request.
<ide> * @param array $config An array of config. This class does not use any config.
<ide> */
<ide> public function __construct(ComponentRegistry $registry, array $config = [])
<ide><path>src/Auth/ControllerAuthorize.php
<ide> public function __construct(ComponentRegistry $registry, array $config = [])
<ide> * Get/set the controller this authorize object will be working with. Also
<ide> * checks that isAuthorized is implemented.
<ide> *
<del> * @param Controller|null $controller null to get, a controller to set.
<add> * @param \Cake\Controller\Controller|null $controller null to get, a controller to set.
<ide> * @return \Cake\Controller\Controller
<ide> * @throws \Cake\Core\Exception\Exception If controller does not have method `isAuthorized()`.
<ide> */
<ide><path>src/Collection/Iterator/FilterIterator.php
<ide> class FilterIterator extends Collection
<ide> * in the current iteration, the key of the element and the passed $items iterator
<ide> * as arguments, in that order.
<ide> *
<del> * @param Iterator $items The items to be filtered.
<add> * @param \Iterator $items The items to be filtered.
<ide> * @param callable $callback Callback.
<ide> */
<ide> public function __construct(Iterator $items, callable $callback)
<ide><path>src/Console/ConsoleOptionParser.php
<ide> public function epilog($text = null)
<ide> * - `choices` A list of valid choices for this option. If left empty all values are valid..
<ide> * An exception will be raised when parse() encounters an invalid value.
<ide> *
<del> * @param ConsoleInputOption|string $name The long name you want to the value to be parsed out as when options are parsed.
<add> * @param \Cake\Console\ConsoleInputOption|string $name The long name you want to the value to be parsed out as when options are parsed.
<ide> * Will also accept an instance of ConsoleInputOption
<ide> * @param array $options An array of parameters that define the behavior of the option
<ide> * @return $this
<ide> public function addOptions(array $options)
<ide> * specific option parsers. When help is generated for a subcommand, if a parser is present
<ide> * it will be used.
<ide> *
<del> * @param ConsoleInputSubcommand|string $name Name of the subcommand. Will also accept an instance of ConsoleInputSubcommand
<add> * @param \Cake\Console\ConsoleInputSubcommand|string $name Name of the subcommand. Will also accept an instance of ConsoleInputSubcommand
<ide> * @param array $options Array of params, see above.
<ide> * @return $this
<ide> */
<ide><path>src/Console/TaskRegistry.php
<ide> class TaskRegistry extends ObjectRegistry
<ide> /**
<ide> * Constructor
<ide> *
<del> * @param Shell $Shell Shell instance
<add> * @param \Cake\Console\Shell $Shell Shell instance
<ide> */
<ide> public function __construct(Shell $Shell)
<ide> {
<ide><path>src/Controller/Component.php
<ide> class Component implements EventListenerInterface
<ide> /**
<ide> * Constructor
<ide> *
<del> * @param ComponentRegistry $registry A ComponentRegistry this component can use to lazy load its components
<add> * @param \Cake\Controller\ComponentRegistry $registry A ComponentRegistry this component can use to lazy load its components
<ide> * @param array $config Array of configuration settings.
<ide> */
<ide> public function __construct(ComponentRegistry $registry, array $config = [])
<ide><path>src/Controller/Component/FlashComponent.php
<ide> class FlashComponent extends Component
<ide> /**
<ide> * Constructor
<ide> *
<del> * @param ComponentRegistry $registry A ComponentRegistry for this component
<add> * @param \Cake\Controller\ComponentRegistry $registry A ComponentRegistry for this component
<ide> * @param array $config Array of config.
<ide> */
<ide> public function __construct(ComponentRegistry $registry, array $config = [])
<ide><path>src/Controller/Component/RequestHandlerComponent.php
<ide> class RequestHandlerComponent extends Component
<ide> /**
<ide> * Constructor. Parses the accepted content types accepted by the client using HTTP_ACCEPT
<ide> *
<del> * @param ComponentRegistry $registry ComponentRegistry object.
<add> * @param \Cake\Controller\ComponentRegistry $registry ComponentRegistry object.
<ide> * @param array $config Array of config.
<ide> */
<ide> public function __construct(ComponentRegistry $registry, array $config = [])
<ide> protected function _setExtension($request, $response)
<ide> * If the XML data is POSTed, the data is parsed into an XML object, which is assigned
<ide> * to the $data property of the controller, which can then be saved to a model object.
<ide> *
<del> * @param Event $event The startup event that was fired.
<add> * @param \Cake\Event\Event $event The startup event that was fired.
<ide> * @return void
<ide> */
<ide> public function startup(Event $event)
<ide> public function convertXml($xml)
<ide> /**
<ide> * Handles (fakes) redirects for AJAX requests using requestAction()
<ide> *
<del> * @param Event $event The Controller.beforeRedirect event.
<add> * @param \Cake\Event\Event $event The Controller.beforeRedirect event.
<ide> * @param string|array $url A string or array containing the redirect location
<ide> * @param \Cake\Network\Response $response The response object.
<ide> * @return \Cake\Network\Response|null The response object if the redirect is caught.
<ide> public function beforeRedirect(Event $event, $url, Response $response)
<ide> * - If the extension is of a type that RequestHandler understands, it will
<ide> * set that Content-type in the response header.
<ide> *
<del> * @param Event $event The Controller.beforeRender event.
<add> * @param \Cake\Event\Event $event The Controller.beforeRender event.
<ide> * @return bool false if the render process should be aborted
<ide> */
<ide> public function beforeRender(Event $event)
<ide> public function prefers($type = null)
<ide> * $this->RequestHandler->renderAs($this, 'xml', ['attachment' => 'myfile.xml'];
<ide> * ```
<ide> *
<del> * @param Controller $controller A reference to a controller object
<add> * @param \Cake\Controller\Controller $controller A reference to a controller object
<ide> * @param string $type Type of response to send (e.g: 'ajax')
<ide> * @param array $options Array of options to use
<ide> * @return void
<ide><path>src/Controller/Component/SecurityComponent.php
<ide> class SecurityComponent extends Component
<ide> /**
<ide> * Component startup. All security checking happens here.
<ide> *
<del> * @param Event $event An Event instance
<add> * @param \Cake\Event\Event $event An Event instance
<ide> * @return mixed
<ide> */
<ide> public function startup(Event $event)
<ide> public function requireAuth($actions)
<ide> * Black-hole an invalid request with a 400 error or custom callback. If SecurityComponent::$blackHoleCallback
<ide> * is specified, it will use this callback by executing the method indicated in $error
<ide> *
<del> * @param Controller $controller Instantiating controller
<add> * @param \Cake\Controller\Controller $controller Instantiating controller
<ide> * @param string $error Error method
<ide> * @return mixed If specified, controller blackHoleCallback's response, or no return otherwise
<ide> * @see SecurityComponent::$blackHoleCallback
<ide> protected function _requireMethod($method, $actions = [])
<ide> /**
<ide> * Check if access requires secure connection
<ide> *
<del> * @param Controller $controller Instantiating controller
<add> * @param \Cake\Controller\Controller $controller Instantiating controller
<ide> * @return bool true if secure connection required
<ide> */
<ide> protected function _secureRequired(Controller $controller)
<ide> protected function _secureRequired(Controller $controller)
<ide> /**
<ide> * Check if authentication is required
<ide> *
<del> * @param Controller $controller Instantiating controller
<add> * @param \Cake\Controller\Controller $controller Instantiating controller
<ide> * @return bool true if authentication required
<ide> */
<ide> protected function _authRequired(Controller $controller)
<ide> protected function _authRequired(Controller $controller)
<ide> /**
<ide> * Validate submitted form
<ide> *
<del> * @param Controller $controller Instantiating controller
<add> * @param \Cake\Controller\Controller $controller Instantiating controller
<ide> * @return bool true if submitted form is valid
<ide> */
<ide> protected function _validatePost(Controller $controller)
<ide> public function generateToken(Request $request)
<ide> /**
<ide> * Calls a controller callback method
<ide> *
<del> * @param Controller $controller Controller to run callback on
<add> * @param \Cake\Controller\Controller $controller Controller to run callback on
<ide> * @param string $method Method to execute
<ide> * @param array $params Parameters to send to method
<ide> * @return mixed Controller callback method's response
<ide><path>src/Controller/Controller.php
<ide> public function isAction($action)
<ide> * Called before the controller action. You can use this method to configure and customize components
<ide> * or perform logic that needs to happen before each controller action.
<ide> *
<del> * @param Event $event An Event instance
<add> * @param \Cake\Event\Event $event An Event instance
<ide> * @return \Cake\Network\Response|null
<ide> * @link http://book.cakephp.org/3.0/en/controllers.html#request-life-cycle-callbacks
<ide> */
<ide> public function beforeFilter(Event $event)
<ide> * Called after the controller action is run, but before the view is rendered. You can use this method
<ide> * to perform logic or set view variables that are required on every request.
<ide> *
<del> * @param Event $event An Event instance
<add> * @param \Cake\Event\Event $event An Event instance
<ide> * @return \Cake\Network\Response|null
<ide> * @link http://book.cakephp.org/3.0/en/controllers.html#request-life-cycle-callbacks
<ide> */
<ide> public function beforeRender(Event $event)
<ide> * You can set the event result to response instance or modify the redirect location
<ide> * using controller's response instance.
<ide> *
<del> * @param Event $event An Event instance
<add> * @param \Cake\Event\Event $event An Event instance
<ide> * @param string|array $url A string or array-based URL pointing to another location within the app,
<ide> * or an absolute URL
<ide> * @param \Cake\Network\Response $response The response object.
<ide> public function beforeRedirect(Event $event, $url, Response $response)
<ide> /**
<ide> * Called after the controller action is run and rendered.
<ide> *
<del> * @param Event $event An Event instance
<add> * @param \Cake\Event\Event $event An Event instance
<ide> * @return \Cake\Network\Response|null
<ide> * @link http://book.cakephp.org/3.0/en/controllers.html#request-life-cycle-callbacks
<ide> */
<ide><path>src/Core/Configure.php
<ide> public static function consume($var)
<ide> *
<ide> * @param string $name The name of the engine being configured. This alias is used later to
<ide> * read values from a specific engine.
<del> * @param ConfigEngineInterface $engine The engine to append.
<add> * @param \Cake\Core\Configure\ConfigEngineInterface $engine The engine to append.
<ide> * @return void
<ide> */
<ide> public static function config($name, ConfigEngineInterface $engine)
<ide><path>src/Database/Expression/CaseExpression.php
<ide> public function elseValue($value = null, $type = null)
<ide> * Compiles the relevant parts into sql
<ide> *
<ide> * @param array|string|\Cake\Database\ExpressionInterface $part The part to compile
<del> * @param ValueBinder $generator Sql generator
<add> * @param \Cake\Database\ValueBinder $generator Sql generator
<ide> *
<ide> * @return string
<ide> */
<ide><path>src/Database/Query.php
<ide> public function execute()
<ide> * values when the query is executed, hence it is most suitable to use with
<ide> * prepared statements.
<ide> *
<del> * @param ValueBinder $generator A placeholder object that will hold
<add> * @param \Cake\Database\ValueBinder $generator A placeholder object that will hold
<ide> * associated values for expressions
<ide> * @return string
<ide> */
<ide><path>src/Database/SqlDialectTrait.php
<ide> protected function _expressionTranslators()
<ide> /**
<ide> * Apply translation steps to select queries.
<ide> *
<del> * @param Query $query The query to translate
<del> * @return Query The modified query
<add> * @param \Cake\Database\Query $query The query to translate
<add> * @return \Cake\Database\Query The modified query
<ide> */
<ide> protected function _selectQueryTranslator($query)
<ide> {
<ide> protected function _selectQueryTranslator($query)
<ide> * Returns the passed query after rewriting the DISTINCT clause, so that drivers
<ide> * that do not support the "ON" part can provide the actual way it should be done
<ide> *
<del> * @param Query $query The query to be transformed
<del> * @return Query
<add> * @param \Cake\Database\Query $query The query to be transformed
<add> * @return \Cake\Database\Query
<ide> */
<ide> protected function _transformDistinct($query)
<ide> {
<ide> protected function _transformDistinct($query)
<ide> *
<ide> * We are intentionally not supporting deletes with joins as they have even poorer support.
<ide> *
<del> * @param Query $query The query to translate
<del> * @return Query The modified query
<add> * @param \Cake\Database\Query $query The query to translate
<add> * @return \Cake\Database\Query The modified query
<ide> */
<ide> protected function _deleteQueryTranslator($query)
<ide> {
<ide> protected function _deleteQueryTranslator($query)
<ide> /**
<ide> * Apply translation steps to update queries.
<ide> *
<del> * @param Query $query The query to translate
<del> * @return Query The modified query
<add> * @param \Cake\Database\Query $query The query to translate
<add> * @return \Cake\Database\Query The modified query
<ide> */
<ide> protected function _updateQueryTranslator($query)
<ide> {
<ide> protected function _updateQueryTranslator($query)
<ide> /**
<ide> * Apply translation steps to insert queries.
<ide> *
<del> * @param Query $query The query to translate
<del> * @return Query The modified query
<add> * @param \Cake\Database\Query $query The query to translate
<add> * @return \Cake\Database\Query The modified query
<ide> */
<ide> protected function _insertQueryTranslator($query)
<ide> {
<ide><path>src/Database/Type.php
<ide> public function getBaseType()
<ide> * Casts given value from a PHP type to one acceptable by database
<ide> *
<ide> * @param mixed $value value to be converted to database equivalent
<del> * @param Driver $driver object from which database preferences and configuration will be extracted
<add> * @param \Cake\Database\Driver $driver object from which database preferences and configuration will be extracted
<ide> * @return mixed
<ide> */
<ide> public function toDatabase($value, Driver $driver)
<ide> public function toDatabase($value, Driver $driver)
<ide> * Casts given value from a database type to PHP equivalent
<ide> *
<ide> * @param mixed $value value to be converted to PHP equivalent
<del> * @param Driver $driver object from which database preferences and configuration will be extracted
<add> * @param \Cake\Database\Driver $driver object from which database preferences and configuration will be extracted
<ide> * @return mixed
<ide> */
<ide> public function toPHP($value, Driver $driver)
<ide> protected function _basicTypeCast($value)
<ide> * Casts give value to Statement equivalent
<ide> *
<ide> * @param mixed $value value to be converted to PHP equivalent
<del> * @param Driver $driver object from which database preferences and configuration will be extracted
<add> * @param \Cake\Database\Driver $driver object from which database preferences and configuration will be extracted
<ide> * @return mixed
<ide> */
<ide> public function toStatement($value, Driver $driver)
<ide><path>src/Database/Type/BinaryType.php
<ide> class BinaryType extends Type
<ide> * As PDO will handle reading file handles.
<ide> *
<ide> * @param string|resource $value The value to convert.
<del> * @param Driver $driver The driver instance to convert with.
<add> * @param \Cake\Database\Driver $driver The driver instance to convert with.
<ide> * @return string|resource
<ide> */
<ide> public function toDatabase($value, Driver $driver)
<ide> public function toDatabase($value, Driver $driver)
<ide> * Convert binary into resource handles
<ide> *
<ide> * @param null|string|resource $value The value to convert.
<del> * @param Driver $driver The driver instance to convert with.
<add> * @param \Cake\Database\Driver $driver The driver instance to convert with.
<ide> * @return resource|null
<ide> * @throws \Cake\Core\Exception\Exception
<ide> */
<ide> public function toPHP($value, Driver $driver)
<ide> * Get the correct PDO binding type for Binary data.
<ide> *
<ide> * @param mixed $value The value being bound.
<del> * @param Driver $driver The driver.
<add> * @param \Cake\Database\Driver $driver The driver.
<ide> * @return int
<ide> */
<ide> public function toStatement($value, Driver $driver)
<ide><path>src/Database/Type/BoolType.php
<ide> class BoolType extends Type
<ide> * Convert bool data into the database format.
<ide> *
<ide> * @param mixed $value The value to convert.
<del> * @param Driver $driver The driver instance to convert with.
<add> * @param \Cake\Database\Driver $driver The driver instance to convert with.
<ide> * @return bool|null
<ide> */
<ide> public function toDatabase($value, Driver $driver)
<ide> public function toDatabase($value, Driver $driver)
<ide> * Convert bool values to PHP booleans
<ide> *
<ide> * @param mixed $value The value to convert.
<del> * @param Driver $driver The driver instance to convert with.
<add> * @param \Cake\Database\Driver $driver The driver instance to convert with.
<ide> * @return bool|null
<ide> */
<ide> public function toPHP($value, Driver $driver)
<ide> public function toPHP($value, Driver $driver)
<ide> * Get the correct PDO binding type for bool data.
<ide> *
<ide> * @param mixed $value The value being bound.
<del> * @param Driver $driver The driver.
<add> * @param \Cake\Database\Driver $driver The driver.
<ide> * @return int
<ide> */
<ide> public function toStatement($value, Driver $driver)
<ide><path>src/Database/Type/DateTimeType.php
<ide> public function __construct($name = null)
<ide> * Convert DateTime instance into strings.
<ide> *
<ide> * @param string|int|\DateTime $value The value to convert.
<del> * @param Driver $driver The driver instance to convert with.
<add> * @param \Cake\Database\Driver $driver The driver instance to convert with.
<ide> * @return string
<ide> */
<ide> public function toDatabase($value, Driver $driver)
<ide> public function toDatabase($value, Driver $driver)
<ide> * Convert strings into DateTime instances.
<ide> *
<ide> * @param string $value The value to convert.
<del> * @param Driver $driver The driver instance to convert with.
<add> * @param \Cake\Database\Driver $driver The driver instance to convert with.
<ide> * @return \Cake\I18n\Time|\DateTime
<ide> */
<ide> public function toPHP($value, Driver $driver)
<ide><path>src/Database/Type/DateType.php
<ide> public function marshal($value)
<ide> * Convert strings into Date instances.
<ide> *
<ide> * @param string $value The value to convert.
<del> * @param Driver $driver The driver instance to convert with.
<add> * @param \Cake\Database\Driver $driver The driver instance to convert with.
<ide> * @return \Carbon\Carbon
<ide> */
<ide> public function toPHP($value, Driver $driver)
<ide><path>src/Database/Type/FloatType.php
<ide> class FloatType extends Type
<ide> * Convert integer data into the database format.
<ide> *
<ide> * @param string|resource $value The value to convert.
<del> * @param Driver $driver The driver instance to convert with.
<add> * @param \Cake\Database\Driver $driver The driver instance to convert with.
<ide> * @return string|resource
<ide> */
<ide> public function toDatabase($value, Driver $driver)
<ide> public function toDatabase($value, Driver $driver)
<ide> * Convert float values to PHP integers
<ide> *
<ide> * @param null|string|resource $value The value to convert.
<del> * @param Driver $driver The driver instance to convert with.
<add> * @param \Cake\Database\Driver $driver The driver instance to convert with.
<ide> * @return resource
<ide> * @throws \Cake\Core\Exception\Exception
<ide> */
<ide> public function toPHP($value, Driver $driver)
<ide> * Get the correct PDO binding type for integer data.
<ide> *
<ide> * @param mixed $value The value being bound.
<del> * @param Driver $driver The driver.
<add> * @param \Cake\Database\Driver $driver The driver.
<ide> * @return int
<ide> */
<ide> public function toStatement($value, Driver $driver)
<ide><path>src/Database/Type/IntegerType.php
<ide> class IntegerType extends Type
<ide> * Convert integer data into the database format.
<ide> *
<ide> * @param mixed $value The value to convert.
<del> * @param Driver $driver The driver instance to convert with.
<add> * @param \Cake\Database\Driver $driver The driver instance to convert with.
<ide> * @return int
<ide> */
<ide> public function toDatabase($value, Driver $driver)
<ide> public function toDatabase($value, Driver $driver)
<ide> * Convert integer values to PHP integers
<ide> *
<ide> * @param mixed $value The value to convert.
<del> * @param Driver $driver The driver instance to convert with.
<add> * @param \Cake\Database\Driver $driver The driver instance to convert with.
<ide> * @return int
<ide> */
<ide> public function toPHP($value, Driver $driver)
<ide> public function toPHP($value, Driver $driver)
<ide> * Get the correct PDO binding type for integer data.
<ide> *
<ide> * @param mixed $value The value being bound.
<del> * @param Driver $driver The driver.
<add> * @param \Cake\Database\Driver $driver The driver.
<ide> * @return int
<ide> */
<ide> public function toStatement($value, Driver $driver)
<ide><path>src/Database/Type/StringType.php
<ide> class StringType extends Type
<ide> * Convert string data into the database format.
<ide> *
<ide> * @param mixed $value The value to convert.
<del> * @param Driver $driver The driver instance to convert with.
<add> * @param \Cake\Database\Driver $driver The driver instance to convert with.
<ide> * @return string|null
<ide> */
<ide> public function toDatabase($value, Driver $driver)
<ide> public function toDatabase($value, Driver $driver)
<ide> * Convert string values to PHP strings.
<ide> *
<ide> * @param mixed $value The value to convert.
<del> * @param Driver $driver The driver instance to convert with.
<add> * @param \Cake\Database\Driver $driver The driver instance to convert with.
<ide> * @return string|null
<ide> */
<ide> public function toPHP($value, Driver $driver)
<ide> public function toPHP($value, Driver $driver)
<ide> * Get the correct PDO binding type for string data.
<ide> *
<ide> * @param mixed $value The value being bound.
<del> * @param Driver $driver The driver.
<add> * @param \Cake\Database\Driver $driver The driver.
<ide> * @return int
<ide> */
<ide> public function toStatement($value, Driver $driver)
<ide><path>src/Database/Type/UuidType.php
<ide> class UuidType extends StringType
<ide> * Casts given value from a PHP type to one acceptable by database
<ide> *
<ide> * @param mixed $value value to be converted to database equivalent
<del> * @param Driver $driver object from which database preferences and configuration will be extracted
<add> * @param \Cake\Database\Driver $driver object from which database preferences and configuration will be extracted
<ide> * @return string|null
<ide> */
<ide> public function toDatabase($value, Driver $driver)
<ide><path>src/Network/Response.php
<ide> public function file($path, array $options = [])
<ide> * If an invalid range is requested a 416 Status code will be used
<ide> * in the response.
<ide> *
<del> * @param File $file The file to set a range on.
<add> * @param \Cake\Filesystem\File $file The file to set a range on.
<ide> * @param string $httpRange The range to use.
<ide> * @return void
<ide> */
<ide> protected function _fileRange($file, $httpRange)
<ide> /**
<ide> * Reads out a file, and echos the content to the client.
<ide> *
<del> * @param File $file File object
<add> * @param \Cake\Filesystem\File $file File object
<ide> * @param array $range The range to read out of the file.
<ide> * @return bool True is whole file is echoed successfully or false if client connection is lost in between
<ide> */
<ide><path>src/ORM/Association/HasMany.php
<ide> public function replace(EntityInterface $sourceEntity, array $targetEntities, ar
<ide> * Skips deleting records present in $remainingEntities
<ide> *
<ide> * @param array $properties array of foreignKey properties
<del> * @param EntityInterface $entity the entity which should have its associated entities unassigned
<del> * @param Table $target The associated table
<add> * @param \Cake\Datasource\EntityInterface $entity the entity which should have its associated entities unassigned
<add> * @param \Cake\ORM\Table $target The associated table
<ide> * @param array $remainingEntities Entities that should not be deleted
<ide> * @param array $options list of options accepted by `Table::delete()`
<ide> * @return bool success
<ide> function ($v) {
<ide> * The action which is taken depends on the dependency between source and targets and also on foreign key nullability
<ide> *
<ide> * @param array $foreignKey array of foreign key properties
<del> * @param Table $target The associated table
<add> * @param \Cake\ORM\Table $target The associated table
<ide> * @param array $conditions The conditions that specifies what are the objects to be unlinked
<ide> * @param array $options list of options accepted by `Table::delete()`
<ide> * @return bool success
<ide> protected function _unlink(array $foreignKey, Table $target, array $conditions =
<ide> /**
<ide> * Checks the nullable flag of the foreign key
<ide> *
<del> * @param Table $table the table containing the foreign key
<add> * @param \Cake\ORM\Table $table the table containing the foreign key
<ide> * @param array $properties the list of fields that compose the foreign key
<ide> * @return bool
<ide> */
<ide><path>src/ORM/Behavior/CounterCacheBehavior.php
<ide> protected function _processAssociations(Event $event, EntityInterface $entity)
<ide> *
<ide> * @param \Cake\Event\Event $event Event instance.
<ide> * @param \Cake\Datasource\EntityInterface $entity Entity
<del> * @param Association $assoc The association object
<add> * @param \Cake\ORM\Association $assoc The association object
<ide> * @param array $settings The settings for for counter cache for this association
<ide> * @return void
<ide> */
<ide><path>src/ORM/EagerLoader.php
<ide> public function externalAssociations(Table $repository)
<ide> * Auxiliary function responsible for fully normalizing deep associations defined
<ide> * using `contain()`
<ide> *
<del> * @param Table $parent owning side of the association
<add> * @param \Cake\ORM\Table $parent owning side of the association
<ide> * @param string $alias name of the association to be loaded
<ide> * @param array $options list of extra options to use for this association
<ide> * @param array $paths An array with two values, the first one is a list of dot
<ide> public function addToJoinsMap($alias, Association $assoc, $asMatching = false)
<ide> *
<ide> * @param array $external the list of external associations to be loaded
<ide> * @param \Cake\ORM\Query $query The query from which the results where generated
<del> * @param BufferedStatement $statement The statement to work on
<add> * @param \Cake\Database\Statement\BufferedStatement $statement The statement to work on
<ide> * @return array
<ide> */
<ide> protected function _collectKeys($external, $query, $statement)
<ide><path>src/ORM/Marshaller.php
<ide> public function many(array $data, array $options = [])
<ide> * Builds the related entities and handles the special casing
<ide> * for junction table entities.
<ide> *
<del> * @param Association $assoc The association to marshal.
<add> * @param \Cake\ORM\Association $assoc The association to marshal.
<ide> * @param array $data The data to convert into entities.
<ide> * @param array $options List of options.
<ide> * @return array An array of built entities.
<ide> protected function _belongsToMany(Association $assoc, array $data, $options = []
<ide> /**
<ide> * Loads a list of belongs to many from ids.
<ide> *
<del> * @param Association $assoc The association class for the belongsToMany association.
<add> * @param \Cake\ORM\Association $assoc The association class for the belongsToMany association.
<ide> * @param array $ids The list of ids to load.
<ide> * @return array An array of entities.
<ide> */
<ide> protected function _loadAssociatedByIds($assoc, $ids)
<ide> /**
<ide> * Loads a list of belongs to many from ids.
<ide> *
<del> * @param Association $assoc The association class for the belongsToMany association.
<add> * @param \Cake\ORM\Association $assoc The association class for the belongsToMany association.
<ide> * @param array $ids The list of ids to load.
<ide> * @return array An array of entities.
<ide> * @deprecated Use _loadAssociatedByIds()
<ide><path>src/Routing/Dispatcher.php
<ide> public function dispatch(Request $request, Response $response)
<ide> * is true. If a response object is returned by controller action that is returned
<ide> * else controller's $response property is returned.
<ide> *
<del> * @param Controller $controller Controller to invoke
<add> * @param \Cake\Controller\Controller $controller Controller to invoke
<ide> * @return \Cake\Network\Response The resulting response object
<ide> * @throws \LogicException If data returned by controller action is not an
<ide> * instance of Response
<ide><path>src/Validation/Validator.php
<ide> public function isPresenceRequired($field, $newRecord)
<ide> * Returns false if any validation for the passed rule set should be stopped
<ide> * due to the field missing in the data array
<ide> *
<del> * @param ValidationSet $field The set of rules for a field.
<add> * @param \Cake\Validation\ValidationSet $field The set of rules for a field.
<ide> * @param array $context A key value list of data containing the validation context.
<ide> * @return bool
<ide> */
<ide> protected function _checkPresence($field, $context)
<ide> /**
<ide> * Returns whether the field can be left blank according to `allowEmpty`
<ide> *
<del> * @param ValidationSet $field the set of rules for a field
<add> * @param \Cake\Validation\ValidationSet $field the set of rules for a field
<ide> * @param array $context a key value list of data containing the validation context.
<ide> * @return bool
<ide> */
<ide> protected function _fieldIsEmpty($data)
<ide> * from executing them
<ide> *
<ide> * @param string $field The name of the field that is being processed
<del> * @param ValidationSet $rules the list of rules for a field
<add> * @param \Cake\Validation\ValidationSet $rules the list of rules for a field
<ide> * @param array $data the full data passed to the validator
<ide> * @param bool $newRecord whether is it a new record or an existing one
<ide> * @return array
| 30
|
PHP
|
PHP
|
fix parsedsn() ignoring empty strings
|
91475ccfa58948b2561ffd9631664c1c3edaf300
|
<ide><path>src/Core/StaticConfigTrait.php
<ide> public static function parseDsn($dsn)
<ide> throw new InvalidArgumentException('Only strings can be passed to parseDsn');
<ide> }
<ide>
<del> $pattern = '/^(?P<scheme>[\w\\\\]+):\/\/((?P<username>.*?)(:(?P<password>.*?))?@)?' .
<del> '((?P<host>[^?#\/:@]+)(:(?P<port>\d+))?)?' .
<del> '(?P<path>\/[^?#]*)?(\?(?P<query>[^#]*))?(#(?P<fragment>.*))?$/';
<add> $pattern = <<<'REGEXP'
<add>{
<add> ^
<add> (?P<_scheme>
<add> (?P<scheme>[\w\\\\]+)://
<add> )
<add> (?P<_username>
<add> (?P<username>.*?)
<add> (?P<_password>
<add> :(?P<password>.*?)
<add> )?
<add> @
<add> )?
<add> (?P<_host>
<add> (?P<host>[^?#/:@]+)
<add> (?P<_port>
<add> :(?P<port>\d+)
<add> )?
<add> )?
<add> (?P<_path>
<add> (?P<path>/[^?#]*)
<add> )?
<add> (?P<_query>
<add> \?(?P<query>[^#]*)
<add> )?
<add> (?P<_fragment>
<add> \#(?P<fragment>.*)
<add> )?
<add> $
<add>}x
<add>REGEXP;
<add>
<ide> preg_match($pattern, $dsn, $parsed);
<ide>
<ide> if (!$parsed) {
<ide> throw new InvalidArgumentException("The DSN string '{$dsn}' could not be parsed.");
<ide> }
<add>
<add> $exists = [];
<ide> foreach ($parsed as $k => $v) {
<ide> if (is_int($k)) {
<ide> unset($parsed[$k]);
<del> }
<del> if ($v === '') {
<add> } elseif (strpos($k, '_') === 0) {
<add> $exists[substr($k, 1)] = ($v !== '');
<add> unset($parsed[$k]);
<add> } elseif ($v === '' && !$exists[$k]) {
<ide> unset($parsed[$k]);
<ide> }
<ide> }
<ide><path>tests/TestCase/Datasource/ConnectionManagerTest.php
<ide> public function dsnProvider()
<ide> 'log' => '1'
<ide> ]
<ide> ],
<add> 'no password' => [
<add> 'mysql://user@localhost:3306/database',
<add> [
<add> 'className' => 'Cake\Database\Connection',
<add> 'driver' => 'Cake\Database\Driver\Mysql',
<add> 'host' => 'localhost',
<add> 'database' => 'database',
<add> 'port' => 3306,
<add> 'scheme' => 'mysql',
<add> 'username' => 'user',
<add> ]
<add> ],
<add> 'empty password' => [
<add> 'mysql://user:@localhost:3306/database',
<add> [
<add> 'className' => 'Cake\Database\Connection',
<add> 'driver' => 'Cake\Database\Driver\Mysql',
<add> 'host' => 'localhost',
<add> 'database' => 'database',
<add> 'port' => 3306,
<add> 'scheme' => 'mysql',
<add> 'username' => 'user',
<add> 'password' => '',
<add> ]
<add> ],
<ide> 'sqlite memory' => [
<ide> 'sqlite:///:memory:',
<ide> [
<ide> public function dsnProvider()
<ide> ]
<ide> ],
<ide> 'complex password' => [
<del> 'mysql://user:pas#][{}$%20@!@localhost:3306/database?log=1"eIdentifiers=1',
<add> 'mysql://user:/?#][{}$%20@!@localhost:3306/database?log=1"eIdentifiers=1',
<ide> [
<ide> 'className' => 'Cake\Database\Connection',
<ide> 'database' => 'database',
<ide> 'driver' => 'Cake\Database\Driver\Mysql',
<ide> 'host' => 'localhost',
<del> 'password' => 'pas#][{}$%20@!',
<add> 'password' => '/?#][{}$%20@!',
<ide> 'port' => 3306,
<ide> 'scheme' => 'mysql',
<ide> 'username' => 'user',
| 2
|
Text
|
Text
|
fix typo in doc/api/esm.md
|
442be20760ffe981baf57307175b618fa504e12b
|
<ide><path>doc/api/esm.md
<ide> import { readFile } from 'fs/promises';
<ide> const json = JSON.parse(await readFile(new URL('./dat.json', import.meta.url)));
<ide> ```
<ide>
<del>Alterantively `module.createRequire()` can be used.
<add>Alternatively `module.createRequire()` can be used.
<ide>
<ide> #### No Native Module Loading
<ide>
| 1
|
Javascript
|
Javascript
|
use process.mixin instead of sys.mixin
|
9415ca909ea0e5e5c0b390fa60759952143beed3
|
<ide><path>lib/fs.js
<del>var sys = require('sys');
<del>
<ide> exports.Stats = process.Stats;
<ide>
<ide> process.Stats.prototype._checkModeProperty = function (property) {
<ide> var FileWriteStream = exports.FileWriteStream = function(path, options) {
<ide> this.encoding = 'binary';
<ide> this.mode = 0666;
<ide>
<del> sys.mixin(this, options || {});
<add> process.mixin(this, options || {});
<ide>
<ide> var
<ide> self = this,
| 1
|
PHP
|
PHP
|
fix cs errors
|
e6c9dc74dec8349017b446f91a64c8fb351d5323
|
<ide><path>src/Database/Type/DateType.php
<ide> public function marshal($value) {
<ide> }
<ide> return $date;
<ide> }
<del>
<add>
<ide> /**
<ide> * Convert strings into Date instances.
<ide> *
<ide><path>tests/TestCase/Controller/Component/PaginatorComponentTest.php
<ide> public function testMergeOptionsMaxLimit() {
<ide> 'paramType' => 'named',
<ide> );
<ide> $result = $this->Paginator->mergeOptions('Post', $settings);
<del> $expected = array('page' => 1, 'limit' => 200, 'maxLimit' => 200, 'paramType' => 'named', 'whitelist' => ['limit', 'sort', 'page', 'direction'],);
<add> $expected = array(
<add> 'page' => 1,
<add> 'limit' => 200,
<add> 'maxLimit' => 200,
<add> 'paramType' => 'named',
<add> 'whitelist' => ['limit', 'sort', 'page', 'direction']
<add> );
<ide> $this->assertEquals($expected, $result);
<ide>
<ide> $settings = array(
<ide> 'maxLimit' => 10,
<ide> 'paramType' => 'named',
<ide> );
<ide> $result = $this->Paginator->mergeOptions('Post', $settings);
<del> $expected = array('page' => 1, 'limit' => 20, 'maxLimit' => 10, 'paramType' => 'named', 'whitelist' => ['limit', 'sort', 'page', 'direction'],);
<add> $expected = array(
<add> 'page' => 1,
<add> 'limit' => 20,
<add> 'maxLimit' => 10,
<add> 'paramType' => 'named',
<add> 'whitelist' => ['limit', 'sort', 'page', 'direction']
<add> );
<ide> $this->assertEquals($expected, $result);
<ide> }
<ide>
| 2
|
Python
|
Python
|
add triviaqa task to projects
|
0ab5dcbf76abe39068ce2679f20ae82bd7d46091
|
<ide><path>official/nlp/projects/triviaqa/dataset.py
<add># Copyright 2020 The TensorFlow Authors. All Rights Reserved.
<add>#
<add># Licensed under the Apache License, Version 2.0 (the "License");
<add># you may not use this file except in compliance with the License.
<add># You may obtain a copy of the License at
<add>#
<add># https://www.apache.org/licenses/LICENSE-2.0
<add>#
<add># Unless required by applicable law or agreed to in writing, software
<add># distributed under the License is distributed on an "AS IS" BASIS,
<add># WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
<add># See the License for the specific language governing permissions and
<add># limitations under the License.
<add>"""TriviaQA: A Reading Comprehension Dataset."""
<add>import functools
<add>import json
<add>import os
<add>
<add>from absl import logging
<add>import apache_beam as beam
<add>import six
<add>import tensorflow as tf
<add>import tensorflow_datasets.public_api as tfds
<add>
<add>from official.nlp.projects.triviaqa import preprocess
<add>
<add>_CITATION = """
<add>@article{2017arXivtriviaqa,
<add> author = {{Joshi}, Mandar and {Choi}, Eunsol and {Weld},
<add> Daniel and {Zettlemoyer}, Luke},
<add> title = "{triviaqa: A Large Scale Distantly Supervised Challenge Dataset for Reading Comprehension}",
<add> journal = {arXiv e-prints},
<add> year = 2017,
<add> eid = {arXiv:1705.03551},
<add> pages = {arXiv:1705.03551},
<add>archivePrefix = {arXiv},
<add> eprint = {1705.03551},
<add>}
<add>"""
<add>_DOWNLOAD_URL_TMPL = (
<add> "http://nlp.cs.washington.edu/triviaqa/data/triviaqa-{}.tar.gz")
<add>_TRAIN_FILE_FORMAT = "*-train.json"
<add>_VALIDATION_FILE_FORMAT = "*-dev.json"
<add>_TEST_FILE_FORMAT = "*test-without-answers.json"
<add>_WEB_EVIDENCE_DIR = "evidence/web"
<add>_WIKI_EVIDENCE_DIR = "evidence/wikipedia"
<add>
<add>_DESCRIPTION = """\
<add>TriviaqQA is a reading comprehension dataset containing over 650K
<add>question-answer-evidence triples. TriviaqQA includes 95K question-answer
<add>pairs authored by trivia enthusiasts and independently gathered evidence
<add>documents, six per question on average, that provide high quality distant
<add>supervision for answering the questions.
<add>"""
<add>
<add>_RC_DESCRIPTION = """\
<add>Question-answer pairs where all documents for a given question contain the
<add>answer string(s).
<add>"""
<add>
<add>_UNFILTERED_DESCRIPTION = """\
<add>110k question-answer pairs for open domain QA where not all documents for a
<add>given question contain the answer string(s). This makes the unfiltered dataset
<add>more appropriate for IR-style QA.
<add>"""
<add>
<add>_CONTEXT_ADDENDUM = "Includes context from Wikipedia and search results."
<add>
<add>
<add>def _web_evidence_dir(tmp_dir):
<add> return tf.io.gfile.glob(os.path.join(tmp_dir, _WEB_EVIDENCE_DIR))
<add>
<add>
<add>def _wiki_evidence_dir(tmp_dir):
<add> return tf.io.gfile.glob(os.path.join(tmp_dir, _WIKI_EVIDENCE_DIR))
<add>
<add>
<add>class TriviaQAConfig(tfds.core.BuilderConfig):
<add> """BuilderConfig for TriviaQA."""
<add>
<add> def __init__(self, *, unfiltered=False, exclude_context=False, **kwargs):
<add> """BuilderConfig for TriviaQA.
<add>
<add> Args:
<add> unfiltered: bool, whether to use the unfiltered version of the dataset,
<add> intended for open-domain QA.
<add> exclude_context: bool, whether to exclude Wikipedia and search context for
<add> reduced size.
<add> **kwargs: keyword arguments forwarded to super.
<add> """
<add> name = "unfiltered" if unfiltered else "rc"
<add> if exclude_context:
<add> name += ".nocontext"
<add> description = _UNFILTERED_DESCRIPTION if unfiltered else _RC_DESCRIPTION
<add> if not exclude_context:
<add> description += _CONTEXT_ADDENDUM
<add> super(TriviaQAConfig, self).__init__(
<add> name=name,
<add> description=description,
<add> version=tfds.core.Version("1.1.1"),
<add> **kwargs)
<add> self.unfiltered = unfiltered
<add> self.exclude_context = exclude_context
<add>
<add>
<add>class BigBirdTriviaQAConfig(tfds.core.BuilderConfig):
<add> """BuilderConfig for TriviaQA."""
<add>
<add> def __init__(self, **kwargs):
<add> """BuilderConfig for TriviaQA.
<add>
<add> Args:
<add> **kwargs: keyword arguments forwarded to super.
<add> """
<add> name = "rc_wiki.preprocessed"
<add> description = _RC_DESCRIPTION
<add> super(BigBirdTriviaQAConfig, self).__init__(
<add> name=name,
<add> description=description,
<add> version=tfds.core.Version("1.1.1"),
<add> **kwargs)
<add> self.unfiltered = False
<add> self.exclude_context = False
<add>
<add> def configure(self,
<add> sentencepiece_model_path,
<add> sequence_length,
<add> stride,
<add> global_sequence_length=None):
<add> """Configures additional user-specified arguments."""
<add> self.sentencepiece_model_path = sentencepiece_model_path
<add> self.sequence_length = sequence_length
<add> self.stride = stride
<add> if global_sequence_length is None and sequence_length is not None:
<add> self.global_sequence_length = sequence_length // 16 + 64
<add> else:
<add> self.global_sequence_length = global_sequence_length
<add> logging.info(
<add> """
<add> global_sequence_length: %s
<add> sequence_length: %s
<add> stride: %s
<add> sentencepiece_model_path: %s""",
<add> self.global_sequence_length, self.sequence_length,
<add> self.stride, self.sentencepiece_model_path)
<add>
<add> def validate(self):
<add> """Validates that user specifies valid arguments."""
<add> if self.sequence_length is None:
<add> raise ValueError("sequence_length must be specified for BigBird.")
<add> if self.stride is None:
<add> raise ValueError("stride must be specified for BigBird.")
<add> if self.sentencepiece_model_path is None:
<add> raise ValueError(
<add> "sentencepiece_model_path must be specified for BigBird.")
<add>
<add>
<add>def filter_files_for_big_bird(files):
<add> filtered_files = [f for f in files if os.path.basename(f).startswith("wiki")]
<add> assert len(filtered_files) == 1, "There should only be one wikipedia file."
<add> return filtered_files
<add>
<add>
<add>class TriviaQA(tfds.core.BeamBasedBuilder):
<add> """TriviaQA is a reading comprehension dataset.
<add>
<add> It containss over 650K question-answer-evidence triples.
<add> """
<add> name = "bigbird_trivia_qa"
<add> BUILDER_CONFIGS = [
<add> BigBirdTriviaQAConfig(),
<add> TriviaQAConfig(unfiltered=False, exclude_context=False), # rc
<add> TriviaQAConfig(unfiltered=False, exclude_context=True), # rc.nocontext
<add> TriviaQAConfig(unfiltered=True, exclude_context=False), # unfiltered
<add> TriviaQAConfig(unfiltered=True, exclude_context=True),
<add> # unfilered.nocontext
<add> ]
<add>
<add> def __init__(self,
<add> *,
<add> sentencepiece_model_path=None,
<add> sequence_length=None,
<add> stride=None,
<add> global_sequence_length=None,
<add> **kwargs):
<add> super(TriviaQA, self).__init__(**kwargs)
<add> if isinstance(self.builder_config, BigBirdTriviaQAConfig):
<add> self.builder_config.configure(
<add> sentencepiece_model_path=sentencepiece_model_path,
<add> sequence_length=sequence_length,
<add> stride=stride,
<add> global_sequence_length=global_sequence_length)
<add>
<add> def _info(self):
<add> if isinstance(self.builder_config, BigBirdTriviaQAConfig):
<add> return tfds.core.DatasetInfo(
<add> builder=self,
<add> description=_DESCRIPTION,
<add> supervised_keys=None,
<add> homepage="http://nlp.cs.washington.edu/triviaqa/",
<add> citation=_CITATION,
<add> features=tfds.features.FeaturesDict({
<add> "id": tfds.features.Text(),
<add> "qid": tfds.features.Text(),
<add> "question": tfds.features.Text(),
<add> "context": tfds.features.Text(),
<add> # Sequence features.
<add> "token_ids": tfds.features.Tensor(shape=(None,), dtype=tf.int64),
<add> "token_offsets":
<add> tfds.features.Tensor(shape=(None,), dtype=tf.int64),
<add> "segment_ids":
<add> tfds.features.Tensor(shape=(None,), dtype=tf.int64),
<add> "global_token_ids":
<add> tfds.features.Tensor(shape=(None,), dtype=tf.int64),
<add> # Start and end indices (inclusive).
<add> "answers":
<add> tfds.features.Tensor(shape=(None, 2), dtype=tf.int64),
<add> }))
<add>
<add> return tfds.core.DatasetInfo(
<add> builder=self,
<add> description=_DESCRIPTION,
<add> features=tfds.features.FeaturesDict({
<add> "question":
<add> tfds.features.Text(),
<add> "question_id":
<add> tfds.features.Text(),
<add> "question_source":
<add> tfds.features.Text(),
<add> "entity_pages":
<add> tfds.features.Sequence({
<add> "doc_source":
<add> tfds.features.Text(),
<add> "filename":
<add> tfds.features.Text(),
<add> "title":
<add> tfds.features.Text(),
<add> "wiki_context":
<add> tfds.features.Text(),
<add> }),
<add> "search_results":
<add> tfds.features.Sequence({
<add> "description":
<add> tfds.features.Text(),
<add> "filename":
<add> tfds.features.Text(),
<add> "rank":
<add> tf.int32,
<add> "title":
<add> tfds.features.Text(),
<add> "url":
<add> tfds.features.Text(),
<add> "search_context":
<add> tfds.features.Text(),
<add> }),
<add> "answer":
<add> tfds.features.FeaturesDict({
<add> "aliases":
<add> tfds.features.Sequence(tfds.features.Text()),
<add> "normalized_aliases":
<add> tfds.features.Sequence(tfds.features.Text()),
<add> "matched_wiki_entity_name":
<add> tfds.features.Text(),
<add> "normalized_matched_wiki_entity_name":
<add> tfds.features.Text(),
<add> "normalized_value":
<add> tfds.features.Text(),
<add> "type":
<add> tfds.features.Text(),
<add> "value":
<add> tfds.features.Text(),
<add> }),
<add> }),
<add>
<add> supervised_keys=None,
<add> homepage="http://nlp.cs.washington.edu/triviaqa/",
<add> citation=_CITATION,
<add> )
<add>
<add> def _split_generators(self, dl_manager):
<add> """Returns SplitGenerators."""
<add> cfg = self.builder_config
<add> download_urls = dict()
<add> if not (cfg.unfiltered and cfg.exclude_context):
<add> download_urls["rc"] = _DOWNLOAD_URL_TMPL.format("rc")
<add> if cfg.unfiltered:
<add> download_urls["unfiltered"] = _DOWNLOAD_URL_TMPL.format("unfiltered")
<add> file_paths = dl_manager.download_and_extract(download_urls)
<add>
<add> qa_dir = (
<add> os.path.join(file_paths["unfiltered"], "triviaqa-unfiltered")
<add> if cfg.unfiltered else
<add> os.path.join(file_paths["rc"], "qa"))
<add> train_files = tf.io.gfile.glob(os.path.join(qa_dir, _TRAIN_FILE_FORMAT))
<add> valid_files = tf.io.gfile.glob(
<add> os.path.join(qa_dir, _VALIDATION_FILE_FORMAT))
<add> test_files = tf.io.gfile.glob(os.path.join(qa_dir, _TEST_FILE_FORMAT))
<add>
<add> if cfg.exclude_context:
<add> web_evidence_dir = None
<add> wiki_evidence_dir = None
<add> else:
<add> web_evidence_dir = os.path.join(file_paths["rc"], _WEB_EVIDENCE_DIR)
<add> wiki_evidence_dir = os.path.join(file_paths["rc"], _WIKI_EVIDENCE_DIR)
<add>
<add> if isinstance(cfg, BigBirdTriviaQAConfig):
<add> train_files = filter_files_for_big_bird(train_files)
<add> valid_files = filter_files_for_big_bird(valid_files)
<add> test_files = filter_files_for_big_bird(test_files)
<add>
<add> return [
<add> tfds.core.SplitGenerator(
<add> name=tfds.Split.TRAIN,
<add> gen_kwargs={"files": train_files,
<add> "web_dir": web_evidence_dir,
<add> "wiki_dir": wiki_evidence_dir,
<add> "answer": True}),
<add> tfds.core.SplitGenerator(
<add> name=tfds.Split.VALIDATION,
<add> gen_kwargs={"files": valid_files,
<add> "web_dir": web_evidence_dir,
<add> "wiki_dir": wiki_evidence_dir,
<add> "answer": True}),
<add> tfds.core.SplitGenerator(
<add> name=tfds.Split.TEST,
<add> gen_kwargs={"files": test_files,
<add> "web_dir": web_evidence_dir,
<add> "wiki_dir": wiki_evidence_dir,
<add> "answer": False}),
<add> ]
<add>
<add> def _build_pcollection(self, pipeline, files, web_dir, wiki_dir, answer):
<add> if isinstance(self.builder_config, BigBirdTriviaQAConfig):
<add> self.builder_config.validate()
<add> question_answers = preprocess.read_question_answers(files[0])
<add> return preprocess.make_pipeline(
<add> pipeline,
<add> question_answers=question_answers,
<add> answer=answer,
<add> max_num_tokens=self.builder_config.sequence_length,
<add> max_num_global_tokens=self.builder_config.global_sequence_length,
<add> stride=self.builder_config.stride,
<add> sentencepiece_model_path=self.builder_config.sentencepiece_model_path,
<add> wikipedia_dir=wiki_dir,
<add> web_dir=web_dir)
<add>
<add> parse_example_fn = functools.partial(parse_example,
<add> self.builder_config.exclude_context,
<add> web_dir, wiki_dir)
<add> return (pipeline
<add> | beam.Create(files)
<add> | beam.ParDo(ReadQuestions())
<add> | beam.Reshuffle()
<add> | beam.Map(parse_example_fn))
<add>
<add>
<add>class ReadQuestions(beam.DoFn):
<add> """Read questions from JSON."""
<add>
<add> def process(self, file):
<add> with tf.io.gfile.GFile(file) as f:
<add> data = json.load(f)
<add> for question in data["Data"]:
<add> example = {"SourceFile": os.path.basename(file)}
<add> example.update(question)
<add> yield example
<add>
<add>
<add>def parse_example(exclude_context, web_dir, wiki_dir, article):
<add> """Return a single example from an article JSON record."""
<add>
<add> def _strip(collection):
<add> return [item.strip() for item in collection]
<add>
<add> if "Answer" in article:
<add> answer = article["Answer"]
<add> answer_dict = {
<add> "aliases":
<add> _strip(answer["Aliases"]),
<add> "normalized_aliases":
<add> _strip(answer["NormalizedAliases"]),
<add> "matched_wiki_entity_name":
<add> answer.get("MatchedWikiEntryName", "").strip(),
<add> "normalized_matched_wiki_entity_name":
<add> answer.get("NormalizedMatchedWikiEntryName", "").strip(),
<add> "normalized_value":
<add> answer["NormalizedValue"].strip(),
<add> "type":
<add> answer["Type"].strip(),
<add> "value":
<add> answer["Value"].strip(),
<add> }
<add> else:
<add> answer_dict = {
<add> "aliases": [],
<add> "normalized_aliases": [],
<add> "matched_wiki_entity_name": "<unk>",
<add> "normalized_matched_wiki_entity_name": "<unk>",
<add> "normalized_value": "<unk>",
<add> "type": "",
<add> "value": "<unk>",
<add> }
<add>
<add> if exclude_context:
<add> article["SearchResults"] = []
<add> article["EntityPages"] = []
<add>
<add> def _add_context(collection, context_field, file_dir):
<add> """Adds context from file, or skips if file does not exist."""
<add> new_items = []
<add> for item in collection:
<add> if "Filename" not in item:
<add> logging.info("Missing context 'Filename', skipping.")
<add> continue
<add>
<add> new_item = item.copy()
<add> fname = item["Filename"]
<add> try:
<add> with tf.io.gfile.GFile(os.path.join(file_dir, fname)) as f:
<add> new_item[context_field] = f.read()
<add> except (IOError, tf.errors.NotFoundError):
<add> logging.info("File does not exist, skipping: %s", fname)
<add> continue
<add> new_items.append(new_item)
<add> return new_items
<add>
<add> def _strip_if_str(v):
<add> return v.strip() if isinstance(v, six.string_types) else v
<add>
<add> def _transpose_and_strip_dicts(dicts, field_names):
<add> return {
<add> tfds.core.naming.camelcase_to_snakecase(k):
<add> [_strip_if_str(d[k]) for d in dicts] for k in field_names
<add> }
<add>
<add> search_results = _transpose_and_strip_dicts(
<add> _add_context(article.get("SearchResults", []), "SearchContext", web_dir),
<add> ["Description", "Filename", "Rank", "Title", "Url", "SearchContext"])
<add>
<add> entity_pages = _transpose_and_strip_dicts(
<add> _add_context(article.get("EntityPages", []), "WikiContext", wiki_dir),
<add> ["DocSource", "Filename", "Title", "WikiContext"])
<add>
<add> question = article["Question"].strip()
<add> question_id = article["QuestionId"]
<add> question_source = article["QuestionSource"].strip()
<add>
<add> return f"{article['SourceFile']}_{question_id}", {
<add> "entity_pages": entity_pages,
<add> "search_results": search_results,
<add> "question": question,
<add> "question_id": question_id,
<add> "question_source": question_source,
<add> "answer": answer_dict,
<add> }
<ide><path>official/nlp/projects/triviaqa/download_and_prepare.py
<add># Copyright 2020 The TensorFlow Authors. All Rights Reserved.
<add>#
<add># Licensed under the Apache License, Version 2.0 (the "License");
<add># you may not use this file except in compliance with the License.
<add># You may obtain a copy of the License at
<add>#
<add># https://www.apache.org/licenses/LICENSE-2.0
<add>#
<add># Unless required by applicable law or agreed to in writing, software
<add># distributed under the License is distributed on an "AS IS" BASIS,
<add># WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
<add># See the License for the specific language governing permissions and
<add># limitations under the License.
<add>"""Downloads and prepares TriviaQA dataset."""
<add>from unittest import mock
<add>
<add>from absl import app
<add>from absl import flags
<add>from absl import logging
<add>import apache_beam as beam
<add>import tensorflow_datasets as tfds
<add>
<add>from official.nlp.projects.triviaqa import dataset # pylint: disable=unused-import
<add>
<add>flags.DEFINE_integer('sequence_length', 4096, 'Max number of tokens.')
<add>
<add>flags.DEFINE_integer(
<add> 'global_sequence_length', None,
<add> 'Max number of question tokens plus sentences. If not set, defaults to '
<add> 'sequence_length // 16 + 64.')
<add>
<add>flags.DEFINE_integer(
<add> 'stride', 3072,
<add> 'For documents longer than `sequence_length`, where to split them.')
<add>
<add>flags.DEFINE_string(
<add> 'sentencepiece_model_path', None,
<add> 'SentencePiece model to use for tokenization.')
<add>
<add>flags.DEFINE_string('data_dir', None, 'Data directory for TFDS.')
<add>
<add>flags.DEFINE_string('runner', 'DirectRunner', 'Beam runner to use.')
<add>
<add>FLAGS = flags.FLAGS
<add>
<add>
<add>def main(argv):
<add> if len(argv) > 1:
<add> raise app.UsageError('Too many command-line arguments.')
<add> builder = tfds.builder(
<add> 'bigbird_trivia_qa/rc_wiki.preprocessed',
<add> data_dir=FLAGS.data_dir,
<add> sentencepiece_model_path=FLAGS.sentencepiece_model_path,
<add> sequence_length=FLAGS.sequence_length,
<add> global_sequence_length=FLAGS.global_sequence_length,
<add> stride=FLAGS.stride)
<add> download_config = tfds.download.DownloadConfig(
<add> beam_options=beam.options.pipeline_options.PipelineOptions(flags=[
<add> f'--runner={FLAGS.runner}',
<add> '--direct_num_workers=8',
<add> '--direct_running_mode=multi_processing',
<add> ]))
<add> with mock.patch('tensorflow_datasets.core.download.extractor._normpath',
<add> new=lambda x: x):
<add> builder.download_and_prepare(download_config=download_config)
<add> logging.info(builder.info.splits)
<add>
<add>
<add>if __name__ == '__main__':
<add> flags.mark_flag_as_required('sentencepiece_model_path')
<add> app.run(main)
<ide><path>official/nlp/projects/triviaqa/evaluate.py
<add># Copyright 2020 The TensorFlow Authors. All Rights Reserved.
<add>#
<add># Licensed under the Apache License, Version 2.0 (the "License");
<add># you may not use this file except in compliance with the License.
<add># You may obtain a copy of the License at
<add>#
<add># https://www.apache.org/licenses/LICENSE-2.0
<add>#
<add># Unless required by applicable law or agreed to in writing, software
<add># distributed under the License is distributed on an "AS IS" BASIS,
<add># WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
<add># See the License for the specific language governing permissions and
<add># limitations under the License.
<add>"""Evalutes TriviaQA predictions."""
<add>import json
<add>
<add>from absl import app
<add>from absl import flags
<add>from absl import logging
<add>import tensorflow as tf
<add>
<add>from official.nlp.projects.triviaqa import evaluation
<add>
<add>flags.DEFINE_string('gold_path', None,
<add> 'Path to golden validation, i.e. wikipedia-dev.json.')
<add>
<add>flags.DEFINE_string('predictions_path', None,
<add> 'Path to predictions in JSON format')
<add>
<add>FLAGS = flags.FLAGS
<add>
<add>
<add>def main(argv):
<add> if len(argv) > 1:
<add> raise app.UsageError('Too many command-line arguments.')
<add> with tf.io.gfile.GFile(FLAGS.gold_path) as f:
<add> ground_truth = {
<add> datum['QuestionId']: datum['Answer'] for datum in json.load(f)['Data']
<add> }
<add> with tf.io.gfile.GFile(FLAGS.predictions_path) as f:
<add> predictions = json.load(f)
<add> logging.info(evaluation.evaluate_triviaqa(ground_truth, predictions))
<add>
<add>
<add>if __name__ == '__main__':
<add> flags.mark_flag_as_required('predictions_path')
<add> app.run(main)
<ide><path>official/nlp/projects/triviaqa/evaluation.py
<add># Copyright 2020 The TensorFlow Authors. All Rights Reserved.
<add># Copyright 2020 Google LLC
<add># Copyright 2017 Mandar Joshi (mandar90@cs.washington.edu)
<add>#
<add># Licensed under the Apache License, Version 2.0 (the "License");
<add># you may not use this file except in compliance with the License.
<add># You may obtain a copy of the License at
<add>#
<add># https://www.apache.org/licenses/LICENSE-2.0
<add>#
<add># Unless required by applicable law or agreed to in writing, software
<add># distributed under the License is distributed on an "AS IS" BASIS,
<add># WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
<add># See the License for the specific language governing permissions and
<add># limitations under the License.
<add>"""Official evaluation script for v1.0 of the TriviaQA dataset.
<add>
<add>Forked from
<add>https://github.com/mandarjoshi90/triviaqa/blob/master/evaluation/triviaqa_evaluation.py.
<add>Modifications are removal of main function.
<add>"""
<add>import collections
<add>import re
<add>import string
<add>import sys
<add>
<add>
<add>def normalize_answer(s):
<add> """Lower text and remove punctuation, articles and extra whitespace."""
<add>
<add> def remove_articles(text):
<add> return re.sub(r'\b(a|an|the)\b', ' ', text)
<add>
<add> def white_space_fix(text):
<add> return ' '.join(text.split())
<add>
<add> def handle_punc(text):
<add> exclude = set(string.punctuation + ''.join([u'‘', u'’', u'´', u'`']))
<add> return ''.join(ch if ch not in exclude else ' ' for ch in text)
<add>
<add> def lower(text):
<add> return text.lower()
<add>
<add> def replace_underscore(text):
<add> return text.replace('_', ' ')
<add>
<add> return white_space_fix(
<add> remove_articles(handle_punc(lower(replace_underscore(s))))).strip()
<add>
<add>
<add>def f1_score(prediction, ground_truth):
<add> prediction_tokens = normalize_answer(prediction).split()
<add> ground_truth_tokens = normalize_answer(ground_truth).split()
<add> common = (
<add> collections.Counter(prediction_tokens)
<add> & collections.Counter(ground_truth_tokens))
<add> num_same = sum(common.values())
<add> if num_same == 0:
<add> return 0
<add> precision = 1.0 * num_same / len(prediction_tokens)
<add> recall = 1.0 * num_same / len(ground_truth_tokens)
<add> f1 = (2 * precision * recall) / (precision + recall)
<add> return f1
<add>
<add>
<add>def exact_match_score(prediction, ground_truth):
<add> return normalize_answer(prediction) == normalize_answer(ground_truth)
<add>
<add>
<add>def metric_max_over_ground_truths(metric_fn, prediction, ground_truths):
<add> scores_for_ground_truths = []
<add> for ground_truth in ground_truths:
<add> score = metric_fn(prediction, ground_truth)
<add> scores_for_ground_truths.append(score)
<add> return max(scores_for_ground_truths)
<add>
<add>
<add>def is_exact_match(answer_object, prediction):
<add> ground_truths = get_ground_truths(answer_object)
<add> for ground_truth in ground_truths:
<add> if exact_match_score(prediction, ground_truth):
<add> return True
<add> return False
<add>
<add>
<add>def has_exact_match(ground_truths, candidates):
<add> for ground_truth in ground_truths:
<add> if ground_truth in candidates:
<add> return True
<add> return False
<add>
<add>
<add>def get_ground_truths(answer):
<add> return answer['NormalizedAliases'] + [
<add> normalize_answer(ans) for ans in answer.get('HumanAnswers', [])
<add> ]
<add>
<add>
<add>def get_oracle_score(ground_truth,
<add> predicted_answers,
<add> qid_list=None,
<add> mute=False):
<add> exact_match = common = 0
<add> if qid_list is None:
<add> qid_list = ground_truth.keys()
<add> for qid in qid_list:
<add> if qid not in predicted_answers:
<add> if not mute:
<add> message = 'Irrelavant question {} will receive score 0.'.format(qid)
<add> print(message, file=sys.stderr)
<add> continue
<add> common += 1
<add> prediction = normalize_answer(predicted_answers[qid])
<add> ground_truths = get_ground_truths(ground_truth[qid])
<add> em_for_this_question = has_exact_match(ground_truths, prediction)
<add> exact_match += int(em_for_this_question)
<add>
<add> exact_match = 100.0 * exact_match / len(qid_list)
<add>
<add> return {
<add> 'oracle_exact_match': exact_match,
<add> 'common': common,
<add> 'denominator': len(qid_list),
<add> 'pred_len': len(predicted_answers),
<add> 'gold_len': len(ground_truth)
<add> }
<add>
<add>
<add>def evaluate_triviaqa(ground_truth,
<add> predicted_answers,
<add> qid_list=None,
<add> mute=False):
<add> f1 = exact_match = common = 0
<add> if qid_list is None:
<add> qid_list = ground_truth.keys()
<add> for qid in qid_list:
<add> if qid not in predicted_answers:
<add> if not mute:
<add> message = 'Missed question {} will receive score 0.'.format(qid)
<add> print(message, file=sys.stderr)
<add> continue
<add> if qid not in ground_truth:
<add> if not mute:
<add> message = 'Irrelavant question {} will receive score 0.'.format(qid)
<add> print(message, file=sys.stderr)
<add> continue
<add> common += 1
<add> prediction = predicted_answers[qid]
<add> ground_truths = get_ground_truths(ground_truth[qid])
<add> em_for_this_question = metric_max_over_ground_truths(
<add> exact_match_score, prediction, ground_truths)
<add> if em_for_this_question == 0 and not mute:
<add> print('em=0:', prediction, ground_truths)
<add> exact_match += em_for_this_question
<add> f1_for_this_question = metric_max_over_ground_truths(
<add> f1_score, prediction, ground_truths)
<add> f1 += f1_for_this_question
<add>
<add> exact_match = 100.0 * exact_match / len(qid_list)
<add> f1 = 100.0 * f1 / len(qid_list)
<add>
<add> return {
<add> 'exact_match': exact_match,
<add> 'f1': f1,
<add> 'common': common,
<add> 'denominator': len(qid_list),
<add> 'pred_len': len(predicted_answers),
<add> 'gold_len': len(ground_truth)
<add> }
<ide><path>official/nlp/projects/triviaqa/inputs.py
<add># Copyright 2020 The TensorFlow Authors. All Rights Reserved.
<add>#
<add># Licensed under the Apache License, Version 2.0 (the "License");
<add># you may not use this file except in compliance with the License.
<add># You may obtain a copy of the License at
<add>#
<add># https://www.apache.org/licenses/LICENSE-2.0
<add>#
<add># Unless required by applicable law or agreed to in writing, software
<add># distributed under the License is distributed on an "AS IS" BASIS,
<add># WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
<add># See the License for the specific language governing permissions and
<add># limitations under the License.
<add># pytype: skip-file
<add>"""Input processing for TriviaQA."""
<add>import os
<add>from typing import Optional, Text, Union
<add>
<add>import tensorflow as tf
<add>import tensorflow_datasets as tfds
<add>
<add>from official.modeling import tf_utils
<add>from official.nlp.projects.triviaqa import dataset # pylint: disable=unused-import
<add>
<add>
<add>def _flatten_dims(tensor: tf.Tensor,
<add> first_dim: Optional[int] = 0,
<add> last_dim: Optional[int] = -1,
<add> name: Optional[Text] = None) -> tf.Tensor:
<add> """Flattens the given span of dimensions in `tensor`.
<add>
<add> Args:
<add> tensor: [..., first_dim_size, ...middle_dims..., last_dim_size, ...] shaped
<add> Tensor.
<add> first_dim: The first dimension to flatten (inclusive). Must be a valid index
<add> for the rank of `tensor`. Default is 0.
<add> last_dim: The last dimension to flatten (inclusive). Must be a valid index
<add> for the rank of `tensor`. Default is -1.
<add> name: A name for the operation (optional).
<add>
<add> Returns:
<add> Tensor of shape [..., flattened_dim_size, ...] where
<add> flattened_dim_size = first_dim_size * ...middle_dims... * last_dim_size.
<add> """
<add> with tf.name_scope(name or 'flatten_dims'):
<add> tensor = tf.convert_to_tensor(tensor)
<add>
<add> rank = tensor.shape.rank
<add> if rank is None:
<add> raise ValueError('Static rank of `tensor` must be known.')
<add> if first_dim < 0:
<add> first_dim += rank
<add> if first_dim < 0 or first_dim >= rank:
<add> raise ValueError('`first_dim` out of bounds for `tensor` rank.')
<add> if last_dim < 0:
<add> last_dim += rank
<add> if last_dim < 0 or last_dim >= rank:
<add> raise ValueError('`last_dim` out of bounds for `tensor` rank.')
<add> if first_dim > last_dim:
<add> raise ValueError('`first_dim` must not be larger than `last_dim`.')
<add>
<add> # Try to calculate static flattened dim size if all input sizes to flatten
<add> # are statically known. Otherwise, just use -1.
<add> flat_dims_shape = tensor.shape[first_dim:(last_dim + 1)].as_list()
<add> flattened_dim_size = 1
<add> for size in flat_dims_shape:
<add> if size is None:
<add> flattened_dim_size = -1
<add> break
<add> flattened_dim_size *= size
<add>
<add> old_shape = tf.shape(tensor)
<add> output_shape = tf.concat([
<add> old_shape[:first_dim], [flattened_dim_size], old_shape[(last_dim + 1):]
<add> ], 0)
<add> return tf.reshape(tensor, output_shape)
<add>
<add>
<add>def _pad_to_multiple(tensor: tf.Tensor,
<add> factor: Union[int, tf.Tensor],
<add> axis: int,
<add> mode: Optional[Text] = 'CONSTANT',
<add> constant_values=0,
<add> name: Optional[Text] = None) -> tf.Tensor:
<add> """Pads `tensor` on a given `axis` to be a multiple of `factor`.
<add>
<add> Padding will be concatenated to the end of the axis only, not the beginning.
<add> If the length along `axis` is already a multiple of `factor`, this is
<add> effectively a no-op.
<add>
<add> Args:
<add> tensor: A Tensor with rank >= 1 to pad.
<add> factor: Positive integer factor to pad for. If a Tensor, must be a scalar
<add> int.
<add> axis: A valid axis in `tensor` to pad.
<add> mode: The padding mode to use according to `tf.pad`. Defaults to 'CONSTANT'.
<add> constant_values: For 'CONSTANT' mode, the scalar pad value to use within
<add> `tf.pad`. Defaults to 0. Must be same type as `tensor`.
<add> name: A name for the operation (optional).
<add>
<add> Returns:
<add> The padded Tensor result.
<add> """
<add> with tf.name_scope(name or 'pad_to_multiple'):
<add> tensor = tf.convert_to_tensor(tensor)
<add>
<add> if isinstance(factor, int) and factor < 1:
<add> raise ValueError('`factor` must be positive.')
<add> rank = tensor.shape.rank
<add> if rank is None:
<add> raise ValueError('Static rank of `tensor` must be known.')
<add> if axis < 0:
<add> axis += rank
<add> if axis < 0 or axis >= rank:
<add> raise ValueError('`axis` out of bounds for `tensor` rank.')
<add>
<add> axis_len = tf_utils.get_shape_list(tensor)[axis]
<add> pad_len = -axis_len % factor
<add> paddings = pad_len * tf.one_hot([-1, axis], rank, axis=0, dtype=tf.int32)
<add> return tf.pad(
<add> tensor=tensor,
<add> paddings=paddings,
<add> mode=mode,
<add> constant_values=constant_values)
<add>
<add>
<add>def _skew_elements_right(tensor: tf.Tensor,
<add> axis: int,
<add> pad_value=0,
<add> name: Optional[Text] = None) -> tf.Tensor:
<add> """Skews successive elements right along the given `axis`.
<add>
<add> This changes an input like
<add> [
<add> [1, 2, 3],
<add> [4, 5, 6],
<add> [7, 8, 9]
<add> ]
<add> into the following:
<add> [
<add> [1, 2, 3, 0, 0],
<add> [0, 4, 5, 6, 0],
<add> [0, 0, 7, 8, 9]
<add> ]
<add>
<add> Args:
<add> tensor: Tensor of shape [..., num_rows, axis_len, ...].
<add> axis: A valid axis in `tensor` to skew along. It must not be the first axis
<add> in `tensor`.
<add> pad_value: The scalar pad value to use. Defaults to 0. Must be the same type
<add> as `tensor`.
<add> name: A name for the operation (optional).
<add>
<add> Returns:
<add> Tensor of shape [..., num_rows, axis_len + num_rows - 1, ...].
<add> """
<add> with tf.name_scope(name or 'skew_elements_right'):
<add> tensor = tf.convert_to_tensor(tensor)
<add>
<add> rank = tensor.shape.rank
<add> num_rows = tf_utils.get_shape_list(tensor)[axis - 1]
<add> axis_len = tf_utils.get_shape_list(tensor)[axis]
<add>
<add> if rank is None:
<add> raise ValueError('Static rank of `tensor` must be known.')
<add> if axis < 0:
<add> axis += rank
<add> if axis <= 0 or axis >= rank:
<add> raise ValueError('`axis` out of bounds for `tensor` rank.')
<add>
<add> output_len = axis_len + num_rows - 1
<add>
<add> paddings = num_rows * tf.one_hot([-1, axis], rank, axis=0, dtype=tf.int32)
<add>
<add> # [..., num_rows, axis_len + num_rows, ...]
<add> padded_tensor = tf.pad(tensor, paddings, constant_values=pad_value)
<add>
<add> # [..., num_rows * (axis_len + num_rows), ...]
<add> flat_tensor = _flatten_dims(
<add> padded_tensor, first_dim=axis - 1, last_dim=axis)
<add>
<add> padded_tensor2 = _pad_to_multiple(
<add> flat_tensor,
<add> factor=output_len,
<add> axis=axis - 1,
<add> constant_values=pad_value)
<add>
<add> # [..., num_rows + 1, output_len, ...]
<add> new_shape = tf.concat([
<add> tf.shape(tensor)[:(axis - 1)], [num_rows + 1, output_len],
<add> tf.shape(tensor)[(axis + 1):]
<add> ], 0)
<add> reshaped_tensor = tf.reshape(padded_tensor2, new_shape)
<add>
<add> # [..., num_rows, output_len, ...]
<add> output_shape = new_shape - tf.one_hot(axis - 1, depth=rank, dtype=tf.int32)
<add> return tf.slice(
<add> reshaped_tensor, begin=tf.zeros_like(output_shape), size=output_shape)
<add>
<add>
<add>class RelativePositionGenerator(object):
<add> """Generates `relative_att_ids` for purely distance-based relative positions.
<add>
<add> This implements the clipped relative position representations originally
<add> described in https://arxiv.org/abs/1803.02155 .
<add>
<add> Attributes:
<add> max_distance: Integer passed from `__init__`.
<add> ignore_direction: Bool passed from `__init__`.
<add> relative_vocab_size: Integer representing the maximum number of unique ids
<add> output from this generator.
<add> left_pad_value: Integer id for all positions at or beyond max_distance to
<add> the left.
<add> right_pad_value: Integer id for all positions at or beyond max_distance to
<add> the right.
<add> """
<add>
<add> def __init__(self, max_distance: int, ignore_direction: bool = False):
<add> """Init.
<add>
<add> Args:
<add> max_distance: The maximum distance to represent. Must not be negative. All
<add> larger distances will be clipped to this value.
<add> ignore_direction: If True, both left and right position representations
<add> will have the same ids based on absolute distance (resulting in
<add> symmetric ids around the center token).
<add> """
<add> if max_distance < 0:
<add> raise ValueError('`max_distance` must not be negative.')
<add> self.max_distance = max_distance
<add> self.ignore_direction = ignore_direction
<add>
<add> self.right_pad_value = max_distance
<add> self.left_pad_value = max_distance if ignore_direction else 2 * max_distance
<add>
<add> # 0 is the first id, so vocab size is 1 + the largest id (left pad value).
<add> self.relative_vocab_size = self.left_pad_value + 1
<add>
<add> def make_relative_att_ids(self,
<add> seq_len: Union[int, tf.Tensor],
<add> batch_size: Optional[Union[int, tf.Tensor]] = 1,
<add> name: Optional[Text] = None) -> tf.Tensor:
<add> """Makes relative position ids for full self-attention.
<add>
<add> For example, if `max_distance` is 3, `ignore_direction` is False, `seq_len`
<add> is 6, and `batch_size` is 1, the result is the following:
<add> [[
<add> [0, 1, 2, 3, 3, 3],
<add> [4, 0, 1, 2, 3, 3],
<add> [5, 4, 0, 1, 2, 3],
<add> [6, 5, 4, 0, 1, 2],
<add> [6, 6, 5, 4, 0, 1],
<add> [6, 6, 6, 5, 4, 0],
<add> ]]
<add>
<add> Args:
<add> seq_len: The sequence length to create ids for. Must be positive. If a
<add> Tensor, must be a scalar int.
<add> batch_size: The batch size of the result (default 1). Must be positive. If
<add> a Tensor, must be a scalar int. All examples in the batch will have the
<add> same id pattern.
<add> name: A name for the operation (optional).
<add>
<add> Returns:
<add> <int32>[batch_size, seq_len, seq_len] Tensor of relative position ids.
<add> """
<add> with tf.name_scope(name or 'make_relative_att_ids'):
<add> if isinstance(seq_len, int) and seq_len < 1:
<add> raise ValueError('`seq_len` must be positive.')
<add> if isinstance(batch_size, int) and batch_size < 1:
<add> raise ValueError('`batch_size` must be positive.')
<add>
<add> # We need the id_pattern to cover all tokens to the left of the last token
<add> # and all tokens to the right of the first token at the same time.
<add> window_size = 2 * seq_len - 1
<add>
<add> # [window_size]
<add> id_pattern = self._make_relative_id_pattern(window_size)
<add>
<add> # [seq_len, window_size]
<add> id_tensor = tf.tile(id_pattern[tf.newaxis, :], [seq_len, 1])
<add>
<add> # [seq_len, window_size + seq_len - 1]
<add> id_tensor = _skew_elements_right(id_tensor, -1)
<add>
<add> # [seq_len, seq_len]
<add> id_tensor = tf.slice(id_tensor, [0, seq_len - 1], [seq_len, seq_len])
<add>
<add> return tf.tile(id_tensor[tf.newaxis, :, :], [batch_size, 1, 1])
<add>
<add> def make_local_relative_att_ids(self,
<add> seq_len: Union[int, tf.Tensor],
<add> local_radius: int,
<add> batch_size: Optional[Union[int,
<add> tf.Tensor]] = 1,
<add> name: Optional[Text] = None) -> tf.Tensor:
<add> """Makes relative position ids for local self-attention.
<add>
<add> The result can be used as `relative_att_ids` in
<add> `layers.RelativeLocalSelfAttention`.
<add>
<add> For example, if `max_distance` is 3, `ignore_direction` is False, `seq_len`
<add> is 4, `local_radius` is 5, and `batch_size` is 1, the result is the
<add> following:
<add> [[
<add> [6, 6, 6, 5, 4, 0, 1, 2, 3, 3, 3],
<add> [6, 6, 6, 5, 4, 0, 1, 2, 3, 3, 3],
<add> [6, 6, 6, 5, 4, 0, 1, 2, 3, 3, 3],
<add> [6, 6, 6, 5, 4, 0, 1, 2, 3, 3, 3],
<add> ]]
<add>
<add> Args:
<add> seq_len: The sequence length to create ids for. Must be positive. If a
<add> Tensor, must be a scalar int.
<add> local_radius: The local radius as expected by
<add> `layers.RelativeLocalSelfAttention`. Must be positive.
<add> batch_size: The batch size of the result (default 1). Must be positive. If
<add> a Tensor, must be a scalar int. All examples in the batch will have the
<add> same id pattern.
<add> name: A name for the operation (optional).
<add>
<add> Returns:
<add> <int32>[batch_size, seq_len, 2*local_radius + 1] Tensor of relative
<add> position ids.
<add> """
<add> with tf.name_scope(name or 'make_local_relative_att_ids'):
<add> if isinstance(seq_len, int) and seq_len < 1:
<add> raise ValueError('`seq_len` must be positive.')
<add> if local_radius < 1:
<add> raise ValueError('`local_radius` must be positive.')
<add> if isinstance(batch_size, int) and batch_size < 1:
<add> raise ValueError('`batch_size` must be positive.')
<add>
<add> window_size = 2 * local_radius + 1
<add>
<add> # [window_size]
<add> id_pattern = self._make_relative_id_pattern(window_size)
<add>
<add> return tf.tile(id_pattern[tf.newaxis, tf.newaxis, :],
<add> [batch_size, seq_len, 1])
<add>
<add> def _make_relative_id_pattern(
<add> self, window_size: Union[int, tf.Tensor]) -> tf.Tensor:
<add> """Helper for making the relative id pattern for a particular window size.
<add>
<add> For example, if `max_distance` is 3, `ignore_direction` is False, and
<add> `window_size` is 11, the result is the following:
<add> [6, 6, 6, 5, 4, 0, 1, 2, 3, 3, 3].
<add>
<add> Args:
<add> window_size: Window size to return relative ids for. Must be positive and
<add> odd since ids will be relative to the center of the window. If a Tensor,
<add> must be a scalar int.
<add>
<add> Returns:
<add> <int32>[window_size] Tensor of relative position ids.
<add> """
<add> if isinstance(window_size, int):
<add> if window_size < 1:
<add> raise ValueError('`window_size` must be positive.')
<add> if window_size % 2 != 1:
<add> raise ValueError('`window_size` must be odd.')
<add>
<add> x = tf.range(self.max_distance + 1, dtype=tf.int32)
<add> x = tf.pad(x, [[self.max_distance, 0]], mode='REFLECT')
<add> if not self.ignore_direction:
<add> direction_adder = tf.concat([
<add> tf.fill([self.max_distance], self.max_distance),
<add> tf.zeros([self.max_distance + 1], dtype=tf.int32)
<add> ], 0)
<add> x += direction_adder
<add>
<add> len_x = x.shape.as_list()[0]
<add> if len_x > window_size:
<add> trim_amount = (len_x - window_size) // 2
<add> return x[trim_amount:-trim_amount]
<add>
<add> pad_amount = (window_size - len_x) // 2
<add> result = tf.pad(x, [[pad_amount, 0]], constant_values=self.left_pad_value)
<add> result = tf.pad(
<add> result, [[0, pad_amount]], constant_values=self.right_pad_value)
<add> return result
<add>
<add>
<add>def read_batches(data_dir,
<add> split,
<add> batch_size,
<add> include_answers=True,
<add> shuffle=False,
<add> drop_final_batch=False,
<add> compression_type=''):
<add> """Read TriviaQA batches."""
<add> features = {
<add> 'id': tf.io.FixedLenFeature([], tf.string),
<add> 'qid': tf.io.FixedLenFeature([], tf.string),
<add> 'context': tf.io.FixedLenFeature([], tf.string),
<add> 'question': tf.io.FixedLenFeature([], tf.string),
<add> 'global_token_ids': tf.io.RaggedFeature(tf.int64),
<add> 'token_ids': tf.io.RaggedFeature(tf.int64),
<add> 'segment_ids': tf.io.RaggedFeature(tf.int64),
<add> 'token_offsets': tf.io.RaggedFeature(tf.int64),
<add> }
<add> if include_answers:
<add> features['answers'] = tf.io.RaggedFeature(
<add> tf.int64, partitions=(tf.io.RaggedFeature.UniformRowLength(2),))
<add>
<add> dataset_builder = tfds.builder(
<add> 'bigbird_trivia_qa/rc_wiki.preprocessed', data_dir=data_dir)
<add> split_info = dataset_builder.info.splits[split]
<add> return tf.data.experimental.make_batched_features_dataset(
<add> [
<add> os.path.join(dataset_builder.data_dir, filename)
<add> for filename in split_info.filenames
<add> ],
<add> batch_size=batch_size,
<add> features=features,
<add> reader=lambda path: tf.data.TFRecordDataset(path, compression_type),
<add> label_key='answers' if include_answers else None,
<add> num_epochs=1,
<add> shuffle=shuffle,
<add> shuffle_buffer_size=split_info.num_examples,
<add> prefetch_buffer_size=tf.data.experimental.AUTOTUNE,
<add> sloppy_ordering=True,
<add> drop_final_batch=drop_final_batch,
<add> reader_num_threads=8,
<add> parser_num_threads=16)
<add>
<add>
<add>def scatter_labels(labels, batch_size, sequence_length):
<add> """Create one hot labels."""
<add> row_ids = labels.value_rowids()
<add> indices = tf.concat(
<add> (tf.stack((row_ids, tf.cast(labels.flat_values[:, 0],
<add> tf.int32), tf.zeros_like(row_ids)), -1),
<add> tf.stack((row_ids, tf.cast(labels.flat_values[:, 1],
<add> tf.int32), tf.ones_like(row_ids)), -1)), 0)
<add> one_hot_labels = tf.scatter_nd(indices,
<add> tf.ones(tf.shape(indices)[0], tf.float32),
<add> (batch_size, sequence_length, 2))
<add> return tf.minimum(one_hot_labels, 1.)
<add>
<add>
<add>def features_map_fn(features, local_radius, relative_pos_max_distance,
<add> use_hard_g2l_mask, padding_id, eos_id, null_id, cls_id,
<add> sep_id, sequence_length, global_sequence_length):
<add> """Make features."""
<add> batch_size = tf.get_static_value(features['token_ids'].shape[0])
<add> # sequence_lengths = features['token_ids'].row_lengths()
<add> question_lengths = tf.argmax(
<add> tf.equal(features['token_ids'].to_tensor(
<add> shape=(batch_size, global_sequence_length)), sep_id), -1) + 1
<add> mapped_features = dict(
<add> token_ids=tf.cast(
<add> features['token_ids'].to_tensor(shape=(batch_size, sequence_length)),
<add> tf.int32),
<add> global_token_ids=tf.cast(
<add> features['global_token_ids'].to_tensor(
<add> shape=(batch_size, global_sequence_length)), tf.int32),
<add> segment_ids=tf.cast(
<add> features['segment_ids'].to_tensor(
<add> shape=(batch_size, sequence_length)), tf.int32),
<add> )
<add> relative_pos_generator = RelativePositionGenerator(
<add> max_distance=relative_pos_max_distance)
<add> # Only do long-to-long attention for non-null tokens.
<add> # Let the null token attend to itself.
<add> l2l_att_mask = tf.ones((batch_size, sequence_length, 2 * local_radius + 1),
<add> tf.int32)
<add> l2l_att_mask *= 1 - tf.cast(
<add> tf.logical_or(
<add> tf.equal(mapped_features['token_ids'], padding_id),
<add> tf.equal(mapped_features['token_ids'], null_id)),
<add> tf.int32)[:, :, tf.newaxis]
<add> l2l_relative_att_ids = relative_pos_generator.make_local_relative_att_ids(
<add> seq_len=sequence_length, local_radius=local_radius, batch_size=batch_size)
<add> #
<add> l2g_att_mask = tf.ones((batch_size, sequence_length, global_sequence_length),
<add> tf.int32)
<add> l2g_att_mask *= tf.cast(
<add> tf.not_equal(mapped_features['token_ids'], padding_id),
<add> tf.int32)[:, :, tf.newaxis]
<add> l2g_att_mask *= tf.cast(
<add> tf.not_equal(mapped_features['global_token_ids'], padding_id),
<add> tf.int32)[:, tf.newaxis, :]
<add> l2g_relative_att_ids = tf.fill(
<add> (batch_size, sequence_length, global_sequence_length),
<add> relative_pos_generator.relative_vocab_size + 1)
<add> #
<add> g2g_att_mask = tf.ones(
<add> (batch_size, global_sequence_length, global_sequence_length), tf.int32)
<add> g2g_att_mask *= tf.cast(
<add> tf.not_equal(mapped_features['global_token_ids'], padding_id),
<add> tf.int32)[:, :, tf.newaxis]
<add> g2g_relative_att_ids = relative_pos_generator.make_relative_att_ids(
<add> seq_len=global_sequence_length, batch_size=batch_size)
<add> global_sentence_mask = tf.equal(mapped_features['global_token_ids'], eos_id)
<add> global_question_mask = tf.logical_not(
<add> tf.logical_or(
<add> tf.logical_or(
<add> tf.equal(mapped_features['global_token_ids'], cls_id),
<add> tf.equal(mapped_features['global_token_ids'], eos_id)),
<add> tf.equal(mapped_features['global_token_ids'], padding_id)))
<add> g2g_question_mask = tf.logical_and(global_question_mask[:, tf.newaxis, :],
<add> global_question_mask[:, :, tf.newaxis])
<add> g2g_sentence_mask = tf.logical_and(global_sentence_mask[:, tf.newaxis, :],
<add> global_sentence_mask[:, :, tf.newaxis])
<add> g2g_local_mask = tf.cast(
<add> tf.logical_or(g2g_question_mask, g2g_sentence_mask), tf.int32)
<add> g2g_relative_att_ids *= g2g_local_mask
<add> g2g_relative_att_ids += (1 - g2g_local_mask) * (
<add> relative_pos_generator.relative_vocab_size + 2)
<add> #
<add> g2l_att_mask = tf.transpose(l2g_att_mask, [0, 2, 1])
<add> if use_hard_g2l_mask:
<add> global_range = tf.range(
<add> global_sequence_length, dtype=mapped_features['global_token_ids'].dtype)
<add> g2l_att_mask *= tf.cast(
<add> tf.logical_or(
<add> tf.equal(
<add> mapped_features['global_token_ids'], cls_id)[:, :, tf.newaxis],
<add> tf.equal(global_range[tf.newaxis, :, tf.newaxis],
<add> mapped_features['segment_ids'][:, tf.newaxis, :])),
<add> tf.int32)
<add> g2l_relative_att_ids = tf.transpose(l2g_relative_att_ids, [0, 2, 1])
<add> mapped_features.update(
<add> dict(
<add> l2l_att_mask=l2l_att_mask,
<add> l2l_relative_att_ids=l2l_relative_att_ids,
<add> l2g_att_mask=l2g_att_mask,
<add> l2g_relative_att_ids=l2g_relative_att_ids,
<add> g2g_att_mask=g2g_att_mask,
<add> g2g_relative_att_ids=g2g_relative_att_ids,
<add> g2l_att_mask=g2l_att_mask,
<add> g2l_relative_att_ids=g2l_relative_att_ids,
<add> question_lengths=question_lengths,
<add> ))
<add> return mapped_features
<add>
<add>
<add>def labels_map_fn(token_ids, labels, sequence_length):
<add> batch_size = tf.get_static_value(labels.shape[0])
<add> row_lengths = labels.row_lengths()
<add> empty_token_index = token_ids.row_lengths() - 1
<add> one_hot_labels = scatter_labels(labels, batch_size, sequence_length)
<add> one_hot_labels += (tf.cast(row_lengths == 0, tf.float32)[:, tf.newaxis] *
<add> tf.one_hot(empty_token_index, sequence_length))[:, :,
<add> tf.newaxis]
<add> return one_hot_labels
<ide><path>official/nlp/projects/triviaqa/modeling.py
<add># Copyright 2020 The TensorFlow Authors. All Rights Reserved.
<add>#
<add># Licensed under the Apache License, Version 2.0 (the "License");
<add># you may not use this file except in compliance with the License.
<add># You may obtain a copy of the License at
<add>#
<add># https://www.apache.org/licenses/LICENSE-2.0
<add>#
<add># Unless required by applicable law or agreed to in writing, software
<add># distributed under the License is distributed on an "AS IS" BASIS,
<add># WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
<add># See the License for the specific language governing permissions and
<add># limitations under the License.
<add>"""Modeling for TriviaQA."""
<add>import tensorflow as tf
<add>
<add>from official.modeling import tf_utils
<add>from official.nlp.configs import encoders
<add>
<add>
<add>class TriviaQaHead(tf.keras.layers.Layer):
<add> """Computes logits given token and global embeddings."""
<add>
<add> def __init__(self,
<add> intermediate_size,
<add> intermediate_activation=tf_utils.get_activation('gelu'),
<add> dropout_rate=0.0,
<add> attention_dropout_rate=0.0,
<add> **kwargs):
<add> super(TriviaQaHead, self).__init__(**kwargs)
<add> self._attention_dropout = tf.keras.layers.Dropout(attention_dropout_rate)
<add> self._intermediate_dense = tf.keras.layers.Dense(intermediate_size)
<add> self._intermediate_activation = tf.keras.layers.Activation(
<add> intermediate_activation)
<add> self._output_dropout = tf.keras.layers.Dropout(dropout_rate)
<add> self._output_layer_norm = tf.keras.layers.LayerNormalization()
<add> self._logits_dense = tf.keras.layers.Dense(2)
<add>
<add> def build(self, input_shape):
<add> output_shape = input_shape['token_embeddings'][-1]
<add> self._output_dense = tf.keras.layers.Dense(output_shape)
<add> super(TriviaQaHead, self).build(input_shape)
<add>
<add> def call(self, inputs, training=None):
<add> token_embeddings = inputs['token_embeddings']
<add> token_ids = inputs['token_ids']
<add> question_lengths = inputs['question_lengths']
<add> x = self._attention_dropout(token_embeddings, training=training)
<add> intermediate_outputs = self._intermediate_dense(x)
<add> intermediate_outputs = self._intermediate_activation(intermediate_outputs)
<add> outputs = self._output_dense(intermediate_outputs)
<add> outputs = self._output_dropout(outputs, training=training)
<add> outputs = self._output_layer_norm(outputs + token_embeddings)
<add> logits = self._logits_dense(outputs)
<add> logits -= tf.expand_dims(
<add> tf.cast(tf.equal(token_ids, 0), tf.float32) + tf.sequence_mask(
<add> question_lengths, logits.shape[-2], dtype=tf.float32), -1) * 1e6
<add> return logits
<add>
<add>
<add>class TriviaQaModel(tf.keras.Model):
<add> """Model for TriviaQA."""
<add>
<add> def __init__(self, model_config: encoders.EncoderConfig, sequence_length: int,
<add> **kwargs):
<add> inputs = dict(
<add> token_ids=tf.keras.Input((sequence_length,), dtype=tf.int32),
<add> question_lengths=tf.keras.Input((), dtype=tf.int32))
<add> encoder = encoders.build_encoder(model_config)
<add> x = encoder(
<add> dict(
<add> input_word_ids=inputs['token_ids'],
<add> input_mask=tf.cast(inputs['token_ids'] > 0, tf.int32),
<add> input_type_ids=1 -
<add> tf.sequence_mask(inputs['question_lengths'], sequence_length,
<add> tf.int32)))['sequence_output']
<add> logits = TriviaQaHead(
<add> model_config.get().intermediate_size,
<add> dropout_rate=model_config.get().dropout_rate,
<add> attention_dropout_rate=model_config.get().attention_dropout_rate)(
<add> dict(
<add> token_embeddings=x,
<add> token_ids=inputs['token_ids'],
<add> question_lengths=inputs['question_lengths']))
<add> super(TriviaQaModel, self).__init__(inputs, logits, **kwargs)
<add> self._encoder = encoder
<add>
<add> @property
<add> def encoder(self):
<add> return self._encoder
<add>
<add>
<add>class SpanOrCrossEntropyLoss(tf.keras.losses.Loss):
<add> """Cross entropy loss for multiple correct answers.
<add>
<add> See https://arxiv.org/abs/1710.10723.
<add> """
<add>
<add> def call(self, y_true, y_pred):
<add> y_pred_masked = y_pred - tf.cast(y_true < 0.5, tf.float32) * 1e6
<add> or_cross_entropy = (
<add> tf.math.reduce_logsumexp(y_pred, axis=-2) -
<add> tf.math.reduce_logsumexp(y_pred_masked, axis=-2))
<add> return tf.math.reduce_sum(or_cross_entropy, -1)
<add>
<add>
<add>def smooth_labels(label_smoothing, labels, question_lengths, token_ids):
<add> mask = 1. - (
<add> tf.cast(tf.equal(token_ids, 0), tf.float32) +
<add> tf.sequence_mask(question_lengths, labels.shape[-2], dtype=tf.float32))
<add> num_classes = tf.expand_dims(tf.math.reduce_sum(mask, -1, keepdims=True), -1)
<add> labels = (1. - label_smoothing) * labels + (label_smoothing / num_classes)
<add> return labels * tf.expand_dims(mask, -1)
<ide><path>official/nlp/projects/triviaqa/predict.py
<add># Copyright 2020 The TensorFlow Authors. All Rights Reserved.
<add>#
<add># Licensed under the Apache License, Version 2.0 (the "License");
<add># you may not use this file except in compliance with the License.
<add># You may obtain a copy of the License at
<add>#
<add># https://www.apache.org/licenses/LICENSE-2.0
<add>#
<add># Unless required by applicable law or agreed to in writing, software
<add># distributed under the License is distributed on an "AS IS" BASIS,
<add># WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
<add># See the License for the specific language governing permissions and
<add># limitations under the License.
<add>"""TriviaQA script for inference."""
<add>import collections
<add>import contextlib
<add>import functools
<add>import json
<add>import operator
<add>
<add>from absl import app
<add>from absl import flags
<add>from absl import logging
<add>import tensorflow as tf
<add>import tensorflow_datasets as tfds
<add>
<add>import sentencepiece as spm
<add>from official.nlp.configs import encoders # pylint: disable=unused-import
<add>from official.nlp.projects.triviaqa import evaluation
<add>from official.nlp.projects.triviaqa import inputs
<add>from official.nlp.projects.triviaqa import prediction
<add>
<add>flags.DEFINE_string('data_dir', None, 'TensorFlow Datasets directory.')
<add>
<add>flags.DEFINE_enum('split', None,
<add> [tfds.Split.TRAIN, tfds.Split.VALIDATION, tfds.Split.TEST],
<add> 'For which split to generate predictions.')
<add>
<add>flags.DEFINE_string('predictions_path', None, 'Output for predictions.')
<add>
<add>flags.DEFINE_string('sentencepiece_model_path', None,
<add> 'Path to sentence piece model.')
<add>
<add>flags.DEFINE_integer('bigbird_block_size', 64,
<add> 'Size of blocks for sparse block attention.')
<add>
<add>flags.DEFINE_string('saved_model_dir', None,
<add> 'Path from which to initialize model and weights.')
<add>
<add>flags.DEFINE_integer('sequence_length', 4096, 'Maximum number of tokens.')
<add>
<add>flags.DEFINE_integer('global_sequence_length', 320,
<add> 'Maximum number of global tokens.')
<add>
<add>flags.DEFINE_integer('batch_size', 32, 'Size of batch.')
<add>
<add>flags.DEFINE_string('master', '', 'Address of the TPU master.')
<add>
<add>flags.DEFINE_integer('decode_top_k', 8,
<add> 'Maximum number of tokens to consider for begin/end.')
<add>
<add>flags.DEFINE_integer('decode_max_size', 16,
<add> 'Maximum number of sentence pieces in an answer.')
<add>
<add>FLAGS = flags.FLAGS
<add>
<add>
<add>@contextlib.contextmanager
<add>def worker_context():
<add> if FLAGS.master:
<add> with tf.device('/job:worker') as d:
<add> yield d
<add> else:
<add> yield
<add>
<add>
<add>def read_sentencepiece_model(path):
<add> with tf.io.gfile.GFile(path, 'rb') as file:
<add> processor = spm.SentencePieceProcessor()
<add> processor.LoadFromSerializedProto(file.read())
<add> return processor
<add>
<add>
<add>def predict(sp_processor, features_map_fn, logits_fn, decode_logits_fn,
<add> split_and_pad_fn, distribute_strategy, dataset):
<add> """Make predictions."""
<add> predictions = collections.defaultdict(list)
<add> for _, features in dataset.enumerate():
<add> token_ids = features['token_ids']
<add> x = split_and_pad_fn(features_map_fn(features))
<add> logits = tf.concat(
<add> distribute_strategy.experimental_local_results(logits_fn(x)), 0)
<add> logits = logits[:features['token_ids'].shape[0]]
<add> end_limit = token_ids.row_lengths() - 1 # inclusive
<add> begin, end, scores = decode_logits_fn(logits, end_limit)
<add> answers = prediction.decode_answer(features['context'], begin, end,
<add> features['token_offsets'],
<add> end_limit).numpy()
<add> for j, (qid, token_id, offset, score, answer) in enumerate(
<add> zip(features['qid'].numpy(),
<add> tf.gather(features['token_ids'], begin, batch_dims=1).numpy(),
<add> tf.gather(features['token_offsets'], begin, batch_dims=1).numpy(),
<add> scores, answers)):
<add> if not answer:
<add> logging.info('%s: %s | NO_ANSWER, %f',
<add> features['id'][j].numpy().decode('utf-8'),
<add> features['question'][j].numpy().decode('utf-8'), score)
<add> continue
<add> if sp_processor.IdToPiece(int(token_id)).startswith('▁') and offset > 0:
<add> answer = answer[1:]
<add> logging.info('%s: %s | %s, %f', features['id'][j].numpy().decode('utf-8'),
<add> features['question'][j].numpy().decode('utf-8'),
<add> answer.decode('utf-8'), score)
<add> predictions[qid.decode('utf-8')].append((score, answer.decode('utf-8')))
<add> predictions = {
<add> qid: evaluation.normalize_answer(
<add> sorted(answers, key=operator.itemgetter(0), reverse=True)[0][1])
<add> for qid, answers in predictions.items()
<add> }
<add> return predictions
<add>
<add>
<add>def main(argv):
<add> if len(argv) > 1:
<add> raise app.UsageError('Too many command-line arguments.')
<add> # Configure input processing.
<add> sp_processor = read_sentencepiece_model(FLAGS.sentencepiece_model_path)
<add> features_map_fn = tf.function(
<add> functools.partial(
<add> inputs.features_map_fn,
<add> local_radius=FLAGS.bigbird_block_size,
<add> relative_pos_max_distance=24,
<add> use_hard_g2l_mask=True,
<add> sequence_length=FLAGS.sequence_length,
<add> global_sequence_length=FLAGS.global_sequence_length,
<add> padding_id=sp_processor.PieceToId('<pad>'),
<add> eos_id=sp_processor.PieceToId('</s>'),
<add> null_id=sp_processor.PieceToId('<empty>'),
<add> cls_id=sp_processor.PieceToId('<ans>'),
<add> sep_id=sp_processor.PieceToId('<sep_0>')),
<add> autograph=False)
<add> # Connect to TPU cluster.
<add> if FLAGS.master:
<add> resolver = tf.distribute.cluster_resolver.TPUClusterResolver(FLAGS.master)
<add> tf.config.experimental_connect_to_cluster(resolver)
<add> tf.tpu.experimental.initialize_tpu_system(resolver)
<add> strategy = tf.distribute.TPUStrategy(resolver)
<add> else:
<add> strategy = tf.distribute.MirroredStrategy()
<add> # Initialize datasets.
<add> with worker_context():
<add> _ = tf.random.get_global_generator()
<add> dataset = inputs.read_batches(
<add> FLAGS.data_dir, FLAGS.split, FLAGS.batch_size, include_answers=False)
<add> # Initialize model and compile.
<add> with strategy.scope():
<add> model = tf.keras.models.load_model(FLAGS.saved_model_dir, compile=False)
<add> logging.info('Model initialized. Beginning prediction loop.')
<add> logits_fn = tf.function(
<add> functools.partial(prediction.distributed_logits_fn, model))
<add> decode_logits_fn = tf.function(
<add> functools.partial(prediction.decode_logits, FLAGS.decode_top_k,
<add> FLAGS.decode_max_size))
<add> split_and_pad_fn = tf.function(
<add> functools.partial(prediction.split_and_pad, strategy, FLAGS.batch_size))
<add> # Prediction strategy.
<add> predict_fn = functools.partial(
<add> predict,
<add> sp_processor=sp_processor,
<add> features_map_fn=features_map_fn,
<add> logits_fn=logits_fn,
<add> decode_logits_fn=decode_logits_fn,
<add> split_and_pad_fn=split_and_pad_fn,
<add> distribute_strategy=strategy,
<add> dataset=dataset)
<add> with worker_context():
<add> predictions = predict_fn()
<add> with tf.io.gfile.GFile(FLAGS.predictions_path, 'w') as f:
<add> json.dump(predictions, f)
<add>
<add>
<add>if __name__ == '__main__':
<add> flags.mark_flags_as_required(['split', 'predictions_path', 'saved_model_dir'])
<add> app.run(main)
<ide><path>official/nlp/projects/triviaqa/prediction.py
<add># Copyright 2020 The TensorFlow Authors. All Rights Reserved.
<add>#
<add># Licensed under the Apache License, Version 2.0 (the "License");
<add># you may not use this file except in compliance with the License.
<add># You may obtain a copy of the License at
<add>#
<add># https://www.apache.org/licenses/LICENSE-2.0
<add>#
<add># Unless required by applicable law or agreed to in writing, software
<add># distributed under the License is distributed on an "AS IS" BASIS,
<add># WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
<add># See the License for the specific language governing permissions and
<add># limitations under the License.
<add>"""Functions for inference."""
<add>import tensorflow as tf
<add>
<add>
<add>def split_and_pad(strategy, batch_size, x):
<add> """Split and pad for interence."""
<add> per_replica_size = batch_size // strategy.num_replicas_in_sync
<add>
<add> def slice_fn(x, i):
<add> begin = min(x.shape[0], i * per_replica_size)
<add> end = min(x.shape[0], (i + 1) * per_replica_size)
<add> indices = tf.range(begin, end, dtype=tf.int32)
<add> return tf.gather(x, tf.pad(indices, [[0, per_replica_size - end + begin]]))
<add>
<add> # pylint: disable=g-long-lambda
<add> return tf.nest.map_structure(
<add> lambda x: strategy.experimental_distribute_values_from_function(
<add> lambda ctx: slice_fn(x, ctx.replica_id_in_sync_group)), x)
<add> # pylint: enable=g-long-lambda
<add>
<add>
<add>def decode_logits(top_k, max_size, logits, default):
<add> """Get the span from logits."""
<add> logits = tf.transpose(logits, [0, 2, 1])
<add> values, indices = tf.math.top_k(logits, top_k)
<add> width = (
<add> tf.expand_dims(indices[:, 1, :], -2) -
<add> tf.expand_dims(indices[:, 0, :], -1))
<add> mask = tf.logical_and(width >= 0, width <= max_size)
<add> scores = (
<add> tf.expand_dims(values[:, 0, :], -1) + tf.expand_dims(values[:, 1, :], -2))
<add> scores = tf.where(mask, scores, -1e8)
<add> flat_indices = tf.argmax(tf.reshape(scores, (-1, top_k * top_k)), -1)
<add> begin = tf.gather(
<add> indices[:, 0, :], tf.math.floordiv(flat_indices, top_k), batch_dims=1)
<add> end = tf.gather(
<add> indices[:, 1, :], tf.math.mod(flat_indices, top_k), batch_dims=1)
<add> reduced_mask = tf.math.reduce_any(mask, [-1, -2])
<add> return (tf.where(reduced_mask, begin,
<add> default), tf.where(reduced_mask, end, default),
<add> tf.math.reduce_max(scores, [-1, -2]))
<add>
<add>
<add>@tf.function
<add>def decode_answer(context, begin, end, token_offsets, end_limit):
<add> i = tf.gather(token_offsets, begin, batch_dims=1)
<add> j = tf.gather(token_offsets, tf.minimum(end + 1, end_limit), batch_dims=1)
<add> j = tf.where(end == end_limit, tf.cast(tf.strings.length(context), tf.int64),
<add> j)
<add> return tf.strings.substr(context, i, j - i)
<add>
<add>
<add>def distributed_logits_fn(model, x):
<add> return model.distribute_strategy.run(
<add> lambda x: model(x, training=False), args=(x,))
<ide><path>official/nlp/projects/triviaqa/preprocess.py
<add># Copyright 2020 The TensorFlow Authors. All Rights Reserved.
<add>#
<add># Licensed under the Apache License, Version 2.0 (the "License");
<add># you may not use this file except in compliance with the License.
<add># You may obtain a copy of the License at
<add>#
<add># https://www.apache.org/licenses/LICENSE-2.0
<add>#
<add># Unless required by applicable law or agreed to in writing, software
<add># distributed under the License is distributed on an "AS IS" BASIS,
<add># WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
<add># See the License for the specific language governing permissions and
<add># limitations under the License.
<add>"""Utilities for preprocessing TriviaQA data."""
<add>import bisect
<add>import json
<add>import operator
<add>import os
<add>import re
<add>import string
<add>from typing import Any, Dict, Generator, List, Optional, Set, Text, Tuple
<add>
<add>from absl import logging
<add>import apache_beam as beam
<add>from apache_beam import metrics
<add>import dataclasses
<add>import nltk
<add>import numpy as np
<add>import tensorflow.io.gfile as gfile
<add>
<add>import sentencepiece as spm
<add>from official.nlp.projects.triviaqa import evaluation
<add>from official.nlp.projects.triviaqa import sentencepiece_pb2
<add>
<add>
<add>@dataclasses.dataclass
<add>class Question(object):
<add> id: Text
<add> value: Text
<add>
<add>
<add>@dataclasses.dataclass
<add>class EvidenceInfo(object):
<add> id: Text
<add> source: Text
<add> title: Text
<add>
<add>
<add>@dataclasses.dataclass
<add>class Evidence(object):
<add> info: EvidenceInfo
<add> text: Text
<add>
<add>
<add>@dataclasses.dataclass
<add>class Answer(object):
<add> value: Text
<add> aliases: List[Text]
<add> normalized_aliases: List[Text]
<add>
<add>
<add>@dataclasses.dataclass
<add>class QuestionAnswer(object):
<add> question: Question
<add> evidence_info: List[EvidenceInfo]
<add> answer: Optional[Answer] = None
<add>
<add>
<add>@dataclasses.dataclass
<add>class QuestionAnswerEvidence(object):
<add> question: Question
<add> evidence: Evidence
<add> answer: Optional[Answer] = None
<add>
<add>
<add>@dataclasses.dataclass
<add>class Features(object):
<add> id: Text
<add> stride_index: int
<add> question_id: Text
<add> question: Text
<add> context: bytes
<add> token_ids: List[int]
<add> token_offsets: List[int]
<add> global_token_ids: List[int]
<add> segment_ids: List[int]
<add>
<add>
<add>@dataclasses.dataclass
<add>class Paragraph(object):
<add> sentences: List[sentencepiece_pb2.SentencePieceText]
<add> size: int
<add>
<add>
<add>@dataclasses.dataclass
<add>class AnswerSpan(object):
<add> begin: int # inclusive
<add> end: int # inclusive
<add> text: Text
<add>
<add>
<add>def make_paragraph(
<add> sentence_tokenizer: nltk.tokenize.api.TokenizerI,
<add> processor: spm.SentencePieceProcessor,
<add> text: Text,
<add> paragraph_metric: Optional[metrics.Metrics.DelegatingDistribution] = None,
<add> sentence_metric: Optional[metrics.Metrics.DelegatingDistribution] = None
<add>) -> Paragraph:
<add> """Tokenizes paragraphs."""
<add> paragraph_size = 0
<add> sentences = []
<add> for sentence in sentence_tokenizer.tokenize(text):
<add> sentencepiece_text = sentencepiece_pb2.SentencePieceText.FromString(
<add> processor.EncodeAsSerializedProto(sentence))
<add> paragraph_size += len(sentencepiece_text.pieces)
<add> sentences.append(sentencepiece_text)
<add> if sentence_metric:
<add> sentence_metric.update(len(sentencepiece_text.pieces))
<add> if paragraph_metric:
<add> paragraph_metric.update(paragraph_size)
<add> return Paragraph(sentences=sentences, size=paragraph_size)
<add>
<add>
<add>def read_question_answers(json_path: Text) -> List[QuestionAnswer]:
<add> """Read question answers."""
<add> with gfile.GFile(json_path) as f:
<add> data = json.load(f)['Data']
<add> question_answers = []
<add> for datum in data:
<add> question = Question(id=datum['QuestionId'], value=datum['Question'])
<add> if 'Answer' in datum:
<add> answer = Answer(
<add> value=datum['Answer']['Value'],
<add> aliases=datum['Answer']['Aliases'],
<add> normalized_aliases=datum['Answer']['NormalizedAliases'])
<add> else:
<add> answer = None
<add> evidence_info = []
<add> for key in ['EntityPages', 'SearchResults']:
<add> for document in datum.get(key, []):
<add> evidence_info.append(
<add> EvidenceInfo(
<add> id=document['Filename'], title=document['Title'], source=key))
<add> question_answers.append(
<add> QuestionAnswer(
<add> question=question, evidence_info=evidence_info, answer=answer))
<add> return question_answers
<add>
<add>
<add>def alias_answer(answer: Text, include=None):
<add> alias = answer.replace('_', ' ').lower()
<add> exclude = set(string.punctuation + ''.join(['‘', '’', '´', '`']))
<add> include = include or []
<add> alias = ''.join(c if c not in exclude or c in include else ' ' for c in alias)
<add> return ' '.join(alias.split()).strip()
<add>
<add>
<add>def make_answer_set(answer: Answer) -> Set[Text]:
<add> """Apply less aggressive normalization to the answer aliases."""
<add> answers = []
<add> for alias in [answer.value] + answer.aliases:
<add> answers.append(alias_answer(alias))
<add> answers.append(alias_answer(alias, [',', '.']))
<add> answers.append(alias_answer(alias, ['-']))
<add> answers.append(alias_answer(alias, [',', '.', '-']))
<add> answers.append(alias_answer(alias, string.punctuation))
<add> return set(answers + answer.normalized_aliases)
<add>
<add>
<add>def find_answer_spans(text: bytes, answer_set: Set[Text]) -> List[AnswerSpan]:
<add> """Find answer spans."""
<add> spans = []
<add> for answer in answer_set:
<add> answer_regex = re.compile(
<add> re.escape(answer).encode('utf-8').replace(b'\\ ', b'[ -]'),
<add> flags=re.IGNORECASE)
<add> for match in re.finditer(answer_regex, text):
<add> spans.append(
<add> AnswerSpan(
<add> begin=match.start(),
<add> end=match.end(),
<add> text=match.group(0).decode('utf-8')))
<add> return sorted(spans, key=operator.attrgetter('begin'))
<add>
<add>
<add>def realign_answer_span(features: Features, answer_set: Optional[Set[Text]],
<add> processor: spm.SentencePieceProcessor,
<add> span: AnswerSpan) -> Optional[AnswerSpan]:
<add> """Align answer span to text with given tokens."""
<add> i = bisect.bisect_left(features.token_offsets, span.begin)
<add> if i == len(features.token_offsets) or span.begin < features.token_offsets[i]:
<add> i -= 1
<add> j = i + 1
<add> answer_end = span.begin + len(span.text.encode('utf-8'))
<add> while (j < len(features.token_offsets) and
<add> features.token_offsets[j] < answer_end):
<add> j += 1
<add> j -= 1
<add> sp_answer = (
<add> features.context[features.token_offsets[i]:features.token_offsets[j + 1]]
<add> if j + 1 < len(features.token_offsets) else
<add> features.context[features.token_offsets[i]:])
<add> if (processor.IdToPiece(features.token_ids[i]).startswith('▁') and
<add> features.token_offsets[i] > 0):
<add> sp_answer = sp_answer[1:]
<add> sp_answer = evaluation.normalize_answer(sp_answer.decode('utf-8'))
<add> if answer_set is not None and sp_answer not in answer_set:
<add> # No need to warn if the cause was breaking word boundaries.
<add> if len(sp_answer) and not len(sp_answer) > len(
<add> evaluation.normalize_answer(span.text)):
<add> logging.warning('%s: "%s" not in %s.', features.question_id, sp_answer,
<add> answer_set)
<add> return None
<add> return AnswerSpan(begin=i, end=j, text=span.text)
<add>
<add>
<add>def read_sentencepiece_model(path):
<add> with gfile.GFile(path, 'rb') as file:
<add> processor = spm.SentencePieceProcessor()
<add> processor.LoadFromSerializedProto(file.read())
<add> return processor
<add>
<add>
<add>class ReadEvidence(beam.DoFn):
<add> """Function to read evidence."""
<add>
<add> def __init__(self, wikipedia_dir: Text, web_dir: Text):
<add> self._wikipedia_dir = wikipedia_dir
<add> self._web_dir = web_dir
<add>
<add> def process(
<add> self, question_answer: QuestionAnswer
<add> ) -> Generator[QuestionAnswerEvidence, None, None]:
<add> for info in question_answer.evidence_info:
<add> if info.source == 'EntityPages':
<add> evidence_path = os.path.join(self._wikipedia_dir, info.id)
<add> elif info.source == 'SearchResult':
<add> evidence_path = os.path.join(self._web_dir, info.id)
<add> else:
<add> raise ValueError(f'Unknown evidence source: {info.source}.')
<add> with gfile.GFile(evidence_path, 'rb') as f:
<add> text = f.read().decode('utf-8')
<add> metrics.Metrics.counter('_', 'documents').inc()
<add> yield QuestionAnswerEvidence(
<add> question=question_answer.question,
<add> evidence=Evidence(info=info, text=text),
<add> answer=question_answer.answer)
<add>
<add>
<add>_CLS_PIECE = '<ans>'
<add>_EOS_PIECE = '</s>'
<add>_SEP_PIECE = '<sep_0>'
<add># _PARAGRAPH_SEP_PIECE = '<sep_1>'
<add>_NULL_PIECE = '<empty>'
<add>_QUESTION_PIECE = '<unused_34>'
<add>
<add>
<add>class MakeFeatures(beam.DoFn):
<add> """Function to make features."""
<add>
<add> def __init__(self, sentencepiece_model_path: Text, max_num_tokens: int,
<add> max_num_global_tokens: int, stride: int):
<add> self._sentencepiece_model_path = sentencepiece_model_path
<add> self._max_num_tokens = max_num_tokens
<add> self._max_num_global_tokens = max_num_global_tokens
<add> self._stride = stride
<add>
<add> def setup(self):
<add> self._sentence_tokenizer = nltk.data.load('tokenizers/punkt/english.pickle')
<add> self._sentencepiece_processor = read_sentencepiece_model(
<add> self._sentencepiece_model_path)
<add>
<add> def _make_features(self, stride_index: int, paragraph_texts: List[Text],
<add> paragraphs: List[Paragraph],
<add> question_answer_evidence: QuestionAnswerEvidence,
<add> ids: List[int],
<add> paragraph_offset: int) -> Tuple[int, Features]:
<add> global_ids = (
<add> [self._sentencepiece_processor.PieceToId(_CLS_PIECE)] +
<add> [self._sentencepiece_processor.PieceToId(_QUESTION_PIECE)] * len(ids))
<add> segment_ids = [i + 1 for i in range(len(ids))] # offset for CLS token
<add> token_ids, sentences = [], []
<add> offsets, offset, full_text = [-1] * len(ids), 0, True
<add> for i in range(paragraph_offset, len(paragraph_texts)):
<add> if i < len(paragraphs):
<add> paragraph = paragraphs[i]
<add> else:
<add> paragraphs.append(
<add> make_paragraph(
<add> self._sentence_tokenizer,
<add> self._sentencepiece_processor,
<add> paragraph_texts[i],
<add> paragraph_metric=metrics.Metrics.distribution(
<add> '_', 'paragraphs'),
<add> sentence_metric=metrics.Metrics.distribution('_', 'sentences')))
<add> paragraph = paragraphs[-1]
<add> for sentence in paragraph.sentences:
<add> if (len(ids) + len(token_ids) + len(sentence.pieces) + 1 >=
<add> self._max_num_tokens or
<add> len(global_ids) >= self._max_num_global_tokens):
<add> full_text = False
<add> break
<add> for j, piece in enumerate(sentence.pieces):
<add> token_ids.append(piece.id)
<add> segment_ids.append(len(global_ids))
<add> offsets.append(offset + piece.begin)
<add> if j == 0 and sentences:
<add> offsets[-1] -= 1
<add> offset += len(sentence.text.encode('utf-8')) + 1
<add> global_ids.append(self._sentencepiece_processor.PieceToId(_EOS_PIECE))
<add> sentences.append(sentence.text)
<add> if not full_text:
<add> break
<add> context = ' '.join(sentences).encode('utf-8')
<add> token_ids.append(self._sentencepiece_processor.PieceToId(_NULL_PIECE))
<add> offsets.append(len(context))
<add> segment_ids.append(0)
<add> next_paragraph_index = len(paragraph_texts)
<add> if not full_text and self._stride > 0:
<add> shift = paragraphs[paragraph_offset].size
<add> next_paragraph_index = paragraph_offset + 1
<add> while (next_paragraph_index < len(paragraphs) and
<add> shift + paragraphs[next_paragraph_index].size <= self._stride):
<add> shift += paragraphs[next_paragraph_index].size
<add> next_paragraph_index += 1
<add> return next_paragraph_index, Features(
<add> id='{}--{}'.format(question_answer_evidence.question.id,
<add> question_answer_evidence.evidence.info.id),
<add> stride_index=stride_index,
<add> question_id=question_answer_evidence.question.id,
<add> question=question_answer_evidence.question.value,
<add> context=context,
<add> token_ids=ids + token_ids,
<add> global_token_ids=global_ids,
<add> segment_ids=segment_ids,
<add> token_offsets=offsets)
<add>
<add> def process(
<add> self, question_answer_evidence: QuestionAnswerEvidence
<add> ) -> Generator[Features, None, None]:
<add> # Tokenize question which is shared among all examples.
<add> ids = (
<add> self._sentencepiece_processor.EncodeAsIds(
<add> question_answer_evidence.question.value) +
<add> [self._sentencepiece_processor.PieceToId(_SEP_PIECE)])
<add> paragraph_texts = list(
<add> filter(
<add> lambda p: p,
<add> map(lambda p: p.strip(),
<add> question_answer_evidence.evidence.text.split('\n'))))
<add> stride_index, paragraphs, paragraph_index = 0, [], 0
<add> while paragraph_index < len(paragraph_texts):
<add> paragraph_index, features = self._make_features(stride_index,
<add> paragraph_texts,
<add> paragraphs,
<add> question_answer_evidence,
<add> ids, paragraph_index)
<add> stride_index += 1
<add> yield features
<add>
<add>
<add>def _handle_exceptional_examples(
<add> features: Features,
<add> processor: spm.SentencePieceProcessor) -> List[AnswerSpan]:
<add> """Special cases in data."""
<add> if features.id == 'qw_6687--Viola.txt':
<add> pattern = 'three strings in common—G, D, and A'.encode('utf-8')
<add> i = features.context.find(pattern)
<add> if i != -1:
<add> span = AnswerSpan(i + len(pattern) - 1, i + len(pattern), 'A')
<add> span = realign_answer_span(features, None, processor, span)
<add> assert span is not None, 'Span should exist.'
<add> return [span]
<add> if features.id == 'sfq_26183--Vitamin_A.txt':
<add> pattern = ('Vitamin A is a group of unsaturated nutritional organic '
<add> 'compounds that includes retinol').encode('utf-8')
<add> i = features.context.find(pattern)
<add> if i != -1:
<add> span = AnswerSpan(i + pattern.find(b'A'), i + pattern.find(b'A') + 1, 'A')
<add> span = realign_answer_span(features, None, processor, span)
<add> assert span is not None, 'Span should exist.'
<add> spans = [span]
<add> span = AnswerSpan(i, i + pattern.find(b'A') + 1, 'Vitamin A')
<add> span = realign_answer_span(features, None, processor, span)
<add> return spans + [span]
<add> if features.id == 'odql_292--Colombia.txt':
<add> pattern = b'Colombia is the third-most populous country in Latin America'
<add> i = features.context.find(pattern)
<add> if i != -1:
<add> span = AnswerSpan(i, i + len(b'Colombia'), 'Colombia')
<add> span = realign_answer_span(features, None, processor, span)
<add> assert span is not None, 'Span should exist.'
<add> return [span]
<add> if features.id == 'tc_1648--Vietnam.txt':
<add> pattern = 'Bảo Đại'.encode('utf-8')
<add> i = features.context.find(pattern)
<add> if i != -1:
<add> span = AnswerSpan(i, i + len(pattern), 'Bảo Đại')
<add> span = realign_answer_span(features, None, processor, span)
<add> assert span is not None, 'Span should exist.'
<add> return [span]
<add> if features.id == 'sfq_22225--Irish_mythology.txt':
<add> pattern = 'Tír na nÓg'.encode('utf-8')
<add> spans = []
<add> i = 0
<add> while features.context.find(pattern, i) != -1:
<add> i = features.context.find(pattern)
<add> span = AnswerSpan(i, i + len(pattern), 'Tír na nÓg')
<add> span = realign_answer_span(features, None, processor, span)
<add> assert span is not None, 'Span should exist.'
<add> spans.append(span)
<add> i += len(pattern)
<add> return spans
<add> return []
<add>
<add>
<add>class FindAnswerSpans(beam.DoFn):
<add> """Find answer spans in document."""
<add>
<add> def __init__(self, sentencepiece_model_path: Text):
<add> self._sentencepiece_model_path = sentencepiece_model_path
<add>
<add> def setup(self):
<add> self._sentencepiece_processor = read_sentencepiece_model(
<add> self._sentencepiece_model_path)
<add>
<add> def process(
<add> self,
<add> element: Tuple[Text, List[Features]],
<add> answer_sets: Dict[Text, Set[Text]],
<add> ) -> Generator[Tuple[Features, List[AnswerSpan]], None, None]:
<add> question_id, features = element
<add> answer_set = answer_sets[question_id]
<add> has_answer = False
<add> for feature in features:
<add> answer_spans = []
<add> for answer_span in find_answer_spans(feature.context, answer_set):
<add> realigned_answer_span = realign_answer_span(
<add> feature, answer_set, self._sentencepiece_processor, answer_span)
<add> if realigned_answer_span:
<add> answer_spans.append(realigned_answer_span)
<add> if not answer_spans:
<add> answer_spans = _handle_exceptional_examples(
<add> feature, self._sentencepiece_processor)
<add> if answer_spans:
<add> has_answer = True
<add> else:
<add> metrics.Metrics.counter('_', 'answerless_examples').inc()
<add> yield feature, answer_spans
<add> if not has_answer:
<add> metrics.Metrics.counter('_', 'answerless_questions').inc()
<add> logging.error('Question %s has no answer.', question_id)
<add>
<add>
<add>def make_example(
<add> features: Features,
<add> labels: Optional[List[AnswerSpan]] = None) -> Tuple[Text, Dict[Text, Any]]:
<add> """Make an example."""
<add> feature = {
<add> 'id': features.id,
<add> 'qid': features.question_id,
<add> 'question': features.question,
<add> 'context': features.context,
<add> 'token_ids': features.token_ids,
<add> 'token_offsets': features.token_offsets,
<add> 'segment_ids': features.segment_ids,
<add> 'global_token_ids': features.global_token_ids,
<add> }
<add> if labels:
<add> answers = set((label.begin, label.end) for label in labels)
<add> feature['answers'] = np.array([list(answer) for answer in answers],
<add> np.int64)
<add> else:
<add> feature['answers'] = np.zeros([0, 2], np.int64)
<add> metrics.Metrics.counter('_', 'examples').inc()
<add> return f'{features.id}--{features.stride_index}', feature
<add>
<add>
<add>def make_pipeline(root: beam.Pipeline, question_answers: List[QuestionAnswer],
<add> answer: bool, max_num_tokens: int, max_num_global_tokens: int,
<add> stride: int, sentencepiece_model_path: Text,
<add> wikipedia_dir: Text, web_dir: Text):
<add> """Makes a Beam pipeline."""
<add> question_answers = (
<add> root | 'CreateQuestionAnswers' >> beam.Create(question_answers))
<add> features = (
<add> question_answers
<add> | 'ReadEvidence' >> beam.ParDo(
<add> ReadEvidence(wikipedia_dir=wikipedia_dir, web_dir=web_dir))
<add> | 'MakeFeatures' >> beam.ParDo(
<add> MakeFeatures(
<add> sentencepiece_model_path=sentencepiece_model_path,
<add> max_num_tokens=max_num_tokens,
<add> max_num_global_tokens=max_num_global_tokens,
<add> stride=stride)))
<add> if answer:
<add> features = features | 'KeyFeature' >> beam.Map(
<add> lambda feature: (feature.question_id, feature))
<add> # pylint: disable=g-long-lambda
<add> answer_sets = (
<add> question_answers
<add> | 'MakeAnswerSet' >>
<add> beam.Map(lambda qa: (qa.question.id, make_answer_set(qa.answer))))
<add> # pylint: enable=g-long-lambda
<add> examples = (
<add> features
<add> | beam.GroupByKey()
<add> | 'FindAnswerSpans' >> beam.ParDo(
<add> FindAnswerSpans(sentencepiece_model_path),
<add> answer_sets=beam.pvalue.AsDict(answer_sets))
<add> | 'MakeExamplesWithLabels' >> beam.MapTuple(make_example))
<add> else:
<add> examples = features | 'MakeExamples' >> beam.Map(make_example)
<add> return examples
<ide><path>official/nlp/projects/triviaqa/sentencepiece_pb2.py
<add># Copyright 2020 The TensorFlow Authors. All Rights Reserved.
<add>#
<add># Licensed under the Apache License, Version 2.0 (the "License");
<add># you may not use this file except in compliance with the License.
<add># You may obtain a copy of the License at
<add>#
<add># https://www.apache.org/licenses/LICENSE-2.0
<add>#
<add># Unless required by applicable law or agreed to in writing, software
<add># distributed under the License is distributed on an "AS IS" BASIS,
<add># WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
<add># See the License for the specific language governing permissions and
<add># limitations under the License.
<add># -*- coding: utf-8 -*-
<add># pylint: disable=bad-continuation
<add># pylint: disable=protected-access
<add># Generated by the protocol buffer compiler. DO NOT EDIT!
<add>"""Generated protocol buffer code."""
<add>from google.protobuf import descriptor as _descriptor
<add>from google.protobuf import message as _message
<add>from google.protobuf import reflection as _reflection
<add>from google.protobuf import symbol_database as _symbol_database
<add># @@protoc_insertion_point(imports)
<add>
<add>_sym_db = _symbol_database.Default()
<add>
<add>DESCRIPTOR = _descriptor.FileDescriptor(
<add> name='third_party/sentencepiece/src/sentencepiece.proto',
<add> package='sentencepiece',
<add> syntax='proto2',
<add> serialized_options=None,
<add> create_key=_descriptor._internal_create_key,
<add> serialized_pb=b'\n1third_party/sentencepiece/src/sentencepiece.proto\x12\rsentencepiece\"\xdf\x01\n\x11SentencePieceText\x12\x0c\n\x04text\x18\x01 \x01(\t\x12>\n\x06pieces\x18\x02 \x03(\x0b\x32..sentencepiece.SentencePieceText.SentencePiece\x12\r\n\x05score\x18\x03 \x01(\x02\x1a\x62\n\rSentencePiece\x12\r\n\x05piece\x18\x01 \x01(\t\x12\n\n\x02id\x18\x02 \x01(\r\x12\x0f\n\x07surface\x18\x03 \x01(\t\x12\r\n\x05\x62\x65gin\x18\x04 \x01(\r\x12\x0b\n\x03\x65nd\x18\x05 \x01(\r*\t\x08\xc8\x01\x10\x80\x80\x80\x80\x02*\t\x08\xc8\x01\x10\x80\x80\x80\x80\x02\"J\n\x16NBestSentencePieceText\x12\x30\n\x06nbests\x18\x01 \x03(\x0b\x32 .sentencepiece.SentencePieceText'
<add>)
<add>
<add>_SENTENCEPIECETEXT_SENTENCEPIECE = _descriptor.Descriptor(
<add> name='SentencePiece',
<add> full_name='sentencepiece.SentencePieceText.SentencePiece',
<add> filename=None,
<add> file=DESCRIPTOR,
<add> containing_type=None,
<add> create_key=_descriptor._internal_create_key,
<add> fields=[
<add> _descriptor.FieldDescriptor(
<add> name='piece',
<add> full_name='sentencepiece.SentencePieceText.SentencePiece.piece',
<add> index=0,
<add> number=1,
<add> type=9,
<add> cpp_type=9,
<add> label=1,
<add> has_default_value=False,
<add> default_value=b''.decode('utf-8'),
<add> message_type=None,
<add> enum_type=None,
<add> containing_type=None,
<add> is_extension=False,
<add> extension_scope=None,
<add> serialized_options=None,
<add> file=DESCRIPTOR,
<add> create_key=_descriptor._internal_create_key),
<add> _descriptor.FieldDescriptor(
<add> name='id',
<add> full_name='sentencepiece.SentencePieceText.SentencePiece.id',
<add> index=1,
<add> number=2,
<add> type=13,
<add> cpp_type=3,
<add> label=1,
<add> has_default_value=False,
<add> default_value=0,
<add> message_type=None,
<add> enum_type=None,
<add> containing_type=None,
<add> is_extension=False,
<add> extension_scope=None,
<add> serialized_options=None,
<add> file=DESCRIPTOR,
<add> create_key=_descriptor._internal_create_key),
<add> _descriptor.FieldDescriptor(
<add> name='surface',
<add> full_name='sentencepiece.SentencePieceText.SentencePiece.surface',
<add> index=2,
<add> number=3,
<add> type=9,
<add> cpp_type=9,
<add> label=1,
<add> has_default_value=False,
<add> default_value=b''.decode('utf-8'),
<add> message_type=None,
<add> enum_type=None,
<add> containing_type=None,
<add> is_extension=False,
<add> extension_scope=None,
<add> serialized_options=None,
<add> file=DESCRIPTOR,
<add> create_key=_descriptor._internal_create_key),
<add> _descriptor.FieldDescriptor(
<add> name='begin',
<add> full_name='sentencepiece.SentencePieceText.SentencePiece.begin',
<add> index=3,
<add> number=4,
<add> type=13,
<add> cpp_type=3,
<add> label=1,
<add> has_default_value=False,
<add> default_value=0,
<add> message_type=None,
<add> enum_type=None,
<add> containing_type=None,
<add> is_extension=False,
<add> extension_scope=None,
<add> serialized_options=None,
<add> file=DESCRIPTOR,
<add> create_key=_descriptor._internal_create_key),
<add> _descriptor.FieldDescriptor(
<add> name='end',
<add> full_name='sentencepiece.SentencePieceText.SentencePiece.end',
<add> index=4,
<add> number=5,
<add> type=13,
<add> cpp_type=3,
<add> label=1,
<add> has_default_value=False,
<add> default_value=0,
<add> message_type=None,
<add> enum_type=None,
<add> containing_type=None,
<add> is_extension=False,
<add> extension_scope=None,
<add> serialized_options=None,
<add> file=DESCRIPTOR,
<add> create_key=_descriptor._internal_create_key),
<add> ],
<add> extensions=[],
<add> nested_types=[],
<add> enum_types=[],
<add> serialized_options=None,
<add> is_extendable=True,
<add> syntax='proto2',
<add> extension_ranges=[
<add> (200, 536870912),
<add> ],
<add> oneofs=[],
<add> serialized_start=183,
<add> serialized_end=281,
<add>)
<add>
<add>_SENTENCEPIECETEXT = _descriptor.Descriptor(
<add> name='SentencePieceText',
<add> full_name='sentencepiece.SentencePieceText',
<add> filename=None,
<add> file=DESCRIPTOR,
<add> containing_type=None,
<add> create_key=_descriptor._internal_create_key,
<add> fields=[
<add> _descriptor.FieldDescriptor(
<add> name='text',
<add> full_name='sentencepiece.SentencePieceText.text',
<add> index=0,
<add> number=1,
<add> type=9,
<add> cpp_type=9,
<add> label=1,
<add> has_default_value=False,
<add> default_value=b''.decode('utf-8'),
<add> message_type=None,
<add> enum_type=None,
<add> containing_type=None,
<add> is_extension=False,
<add> extension_scope=None,
<add> serialized_options=None,
<add> file=DESCRIPTOR,
<add> create_key=_descriptor._internal_create_key),
<add> _descriptor.FieldDescriptor(
<add> name='pieces',
<add> full_name='sentencepiece.SentencePieceText.pieces',
<add> index=1,
<add> number=2,
<add> type=11,
<add> cpp_type=10,
<add> label=3,
<add> has_default_value=False,
<add> default_value=[],
<add> message_type=None,
<add> enum_type=None,
<add> containing_type=None,
<add> is_extension=False,
<add> extension_scope=None,
<add> serialized_options=None,
<add> file=DESCRIPTOR,
<add> create_key=_descriptor._internal_create_key),
<add> _descriptor.FieldDescriptor(
<add> name='score',
<add> full_name='sentencepiece.SentencePieceText.score',
<add> index=2,
<add> number=3,
<add> type=2,
<add> cpp_type=6,
<add> label=1,
<add> has_default_value=False,
<add> default_value=float(0),
<add> message_type=None,
<add> enum_type=None,
<add> containing_type=None,
<add> is_extension=False,
<add> extension_scope=None,
<add> serialized_options=None,
<add> file=DESCRIPTOR,
<add> create_key=_descriptor._internal_create_key),
<add> ],
<add> extensions=[],
<add> nested_types=[
<add> _SENTENCEPIECETEXT_SENTENCEPIECE,
<add> ],
<add> enum_types=[],
<add> serialized_options=None,
<add> is_extendable=True,
<add> syntax='proto2',
<add> extension_ranges=[
<add> (200, 536870912),
<add> ],
<add> oneofs=[],
<add> serialized_start=69,
<add> serialized_end=292,
<add>)
<add>
<add>_NBESTSENTENCEPIECETEXT = _descriptor.Descriptor(
<add> name='NBestSentencePieceText',
<add> full_name='sentencepiece.NBestSentencePieceText',
<add> filename=None,
<add> file=DESCRIPTOR,
<add> containing_type=None,
<add> create_key=_descriptor._internal_create_key,
<add> fields=[
<add> _descriptor.FieldDescriptor(
<add> name='nbests',
<add> full_name='sentencepiece.NBestSentencePieceText.nbests',
<add> index=0,
<add> number=1,
<add> type=11,
<add> cpp_type=10,
<add> label=3,
<add> has_default_value=False,
<add> default_value=[],
<add> message_type=None,
<add> enum_type=None,
<add> containing_type=None,
<add> is_extension=False,
<add> extension_scope=None,
<add> serialized_options=None,
<add> file=DESCRIPTOR,
<add> create_key=_descriptor._internal_create_key),
<add> ],
<add> extensions=[],
<add> nested_types=[],
<add> enum_types=[],
<add> serialized_options=None,
<add> is_extendable=False,
<add> syntax='proto2',
<add> extension_ranges=[],
<add> oneofs=[],
<add> serialized_start=294,
<add> serialized_end=368,
<add>)
<add>
<add>_SENTENCEPIECETEXT_SENTENCEPIECE.containing_type = _SENTENCEPIECETEXT
<add>_SENTENCEPIECETEXT.fields_by_name[
<add> 'pieces'].message_type = _SENTENCEPIECETEXT_SENTENCEPIECE
<add>_NBESTSENTENCEPIECETEXT.fields_by_name[
<add> 'nbests'].message_type = _SENTENCEPIECETEXT
<add>DESCRIPTOR.message_types_by_name['SentencePieceText'] = _SENTENCEPIECETEXT
<add>DESCRIPTOR.message_types_by_name[
<add> 'NBestSentencePieceText'] = _NBESTSENTENCEPIECETEXT
<add>_sym_db.RegisterFileDescriptor(DESCRIPTOR)
<add>
<add>SentencePieceText = _reflection.GeneratedProtocolMessageType(
<add> 'SentencePieceText',
<add> (_message.Message,),
<add> {
<add> 'SentencePiece':
<add> _reflection.GeneratedProtocolMessageType(
<add> 'SentencePiece',
<add> (_message.Message,),
<add> {
<add> 'DESCRIPTOR':
<add> _SENTENCEPIECETEXT_SENTENCEPIECE,
<add> '__module__':
<add> 'official.nlp.projects.triviaqa.sentencepiece_pb2'
<add> # @@protoc_insertion_point(class_scope:sentencepiece.SentencePieceText.SentencePiece)
<add> }),
<add> 'DESCRIPTOR':
<add> _SENTENCEPIECETEXT,
<add> '__module__':
<add> 'official.nlp.projects.triviaqa.sentencepiece_pb2'
<add> # @@protoc_insertion_point(class_scope:sentencepiece.SentencePieceText)
<add> })
<add>_sym_db.RegisterMessage(SentencePieceText)
<add>_sym_db.RegisterMessage(SentencePieceText.SentencePiece)
<add>
<add>NBestSentencePieceText = _reflection.GeneratedProtocolMessageType(
<add> 'NBestSentencePieceText',
<add> (_message.Message,),
<add> {
<add> 'DESCRIPTOR': _NBESTSENTENCEPIECETEXT,
<add> '__module__': 'official.nlp.projects.triviaqa.sentencepiece_pb2'
<add> # @@protoc_insertion_point(class_scope:sentencepiece.NBestSentencePieceText)
<add> })
<add>_sym_db.RegisterMessage(NBestSentencePieceText)
<add>
<add># @@protoc_insertion_point(module_scope)
<ide><path>official/nlp/projects/triviaqa/train.py
<add># Copyright 2020 The TensorFlow Authors. All Rights Reserved.
<add>#
<add># Licensed under the Apache License, Version 2.0 (the "License");
<add># you may not use this file except in compliance with the License.
<add># You may obtain a copy of the License at
<add>#
<add># https://www.apache.org/licenses/LICENSE-2.0
<add>#
<add># Unless required by applicable law or agreed to in writing, software
<add># distributed under the License is distributed on an "AS IS" BASIS,
<add># WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
<add># See the License for the specific language governing permissions and
<add># limitations under the License.
<add>"""TriviaQA training script."""
<add>import collections
<add>import contextlib
<add>import functools
<add>import json
<add>import operator
<add>import os
<add>
<add>from absl import app
<add>from absl import flags
<add>from absl import logging
<add>import gin
<add>import tensorflow as tf
<add>import tensorflow_datasets as tfds
<add>
<add>import sentencepiece as spm
<add>from official.nlp import optimization as nlp_optimization
<add>from official.nlp.configs import encoders
<add>from official.nlp.projects.triviaqa import evaluation
<add>from official.nlp.projects.triviaqa import inputs
<add>from official.nlp.projects.triviaqa import modeling
<add>from official.nlp.projects.triviaqa import prediction
<add>
<add>flags.DEFINE_string('data_dir', None, 'Data directory for TensorFlow Datasets.')
<add>
<add>flags.DEFINE_string(
<add> 'validation_gold_path', None,
<add> 'Path to golden validation. Usually, the wikipedia-dev.json file.')
<add>
<add>flags.DEFINE_string('model_dir', None,
<add> 'Directory for checkpoints and summaries.')
<add>
<add>flags.DEFINE_string('model_config_path', None,
<add> 'JSON file containing model coniguration.')
<add>
<add>flags.DEFINE_string('sentencepiece_model_path', None,
<add> 'Path to sentence piece model.')
<add>
<add>flags.DEFINE_enum('encoder', 'bigbird',
<add> ['bert', 'bigbird', 'albert', 'mobilebert'],
<add> 'Which transformer encoder model to use.')
<add>
<add>flags.DEFINE_integer('bigbird_block_size', 64,
<add> 'Size of blocks for sparse block attention.')
<add>
<add>flags.DEFINE_string('init_checkpoint_path', None,
<add> 'Path from which to initialize weights.')
<add>
<add>flags.DEFINE_integer('train_sequence_length', 4096,
<add> 'Maximum number of tokens for training.')
<add>
<add>flags.DEFINE_integer('train_global_sequence_length', 320,
<add> 'Maximum number of global tokens for training.')
<add>
<add>flags.DEFINE_integer('validation_sequence_length', 4096,
<add> 'Maximum number of tokens for validation.')
<add>
<add>flags.DEFINE_integer('validation_global_sequence_length', 320,
<add> 'Maximum number of global tokens for validation.')
<add>
<add>flags.DEFINE_integer('batch_size', 32, 'Size of batch.')
<add>
<add>flags.DEFINE_string('master', '', 'Address of the TPU master.')
<add>
<add>flags.DEFINE_integer('decode_top_k', 8,
<add> 'Maximum number of tokens to consider for begin/end.')
<add>
<add>flags.DEFINE_integer('decode_max_size', 16,
<add> 'Maximum number of sentence pieces in an answer.')
<add>
<add>flags.DEFINE_float('dropout_rate', 0.1, 'Dropout rate for hidden layers.')
<add>
<add>flags.DEFINE_float('attention_dropout_rate', 0.3,
<add> 'Dropout rate for attention layers.')
<add>
<add>flags.DEFINE_float('label_smoothing', 1e-1, 'Degree of label smoothing.')
<add>
<add>flags.DEFINE_multi_string(
<add> 'gin_bindings', [],
<add> 'Gin bindings to override the values set in the config files')
<add>
<add>FLAGS = flags.FLAGS
<add>
<add>
<add>@contextlib.contextmanager
<add>def worker_context():
<add> if FLAGS.master:
<add> with tf.device('/job:worker') as d:
<add> yield d
<add> else:
<add> yield
<add>
<add>
<add>def read_sentencepiece_model(path):
<add> with tf.io.gfile.GFile(path, 'rb') as file:
<add> processor = spm.SentencePieceProcessor()
<add> processor.LoadFromSerializedProto(file.read())
<add> return processor
<add>
<add>
<add># Rename old BERT v1 configuration parameters.
<add>_MODEL_CONFIG_REPLACEMENTS = {
<add> 'num_hidden_layers': 'num_layers',
<add> 'attention_probs_dropout_prob': 'attention_dropout_rate',
<add> 'hidden_dropout_prob': 'dropout_rate',
<add> 'hidden_act': 'hidden_activation',
<add> 'window_size': 'block_size',
<add>}
<add>
<add>
<add>def read_model_config(encoder,
<add> path,
<add> bigbird_block_size=None) -> encoders.EncoderConfig:
<add> """Merges the JSON configuration into the encoder configuration."""
<add> with tf.io.gfile.GFile(path) as f:
<add> model_config = json.load(f)
<add> for key, value in _MODEL_CONFIG_REPLACEMENTS.items():
<add> if key in model_config:
<add> model_config[value] = model_config.pop(key)
<add> model_config['attention_dropout_rate'] = FLAGS.attention_dropout_rate
<add> model_config['dropout_rate'] = FLAGS.dropout_rate
<add> model_config['block_size'] = bigbird_block_size
<add> encoder_config = encoders.EncoderConfig(type=encoder)
<add> # Override the default config with those loaded from the JSON file.
<add> encoder_config_keys = encoder_config.get().as_dict().keys()
<add> overrides = {}
<add> for key, value in model_config.items():
<add> if key in encoder_config_keys:
<add> overrides[key] = value
<add> else:
<add> logging.warning('Ignoring config parameter %s=%s', key, value)
<add> encoder_config.get().override(overrides)
<add> return encoder_config
<add>
<add>
<add>@gin.configurable(blacklist=[
<add> 'model',
<add> 'strategy',
<add> 'train_dataset',
<add> 'model_dir',
<add> 'init_checkpoint_path',
<add> 'evaluate_fn',
<add>])
<add>def fit(model,
<add> strategy,
<add> train_dataset,
<add> model_dir,
<add> init_checkpoint_path=None,
<add> evaluate_fn=None,
<add> learning_rate=1e-5,
<add> learning_rate_polynomial_decay_rate=1.,
<add> weight_decay_rate=1e-1,
<add> num_warmup_steps=5000,
<add> num_decay_steps=51000,
<add> num_epochs=6):
<add> """Train and evaluate."""
<add> hparams = dict(
<add> learning_rate=learning_rate,
<add> num_decay_steps=num_decay_steps,
<add> num_warmup_steps=num_warmup_steps,
<add> num_epochs=num_epochs,
<add> weight_decay_rate=weight_decay_rate,
<add> dropout_rate=FLAGS.dropout_rate,
<add> attention_dropout_rate=FLAGS.attention_dropout_rate,
<add> label_smoothing=FLAGS.label_smoothing)
<add> logging.info(hparams)
<add> learning_rate_schedule = nlp_optimization.WarmUp(
<add> learning_rate,
<add> tf.keras.optimizers.schedules.PolynomialDecay(
<add> learning_rate,
<add> num_decay_steps,
<add> end_learning_rate=0.,
<add> power=learning_rate_polynomial_decay_rate), num_warmup_steps)
<add> with strategy.scope():
<add> optimizer = nlp_optimization.AdamWeightDecay(
<add> learning_rate_schedule,
<add> weight_decay_rate=weight_decay_rate,
<add> epsilon=1e-6,
<add> exclude_from_weight_decay=['LayerNorm', 'layer_norm', 'bias'])
<add> model.compile(optimizer, loss=modeling.SpanOrCrossEntropyLoss())
<add>
<add> def init_fn(init_checkpoint_path):
<add> ckpt = tf.train.Checkpoint(encoder=model.encoder)
<add> ckpt.restore(init_checkpoint_path).assert_existing_objects_matched()
<add>
<add> with worker_context():
<add> ckpt_manager = tf.train.CheckpointManager(
<add> tf.train.Checkpoint(model=model, optimizer=optimizer),
<add> model_dir,
<add> max_to_keep=None,
<add> init_fn=(functools.partial(init_fn, init_checkpoint_path)
<add> if init_checkpoint_path else None))
<add> with strategy.scope():
<add> ckpt_manager.restore_or_initialize()
<add> val_summary_writer = tf.summary.create_file_writer(
<add> os.path.join(model_dir, 'val'))
<add> best_exact_match = 0.
<add> for epoch in range(len(ckpt_manager.checkpoints), num_epochs):
<add> model.fit(
<add> train_dataset,
<add> callbacks=[
<add> tf.keras.callbacks.TensorBoard(model_dir, write_graph=False),
<add> ])
<add> ckpt_path = ckpt_manager.save()
<add> if evaluate_fn is None:
<add> continue
<add> metrics = evaluate_fn()
<add> logging.info('Epoch %d: %s', epoch + 1, metrics)
<add> if best_exact_match < metrics['exact_match']:
<add> best_exact_match = metrics['exact_match']
<add> model.save(os.path.join(model_dir, 'export'), include_optimizer=False)
<add> logging.info('Exporting %s as SavedModel.', ckpt_path)
<add> with val_summary_writer.as_default():
<add> for name, data in metrics.items():
<add> tf.summary.scalar(name, data, epoch + 1)
<add>
<add>
<add>def evaluate(sp_processor, features_map_fn, labels_map_fn, logits_fn,
<add> decode_logits_fn, split_and_pad_fn, distribute_strategy,
<add> validation_dataset, ground_truth):
<add> """Run evaluation."""
<add> loss_metric = tf.keras.metrics.Mean()
<add>
<add> @tf.function
<add> def update_loss(y, logits):
<add> loss_fn = modeling.SpanOrCrossEntropyLoss(
<add> reduction=tf.keras.losses.Reduction.NONE)
<add> return loss_metric(loss_fn(y, logits))
<add>
<add> predictions = collections.defaultdict(list)
<add> for _, (features, labels) in validation_dataset.enumerate():
<add> token_ids = features['token_ids']
<add> y = labels_map_fn(token_ids, labels)
<add> x = split_and_pad_fn(features_map_fn(features))
<add> logits = tf.concat(
<add> distribute_strategy.experimental_local_results(logits_fn(x)), 0)
<add> logits = logits[:features['token_ids'].shape[0]]
<add> update_loss(y, logits)
<add> end_limit = token_ids.row_lengths() - 1 # inclusive
<add> begin, end, scores = decode_logits_fn(logits, end_limit)
<add> answers = prediction.decode_answer(features['context'], begin, end,
<add> features['token_offsets'],
<add> end_limit).numpy()
<add> for _, (qid, token_id, offset, score, answer) in enumerate(
<add> zip(features['qid'].numpy(),
<add> tf.gather(features['token_ids'], begin, batch_dims=1).numpy(),
<add> tf.gather(features['token_offsets'], begin, batch_dims=1).numpy(),
<add> scores, answers)):
<add> if not answer:
<add> continue
<add> if sp_processor.IdToPiece(int(token_id)).startswith('▁') and offset > 0:
<add> answer = answer[1:]
<add> predictions[qid.decode('utf-8')].append((score, answer.decode('utf-8')))
<add> predictions = {
<add> qid: evaluation.normalize_answer(
<add> sorted(answers, key=operator.itemgetter(0), reverse=True)[0][1])
<add> for qid, answers in predictions.items()
<add> }
<add> metrics = evaluation.evaluate_triviaqa(ground_truth, predictions, mute=True)
<add> metrics['loss'] = loss_metric.result().numpy()
<add> return metrics
<add>
<add>
<add>def main(argv):
<add> if len(argv) > 1:
<add> raise app.UsageError('Too many command-line arguments.')
<add> gin.parse_config(FLAGS.gin_bindings)
<add> model_config = read_model_config(
<add> FLAGS.encoder,
<add> FLAGS.model_config_path,
<add> bigbird_block_size=FLAGS.bigbird_block_size)
<add> logging.info(model_config.get().as_dict())
<add> # Configure input processing.
<add> sp_processor = read_sentencepiece_model(FLAGS.sentencepiece_model_path)
<add> features_map_fn = functools.partial(
<add> inputs.features_map_fn,
<add> local_radius=FLAGS.bigbird_block_size,
<add> relative_pos_max_distance=24,
<add> use_hard_g2l_mask=True,
<add> padding_id=sp_processor.PieceToId('<pad>'),
<add> eos_id=sp_processor.PieceToId('</s>'),
<add> null_id=sp_processor.PieceToId('<empty>'),
<add> cls_id=sp_processor.PieceToId('<ans>'),
<add> sep_id=sp_processor.PieceToId('<sep_0>'))
<add> train_features_map_fn = tf.function(
<add> functools.partial(
<add> features_map_fn,
<add> sequence_length=FLAGS.train_sequence_length,
<add> global_sequence_length=FLAGS.train_global_sequence_length),
<add> autograph=False)
<add> train_labels_map_fn = tf.function(
<add> functools.partial(
<add> inputs.labels_map_fn, sequence_length=FLAGS.train_sequence_length))
<add> # Connect to TPU cluster.
<add> if FLAGS.master:
<add> resolver = tf.distribute.cluster_resolver.TPUClusterResolver(FLAGS.master)
<add> tf.config.experimental_connect_to_cluster(resolver)
<add> tf.tpu.experimental.initialize_tpu_system(resolver)
<add> strategy = tf.distribute.TPUStrategy(resolver)
<add> else:
<add> strategy = tf.distribute.MirroredStrategy()
<add> # Initialize datasets.
<add> with worker_context():
<add> _ = tf.random.get_global_generator()
<add> train_dataset = inputs.read_batches(
<add> FLAGS.data_dir,
<add> tfds.Split.TRAIN,
<add> FLAGS.batch_size,
<add> shuffle=True,
<add> drop_final_batch=True)
<add> validation_dataset = inputs.read_batches(FLAGS.data_dir,
<add> tfds.Split.VALIDATION,
<add> FLAGS.batch_size)
<add>
<add> def train_map_fn(x, y):
<add> features = train_features_map_fn(x)
<add> labels = modeling.smooth_labels(FLAGS.label_smoothing,
<add> train_labels_map_fn(x['token_ids'], y),
<add> features['question_lengths'],
<add> features['token_ids'])
<add> return features, labels
<add>
<add> train_dataset = train_dataset.map(train_map_fn, 16).prefetch(16)
<add> # Initialize model and compile.
<add> with strategy.scope():
<add> model = modeling.TriviaQaModel(model_config, FLAGS.train_sequence_length)
<add> logits_fn = tf.function(
<add> functools.partial(prediction.distributed_logits_fn, model))
<add> decode_logits_fn = tf.function(
<add> functools.partial(prediction.decode_logits, FLAGS.decode_top_k,
<add> FLAGS.decode_max_size))
<add> split_and_pad_fn = tf.function(
<add> functools.partial(prediction.split_and_pad, strategy, FLAGS.batch_size))
<add> # Evaluation strategy.
<add> with tf.io.gfile.GFile(FLAGS.validation_gold_path) as f:
<add> ground_truth = {
<add> datum['QuestionId']: datum['Answer'] for datum in json.load(f)['Data']
<add> }
<add> validation_features_map_fn = tf.function(
<add> functools.partial(
<add> features_map_fn,
<add> sequence_length=FLAGS.validation_sequence_length,
<add> global_sequence_length=FLAGS.validation_global_sequence_length),
<add> autograph=False)
<add> validation_labels_map_fn = tf.function(
<add> functools.partial(
<add> inputs.labels_map_fn,
<add> sequence_length=FLAGS.validation_sequence_length))
<add> evaluate_fn = functools.partial(
<add> evaluate,
<add> sp_processor=sp_processor,
<add> features_map_fn=validation_features_map_fn,
<add> labels_map_fn=validation_labels_map_fn,
<add> logits_fn=logits_fn,
<add> decode_logits_fn=decode_logits_fn,
<add> split_and_pad_fn=split_and_pad_fn,
<add> distribute_strategy=strategy,
<add> validation_dataset=validation_dataset,
<add> ground_truth=ground_truth)
<add> logging.info('Model initialized. Beginning training fit loop.')
<add> fit(model, strategy, train_dataset, FLAGS.model_dir,
<add> FLAGS.init_checkpoint_path, evaluate_fn)
<add>
<add>
<add>if __name__ == '__main__':
<add> flags.mark_flags_as_required([
<add> 'model_config_path', 'model_dir', 'sentencepiece_model_path',
<add> 'validation_gold_path'
<add> ])
<add> app.run(main)
| 11
|
Python
|
Python
|
add python3 support for some examples
|
d811048887134013d3a0bcd9898f02ac3b4688b8
|
<ide><path>examples/mnist_acgan.py
<ide> from __future__ import print_function
<ide>
<ide> from collections import defaultdict
<del>import cPickle as pickle
<add>try:
<add> import cPickle as pickle
<add>except ImportError:
<add> import pickle
<ide> from PIL import Image
<ide>
<ide> from six.moves import range
<ide><path>examples/mnist_net2net.py
<ide> '''
<ide>
<ide> from __future__ import print_function
<add>from six.moves import xrange
<ide> import numpy as np
<ide> np.random.seed(1337)
<ide>
| 2
|
PHP
|
PHP
|
fix broken tests
|
431e610317513d5d20bd91cf2c8198be9a197edc
|
<ide><path>lib/Cake/Test/Case/Console/Command/Task/ExtractTaskTest.php
<ide> public function testExecute() {
<ide> $this->assertRegExp($pattern, $result);
<ide>
<ide> $pattern = '/\#: (\\\\|\/)extract\.ctp:14\n';
<del> $pattern .= '\#: (\\\\|\/)home\.ctp:99\n';
<add> $pattern .= '\#: (\\\\|\/)home\.ctp:100\n';
<ide> $pattern .= 'msgid "Editing this Page"\nmsgstr ""/';
<ide> $this->assertRegExp($pattern, $result);
<ide>
<ide><path>lib/Cake/Test/Case/Utility/DebuggerTest.php
<ide> public function testExcerpt() {
<ide> $pattern = '/<code>.*?<span style\="color\: \#\d+">.*?<\?php/';
<ide> $this->assertRegExp($pattern, $result[0]);
<ide>
<del> $result = Debugger::excerpt(__FILE__, 10, 2);
<add> $result = Debugger::excerpt(__FILE__, 11, 2);
<ide> $this->assertEquals(5, count($result));
<ide>
<ide> $pattern = '/<span style\="color\: \#\d{6}">\*<\/span>/';
| 2
|
PHP
|
PHP
|
improve french translation for validation
|
5a082257b72bc5ab5cfbd0fcc815e94a1f29b230
|
<ide><path>application/language/fr/validation.php
<ide> "alpha" => "Le champ :attribute ne doit contenir que des lettres.",
<ide> "alpha_dash" => "Le champ :attribute ne doit contenir que des lettres, nombres et des tirets.",
<ide> "alpha_num" => "Le champ :attribute ne doit contenir que des lettres et nombres.",
<del> "array" => "The :attribute must have selected elements.",
<add> "array" => "Le champ :attribute doit avoir des éléments selectionnés.",
<ide> "before" => "Le champ :attribute doit être une date avant :date.",
<ide> "between" => array(
<ide> "numeric" => "Le champ :attribute doit être entre :min - :max.",
<ide> "file" => "Le champ :attribute doit être entre :min - :max kilo-octets.",
<ide> "string" => "Le champ :attribute doit être entre :min - :max caractères.",
<ide> ),
<ide> "confirmed" => "Le champ :attribute confirmation est différent.",
<del> "count" => "The :attribute must have exactly :count selected elements.",
<del> "countbetween" => "The :attribute must have between :min and :max selected elements.",
<del> "countmax" => "The :attribute must have less than :max selected elements.",
<del> "countmin" => "The :attribute must have at least :min selected elements.",
<del> "different" => "Les champ :attribute et :other doivent être différents.",
<add> "count" => "Le champ :attribute doit avoir exactement :count éléments sélectionnés.",
<add> "countbetween" => "Le champ :attribute doit avoir entre :min et :max éléments sélectionnés.",
<add> "countmax" => "Le champ :attribute doit avoir moins de :max éléments sélectionnés.",
<add> "countmin" => "Le champ :attribute doit avoir au moins :min éléments sélectionnés.",
<add> "different" => "Les champs :attribute et :other doivent être différents.",
<ide> "email" => "Le format du champ :attribute est invalide.",
<ide> "exists" => "Le champ sélectionné :attribute est invalide.",
<ide> "image" => "Le champ :attribute doit être une image.",
<ide> "ip" => "Le champ :attribute doit être une adresse IP valide.",
<ide> "match" => "Le format du champ :attribute est invalide.",
<ide> "max" => array(
<del> "numeric" => "Le :attribute doit être plus petit que :max.",
<del> "file" => "Le :attribute doit être plus petit que :max kilo-octets.",
<del> "string" => "Le :attribute doit être plus petit que :max caractères.",
<add> "numeric" => "Le champ :attribute doit être plus petit que :max.",
<add> "file" => "Le champ :attribute doit être plus petit que :max kilo-octets.",
<add> "string" => "Le champ :attribute doit être plus petit que :max caractères.",
<ide> ),
<ide> "mimes" => "Le champ :attribute doit être un fichier de type: :values.",
<ide> "min" => array(
<ide> "numeric" => "Le champ :attribute doit être au moins :min.",
<del> "file" => "Le champ :attribute doit être au moins :min kilo-octets.",
<del> "string" => "Le champ :attribute doit être au moins :min caractères.",
<add> "file" => "Le champ :attribute doit être au moins de :min kilo-octets.",
<add> "string" => "Le champ :attribute doit avoir au moins :min caractères.",
<ide> ),
<ide> "not_in" => "Le champ sélectionné :attribute est invalide.",
<ide> "numeric" => "Le champ :attribute doit être un nombre.",
<ide> "required" => "Le champ :attribute est requis",
<del> "same" => "Le champ :attribute et :other doivent être identique.",
<add> "same" => "Les champs :attribute et :other doivent être identique.",
<ide> "size" => array(
<ide> "numeric" => "Le champ :attribute doit être :size.",
<ide> "file" => "Le champ :attribute doit être de :size kilo-octets.",
<ide> "string" => "Le champ :attribute doit être de :size caractères.",
<ide> ),
<ide> "unique" => "Le champ :attribute est déjà utilisé.",
<del> "url" => "Le champ :attribute à un format invalide.",
<add> "url" => "Le champ :attribute a un format invalide.",
<ide>
<ide> /*
<ide> |--------------------------------------------------------------------------
| 1
|
Python
|
Python
|
skip one more possibly failing c99 test
|
e9ae63eda3f9ac5076ba2d216c96e6e137270009
|
<ide><path>numpy/core/tests/test_umath.py
<ide> class TestC99(object):
<ide> def test_clog(self):
<ide> for p, v, e in [
<ide> ((-0., 0.), (-inf, pi), 'divide'),
<del> ((+0., 0.), (-inf, 0.), 'divide'),
<add> ((+0., 0.), (-inf, 0.), 'XXX divide'), # fails on OSX?
<ide> ((1., inf), (inf, pi/2), ''),
<ide> ((1., nan), (nan, nan), 'invalid-optional'),
<ide> ((-inf, 1.), (inf, pi), ''),
| 1
|
Go
|
Go
|
fix flaky test teststatsallnewcontainersadded
|
71d6e71cff4122a987a4655723d50a296916f523
|
<ide><path>integration-cli/docker_cli_stats_test.go
<ide> func (s *DockerSuite) TestStatsAllNewContainersAdded(c *check.C) {
<ide> }()
<ide>
<ide> out, _ := dockerCmd(c, "run", "-d", "busybox", "top")
<add> c.Assert(waitRun(strings.TrimSpace(out)), check.IsNil)
<ide> id <- strings.TrimSpace(out)[:12]
<ide>
<ide> select {
| 1
|
Javascript
|
Javascript
|
add dohop to showcase
|
46c3af994121be073cb38c191bfae3bb1f6d10be
|
<ide><path>website/src/react-native/showcase.js
<ide> var apps = [
<ide> ],
<ide> author: 'Genki Takiuchi (s21g Inc.)',
<ide> },
<add> {
<add> name: 'Dohop Flights',
<add> icon: 'http://a5.mzstatic.com/us/r30/Purple60/v4/3e/94/e9/3e94e9b3-f9a0-7b27-1824-b3da732ec967/icon175x175.jpeg',
<add> linkAppStore: 'https://itunes.apple.com/us/app/dohop-flights-your-new-flight/id964170399',
<add> linkPlayStore: 'https://play.google.com/store/apps/details?id=com.dohop',
<add> author: 'Dohop',
<add> },
<ide> {
<ide> name: 'DONUT chatrooms for communities',
<ide> icon: 'http://a2.mzstatic.com/eu/r30/Purple49/v4/d4/2d/e5/d42de510-6802-2694-1b60-ca80ffa1e2cb/icon175x175.png',
| 1
|
PHP
|
PHP
|
add unit tests to illustrate json assertion issues
|
4a9ccc9943e016e324b1df5e74dd244cd8cd5420
|
<ide><path>tests/Foundation/FoundationTestResponseTest.php
<ide> public function testAssertJsonFragment()
<ide> $response->assertJsonFragment(['foobar' => ['foobar_foo' => 'foo', 'foobar_bar' => 'bar']]);
<ide>
<ide> $response->assertJsonFragment(['foo' => 'bar 0', 'bar' => ['foo' => 'bar 0', 'bar' => 'foo 0']]);
<add>
<add> $response = TestResponse::fromBaseResponse(new Response(new JsonSerializableSingleResourceWithIntegersStub()));
<add>
<add> $response->assertJsonFragment(['id' => 10]);
<add>
<add> try {
<add> $response->assertJsonFragment(['id' => 1]);
<add> $this->fail('Asserting id => 1, existsing in JsonSerializableSingleResourceWithIntegersStub should fail');
<add> } catch (\PHPUnit\Framework\ExpectationFailedException $e) {
<add> }
<ide> }
<ide>
<ide> public function testAssertJsonStructure()
<ide> public function testAssertJsonCount()
<ide> $response->assertJsonCount(4);
<ide> }
<ide>
<add> public function testAssertJsonMissing()
<add> {
<add> $response = TestResponse::fromBaseResponse(new Response(new JsonSerializableSingleResourceWithIntegersStub));
<add>
<add> $response->assertJsonMissing(['id' => 2]);
<add> }
<add>
<ide> public function testAssertJsonMissingValidationErrors()
<ide> {
<ide> $baseResponse = tap(new Response, function ($response) {
<ide> public function jsonSerialize()
<ide> ];
<ide> }
<ide> }
<add>
<add>class JsonSerializableSingleResourceWithIntegersStub implements JsonSerializable
<add>{
<add> public function jsonSerialize()
<add> {
<add> return [
<add> ['id' => 10, 'foo' => 'bar'],
<add> ['id' => 20, 'foo' => 'bar'],
<add> ['id' => 30, 'foo' => 'bar'],
<add> ];
<add> }
<add>}
| 1
|
Javascript
|
Javascript
|
fix extra closing tag in renderbuffer
|
cc75d4e4e493d6322000d86f7939500fb46e7ec4
|
<ide><path>packages/ember-views/lib/system/render_buffer.js
<ide> Ember._RenderBuffer.prototype =
<ide> },
<ide>
<ide> generateElement: function() {
<del> var element = document.createElement(this.currentTagName()),
<add> var tagName = this.tagNames.pop(), // pop since we don't need to close
<add> element = document.createElement(tagName),
<ide> $element = Ember.$(element),
<ide> id = this.elementId,
<ide> classes = this.classes,
| 1
|
Text
|
Text
|
fix function name in process.md
|
1f6adff12cdf1d1ca6d8def2ef0b5abdff8c45b0
|
<ide><path>doc/api/process.md
<ide> added: v9.3.0
<ide>
<ide> * `fn` {Function|null}
<ide>
<del>The `process.setUncaughtExceptionCapture` function sets a function that will
<del>be invoked when an uncaught exception occurs, which will receive the exception
<del>value itself as its first argument.
<add>The `process.setUncaughtExceptionCaptureCallback()` function sets a function
<add>that will be invoked when an uncaught exception occurs, which will receive the
<add>exception value itself as its first argument.
<ide>
<ide> If such a function is set, the [`'uncaughtException'`][] event will
<ide> not be emitted. If `--abort-on-uncaught-exception` was passed from the
<ide> command line or set through [`v8.setFlagsFromString()`][], the process will
<ide> not abort.
<ide>
<del>To unset the capture function, `process.setUncaughtExceptionCapture(null)`
<del>may be used. Calling this method with a non-`null` argument while another
<del>capture function is set will throw an error.
<add>To unset the capture function,
<add>`process.setUncaughtExceptionCaptureCallback(null)` may be used. Calling this
<add>method with a non-`null` argument while another capture function is set will
<add>throw an error.
<ide>
<ide> Using this function is mutually exclusive with using the deprecated
<ide> [`domain`][] built-in module.
| 1
|
Python
|
Python
|
fix other open calls without context managers
|
d54631f68b2dc739bb6dd215ddc3ac14ee2465c6
|
<ide><path>spacy/cli/convert.py
<ide> def convert(
<ide> ner_map = srsly.read_json(ner_map) if ner_map is not None else None
<ide> doc_files = []
<ide> for input_loc in walk_directory(Path(input_path), converter):
<del> input_data = input_loc.open("r", encoding="utf-8").read()
<add> with input_loc.open("r", encoding="utf-8") as infile:
<add> input_data = infile.read()
<ide> # Use converter function to convert data
<ide> func = CONVERTERS[converter]
<ide> docs = func(
<ide><path>spacy/pipeline/entity_linker.py
<ide> def from_disk(
<ide>
<ide> def load_model(p):
<ide> try:
<del> self.model.from_bytes(p.open("rb").read())
<add> with p.open("rb") as infile:
<add> self.model.from_bytes(infile.read())
<ide> except AttributeError:
<ide> raise ValueError(Errors.E149) from None
<ide>
<ide><path>spacy/tests/tokenizer/test_tokenizer.py
<ide> def test_tokenizer_handles_long_text(tokenizer):
<ide> @pytest.mark.parametrize("file_name", ["sun.txt"])
<ide> def test_tokenizer_handle_text_from_file(tokenizer, file_name):
<ide> loc = ensure_path(__file__).parent / file_name
<del> text = loc.open("r", encoding="utf8").read()
<add> with loc.open("r", encoding="utf8") as infile:
<add> text = infile.read()
<ide> assert len(text) != 0
<ide> tokens = tokenizer(text)
<ide> assert len(tokens) > 100
| 3
|
Python
|
Python
|
change loss and optimizer to new api
|
d6522e28732fd14a926440ef5f315e6a8e13792c
|
<ide><path>examples/lm_finetuning/finetune_on_pregenerated.py
<ide> def main():
<ide> help="Loss scaling to improve fp16 numeric stability. Only used when fp16 set to True.\n"
<ide> "0 (default value): dynamic loss scaling.\n"
<ide> "Positive power of 2: static loss scaling value.\n")
<del> parser.add_argument("--warmup_proportion",
<del> default=0.1,
<del> type=float,
<del> help="Proportion of training to perform linear learning rate warmup for. "
<del> "E.g., 0.1 = 10%% of training.")
<add> parser.add_argument("--warmup_steps",
<add> default=0,
<add> type=int,
<add> help="Linear warmup over warmup_steps.")
<ide> parser.add_argument("--learning_rate",
<ide> default=3e-5,
<ide> type=float,
<ide> def main():
<ide> optimizer = FP16_Optimizer(optimizer, dynamic_loss_scale=True)
<ide> else:
<ide> optimizer = FP16_Optimizer(optimizer, static_loss_scale=args.loss_scale)
<del> warmup_linear = WarmupLinearSchedule(warmup=args.warmup_proportion,
<del> t_total=num_train_optimization_steps)
<ide> else:
<del> optimizer = AdamW(optimizer_grouped_parameters,
<del> lr=args.learning_rate,
<del> warmup=args.warmup_proportion,
<del> t_total=num_train_optimization_steps)
<add> optimizer = AdamW(optimizer_grouped_parameters, lr=args.learning_rate, eps=args.adam_epsilon)
<add> scheduler = WarmupLinearSchedule(optimizer, warmup_steps=args.warmup_steps, t_total=num_train_optimization_steps)
<ide>
<ide> global_step = 0
<ide> logging.info("***** Running training *****")
<ide> def main():
<ide> for step, batch in enumerate(train_dataloader):
<ide> batch = tuple(t.to(device) for t in batch)
<ide> input_ids, input_mask, segment_ids, lm_label_ids, is_next = batch
<del> loss = model(input_ids, segment_ids, input_mask, lm_label_ids, is_next)
<add> outputs = model(input_ids, segment_ids, input_mask, lm_label_ids, is_next)
<add> loss = outputs[0]
<ide> if n_gpu > 1:
<ide> loss = loss.mean() # mean() to average on multi-gpu.
<ide> if args.gradient_accumulation_steps > 1:
<ide> def main():
<ide> mean_loss = tr_loss * args.gradient_accumulation_steps / nb_tr_steps
<ide> pbar.set_postfix_str(f"Loss: {mean_loss:.5f}")
<ide> if (step + 1) % args.gradient_accumulation_steps == 0:
<del> if args.fp16:
<del> # modify learning rate with special warm up BERT uses
<del> # if args.fp16 is False, BertAdam is used that handles this automatically
<del> lr_this_step = args.learning_rate * warmup_linear.get_lr(global_step, args.warmup_proportion)
<del> for param_group in optimizer.param_groups:
<del> param_group['lr'] = lr_this_step
<add> scheduler.step() # Update learning rate schedule
<ide> optimizer.step()
<ide> optimizer.zero_grad()
<ide> global_step += 1
<ide>
<ide> # Save a trained model
<del> if torch.distributed.get_rank() == 0:
<add> if n_gpu > 1 and torch.distributed.get_rank() == 0 or n_gpu <=1 :
<ide> logging.info("** ** * Saving fine-tuned model ** ** * ")
<ide> model_to_save = model.module if hasattr(model, 'module') else model # Only save the model it-self
<ide>
<ide><path>examples/lm_finetuning/simple_lm_finetuning.py
<ide> def main():
<ide> default=3.0,
<ide> type=float,
<ide> help="Total number of training epochs to perform.")
<del> parser.add_argument("--warmup_proportion",
<del> default=0.1,
<del> type=float,
<del> help="Proportion of training to perform linear learning rate warmup for. "
<del> "E.g., 0.1 = 10%% of training.")
<add> parser.add_argument("--warmup_steps",
<add> default=0,
<add> type=int,
<add> help="Linear warmup over warmup_steps.")
<ide> parser.add_argument("--no_cuda",
<ide> action='store_true',
<ide> help="Whether not to use CUDA when available")
<ide> def main():
<ide>
<ide> if os.path.exists(args.output_dir) and os.listdir(args.output_dir):
<ide> raise ValueError("Output directory ({}) already exists and is not empty.".format(args.output_dir))
<del> if not os.path.exists(args.output_dir) and torch.distributed.get_rank() == 0:
<add> if not os.path.exists(args.output_dir) and ( n_gpu > 1 and torch.distributed.get_rank() == 0 or n_gpu <=1 ):
<ide> os.makedirs(args.output_dir)
<ide>
<ide> tokenizer = BertTokenizer.from_pretrained(args.bert_model, do_lower_case=args.do_lower_case)
<ide> def main():
<ide> optimizer = FP16_Optimizer(optimizer, dynamic_loss_scale=True)
<ide> else:
<ide> optimizer = FP16_Optimizer(optimizer, static_loss_scale=args.loss_scale)
<del> warmup_linear = WarmupLinearSchedule(warmup=args.warmup_proportion,
<del> t_total=num_train_optimization_steps)
<ide>
<ide> else:
<del> optimizer = BertAdam(optimizer_grouped_parameters,
<del> lr=args.learning_rate,
<del> warmup=args.warmup_proportion,
<del> t_total=num_train_optimization_steps)
<add> optimizer = AdamW(optimizer_grouped_parameters, lr=args.learning_rate, eps=args.adam_epsilon)
<add> scheduler = WarmupLinearSchedule(optimizer, warmup_steps=args.warmup_steps, t_total=num_train_optimization_steps)
<ide>
<ide> global_step = 0
<ide> if args.do_train:
<ide> def main():
<ide> for step, batch in enumerate(tqdm(train_dataloader, desc="Iteration")):
<ide> batch = tuple(t.to(device) for t in batch)
<ide> input_ids, input_mask, segment_ids, lm_label_ids, is_next = batch
<del> loss = model(input_ids, segment_ids, input_mask, lm_label_ids, is_next)
<add> outputs = model(input_ids, segment_ids, input_mask, lm_label_ids, is_next)
<add> loss = outputs[0]
<ide> if n_gpu > 1:
<ide> loss = loss.mean() # mean() to average on multi-gpu.
<ide> if args.gradient_accumulation_steps > 1:
<ide> def main():
<ide> nb_tr_examples += input_ids.size(0)
<ide> nb_tr_steps += 1
<ide> if (step + 1) % args.gradient_accumulation_steps == 0:
<del> if args.fp16:
<del> # modify learning rate with special warm up BERT uses
<del> # if args.fp16 is False, BertAdam is used that handles this automatically
<del> lr_this_step = args.learning_rate * warmup_linear.get_lr(global_step, args.warmup_proportion)
<del> for param_group in optimizer.param_groups:
<del> param_group['lr'] = lr_this_step
<add> scheduler.step() # Update learning rate schedule
<ide> optimizer.step()
<ide> optimizer.zero_grad()
<ide> global_step += 1
<ide>
<ide> # Save a trained model
<del> model_to_save = model.module if hasattr(model, 'module') else model # Only save the model it-self
<del> output_model_file = os.path.join(args.output_dir, WEIGHTS_NAME)
<del> output_config_file = os.path.join(args.output_dir, CONFIG_NAME)
<del> if args.do_train and torch.distributed.get_rank() == 0:
<add> if args.do_train and ( n_gpu > 1 and torch.distributed.get_rank() == 0 or n_gpu <=1):
<ide> logger.info("** ** * Saving fine - tuned model ** ** * ")
<add> model_to_save = model.module if hasattr(model, 'module') else model # Only save the model it-self
<add> output_model_file = os.path.join(args.output_dir, WEIGHTS_NAME)
<add> output_config_file = os.path.join(args.output_dir, CONFIG_NAME)
<ide> torch.save(model_to_save.state_dict(), output_model_file)
<ide> model_to_save.config.to_json_file(output_config_file)
<ide> tokenizer.save_vocabulary(args.output_dir)
| 2
|
Javascript
|
Javascript
|
fix lint errors
|
230185fb7405f5aa8820a3c501567970e7477518
|
<ide><path>src/main-process/atom-application.js
<ide> module.exports = class AtomApplication extends EventEmitter {
<ide> this.promptForRestart()
<ide> );
<ide> });
<del>
<ide> await this.configFilePromise;
<ide> }
<ide>
| 1
|
Ruby
|
Ruby
|
create venvs with access to system site packages
|
3d3b9874f93e30b75d2d1595cb207cd7f03169b9
|
<ide><path>Library/Homebrew/language/python.rb
<ide> def initialize(formula, venv_root, python)
<ide> def create
<ide> return if (@venv_root/"bin/python").exist?
<ide>
<del> @formula.system @python, "-m", "venv", @venv_root
<add> @formula.system @python, "-m", "venv", "--system-site-packages", @venv_root
<ide>
<ide> # Robustify symlinks to survive python patch upgrades
<ide> @venv_root.find do |f|
<ide><path>Library/Homebrew/test/language/python/virtualenv_spec.rb
<ide>
<ide> describe "#create" do
<ide> it "creates a venv" do
<del> expect(formula).to receive(:system).with("python", "-m", "venv", dir)
<add> expect(formula).to receive(:system).with("python", "-m", "venv", "--system-site-packages", dir)
<ide> virtualenv.create
<ide> end
<ide> end
| 2
|
Java
|
Java
|
remove unnecessary calls to disablecontentcaching
|
c7e037da39d2c0d57e0b895504c7a296fe77e43a
|
<ide><path>spring-webmvc/src/main/java/org/springframework/web/servlet/mvc/method/annotation/HttpEntityMethodProcessor.java
<ide> import org.springframework.web.bind.support.WebDataBinderFactory;
<ide> import org.springframework.web.context.request.NativeWebRequest;
<ide> import org.springframework.web.context.request.ServletWebRequest;
<del>import org.springframework.web.filter.ShallowEtagHeaderFilter;
<ide> import org.springframework.web.method.support.ModelAndViewContainer;
<ide> import org.springframework.web.servlet.mvc.support.RedirectAttributes;
<ide> import org.springframework.web.servlet.support.RequestContextUtils;
<ide> public void handleReturnValue(@Nullable Object returnValue, MethodParameter retu
<ide> if ((HttpMethod.GET.equals(method) || HttpMethod.HEAD.equals(method))
<ide> && isResourceNotModified(inputMessage, outputMessage)) {
<ide> outputMessage.flush();
<del> ShallowEtagHeaderFilter.disableContentCaching(inputMessage.getServletRequest());
<ide> return;
<ide> }
<ide> }
<ide><path>spring-webmvc/src/main/java/org/springframework/web/servlet/mvc/method/annotation/ServletInvocableHandlerMethod.java
<ide> import org.springframework.web.bind.annotation.ResponseBody;
<ide> import org.springframework.web.bind.annotation.ResponseStatus;
<ide> import org.springframework.web.context.request.ServletWebRequest;
<del>import org.springframework.web.filter.ShallowEtagHeaderFilter;
<ide> import org.springframework.web.method.HandlerMethod;
<ide> import org.springframework.web.method.support.HandlerMethodReturnValueHandler;
<ide> import org.springframework.web.method.support.HandlerMethodReturnValueHandlerComposite;
<ide> private void disableContentCachingIfNecessary(ServletWebRequest webRequest) {
<ide> if (StringUtils.hasText(response.getHeader(HttpHeaders.ETAG))) {
<ide> HttpServletRequest request = webRequest.getNativeRequest(HttpServletRequest.class);
<ide> Assert.notNull(request, "Expected HttpServletRequest");
<del> ShallowEtagHeaderFilter.disableContentCaching(request);
<ide> }
<ide> }
<ide> }
<ide><path>spring-webmvc/src/test/java/org/springframework/web/servlet/mvc/method/annotation/HttpEntityMethodProcessorMockTests.java
<ide> import java.util.Date;
<ide> import java.util.Set;
<ide>
<add>import javax.servlet.FilterChain;
<add>
<ide> import org.junit.jupiter.api.BeforeEach;
<ide> import org.junit.jupiter.api.Test;
<ide> import org.mockito.ArgumentCaptor;
<ide> import org.springframework.web.HttpMediaTypeNotSupportedException;
<ide> import org.springframework.web.bind.annotation.RequestMapping;
<ide> import org.springframework.web.context.request.ServletWebRequest;
<add>import org.springframework.web.filter.ShallowEtagHeaderFilter;
<ide> import org.springframework.web.method.support.ModelAndViewContainer;
<ide> import org.springframework.web.testfixture.servlet.MockHttpServletRequest;
<ide> import org.springframework.web.testfixture.servlet.MockHttpServletResponse;
<ide> public void handleEtagWithHttp304() throws Exception {
<ide> assertConditionalResponse(HttpStatus.NOT_MODIFIED, null, etagValue, -1);
<ide> }
<ide>
<add> @Test
<add> public void handleEtagWithHttp304AndEtagFilterHasNoImpact() throws Exception {
<add>
<add> String eTagValue = "\"deadb33f8badf00d\"";
<add>
<add> FilterChain chain = (req, res) -> {
<add> servletRequest.addHeader(HttpHeaders.IF_NONE_MATCH, eTagValue);
<add> ResponseEntity<String> returnValue = ResponseEntity.ok().eTag(eTagValue).body("body");
<add> initStringMessageConversion(TEXT_PLAIN);
<add> try {
<add> processor.handleReturnValue(returnValue, returnTypeResponseEntity, mavContainer, webRequest);
<add> }
<add> catch (Exception ex) {
<add> throw new IllegalStateException(ex);
<add> }
<add> };
<add>
<add> new ShallowEtagHeaderFilter().doFilter(this.servletRequest, this.servletResponse, chain);
<add>
<add> assertConditionalResponse(HttpStatus.NOT_MODIFIED, null, eTagValue, -1);
<add> }
<add>
<ide> @Test // SPR-14559
<ide> public void shouldHandleInvalidIfNoneMatchWithHttp200() throws Exception {
<ide> String etagValue = "\"deadb33f8badf00d\"";
<ide><path>spring-webmvc/src/test/java/org/springframework/web/servlet/mvc/method/annotation/ServletInvocableHandlerMethodTests.java
<ide> import java.util.Collections;
<ide> import java.util.List;
<ide>
<add>import javax.servlet.FilterChain;
<ide> import javax.servlet.http.HttpServletResponse;
<ide>
<ide> import org.junit.jupiter.api.Test;
<ide> public void invokeAndHandle_VoidRequestNotModified() throws Exception {
<ide> .isTrue();
<ide> }
<ide>
<del> @Test // gh-23775
<add> @Test
<ide> public void invokeAndHandle_VoidNotModifiedWithEtag() throws Exception {
<del> String etag = "\"deadb33f8badf00d\"";
<del> this.request.addHeader(HttpHeaders.IF_NONE_MATCH, etag);
<del> this.webRequest.checkNotModified(etag);
<ide>
<del> ServletInvocableHandlerMethod handlerMethod = getHandlerMethod(new Handler(), "notModified");
<del> handlerMethod.invokeAndHandle(this.webRequest, this.mavContainer);
<add> String eTagValue = "\"deadb33f8badf00d\"";
<ide>
<del> assertThat(this.mavContainer.isRequestHandled())
<del> .as("Null return value + 'not modified' request should result in 'request handled'")
<del> .isTrue();
<add> FilterChain chain = (req, res) -> {
<add> request.addHeader(HttpHeaders.IF_NONE_MATCH, eTagValue);
<add> webRequest.checkNotModified(eTagValue);
<add>
<add> try {
<add> ServletInvocableHandlerMethod handlerMethod = getHandlerMethod(new Handler(), "notModified");
<add> handlerMethod.invokeAndHandle(webRequest, mavContainer);
<add> }
<add> catch (Exception ex) {
<add> throw new IllegalStateException(ex);
<add> }
<add> };
<add>
<add> new ShallowEtagHeaderFilter().doFilter(this.request, this.response, chain);
<ide>
<del> assertThat(this.request.getAttribute(ShallowEtagHeaderFilter.class.getName() + ".STREAMING"))
<del> .isEqualTo(true);
<add> assertThat(response.getStatus()).isEqualTo(304);
<add> assertThat(response.getHeader(HttpHeaders.ETAG)).isEqualTo(eTagValue);
<add> assertThat(response.getContentAsString()).isEmpty();
<ide> }
<ide>
<ide> @Test // SPR-9159
<ide> public void invokeAndHandle_NotVoidWithResponseStatusAndReason() throws Exceptio
<ide> .as("When a status reason w/ used, the request is handled").isTrue();
<ide> }
<ide>
<add> @Test // gh-23775, gh-24635
<add> public void invokeAndHandle_ETagFilterHasNoImpactWhenETagPresent() throws Exception {
<add>
<add> String eTagValue = "\"deadb33f8badf00d\"";
<add>
<add> FilterChain chain = (req, res) -> {
<add> request.addHeader(HttpHeaders.IF_NONE_MATCH, eTagValue);
<add> webRequest.checkNotModified(eTagValue);
<add>
<add> try {
<add> ServletInvocableHandlerMethod handlerMethod = getHandlerMethod(new Handler(), "notModified");
<add> handlerMethod.invokeAndHandle(webRequest, mavContainer);
<add> }
<add> catch (Exception ex) {
<add> throw new IllegalStateException(ex);
<add> }
<add> };
<add>
<add> new ShallowEtagHeaderFilter().doFilter(this.request, this.response, chain);
<add>
<add> assertThat(this.response.getStatus()).isEqualTo(304);
<add> assertThat(this.response.getHeader(HttpHeaders.ETAG)).isEqualTo(eTagValue);
<add> assertThat(this.response.getContentAsString()).isEmpty();
<add> }
<add>
<ide> @Test
<ide> public void invokeAndHandle_Exception() throws Exception {
<ide> this.returnValueHandlers.addHandler(new ExceptionRaisingReturnValueHandler());
| 4
|
Go
|
Go
|
add error checking for hostport range
|
135f8f967488f171fcda6c66d7fa35726edaf19c
|
<ide><path>cli/compose/convert/service.go
<ide> func convertEndpointSpec(source []string) (*swarm.EndpointSpec, error) {
<ide> }
<ide>
<ide> for port := range ports {
<del> portConfigs = append(
<del> portConfigs,
<del> opts.ConvertPortToPortConfig(port, portBindings)...)
<add> portConfig, err := opts.ConvertPortToPortConfig(port, portBindings)
<add> if err != nil {
<add> return nil, err
<add> }
<add> portConfigs = append(portConfigs, portConfig...)
<ide> }
<ide>
<ide> return &swarm.EndpointSpec{Ports: portConfigs}, nil
<ide><path>opts/port.go
<ide> func (p *PortOpt) Set(value string) error {
<ide> ports, portBindings, _ := nat.ParsePortSpecs([]string{value})
<ide>
<ide> for port := range ports {
<del> portConfigs = append(portConfigs, ConvertPortToPortConfig(port, portBindings)...)
<add> portConfig, err := ConvertPortToPortConfig(port, portBindings)
<add> if err != nil {
<add> return err
<add> }
<add> portConfigs = append(portConfigs, portConfig...)
<ide> }
<ide> p.ports = append(p.ports, portConfigs...)
<ide> }
<ide> func (p *PortOpt) Value() []swarm.PortConfig {
<ide> func ConvertPortToPortConfig(
<ide> port nat.Port,
<ide> portBindings map[nat.Port][]nat.PortBinding,
<del>) []swarm.PortConfig {
<add>) ([]swarm.PortConfig, error) {
<ide> ports := []swarm.PortConfig{}
<ide>
<ide> for _, binding := range portBindings[port] {
<del> hostPort, _ := strconv.ParseUint(binding.HostPort, 10, 16)
<add> hostPort, err := strconv.ParseUint(binding.HostPort, 10, 16)
<add> if err != nil && binding.HostPort != "" {
<add> return nil, fmt.Errorf("invalid hostport binding (%s) for port (%s)", binding.HostPort, port.Port())
<add> }
<ide> ports = append(ports, swarm.PortConfig{
<ide> //TODO Name: ?
<ide> Protocol: swarm.PortConfigProtocol(strings.ToLower(port.Proto())),
<ide> func ConvertPortToPortConfig(
<ide> PublishMode: swarm.PortConfigPublishModeIngress,
<ide> })
<ide> }
<del> return ports
<add> return ports, nil
<ide> }
| 2
|
PHP
|
PHP
|
add test for booting callbacks
|
35df766a03233f41dfd9293c39a92618170f5dda
|
<ide><path>tests/Foundation/FoundationApplicationTest.php
<ide> public function testAfterBootstrappingAddsClosure()
<ide>
<ide> public function testBootingCallbacks()
<ide> {
<del> $app = new Application;
<add> $application = new Application;
<ide>
<ide> $counter = 0;
<del> $closure = function ($app) use (&$counter) {
<add> $closure = function ($app) use (&$counter, $application) {
<ide> $counter++;
<del> $this->assertInstanceOf(Application::class, $app);
<add> $this->assertSame($application, $app);
<ide> };
<ide>
<del> $closure2 = function ($app) use (&$counter) {
<add> $closure2 = function ($app) use (&$counter, $application) {
<ide> $counter++;
<del> $this->assertInstanceOf(Application::class, $app);
<add> $this->assertSame($application, $app);
<ide> };
<ide>
<del> $app->booting($closure);
<del> $app->booting($closure2);
<del> $app->boot();
<add> $application->booting($closure);
<add> $application->booting($closure2);
<add>
<add> $application->boot();
<add>
<ide> $this->assertEquals(2, $counter);
<ide> }
<add>
<add> public function testBootedCallbacks()
<add> {
<add> $application = new Application;
<add>
<add> $counter = 0;
<add> $closure = function ($app) use (&$counter, $application) {
<add> $counter++;
<add> $this->assertSame($application, $app);
<add> };
<add>
<add> $closure2 = function ($app) use (&$counter, $application) {
<add> $counter++;
<add> $this->assertSame($application, $app);
<add> };
<add>
<add> $closure3 = function ($app) use (&$counter, $application) {
<add> $counter++;
<add> $this->assertSame($application, $app);
<add> };
<add>
<add> $application->booting($closure);
<add> $application->booted($closure);
<add> $application->booted($closure2);
<add> $application->boot();
<add>
<add> $this->assertEquals(3, $counter);
<add>
<add> $application->booted($closure3);
<add>
<add> $this->assertEquals(4, $counter);
<add> }
<ide> }
<ide>
<ide> class ApplicationBasicServiceProviderStub extends ServiceProvider
| 1
|
Javascript
|
Javascript
|
use single quotes in js string
|
fa70560eba330e2b8f31add9d93d8f4a2173607a
|
<ide><path>src/state-store.js
<ide> class StateStore {
<ide> resolve(dbOpenRequest.result)
<ide> }
<ide> dbOpenRequest.onerror = (error) => {
<del> console.error("Could not connect to indexedDB", error)
<add> console.error('Could not connect to indexedDB', error)
<ide> resolve(null)
<ide> }
<ide> })
| 1
|
PHP
|
PHP
|
use text class
|
b0fcaae134abeec996433b5a7362967805af5abd
|
<ide><path>src/Utility/Text.php
<ide> public static function wordWrap($text, $width = 72, $break = "\n", $cut = false)
<ide> {
<ide> $paragraphs = explode($break, $text);
<ide> foreach ($paragraphs as &$paragraph) {
<del> $paragraph = String::_wordWrap($paragraph, $width, $break, $cut);
<add> $paragraph = static::_wordWrap($paragraph, $width, $break, $cut);
<ide> }
<ide> return implode($break, $paragraphs);
<ide> }
<ide><path>tests/TestCase/Utility/TextTest.php
<ide> public function testWordWrapUnicodeAware()
<ide> public function testWordWrapNewlineAware() {
<ide> $text = 'This is a line that is almost the 55 chars long.
<ide> This is a new sentence which is manually newlined, but is so long it needs two lines.';
<del> $result = String::wordWrap($text, 55);
<add> $result = Text::wordWrap($text, 55);
<ide> $expected = <<<TEXT
<ide> This is a line that is almost the 55 chars long.
<ide> This is a new sentence which is manually newlined, but
<ide> public function testWrap()
<ide> This is the song th
<ide> at never ends. This
<ide> is the song that n
<del>ever ends. This is
<add>ever ends. This is
<ide> the song that never
<ide> ends.
<ide> TEXT;
| 2
|
Javascript
|
Javascript
|
remove words "as well" to prevent text wrapping
|
09cb38aa211ce20cbb49b916043f3319f2325313
|
<ide><path>client/src/components/Supporters.js
<ide> function Supporters({ isDonating, activeDonations }) {
<ide> <br />
<ide> <br />
<ide> Do you know anyone who's interested in technology? Encourage
<del> them to join the community as well.
<add> them to join the community.
<ide> </Fragment>
<ide> ) : (
<ide> `Join ${commaNumber(
| 1
|
PHP
|
PHP
|
add space before catch in connection
|
3bbac011c3e604a5ff16776b9a393038378c2d9f
|
<ide><path>src/Database/Connection.php
<ide> public function connect() {
<ide> try {
<ide> $this->_driver->connect();
<ide> return true;
<del> } catch(\Exception $e) {
<add> } catch (\Exception $e) {
<ide> throw new MissingConnectionException(['reason' => $e->getMessage()]);
<ide> }
<ide> }
| 1
|
Ruby
|
Ruby
|
fix some warnings in i18n lib
|
8c2e839e5a0fb1662ae867c70114c3fc91850a55
|
<ide><path>activesupport/lib/active_support/vendor/i18n-0.0.1/lib/i18n.rb
<ide> def exception_handler=(exception_handler)
<ide> # storage. Decoupled for backends like a db backend that persist their
<ide> # translations, so the backend can decide whether/when to yield or not.
<ide> def populate(&block)
<del> backend.populate &block
<add> backend.populate(&block)
<ide> end
<ide>
<ide> # Stores translations for the given locale in the backend.
<ide> def default_exception_handler(exception, locale, key, options)
<ide> # keys are Symbols.
<ide> def normalize_translation_keys(locale, key, scope)
<ide> keys = [locale] + Array(scope) + [key]
<del> keys = keys.map{|key| key.to_s.split(/\./) }
<del> keys.flatten.map{|key| key.to_sym}
<add> keys = keys.map{|k| k.to_s.split(/\./) }
<add> keys.flatten.map{|k| k.to_sym}
<ide> end
<ide> end
<ide> end
<ide><path>activesupport/lib/active_support/vendor/i18n-0.0.1/lib/i18n/backend/simple.rb
<ide> def store_translations(locale, data)
<ide>
<ide> def translate(locale, key, options = {})
<ide> raise InvalidLocale.new(locale) if locale.nil?
<del> return key.map{|key| translate locale, key, options } if key.is_a? Array
<add> return key.map{|k| translate locale, k, options } if key.is_a? Array
<ide>
<ide> reserved = :scope, :default
<ide> count, scope, default = options.values_at(:count, *reserved)
<ide> def localize(locale, object, format = :default)
<ide> def lookup(locale, key, scope = [])
<ide> return unless key
<ide> keys = I18n.send :normalize_translation_keys, locale, key, scope
<del> keys.inject(@@translations){|result, key| result[key.to_sym] or return nil }
<add> keys.inject(@@translations){|result, k| result[k.to_sym] or return nil }
<ide> end
<ide>
<ide> # Evaluates a default translation.
| 2
|
Javascript
|
Javascript
|
fix bug in scrollintoview
|
65c8549759de630ae3c1cf687035a973f45e7afc
|
<ide><path>web/ui_utils.js
<ide> function scrollIntoView(element, spot, skipOverflowHiddenElements = false) {
<ide> }
<ide> let offsetY = element.offsetTop + element.clientTop;
<ide> let offsetX = element.offsetLeft + element.clientLeft;
<del> while (parent.clientHeight === parent.scrollHeight ||
<add> while ((parent.clientHeight === parent.scrollHeight &&
<add> parent.clientWidth === parent.scrollWidth) ||
<ide> (skipOverflowHiddenElements &&
<ide> getComputedStyle(parent).overflow === 'hidden')) {
<ide> if (parent.dataset._scaleY) {
| 1
|
Python
|
Python
|
add tests for check_fit with deg specified as list
|
4dd71a3ab7bd019e36998e7c5f98ec2345539f18
|
<ide><path>numpy/polynomial/tests/test_classes.py
<ide> def f(x):
<ide> assert_almost_equal(p(x), y)
<ide> assert_almost_equal(p.domain, d)
<ide> assert_almost_equal(p.window, w)
<add> p = Poly.fit(x, y, [0, 1, 2, 3], domain=d, window=w)
<add> assert_almost_equal(p(x), y)
<add> assert_almost_equal(p.domain, d)
<add> assert_almost_equal(p.window, w)
<ide>
<ide> # check with class domain default
<ide> p = Poly.fit(x, y, 3, [])
<ide> assert_equal(p.domain, Poly.domain)
<ide> assert_equal(p.window, Poly.window)
<add> p = Poly.fit(x, y, [0, 1, 2, 3], [])
<add> assert_equal(p.domain, Poly.domain)
<add> assert_equal(p.window, Poly.window)
<ide>
<ide> # check that fit accepts weights.
<ide> w = np.zeros_like(x)
<ide> z = y + random(y.shape)*.25
<ide> w[::2] = 1
<ide> p1 = Poly.fit(x[::2], z[::2], 3)
<ide> p2 = Poly.fit(x, z, 3, w=w)
<add> p3 = Poly.fit(x, z, [0, 1, 2, 3], w=w)
<ide> assert_almost_equal(p1(x), p2(x))
<add> assert_almost_equal(p2(x), p3(x))
<ide>
<ide>
<ide> def check_equal(Poly):
| 1
|
Text
|
Text
|
add devnexen to collaborators
|
ecd98428b0e9734986f519135a9fe8d597b52d6e
|
<ide><path>README.md
<ide> For information about the governance of the Node.js project, see
<ide> **David Cai** <davidcai1993@yahoo.com> (he/him)
<ide> * [davisjam](https://github.com/davisjam) -
<ide> **Jamie Davis** <davisjam@vt.edu> (he/him)
<add>* [devnexen](https://github.com/devnexen) -
<add>**David Carlier** <devnexen@gmail.com>
<ide> * [devsnek](https://github.com/devsnek) -
<ide> **Gus Caplan** <me@gus.host> (he/him)
<ide> * [digitalinfinity](https://github.com/digitalinfinity) -
| 1
|
Text
|
Text
|
add link i removed from challenge
|
32b53ba0e0e6d4a2116de4949d95c36c4be47140
|
<ide><path>guide/english/certifications/apis-and-microservices/basic-node-and-express/implement-a-root-level-request-logger-middleware/index.md
<ide> If you have trouble formatting the string correctly, one way to do it looks like
<ide>
<ide>
<ide> <a href='https://github.com/freecodecamp/guides/tree/master/src/pages/certifications/apis-and-microservices/basic-node-and-express/implement-a-root-level-request-logger-middleware/index.md' target='_blank' rel='nofollow'>Help our community expand these hints and guides</a>.
<add>
<add>### Resources
<add>- [Express Middleware](https://expressjs.com/en/guide/using-middleware.html)
| 1
|
Java
|
Java
|
expose aligncontent to java
|
c2a41e42f20e4453cd4dad55dbc1f1cf0ac086fe
|
<ide><path>ReactAndroid/src/main/java/com/facebook/csslayout/CSSNode.java
<ide> public void setAlignSelf(CSSAlign alignSelf) {
<ide> }
<ide> }
<ide>
<add> @Override
<add> public CSSAlign getAlignContent() {
<add> return style.alignContent;
<add> }
<add>
<add> @Override
<add> public void setAlignContent(CSSAlign alignContent) {
<add> if (style.alignContent != alignContent) {
<add> style.alignContent = alignContent;
<add> dirty();
<add> }
<add> }
<add>
<ide> /**
<ide> * Get this node's position type, as defined by style.
<ide> */
<ide><path>ReactAndroid/src/main/java/com/facebook/csslayout/CSSNodeAPI.java
<ide> void measure(
<ide> void setAlignItems(CSSAlign alignItems);
<ide> CSSAlign getAlignSelf();
<ide> void setAlignSelf(CSSAlign alignSelf);
<add> CSSAlign getAlignContent();
<add> void setAlignContent(CSSAlign alignContent);
<ide> CSSPositionType getPositionType();
<ide> void setPositionType(CSSPositionType positionType);
<ide> void setWrap(CSSWrap flexWrap);
<ide><path>ReactAndroid/src/main/java/com/facebook/csslayout/CSSNodeJNI.java
<ide> public void setAlignSelf(CSSAlign alignSelf) {
<ide> jni_CSSNodeStyleSetAlignSelf(mNativePointer, alignSelf.ordinal());
<ide> }
<ide>
<add> private native int jni_CSSNodeStyleGetAlignContent(int nativePointer);
<add> @Override
<add> public CSSAlign getAlignContent() {
<add> assertNativeInstance();
<add> return CSSAlign.values()[jni_CSSNodeStyleGetAlignContent(mNativePointer)];
<add> }
<add>
<add> private native void jni_CSSNodeStyleSetAlignContent(int nativePointer, int alignContent);
<add> @Override
<add> public void setAlignContent(CSSAlign alignContent) {
<add> assertNativeInstance();
<add> jni_CSSNodeStyleSetAlignContent(mNativePointer, alignContent.ordinal());
<add> }
<add>
<ide> private native int jni_CSSNodeStyleGetPositionType(int nativePointer);
<ide> @Override
<ide> public CSSPositionType getPositionType() {
| 3
|
Javascript
|
Javascript
|
combine contructor, tag, object into a function
|
04ccc98de65b0473d7e86a632cfe5188f31a84a8
|
<ide><path>lib/internal/util/inspect.js
<ide> function getKeys(value, showHidden) {
<ide> return keys;
<ide> }
<ide>
<add>function getCtxStyle(constructor, tag) {
<add> return constructor || tag || 'Object';
<add>}
<add>
<ide> function formatProxy(ctx, proxy, recurseTimes) {
<ide> if (recurseTimes != null) {
<ide> if (recurseTimes < 0)
<ide> function formatRaw(ctx, value, recurseTimes) {
<ide>
<ide> if (recurseTimes != null) {
<ide> if (recurseTimes < 0)
<del> return ctx.stylize(`[${constructor || tag || 'Object'}]`, 'special');
<add> return ctx.stylize(`[${getCtxStyle(constructor, tag)}]`, 'special');
<ide> recurseTimes -= 1;
<ide> }
<ide>
<ide> function handleMaxCallStackSize(ctx, err, constructor, tag, indentationLvl) {
<ide> ctx.seen.pop();
<ide> ctx.indentationLvl = indentationLvl;
<ide> return ctx.stylize(
<del> `[${constructor || tag || 'Object'}: Inspection interrupted ` +
<add> `[${getCtxStyle(constructor, tag)}: Inspection interrupted ` +
<ide> 'prematurely. Maximum call stack size exceeded.]',
<ide> 'special'
<ide> );
| 1
|
Python
|
Python
|
update code comment
|
34a1a01091e5411bd7f4379a0d9c33d2b03c7be1
|
<ide><path>modeling.py
<ide> def forward(self, input_ids, token_type_ids=None, attention_mask=None):
<ide> token_type_ids = torch.zeros_like(input_ids)
<ide>
<ide> # We create a 3D attention mask from a 2D tensor mask.
<del> # Sizes are [batch_size, 1, 1, from_seq_length]
<del> # So we can broadcast to [batch_size, num_heads, to_seq_length, from_seq_length]
<add> # Sizes are [batch_size, 1, 1, to_seq_length]
<add> # So we can broadcast to [batch_size, num_heads, from_seq_length, to_seq_length]
<ide> # this attention mask is more simple than the triangular masking of causal attention
<ide> # used in OpenAI GPT, we just need to prepare the broadcast dimension here.
<ide> extended_attention_mask = attention_mask.unsqueeze(1).unsqueeze(2)
| 1
|
Python
|
Python
|
fix pendingdeprecation warnings in tests
|
b6fb377c2b4b747597bc3291dadd52b633b135b4
|
<ide><path>rest_framework/tests/filters.py
<ide> class FilterableItem(models.Model):
<ide> class FilterFieldsRootView(generics.ListCreateAPIView):
<ide> model = FilterableItem
<ide> filter_fields = ['decimal', 'date']
<del> filter_backend = filters.DjangoFilterBackend
<add> filter_backends = (filters.DjangoFilterBackend,)
<ide>
<ide> # These class are used to test a filter class.
<ide> class SeveralFieldsFilter(django_filters.FilterSet):
<ide> class Meta:
<ide> class FilterClassRootView(generics.ListCreateAPIView):
<ide> model = FilterableItem
<ide> filter_class = SeveralFieldsFilter
<del> filter_backend = filters.DjangoFilterBackend
<add> filter_backends = (filters.DjangoFilterBackend,)
<ide>
<ide> # These classes are used to test a misconfigured filter class.
<ide> class MisconfiguredFilter(django_filters.FilterSet):
<ide> class Meta:
<ide> class IncorrectlyConfiguredRootView(generics.ListCreateAPIView):
<ide> model = FilterableItem
<ide> filter_class = MisconfiguredFilter
<del> filter_backend = filters.DjangoFilterBackend
<add> filter_backends = (filters.DjangoFilterBackend,)
<ide>
<ide> class FilterClassDetailView(generics.RetrieveAPIView):
<ide> model = FilterableItem
<ide> filter_class = SeveralFieldsFilter
<del> filter_backend = filters.DjangoFilterBackend
<add> filter_backends = (filters.DjangoFilterBackend,)
<ide>
<ide> # Regression test for #814
<ide> class FilterableItemSerializer(serializers.ModelSerializer):
<ide> class FilterFieldsQuerysetView(generics.ListCreateAPIView):
<ide> queryset = FilterableItem.objects.all()
<ide> serializer_class = FilterableItemSerializer
<ide> filter_fields = ['decimal', 'date']
<del> filter_backend = filters.DjangoFilterBackend
<add> filter_backends = (filters.DjangoFilterBackend,)
<ide>
<ide> class GetQuerysetView(generics.ListCreateAPIView):
<ide> serializer_class = FilterableItemSerializer
<ide> filter_class = SeveralFieldsFilter
<del> filter_backend = filters.DjangoFilterBackend
<add> filter_backends = (filters.DjangoFilterBackend,)
<ide>
<ide> def get_queryset(self):
<ide> return FilterableItem.objects.all()
<ide><path>rest_framework/tests/generics.py
<ide> class SlugBasedInstanceView(InstanceView):
<ide> """
<ide> model = SlugBasedModel
<ide> serializer_class = SlugSerializer
<add> lookup_field = 'slug'
<ide>
<ide>
<ide> class TestRootView(TestCase):
<ide> def setUp(self):
<ide> {'id': obj.id, 'text': obj.text}
<ide> for obj in self.objects.all()
<ide> ]
<del> self.root_view = RootView.as_view()
<del> self.instance_view = InstanceView.as_view()
<del> self.original_root_backend = getattr(RootView, 'filter_backend')
<del> self.original_instance_backend = getattr(InstanceView, 'filter_backend')
<del>
<del> def tearDown(self):
<del> setattr(RootView, 'filter_backend', self.original_root_backend)
<del> setattr(InstanceView, 'filter_backend', self.original_instance_backend)
<ide>
<ide> def test_get_root_view_filters_by_name_with_filter_backend(self):
<ide> """
<ide> GET requests to ListCreateAPIView should return filtered list.
<ide> """
<del> setattr(RootView, 'filter_backend', InclusiveFilterBackend)
<add> root_view = RootView.as_view(filter_backends=(InclusiveFilterBackend,))
<ide> request = factory.get('/')
<del> response = self.root_view(request).render()
<add> response = root_view(request).render()
<ide> self.assertEqual(response.status_code, status.HTTP_200_OK)
<ide> self.assertEqual(len(response.data), 1)
<ide> self.assertEqual(response.data, [{'id': 1, 'text': 'foo'}])
<ide> def test_get_root_view_filters_out_all_models_with_exclusive_filter_backend(self
<ide> """
<ide> GET requests to ListCreateAPIView should return empty list when all models are filtered out.
<ide> """
<del> setattr(RootView, 'filter_backend', ExclusiveFilterBackend)
<add> root_view = RootView.as_view(filter_backends=(ExclusiveFilterBackend,))
<ide> request = factory.get('/')
<del> response = self.root_view(request).render()
<add> response = root_view(request).render()
<ide> self.assertEqual(response.status_code, status.HTTP_200_OK)
<ide> self.assertEqual(response.data, [])
<ide>
<ide> def test_get_instance_view_filters_out_name_with_filter_backend(self):
<ide> """
<ide> GET requests to RetrieveUpdateDestroyAPIView should raise 404 when model filtered out.
<ide> """
<del> setattr(InstanceView, 'filter_backend', ExclusiveFilterBackend)
<add> instance_view = InstanceView.as_view(filter_backends=(ExclusiveFilterBackend,))
<ide> request = factory.get('/1')
<del> response = self.instance_view(request, pk=1).render()
<add> response = instance_view(request, pk=1).render()
<ide> self.assertEqual(response.status_code, status.HTTP_404_NOT_FOUND)
<ide> self.assertEqual(response.data, {'detail': 'Not found'})
<ide>
<ide> def test_get_instance_view_will_return_single_object_when_filter_does_not_exclude_it(self):
<ide> """
<ide> GET requests to RetrieveUpdateDestroyAPIView should return a single object when not excluded
<ide> """
<del> setattr(InstanceView, 'filter_backend', InclusiveFilterBackend)
<add> instance_view = InstanceView.as_view(filter_backends=(InclusiveFilterBackend,))
<ide> request = factory.get('/1')
<del> response = self.instance_view(request, pk=1).render()
<add> response = instance_view(request, pk=1).render()
<ide> self.assertEqual(response.status_code, status.HTTP_200_OK)
<ide> self.assertEqual(response.data, {'id': 1, 'text': 'foo'})
<ide><path>rest_framework/tests/pagination.py
<ide> class FilterFieldsRootView(generics.ListCreateAPIView):
<ide> model = FilterableItem
<ide> paginate_by = 10
<ide> filter_class = DecimalFilter
<del> filter_backend = filters.DjangoFilterBackend
<add> filter_backends = (filters.DjangoFilterBackend,)
<ide>
<ide> view = FilterFieldsRootView.as_view()
<ide>
<ide> def filter_queryset(self, request, queryset, view):
<ide> class BasicFilterFieldsRootView(generics.ListCreateAPIView):
<ide> model = FilterableItem
<ide> paginate_by = 10
<del> filter_backend = DecimalFilterBackend
<add> filter_backends = (DecimalFilterBackend,)
<ide>
<ide> view = BasicFilterFieldsRootView.as_view()
<ide>
| 3
|
PHP
|
PHP
|
consolidate return early statements
|
0c6833783b5cfda4fa669521dbb798c1531b0f47
|
<ide><path>Cake/Cache/Cache.php
<ide> public static function gc($config = 'default', $expires = null) {
<ide> */
<ide> public static function write($key, $value, $config = 'default') {
<ide> $engine = static::engine($config);
<del> if (!$engine) {
<del> return false;
<del> }
<del>
<del> if (is_resource($value)) {
<add> if (!$engine || is_resource($value)) {
<ide> return false;
<ide> }
<ide>
<ide> public static function read($key, $config = 'default') {
<ide> */
<ide> public static function increment($key, $offset = 1, $config = 'default') {
<ide> $engine = static::engine($config);
<del> if (!$engine) {
<del> return false;
<del> }
<del>
<del> if (!is_int($offset) || $offset < 0) {
<add> if (!$engine || !is_int($offset) || $offset < 0) {
<ide> return false;
<ide> }
<ide>
<ide> public static function increment($key, $offset = 1, $config = 'default') {
<ide> */
<ide> public static function decrement($key, $offset = 1, $config = 'default') {
<ide> $engine = static::engine($config);
<del> if (!$engine) {
<del> return false;
<del> }
<del>
<del> if (!is_int($offset) || $offset < 0) {
<add> if (!$engine || !is_int($offset) || $offset < 0) {
<ide> return false;
<ide> }
<ide>
<ide> public static function clearGroup($group, $config = 'default') {
<ide> return false;
<ide> }
<ide>
<del> $success = $engine->clearGroup($group);
<del> return $success;
<add> return $engine->clearGroup($group);
<ide> }
<ide>
<ide> /**
| 1
|
Python
|
Python
|
use float32 metrics in mnist_eager
|
dfafba4a017c21c19dfdb60e1580f0b2ff5d361f
|
<ide><path>official/mnist/mnist_eager.py
<ide> def train(model, optimizer, dataset, step_counter, log_interval=None):
<ide>
<ide> def test(model, dataset):
<ide> """Perform an evaluation of `model` on the examples from `dataset`."""
<del> avg_loss = tfe.metrics.Mean('loss')
<del> accuracy = tfe.metrics.Accuracy('accuracy')
<add> avg_loss = tfe.metrics.Mean('loss', dtype=tf.float32)
<add> accuracy = tfe.metrics.Accuracy('accuracy', dtype=tf.float32)
<ide>
<ide> for (images, labels) in dataset:
<ide> logits = model(images, training=False)
| 1
|
Go
|
Go
|
remove extra wait after device removal
|
dbf04ec4e2a6b4fe73f7f300918a906c0ff1a37b
|
<ide><path>daemon/graphdriver/devmapper/deviceset.go
<ide> func (devices *DeviceSet) deleteDevice(info *DevInfo) error {
<ide>
<ide> devinfo, _ := devicemapper.GetInfo(info.Name())
<ide> if devinfo != nil && devinfo.Exists != 0 {
<del> if err := devices.removeDeviceAndWait(info.Name()); err != nil {
<add> if err := devices.removeDevice(info.Name()); err != nil {
<ide> logrus.Debugf("Error removing device: %s", err)
<ide> return err
<ide> }
<ide> func (devices *DeviceSet) deactivateDevice(info *DevInfo) error {
<ide> return err
<ide> }
<ide> if devinfo.Exists != 0 {
<del> if err := devices.removeDeviceAndWait(info.Name()); err != nil {
<add> if err := devices.removeDevice(info.Name()); err != nil {
<ide> return err
<ide> }
<ide> }
<ide>
<ide> return nil
<ide> }
<ide>
<del>// Issues the underlying dm remove operation and then waits
<del>// for it to finish.
<del>func (devices *DeviceSet) removeDeviceAndWait(devname string) error {
<add>// Issues the underlying dm remove operation.
<add>func (devices *DeviceSet) removeDevice(devname string) error {
<ide> var err error
<ide>
<del> logrus.Debugf("[devmapper] removeDeviceAndWait START(%s)", devname)
<del> defer logrus.Debugf("[devmapper] removeDeviceAndWait END(%s)", devname)
<add> logrus.Debugf("[devmapper] removeDevice START(%s)", devname)
<add> defer logrus.Debugf("[devmapper] removeDevice END(%s)", devname)
<ide>
<ide> for i := 0; i < 1000; i++ {
<ide> err = devicemapper.RemoveDevice(devname)
<ide> func (devices *DeviceSet) removeDeviceAndWait(devname string) error {
<ide> time.Sleep(10 * time.Millisecond)
<ide> devices.Lock()
<ide> }
<del> if err != nil {
<del> return err
<del> }
<del>
<del> if err := devices.waitRemove(devname); err != nil {
<del> return err
<del> }
<del> return nil
<del>}
<del>
<del>// waitRemove blocks until either:
<del>// a) the device registered at <device_set_prefix>-<hash> is removed,
<del>// or b) the 10 second timeout expires.
<del>func (devices *DeviceSet) waitRemove(devname string) error {
<del> logrus.Debugf("[deviceset %s] waitRemove(%s)", devices.devicePrefix, devname)
<del> defer logrus.Debugf("[deviceset %s] waitRemove(%s) END", devices.devicePrefix, devname)
<del> i := 0
<del> for ; i < 1000; i++ {
<del> devinfo, err := devicemapper.GetInfo(devname)
<del> if err != nil {
<del> // If there is an error we assume the device doesn't exist.
<del> // The error might actually be something else, but we can't differentiate.
<del> return nil
<del> }
<del> if i%100 == 0 {
<del> logrus.Debugf("Waiting for removal of %s: exists=%d", devname, devinfo.Exists)
<del> }
<del> if devinfo.Exists == 0 {
<del> break
<del> }
<ide>
<del> devices.Unlock()
<del> time.Sleep(10 * time.Millisecond)
<del> devices.Lock()
<del> }
<del> if i == 1000 {
<del> return fmt.Errorf("Timeout while waiting for device %s to be removed", devname)
<del> }
<del> return nil
<add> return err
<ide> }
<ide>
<ide> // waitClose blocks until either:
| 1
|
Ruby
|
Ruby
|
restore original constant
|
a933b5850754931538f40af6e2767eb27fb6e904
|
<ide><path>Library/Homebrew/cmd/versions.rb
<ide> def formula_for_sha sha, &block
<ide>
<ide> # Unload the class so Formula#version returns the correct value
<ide> begin
<del> Formulary.unload_formula name
<add> old_const = Formulary.unload_formula name
<ide> nostdout { yield Formula.factory(path.to_s) }
<ide> rescue *IGNORED_EXCEPTIONS => e
<ide> # We rescue these so that we can skip bad versions and
<ide> # continue walking the history
<ide> ohai "#{e} in #{name} at revision #{sha}", e.backtrace if ARGV.debug?
<ide> rescue FormulaUnavailableError
<ide> # Suppress this error
<add> ensure
<add> Formulary.restore_formula name, old_const
<ide> end
<ide> end
<ide> end
<ide><path>Library/Homebrew/formulary.rb
<ide> def self.get_formula_class formula_name
<ide> Object.const_get(Formula.class_s(formula_name))
<ide> end
<ide>
<add> def self.restore_formula formula_name, value
<add> old_verbose, $VERBOSE = $VERBOSE, nil
<add> Object.const_set(Formula.class_s(formula_name), value)
<add> ensure
<add> $VERBOSE = old_verbose
<add> end
<add>
<ide> # A FormulaLoader returns instances of formulae.
<ide> # Subclasses implement loaders for particular sources of formulae.
<ide> class FormulaLoader
| 2
|
PHP
|
PHP
|
fix formhelper test
|
4c0d0e171ed1729830cdd6957416db433bbca4c0
|
<ide><path>tests/TestCase/View/Helper/FormHelperTest.php
<ide> public function testFormSecuredFileControl()
<ide> $this->Form->create();
<ide>
<ide> $this->Form->file('Attachment.file');
<del> $expected = [
<del> 'Attachment.file.name', 'Attachment.file.type',
<del> 'Attachment.file.tmp_name', 'Attachment.file.error',
<del> 'Attachment.file.size',
<del> ];
<add> $expected = ['Attachment.file'];
<ide> $result = $this->Form->getFormProtector()->__debugInfo()['fields'];
<ide> $this->assertEquals($expected, $result);
<ide> }
| 1
|
Mixed
|
Python
|
add test_connection method for snowflake hook
|
acfb7b5acf887d38aa8751c18d17dbfe85e78b7c
|
<ide><path>airflow/providers/snowflake/hooks/snowflake.py
<ide> def run(self, sql: Union[str, list], autocommit: bool = False, parameters: Optio
<ide> conn.commit()
<ide>
<ide> return execution_info
<add>
<add> def test_connection(self):
<add> """Test the Snowflake connection by running a simple query."""
<add> try:
<add> self.run(sql="select 1")
<add> except Exception as e:
<add> return False, str(e)
<add> return True, "Connection successfully tested"
<ide><path>airflow/www/static/js/connection_form.js
<ide> $(document).ready(() => {
<ide> outObj.connection_id = this.value;
<ide> } else if (this.value !== '' && this.name === 'port') {
<ide> outObj[this.name] = Number(this.value);
<del> } else if (this.value !== '' && this.name !== 'csrf_token') {
<add> } else if (this.value !== '' && this.name !== 'csrf_token' && !this.name.match('extra__')) {
<ide> outObj[this.name] = this.value;
<ide> }
<ide> });
<ide><path>tests/providers/snowflake/hooks/test_snowflake.py
<ide> def test_key_pair_auth_not_encrypted(self):
<ide> params = self.db_hook._get_conn_params()
<ide> assert 'private_key' in params
<ide>
<add> @mock.patch('airflow.providers.snowflake.hooks.snowflake.SnowflakeHook.run')
<add> def test_connection_success(self, mock_run):
<add> mock_run.return_value = [{'1': 1}]
<add> status, msg = self.db_hook.test_connection()
<add> assert status is True
<add> assert msg == 'Connection successfully tested'
<add>
<add> @mock.patch(
<add> 'airflow.providers.snowflake.hooks.snowflake.SnowflakeHook.run',
<add> side_effect=Exception('Connection Errors'),
<add> )
<add> def test_connection_failure(self, mock_run):
<add> status, msg = self.db_hook.test_connection()
<add> assert status is False
<add> assert msg == 'Connection Errors'
<add>
<ide>
<ide> """
<ide> Testing hooks with assigning`extra_` parameters
| 3
|
Javascript
|
Javascript
|
remove dead code and polyfill for containselement
|
d0052ac58ea7045c1bc5b1190bd8225c42deb2c7
|
<ide><path>packages/ember-views/lib/views/states/pre_render.js
<del>/* global Node */
<del>
<ide> import _default from "ember-views/views/states/default";
<ide> import { create } from "ember-metal/platform";
<ide> import merge from "ember-metal/merge";
<ide> import jQuery from "ember-views/system/jquery";
<ide> */
<ide> var preRender = create(_default);
<ide>
<del>var containsElement;
<del>if (typeof Node === 'object') {
<del> containsElement = Node.prototype.contains;
<del>
<del> if (!containsElement && Node.prototype.compareDocumentPosition) {
<del> // polyfill for older Firefox.
<del> // http://compatibility.shwups-cms.ch/en/polyfills/?&id=52
<del> containsElement = function(node){
<del> return !!(this.compareDocumentPosition(node) & 16);
<del> };
<del> }
<del>} else {
<del> containsElement = function(element) {
<del> return this.contains(element);
<del> };
<del>}
<del>
<ide> export default preRender;
| 1
|
Javascript
|
Javascript
|
remove incorrect comment
|
40ea053bac8c5ecfd51deef592a397004c4569c0
|
<ide><path>packages/react-native-renderer/src/ReactFabricEventEmitter.js
<ide> import type {TopLevelType} from 'events/TopLevelEventTypes';
<ide>
<ide> export {getListener, registrationNameModules as registrationNames};
<ide>
<del>/**
<del> * Publicly exposed method on module for native objc to invoke when a top
<del> * level event is extracted.
<del> * @param {rootNodeID} rootNodeID React root node ID that event occurred on.
<del> * @param {TopLevelType} topLevelType Top level type of event.
<del> * @param {object} nativeEventParam Object passed from native.
<del> */
<ide> export function dispatchEvent(
<ide> target: Object,
<ide> topLevelType: TopLevelType,
| 1
|
Text
|
Text
|
improve issue templates
|
7193f49eaf2737c42c54ff685892964d7bbe216b
|
<ide><path>.github/ISSUE_TEMPLATE/1.Bug_report.md
<ide> ---
<ide> name: Bug report
<ide> about: Create a bug report for the Next.js core
<del>
<ide> ---
<ide>
<del>**Describe the bug**
<add># Bug report
<add>
<add>## Describe the bug
<add>
<ide> A clear and concise description of what the bug is.
<ide>
<del>**To Reproduce**
<add>## To Reproduce
<add>
<ide> Steps to reproduce the behavior, please provide code snippets or a repository:
<ide> 1. Go to '...'
<ide> 2. Click on '....'
<ide> 3. Scroll down to '....'
<ide> 4. See error
<ide>
<del>**Expected behavior**
<add>## Expected behavior
<ide> A clear and concise description of what you expected to happen.
<ide>
<del>**Screenshots**
<add>## Screenshots
<ide> If applicable, add screenshots to help explain your problem.
<ide>
<del>**System information**
<add>## System information
<ide> - OS: [e.g. macOS, Windows]
<ide> - Browser (if applies) [e.g. chrome, safari]
<ide> - Version of Next.js: [e.g. 6.0.2]
<ide>
<del>**Additional context**
<add>## Additional context
<add>
<ide> Add any other context about the problem here.
<add><path>.github/ISSUE_TEMPLATE/10.Nextjs.org_showcase.md
<del><path>.github/ISSUE_TEMPLATE/4.Nextjs.org_showcase.md
<ide> about: Apply for your project to be added to nextjs.org
<ide>
<ide> ---
<ide>
<add># Request to be added to the showcase on nextjs.org
<add>
<ide> - Name of company:
<ide> - Url:
<ide> - Testimonial about Next.js (optional):
<ide><path>.github/ISSUE_TEMPLATE/2.Feature_request.md
<ide> ---
<ide> name: Feature request
<ide> about: Create a feature request for the Next.js core
<del>
<ide> ---
<ide>
<del>**Is your feature request related to a problem? Please describe.**
<add># Feature request
<add>
<add>## Is your feature request related to a problem? Please describe.
<ide> A clear and concise description of what you want and what your use case is.
<ide>
<del>**Describe the solution you'd like**
<add>## Describe the solution you'd like
<ide> A clear and concise description of what you want to happen.
<ide>
<del>**Describe alternatives you've considered**
<add>## Describe alternatives you've considered
<ide> A clear and concise description of any alternative solutions or features you've considered.
<ide>
<del>**Additional context**
<add>## Additional context
<ide> Add any other context or screenshots about the feature request here.
<ide><path>.github/ISSUE_TEMPLATE/3.Example_Bug_report.md
<add>---
<add>name: Bug report for examples
<add>about: Create a bug report for one of the Next.js examples
<add>---
<add>
<add># Examples bug report
<add>
<add>## Example name
<add>Provide the example name
<add>
<add>## Describe the bug
<add>A clear and concise description of what the bug is.
<add>
<add>## To Reproduce
<add>Steps to reproduce the behavior, please provide code snippets or a repository:
<add>1. Go to '...'
<add>2. Click on '....'
<add>3. Scroll down to '....'
<add>4. See error
<add>
<add>## Expected behavior
<add>A clear and concise description of what you expected to happen.
<add>
<add>## Screenshots
<add>If applicable, add screenshots to help explain your problem.
<add>
<add>## System information
<add> - OS: [e.g. macOS, Windows]
<add> - Browser (if applies) [e.g. chrome, safari]
<add> - Version of Next.js: [e.g. 6.0.2]
<add>
<add>## Additional context
<add>
<add>Add any other context about the problem here.
<ide><path>.github/ISSUE_TEMPLATE/3.Question_about_next.md
<del>---
<del>name: Question about Next.js
<del>about: If you have a question unrelated to the Next.js core or examples. Reach out to the community on https://spectrum.chat/next-js
<del>
<del>---
<del>
<del>Questions should be posted on https://spectrum.chat/next-js
<ide><path>.github/ISSUE_TEMPLATE/9.Question_about_next.md
<add>---
<add>name: Question about Next.js
<add>about: If you have a question related to Next.js or the examples. Reach out to the community on https://spectrum.chat/next-js
<add>---
<add>
<add># Question about Next.js
<add>
<add>Questions should be posted on https://spectrum.chat/next-js
| 6
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.