hash stringlengths 40 40 | diff stringlengths 131 26.7k | message stringlengths 7 694 | project stringlengths 5 67 | split stringclasses 1 value | diff_languages stringlengths 2 24 |
|---|---|---|---|---|---|
8b416558945350363664a029154a8c284b72ff6a | diff --git a/dvc/path_info.py b/dvc/path_info.py
index <HASH>..<HASH> 100644
--- a/dvc/path_info.py
+++ b/dvc/path_info.py
@@ -317,15 +317,9 @@ class HTTPURLInfo(URLInfo):
)
-# See https://github.com/shizacat/dvc/blob/remote-webdav/dvc/path_info.py
-class WebDAVURLInfo(HTTPURLInfo):
+class WebDAVURLInfo(URLInfo):
@cached_property
def url(self):
- return "{}://{}{}{}{}{}".format(
- self.scheme.replace("webdav", "http"),
- self.netloc,
- self._spath,
- (";" + self.params) if self.params else "",
- ("?" + self.query) if self.query else "",
- ("#" + self.fragment) if self.fragment else "",
+ return "{}://{}{}".format(
+ self.scheme.replace("webdav", "http"), self.netloc, self._spath
) | Derive WebDAVURLInfo from URLInfo instead of HTTPURLInfo (#<I>)
Unused, empty http extra parts (params,query,fragment) cause problems
with pushing/pulling - lead to violation of the assumption where to look
for hash values in file/path names.
Context:
<URL> | iterative_dvc | train | py |
77942328a0a68aec6b9422e58dd6771f3a17fbcb | diff --git a/lib/collection/request-auth/awsv4.js b/lib/collection/request-auth/awsv4.js
index <HASH>..<HASH> 100644
--- a/lib/collection/request-auth/awsv4.js
+++ b/lib/collection/request-auth/awsv4.js
@@ -67,7 +67,7 @@ module.exports = {
_.map(signedData.headers, function (value, key) {
// TODO: figure out a better way of handling errors.
- if (!_.contains(['content-length', 'content-type'], key.toLowerCase())) {
+ if (!_.contains(['content-length', 'content-type', 'host'], key.toLowerCase())) {
request.addHeader({
key: key,
value: value, | ensure that aws auth does not add duplicate host header | postmanlabs_postman-collection | train | js |
10590517e04ee4162a997a2b4864a985aac189cc | diff --git a/value/src/it/functional/src/main/java/com/google/auto/value/SimpleValueType.java b/value/src/it/functional/src/main/java/com/google/auto/value/SimpleValueType.java
index <HASH>..<HASH> 100644
--- a/value/src/it/functional/src/main/java/com/google/auto/value/SimpleValueType.java
+++ b/value/src/it/functional/src/main/java/com/google/auto/value/SimpleValueType.java
@@ -28,14 +28,14 @@ public abstract class SimpleValueType {
// The getters here are formatted as an illustration of what getters typically look in real
// classes. In particular they have doc comments.
- /** @return A string that is a nullable string. */
+ /** Returns a string that is a nullable string. */
@Nullable
public abstract String string();
- /** @return An integer that is an integer. */
+ /** Returns an integer that is an integer. */
public abstract int integer();
- /** @return A non-null map where the keys are strings and the values are longs. */
+ /** Returns a non-null map where the keys are strings and the values are longs. */
public abstract Map<String, Long> map();
public static SimpleValueType create( | Fix 3 ErrorProneStyle findings:
* A summary fragment is required; consider using the value of the @return block as a summary fragment instead.
-------------
Created by MOE: <URL> | google_auto | train | java |
6ef251d9e9ca538f68bf1899b12c9d96453f2429 | diff --git a/lib/sup/util.rb b/lib/sup/util.rb
index <HASH>..<HASH> 100644
--- a/lib/sup/util.rb
+++ b/lib/sup/util.rb
@@ -1,3 +1,5 @@
+# encoding: utf-8
+
require 'thread'
require 'lockfile'
require 'mime/types'
@@ -113,6 +115,25 @@ module RMail
end
class Header
+
+ # Convert to ASCII before trying to match with regexp
+ class Field
+
+ EXTRACT_FIELD_NAME_RE = /\A([^\x00-\x1f\x7f-\xff :]+):\s*/no
+
+ class << self
+ def parse(field)
+ field = field.dup.to_s
+ field = field.fix_encoding.ascii
+ if field =~ EXTRACT_FIELD_NAME_RE
+ [ $1, $'.chomp ]
+ else
+ [ "", Field.value_strip(field) ]
+ end
+ end
+ end
+ end
+
## Be more cautious about invalid content-type headers
## the original RMail code calls
## value.strip.split(/\s*;\s*/)[0].downcase | patch RMail field parser to use UTF-8 regexps and fix_encoding before parsing field | sup-heliotrope_sup | train | rb |
d3bd3e0e70fc8c66881b94041e40cde807cf3502 | diff --git a/src/components/link.js b/src/components/link.js
index <HASH>..<HASH> 100644
--- a/src/components/link.js
+++ b/src/components/link.js
@@ -339,7 +339,7 @@ registerShader('portal', {
'vec2 sampleUV;',
'float borderThickness = clamp(exp(-vDistance / 50.0), 0.6, 0.95);',
'sampleUV.y = saturate(direction.y * 0.5 + 0.5);',
- 'sampleUV.x = atan(direction.z, direction.x) * -RECIPROCAL_PI2 + 0.5;',
+ 'sampleUV.x = atan(direction.z, -direction.x) * -RECIPROCAL_PI2 + 0.5;',
'if (vDistanceToCenter > borderThickness && borderEnabled == 1.0) {',
'gl_FragColor = vec4(strokeColor, 1.0);',
'} else {', | flipped the sign on sampleUV.x in the Portal shader used by the link component (#<I>) | aframevr_aframe | train | js |
24a851fc451307b161ef7afc92f059a076b2e5a7 | diff --git a/spec/overcommit/hook/pre_commit/line_endings_spec.rb b/spec/overcommit/hook/pre_commit/line_endings_spec.rb
index <HASH>..<HASH> 100644
--- a/spec/overcommit/hook/pre_commit/line_endings_spec.rb
+++ b/spec/overcommit/hook/pre_commit/line_endings_spec.rb
@@ -24,7 +24,7 @@ describe Overcommit::Hook::PreCommit::LineEndings do
around do |example|
repo do
File.open(staged_file, 'w') { |f| f.write(contents) }
- `git add #{staged_file}`
+ `git add #{staged_file} > #{File::NULL} 2>&1`
example.run
end
end | Quiet LineEndings spec | sds_overcommit | train | rb |
aff2a5a22a1ecbf381184a5e81741365a661a533 | diff --git a/src/RequestBuilder.php b/src/RequestBuilder.php
index <HASH>..<HASH> 100644
--- a/src/RequestBuilder.php
+++ b/src/RequestBuilder.php
@@ -117,7 +117,7 @@ class RequestBuilder
$action['httpMethod'],
$uri,
['Content-Type' => 'application/json'],
- $body ? json_encode($body) : []
+ $body ? json_encode($body) : null
);
} | pass null instead of empty array to Request (#<I>) | googleapis_google-cloud-php | train | php |
8e992733426540e940415a63e9dae6f36611312d | diff --git a/eZ/Bundle/EzPublishLegacyBundle/EzPublishLegacyBundle.php b/eZ/Bundle/EzPublishLegacyBundle/EzPublishLegacyBundle.php
index <HASH>..<HASH> 100644
--- a/eZ/Bundle/EzPublishLegacyBundle/EzPublishLegacyBundle.php
+++ b/eZ/Bundle/EzPublishLegacyBundle/EzPublishLegacyBundle.php
@@ -10,8 +10,6 @@
namespace eZ\Bundle\EzPublishLegacyBundle;
use Symfony\Component\HttpKernel\Bundle\Bundle;
-use eZ\Bundle\EzPublishLegacyBundle\Event\Listener\Fallback as FallbackListener;
-use eZ\Bundle\EzPublishLegacyBundle\Routing\Matcher\FallbackMatcher;
class EzPublishLegacyBundle extends Bundle
{ | Removed unused "use" statements | ezsystems_ezpublish-kernel | train | php |
306a7a1b52860f699d98f2a3b3439424f7bfe820 | diff --git a/classes/Brave/Auth/CoreAuthHandler.php b/classes/Brave/Auth/CoreAuthHandler.php
index <HASH>..<HASH> 100644
--- a/classes/Brave/Auth/CoreAuthHandler.php
+++ b/classes/Brave/Auth/CoreAuthHandler.php
@@ -9,7 +9,7 @@ namespace Brave\Auth;
class CoreAuthHandler implements \Requests_Auth
{
- private $debug = true;
+ private $debug = false;
/**
* Constructor | API should have debug disabled by default so people checking it out for production use can use it right away | bravecollective_php-api | train | php |
161f352838e5761ebaeb3ce3ed57048681ca7352 | diff --git a/GPy/core/parameterization/index_operations.py b/GPy/core/parameterization/index_operations.py
index <HASH>..<HASH> 100644
--- a/GPy/core/parameterization/index_operations.py
+++ b/GPy/core/parameterization/index_operations.py
@@ -95,6 +95,9 @@ class ParameterIndexOperations(object):
def __getitem__(self, prop):
return self._properties[prop]
+ def __delitem__(self, prop):
+ del self._properties[prop]
+
def __str__(self, *args, **kwargs):
import pprint
return pprint.pformat(dict(self._properties)) | delete dangling fixed attribute in constraints | SheffieldML_GPy | train | py |
9dda59d3931bddc41d5880658004edd048a56f3f | diff --git a/twtxt/__init__.py b/twtxt/__init__.py
index <HASH>..<HASH> 100644
--- a/twtxt/__init__.py
+++ b/twtxt/__init__.py
@@ -9,4 +9,4 @@
"""
-__version__ = '1.2.2'
+__version__ = '1.3.0-dev' | Get back to track with <I>-0-dev | buckket_twtxt | train | py |
7b50a13982eff60c945eb1dd69c0f5e7372d27ce | diff --git a/pulsar/apps/__init__.py b/pulsar/apps/__init__.py
index <HASH>..<HASH> 100644
--- a/pulsar/apps/__init__.py
+++ b/pulsar/apps/__init__.py
@@ -622,12 +622,9 @@ The application is now in the arbiter but has not yet started.'''
def configure_logging(self, config = None):
"""Set the logging configuration as specified by the
:ref:`logconfig <setting-logconfig>` setting."""
- if self.cfg.debug:
- self.loglevel = logging.DEBUG
- else:
- self.loglevel = self.cfg.loglevel
+ self.loglevel = self.cfg.loglevel
config = config or self.cfg.logconfig
- super(Application,self).configure_logging(config = config)
+ super(Application,self).configure_logging(config=config)
def actorlinks(self, links):
if not links: | debug flag does not set the logging level | quantmind_pulsar | train | py |
372bd2d63451a1b23a0a850e7af3cf65d389d124 | diff --git a/models/deployment_manifest.go b/models/deployment_manifest.go
index <HASH>..<HASH> 100644
--- a/models/deployment_manifest.go
+++ b/models/deployment_manifest.go
@@ -39,8 +39,8 @@ type manifestNetwork struct {
type manifestResourcePool struct {
Name string
NetworkName string `yaml:"network"`
- Stemcell string
- CloudProperties string `yaml:"cloud_properties"`
+ Stemcell *manifestStemcell
+ CloudProperties *map[string]interface{} `yaml:"cloud_properties"`
}
// ManifestJob describes a cluster of VMs each running the same set of job templates
@@ -68,6 +68,11 @@ type manifestJobNetwork struct {
StaticIPs *[]string `yaml:"static_ips"`
}
+type manifestStemcell struct {
+ Name string
+ Version string
+}
+
// FindByJobTemplates returns the subnet of ManifestJobs that include a specific job template
func (manifest *DeploymentManifest) FindByJobTemplates(jobTemplateName string) (jobs []*ManifestJob) {
jobs = []*ManifestJob{} | resource_pools[].stemcell has .name/.version | cloudfoundry-community_gogobosh | train | go |
836379157b5164bd0c2d6b06d3e92ab69b62d83b | diff --git a/src/server/public/scripts/modules/FormCreate.js b/src/server/public/scripts/modules/FormCreate.js
index <HASH>..<HASH> 100755
--- a/src/server/public/scripts/modules/FormCreate.js
+++ b/src/server/public/scripts/modules/FormCreate.js
@@ -176,7 +176,7 @@ export default class FormCreate {
}.bind(this))
}
- var slugPaths = document.querySelectorAll('[data-slug-type=path]')
+ var slugPaths = this._form.querySelectorAll('[data-slug-type=path]')
Array.prototype.forEach.call(slugPaths, function(slugPath) {
var isStructureFolder = (slugPath.parentNode.getAttribute('data-shown') != null)
if (slugPath.value != null && slugPath.value != '' && (isStructureFolder && !slugPath.parentNode.classList.contains('hidden'))) { | fix: bug abejs form multiple precontrib was selected on queryselector for structure path url | abecms_abecms | train | js |
a1ba524b85bf5473e6c61dd006e6262853cdba65 | diff --git a/test/test_helper.rb b/test/test_helper.rb
index <HASH>..<HASH> 100644
--- a/test/test_helper.rb
+++ b/test/test_helper.rb
@@ -1,7 +1,10 @@
require 'simplecov'
require 'simplecov-console'
SimpleCov.formatter = SimpleCov::Formatter::Console
-SimpleCov.start
+SimpleCov.start do
+ add_filter '.bundle/'
+ add_filter 'test/'
+end
# Configure Rails Environment
ENV["RAILS_ENV"] = "test" | Do not analyze .bundle/ and test/ files during simplecov analysis | andersondias_attributes_sanitizer | train | rb |
007cccd93214f86abad3047b2b062e1eb89ab452 | diff --git a/setup.py b/setup.py
index <HASH>..<HASH> 100644
--- a/setup.py
+++ b/setup.py
@@ -45,7 +45,7 @@ setup(
#test_suite = 'setuptest.setuptest.SetupTestSuite',
#cmdclass = {'test': test},
url = __github_url__,
- download_url = "%s/archive/v%s.tar.gz" % (__github_url__, version),
+ download_url = "%s/tarball/%s" % (__github_url__, version),
keywords = ["ai", "ml", "artificial intelligence", "machine intelligence", "norvig", "russell", "agent", "bot", "book", "textbook", "algorithm", "machine-learning", "search"],
classifiers = [
"Programming Language :: Python", | update github url to use tarball which is autogenerated with git tag on github | hobson_aima | train | py |
359e53f53fac49691a71ad3e20dea32c33c485a0 | diff --git a/lib/dbf/table.rb b/lib/dbf/table.rb
index <HASH>..<HASH> 100644
--- a/lib/dbf/table.rb
+++ b/lib/dbf/table.rb
@@ -134,7 +134,7 @@ module DBF
# @param [optional String] path Defaults to basename of dbf file
def to_csv(path = nil)
path = File.basename(@data.path, '.dbf') + '.csv' if path.nil?
- FCSV.open(path, 'w', :force_quotes => true) do |csv|
+ FCSV.open(path, 'w') do |csv|
each do |record|
csv << record.to_a
end | removed force quotes as csv export option | infused_dbf | train | rb |
ecb69a230065abe2fe5f91799a72928579ee27d8 | diff --git a/memprof/__init__.py b/memprof/__init__.py
index <HASH>..<HASH> 100644
--- a/memprof/__init__.py
+++ b/memprof/__init__.py
@@ -17,4 +17,4 @@
from .memprof import *
import pkg_resources
-__version__ = pkg_resources.require("memprof")[0].version
+__version__ = pkg_resources.get_distribution("memprof").version | use recommended way of getting the version number
pkg_resources.require() should not be called directly, it raises
DistributionNotFound() for tornado and nose in Debian (and probably
others). get_distribution() gives you the version as well and is not
recommended against in the documentation, so use it. | jmdana_memprof | train | py |
353fa86d2091aba73a8544ea442f6e3f394ccb2d | diff --git a/src/Fraction.php b/src/Fraction.php
index <HASH>..<HASH> 100644
--- a/src/Fraction.php
+++ b/src/Fraction.php
@@ -178,13 +178,6 @@ class Fraction
$a = $this->numerator;
$b = $this->denominator;
- // if the gmp_gcd function is available, use it, on the assumption
- // that it will perform better
- // http://php.net/manual/en/function.gmp-gcd.php
- if (function_exists('gmp_gcd')) {
- return gmp_gcd($a, $b);
- }
-
// ensure no negative values
$a = abs($a);
$b = abs($b); | Commenting then removing call to gmp_gcd
Commenting out call to gmp_gcd
Removing call to gmp_gcd | phospr_fraction | train | php |
fd92ff7dc3975223c6af024511400f84365730db | diff --git a/lib/shikashi/sandbox.rb b/lib/shikashi/sandbox.rb
index <HASH>..<HASH> 100644
--- a/lib/shikashi/sandbox.rb
+++ b/lib/shikashi/sandbox.rb
@@ -40,6 +40,7 @@ module Shikashi
@privileges.allow_method :eval
@privileges.allow_exceptions
+ @privileges.object(SecurityError).allow :new
end
def self.generate_id | added default permissions to call SecurityError.new for internal security errors | tario_shikashi | train | rb |
c1be150dbf6bf1844f6de59caef2c89cac5993e0 | diff --git a/ns/namespace.go b/ns/namespace.go
index <HASH>..<HASH> 100644
--- a/ns/namespace.go
+++ b/ns/namespace.go
@@ -112,8 +112,10 @@ func GetValueByNamespace(object interface{}, ns []string) interface{} {
fmt.Printf("Could not find tag for field{%s}\n", field)
return nil
}
+
// remove omitempty from tag
- tag = strings.Replace(tag, ",omitempty", "", -1)
+ tag = strings.Split(tag, ",")[0]
+
if tag == current {
val, err := reflections.GetField(object, field)
if err != nil { | Remove everything after the first coma from the tag
Currently we it's removing ",omitempty" only. I found out that some people use ", omitempty" so the tag wouldn't match the `current` element. | intelsdi-x_snap-plugin-utilities | train | go |
dd372beafac75fd92d1d601bad3a6a5b98bff61c | diff --git a/src/Config.php b/src/Config.php
index <HASH>..<HASH> 100644
--- a/src/Config.php
+++ b/src/Config.php
@@ -442,7 +442,7 @@ class Config
// when adding relations, make sure they're added by their slug. Not their 'name' or 'singular name'.
if (!empty($contentType['relations']) && is_array($contentType['relations'])) {
- foreach ($contentType['relations'] as $relkey => $relation) {
+ foreach (array_keys($contentType['relations']) as $relkey) {
if ($relkey != Slugify::create()->slugify($relkey)) {
$contentType['relations'][Slugify::create()->slugify($relkey)] = $contentType['relations'][$relkey];
unset($contentType['relations'][$relkey]); | Only use the array keys if that's all we need | bolt_bolt | train | php |
e419cc0852dd46704326b281c01788b568861fac | diff --git a/worker/uniter/jujuc/action-get.go b/worker/uniter/jujuc/action-get.go
index <HASH>..<HASH> 100644
--- a/worker/uniter/jujuc/action-get.go
+++ b/worker/uniter/jujuc/action-get.go
@@ -1,4 +1,4 @@
-// Copyright 2012, 2013 Canonical Ltd.
+// Copyright 2014 Canonical Ltd.
// Licensed under the AGPLv3, see LICENCE file for details.
package jujuc
diff --git a/worker/uniter/jujuc/action-get_test.go b/worker/uniter/jujuc/action-get_test.go
index <HASH>..<HASH> 100644
--- a/worker/uniter/jujuc/action-get_test.go
+++ b/worker/uniter/jujuc/action-get_test.go
@@ -1,4 +1,4 @@
-// Copyright 2012, 2013 Canonical Ltd.
+// Copyright 2014 Canonical Ltd.
// Licensed under the AGPLv3, see LICENCE file for details.
package jujuc_test | Corrected copyright date on action-get files | juju_juju | train | go,go |
61658d1ad33736c972529c74a3bb53f5ed4ef02a | diff --git a/pgjdbc/src/main/java/org/postgresql/jdbc/PgConnection.java b/pgjdbc/src/main/java/org/postgresql/jdbc/PgConnection.java
index <HASH>..<HASH> 100644
--- a/pgjdbc/src/main/java/org/postgresql/jdbc/PgConnection.java
+++ b/pgjdbc/src/main/java/org/postgresql/jdbc/PgConnection.java
@@ -733,6 +733,7 @@ public class PgConnection implements BaseConnection {
/**
* <B>Note:</B> even though {@code Statement} is automatically closed when it is garbage
* collected, it is better to close it explicitly to lower resource consumption.
+ * The spec says that calling close on a closed connection is a no-op.
*
* {@inheritDoc}
*/
@@ -743,6 +744,9 @@ public class PgConnection implements BaseConnection {
// When that happens the connection is still registered in the finalizer queue, so it gets finalized
return;
}
+ if (queryExecutor.isClosed()) {
+ return;
+ }
releaseTimer();
queryExecutor.close();
openStackTrace = null; | fix Issue #<I>. The spec says that calling close() on a closed connection is a noop. (#<I>)
* fix Issue #<I>. The spec says that calling close() on a closed connection is a noop. | pgjdbc_pgjdbc | train | java |
f1de196740d00803594a51f1bfac9ec6a91e73c5 | diff --git a/salt/states/cron.py b/salt/states/cron.py
index <HASH>..<HASH> 100644
--- a/salt/states/cron.py
+++ b/salt/states/cron.py
@@ -150,6 +150,13 @@ from salt.modules.cron import (
)
+def __virtual__():
+ if 'cron.list_tab' in __salt__:
+ return True
+ else:
+ return (False, 'cron module could not be loaded')
+
+
def _check_cron(user,
cmd,
minute=None, | Add virtual func for cron state module
This prevents a traceback when you attempt to use a cron state without
cron being installed. | saltstack_salt | train | py |
a8949ed70875a7e38ea570308adf9b29ea132660 | diff --git a/spec/spec_helper.rb b/spec/spec_helper.rb
index <HASH>..<HASH> 100644
--- a/spec/spec_helper.rb
+++ b/spec/spec_helper.rb
@@ -1,6 +1,6 @@
-# encoding: utf-8
+# frozen_string_literal: true
-if RUBY_VERSION > '1.9' and (ENV['COVERAGE'] || ENV['TRAVIS'])
+if ENV['COVERAGE'] || ENV['TRAVIS']
require 'simplecov'
require 'coveralls' | Change to stop detecting ruby version | piotrmurach_tty-platform | train | rb |
86fa8572ce47e5aad595a8e0e883316421b28930 | diff --git a/setup.py b/setup.py
index <HASH>..<HASH> 100644
--- a/setup.py
+++ b/setup.py
@@ -15,8 +15,7 @@ setup(name='panphon',
'numpy',
'editdistance',
'munkres'],
- scripts=['panphon/bin/apply_diacritics.py',
- 'panphon/bin/validate_ipa.py',
+ scripts=['panphon/bin/validate_ipa.py',
'panphon/bin/align_wordlists.py',
'panphon/bin/generate_ipa_all.py'],
packages=['panphon'], | Removed superfluous file names from script section of setup.py | dmort27_panphon | train | py |
637e6dcb2e3c85b1bf91f9367ae98e20e4dd4c26 | diff --git a/bitcoin/core.py b/bitcoin/core.py
index <HASH>..<HASH> 100644
--- a/bitcoin/core.py
+++ b/bitcoin/core.py
@@ -41,7 +41,7 @@ class Output(SerializableMixin):
def get_script_class(cls):
return getattr(cls, 'script_class', Script)
- def serialize(self, type_, version, chain, height):
+ def serialize(self):
parts = list()
parts.append(pack('<Q', self.amount))
script = self.contract.serialize() | Remove extra parameters for serialization which are not currently being used. | maaku_python-bitcoin | train | py |
bdda82edae69268670e2bd1e7ab6c7584e9da0eb | diff --git a/ykman/otp.py b/ykman/otp.py
index <HASH>..<HASH> 100644
--- a/ykman/otp.py
+++ b/ykman/otp.py
@@ -75,6 +75,7 @@ class PrepareUploadError(Enum):
# Defined here
CONNECTION_FAILED = 'Failed to open HTTPS connection'
NOT_FOUND = 'Upload request not recognized by server'
+ SERVICE_UNAVAILABLE = 'Service temporarily unavailable, please try again later' # noqa: E501
# Defined in upload project
PRIVATE_ID_INVALID_LENGTH = 'Private ID must be 12 characters long.'
@@ -206,6 +207,10 @@ class OtpController(object):
if resp.status == 404:
raise PrepareUploadFailed(
resp.status, resp_body, [PrepareUploadError.NOT_FOUND])
+ elif resp.status == 503:
+ raise PrepareUploadFailed(
+ resp.status, resp_body,
+ [PrepareUploadError.SERVICE_UNAVAILABLE])
else:
try:
errors = json.loads(resp_body.decode('utf-8')).get('errors') | Handle status <I> from upload | Yubico_yubikey-manager | train | py |
5528a2672c8332966b22c24f06b5a5c5c66423f5 | diff --git a/lib/configParser.js b/lib/configParser.js
index <HASH>..<HASH> 100644
--- a/lib/configParser.js
+++ b/lib/configParser.js
@@ -32,7 +32,7 @@ var ConfigParser = {
}).sort(function(a, b) {
return parseFloat(a) - parseFloat(b);
});
- if (verStr == 'current') {
+ if (verStr == 'current' || verStr == 'latest') {
return filteredBrowsers[filteredBrowsers.length - 1];
}
else if (verStr == 'previous') {
@@ -82,7 +82,7 @@ var ConfigParser = {
browserObject.browser = browserData[0];
}
if (browserData[lindex] && browserData[lindex].indexOf("+") == -1) {
- if (["current", "previous"].indexOf(browserData[lindex]) != -1) {
+ if (["current", "previous", "latest"].indexOf(browserData[lindex]) != -1) {
version = ConfigParser.setBrowserVersion(browserObject, browserData[lindex]);
}
else { | support 'latest' in short config | browserstack_browserstack-runner | train | js |
137b02b27f3998111240ec3abb79c4c83b4e0f4d | diff --git a/nosqlite.py b/nosqlite.py
index <HASH>..<HASH> 100644
--- a/nosqlite.py
+++ b/nosqlite.py
@@ -240,7 +240,6 @@ class Collection(object):
to 'baz' and either the 'foo' key is an even number between 0 and 10 or is an odd number
greater than 10.
"""
-
matches = [] # A list of booleans
reapply = lambda q: self._apply_query(q, document)
@@ -270,8 +269,12 @@ class Collection(object):
if '.' in field:
nodes = field.split('.')
document_section = document
- for path in nodes[:-1]:
- document_section = document_section.get(path, None)
+
+ try:
+ for path in nodes[:-1]:
+ document_section = document_section.get(path, None)
+ except AttributeError:
+ document_section = None
if document_section is None:
matches.append(False) | handle error if requested path does not exist | shaunduncan_nosqlite | train | py |
2475c6944f9820cb1c9d2149ba06b2d7be7c6c45 | diff --git a/engine/node/oo-node-impl/src/main/java/com/hp/oo/engine/node/services/WorkerNodeServiceImpl.java b/engine/node/oo-node-impl/src/main/java/com/hp/oo/engine/node/services/WorkerNodeServiceImpl.java
index <HASH>..<HASH> 100644
--- a/engine/node/oo-node-impl/src/main/java/com/hp/oo/engine/node/services/WorkerNodeServiceImpl.java
+++ b/engine/node/oo-node-impl/src/main/java/com/hp/oo/engine/node/services/WorkerNodeServiceImpl.java
@@ -297,6 +297,7 @@ public final class WorkerNodeServiceImpl implements WorkerNodeService, UserDetai
@Override
@Transactional
+ @Secured("topologyManage")
public void addGroupToWorker(String workerUuid, String group) {
WorkerNode worker = readByUUID(workerUuid);
List<String> groups = new ArrayList<>(worker.getGroups()); | Make addGroupToWorker require topologyManage permission. Fixes bug #<I> | CloudSlang_score | train | java |
b2d2ca69063507b82b8cff13b5bd62593c16f9c9 | diff --git a/opal/browser/dom/node.rb b/opal/browser/dom/node.rb
index <HASH>..<HASH> 100644
--- a/opal/browser/dom/node.rb
+++ b/opal/browser/dom/node.rb
@@ -59,7 +59,7 @@ class Node
if native?(node)
`#@native.appendChild(node)`
elsif node.respond_to? :each
- node.each { |n| add_child(n) }
+ node.each { |n| self << n }
elsif String === node
`#@native.appendChild(#@native.ownerDocument.createTextNode(node))`
else
@@ -110,7 +110,7 @@ class Node
#
# @param node [Node] the node to append to
def append_to(node)
- node.add_child(self)
+ node << self
end
# Get an array of ancestors. | dom/node: use #<< internally instead of #add_child | opal_opal-browser | train | rb |
3bc49c53d01e337df1e9d70d53816480ce806300 | diff --git a/pycoin/ecdsa/Curve.py b/pycoin/ecdsa/Curve.py
index <HASH>..<HASH> 100644
--- a/pycoin/ecdsa/Curve.py
+++ b/pycoin/ecdsa/Curve.py
@@ -81,10 +81,11 @@ class Curve(object):
result = p
while i > 1:
result += result
- if (e3 & i) != 0 and (e & i) == 0:
- result = result + p
- if (e3 & i) == 0 and (e & i) != 0:
- result = result - p
+ if (e3 & i):
+ v = [result, result+p]
+ else:
+ v = [result-p, result]
+ result = v[0 if (e & i) else 1]
i >>= 1
return result | Reduce timing attack surface on point multiplication. | richardkiss_pycoin | train | py |
9dedd55048eaac1af357e8c21ec0ad5f30a90c90 | diff --git a/src/ServiceLocator.php b/src/ServiceLocator.php
index <HASH>..<HASH> 100644
--- a/src/ServiceLocator.php
+++ b/src/ServiceLocator.php
@@ -173,6 +173,11 @@ class ServiceLocator implements ServiceLocatorInterface
return $this->getService('honeybee.infrastructure.event_bus');
}
+ public function getProcessManager()
+ {
+ $this->getService('honeybee.infrastructure.process_manager');
+ }
+
public function getJobService()
{
return $this->getService('honeybee.infrastructure.job_service');
diff --git a/src/ServiceLocatorInterface.php b/src/ServiceLocatorInterface.php
index <HASH>..<HASH> 100644
--- a/src/ServiceLocatorInterface.php
+++ b/src/ServiceLocatorInterface.php
@@ -23,6 +23,7 @@ interface ServiceLocatorInterface
public function getCommandBus();
public function getEventBus();
+ public function getProcessManager();
public function getConnectorService();
public function getDataAccessService(); | added getProcessManager to ServiceLocatorInterface | honeybee_honeybee | train | php,php |
aa640fc9df9f92a6bb2fc73d9937f1935388d639 | diff --git a/juju/bundle.py b/juju/bundle.py
index <HASH>..<HASH> 100644
--- a/juju/bundle.py
+++ b/juju/bundle.py
@@ -584,11 +584,26 @@ class AddApplicationChange(ChangeInfo):
options["trust"] = "true"
url = URL.parse(str(charm))
+
+ # set the channel to the default value if not specified
+ if not self.channel:
+ if Schema.CHARM_STORE.matches(url.schema):
+ self.channel = "stable"
+ elif Schema.CHARM_HUB.matches(url.schema):
+ self.channel = "latest/stable"
+ else: # for local charms
+ self.channel = ""
+
channel = None
+ non_normalized_channel = None
if self.channel is not None and self.channel != "":
- channel = Channel.parse(self.channel).normalize()
+ non_normalized_channel = Channel.parse(self.channel)
+ channel = non_normalized_channel.normalize()
- origin = context.origins.get(str(url), {}).get(str(channel), None)
+ origin = context.origins.get(str(url), {}).get(
+ str(channel),
+ context.origins.get(str(url), {}).get(str(non_normalized_channel), None),
+ )
if origin is None:
raise JujuError("expected origin to be valid for application {} and charm {} with channel {}".format(self.application, str(url), str(channel))) | Fixed the bundle run when the channel is None;
Also added a second retrial on the obtention of the bundle's origin (in
order to use both the normalized and non normalized channel -> this
because some charms cause a failure on obtaining the origin with one
type of channel, and others with the other one). | juju_python-libjuju | train | py |
5e4e900d41852d0d6e62d4747b45d6cbd09b6abf | diff --git a/salt/pillar.py b/salt/pillar.py
index <HASH>..<HASH> 100644
--- a/salt/pillar.py
+++ b/salt/pillar.py
@@ -16,12 +16,13 @@ class Pillar(object):
'''
Read over the pillar top files and render the pillar data
'''
- def __init__(self, opts, grains):
+ def __init__(self, opts, grains, id_):
# use the local file client
self.opts = copy.deepcopy(opts)
self.opts['file_roots'] = self.opts['pillar_roots']
self.opts['file_client'] = 'local'
self.opts['grains'] = grains
+ self.opts['id'] = id_
self.client = salt.fileclient.get_file_client(self.opts)
self.matcher = salt.minion.Matcher(self.opts)
self.rend = salt.loader.render(opts, {}) | Add id to pillar object | saltstack_salt | train | py |
f31bf39932e2ccb630cd2e0323170fddbbb167e8 | diff --git a/libraries/lithium/tests/cases/util/reflection/InspectorTest.php b/libraries/lithium/tests/cases/util/reflection/InspectorTest.php
index <HASH>..<HASH> 100644
--- a/libraries/lithium/tests/cases/util/reflection/InspectorTest.php
+++ b/libraries/lithium/tests/cases/util/reflection/InspectorTest.php
@@ -150,7 +150,7 @@ class InspectorTest extends \lithium\test\Unit {
$this->assertNull(Inspector::info('\lithium\util'));
$result = Inspector::info('\lithium\util\reflection\Inspector');
- $this->assertTrue(strpos($result['file'], 'lithium/util/reflection/Inspector.php'));
+ $this->assertTrue(strpos(str_replace('\\', '/', $result['file']), 'lithium/util/reflection/Inspector.php'));
$this->assertEqual('lithium\util\reflection', $result['namespace']);
$this->assertEqual('Inspector', $result['shortName']); | Fixing one test in testIdentifierIntrospection() so it will work on Windows. | UnionOfRAD_framework | train | php |
bc7c6ff8e4fbd56b203eb6e5b6747ced5e9febd1 | diff --git a/src/sap.ui.documentation/src/sap/ui/documentation/sdk/controller/Sitemap.controller.js b/src/sap.ui.documentation/src/sap/ui/documentation/sdk/controller/Sitemap.controller.js
index <HASH>..<HASH> 100644
--- a/src/sap.ui.documentation/src/sap/ui/documentation/sdk/controller/Sitemap.controller.js
+++ b/src/sap.ui.documentation/src/sap/ui/documentation/sdk/controller/Sitemap.controller.js
@@ -101,7 +101,7 @@ sap.ui.define([
oFormattedNode = parsers[sType].formatNode(oNode);
if (oFormattedNode.hidden !== true) {
- newNodes.push();
+ newNodes.push(oFormattedNode);
}
} | [INTERNAL] DemoKit (Sitemap): Push formatted node to the array if not hidden
Problem: A node passed through the hidden condition was not pushed to the result array.
Caused by: I4d<I>cf<I>d<I>ba4e3e8a<I>dcdf<I>d8e2f
BCP: n/a
Change-Id: I<I>d9d<I>afd2b8f3ea<I>fe<I>de | SAP_openui5 | train | js |
8e0b6bd893240444c1c23d36af1c238330fff3c4 | diff --git a/tornado_pyuv/__init__.py b/tornado_pyuv/__init__.py
index <HASH>..<HASH> 100644
--- a/tornado_pyuv/__init__.py
+++ b/tornado_pyuv/__init__.py
@@ -64,7 +64,6 @@ class UVLoop(IOLoop):
if fd in self._handlers:
raise IOError("fd %d already registered" % fd)
poll = pyuv.Poll(self._loop, fd)
- poll.fd = fd
poll.handler = stack_context.wrap(handler)
self._handlers[fd] = poll
poll_events = 0
@@ -200,7 +199,7 @@ class UVLoop(IOLoop):
events |= IOLoop.READ
if poll_events & pyuv.UV_WRITABLE:
events |= IOLoop.WRITE
- fd = handle.fd
+ fd = handle.fileno()
try:
self._handlers[fd].handler(fd, events)
except (OSError, IOError) as e: | USe Poll.fileno() since it's now available | saghul_tornaduv | train | py |
36fec694b55cd3d2869f88de0deb83c656ab8202 | diff --git a/helpers/FileHelper.php b/helpers/FileHelper.php
index <HASH>..<HASH> 100644
--- a/helpers/FileHelper.php
+++ b/helpers/FileHelper.php
@@ -22,6 +22,7 @@ class FileHelper extends Helper{
} elseif (function_exists('mime_content_type')){
}
+ return true;
}
public function upload($name, $location){ | Check if is image returns true for now until is written | mpf-soft_mpf | train | php |
8f49fab86b48c73719b1eaeececdebc8d727a4af | diff --git a/salt/states/host.py b/salt/states/host.py
index <HASH>..<HASH> 100644
--- a/salt/states/host.py
+++ b/salt/states/host.py
@@ -2,13 +2,25 @@
Management of addresses and names in hosts file.
================================================
-The hosts file can be managed to contain definitions for specific hosts:
+The /etc/hosts file can be managed to contain definitions for specific hosts:
.. code-block:: yaml
salt-master:
host.present:
- ip: 192.168.0.42
+
+Or using the "names:" directive, you can put several names for the same IP. (Do not try one name with space-seperated values).
+.. code-block:: yaml
+
+ server1:
+ host.present:
+ - ip: 192.168.0.42
+ - names:
+ - server1
+ - florida
+
+NOTE: changing the IP or name(s) in the present() function does not cause an update to remove the old entry.
''' | Add example and warning for multiple name for host. Also specify path. And re: changing entries. | saltstack_salt | train | py |
1f9b135f75bb6325a48df94b9763f7ba0322b778 | diff --git a/src/PhpImap/Imap.php b/src/PhpImap/Imap.php
index <HASH>..<HASH> 100644
--- a/src/PhpImap/Imap.php
+++ b/src/PhpImap/Imap.php
@@ -581,7 +581,7 @@ final class Imap
return [];
}
- throw new UnexpectedValueException('Call to imap_getmailboxes() with supplied arguments returned false, not array!', 0, self::HandleErrors(imap_errors(), 'imap_headers'));
+ throw new UnexpectedValueException('Call to imap_getmailboxes() with supplied arguments returned false, not array!', 0, self::HandleErrors(imap_errors(), 'imap_getmailboxes'));
}
/** @psalm-var list<object> */
@@ -618,7 +618,7 @@ final class Imap
);
if (false === $result) {
- throw new UnexpectedValueException('Call to imap_getsubscribed() with supplied arguments returned false, not array!', 0, self::HandleErrors(imap_errors(), 'imap_headers'));
+ throw new UnexpectedValueException('Call to imap_getsubscribed() with supplied arguments returned false, not array!', 0, self::HandleErrors(imap_errors(), 'imap_getsubscribed'));
}
/** @psalm-var list<object> */ | correcting function names
incorrect function names were copy-pasted in when fleshing out
implementation. | barbushin_php-imap | train | php |
54b45a31d1d75831f9e4a947df2aa4ec37d46efe | diff --git a/runcommands/runners/remote.py b/runcommands/runners/remote.py
index <HASH>..<HASH> 100644
--- a/runcommands/runners/remote.py
+++ b/runcommands/runners/remote.py
@@ -1,5 +1,4 @@
import atexit
-import getpass
import sys
import time
from functools import partial
@@ -29,7 +28,6 @@ class RemoteRunner(Runner):
raise RunValueError('Only one of `sudo` or `run_as` may be specified')
use_pty = self.use_pty(use_pty)
- user = user or getpass.getuser()
path = self.munge_path(path, prepend_path, append_path, '$PATH')
remote_command = self.get_command(cmd, user, cd, path, sudo, run_as) | In RemoteRunner.run() don't set user to getpass.getuser()
Paramiko's `SSHClient.connect()` will do this if no user is passed in. | wylee_runcommands | train | py |
48c2319328e00cb57e0b02660a8cd3a008493f11 | diff --git a/src/main/java/org/primefaces/extensions/util/ExtLangUtils.java b/src/main/java/org/primefaces/extensions/util/ExtLangUtils.java
index <HASH>..<HASH> 100644
--- a/src/main/java/org/primefaces/extensions/util/ExtLangUtils.java
+++ b/src/main/java/org/primefaces/extensions/util/ExtLangUtils.java
@@ -124,7 +124,7 @@ public class ExtLangUtils {
}
public static String replace(final String text, final String searchString, final String replacement) {
- if (LangUtils.isValueEmpty(text) || LangUtils.isValueEmpty(searchString) || replacement == null) {
+ if (LangUtils.isValueEmpty(text) || searchString == null || replacement == null) {
return text;
}
int max = INDEX_NOT_FOUND; | Fixed Sonar issue around null check | primefaces-extensions_core | train | java |
bc0f65e6667bb6181d0a0dc682c3dde6ad641282 | diff --git a/src/widget.treeselector.js b/src/widget.treeselector.js
index <HASH>..<HASH> 100644
--- a/src/widget.treeselector.js
+++ b/src/widget.treeselector.js
@@ -14,7 +14,7 @@ RDFauthor.registerWidget({
RDFauthor.loadStylesheet(RDFAUTHOR_BASE + 'src/widget.treeselector.css');
- RDFauthor.loadScript(RDFAUTHOR_BASE + 'libraries/jquery.jstree.js', function(){
+ RDFauthor.loadScript(RDFAUTHOR_BASE + 'libraries/jstree/jquery.jstree.js', function(){
self._jsTreeLoaded = true;
self._init();
});
@@ -45,7 +45,7 @@ RDFauthor.registerWidget({
markup: function () {
var markup =
'<div class="container" style="width:100%">\
- <input type="text" style="width:100%;" class="text image-icon treeselector" name="treeselector" id="treeselector-edit-' + this.ID + '" value="'
+ <input type="text" style="width:100%;" class="text treeselector" name="treeselector" id="treeselector-edit-' + this.ID + '" value="'
+ (this.statement.hasObject() ? this.statement.objectValue() : '') + '" />\
</div>'; | remove class attribute, update library path of jstree | AKSW_RDFauthor | train | js |
8563c7a25f5e322f1d098244e2d9e5b0eebf9a67 | diff --git a/cloudflare.go b/cloudflare.go
index <HASH>..<HASH> 100644
--- a/cloudflare.go
+++ b/cloudflare.go
@@ -251,6 +251,10 @@ func (api *API) makeRequestWithAuthTypeAndHeaders(ctx context.Context, method, u
// retry if the server is rate limiting us or if it failed
// assumes server operations are rolled back on failure
if respErr != nil || resp.StatusCode == http.StatusTooManyRequests || resp.StatusCode >= 500 {
+ if resp.StatusCode == http.StatusTooManyRequests {
+ respErr = errors.New("exceeded available rate limit retries")
+ }
+
// if we got a valid http response, try to read body so we can reuse the connection
// see https://golang.org/pkg/net/http/#Client.Do
if respErr == nil { | cloudflare: update rate limit exceeded error message
Updates the messaging when the rate limit retry policy has been
exhausted to convey the actual failure instead of the generic `could not
read response body` failure. | cloudflare_cloudflare-go | train | go |
bffc6362ee9ccdf51c0bfbe876cef5715b013da2 | diff --git a/src/animanager/commands/anime/bump.py b/src/animanager/commands/anime/bump.py
index <HASH>..<HASH> 100644
--- a/src/animanager/commands/anime/bump.py
+++ b/src/animanager/commands/anime/bump.py
@@ -89,16 +89,18 @@ def bump(db, id):
# Calculate what needs updating, putting it into a dictionary that we will
# update at the end all at once.
update_map = dict()
- # First we update the episode count.
+
+ # We set the starting date if we haven't watched anything yet.
+ if watched == 0:
+ update_map['date_started'] = date.today().isoformat()
+
+ # We update the episode count.
watched += 1
update_map['ep_watched'] = watched
# If the status wasn't watching, we set it so.
- # Additionally if it was plan to watch, we set the starting date.
if status != 'watching':
update_map['status'] = 'watching'
- if status == 'plan to watch':
- update_map['date_started'] = date.today().isoformat()
# Finally, if the series is now complete, we set the status and finish date
# accordingly. | Set start date on first episode
We set the start date when we watch the first episode. Doing it on the
transition away from 'plan to watch' skips the case when adding a show
as watching. I don't think there's a case where watching the first
episode shouldn't set the start date, and the same for the inverse. | darkfeline_animanager | train | py |
fc9cb126e5ec434ad5e9536aa75d86eed845e676 | diff --git a/cmd/helm/install.go b/cmd/helm/install.go
index <HASH>..<HASH> 100644
--- a/cmd/helm/install.go
+++ b/cmd/helm/install.go
@@ -346,7 +346,9 @@ func locateChartPath(name, version string, verify bool, keyring string) (string,
if err != nil {
return filename, err
}
- fmt.Printf("Fetched %s to %s\n", name, filename)
+ if flagDebug {
+ fmt.Printf("Fetched %s to %s\n", name, filename)
+ }
return lname, nil
} else if flagDebug {
return filename, err | fix(helm): suprress info message for 'helm inspect'
There was an informational message being printed that is unnecessary,
but prevented shell scripting the results of inspect calls.
Closes #<I> | helm_helm | train | go |
f4ec173864b148ea35b5f2443a8cea02294c9363 | diff --git a/anndata/compat.py b/anndata/compat.py
index <HASH>..<HASH> 100644
--- a/anndata/compat.py
+++ b/anndata/compat.py
@@ -89,9 +89,11 @@ def _to_fixed_length_strings(value: np.ndarray) -> np.ndarray:
https://github.com/zarr-developers/zarr-python/pull/422
"""
new_dtype = []
- for dt_name, (dt_type, _) in value.dtype.fields.items():
- if dt_type.str[1] in ("U", "O"):
- new_dtype.append((dt_name, "U200"))
+ for dt_name, (dt_type, dt_offset) in value.dtype.fields.items():
+ if dt_type.kind == "O":
+ # Asumming the objects are str
+ size = max(len(x.encode()) for x in value.getfield("O", dt_offset))
+ new_dtype.append((dt_name, f"U{size}"))
else:
new_dtype.append((dt_name, dt_type))
return value.astype(new_dtype) | Allow any length of string to be written in a structured array to zarr | theislab_anndata | train | py |
e4426c7946189aa41f0c99d37bf843799fb00c33 | diff --git a/batchpath.py b/batchpath.py
index <HASH>..<HASH> 100644
--- a/batchpath.py
+++ b/batchpath.py
@@ -38,8 +38,7 @@ class GeneratePaths:
minsize=self.minsize):
yield os.path.abspath(path)
elif os.path.isdir(path):
- # pylint: disable=W0612
- for root, dnames, fnames in self._walker(path):
+ for root, _, fnames in self._walker(path):
yield from self._generator_rebase(fnames, root)
def _generator_other(self): | Unpack to throwaway var and remove unnecessary pylint override | brbsix_python-batchpath | train | py |
045a4c5ceecd6efc726094f6856b65f840b97eb7 | diff --git a/greenlight.js b/greenlight.js
index <HASH>..<HASH> 100644
--- a/greenlight.js
+++ b/greenlight.js
@@ -11,16 +11,16 @@ module.exports = function(fn) {
var fiber = Fiber(function() {
try {
fn(flow);
-
- //Prevent memory leak
- fn = null;
- fiber = null;
} catch(e) {
// throw in next tick so the context matches again if yielded.
process.nextTick(function() {
throw e;
});
- }
+ } finally {
+ //Prevent memory leak
+ fn = null;
+ fiber = null;
+ }
});
var parallelCount = 0;
@@ -112,10 +112,8 @@ module.exports = function(fn) {
returnValue = {};
return toReturn;
- } else {
- return;
}
};
fiber.run();
-}
+}; | Moving memory leak prevention into finally block to make sure it gets executed, even if an exception is thrown. | scriby_asyncblock-generators | train | js |
bcc56025be85e06c4b7ccb44d37132b4d185d52f | diff --git a/lib/capybara.rb b/lib/capybara.rb
index <HASH>..<HASH> 100644
--- a/lib/capybara.rb
+++ b/lib/capybara.rb
@@ -113,7 +113,7 @@ module Capybara
##
#
- # Modify a selector previously craeated by {Capybara.add_selector}.
+ # Modify a selector previously created by {Capybara.add_selector}.
# For example modifying the :button selector to also find divs styled
# to look like buttons might look like this
# | lib/capybara: Fix spelling of *created* | teamcapybara_capybara | train | rb |
5ac40841ac93c3d85813964830b7adeba7c0423d | diff --git a/src/Saft/Store/Adapter/Http.php b/src/Saft/Store/Adapter/Http.php
index <HASH>..<HASH> 100644
--- a/src/Saft/Store/Adapter/Http.php
+++ b/src/Saft/Store/Adapter/Http.php
@@ -341,7 +341,7 @@ class Http extends AbstractAdapter
*/
public function getTripleCount($graphUri)
{
- if (true === \Saft\Uri::check($graphUri)) {
+ if (true === \Saft\Rdf\NamedNode::check($graphUri)) {
// TODO simplify that mess!
$client = new \Saft\Sparql\Client(); | Replace Saft\Uri::check with Saft\Rdf\NamedNode::check | SaftIng_Saft | train | php |
f3dce0522f5bc89bfa96b9fb84526dc045b16ecf | diff --git a/glue/ligolw/ligolw.py b/glue/ligolw/ligolw.py
index <HASH>..<HASH> 100644
--- a/glue/ligolw/ligolw.py
+++ b/glue/ligolw/ligolw.py
@@ -84,7 +84,7 @@ class Element(object):
if not result:
a = self.childNodes[:]
a.sort()
- b = self.childNodes[:]
+ b = other.childNodes[:]
b.sort()
result = cmp(a, b)
return result | Fix compare() method: child nodes not being tested correctly. | gwastro_pycbc-glue | train | py |
4689dbd99b40176ffca2b6308d687204f1a3b556 | diff --git a/lib/validator.js b/lib/validator.js
index <HASH>..<HASH> 100644
--- a/lib/validator.js
+++ b/lib/validator.js
@@ -21,6 +21,9 @@ var validators = {
},
isAlpha: function(str) {
return str.match(/^[a-zA-Z]+$/);
+ },
+ isAlphanumeric: function(str) {
+ return str.match(/^[a-zA-Z0-9]+$/);
}
};
@@ -103,7 +106,7 @@ Validator.prototype.isAlpha = function() {
}
Validator.prototype.isAlphanumeric = function() {
- if (!this.str.match(/^[a-zA-Z0-9]+$/)) {
+ if (!validators.isAlphanumeric(this.str)) {
return this.error(this.msg || 'Invalid characters');
}
return this; | refactored isAlphanumeric | chriso_validator.js | train | js |
44c240ff00252e436246984152db0647d7ac087e | diff --git a/casper.js b/casper.js
index <HASH>..<HASH> 100644
--- a/casper.js
+++ b/casper.js
@@ -818,17 +818,26 @@
};
/**
- * Retrieves string contents from a binary file behind an url.
+ * Retrieves string contents from a binary file behind an url. Silently
+ * fails but log errors.
*
* @param String url
* @return string
*/
this.getBinary = function(url) {
- var xhr = new XMLHttpRequest();
- xhr.open("GET", url, false);
- xhr.overrideMimeType("text/plain; charset=x-user-defined");
- xhr.send(null);
- return xhr.responseText;
+ try {
+ var xhr = new XMLHttpRequest();
+ xhr.open("GET", url, false);
+ xhr.overrideMimeType("text/plain; charset=x-user-defined");
+ xhr.send(null);
+ return xhr.responseText;
+ } catch (e) {
+ if (e.name === "NETWORK_ERR" && e.code === 101) {
+ console.log('unfortunately, casperjs cannot make cross domain ajax requests');
+ }
+ console.log('error while fetching ' + url + ': ' + e);
+ return "";
+ }
};
/** | better handling of ajax retrieval failures (more informative) | casperjs_casperjs | train | js |
b0410f51335e3edd967836fcaff9ca6c33c10c5a | diff --git a/shared/actions/unlock-folders.js b/shared/actions/unlock-folders.js
index <HASH>..<HASH> 100644
--- a/shared/actions/unlock-folders.js
+++ b/shared/actions/unlock-folders.js
@@ -50,6 +50,7 @@ function waiting (currentlyWaiting: boolean): Waiting {
export function checkPaperKey (paperKey: HiddenString): TypedAsyncAction<CheckPaperKey | Waiting> {
return dispatch => {
// TODO figure out what service request to ask for
+ // TODO Use the waiting ability of the engine instead of manually dispatching
dispatch(waiting(true))
setTimeout(() => { | Leave todo for doing dispatching when switching to engine | keybase_client | train | js |
5c8cd07f88b1e6825dd88f5191f9dfee22ce51e7 | diff --git a/cf/help/help.go b/cf/help/help.go
index <HASH>..<HASH> 100644
--- a/cf/help/help.go
+++ b/cf/help/help.go
@@ -155,6 +155,7 @@ func newAppPresenter() (presenter appPresenter) {
presentNonCodegangstaCommand("enable-ssh"),
presentNonCodegangstaCommand("disable-ssh"),
presentNonCodegangstaCommand("ssh-enabled"),
+ presentNonCodegangstaCommand("ssh"),
},
},
}, { | add command ssh to cf help | cloudfoundry_cli | train | go |
9e767474c2fe04c8b9e54fc8dbca8a40434ccf6b | diff --git a/setup.py b/setup.py
index <HASH>..<HASH> 100644
--- a/setup.py
+++ b/setup.py
@@ -6,7 +6,7 @@ README = open(os.path.join(here, 'README.rst')).read()
NEWS = open(os.path.join(here, 'NEWS.txt')).read()
-version = '0.3.1'
+version = '0.4'
install_requires = [
"Sphinx", | Beginning work on Hieroglyph <I>. | nyergler_hieroglyph | train | py |
6ccdfda2253d4cf55dad1ce2c6d4ee4d557ed625 | diff --git a/doc/conf.py b/doc/conf.py
index <HASH>..<HASH> 100644
--- a/doc/conf.py
+++ b/doc/conf.py
@@ -13,6 +13,9 @@
from __future__ import absolute_import, division, print_function
+from future import standard_library
+standard_library.install_aliases() # noqa: E402
+
import os
import sys
diff --git a/doc/exts/docscrape.py b/doc/exts/docscrape.py
index <HASH>..<HASH> 100644
--- a/doc/exts/docscrape.py
+++ b/doc/exts/docscrape.py
@@ -2,12 +2,15 @@
"""
from __future__ import absolute_import, division, print_function
+from future import standard_library
+standard_library.install_aliases() # noqa: E402
import inspect
import pydoc
import re
import textwrap
-from StringIO import StringIO
+
+from io import StringIO
from warnings import warn
class Reader(object):
diff --git a/doc/exts/docscrape_sphinx.py b/doc/exts/docscrape_sphinx.py
index <HASH>..<HASH> 100644
--- a/doc/exts/docscrape_sphinx.py
+++ b/doc/exts/docscrape_sphinx.py
@@ -1,4 +1,6 @@
from __future__ import absolute_import, division, print_function
+from future import standard_library
+standard_library.install_aliases() # noqa: E402
import inspect
import pydoc | Added support for docs generation on PY2 and PY3 | ska-sa_katcp-python | train | py,py,py |
e24d67884a6fc495806d55fa7e5942e5eb34b92f | diff --git a/lib/discordrb/gateway.rb b/lib/discordrb/gateway.rb
index <HASH>..<HASH> 100644
--- a/lib/discordrb/gateway.rb
+++ b/lib/discordrb/gateway.rb
@@ -182,7 +182,7 @@ module Discordrb
# outside of testing and implementing highly custom reconnect logic.
# @param url [String, nil] the URL to connect to or nil if one should be obtained from Discord.
def inject_reconnect(url)
- websocket_message({
+ handle_message({
op: Opcodes::RECONNECT,
d: {
url: url | Change inject_reconnect to use handle_message
websocket_message doesn't exist in Gateway. | meew0_discordrb | train | rb |
0b63a88fc9f0b5d6ff79b8d784dd2837f2fb5f50 | diff --git a/webpack.config.js b/webpack.config.js
index <HASH>..<HASH> 100644
--- a/webpack.config.js
+++ b/webpack.config.js
@@ -91,6 +91,11 @@ if ( CALYPSO_ENV === 'desktop' || CALYPSO_ENV === 'desktop-mac-app-store' ) {
} else {
webpackConfig.entry.vendor = [ 'react', 'store', 'page', 'wpcom', 'jed', 'debug' ];
webpackConfig.plugins.push( new webpack.optimize.CommonsChunkPlugin( 'vendor', '[name].[hash].js' ) );
+ webpackConfig.plugins.push( new webpack.optimize.CommonsChunkPlugin( {
+ children: true,
+ minChunks: 5,
+ name: 'build-' + CALYPSO_ENV
+ } ) );
webpackConfig.plugins.push( new ChunkFileNamePlugin() );
// jquery is only needed in the build for the desktop app
// see electron bug: https://github.com/atom/electron/issues/254 | Webpack: Move common modules in children into build | Automattic_wp-calypso | train | js |
98c1a6f37335a37760fe6850c2056552c082c547 | diff --git a/updateengine/client.go b/updateengine/client.go
index <HASH>..<HASH> 100644
--- a/updateengine/client.go
+++ b/updateengine/client.go
@@ -72,7 +72,8 @@ func (c *Client) RebootNeededSignal(rcvr chan Status) {
}
func (c *Client) GetStatus() (result Status, err error) {
- call := c.object.Call(dbusInterface + ".GetStatus", 0)
+ call := c.object.Call(dbusInterface+".GetStatus", 0)
+ err = call.Err
if err != nil {
return
} | fix(updateengine): set error appropriately from dbus call | coreos_locksmith | train | go |
8020c651600dd40677198519ae29946d540ea15f | diff --git a/setup.py b/setup.py
index <HASH>..<HASH> 100755
--- a/setup.py
+++ b/setup.py
@@ -48,7 +48,7 @@ except (IOError, ImportError):
long_description = f.read()
-version = '0.1.5'
+version = '0.1.6'
class TestCommand(Command): | Update version number to <I> | chaoss_grimoirelab-perceval-puppet | train | py |
615792d182089fbb4784e65286552004c3d4f3be | diff --git a/encoding/codec.go b/encoding/codec.go
index <HASH>..<HASH> 100644
--- a/encoding/codec.go
+++ b/encoding/codec.go
@@ -10,10 +10,10 @@ type Codec interface {
Marshal(interface{}) ([]byte, error)
Decode(io.Reader, interface{}) error
Unmarshal([]byte, interface{}) error
- Pool([]byte)
+ Pool(...[]byte)
}
-var JSON = Json{}
+var JSON Codec = Json{}
type Json struct{}
@@ -33,4 +33,4 @@ func (Json) Unmarshal(data []byte, v interface{}) error {
return json.Unmarshal(data, v)
}
-func (Json) Pool([]byte) {}
+func (Json) Pool(...[]byte) {}
diff --git a/time2/time.go b/time2/time.go
index <HASH>..<HASH> 100644
--- a/time2/time.go
+++ b/time2/time.go
@@ -28,6 +28,14 @@ func Now() time.Time {
return now.In(Location)
}
+func After(d time.Duration) time.Time {
+ return Now().Add(d)
+}
+
+func Since(t time.Time) time.Duration {
+ return Now().Sub(t)
+}
+
func CurrDate() string {
return Date(Now())
} | encoding: fixed json codec; time2: add After, Since | cosiner_gohper | train | go,go |
d63f6ccd1abf8354577c110af2b1dd56b2dcef33 | diff --git a/index.js b/index.js
index <HASH>..<HASH> 100644
--- a/index.js
+++ b/index.js
@@ -73,7 +73,6 @@ function showError(errorCode, res) {
fs.createReadStream(__dirname + '/errors/' + errorCode + '.html').pipe(res);
}
-
module.exports.createGlance = function (options) {
return new Glance(options);
};
@@ -87,10 +86,9 @@ c
.option('-v, --verbose', 'log connections to console | default off')
.parse(process.argv);
-var glance = new Glance({
+new Glance({
port: c.port,
dir: c.dir,
verbose: c.verbose
-});
-glance.start();
+}).start(); | a bit more succinct | jarofghosts_glance | train | js |
a93ada0a1e3d6dfd4dcedacf5b7f3d6276deb41d | diff --git a/go/bind/keybase.go b/go/bind/keybase.go
index <HASH>..<HASH> 100644
--- a/go/bind/keybase.go
+++ b/go/bind/keybase.go
@@ -160,9 +160,11 @@ type serviceCn struct {
}
func (s serviceCn) NewKeybaseService(config libkbfs.Config, params libkbfs.InitParams, ctx libkbfs.Context, log logger.Logger) (libkbfs.KeybaseService, error) {
- keybaseService := libkbfs.NewKeybaseDaemonRPC(config, ctx, log, true, nil)
+ keybaseService := libkbfs.NewKeybaseDaemonRPC(
+ config, ctx, log, true, nil, nil)
keybaseService.AddProtocols([]rpc.Protocol{
keybase1.FsProtocol(fsrpc.NewFS(config, log)),
+ // TODO: add git protocol if mobile ever needs it.
})
return keybaseService, nil
} | bind: update `NewKeybaseDaemonRPC` call
Changed by keybase/kbfs#<I>
Issue: KBFS-<I> | keybase_client | train | go |
d35de2272d31f0a8dcfb49c2ef3d54dc95b38a0b | diff --git a/python/ray/tests/test_ray_init.py b/python/ray/tests/test_ray_init.py
index <HASH>..<HASH> 100644
--- a/python/ray/tests/test_ray_init.py
+++ b/python/ray/tests/test_ray_init.py
@@ -41,15 +41,20 @@ class TestRedisPassword:
# We catch a generic Exception here in case someone later changes the
# type of the exception.
except Exception as ex:
- if not isinstance(ex.__cause__, redis.AuthenticationError):
+ if not (isinstance(ex.__cause__, redis.AuthenticationError)
+ and "invalid password" in str(ex.__cause__)) and not (
+ isinstance(ex, redis.ResponseError) and
+ "WRONGPASS invalid username-password pair" in str(ex)):
raise
# By contrast, we may be fairly confident the exact string
# 'invalid password' won't go away, because redis-py simply wraps
# the exact error from the Redis library.
# https://github.com/andymccurdy/redis-py/blob/master/
# redis/connection.py#L132
- if "invalid password" not in str(ex.__cause__):
- raise
+ # Except, apparently sometimes redis-py raises a completely
+ # different *type* of error for a bad password,
+ # redis.ResponseError, which is not even derived from
+ # redis.ConnectionError as redis.AuthenticationError is.
# Check that we can connect to Redis using the provided password
redis_client = redis.StrictRedis( | [Core] Allow redis.ResponseError instead of redis.AuthenticationError (#<I>)
* redis.ResponseError
* there really is no way to make this look good, is there | ray-project_ray | train | py |
48f55b4f3138b570e45aa5f7874df26de32aa658 | diff --git a/lib/stores/ldap/save.js b/lib/stores/ldap/save.js
index <HASH>..<HASH> 100644
--- a/lib/stores/ldap/save.js
+++ b/lib/stores/ldap/save.js
@@ -87,6 +87,9 @@ exports.record = {
move_from = self.getChanges()[name][0];
}else{
if((value === null || value === '') && self.changes[name]){ //remove only if the previous value was valid
+
+ tmp[name] = self.changes[name][0]; //delete the old value...
+
values.push(new ldap.Change({
operation: 'delete',
modification: tmp | ldap delete bugfix | PhilWaldmann_openrecord | train | js |
589d3b2449d62b247dc1d3f870a253e933febf5a | diff --git a/pypeerassets/provider/blockbook.py b/pypeerassets/provider/blockbook.py
index <HASH>..<HASH> 100644
--- a/pypeerassets/provider/blockbook.py
+++ b/pypeerassets/provider/blockbook.py
@@ -9,7 +9,11 @@ from btcpy.structs.transaction import ScriptSig, Sequence, TxIn
from pypeerassets.exceptions import InsufficientFunds, UnsupportedNetwork
from pypeerassets.provider.common import Provider
+'''
+TODO:
+Everything except getrawtransaction, getblockhash, getblock
+'''
class Blockbook(Provider):
'''API wrapper for https://blockbook.peercoin.net blockexplorer.'''
@@ -73,12 +77,12 @@ class Blockbook(Provider):
def getblockhash(self, index: int) -> str:
'''Returns the hash of the block at ; index 0 is the genesis block.'''
- return cast(str, self.api_fetch('getblockhash?index=' + str(index)))
+ return cast(str, self.api_fetch('/block-index/' + str(index)))
def getblock(self, hash: str) -> dict:
'''Returns information about the block with the given hash.'''
- return cast(dict, self.api_fetch('getblock?hash=' + hash))
+ return cast(dict, self.api_fetch('/block/' + hash))
def getrawtransaction(self, txid: str, decrypt: int=0) -> dict:
'''Returns raw transaction representation for given transaction id. | added getblockhash, getblock | PeerAssets_pypeerassets | train | py |
7938e590b2e65593a616dafa838032b0e759a05f | diff --git a/setup.py b/setup.py
index <HASH>..<HASH> 100644
--- a/setup.py
+++ b/setup.py
@@ -14,7 +14,7 @@ setup(
author = u'Juan Pedro Fisanotti',
author_email = 'fisadev@gmail.com',
url='',
- packages=['simeple_ai'],
+ packages=['simple_ai'],
classifiers = [
'Intended Audience :: Developers',
'License :: OSI Approved :: MIT License', | Fixed seccond typo on the same word -_- | simpleai-team_simpleai | train | py |
615b46d70cce267ce024cec00d23911dfcde428e | diff --git a/lib/util/player_configuration.js b/lib/util/player_configuration.js
index <HASH>..<HASH> 100644
--- a/lib/util/player_configuration.js
+++ b/lib/util/player_configuration.js
@@ -44,11 +44,7 @@ shaka.util.PlayerConfiguration = class {
// Some browsers implement the Network Information API, which allows
// retrieving information about a user's network connection.
- //
- // We are excluding connection.type == undefined to avoid getting bogus data
- // on platforms where the implementation is incomplete. Currently, desktop
- // Chrome 64 returns connection type undefined and a bogus downlink value.
- if (navigator.connection && navigator.connection.type) {
+ if (navigator.connection) {
// If it's available, get the bandwidth estimate from the browser (in
// megabits per second) and use it as defaultBandwidthEstimate.
bandwidthEstimate = navigator.connection.downlink * 1e6; | Fix initial bandwidth estimate on Chrome
The Network Information API gives us an initial bandwidth estimate on
Chrome and Opera (and others, some day). But the API changed slightly
in a way that broke our detection of it. This restores the
functionality so that our initial bandwidth estimate is closer to what
we will converge on during playback.
Change-Id: Iaff<I>c<I>e6fcad1f<I>d<I>d<I>a<I>f<I>cd7 | google_shaka-player | train | js |
d2bda0757e28ff30c98602ddfc1e7617a08e39cd | diff --git a/samples/es6sample.js b/samples/es6sample.js
index <HASH>..<HASH> 100644
--- a/samples/es6sample.js
+++ b/samples/es6sample.js
@@ -4,6 +4,9 @@ import http from "speedboat/http";
export let options = {
vus: 5,
+ thresholds: {
+ my_rate: ["avg>=0.4"],
+ }
};
let mCounter = new Counter("my_counter"); | [docs] Thresholds in es6sample.js | loadimpact_k6 | train | js |
12e76af7c0ca85f3dcdaf2e79e5f08c2e8d4daf9 | diff --git a/lib/puppet/pops/types/type_calculator.rb b/lib/puppet/pops/types/type_calculator.rb
index <HASH>..<HASH> 100644
--- a/lib/puppet/pops/types/type_calculator.rb
+++ b/lib/puppet/pops/types/type_calculator.rb
@@ -168,7 +168,15 @@ class Puppet::Pops::Types::TypeCalculator
type
end
+ # @api private
+ def infer_Class(o)
+ type = Types::PRubyType.new()
+ type.ruby_class = o.name
+ type
+ end
+
# The type of all types is PType
+ # @api private
#
def infer_PObjectType(o)
Types::PType.new() | (#<I>) Add infer_Class to remove requirement to pass an instance.
Without this it is more difficult to create a type for a class. | puppetlabs_puppet | train | rb |
a23590b2555d77f4e46b33f3b1427103b1c12dd4 | diff --git a/src/engine/processors/consult.js b/src/engine/processors/consult.js
index <HASH>..<HASH> 100644
--- a/src/engine/processors/consult.js
+++ b/src/engine/processors/consult.js
@@ -86,10 +86,10 @@ const handleConsultEntry = function handleConsultEntry(
// work path from the current working directory given
filepath = path.resolve(workingDirectory, filepath);
}
- if (theta.Id === undefined || !(theta.Id instanceof Functor)) {
+ if (theta.Id === undefined || !(theta.Id instanceof Functor || theta.Id instanceof Value)) {
promise = consultFile(filepath);
} else {
- promise = consultFile(filepath, theta.Id.evaluate());
+ promise = consultFile(filepath, theta.Id);
}
return promise
@@ -123,7 +123,7 @@ function consultProcessor(engine, targetProgram) {
let processProgramWithId = function processProgramWithId(program, id) {
let treeMap = new LiteralTreeMap();
let theta = {
- Id: new Value(id)
+ Id: id
};
treeMap.add(processIdLiteral.substitute(theta)); | fix agent consult ID value being converted to Value | lps-js_lps.js | train | js |
e21ffa2e53728dcac86d9f37245ee0b1c29359b4 | diff --git a/betfairlightweight/resources/streamingresources.py b/betfairlightweight/resources/streamingresources.py
index <HASH>..<HASH> 100644
--- a/betfairlightweight/resources/streamingresources.py
+++ b/betfairlightweight/resources/streamingresources.py
@@ -335,9 +335,9 @@ class CricketInningsStats:
self.innings_num = kwargs.get("inningsNum")
self.batting_team = kwargs.get("battingTeam")
self.bowling_team = kwargs.get("bowlingTeam")
- self.inningsRuns = kwargs.get("inningsRuns")
- self.inningsOvers = kwargs.get("inningsOvers")
- self.inningsWickets = kwargs.get("inningsWickets")
+ self.innings_runs = kwargs.get("inningsRuns")
+ self.innings_overs = kwargs.get("inningsOvers")
+ self.innings_wickets = kwargs.get("inningsWickets")
class CricketBattingTeamStats: | Fix CricketInningsStats field names | liampauling_betfair | train | py |
d1060593dff89ad72abbdddf409f4c4d94252a0b | diff --git a/cocaine/detail/api.py b/cocaine/detail/api.py
index <HASH>..<HASH> 100644
--- a/cocaine/detail/api.py
+++ b/cocaine/detail/api.py
@@ -26,7 +26,7 @@ class API(object):
2: ['refresh', {}, {0: ['value', {}], 1: ['error', {}]}],
3: ['cluster', {}, {0: ['value', {}], 1: ['error', {}]}],
4: ['publish', {0: ['discard', {}]}, {0: ['value', {}], 1: ['error', {}]}],
- 5: ['routing', {}, {0: ['write', None], 1: ['error', {}], 2: ['close', {}]}]}
+ 5: ['routing', {0: ['discard', {}]}, {0: ['write', None], 1: ['error', {}], 2: ['close', {}]}]}
Logger = {0: ['emit', {}, {}],
1: ['verbosity', {}, {0: ['value', {}], 1: ['error', {}]}], | [Locator] Sync API description with new routing dispatch | cocaine_cocaine-framework-python | train | py |
5db47d88276dff163d6c1ac03835d4ead7d4e32b | diff --git a/sphinxcontrib/needs/utils.py b/sphinxcontrib/needs/utils.py
index <HASH>..<HASH> 100644
--- a/sphinxcontrib/needs/utils.py
+++ b/sphinxcontrib/needs/utils.py
@@ -108,7 +108,7 @@ def process_dynamic_values(app, doctree, fromdocname):
class NeedsList:
JSON_KEY_EXCLUSIONS = {'links_back', 'type_color', 'hide_status',
'target_node', 'hide', 'type_prefix', 'lineno',
- 'docname', 'type', 'collapse',
+ 'type', 'collapse',
'type_style', 'hide_tags', 'content'}
def __init__(self, config, outdir, confdir): | Activated docname in need-builder output. fixes #<I> | useblocks_sphinxcontrib-needs | train | py |
186d1e452ac0f79afd88a425554a0f503df574a7 | diff --git a/test/integration.js b/test/integration.js
index <HASH>..<HASH> 100644
--- a/test/integration.js
+++ b/test/integration.js
@@ -223,9 +223,7 @@ test('output logs of deployment', async t => {
reject: false
})
- t.true(stdout.includes('npm install'))
- t.true(stdout.includes('Installing'))
- t.true(stdout.includes('Installed'))
+ t.true(stdout.includes('yarn install'))
t.true(stdout.includes('Snapshotting deployment'))
t.true(stdout.includes('Saving deployment image'))
t.true(stdout.includes('npm start')) | Fixed integration tests (#<I>) | zeit_now-cli | train | js |
16435134497ef0cb2298546e2f8389fad2e2f9ee | diff --git a/source/rafcon/statemachine/states/hierarchy_state.py b/source/rafcon/statemachine/states/hierarchy_state.py
index <HASH>..<HASH> 100644
--- a/source/rafcon/statemachine/states/hierarchy_state.py
+++ b/source/rafcon/statemachine/states/hierarchy_state.py
@@ -216,7 +216,6 @@ class HierarchyState(ContainerState):
if transition is None:
transition = self.handle_no_transition(self.child_state)
- self.child_state.state_execution_status = StateExecutionState.WAIT_FOR_NEXT_STATE
# if the transition is still None, then the child_state was preempted or aborted, in this case
# return
if transition is None: | resolves #<I>
remove duplicate execution status update | DLR-RM_RAFCON | train | py |
ad2c24682aea3d53fab26e387dda2f97a747d6b2 | diff --git a/src/Tappable.js b/src/Tappable.js
index <HASH>..<HASH> 100644
--- a/src/Tappable.js
+++ b/src/Tappable.js
@@ -292,9 +292,6 @@ var Mixin = {
var movement = this.calculateMovement(this._lastTouch);
if (movement.x <= this.props.moveThreshold && movement.y <= this.props.moveThreshold && this.props.onTap) {
event.preventDefault();
- event.preventDefault = function(){};
- // calling preventDefault twice throws an error. This will fix that.
- event.persist();
afterEndTouch = () => {
var finalParentScrollPos = this._scrollParents.map(node => node.scrollTop + node.scrollLeft);
var stoppedMomentumScroll = this._scrollParentPos.some((end, i) => { | Remove bad workarounds to solve the React event pooling problem on tap
The solution is to fire the tap event always synchronously (already fixed).
So the React event pooling can behave as usual.
Removed the attempt to assign an empty function to preventDefault, because it does not make sense to assign a function to a SyntheticEvent that has been put back in the pool. It should be the role of React to be defensive agaiin using preventDefault on events that are already put back in the pool. (See <URL>) | JedWatson_react-tappable | train | js |
65d519551c7c94a91c7256f3fd484e006b9a455e | diff --git a/lib/hanami/utils/inflector.rb b/lib/hanami/utils/inflector.rb
index <HASH>..<HASH> 100644
--- a/lib/hanami/utils/inflector.rb
+++ b/lib/hanami/utils/inflector.rb
@@ -271,7 +271,8 @@ module Hanami
'areas' => 'area',
'hives' => 'hive',
'phases' => 'phase',
- 'exercises' => 'exercise'
+ 'exercises' => 'exercise',
+ 'releases' => 'release'
)
# Block for custom inflection rules.
diff --git a/spec/support/fixtures/fixtures.rb b/spec/support/fixtures/fixtures.rb
index <HASH>..<HASH> 100644
--- a/spec/support/fixtures/fixtures.rb
+++ b/spec/support/fixtures/fixtures.rb
@@ -610,7 +610,8 @@ TEST_SINGULARS = {
'album' => 'albums',
# https://github.com/hanami/utils/issues/217
'phase' => 'phases',
- 'exercise' => 'exercises'
+ 'exercise' => 'exercises',
+ 'release' => 'releases'
}.merge(TEST_PLURALS)
require 'hanami/utils/inflector' | inflector: Fix singular of "releases" (#<I>) | hanami_utils | train | rb,rb |
e3335574a43f853520ccd980706df0e4c9907c74 | diff --git a/lib/project/pro_motion/fragments/pm_screen_module.rb b/lib/project/pro_motion/fragments/pm_screen_module.rb
index <HASH>..<HASH> 100644
--- a/lib/project/pro_motion/fragments/pm_screen_module.rb
+++ b/lib/project/pro_motion/fragments/pm_screen_module.rb
@@ -237,6 +237,8 @@
case mode
when :adjust_resize
Android::View::WindowManager::LayoutParams::SOFT_INPUT_ADJUST_RESIZE
+ when :adjust_pan
+ Android::View::WindowManager::LayoutParams::SOFT_INPUT_ADJUST_PAN
end
activity.getWindow().setSoftInputMode(mode_const)
end | Keyboard pan mode to prevent ui control squishing. | infinitered_bluepotion | train | rb |
e18695de17186c8a25f80c16700c8e226ed4830a | diff --git a/src/simg.js b/src/simg.js
index <HASH>..<HASH> 100644
--- a/src/simg.js
+++ b/src/simg.js
@@ -63,7 +63,7 @@
}
// Make the new img's source an SVG image.
- img.setAttribute('src', 'data:image/svg+xml;base64,'+ btoa(str));
+ img.setAttribute('src', 'data:image/svg+xml;base64,'+ btoa(unescape(encodeURIComponent(str))));
},
// Returns callback to new img from SVG.
@@ -99,7 +99,7 @@
toBinaryBlob: function(cb){
this.toCanvas(function(canvas){
var dataUrl = canvas.toDataURL().replace(/^data:image\/(png|jpg);base64,/, "");
- var byteString = atob(dataUrl);
+ var byteString = atob(escape(dataUrl));
// write the bytes of the string to an ArrayBuffer
var ab = new ArrayBuffer(byteString.length);
var ia = new Uint8Array(ab); | Support for foreign characters from @clemsos (<URL>) | krunkosaurus_simg | train | js |
6916bb24a24547b6cb6bd46d072f0eee22f019b3 | diff --git a/cmd/kubeadm/app/constants/constants.go b/cmd/kubeadm/app/constants/constants.go
index <HASH>..<HASH> 100644
--- a/cmd/kubeadm/app/constants/constants.go
+++ b/cmd/kubeadm/app/constants/constants.go
@@ -136,7 +136,7 @@ const (
// LabelNodeRoleExcludeBalancer specifies that the node should be
// exclude from load balancers created by a cloud provider.
- LabelNodeRoleExcludeBalancer = "node-role.kubernetes.io/exclude-balancer"
+ LabelNodeRoleExcludeBalancer = "alpha.node-role.kubernetes.io/exclude-balancer"
// MasterConfigurationConfigMap specifies in what ConfigMap in the kube-system namespace the `kubeadm init` configuration should be stored
MasterConfigurationConfigMap = "kubeadm-config" | Append an alpha label to the exclude load balancer annotation. | kubernetes_kubernetes | train | go |
dab5e21b10ee1b6768f93c56e8e58063679343d9 | diff --git a/commands/story/start/command.go b/commands/story/start/command.go
index <HASH>..<HASH> 100644
--- a/commands/story/start/command.go
+++ b/commands/story/start/command.go
@@ -125,7 +125,9 @@ StoryLoop:
fmt.Println()
// Create the story branch, optionally.
- if !flagNoBranch {
+ if flagNoBranch {
+ log.Log("Not creating any feature branch")
+ } else {
var action common.Action
action, err = createBranch()
if err != nil {
@@ -206,14 +208,11 @@ func createBranch() (common.Action, error) {
return nil, errs.NewError(task, err, nil)
}
- exitFunc := func() {
- fmt.Println("\nI will exit now. Just run me again later if you want.")
- os.Exit(0)
- }
-
sluggedLine := slug.Slug(line)
if sluggedLine == "" {
- exitFunc()
+ fmt.Println()
+ log.Log("Not creating any feature branch")
+ return nil, nil
}
branchName := "story/" + sluggedLine
@@ -223,7 +222,7 @@ func createBranch() (common.Action, error) {
return nil, errs.NewError(task, err, nil)
}
if !ok {
- exitFunc()
+ panic(prompt.ErrCanceled)
}
fmt.Println() | story start: Skip branch creation when name empty
Skip the branch creation step when the user inserts an empty branch
name. This is basically equivalent to running story start with
-no_branch.
Change-Id: f<I>
Story-Id: SF-<I> | salsaflow_salsaflow | train | go |
8526cc7947e1caa81ff5f8d5bf788adf563c0cbf | diff --git a/encoder.go b/encoder.go
index <HASH>..<HASH> 100644
--- a/encoder.go
+++ b/encoder.go
@@ -25,6 +25,7 @@ import (
"encoding/base64"
"encoding/json"
"math"
+ "strings"
"sync"
"time"
"unicode/utf8"
@@ -351,6 +352,9 @@ func (enc *ltsvEncoder) closeOpenNamespaces() {
func (enc *ltsvEncoder) addKey(key string) {
enc.addElementSeparator()
if enc.nestedLevel == 0 {
+ if strings.ContainsRune(key, ':') {
+ panic("LTSV keys must not contain colon ':'")
+ }
enc.safeAddString(key)
enc.buf.AppendByte(':')
enc.justAfterKey = true | Panic if LTSV key contains a colon character | hnakamur_zap-ltsv | train | go |
7990f40d665604b1bfc13cf84bc177aa2a36fc7d | diff --git a/platform/html5/html.js b/platform/html5/html.js
index <HASH>..<HASH> 100644
--- a/platform/html5/html.js
+++ b/platform/html5/html.js
@@ -517,6 +517,11 @@ exports.init = function(ctx) {
if (ctx.processKey(event))
event.preventDefault()
}) //fixme: add html.Document instead
+ win.on('orientationchange', function(event) {
+ log('orientation changed event')
+ ctx.system.screenWidth = window.screen.width
+ ctx.system.screenHeight = window.screen.height
+ })
ctx._styleClassifier = $manifest$cssAutoClassificator? new StyleClassifier(ctx._prefix): null; //broken beyond repair
} | update screenwidth/height on orientation changed event | pureqml_qmlcore | train | js |
053ab688f3008a39c88a6a449f27294683d233da | diff --git a/core/src/main/java/com/google/errorprone/bugpatterns/threadsafety/WellKnownMutability.java b/core/src/main/java/com/google/errorprone/bugpatterns/threadsafety/WellKnownMutability.java
index <HASH>..<HASH> 100644
--- a/core/src/main/java/com/google/errorprone/bugpatterns/threadsafety/WellKnownMutability.java
+++ b/core/src/main/java/com/google/errorprone/bugpatterns/threadsafety/WellKnownMutability.java
@@ -92,6 +92,7 @@ final class WellKnownMutability {
.add(java.math.BigDecimal.class)
.add(java.net.InetAddress.class)
.add(java.util.Locale.class)
+ .add("java.util.Optional", "T")
.add("org.joda.time.DateTime")
.add("org.joda.time.DateTimeZone")
.add("org.joda.time.Duration") | Add java.util.Optional as a well-known immutable container
Fixes #<I>
MOE_MIGRATED_REVID=<I> | google_error-prone | train | java |
d02a79bbbea9f8312624298c3a90df84e5e943a9 | diff --git a/modules/phenyl-interfaces/index.js b/modules/phenyl-interfaces/index.js
index <HASH>..<HASH> 100644
--- a/modules/phenyl-interfaces/index.js
+++ b/modules/phenyl-interfaces/index.js
@@ -155,11 +155,18 @@ import type {
UserDefinition,
UserDefinitions,
} from './decls/user-definition.js.flow'
-import type { WhereConditions } from './decls/where-conditions.js.flow'
+import type {
+ AndWhereConditions,
+ NorWhereConditions,
+ OrWhereConditions,
+ SimpleWhereConditions,
+ WhereConditions,
+} from './decls/where-conditions.js.flow'
export type {
AclHandler,
AddToSetOperator,
+ AndWhereConditions,
AuthenticationHandler,
AuthenticationResult,
AuthClient,
@@ -242,7 +249,9 @@ export type {
MultiInsertCommand,
MultiUpdateCommand,
NGAuthenticationResult,
+ NorWhereConditions,
OKAuthenticationResult,
+ OrWhereConditions,
PreSession,
PhenylRunner,
PopOperator,
@@ -265,6 +274,7 @@ export type {
SetOperator,
Session,
SessionClient,
+ SimpleWhereConditions,
SingleInsertCommand,
SingleQueryResult,
SingleQueryResultOrError, | [phenyl-interfaces] export WhereConditions subtypes | phenyl-js_phenyl | train | js |
c1cc93e33d9626c7415ea7fc0428682ee30fffc7 | diff --git a/libraries/joomla/form/form.php b/libraries/joomla/form/form.php
index <HASH>..<HASH> 100644
--- a/libraries/joomla/form/form.php
+++ b/libraries/joomla/form/form.php
@@ -1833,26 +1833,18 @@ class JForm
if ($required)
{
- // If the field is required and the value is empty return an error message.
+ // If the field is required and the value is empty return an error message.
if (($value === '') || ($value === null))
{
- // Does the field have a defined error message?
- if ($element['message'])
+ if ($element['label'])
{
- $message = JText::_($element['message']);
+ $message = JText::_($element['label']);
}
else
{
- if ($element['label'])
- {
- $message = JText::_($element['label']);
- }
- else
- {
- $message = JText::_($element['name']);
- }
- $message = JText::sprintf('JLIB_FORM_VALIDATE_FIELD_REQUIRED', $message);
+ $message = JText::_($element['name']);
}
+ $message = JText::sprintf('JLIB_FORM_VALIDATE_FIELD_REQUIRED', $message);
return new RuntimeException($message);
} | Correct confusion between required and validate messages in field | joomla_joomla-framework | train | php |
2060fa06983acbe6452656f268fd3389c215ec9f | diff --git a/src/Scale.js b/src/Scale.js
index <HASH>..<HASH> 100644
--- a/src/Scale.js
+++ b/src/Scale.js
@@ -68,6 +68,9 @@ function createScale(type, scheme, reverse) {
}
function configureDomain(scale, _) {
+ var raw = rawDomain(scale, _.domainRaw);
+ if (raw > -1) return raw;
+
var domain = _.domain,
zero = _.zero || (_.zero === undefined && INCLUDE_ZERO[scale.type]),
n;
@@ -89,6 +92,10 @@ function configureDomain(scale, _) {
return domain.length;
}
+function rawDomain(scale, raw) {
+ return raw ? (scale.domain(raw), raw.length) : -1;
+}
+
function configureRange(scale, _, count) {
var type = scale.type,
round = _.round || false, | Add domainRaw to Scale operator. | vega_vega-encode | train | js |
7f593b4fa3267be56cabe2e66bba71ace9e2a7a3 | diff --git a/AegeanTools/regions.py b/AegeanTools/regions.py
index <HASH>..<HASH> 100644
--- a/AegeanTools/regions.py
+++ b/AegeanTools/regions.py
@@ -178,7 +178,8 @@ class Region():
for d in xrange(1,self.maxdepth+1):
for p in self.pixeldict[d]:
line="fk5; polygon("
- vectors = zip(*hp.boundaries(2**d,p,step=1,nest=True))
+ #the following int() gets around some problems with np.int64 that exist prior to numpy v 1.8.1
+ vectors = zip(*hp.boundaries(2**d,int(p),step=1,nest=True))
positions=[]
for sky in self.vec2sky(np.array(vectors),degrees=True):
ra, dec = sky | fixed a bug for numpy v<<I> related to the continuing broken-ness of np.int<I> | PaulHancock_Aegean | train | py |
bed853f4c4b5795da01c5d7f57cf5a8a3f379ae9 | diff --git a/tower_cli/resources/setting.py b/tower_cli/resources/setting.py
index <HASH>..<HASH> 100644
--- a/tower_cli/resources/setting.py
+++ b/tower_cli/resources/setting.py
@@ -52,8 +52,8 @@ class Resource(models.Resource):
'results': [{'id': k, 'value': v} for k, v in result.items()]
}
- @resources.command(ignore_defaults=True)
- def get(self, pk, **kwargs):
+ @resources.command(use_fields_as_options=False)
+ def get(self, pk):
"""Return one and exactly one object"""
# The Tower API doesn't provide a mechanism for retrieving a single
# setting value at a time, so fetch them all and filter | correct help text for `tower-cli setting get` | ansible_tower-cli | train | py |
4525df759d248affea0e1bc0b6dc6b9684eb8491 | diff --git a/mesh_tensorflow/transformer/utils.py b/mesh_tensorflow/transformer/utils.py
index <HASH>..<HASH> 100644
--- a/mesh_tensorflow/transformer/utils.py
+++ b/mesh_tensorflow/transformer/utils.py
@@ -1184,8 +1184,9 @@ def eval_model(estimator, vocabulary, sequence_length, batch_size,
# Create list of postprocessed text targets
examples = [ex for ex in tfds.as_numpy(ds)]
targets = [
- eval_dataset.postprocess_fn(
- ex["targets_plaintext"], example=ex, is_target=True)
+ eval_dataset.postprocess_fn( # pylint:disable=g-complex-comprehension
+ tf.compat.as_string(ex["targets_plaintext"]),
+ example=ex, is_target=True)
for ex in examples
]
targets_filename = os.path.join(
@@ -1228,7 +1229,7 @@ def eval_model(estimator, vocabulary, sequence_length, batch_size,
examples = cached_examples[eval_dataset.name]
dataset_size = len(examples)
predictions = [
- eval_dataset.postprocess_fn(d, example=ex)
+ eval_dataset.postprocess_fn(tf.compat.as_string(d), example=ex)
for d, ex in zip(decodes[:dataset_size], examples)
]
# Remove the used decodes. | Convert Transformer targets to text before passing into postprocess function during evaluation.
PiperOrigin-RevId: <I> | tensorflow_mesh | train | py |
b12b1fc79d9dca7ab1c9c9412651562e1f4477aa | diff --git a/payu/models/accessom2.py b/payu/models/accessom2.py
index <HASH>..<HASH> 100644
--- a/payu/models/accessom2.py
+++ b/payu/models/accessom2.py
@@ -69,6 +69,20 @@ class AccessOm2(Model):
shutil.move(os.path.join(self.work_restart_path, f),
self.restart_path)
os.rmdir(self.work_restart_path)
+ else:
+ cice5 = None
+ mom = None
+ for model in self.expt.models:
+ if model.model_type == 'cice5':
+ cice5 = model
+ elif model.model_type == 'mom':
+ mom = model
+
+ # Copy restart from ocean into ice area.
+ if cice5 is not None and mom is not None:
+ o2i_src = os.path.join(mom.work_path, 'o2i.nc')
+ o2i_dst = os.path.join(cice5.restart_path, 'o2i.nc')
+ shutil.copy2(o2i_src, o2i_dst)
def collate(self):
pass | Copy o2i.nc from MOM work to CICE restart when doing archive. Closes #<I> | payu-org_payu | train | py |
6408ba426c60b84c8d60de428577e761589d2efb | diff --git a/src/core/core.scale.js b/src/core/core.scale.js
index <HASH>..<HASH> 100644
--- a/src/core/core.scale.js
+++ b/src/core/core.scale.js
@@ -692,7 +692,7 @@ module.exports = function(Chart) {
var label = itemToDraw.label;
if (helpers.isArray(label)) {
- for (var i = 0, y = 0; i < label.length; ++i) {
+ for (var i = 0, y = -(label.length - 1)*tickFontSize*0.75; i < label.length; ++i) {
// We just make sure the multiline element is a string here..
context.fillText('' + label[i], 0, y);
// apply same lineSpacing as calculated @ L#320 | Fix #<I> Multi-lines label alignment (#<I>) | chartjs_Chart.js | train | js |
4abfa7be167521f630ad50f45e4f23340a251020 | diff --git a/lib/assetScanner.js b/lib/assetScanner.js
index <HASH>..<HASH> 100644
--- a/lib/assetScanner.js
+++ b/lib/assetScanner.js
@@ -5,6 +5,7 @@ var EventEmitter = require("events").EventEmitter;
var AssetScanner = module.exports = function (options) {
this.directory = options.src;
+ this.detectChanges = options.detectChanges;
this.initialized = false;
};
@@ -45,5 +46,12 @@ AssetScanner.prototype.scanDirectoryItem = function (item, callback) {
};
AssetScanner.prototype.scanFile = function (item, callback) {
+ if (this.detectChanges) {
+ fs.watch(item, { persistent: false }, function (e, filename) {
+ this.onFile(item, function (err) {
+ if (err) throw err;
+ });
+ }.bind(this));
+ }
return this.onFile(item, callback);
};
\ No newline at end of file | Added file watching when detectChanges: true | js-kyle_connect-assets | train | js |
69b8c3d27438b8ee1b46db4e701f72e9275674b1 | diff --git a/bananas/admin/api/views.py b/bananas/admin/api/views.py
index <HASH>..<HASH> 100644
--- a/bananas/admin/api/views.py
+++ b/bananas/admin/api/views.py
@@ -6,6 +6,7 @@ from django.contrib.auth import (
from django.contrib.auth.forms import AuthenticationForm, PasswordChangeForm
from django.utils.translation import ugettext_lazy as _
from rest_framework import serializers, status, viewsets
+from rest_framework.permissions import AllowAny
from rest_framework.response import Response
from .i18n import RawTranslationCatalog
@@ -120,6 +121,7 @@ class TranslationAPI(BananasAdminAPI):
name = _("Translation catalog")
basename = "i18n"
+ permission_classes = (AllowAny,)
class Admin:
exclude_tags = ["navigation"] | Allow anonymous access to the i<I>n endpoint | 5monkeys_django-bananas | train | py |
Subsets and Splits
Java Commits in Train Set
Queries for all entries where the diff_languages column is 'java', providing a filtered dataset but without deeper analysis.
Java Commits Test Data
Returns a subset of 5000 entries from the dataset where the programming language difference is Java, providing basic filtering for exploration.
Java Commits Sample
Retrieves the first 1,000 records where the 'diff_languages' column is 'java', providing limited insight into the specific data entries.
Java Commits Validation Sample
Retrieves a sample of entries from the validation dataset where the diff languages are Java, providing limited insight into specific Java-related data points.
Java Commits in Validation
This query retrieves a limited sample of entries from the validation dataset where the programming language difference is Java, providing basic filtering with minimal insight.
Java Commits Sample
This query retrieves a sample of 100 records where the 'diff_languages' is 'java', providing basic filtering but limited analytical value.
Java Commits Sample
Retrieves 100 samples where the language difference is Java, providing basic filtering but minimal analytical value.
Java Commits Sample
Retrieves 10 samples where the diff_languages column is 'java', providing basic examples of data entries with this specific language.
Java Commits Validation Sample
Retrieves 1,000 records where the differences in languages are marked as Java, providing a snapshot of that specific subset but limited to raw data.
Java Commits Sample
This query retrieves 1000 random samples from the dataset where the programming language is Java, offering limited insight beyond raw data.