repo_name stringlengths 5 100 | path stringlengths 4 231 | language stringclasses 1 value | license stringclasses 15 values | size int64 6 947k | score float64 0 0.34 | prefix stringlengths 0 8.16k | middle stringlengths 3 512 | suffix stringlengths 0 8.17k |
|---|---|---|---|---|---|---|---|---|
sandino/python-markdown-video | mdx_video.py | Python | lgpl-3.0 | 15,167 | 0.004417 | #!/usr/bin/env python
"""
Embeds web videos using URLs. For instance, if a URL to an youtube video is
found in the text submitted to markdown and it isn't enclosed in parenthesis
like a normal link in markdown, then the URL will be swapped with a embedded
youtube video.
All resulting HTML is XHTML Strict compatible.
>>> import markdown
Test Metacafe
>>> s = "http://www.metacafe.com/watch/yt-tZMsrrQCnx8/pycon_2008_django_sprint_room/"
>>> markdown.markdown(s, ['video'])
u'<p>\\n<iframe allowfullscreen="true" frameborder="0" height="423" src="http://www.metacafe.com/fplayer/yt-tZMsrrQCnx8/pycon_2008_django_sprint_room.swf" width="498"></iframe>\\n</p>'
Test Metacafe with arguments
>>> markdown.markdown(s, ['video(metacafe_width=500,metacafe_height=425)'])
u'<p>\\n<iframe allowfullscreen="true" frameborder="0" height="425" src="http://www.metacafe.com/fplayer/yt-tZMsrrQCnx8/pycon_2008_django_sprint_room.swf" width="500"></iframe>\\n</p>'
Test Link To Metacafe
>>> s = "[Metacafe link](http://www.metacafe.com/watch/yt-tZMsrrQCnx8/pycon_2008_django_sprint_room/)"
>>> markdown.markdown(s, ['video'])
u'<p><a href="http: | //www.metacafe.com/watch/yt-tZMsrr | QCnx8/pycon_2008_django_sprint_room/">Metacafe link</a></p>'
Test Markdown Escaping
>>> s = "\\http://www.metacafe.com/watch/yt-tZMsrrQCnx8/pycon_2008_django_sprint_room/"
>>> markdown.markdown(s, ['video'])
u'<p>http://www.metacafe.com/watch/yt-tZMsrrQCnx8/pycon_2008_django_sprint_room/</p>'
>>> s = "`http://www.metacafe.com/watch/yt-tZMsrrQCnx8/pycon_2008_django_sprint_room/`"
>>> markdown.markdown(s, ['video'])
u'<p><code>http://www.metacafe.com/watch/yt-tZMsrrQCnx8/pycon_2008_django_sprint_room/</code></p>'
Test Youtube
>>> s = "http://www.youtube.com/watch?v=u1mA-0w8XPo&hd=1&fs=1&feature=PlayList&p=34C6046F7FEACFD3&playnext=1&playnext_from=PL&index=1"
>>> markdown.markdown(s, ['video'])
u'<p>\\n<div class="embed-responsive embed-responsive-16by9">\\n<iframe allowfullscreen="true" class="embed-responsive-item" frameborder="0" height="480" src="http://www.youtube.com/embed/u1mA-0w8XPo&hd=1&fs=1&feature=PlayList&p=34C6046F7FEACFD3&playnext=1&playnext_from=PL&index=1" width="853"></iframe>\\n</div>\\n</p>'
Test Youtube with argument
>>> markdown.markdown(s, ['video(youtube_width=200,youtube_height=100)'])
u'<p>\\n<iframe allowfullscreen="true" frameborder="0" height="100" src="http://www.youtube.com/embed/u1mA-0w8XPo&hd=1&fs=1&feature=PlayList&p=34C6046F7FEACFD3&playnext=1&playnext_from=PL&index=1" width="200"></iframe>\\n</p>'
Test Youtube Link
>>> s = "[Youtube link](http://www.youtube.com/watch?v=u1mA-0w8XPo&feature=PlayList&p=34C6046F7FEACFD3&playnext=1&playnext_from=PL&index=1)"
>>> markdown.markdown(s, ['video'])
u'<p><a href="http://www.youtube.com/watch?v=u1mA-0w8XPo&feature=PlayList&p=34C6046F7FEACFD3&playnext=1&playnext_from=PL&index=1">Youtube link</a></p>'
Test Dailymotion
>>> s = "http://www.dailymotion.com/relevance/search/ut2004/video/x3kv65_ut2004-ownage_videogames"
>>> markdown.markdown(s, ['video'])
u'<p><object data="http://www.dailymotion.com/swf/x3kv65_ut2004-ownage_videogames" height="405" type="application/x-shockwave-flash" width="480"><param name="movie" value="http://www.dailymotion.com/swf/x3kv65_ut2004-ownage_videogames"></param><param name="allowFullScreen" value="true"></param></object></p>'
Test Dailymotion again (Dailymotion and their crazy URLs)
>>> s = "http://www.dailymotion.com/us/video/x8qak3_iron-man-vs-bruce-lee_fun"
>>> markdown.markdown(s, ['video'])
u'<p><object data="http://www.dailymotion.com/swf/x8qak3_iron-man-vs-bruce-lee_fun" height="405" type="application/x-shockwave-flash" width="480"><param name="movie" value="http://www.dailymotion.com/swf/x8qak3_iron-man-vs-bruce-lee_fun"></param><param name="allowFullScreen" value="true"></param></object></p>'
Test Yahoo! Video
>>> s = "http://video.yahoo.com/watch/1981791/4769603"
>>> markdown.markdown(s, ['video'])
u'<p><object data="http://d.yimg.com/static.video.yahoo.com/yep/YV_YEP.swf?ver=2.2.40" height="322" type="application/x-shockwave-flash" width="512"><param name="movie" value="http://d.yimg.com/static.video.yahoo.com/yep/YV_YEP.swf?ver=2.2.40"></param><param name="allowFullScreen" value="true"></param><param name="flashVars" value="id=4769603&vid=1981791"></param></object></p>'
Test Veoh Video
>>> s = "http://www.veoh.com/search/videos/q/mario#watch%3De129555XxCZanYD"
>>> markdown.markdown(s, ['video'])
u'<p><object data="http://www.veoh.com/videodetails2.swf?permalinkId=e129555XxCZanYD" height="341" type="application/x-shockwave-flash" width="410"><param name="movie" value="http://www.veoh.com/videodetails2.swf?permalinkId=e129555XxCZanYD"></param><param name="allowFullScreen" value="true"></param></object></p>'
Test Veoh Video Again (More fun URLs)
>>> s = "http://www.veoh.com/group/BigCatRescuers#watch%3Dv16771056hFtSBYEr"
>>> markdown.markdown(s, ['video'])
u'<p><object data="http://www.veoh.com/videodetails2.swf?permalinkId=v16771056hFtSBYEr" height="341" type="application/x-shockwave-flash" width="410"><param name="movie" value="http://www.veoh.com/videodetails2.swf?permalinkId=v16771056hFtSBYEr"></param><param name="allowFullScreen" value="true"></param></object></p>'
Test Veoh Video Yet Again (Even more fun URLs)
>>> s = "http://www.veoh.com/browse/videos/category/anime/watch/v181645607JyXPWcQ"
>>> markdown.markdown(s, ['video'])
u'<p><object data="http://www.veoh.com/videodetails2.swf?permalinkId=v181645607JyXPWcQ" height="341" type="application/x-shockwave-flash" width="410"><param name="movie" value="http://www.veoh.com/videodetails2.swf?permalinkId=v181645607JyXPWcQ"></param><param name="allowFullScreen" value="true"></param></object></p>'
Test Vimeo Video
>>> s = "http://www.vimeo.com/1496152"
>>> markdown.markdown(s, ['video'])
u'<p>\\n<iframe allowfullscreen="true" frameborder="0" height="480" src="http://vimeo.com/moogaloop.swf?clip_id=1496152&amp;server=vimeo.com" width="850"></iframe>\\n</p>'
Test Vimeo Video with some GET values
>>> s = "http://vimeo.com/1496152?test=test"
>>> markdown.markdown(s, ['video'])
u'<p>\\n<iframe allowfullscreen="true" frameborder="0" height="480" src="http://vimeo.com/moogaloop.swf?clip_id=1496152&amp;server=vimeo.com" width="850"></iframe>\\n</p>'
Test Blip.tv
>>> s = "http://blip.tv/file/get/Pycon-PlenarySprintIntro563.flv"
>>> markdown.markdown(s, ['video'])
u'<p><object data="http://blip.tv/scripts/flash/showplayer.swf?file=http://blip.tv/file/get/Pycon-PlenarySprintIntro563.flv" height="300" type="application/x-shockwave-flash" width="480"><param name="movie" value="http://blip.tv/scripts/flash/showplayer.swf?file=http://blip.tv/file/get/Pycon-PlenarySprintIntro563.flv"></param><param name="allowFullScreen" value="true"></param></object></p>'
Test Gametrailers
>>> s = "http://www.gametrailers.com/video/console-comparison-borderlands/58079"
>>> markdown.markdown(s, ['video'])
u'<p><object data="http://www.gametrailers.com/remote_wrap.php?mid=58079" height="392" type="application/x-shockwave-flash" width="480"><param name="movie" value="http://www.gametrailers.com/remote_wrap.php?mid=58079"></param><param name="allowFullScreen" value="true"></param></object></p>'
"""
import markdown
version = "0.1.6"
class VideoExtension(markdown.Extension):
def __init__(self, configs):
self.config = {
'bliptv_width': ['480', 'Width for Blip.tv videos'],
'bliptv_height': ['300', 'Height for Blip.tv videos'],
'dailymotion_width': ['480', 'Width for Dailymotion videos'],
'dailymotion_height': ['405', 'Height for Dailymotion videos'],
'gametrailers_width': ['480', 'Width for Gametrailers videos'],
'gametrailers_height': ['392', 'Height for Gametrailers videos'],
'metacafe_width': ['498', 'Width for Metacafe videos'],
'metacafe_height': ['423', 'Height for Metacafe videos'],
'veoh_width': ['410', 'Width for Veoh videos'],
'veoh_height': ['341', 'Height for Veoh videos'],
'vim |
evoskuil/czmq | bindings/python_cffi/czmq_cffi/Zfile.py | Python | mpl-2.0 | 6,377 | 0.001568 | ################################################################################
# THIS FILE IS 100% GENERATED BY ZPROJECT; DO NOT EDIT EXCEPT EXPERIMENTALLY #
# Read the zproject/README.md for information about making permanent changes. #
################################################################################
import utils
from . import destructors
libczmq_destructors = destructors.lib
class Zfile(object):
"""
helper functions for working with files.
"""
def __init__(self, path, name):
"""
If file exists, populates properties. CZMQ supports portable symbolic
links, which are files with the extension ".ln". A symbolic link is a
text file containing one line, the filename of a target file. Reading
data from the symbolic link actually reads from the target file. Path
may be NULL, in which case i | t is not used.
"""
p = utils.lib.zfile_new(utils.to_bytes(path), utils.to_bytes(name))
if p == utils.ffi.NULL:
raise MemoryError("Could not allocate person")
| # ffi.gc returns a copy of the cdata object which will have the
# destructor called when the Python object is GC'd:
# https://cffi.readthedocs.org/en/latest/using.html#ffi-interface
self._p = utils.ffi.gc(p, libczmq_destructors.zfile_destroy_py)
def dup(self):
"""
Duplicate a file item, returns a newly constructed item. If the file
is null, or memory was exhausted, returns null.
"""
return utils.lib.zfile_dup(self._p)
def filename(self, path):
"""
Return file name, remove path if provided
"""
return utils.lib.zfile_filename(self._p, utils.to_bytes(path))
def restat(self):
"""
Refresh file properties from disk; this is not done automatically
on access methods, otherwise it is not possible to compare directory
snapshots.
"""
utils.lib.zfile_restat(self._p)
def modified(self):
"""
Return when the file was last modified. If you want this to reflect the
current situation, call zfile_restat before checking this property.
"""
return utils.lib.zfile_modified(self._p)
def cursize(self):
"""
Return the last-known size of the file. If you want this to reflect the
current situation, call zfile_restat before checking this property.
"""
return utils.lib.zfile_cursize(self._p)
def is_directory(self):
"""
Return true if the file is a directory. If you want this to reflect
any external changes, call zfile_restat before checking this property.
"""
return utils.lib.zfile_is_directory(self._p)
def is_regular(self):
"""
Return true if the file is a regular file. If you want this to reflect
any external changes, call zfile_restat before checking this property.
"""
return utils.lib.zfile_is_regular(self._p)
def is_readable(self):
"""
Return true if the file is readable by this process. If you want this to
reflect any external changes, call zfile_restat before checking this
property.
"""
return utils.lib.zfile_is_readable(self._p)
def is_writeable(self):
"""
Return true if the file is writeable by this process. If you want this
to reflect any external changes, call zfile_restat before checking this
property.
"""
return utils.lib.zfile_is_writeable(self._p)
def is_stable(self):
"""
Check if file has stopped changing and can be safely processed.
Updates the file statistics from disk at every call.
"""
return utils.lib.zfile_is_stable(self._p)
def has_changed(self):
"""
Return true if the file was changed on disk since the zfile_t object
was created, or the last zfile_restat() call made on it.
"""
return utils.lib.zfile_has_changed(self._p)
def remove(self):
"""
Remove the file from disk
"""
utils.lib.zfile_remove(self._p)
def input(self):
"""
Open file for reading
Returns 0 if OK, -1 if not found or not accessible
"""
return utils.lib.zfile_input(self._p)
def output(self):
"""
Open file for writing, creating directory if needed
File is created if necessary; chunks can be written to file at any
location. Returns 0 if OK, -1 if error.
"""
return utils.lib.zfile_output(self._p)
def read(self, bytes, offset):
"""
Read chunk from file at specified position. If this was the last chunk,
sets the eof property. Returns a null chunk in case of error.
"""
return utils.lib.zfile_read(self._p, bytes, offset)
def eof(self):
"""
Returns true if zfile_read() just read the last chunk in the file.
"""
return utils.lib.zfile_eof(self._p)
def write(self, chunk, offset):
"""
Write chunk to file at specified position
Return 0 if OK, else -1
"""
return utils.lib.zfile_write(self._p, chunk._p, offset)
def readln(self):
"""
Read next line of text from file. Returns a pointer to the text line,
or NULL if there was nothing more to read from the file.
"""
return utils.lib.zfile_readln(self._p)
def close(self):
"""
Close file, if open
"""
utils.lib.zfile_close(self._p)
def handle(self):
"""
Return file handle, if opened
"""
return utils.lib.zfile_handle(self._p)
def digest(self):
"""
Calculate SHA1 digest for file, using zdigest class.
"""
return utils.lib.zfile_digest(self._p)
def test(verbose):
"""
Self test of this class.
"""
utils.lib.zfile_test(verbose)
################################################################################
# THIS FILE IS 100% GENERATED BY ZPROJECT; DO NOT EDIT EXCEPT EXPERIMENTALLY #
# Read the zproject/README.md for information about making permanent changes. #
################################################################################
|
catapult-project/catapult | third_party/gsutil/third_party/apitools/samples/storage_sample/uploads_test.py | Python | bsd-3-clause | 6,523 | 0.000153 | #
# Copyright 2015 Google Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Integration tests for uploading and downloading to GCS.
These tests exercise most of the corner cases for upload/download of
files in apitools, via GCS. There are no performance tests here yet.
"""
import json
import os
import random
import string
import unittest
import six
from apitools.base.py import transfer
import storage
_CLIENT = None
def _GetClient():
global _CLIENT # pylint: disable=global-statement
if _CLIENT is None:
_CLIENT = storage.StorageV1()
return _CLIENT
class UploadsTest(unittest.TestCase):
_DEFAULT_BUCKET = 'apitools'
_TESTDATA_PREFIX = 'uploads'
def setUp(self):
self.__client = _GetClient()
self.__files = []
self.__content = ''
self.__buffer = None
self.__upload = None
def tearDown(self):
self.__DeleteFiles()
def __ResetUpload(self, size, auto_transfer=True):
self.__content = ''.join(
random.choice(string.ascii_letters) for _ in range(size))
self.__buffer = six.StringIO(self.__content)
self.__upload = storage.Upload.FromStream(
self.__buffer, 'text/plain', auto_transfer=auto_transfer)
def __DeleteFiles(self):
for filename in self.__files:
self.__DeleteFile(filename)
def __DeleteFile(self, filename):
object_name = os.path.join(self._TESTDATA_PREFIX, filename)
req = storage.StorageObjectsDeleteRequest(
bucket=self._DEFAULT_BUCKET, object=object_name)
self.__client.objects.Delete(req)
def __InsertRequest(self, filename):
object_name = os.path.join(self._TESTDATA_PREFIX, filename)
return storage.StorageObjectsInsertRequest(
name=object_name, bucket=self._DEFAULT_BUCKET)
def __GetRequest(self, filename):
object_name = os.path.join(self._TESTDATA_PREFIX, filename)
return storage.StorageObjectsGetRequest(
object=object_name, bucket=self._DEFAULT_BUCKET)
def __InsertFile(self, filename, request=None):
if request is None:
request = self.__InsertRequest(filename)
response = self.__client.objects.Insert(request, upload=self.__upload)
self.assertIsNotNone(response)
self.__files.append(filename)
return response
def testZeroBytes(self):
filename = 'zero_byte_file'
self.__ResetUpload(0)
response = self.__InsertFile(filename)
self.assertEqual(0, response.size)
|
def testSimpleUpload(self):
filename = 'fifteen_byte_file'
self.__ResetUpload(15)
response = self.__InsertFile(filename)
self.assertEqual(15, response.size)
def testMultipartUpload(self):
filename = 'fifteen_byte_file'
self.__ResetUpload(15)
| request = self.__InsertRequest(filename)
request.object = storage.Object(contentLanguage='en')
response = self.__InsertFile(filename, request=request)
self.assertEqual(15, response.size)
self.assertEqual('en', response.contentLanguage)
def testAutoUpload(self):
filename = 'ten_meg_file'
size = 10 << 20
self.__ResetUpload(size)
request = self.__InsertRequest(filename)
response = self.__InsertFile(filename, request=request)
self.assertEqual(size, response.size)
def testStreamMedia(self):
filename = 'ten_meg_file'
size = 10 << 20
self.__ResetUpload(size, auto_transfer=False)
self.__upload.strategy = 'resumable'
self.__upload.total_size = size
request = self.__InsertRequest(filename)
initial_response = self.__client.objects.Insert(
request, upload=self.__upload)
self.assertIsNotNone(initial_response)
self.assertEqual(0, self.__buffer.tell())
self.__upload.StreamMedia()
self.assertEqual(size, self.__buffer.tell())
def testBreakAndResumeUpload(self):
filename = ('ten_meg_file_' +
''.join(random.sample(string.ascii_letters, 5)))
size = 10 << 20
self.__ResetUpload(size, auto_transfer=False)
self.__upload.strategy = 'resumable'
self.__upload.total_size = size
# Start the upload
request = self.__InsertRequest(filename)
initial_response = self.__client.objects.Insert(
request, upload=self.__upload)
self.assertIsNotNone(initial_response)
self.assertEqual(0, self.__buffer.tell())
# Pretend the process died, and resume with a new attempt at the
# same upload.
upload_data = json.dumps(self.__upload.serialization_data)
second_upload_attempt = transfer.Upload.FromData(
self.__buffer, upload_data, self.__upload.http)
second_upload_attempt._Upload__SendChunk(0)
self.assertEqual(second_upload_attempt.chunksize, self.__buffer.tell())
# Simulate a third try, and stream from there.
final_upload_attempt = transfer.Upload.FromData(
self.__buffer, upload_data, self.__upload.http)
final_upload_attempt.StreamInChunks()
self.assertEqual(size, self.__buffer.tell())
# Verify the upload
object_info = self.__client.objects.Get(self.__GetRequest(filename))
self.assertEqual(size, object_info.size)
# Confirm that a new attempt successfully does nothing.
completed_upload_attempt = transfer.Upload.FromData(
self.__buffer, upload_data, self.__upload.http)
self.assertTrue(completed_upload_attempt.complete)
completed_upload_attempt.StreamInChunks()
# Verify the upload didn't pick up extra bytes.
object_info = self.__client.objects.Get(self.__GetRequest(filename))
self.assertEqual(size, object_info.size)
# TODO(craigcitro): Add tests for callbacks (especially around
# finish callback).
if __name__ == '__main__':
unittest.main()
|
kaffeel/oppia | extensions/rich_text_components/Image/Image.py | Python | apache-2.0 | 2,465 | 0 | # coding: utf-8
#
# Copyright 2014 The Oppia Authors. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, softwar
# distributed under the License is distributed on an "AS-IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from extensions.rich_text_components import base
class Image(base.BaseRichTextComponent):
"""A rich-text component representing an inline image."""
name = 'Image'
category = 'Basic Input'
description = 'An image.'
frontend_name = 'image'
tooltip = 'Insert image'
_customization_arg_specs = [{
'name': 'filepath',
'description': (
'The name of the image file. (Allowed extensions: gif, jpeg, jpg, '
'png.)'),
'schema': {
'type': 'custom',
'obj_type': 'Filepath',
},
'default_value': '',
}, {
'name': 'alt',
'description': 'Alt text (for screen readers)',
'schema': {
'type': 'unicode',
},
'default_value': '',
}]
| icon_data_url = (
'data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAABAAAAAQCAYAAAAf8/9hAAA'
'ABGdBTUEAAK/INwWK6QAAABl0RVh0%0AU29mdHdhcmUAQWRvYmUgSW1hZ2VSZWFkeXHJZ'
| 'TwAAAHwSURBVDjLpZM9a1RBFIafM/fevfcmC7uQ%0AjWEjUZKAYBHEVEb/gIWFjVVSWEj'
'6gI0/wt8gprPQykIsTP5BQLAIhBVBzRf52Gw22bk7c8YiZslu%0AgggZppuZ55z3nfdIC'
'IHrrBhg%2BePaa1WZPyk0s%2B6KWwM1khiyhDcvns4uxQAaZOHJo4nRLMtEJPpn%0AxY6'
'Cd10%2BfNl4DpwBTqymaZrJ8uoBHfZoyTqTYzvkSRMXlP2jnG8bFYbCXWJGePlsEq8iPQ'
'mFA2Mi%0AjEBhtpis7ZCWftC0LZx3xGnK1ESd741hqqUaqgMeAChgjGDDLqXkgMPTJtZ3'
'KJzDhTZpmtK2OSO5%0AIRB6xvQDRAhOsb5Lx1lOu5ZCHV4B6RLUExvh4s%2BZntHhDJAx'
'Sqs9TCDBqsc6j0iJdqtMuTROFBkI%0AcllCCGcSytFNfm1tU8k2GRo2pOI43h9ie6tOvT'
'JFbORyDsJFQHKD8fw%2BP9dWqJZ/I96TdEa5Nb1A%0AOavjVfti0dfB%2Bt4iXhWvyh27'
'y9zEbRRobG7z6fgVeqSoKvB5oIMQEODx7FLvIJo55KS9R7b5ldrD%0AReajpC%2BZ5z7G'
'AHJFXn1exedVbG36ijwOmJgl0kS7lXtjD0DkLyqc70uPnSuIIwk9QCmWd%2B9XGnOF%0A'
'DzP/M5xxBInhLYBcd5z/AAZv2pOvFcS/AAAAAElFTkSuQmCC%0A'
)
|
dieseldev/diesel | examples/forker.py | Python | bsd-3-clause | 255 | 0.007843 | from diesel import Loop, fork, Application, sleep
def sleep_and_print(num):
sleep(1)
print num
sleep(1)
a.halt()
def forker():
for x in xrange(5):
fork(sleep_and_print, x)
a = Application()
a.add_l | oop(Loop(forker))
a | .run()
|
jonfoster/pyxb1 | pyxb/bundles/opengis/iso19139/20070417/gmx.py | Python | apache-2.0 | 61 | 0 | from pyxb | .bundles.opengis.iso19139.20070417.raw.gmx imp | ort *
|
albiremo/aerotools-hm | geometry.py | Python | gpl-3.0 | 5,740 | 0.036411 |
#import
#-----------------------------------------------------
import numpy as np
from scipy.linalg import solve,solve_banded
import matplotlib as mp
mp.use("Qt4Agg")
import scipy as sp
#import matplotlib.pyplot as plt
from numpy import pi
#-----------------------------------------------------
#-----------------------------------------------------
#dictionary definition for NACA 5 digit
def zero():
q=0.0580
k=361.4
return q,k
def one():
q=0.1260
k=361.4
return q,k
def two():
q=0.2025
k=51.64
return q,k
def three():
q=0.2900
k=15.957
return(q,k)
def four():
q=0.3910
k=3.230
return q,k
#----------------------------------------------------
#-------------------------------------------------
def NACA4camberline(xc,mc,pos_mc):
m=pos_mc/100
p=pos_mc/10
yc = ((m/(p**2))*(2*p-xc)*xc)*(xc<p)+((m/(1-p)**2)*(1- 2*p + (2*p-xc)*xc))*(xc>=p)
Dyc = ((m/p**2)*2*(p-xc))*(xc<p)+((m/(1-p)**2) * 2*(p-xc))*(xc>=p)
return yc,Dyc
def NACA5camberline(xc,tipologia):
options={210: zero,
220: one,
230: two,
240: three,
250: four,}
try:
q,k=options[tipologia]()
except KeyError:
print('code not supported')
yc = np.zeros([len(xc)])
Dyc = np.zeros([len(xc)])
yc = ((k/6)*(xc**3-3*q*xc**2+q**2*(3-q)*xc))*(xc<q) +((k/6)*q**3*(1-xc))*(xc>=q)
Dyc = ((k/6)*(3*xc**2 -6*q*xc+ q**2*(3-q)*np.ones(len(xc))))*(xc<q)-((k/6)*np.ones(len(xc))*q**3)*(xc>=q)
return yc,Dyc
def NACAthick(xc,SS,xtec):
s=0.01*SS
#classical NACA thickness
tk = 5*s*(0.29690*np.sqrt(xc) -0.12600*xc -0.35160*(xc**2) + 0.28430*(xc**3) -0.10150*xc**4)
# is possible to evaluate correction of this coefficients, due to the fact
# that we need 0 thickness exit
if xtec<1:
tkte=tk[-1]
A1=np.zeros([4,4],np.float64)
A1 = 5*s*np.matrix([[1.0, 1.0, 1.0, 1.0],[xtec, xtec**2, xtec**3, xtec**4],[1, 2*xtec, 3*xtec**2, 4*xtec**3],[0, 2, 6*xtec, 12*xtec**2]])
# remember the tonda
rhs=np.zeros([4],np.float64)
rhs=np.array([-tkte,0,0,0])
b=solve(A1,rhs)
tk = 5*s*(0.29690*np.sqrt(xc) -0.12600*xc -0.35160*xc**2 +\
0.28430*xc**3 -0.10150*xc**4)*(xc<xtec) +\
5*s*(0.29690*np.sqrt(xc)+(-0.12600+b[0])*xc +(-0.35160+b[1])*xc**2 \
+ (0.28430+b[2])*xc**3 +(-0.10150+b[3])*xc**4)*(xc>=xtec)
tk[-1]=0
return tk
def NACA(code,nc):
nc=nc+1
#I've to make 1 more every time, because the last one of arange
#is no | t complained
# xc = 0.5*(1-np.cos(np.arange(0,pi,pi/(nc-1))))
xc = np.linspace(0,1,num = nc-1,endpoint = True)
nv = 2*nc-1 #number of panel vertices, must be double of number of nodes but
#minus 1, because we have 1 more node on chord than number of vertices
xv = np.zeros([nv],dtype=np.float64)
yv = np.zeros([nv],dtype=np.float64)
| nn = len(code)
if nn>5 or nn<4:
print('error enter a NACA 4 or 5 digit code')
return
else:
if nn==4:
#4 digit case
A=list(code)
mc=np.int(A[0]) #maximum camber
pos_mc=np.int(A[1]) #position of maximum camber
SS=np.int(A[2])*10+np.int(A[3])
print('max_camber:',mc,'pos_max_camber:',pos_mc,'max_thick:',SS)
xtec=np.float64(1)
#maximum thickness
# camber line construction
yc,Dyc=NACA4camberline(xc,mc,pos_mc)
#thickness construction
tk=NACAthick(xc,SS,xtec)
#print(xv[0:nc-1].shape,xv[nc:nv-1].shape)
theta=np.arctan(Dyc)
#print(tk.shape, theta.shape,xc.shape,nv)
#xv=np.zeros([nv],np.float64)
#yv=np.zeros([nv],np.float64)
xv[0:nc-1]=xc[0:nc-1]-tk*np.sin(theta[0:nc-1])
yv[0:nc-1]=yc[0:nc-1]+tk*np.cos(theta[0:nc-1])
xv[0:nc-1]=xv[nc-1:0:-1]
yv[0:nc-1]=yv[nc-1:0:-1]
xv[nc:nv-1]=xc[1:nc-1]+tk[1:nc-1]*np.sin(theta[1:nc-1])
yv[nc:nv-1]=yc[1:nc-1]-tk[1:nc-1]*np.cos(theta[1:nc-1])
xvnew=np.zeros([nv-3],np.float64)
yvnew=np.zeros([nv-3],np.float64)
xvnew=xv[1:nv-1]
yvnew=yv[1:nv-1]
with open("naca_airfoil.txt","w") as air:
for i in range(0,nv-2):
print(xvnew[i],yvnew[i], file=air, sep=' ')
return(xvnew,yvnew)
else:
#5 digit case
A=list(code)
tipologia=np.int(A[0])*100+np.int(A[1])*10+np.int(A[2])
SS=np.int(A[3])*10+np.int(A[4])
#camberline construction
yc,Dyc=NACA5camberline(xc,tipologia)
#thickness construction
xtec=1.0
tk=NACAthick(xc,SS,xtec)
#print(xv[0:nc-1].shape,xv[nc:nv-1].shape)
theta=np.arctan(Dyc)
#print(tk.shape, theta.shape,xc.shape,nv)
#xv=np.zeros([nv],np.float64)
#yv=np.zeros([nv],np.float64)
xv[0:nc-1]=xc[0:nc-1]-tk*np.sin(theta[0:nc-1])
yv[0:nc-1]=yc[0:nc-1]+tk*np.cos(theta[0:nc-1])
xv[0:nc-1]=xv[nc-1:0:-1]
yv[0:nc-1]=yv[nc-1:0:-1]
xv[nc:nv-1]=xc[1:nc-1]+tk[1:nc-1]*np.sin(theta[1:nc-1])
yv[nc:nv-1]=yc[1:nc-1]-tk[1:nc-1]*np.cos(theta[1:nc-1])
xvnew=np.zeros([nv-3],np.float64)
yvnew=np.zeros([nv-3],np.float64)
xvnew=xv[1:nv-1]
yvnew=yv[1:nv-1]
with open("naca_airfoil.txt","w") as air:
for i in range(0,nv-2):
print(xvnew[i],yvnew[i], file=air, sep=' ')
return(xvnew,yvnew)
return(yv)
|
BtpPrograms/MHacks8 | Python/emg_test.py | Python | gpl-3.0 | 1,120 | 0.004464 | # This file is from us, not the library developer
from __future__ import print_function
from collections import Counter
import struct
import sys
import time
import numpy as np
try:
from sklearn import neighbors, svm
HAVE_SK = True
except ImportError:
HAVE_SK = False
try:
import pygame
from pygame.locals import *
HAVE_PYGAME = True
except ImportError:
HAVE_PYGAME = False
from common import *
import myo
class EMGHandler(object):
def __init__(self, m):
self.recording = -1
self.m = m
self.emg = (0,) * 8
def __call__(self, emg, moving):
self.emg = emg
if self.recording >= 0:
self.m.cls.store_data(self.recording, emg)
if __name__ == | '__main__':
m = myo.Myo(myo.NNClassifier(), sys.argv[1] if len(sys.argv) >= 2 else None)
hnd = EMGHandler(m)
m.add_emg_handler(hnd)
m.connect()
try:
while | True:
m.run()
print(hnd.emg)
except KeyboardInterrupt:
pass
finally:
m.disconnect()
print()
if HAVE_PYGAME:
pygame.quit()
|
dstansby/heliopy | heliopy/data/test/test_helios.py | Python | gpl-3.0 | 2,429 | 0 | from datetime import datetime
import pathlib
import shutil
import urllib
import pytest
from .util import check_data_output, website_working
helios = pytest.importorskip('heliopy.data.helios')
pytest.mark.skipif(
not website_working('https://helios-data.ssl.berkeley.edu/data/'))
pytestmark = pytest.mark.data
probe = '1'
def test_merged():
starttime = datetime(1976, 1, 10, 0, 0, 0)
endtime = datetime(1976, 1, 10, 23, 59, 59)
df = helios.merged(probe, starttime, endtime)
check_data_output(df)
starttime = datetime(2000, 1, 1, 0, 0, 0)
endtime = datetime(2000, 1, 2, 0, 0, 0)
with pytest.raises(RuntimeError):
helios.merged(probe, starttime, endtime)
def test_corefit():
starttime = datetime(1976, 1, 10, 0, 0, 0)
endtime = datetime(1976, 1, 10, 23, 59, 59)
df = helios.corefit(probe, starttime, endtime)
check_data_output(df)
starttime = datetime(2000, 1, 1, 0, 0, 0)
endtime = datetime(2000, 1, 2, 0, 0, 0)
with pytest.raises(RuntimeError):
helios.corefit(probe, starttime, endtime)
def test_6sec_ness():
starttime = datetime(1976, 1, 16)
endtime = datetime(1976, 1, 18)
probe = '2'
df = helios.mag_ness(probe, starttime, endtime)
check_data_output(df)
@pytest.mark.xfail()
def test_distribution_funcs():
local_dir = pathlib.Path(helios.helios_dir)
local_dir = local_dir / 'helios1' / 'dist' / '1974' / '346'
local_dir.mkdir(parents=True, exist_ok=True)
remote_file = ('http://helios-data.ssl.berkeley.edu/data/E1_experiment'
'/E1_original_data/helios_1/1974/346/'
'h1y74d346h03m27s21_hdm.1')
local_fname, _ = urllib.request.urlretrieve(
remote_file, './h1y74d346h03m27s21_hdm.1')
local_fname = pathlib.Path(local_fname)
new_path = local_dir / local_fname.name
shutil.copyfile(local_fname, new_path)
helios.integrated_dists(
'1', datetime(1974, 12, 12), datetime(1974, 12, 13))
helios.distparams(
'1', datetime(1974, 12, 12), datetime(1974, 12, 13))
helios.electron_dists(
'1', datetime(1974, 12, 12), datetime(1974, 12, 13))
helio | s.ion_dists(
'1', datetime(1974, 12, 1 | 2), datetime(1974, 12, 13))
def test_mag_4hz():
starttime = datetime(1976, 1, 16)
endtime = datetime(1976, 1, 18)
probe = '2'
df = helios.mag_4hz(probe, starttime, endtime)
check_data_output(df)
|
madcowfred/evething | thing/tasks/reftypes.py | Python | bsd-2-clause | 2,946 | 0.004073 | # ------------------------------------------------------------------------------
# Copyright (c) 2010-2013, EVEthing team
# All rights reserved.
#
# Redistribution and use in source and binary forms, with or without modification,
# are permitted provided that the following conditions are met:
#
# Redistributions of source code must retain the above copyright notice, this
# list of conditions and the following disclaimer.
# Redistributions in binary form must reproduce the above copyright notice,
# this list of conditions and the following disclaimer in the documentation
# and/or other materials provided with the distribution.
#
# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND
# ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED
# WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED.
# IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT,
# INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT
# NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR
# PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY,
# WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
# ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY
# OF SUCH DAMAGE.
# ------------------------------------------------------------------------------
from .apitask import APITask
from thing.models import RefType
class RefTypes(APITask):
name = 'thing.ref_types'
def run(self, url, taskstate_id, apikey_id, zero):
if self.init(taskstate_id, apikey_id) is False:
return
# Fetch the API data
if self.fetch_api(url, {}, use_auth=False) is False or self.root is None:
return
# Build a refTypeID:row dictionary
bulk_data = {}
for row in self.root.findall('result/rowset/row'):
bulk_data[int(row.attrib['refTypeID'])] = row
# Bulk retrieve all of those stations that exist
rt_map = RefType.ob | jects.in_bulk(bulk_data.keys())
new = []
for refTypeID, row in bulk_data.items():
reftype = rt_map.get(refTypeID)
# RefType does not exist, make a new one
if reftype is None:
new.append(RefType(
id=refTypeID,
name=row.attrib['refTypeName'],
))
# RefType exists and name has changed, update it
elif reftype.name != row.att | rib['refTypeName']:
reftype.name = row.attrib['refTypeName']
reftype.save()
# Create any new stations
if new:
RefType.objects.bulk_create(new)
return True
# ---------------------------------------------------------------------------
|
HiSPARC/station-software | user/python/Lib/test/test_unicode_file.py | Python | gpl-3.0 | 8,417 | 0.001782 | # Test some Unicode file name semantics
# We dont test many operations on files other than
# that their names can be used with Unicode characters.
import os, glob, time, shutil
import unicodedata
import unittest
from test.test_support import run_unittest, change_cwd, TESTFN_UNICODE
from test.test_support import TESTFN_ENCODING, TESTFN_UNENCODABLE
try:
TESTFN_ENCODED = TESTFN_UNICODE.encode(TESTFN_ENCODING)
except (UnicodeError, TypeError):
# Either the file system encoding is None, or the file name
# cannot be encoded in the file system encoding.
raise unittest.SkipTest("No Unicode filesystem semantics on this platform.")
if TESTFN_ENCODED.decode(TESTFN_ENCODING) != TESTFN_UNICODE:
# The file system encoding does not support Latin-1
# (which test_support assumes), so try the file system
# encoding instead.
import sys
try:
TESTFN_UNICODE = unicode("@test-\xe0\xf2", sys.getfilesystemencoding())
TESTFN_ENCODED = TESTFN_UNICODE.encode(TESTFN_ENCODING)
if '?' in TESTFN_ENCODED:
# MBCS will not report the error properly
raise UnicodeError, "mbcs encoding problem"
except (UnicodeError, TypeError):
raise unittest.SkipTest("Cannot find a suiteable filename.")
if TESTFN_ENCODED.decode(TESTFN_ENCODING) != TESTFN_UNICODE:
raise unittest.SkipTest("Cannot find a suitable filename.")
def remove_if_exists(filename):
if os.path.exists(filename):
os.unlink(filename)
class TestUnicodeFiles(unittest.TestCase):
# The 'do_' functions are the actual tests. They generally assume the
# file already exists etc.
# Do all the tests we can given only a single filename. The file should
# exist.
def _do_single(self, filename):
self.assertTrue(os.path.exists(filename))
self.assertTrue(os.path.isfile(filename))
self.assertTrue(os.access(filename, os.R_OK))
self.assertTrue(os.path.exists(os.path.abspath(filename)))
self.assertTrue(os.path.isfile(os.path.abspath(filename)))
self.assertTrue(os.access(os.path.abspath(filename), os.R_OK))
os.chmod(filename, 0777)
os.utime(filename, None)
os.utime(filename, (time.time(), time.time()))
# Copy/rename etc tests using the same filename
self._do_copyish(filename, filename)
# Filename should appear in glob output
self.assertTrue(
os.path.abspath(filename)==os.path.abspath(glob.glob(filename)[0]))
# basename should appear in listdir.
path, base = os.path.split(os.path.abspath(filename))
if isinstance(base, str):
base = base.decode(TESTFN_ENCODING)
file_list = os.listdir(path)
# listdir() with a unicode arg may or may not return Unicode
# objects, depending on the platform.
if file_list and isinstance(file_list[0], str):
file_list = [f.decode(TESTFN_ENCODING) for f in file_list]
# Normalize the unicode strings, as round-tripping the name via the OS
# may return a different (but equivalent) value.
base = unicodedata.normalize("NFD", base)
file_list = [unicodedata.normalize("NFD", f) for f in file_list]
self.assertIn(base, file_list)
# Do as many "equivalancy' tests as we can - ie, check that although we
# have different types for the filename, they refer to the same file.
def _do_equivalent(self, filename1, filename2):
# Note we only check "filename1 against filename2" - we don't bother
# checking "filename2 against 1", as we assume we are called again with
# the args reversed.
self.assertTrue(type(filename1)!=type(filename2),
"No point checking equivalent filenames of the same type")
# stat and lstat should return the same results.
self.assertEqual(os.stat(filename1),
os.stat(filename2))
self.assertEqual(os.lstat(filename1),
| os.lstat(filename2))
# Copy/rename etc tests using equivalent filename
self._do_copyish(filename1, filename2)
# Tests that copy, move, etc one file to another.
def _do_copyish(self, filename1, fil | ename2):
# Should be able to rename the file using either name.
self.assertTrue(os.path.isfile(filename1)) # must exist.
os.rename(filename1, filename2 + ".new")
self.assertTrue(os.path.isfile(filename1+".new"))
os.rename(filename1 + ".new", filename2)
self.assertTrue(os.path.isfile(filename2))
shutil.copy(filename1, filename2 + ".new")
os.unlink(filename1 + ".new") # remove using equiv name.
# And a couple of moves, one using each name.
shutil.move(filename1, filename2 + ".new")
self.assertTrue(not os.path.exists(filename2))
shutil.move(filename1 + ".new", filename2)
self.assertTrue(os.path.exists(filename1))
# Note - due to the implementation of shutil.move,
# it tries a rename first. This only fails on Windows when on
# different file systems - and this test can't ensure that.
# So we test the shutil.copy2 function, which is the thing most
# likely to fail.
shutil.copy2(filename1, filename2 + ".new")
os.unlink(filename1 + ".new")
def _do_directory(self, make_name, chdir_name, encoded):
if os.path.isdir(make_name):
os.rmdir(make_name)
os.mkdir(make_name)
try:
with change_cwd(chdir_name):
if not encoded:
cwd_result = os.getcwdu()
name_result = make_name
else:
cwd_result = os.getcwd().decode(TESTFN_ENCODING)
name_result = make_name.decode(TESTFN_ENCODING)
cwd_result = unicodedata.normalize("NFD", cwd_result)
name_result = unicodedata.normalize("NFD", name_result)
self.assertEqual(os.path.basename(cwd_result),name_result)
finally:
os.rmdir(make_name)
# The '_test' functions 'entry points with params' - ie, what the
# top-level 'test' functions would be if they could take params
def _test_single(self, filename):
remove_if_exists(filename)
f = file(filename, "w")
f.close()
try:
self._do_single(filename)
finally:
os.unlink(filename)
self.assertTrue(not os.path.exists(filename))
# and again with os.open.
f = os.open(filename, os.O_CREAT)
os.close(f)
try:
self._do_single(filename)
finally:
os.unlink(filename)
def _test_equivalent(self, filename1, filename2):
remove_if_exists(filename1)
self.assertTrue(not os.path.exists(filename2))
f = file(filename1, "w")
f.close()
try:
self._do_equivalent(filename1, filename2)
finally:
os.unlink(filename1)
# The 'test' functions are unittest entry points, and simply call our
# _test functions with each of the filename combinations we wish to test
def test_single_files(self):
self._test_single(TESTFN_ENCODED)
self._test_single(TESTFN_UNICODE)
if TESTFN_UNENCODABLE is not None:
self._test_single(TESTFN_UNENCODABLE)
def test_equivalent_files(self):
self._test_equivalent(TESTFN_ENCODED, TESTFN_UNICODE)
self._test_equivalent(TESTFN_UNICODE, TESTFN_ENCODED)
def test_directories(self):
# For all 'equivalent' combinations:
# Make dir with encoded, chdir with unicode, checkdir with encoded
# (or unicode/encoded/unicode, etc
ext = ".dir"
self._do_directory(TESTFN_ENCODED+ext, TESTFN_ENCODED+ext, True)
self._do_directory(TESTFN_ENCODED+ext, TESTFN_UNICODE+ext, True)
self._do_directory(TESTFN_UNICODE+ext, TESTFN_ENCODED+ext, False)
self._do_directory(TESTFN_UNICODE+ext, TESTFN_UNICODE+ext, False)
# Our directory name that can't use a non-unicode name.
if TESTFN_UNENCODABLE is not None:
|
t-umeno/dpkt_merge_pcap | dpkt_merge_pcap.py | Python | bsd-3-clause | 1,539 | 0.014945 | #!/usr/bin/python
import getopt, sys
import dpkt,socket
def usage():
print "dpkt_merge_pcap [-l input_pcap_file_list] [-o output_pcap_file]"
def main():
input_pcap_file_list="/dev/stdin"
output_pcap_file="/dev/stdout"
try:
opts, args = getopt.getopt(sys.argv[1:], "hl:o:", ["help", "input_pcap_file_list=", "output_pcap_file="])
except getopt.GetoptError as err:
# print help information and exit:
usage()
sys.exit(2)
for o, a in opts:
if o in ("-h", "--help"):
usage()
sys.exit(0)
elif o in ("-l", "--input_pcap_file_list"):
input_pcap_file_list = a
elif o in ("-o", "--ou | tput_pcap_file"):
output_pcap_file = a
else:
assert False, "unhandled option"
output | _pcap = open(output_pcap_file,'wb')
pcw = dpkt.pcap.Writer(output_pcap)
f = open(input_pcap_file_list)
line = f.readline()
line = line.rstrip()
if len(line) > 0:
input_pcap = open(line,'rb')
pcr = dpkt.pcap.Reader(input_pcap)
for ts,buf in pcr:
pcw.writepkt(buf,ts)
input_pcap.close
while line:
line = f.readline()
line = line.rstrip()
if len(line) == 0:
continue
input_pcap = open(line,'rb')
pcr = dpkt.pcap.Reader(input_pcap)
for ts,buf in pcr:
pcw.writepkt(buf,ts)
input_pcap.close
f.close
output_pcap.close
if __name__ == '__main__':
main()
|
Mozpacers/Moz-Connect | moz_connect/wsgi.py | Python | mit | 399 | 0 | """
WSGI config for moz_connect project.
It exposes the WSGI callable as a module-level var | iable named ``application``.
For more information on this file, see
https://docs.djangoproject.com/en/1.9/howto/deployment/wsgi/
"""
import os
from django.core.wsgi import get_wsgi_application
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "moz_connect.settings")
application = get_ | wsgi_application()
|
gautamMalu/rootfs_xen_arndale | usr/lib/python3.4/pathlib.py | Python | gpl-2.0 | 41,820 | 0.000598 | import fnmatch
import functools
import io
import ntpath
import os
import posixpath
import re
import sys
from collections import Sequence
from contextlib import contextmanager
from errno import EINVAL, ENOENT
from operator import attrgetter
from stat import S_ISDIR, S_ISLNK, S_ISREG, S_ISSOCK, S_ISBLK, S_ISCHR, S_ISFIFO
from urllib.parse import quote_from_bytes as urlquote_from_bytes
supports_symlinks = True
try:
import nt
except ImportError:
nt = None
else:
if sys.getwindowsversion()[:2] >= (6, 0):
from nt import _getfinalpathname
else:
supports_symlinks = False
_getfinalpathname = None
__all__ = [
"PurePath", "PurePosixPath", "PureWindowsPath",
"Path", "PosixPath", "WindowsPath",
]
#
# Internals
#
def _is_wildcard_pattern(pat):
# Whether this pattern needs actual matching using fnmatch, or can
# be looked up directly as a file.
return "*" in pat or "?" in pat or "[" in pat
class _Flavour(object):
"""A flavour implements a particular (platform-specific) set of path
semantics."""
def __init__(self):
self.join = self.sep.join
def parse_parts(self, parts):
parsed = []
sep = self.sep
altsep = self.altsep
drv = root = ''
it = reversed(parts)
for part in it:
if not part:
continue
if altsep:
part = part.replace(altsep, sep)
drv, root, rel = self.splitroot(part)
if sep in rel:
for x in reversed(rel.split(sep)):
if x and x != '.':
parsed.append(sys.intern(x))
else:
if rel and rel != '.':
parsed.append(sys.intern(rel))
if drv or root:
if not drv:
# If no drive is present, try to find one in the previous
# parts. This makes the result of parsing e.g.
# ("C:", "/", "a") reasonably intuitive.
for part in it:
drv = self.splitroot(part)[0]
if drv:
break
break
if drv or root:
parsed.append(drv + root)
parsed.reverse()
return drv, root, parsed
def join_parsed_parts(self, drv, root, parts, drv2, root2, parts2):
"""
Join the two paths represented by the respective
(drive, root, parts) tuples. Return a new (drive, root, parts) tuple.
"""
if root2:
if not drv2 and drv:
return drv, root2, [drv + root2] + parts2[1:]
elif drv2:
if drv2 == drv or self.casefold(drv2) == self.casefold(drv):
# Same drive => second path is relative to the first
return drv, root, parts + parts2[1:]
else:
# Second path is non-anchored (common case)
return drv, root, parts + parts2
return drv2, root2, parts2
class _WindowsFlavour(_Flavour):
# Reference for Windows paths can be found at
# http://msdn.microsoft.com/en-us/library/aa365247%28v=vs.85%29.aspx
sep = '\\'
altsep = '/'
has_drv = True
pathmod = ntpath
is_supported = (nt is not None)
drive_letters = (
set(chr(x) for x in range(ord('a'), ord('z') + 1)) |
set(chr(x) for x in range(ord('A'), ord('Z') + 1))
)
ext_namespace_prefix = '\\\\?\\'
reserved_names = (
{'CON', 'PRN', 'AUX', 'NUL'} |
{'COM%d' % i for i in range(1, 10)} |
{'LPT%d' % i for i in range(1, 10)}
)
# Interesting findings about extended paths:
# - '\\?\c:\a', '//?/c:\a' and '//?/c:/a' are all supported
# but '\\?\c:/a' is not
# - extended paths are always absolute; "relative" extended paths will
# fail.
def splitroot(self, part, sep=sep):
first = part[0:1]
second = part[1:2]
if (second == sep and first == sep):
# XXX extended paths should also disable the collapsing of "."
# components (according to MSDN docs).
prefix, part = self._split_extended_path(part)
first = part[0:1]
second = part[1:2]
else:
prefix = ''
third = part[2:3]
if (second == sep and first == sep and third != sep):
# is a UNC path:
# vvvvvvvvvvvvvvvvvvvvv root
# \\machine\mountpoint\directory\etc\...
# directory ^^^^^^^^^^^^^^
index = part.find(sep, 2)
if index != -1:
index2 = part.find(sep, index + 1)
# a UNC path can't have two slashes in a row
# (after the initial two)
if index2 != index + 1:
if index2 == -1:
index2 = len(part)
if prefix:
return prefix + part[1:index2], sep, part[index2+1:]
else:
return part[:index2], sep, part[index2+1:]
drv = root = ''
if second == ':' and first in self.drive_letters:
drv = part[:2]
part = part[2:]
first = third
if first == sep:
root = first
part = part.lstrip(sep)
return prefix + drv, root, part
def casefold(self, s):
return s.lower()
def casefold_parts(self, parts):
return [p.lower() for p in parts]
def resolve(self, path):
s = str(path)
if not s:
return os.getcwd()
if _getfinalpathname is not None:
return self._ext_to_normal(_getfinalpathname(s))
# Means fallback on absolute
return None
def _split_extended_path(self, s, ext_prefix=ext_namespace_prefix):
prefix = ''
if s.startswith(ext_prefix):
prefix = s[:4]
s = s[4:]
if s.startswith('UNC\\'):
prefix += s[:3]
s = '\\' + s[3:]
return prefix, s
def _ext_to_normal(self, s):
# Turn back an extended path into a normal DOS-like path
return self._split_extended_path(s)[1]
def is_reserved(self, parts):
# NOTE: the rules for reserved names seem somewhat complicated
# (e.g. r"..\NUL" is reserved but not r"foo\NUL").
# We err on the side of caution and return True for paths which are
# not considered reserved by Windows.
if not parts:
return False
if parts[0].startswith('\\\\'):
# UNC paths are never reserved
return False
return parts[-1].partition('.')[0].upper() in self.reserved_names
def make_uri(self, path):
# Under Windows, file URIs use the UTF-8 encoding.
drive = path.drive
if len(drive) == 2 and drive[1] == ':':
# It's a path on a local drive => 'file:///c:/a/b'
rest = path.as_posix()[2:].lstrip('/')
return 'file:///%s/%s' % (
drive, urlquote_from_bytes(rest.encode('utf-8')))
else:
# It's a path on a network drive => 'file://host/share/a/b'
return 'file:' + urlquote_from_bytes(path.as_posix().encode('utf-8'))
class _PosixFlavour(_Flavour):
sep = '/'
altsep = ''
has_drv = False
pathmod = posixpath
is_supported = (os.name != 'nt')
def splitroot(self, part, sep=sep):
if part and part[0] == sep:
stripped_part = part.lstrip(sep)
# According to POSIX path resolution:
# http://pubs.opengroup.org/onlinepubs/009695399/basedefs/xbd_chap04.html#tag_04_11
# "A pathname that begins with | two successive slashes may be
# interpreted in an implementation-defined manner, although more
# than two leading slashes shall be treated as a single slash".
| if len(part) - len(stripped_part) == 2:
return '', sep * 2, stripped_part
else:
return '', sep, stripped_part
else:
return '', '', part
def case |
jasonmccampbell/numpy-refactor-sprint | numpy/core/tests/test_unicode.py | Python | bsd-3-clause | 11,858 | 0.002952 | import sys
from numpy.testing import *
from numpy.core import *
from numpy.compat import asbytes
# Guess the UCS length for this python interpreter
if sys.version_info[0] >= 3:
import array as _array
ucs4 = (_array.array('u').itemsize == 4)
def buffer_length(arr):
if isinstance(arr, unicode):
return _array.array('u').itemsize * len(arr)
v = memoryview(arr)
if v.shape is None:
return len(v) * v.itemsize
else:
return prod(v.shape) * v.itemsize
else:
if len(buffer(u'u')) == 4:
ucs4 = True
else:
ucs4 = False
def buffer_length(arr):
if isinstance(arr, ndarray):
return len(arr.data)
return len(buffer(arr))
# Value that can be represented in UCS2 interpreters
ucs2_value = u'\uFFFF'
# Value that cannot be represented in UCS2 interpreters (but can in UCS4)
ucs4_value = u'\U0010FFFF'
############################################################
# Creation tests
############################################################
class create_zeros(object):
"""Check the creation of zero-valued arrays"""
def content_check(self, ua, ua_scalar, nbytes):
# Check the length of the unicode base type
self.assert_(int(ua.dtype.str[2:]) == self.ulen)
# Check the length of the data buffer
self.assert_(buffer_length(ua) == nbytes)
# Small check that data in array element is ok
self.assert_(ua_scalar == u'')
# Encode to ascii and double check
self.assert_(ua_scalar.encode('ascii') == asbytes(''))
# Check buffer lengths for scalars
if ucs4:
self.assert_(buffer_length(ua_scalar) == 0)
else:
self.assert_(buffer_length(ua_scalar) == 0)
def test_zeros0D(self):
"""Check creation of 0-dimensional objects"""
ua = zeros((), dtype='U%s' % self.ulen)
self.content_check(ua, ua[()], 4*self.ulen)
def test_zerosSD(self):
"""Check creation of single-dimensional objects"""
ua = zeros((2,), dtype='U%s' % self.ulen)
self.content_check(ua, ua[0], 4*self.ulen*2)
self.content_check(ua, ua[1], 4*self.ulen*2)
def test_zerosMD(self):
"""Check creation of multi-dimensional objects"""
ua = zeros((2,3,4), dtype='U%s' % self.ulen)
self.content_check(ua, ua[0,0,0], 4*self.ulen*2*3*4)
self.content_check(ua, ua[-1,-1,-1], 4*self.ulen*2*3*4)
class test_create_zeros_1(create_zeros, TestCase):
"""Check the creation of zero-valued arrays (size 1)"""
ulen = 1
class test_create_zeros_2(create_zeros, TestCase):
"""Check the creation of zero-valued arrays (size 2)"""
ulen = 2
class test_create_zeros_1009(create_zeros, TestCase):
"""Check the creation of zero-valued arrays (size 1009)"""
ulen = 1009
class create_values(object):
"""Check the creation of unicode arrays with values"""
def content_check(self, ua, ua_scalar, nbytes):
# Check the length of the unicode base type
self.assert_(int(ua.dtype.str[2:]) == self.ulen)
# Check the length of the data buffer
self.assert_(buffer_length(ua) == nbytes)
# Small check that data in array element is ok
self.assert_(ua_scalar == self.ucs_value*self.ulen)
# Encode to UTF-8 and double check
self.assert_(ua_scalar.encode('utf-8') == \
(self.ucs_value*self.ulen).encode('utf-8'))
# Check buffer lengths for scalars
if ucs4:
self.assert_(buffer_length(ua_scalar) == 4*self.ulen)
else:
if self.ucs_value == ucs4_value:
# In UCS2, the \U0010FFFF will be represented using a
# surrogate *pair*
self.assert_(buffer_length(ua_scalar) == 2*2*self.ulen)
else:
# In UCS2, the \uFFFF will be represented using a
# regular 2-byte word
self.assert_(buffer_length(ua_scalar) == 2*self.ulen)
def test_values0D(self):
"""Check creation of 0-dimensional objects with values"""
ua = array(self.ucs_value*self.ulen, dtype='U%s' % self.ulen)
self.content_check(ua, ua[()], 4*self.ulen)
def test_valuesSD(self):
"""Check creation of single-dimensional objects with values"""
ua = array([self.ucs_value*self.ulen]*2, dtype='U%s' % self.ulen)
self.content_check(ua, ua[0], 4*self.ulen*2)
self.content_check(ua, ua[1], 4*self.ulen*2)
def test_valuesMD(self):
"""Check creation of multi-dimensional objects with values"""
ua = array([[[self.ucs_value*self.ulen]*2]*3]*4, dtype='U%s' % self.ulen)
self.content_check(ua, ua[0,0,0], 4*self.ulen*2*3*4)
self.content_check(ua, ua[-1,-1,-1], 4*self.ulen*2*3*4)
class test_create_values_1_ucs2(create_values, TestCase):
"""Check the creation of valued arrays (size 1, UCS2 values)"""
ulen = 1
ucs_value = ucs2_value
class test_create_values_1_ucs4(create_values, TestCase):
"""Check the creation of valued arrays (size 1, UCS4 values)"""
ulen = 1
ucs_value = ucs4_value
class test_create_values_2_ucs2(create_values, TestCase):
"""Check the creation of valued arrays (size 2, UCS2 values)"""
ulen = 2
ucs_value = ucs2_value
class test_create_values_2_ucs4(create_values, TestCase):
"""Check the creation of valued arrays (size 2, UCS4 values)"""
ulen = 2
ucs_value = ucs4_value
class test_create_values_1009_ucs2(create_values, TestCase):
"""Check the creation of valued arrays (size 1009, UCS2 values)"""
ulen = 1009
ucs_value = ucs2_value
cla | ss test_create_values_1009_ucs4(create_values, TestCase):
"""Check the creation of valued arrays (size 1009, UCS4 values)"""
ulen = 1009
ucs_value = ucs4_value
############################################################
# Assignment tests
############################################################
class assign_values(object):
"""Check the assignment of unicode arrays with values"""
def content_check(self, ua, ua_scalar, nb | ytes):
# Check the length of the unicode base type
self.assert_(int(ua.dtype.str[2:]) == self.ulen)
# Check the length of the data buffer
self.assert_(buffer_length(ua) == nbytes)
# Small check that data in array element is ok
self.assert_(ua_scalar == self.ucs_value*self.ulen)
# Encode to UTF-8 and double check
self.assert_(ua_scalar.encode('utf-8') == \
(self.ucs_value*self.ulen).encode('utf-8'))
# Check buffer lengths for scalars
if ucs4:
self.assert_(buffer_length(ua_scalar) == 4*self.ulen)
else:
if self.ucs_value == ucs4_value:
# In UCS2, the \U0010FFFF will be represented using a
# surrogate *pair*
self.assert_(buffer_length(ua_scalar) == 2*2*self.ulen)
else:
# In UCS2, the \uFFFF will be represented using a
# regular 2-byte word
self.assert_(buffer_length(ua_scalar) == 2*self.ulen)
def test_values0D(self):
"""Check assignment of 0-dimensional objects with values"""
ua = zeros((), dtype='U%s' % self.ulen)
ua[()] = self.ucs_value*self.ulen
self.content_check(ua, ua[()], 4*self.ulen)
def test_valuesSD(self):
"""Check assignment of single-dimensional objects with values"""
ua = zeros((2,), dtype='U%s' % self.ulen)
ua[0] = self.ucs_value*self.ulen
self.content_check(ua, ua[0], 4*self.ulen*2)
ua[1] = self.ucs_value*self.ulen
self.content_check(ua, ua[1], 4*self.ulen*2)
def test_valuesMD(self):
"""Check assignment of multi-dimensional objects with values"""
ua = zeros((2,3,4), dtype='U%s' % self.ulen)
ua[0,0,0] = self.ucs_value*self.ulen
self.content_check(ua, ua[0,0,0], 4*self.ulen*2*3*4)
ua[-1,-1,-1] = self.ucs_value*self.ulen
self.content_check(ua, ua[-1,-1,-1], 4*self.ulen*2*3*4)
class test_assign |
Fat-Zer/FreeCAD_sf_master | src/Mod/Plot/plotSave/TaskPanel.py | Python | lgpl-2.1 | 8,708 | 0.002526 | #***************************************************************************
#* *
#* Copyright (c) 2011, 2012 *
#* Jose Luis Cercos Pita <jlcercos@gmail.com> *
#* *
#* This program is free software; you can redistribute it and/or modify *
#* it under the terms of the GNU Lesser General Public License (LGPL) *
#* as published by the Free Software Foundation; either version 2 of *
#* the License, or (at your option) any later version. *
#* f | or detail see the LICENCE text file. *
#* *
#* This program is distributed in the hope that | it will be useful, *
#* but WITHOUT ANY WARRANTY; without even the implied warranty of *
#* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the *
#* GNU Library General Public License for more details. *
#* *
#* You should have received a copy of the GNU Library General Public *
#* License along with this program; if not, write to the Free Software *
#* Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 *
#* USA *
#* *
#***************************************************************************
import os
import six
import FreeCAD as App
import FreeCADGui as Gui
from PySide import QtGui, QtCore
import Plot
from plotUtils import Paths
class TaskPanel:
def __init__(self):
self.ui = Paths.modulePath() + "/plotSave/TaskPanel.ui"
def accept(self):
plt = Plot.getPlot()
if not plt:
msg = QtGui.QApplication.translate(
"plot_console",
"Plot document must be selected in order to save it",
None)
App.Console.PrintError(msg + "\n")
return False
mw = self.getMainWindow()
form = mw.findChild(QtGui.QWidget, "TaskPanel")
form.path = self.widget(QtGui.QLineEdit, "path")
form.sizeX = self.widget(QtGui.QDoubleSpinBox, "sizeX")
form.sizeY = self.widget(QtGui.QDoubleSpinBox, "sizeY")
form.dpi = self.widget(QtGui.QSpinBox, "dpi")
path = six.text_type(form.path.text())
size = (form.sizeX.value(), form.sizeY.value())
dpi = form.dpi.value()
Plot.save(path, size, dpi)
return True
def reject(self):
return True
def clicked(self, index):
pass
def open(self):
pass
def needsFullSpace(self):
return True
def isAllowedAlterSelection(self):
return False
def isAllowedAlterView(self):
return True
def isAllowedAlterDocument(self):
return False
def helpRequested(self):
pass
def setupUi(self):
mw = self.getMainWindow()
form = mw.findChild(QtGui.QWidget, "TaskPanel")
form.path = self.widget(QtGui.QLineEdit, "path")
form.pathButton = self.widget(QtGui.QPushButton, "pathButton")
form.sizeX = self.widget(QtGui.QDoubleSpinBox, "sizeX")
form.sizeY = self.widget(QtGui.QDoubleSpinBox, "sizeY")
form.dpi = self.widget(QtGui.QSpinBox, "dpi")
self.form = form
self.retranslateUi()
QtCore.QObject.connect(
form.pathButton,
QtCore.SIGNAL("pressed()"),
self.onPathButton)
QtCore.QObject.connect(
Plot.getMdiArea(),
QtCore.SIGNAL("subWindowActivated(QMdiSubWindow*)"),
self.onMdiArea)
home = os.getenv('USERPROFILE') or os.getenv('HOME')
form.path.setText(os.path.join(home, "plot.png"))
self.updateUI()
return False
def getMainWindow(self):
toplevel = QtGui.QApplication.topLevelWidgets()
for i in toplevel:
if i.metaObject().className() == "Gui::MainWindow":
return i
raise RuntimeError("No main window found")
def widget(self, class_id, name):
"""Return the selected widget.
Keyword arguments:
class_id -- Class identifier
name -- Name of the widget
"""
mw = self.getMainWindow()
form = mw.findChild(QtGui.QWidget, "TaskPanel")
return form.findChild(class_id, name)
def retranslateUi(self):
"""Set the user interface locale strings."""
self.form.setWindowTitle(QtGui.QApplication.translate(
"plot_save",
"Save figure",
None))
self.widget(QtGui.QLabel, "sizeLabel").setText(
QtGui.QApplication.translate(
"plot_save",
"Inches",
None))
self.widget(QtGui.QLabel, "dpiLabel").setText(
QtGui.QApplication.translate(
"plot_save",
"Dots per Inch",
None))
self.widget(QtGui.QLineEdit, "path").setToolTip(
QtGui.QApplication.translate(
"plot_save",
"Output image file path",
None))
self.widget(QtGui.QPushButton, "pathButton").setToolTip(
QtGui.QApplication.translate(
"plot_save",
"Show a file selection dialog",
None))
self.widget(QtGui.QDoubleSpinBox, "sizeX").setToolTip(
QtGui.QApplication.translate(
"plot_save",
"X image size",
None))
self.widget(QtGui.QDoubleSpinBox, "sizeY").setToolTip(
QtGui.QApplication.translate(
"plot_save",
"Y image size",
None))
self.widget(QtGui.QSpinBox, "dpi").setToolTip(
QtGui.QApplication.translate(
"plot_save",
"Dots per point, with size will define output image"
" resolution",
None))
def updateUI(self):
""" Setup UI controls values if possible """
mw = self.getMainWindow()
form = mw.findChild(QtGui.QWidget, "TaskPanel")
form.path = self.widget(QtGui.QLineEdit, "path")
form.pathButton = self.widget(QtGui.QPushButton, "pathButton")
form.sizeX = self.widget(QtGui.QDoubleSpinBox, "sizeX")
form.sizeY = self.widget(QtGui.QDoubleSpinBox, "sizeY")
form.dpi = self.widget(QtGui.QSpinBox, "dpi")
plt = Plot.getPlot()
form.path.setEnabled(bool(plt))
form.pathButton.setEnabled(bool(plt))
form.sizeX.setEnabled(bool(plt))
form.sizeY.setEnabled(bool(plt))
form.dpi.setEnabled(bool(plt))
if not plt:
return
fig = plt.fig
size = fig.get_size_inches()
dpi = fig.get_dpi()
form.sizeX.setValue(size[0])
form.sizeY.setValue(size[1])
form.dpi.setValue(dpi)
def onPathButton(self):
"""Executed when the path selection button is pressed."""
mw = self.getMainWindow()
form = mw.findChild(QtGui.QWidget, "TaskPanel")
form.path = self.widget(QtGui.QLineEdit, "path")
path = form.path.text()
file_choices = ("Portable Network Graphics (*.png)|*.png;;"
"Portable Document Format (*.pdf)|*.pdf;;"
"PostScript (*.ps)|*.ps;;"
"Encapsulated PostScript (*.eps)|*.eps")
path = QtGui.QFileDialog.getSaveFileName(None,
'Save figure',
path,
|
niqdev/packtpub-crawler | script/utils.py | Python | mit | 2,506 | 0.00399 | import requests
import ConfigParser
from bs4 import BeautifulSoup
from time import sleep
from clint.textui import progress
import os, sys, itertools
from threading import Thread
from logs import *
def ip_address():
"""
Gets current IP address
"""
response = requests.get('http://www.ip-addr.es')
print '[-] GET {0} | {1}'.format(response.status_code, response.url)
log_info('[+] ip address is: {0}'.format(response.text.strip()))
def config_file(path):
"""
Reads configuration file
"""
if not os.path.exists(path):
raise IOError('file not found!')
log_info('[*] configuration file: {0}'.format(path))
config = ConfigParser.ConfigParser()
config.read(path)
return config
def make_soup(response, debug=False):
"""
Makes soup from response
"""
print '[*] fetching url... {0} | {1}'.format(response.status_code, response.url)
soup = BeautifulSoup(response.text, 'html5lib')
if debug:
print soup.prettify().encode('utf-8')
return soup
def wait(delay, isDev):
if delay > 0:
if isDev:
print '[-] going to sleep {0} seconds'.format(delay)
sleep(delay)
def download_file(r, url, directory, filename, headers):
"""
Downloads file with progress bar
"""
if not os.path.exists(directory):
# creates directories recursively
os.makedirs(directory)
log_info('[+] created new directory: ' + directory)
filename = filename.replace(':', '-')
path = os.path.join(directory, filename)
print '[-] downloading file from url: {0}'.format(url)
response = r.get(url, headers=headers, stream=True)
#log_dict(response.headers)
total_length = 0
test_length | = response.headers.get('content-length')
if test_length is not None:
total_length = int(test_length)
with open(path, 'wb') as f:
for chunk in progress.bar(response.iter_content(chunk_size=1024), expected_size=(total_length/1024) + 1):
if chunk:
f.write(chunk)
f.flush()
log_success('[+] new download: {0}'.format(path))
return path
def thread_loader(function):
"""
Starts | a thread with loading bar
"""
thread = Thread(target=function)
thread.start()
spinner = itertools.cycle(['-', '/', '|', '\\'])
while thread.is_alive():
sys.stdout.write(spinner.next())
sys.stdout.flush()
# erase the last written char
sys.stdout.write('\b')
|
nirs/vdsm | tests/storage/securable_test.py | Python | gpl-2.0 | 3,360 | 0 | #
# Copyright 2012-2016 Red Hat, Inc.
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
#
# Refer to the README and COPYING files for full details of the license
#
from __future__ import absolute_import
from __future__ import division
from vdsm.storage.securable import secured, SecureError, unsecured
from testlib import VdsmTestCase
@secured
class SecureClass(object):
class InnerClass(object):
pass
classVariable = 42
def __init__(self):
self.secured = False
def __is_secure__(self):
return self.secured
@staticmethod
def staticMethod():
pass
@classmethod
def classMethod(cls):
pass
def securedMethod(self):
"securedMethod docstring"
pass
@unsecured
def unsecuredMethod(self):
"unsecuredMethod docstring"
pass
class ClassWithoutIsSecureMethod(object):
pass
class ClassIsSecureClassMethod(object):
@classmethod
def __is_secure__(cls):
ret | urn True
class TestSecurable(VdsmTestCase):
def assertUnsecured(self, secureObject):
self.assertRaises(SecureError, secureObject.securedMethod)
secureObject.unsecuredMethod()
def assertSecured(self, secureObject):
secureObject.securedMethod()
secureObject.unsecuredMethod()
def testIsSecureMethodCheck(self):
self.assertRaises(NotImplementedError, secured,
| ClassWithoutIsSecureMethod)
self.assertRaises(NotImplementedError, secured,
ClassIsSecureClassMethod)
def testSecurable(self):
secureObject = SecureClass()
self.assertUnsecured(secureObject)
secureObject.secured = True
self.assertSecured(secureObject)
secureObject.secured = False
self.assertUnsecured(secureObject)
def testSecurityOverride(self):
secureObject = SecureClass()
secureObject.securedMethod(__securityOverride=True)
def testDocstringWrapping(self):
secureObject = SecureClass()
self.assertEqual(secureObject.securedMethod.__doc__,
"securedMethod docstring")
self.assertEqual(secureObject.unsecuredMethod.__doc__,
"unsecuredMethod docstring")
def testInnerClass(self):
obj = SecureClass.InnerClass()
self.assertEqual(type(obj), SecureClass.InnerClass)
def testClassVariable(self):
self.assertEqual(SecureClass.classVariable, 42)
def testStaticMethod(self):
SecureClass.staticMethod()
def testClassMethod(self):
SecureClass.classMethod()
secureObject = SecureClass()
secureObject.classMethod()
|
NorthIsUp/sf-lindy | src/sflindy/manage.py | Python | bsd-3-clause | 277 | 0.00361 | #!/usr/bin/env python
import os
import sys
def main():
os.environ.setdefault("DJANGO | _SETTINGS_MODULE", "sflindy.settings")
from django.core.management import execute_from_command_line
ex | ecute_from_command_line(sys.argv)
if __name__ == "__main__":
main()
|
mseclab/AHE17 | YouCanHideButYouCannotRun/multithreads.py | Python | mit | 2,670 | 0.004494 | """
Allow to trace called methods and package
"""
import frida
import re
syms = []
def on_message(message, data):
global syms
global index, filename
if message['type'] == 'send':
if "SYM" in message["payload"]:
c = message["payload"].split(":")[1]
print c
syms.append(c)
else:
print("[*] {0}".format(message["payload"]))
else:
print(message)
def overload2params(x):
start = 97
params = []
count_re = re.compile('\((.*)\)')
arguments = count_re.findall(x)
if arguments[0]:
arguments = arguments[0]
arguments = arguments.replace(" ", "")
arguments = arguments.split(",")
for _ in arguments:
params.append(chr(start))
start += 1
return ",".join(params)
else:
return ""
def get_script():
jscode = """
Java.perform(function() {
var flagArray = [];
var randomfile = Java.use('java.io.RandomAccessFile');
var skip = true;
randomfile.seek.implementation = function(pos)
{
if (pos == 0){
skip = false;
}
return randomfile.seek.call(this, pos);
}
randomfile.writeChar.implementation = function(c)
{
if(skip || c == 10)
{
send("PARTIAL:"+flagArray.join(""));
}else{
send("index: "+c);
flagArray.push(String.fromCharCode(c))
send("SYM:"+String.fromCharCode(c));
}
return randomfile.writeChar.call(this, c);
}
});
"""
return jscode
def attach_to_process(proc_name):
done = False
process = None
while not done:
try:
process = frida.get_usb_device().attach(proc_name)
done = True
except Exception:
pass
return process
if __name__ == "__main__":
print "[+] Waiting for app called {0}".format("hackchallenge.ahe17.teamsik.org.romanempire")
| process = attach_to_process("hackchallenge.ahe17.teamsik.org.romanempire")
script = get_script()
try:
script = process.create_script(script)
except frida.InvalidArgumentError as e:
message = e.args[0]
line = re.compile('Script\(line (\d+)\)')
line = int(line.findall(message)[0])
script = script.split("\n")
print "[-] Error on line {0}:\n{1}: {2}".for | mat(line, line, script[line])
exit(0)
script.on('message', on_message)
print('[*] Attached on process')
print('[*] Press enter to exit...')
script.load()
try:
raw_input()
except KeyboardInterrupt:
pass
print "FLAG: " + "".join(syms) |
karelvysinka/connector-woocommerce | woocommerceerpconnect/unit/binder.py | Python | agpl-3.0 | 6,183 | 0 | # -*- coding: utf-8 -*-
#
#
# Tech-Receptives Solutions Pvt. Ltd.
# Copyright (C) 2009-TODAY Tech-Receptives(<http://www.techreceptives.com>).
#
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as
# published by the Free Software Foundation, either version 3 of the
# License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
#
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <http://www.gnu.org/licenses/>.
#
#
import openerp
from openerp.addons.connector.connector import Binder
from ..backend import woo
class WooBinder(Binder):
""" Generic Binder for WooCommerce """
@woo
class WooModelBinder(WooBinder):
"""
Bindings are done directly on the binding model.woo.product.category
Binding models are models called ``woo.{normal_model}``,
like ``woo.res.partner`` or ``woo.product.product``.
They are ``_inherits`` of the normal models and contains
the Woo ID, the ID of the Woo Backend and the additional
fields belonging to the Woo instance.
"""
_model_name = [
'woo.res.partner',
'woo.product.category',
'woo.product.product',
'woo.sale.order',
'woo.sale.order.line',
]
def to_openerp(self, external_id, unwrap=False, browse=False):
""" Give the OpenERP ID for an external ID
:param external_id: external ID for which we want the OpenERP ID
:param unwrap: if True, returns the normal record (the one
inherits'ed), else return the binding record
:param browse: if True, returns a recordset
:return: a recordset of one record, depending on the value of unwrap,
or an empty recordset if no binding is found
:rtype: recordset
"""
bindings = self.model.with_context(active_test=False).search(
[('woo_id', '=', str(external_id)),
('backend_id', '=', self.backend_record.id)]
)
if not bindings:
return self.model.browse() if browse else None
assert len(bindings) == 1, "Several records found: %s" % (bindings,)
if unwrap:
return bindings.openerp_id if browse else bindings.openerp_id.id
else:
return bindings if browse else bindings.id
def to_backend(self, record_id, wrap=False):
""" Give the external ID for an OpenERP ID
:param record_id: OpenERP ID for which we want the external id
or a recordset with one record
:param wrap: if False, record_id is the ID of the binding,
if True, record_id is the ID of the normal record, the
method will search the corresponding binding and returns
the backend id of the binding
:return: backend identifier of the record
"""
recor | d = self.model.browse()
if isinstance(record_id, openerp.models.BaseMod | el):
record_id.ensure_one()
record = record_id
record_id = record_id.id
if wrap:
binding = self.model.with_context(active_test=False).search(
[('openerp_id', '=', record_id),
('backend_id', '=', self.backend_record.id),
]
)
if binding:
binding.ensure_one()
return binding.woo_id
else:
return None
if not record:
record = self.model.browse(record_id)
assert record
return record.woo_id
def bind(self, external_id, binding_id):
""" Create the link between an external ID and an OpenERP ID and
update the last synchronization date.
:param external_id: External ID to bind
:param binding_id: OpenERP ID to bind
:type binding_id: int
"""
# the external ID can be 0 on Woo! Prevent False values
# like False, None, or "", but not 0.
assert (external_id or external_id == 0) and binding_id, (
"external_id or binding_id missing, "
"got: %s, %s" % (external_id, binding_id)
)
# avoid to trigger the export when we modify the `woo_id`
now_fmt = openerp.fields.Datetime.now()
if not isinstance(binding_id, openerp.models.BaseModel):
binding_id = self.model.browse(binding_id)
binding_id.with_context(connector_no_export=True).write(
{'woo_id': str(external_id),
'sync_date': now_fmt,
})
def unwrap_binding(self, binding_id, browse=False):
""" For a binding record, gives the normal record.
Example: when called with a ``woo.product.product`` id,
it will return the corresponding ``product.product`` id.
:param browse: when True, returns a browse_record instance
rather than an ID
"""
if isinstance(binding_id, openerp.models.BaseModel):
binding = binding_id
else:
binding = self.model.browse(binding_id)
openerp_record = binding.openerp_id
if browse:
return openerp_record
return openerp_record.id
def unwrap_model(self):
""" For a binding model, gives the name of the normal model.
Example: when called on a binder for ``woo.product.product``,
it will return ``product.product``.
This binder assumes that the normal model lays in ``openerp_id`` since
this is the field we use in the ``_inherits`` bindings.
"""
try:
column = self.model._fields['openerp_id']
except KeyError:
raise ValueError('Cannot unwrap model %s, because it has '
'no openerp_id field' % self.model._name)
return column.comodel_name
|
gnowgi/gnowsys-studio | objectapp/urls/add.py | Python | agpl-3.0 | 3,741 | 0.000802 | # Copyright (c) 2011, 2012 Free Software Foundation
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as
# published by the Free Software Foundation, either version 3 of the
# License, or (at your option) any later version.
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <http://www.gnu.org/licenses/>.
# This project incorporates work covered by the following copyright and permission notice:
# Copyright (c) 2009, Julien Fache
# All rights reserved.
# Redistribution and use in source and binary forms, with or without
# modification, are permitted provided that the following conditions
# are met:
# * Redistributions of source code must retain the above copyright
# notice, this list of conditions and the following disclaimer.
# * Redistributions in binary form must reproduce the above copyright
# notice, this list of conditions and the following disclaimer in
# the documentation and/or other materials provided with the
# distribution.
# * Neither the name of the author nor the names of other
# contributors may be used to endorse or promote products derived
# from this software without specific prior written permission.
# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
# "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
# LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS
# FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE
# COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT,
# INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES
# (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
# SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION)
# HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT,
# STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
# ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED
# OF THE POSSIBILITY OF SUCH DAMAGE.
# Copyright (c) 2011, 2012 Free Software Foundation
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as
# published by the Free Software Foundation, either version 3 of the
# License, or (at your option) any later version.
# This program is distributed in the hope that it will be useful,
# but WITHOUT | ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <http://www.gnu.org/licenses/>.
"""Urls for Objectapp forms"""
from django.conf.urls.defaults import url
from django.conf.urls.defaults import patterns
urlpatterns = patterns('objectapp.vi | ews.add',
url(r'^gbobject/$', 'addgbobject',
name='objectapp_add_gbobject'),
url(r'^process/$', 'addprocess',
name='objectapp_add_gbobject'),
url(r'^system/$', 'addsystem',
name='objectapp_add_system'),
)
|
NekBox/nekpy | nekpy/dask/runner.py | Python | mit | 1,462 | 0.004788 | from dask.callbacks import Callback
from os import getcwd, remove
from os.path import join, exists
from dask.diagnostics import ProgressBar
from dask.multiprocessing import get as get_proc
import toolz
import json
class NekCallback(Callback):
def __init__(self, case):
self.case = case
self.cwd = getcwd()
self.cache = {}
if exists(join(sel | f.cwd, "HALT")):
remove(join(self.cwd, "HALT"))
def _posttask(self, key, result, dsk, state, id):
self.cache.update(state['cache'])
with open(join(self.cwd, "{}.cache". | format(self.case["prefix"])), "w") as f:
json.dump(self.cache, f)
if exists(join(self.cwd, "HALT")):
for k in state['ready']:
state['cache'][k] = None
for k in state['waiting']:
state['cache'][k] = None
state['ready'] = []
state['waiting'] = []
return
def run_all(values, base, get=get_proc, num_workers = 4):
full_dask = toolz.merge(val.dask for val in values)
full_keys = [val._key for val in values]
cache = {}
if exists("{}.cache".format(base["prefix"])):
with open("{}.cache".format(base["prefix"]), "r") as f:
cache = json.load(f)
full_dask.update(cache)
with ProgressBar(), NekCallback(base) as rprof:
res = get(full_dask, full_keys, cache=cache, num_workers=num_workers, optimize_graph=False)
return res
|
mdbartos/RIPS | temperature/elec_temp_join.py | Python | mit | 2,831 | 0.011657 | import datetime
import os
import numpy as np
import pandas as pd
from shapely import geometry
import geopandas as gpd
from geopandas import tools
import sys
sys.path.append('/home/kircheis/github/RIPS')
from rect_grid import rect_grid
#### DECLARE FILE PATHS
utility = '/home/kircheis/data/shp/Electric_Retail_Service_Ter.shp'
util = gpd.read_file(utility)
urbarea = '/home/kircheis/data/shp/census/cb_2013_us_ua10_500k/cb_2013_us_ua10_500k.shp'
ua = gpd.read_file(urbarea)
ua = ua.to_crs(util.crs)
urbpop = '/home/kircheis/data/census/ua/ua_list_all.txt'
uapop = pd.read_fwf(urbpop, colspecs=[(0,5), (10,70), (75,84), (89,98), (103,117), (122,131), (136,150), (155,164), (169,178), (183,185)], names=['UACE', 'NAME', 'POP', 'HU', 'AREALAND', 'AREALANDSQMI', 'AREAWATER', 'AREAWATERSQMI', 'POPDEN', 'LSADC'], skiprows=1)
uapop['UACE'] = uapop['UACE'].astype(str).str.pad(5, side='left', fillchar='0')
uapop = uapop.set_index('UACE')
#### FIND WHICH URBAN AREAS ARE IN WHICH UTILITY SERVICE AREAS
j = tools.sjoin(util, ua)
#### ALLOCATE GRID FOR TEMPERATURE FORCINGS
g = rect_grid((-130, 24, -65, 50), 0.125)
coords = g.centroid.apply(lambda x: x.coords[0])
coordstr = coords.apply(lambda x: 'data_%s_%s' % (x[1], x[0]))
g = gpd.GeoDataFrame(geometry=g.geometry, index=g.index)
g.crs = util.crs
g['coordstr'] = coordstr
#### JOIN U | TILITY SERVICE AREAS WITH TEMPERATURE FORCINGS
ua_g = tools.sjoin(ua, g)
ua_g['grid_geom'] = ua_g['index_right'].map(g['geometry'])
ua_g['dist'] = ua_g.apply(lambda x: (x['geometry'].centroid).distance(x['grid_geom'].centroid), axis=1)
ua_g_out = ua_g.reset_index().loc[ua_g.reset_index().groupby('index').idxmin('dist')['dist'].values].set_index('index')
# | ### MAP COORDINATE STRING TO ORIGINAL JOIN
j['grid_cell'] = j['index_right'].map(ua_g_out['coordstr'])
j['POP'] = j['UACE10'].map(uapop['POP'])
eia_to_util = pd.read_csv('/home/kircheis/github/RIPS/crosswalk/util_eia_id.csv', index_col=0)
j['eia_code'] = j['UNIQUE_ID'].map(eia_to_util['company_id'])
#### WRITE TO CSV
#j[['UNIQUE_ID', 'NAME', 'CITY', 'index_right', 'AFFGEOID10', 'UACE10', 'NAME10', 'grid_cell', 'POP', 'eia_code']].to_csv('util_demand_to_met_ua')
#### FOR UTILITIES WITH NO URBAN AREA
non_ua = util[~np.in1d(util['UNIQUE_ID'].values, j['UNIQUE_ID'].unique())]
non_ua_c = non_ua.centroid
dirlist = pd.Series(os.listdir('/home/kircheis/data/source_hist_forcings/'))
distlist = gpd.GeoSeries(dirlist.str.split('_').str[::-1].str[:-1].apply(lambda x: geometry.Point(float(x[0]), float(x[1]))))
non_ua['grid_cell'] = non_ua.centroid.apply(lambda x: dirlist[list(distlist.sindex.nearest(x.coords[0], objects='raw'))[0]])
non_ua['eia_code'] = non_ua['UNIQUE_ID'].map(eia_to_util['company_id'])
#### WRITE TO CSV
#non_ua[['UNIQUE_ID', 'NAME', 'grid_cell', 'eia_code']].to_csv('util_demand_to_met_nonua')
|
Irishsmurf/A-Dinosaur-Tale | gamelib/ezmenu.py | Python | gpl-3.0 | 1,510 | 0.045695 | import pygame
class EzMenu:
def __init__(self, *options):
self.options = options
self.x = 0
self.y = 0
self.font = pygame.font.Font(None, 32)
self.option = 0
self.width = 1
self.color = [0, 0, 0]
self.hcolor = [255, 0, 0]
self.height = len(self.options)*self.font.get_height()
for o in self.options:
text = o[0]
ren = self.font.render(text, 1, (0, 0, 0))
if ren.get_width() > self.width:
self.width = ren.get_width()
def draw(self, surface):
i=0
for o in self.options:
if i==self.option:
clr = self.hcolor
else:
clr = self.color
text = o[0]
ren = self.font.render(text, 1, clr)
if ren.get_width() > self.width:
self.width = ren.get_width()
| surface.blit(ren, ((self.x+self.width/2) - ren.get_widt | h()/2, self.y + i*(self.font.get_height()+4)))
i+=1
def update(self, events):
for e in events:
if e.type == pygame.KEYDOWN:
if e.key == pygame.K_DOWN:
self.option += 1
if e.key == pygame.K_UP:
self.option -= 1
if e.key == pygame.K_RETURN:
self.options[self.option][1]()
if self.option > len(self.options)-1:
self.option = 0
if self.option < 0:
self.option = len(self.options)-1
def set_pos(self, x, y):
self.x = x
self.y = y
def set_font(self, font):
self.font = font
def set_highlight_color(self, color):
self.hcolor = color
def set_normal_color(self, color):
self.color = color
def center_at(self, x, y):
self.x = x-(self.width/2)
self.y = y-(self.height/2)
|
micolous/ledsign | cpower1200_rss.py | Python | lgpl-3.0 | 1,615 | 0.005573 | #!/usr/bin/env python
"""
RSS Reader for C-Power 1200
Copyright 2010-2012 Michael Farrell <http://micolous.id.au/>
This library is free software: | you can redistribute it and/or modify
it under the terms of the GNU Lesser General Public License as published by
the Free Software Foundation, either version 3 of the License, or
(at your option) any later version.
This program is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICUL | AR PURPOSE. See the
GNU General Public License for more details.
You should have received a copy of the GNU Lesser General Public License
along with this program. If not, see <http://www.gnu.org/licenses/>.
"""
from cpower1200 import *
import feedparser
from sys import argv
FEED = 'http://news.google.com.au/news?pz=1&cf=all&ned=au&hl=en&output=rss'
d = feedparser.parse(FEED)
s = CPower1200(argv[1])
# Define one window at the top of the screen, and one in the lower part of the screen
s.send_window(dict(x=0, y=0, h=8, w=64), dict(x=0, y=8, h=8, w=64))
header = s.format_text(d.feed.title, RED, 0)
articles = ''
for i, article in enumerate(d.entries[:4]):
print "entry %d: %s" % (i, article.title)
colour = YELLOW if i % 2 == 0 else GREEN
articles += s.format_text(article.title + ' ', colour)
# send to sign
#s.send_text(0, header, effect=EFFECT_NONE)
s.send_clock(0, display_year=False, display_month=False, display_day=False, display_hour=True, display_minute=True, display_second=True, multiline=False, red=255,green=0,blue=0)
s.send_text(1, articles, speed=10)
|
BSGOxford/BrowseVCF | web/scripts/script07_use_gene_list.py | Python | gpl-3.0 | 8,403 | 0.022373 | #!/usr/bin/env python
import os
import sys
import glob
import argparse
from datetime import datetime
import platform
if platform.system().lower() == 'darwin':
os.environ['PYTHONPATH'] = '%s/osx_libs:$PYTHONPATH' % os.getcwd()
import wormtable as wt
################################################################################
# This script allows the user to filter variants in a vcf file based on one or
# more genes of interest. Genes can be provided as a comma-separated string or
# as a text file, with one gene per line. The query can be either positive (keep
# variants annotated to any of the input genes) or negative (keep variants not
# annotated to any of the input genes).
################################################################################
def parse_args():
"""
Parse the input arguments.
"""
parser = argparse.ArgumentParser()
parser.add_argument('-i', dest = 'inp_folder', required = True,
help = 'input folder containing the several wormtables')
parser.add_argument('-o', dest = 'out_file', required = True,
help = 'output file [.txt]')
parser.add_argument('-g', dest = 'genes_to_query', required = True,
help = 'genes of interest [comma-sep. string or file ' +
'path]')
parser.add_argument('-f', dest = 'field_name', required = True,
help = 'field where gene names have to be searched')
parser.add_argument('-n', dest = 'negative_query', required = True,
help = 'is this a negative query? [True or False]')
parser.add_argument('-p', dest = 'previous_results', required = False,
help = 'previously saved results from another query ' +
'[.txt]')
args = parser.parse_args()
return args
def check_input_file(folder_name):
"""
Make sure that the input file's path is properly defined.
"""
if not os.path.exists(folder_name):
sys.stderr.write("\nFolder named '" + folder_name + "' does not exist.\n")
sys.exit()
return folder_name
def check_output_file(file_name):
"""
Make sure that the input file's path does not already exist.
"""
if os.path.exists(file_name):
sys.stderr.write("\nFile named '" + file_name + "' already exists.\n")
sys.exit()
return file_name
def store_genes(genes_to_query):
"""
Store all input gene names in a set. If the path of genes_to_query does not
exist, it will treat genes_to_query as a string.
"""
genes = set()
# genes_to_query is a text file
if os.path.exists(genes_to_query):
f = open(genes_to_query)
for line in f:
genes.add(line.strip('\n'))
f.close()
# genes_to_query is a comma-separated string
else:
genes = set(genes_to_query.split(','))
return genes
def get_variants_assoc_to_gene_set_from_previous_results(inp_folder, genes,
field_name, negative_query, previous_results):
"""
Open the field_name wormtable (assumed to be named 'inp_folder/field_name.wt')
within inp_folder and return a set of all row IDs where at least one gene from
genes is found. Use ids from previous_results as starting point to further
filter the data and to make it faster.
If negative_query is True, only variants NOT containing any of the input genes
in field_name will be returned; if False, viceversa (positive query is run).
"""
# extract row IDs to check from previous_results (which is a file path) and
# store them in a set; NOTE: it assumes previous_results has a 1-line header,
# is tab-separated and row_id is the left-most field!
ids_to_check = set()
f = open(previous_results)
header = True
for line in f:
if header:
header = False
else:
ids_to_check.add(int(line.split('\t')[0]))
f.close()
# open wormtable for the field of interest
table = wt.open_table(inp_folder + '/' + field_name + '.wt',
db_cache_size='4G')
index = table.open_index('row_id')
all_ids = set()
pos_ids = set()
# NOTE: it assumes the wormtable has only two columns: 'row_id' and field_name
row_id_idx = 0
field_name_idx = 1
for row in index.cursor(['row_id', field_name]):
if row[row_id_idx] in ids_to_check:
all_ids.add(row[row_id_idx])
for value in row[field_name_idx].split(','):
for gene in genes:
if value.find(gene) != -1:
pos_ids.add(row[row_id_idx])
break
# close table and index
table.close()
index.close()
# if "negative_query" is True, return all row IDs which are not in "pos_ids"
if negative_query == 'True':
neg_ids = all_ids - pos_ids
return neg_ids
elif negative_query == 'False':
return pos_ids
def get_variants_assoc_to_gene_set(inp_folder, genes, field_name,
negative_query):
"""
Open the field_name wormtable (assumed to be named 'inp_folder/field_name.wt')
within inp_folder and return a set of all row IDs where at least one gene from
genes is found.
If negative_query is True, only variants NOT containing any of the input genes
in field_name will be returned; if False, viceversa (positive query is run).
"""
# open wormtable for the field of interest
table = wt.open_table(inp_folder + '/' + field_name + '.wt',
db_cache_size='4G')
all_ids = set()
pos_ids = set()
# NOTE: it assumes the wormtable has only two columns: 'row_id' and field_name
row_id_idx = 0
field_name_idx = 1
for row in table.cursor(['row_id', field_name]):
all_ids.add(row[row_id_idx])
for value in row[field_name_idx].split(','):
for gene in genes:
if value.find(gene) != -1:
pos_ids.add(row[row_id_idx])
break
# close table
table.close()
# if "negative_query" is True, return a | ll row IDs which are not in "pos_ids"
if negative_query == 'True':
neg_ids = all_ids - pos_ids
return neg_ids
elif negative_query == 'False':
return pos_ids
def retrieve_variants_by_rowid(inp_folder, ids, out_file):
"""
Use the row IDs in ids to query the complete wormtable (containing all variant
fields) and return al | l the information about the filtered variants.
"""
# open table and load indices
table = wt.open_table(inp_folder + '/schema.wt', db_cache_size='4G')
index = table.open_index('row_id')
# retrieve the rows using the 'row_id' field and write the results in out_file
col_names = [col.get_name() for col in table.columns()]
row_id_idx = col_names.index('row_id')
out = open(out_file, 'w')
out.write('\t'.join(col_names) + '\n')
for row in index.cursor(col_names):
if row[row_id_idx] in ids:
to_write = list()
for value in row:
try: # value is a number (int or float)
to_write.append(int(value))
except TypeError, e: # value is a tuple
if value is not None:
to_write.append(','.join([str(x) for x in value]))
else:
to_write.append(None)
except ValueError, e: # value is a string
to_write.append(value)
except:
to_write.append(None)
out.write('\t'.join([str(x) for x in to_write]) + '\n')
out.close()
# close table and index
table.close()
index.close()
return
def script07_api_call(i_folder, o_file, genes_to_query, field_name,
negative_query, previous_results = None):
"""
API call for web-based and other front-end services, to avoid a system call
and a new Python process.
"""
t1 = datetime.now()
inp_folder = check_input_file(i_folder)
out_file = check_output_file(o_file)
negative_query = str(negative_query).lower()
if negative_query.startswith('t'):
negative_query = 'True'
else:
negative_query = 'False'
genes = store_genes(genes_to_query)
if previous_results != None:
ids = get_variants_assoc_to_gene_set_from_previous_results(inp_folder,
genes, field_name, negative_query, previous_results)
else:
ids = get_variants_assoc_to_gene_set(inp_folder, genes, field_name,
negative_query)
retrieve_variants_by_rowid(inp_folder, ids, out_file)
t2 = datetime.now()
sys.stderr.write('%s\n' % str(t2 - t1))
return
def main():
"""
Main functio |
hehongliang/tensorflow | tensorflow/python/kernel_tests/signal/mfcc_ops_test.py | Python | apache-2.0 | 2,543 | 0.005505 | # Copyright 2017 The TensorFlow Authors. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# ==============================================================================
"""Tests for mfcc_ops."""
from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
from tensorflow.python.framework import dtypes
from tensorflow.python.framework import tensor_shape
from tensorflow.python.ops import array_ops
from tensorflow.python.ops import random_ops
from tensorflow.python.ops import spectral_ops_test_util
from tensorflow.python.ops.signal import mfcc_ops
from tensorflow.python.platform import test
# TODO(rjryan): We have no open source tests for MFCCs at the moment. Internally
# at Google, this code is tested against a reference implementation that follows
# HTK conventions.
class MFCCTest(test.TestCase):
de | f test_error(self):
# num_mel_bins must be positive.
with self.assertRaises(ValueError):
signal = array_ops.zeros((2, 3, 0))
mfcc_ops.mfccs_from_log_mel_spectrograms(signal)
# signal must be float32
with self.assertRaises(ValueError):
| signal = array_ops.zeros((2, 3, 5), dtype=dtypes.float64)
mfcc_ops.mfccs_from_log_mel_spectrograms(signal)
def test_basic(self):
"""A basic test that the op runs on random input."""
with spectral_ops_test_util.fft_kernel_label_map():
with self.session(use_gpu=True):
signal = random_ops.random_normal((2, 3, 5))
mfcc_ops.mfccs_from_log_mel_spectrograms(signal).eval()
def test_unknown_shape(self):
"""A test that the op runs when shape and rank are unknown."""
with spectral_ops_test_util.fft_kernel_label_map():
with self.session(use_gpu=True):
signal = array_ops.placeholder_with_default(
random_ops.random_normal((2, 3, 5)), tensor_shape.TensorShape(None))
self.assertIsNone(signal.shape.ndims)
mfcc_ops.mfccs_from_log_mel_spectrograms(signal).eval()
if __name__ == "__main__":
test.main()
|
wang-g/wang-g.github.io | support_data.py | Python | mit | 4,150 | 0.006265 | from __future__ import division
from bs4 import BeautifulSoup
import urllib2
import re
def url_request(url):
hdr = {'User-Agent': 'Mozilla/5.0 (Windows NT 6.3; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/42.0.2311.152 Safari/537.36',
'Accept':'*/*'}
request = urllib2.Request(url, headers=hdr)
try:
response = urllib2.urlopen(request)
except urllib2.HTTPError, e:
print e.fp.read()
quit()
return response
def support_skill_row(skill_name, skill_tag_and_costs):
row = skill_name + "|"
attr_tag = skill_tag_and_costs[0]
mana_mults = skill_tag_and_costs[1]
row += attr_tag + "|"
if type(mana_mults) is list:
for t in mana_mults:
row += str(t[0]) + "-" + str(t[1]) + ","
row = row[:-1]
else:
row += str(mana_mults)
return row+"\n"
def get_support_info(html):
bs = BeautifulSoup(html)
# First, tries finding mana cost information in the gem progression table
gem_prog_table = bs.find('table', {'class': 'wikitable GemLevelTable'})
if gem_prog_table == None: #returns if gem progression table not found
return None
first_row = gem_prog_table.find('tr') #finds first row, the header row
prog_headers = first_row.find_all('th') #gets a list of the column labels
mana_mult_col = -1
for i in range(len(prog_headers)):
header_text = prog_headers[i].text
if "Mult." in header_text:
mana_mult_col = i
break
if mana_mult_col != -1: #if no column labeled "Mana Cost" was found, return None
mana_mults = []
for row in first_row.find_next_siblings('tr'):
## print row
row_entries = row.find_all(['th','td'])
level_entry = row.fin | d('th').text.strip()
mana_entry = row_entries[mana_mult_col].text.strip().strip('%')
try:
mana_mults.append((int(level_entry),int(mana_entry)/100))
except ValueError:
continue
return mana_mults
else:
return None
def find_supports(bs, search_phrase, attr_tag, support_dict):
header = bs.find('th', text = re.compile(search_phrase))
support_headers = header.find_next_sibli | ngs('th')
mcm_col = -1
for i in range(len(support_headers)):
if support_headers[i].text == 'MCM':
## print "header_location: " + str(i + 1)
mcm_col = i + 1
break
table = header.find_parent('table')
rows = table.find_all('tr')
base_url = 'http://pathofexile.gamepedia.com/'
support_list = []
for r in rows:
link = r.find('a')
if link == None:
continue
skill_name = link.text.strip()
print skill_name
row_entries = r.find_all('td')
mcm_entry = row_entries[mcm_col]
mcm = None
if 'N/A' in mcm_entry.text:
mcm = 1
elif '*' in mcm_entry.text:
url = base_url + link['href']
response = url_request(url)
support_html = response.read()
mcm = get_support_info(support_html)
else:
try:
mcm = int(mcm_entry.text.strip().strip('%'))/100
except ValueError:
mcm = None
if not mcm == None:
support_dict[skill_name] = (attr_tag, mcm)
else:
print "OH NO: " + skill_name
url = 'http://pathofexile.gamepedia.com/Skills'
base_url = 'http://pathofexile.gamepedia.com'
response = url_request(url)
sk_html = response.read()
bs = BeautifulSoup(sk_html)
support_dict = {}
find_supports(bs, 'Strength Support Gems', 'str', support_dict)
find_supports(bs, 'Dexterity Support Gems', 'dex', support_dict)
find_supports(bs, 'Intelligence Support Gems', 'int', support_dict)
support_file = open('support_file.txt', 'w')
for support in support_dict:
if type(support_dict[support][1]) is list and len(support_dict[support][1]) < 20:
support_file.write("--")
support_file.write(support_skill_row(support, support_dict[support]))
support_file.close()
|
jiajiax/crosswalk-test-suite | apptools/apptools-android-tests/apptools/comm.py | Python | bsd-3-clause | 7,398 | 0.002298 | #!/usr/bin/env python
#
# Copyright (c) 2015 Intel Corporation.
#
# Redistribution and use in source and binary forms, with or without
# modification, are permitted provided that the following conditions are met:
#
# * Redistributions of works must retain the original copyright notice, this
# list of conditions and the following disclaimer.
# * Redistributions in binary form must reproduce the original copyright
# notice, this list of conditions and the following disclaimer in the
# documentation and/or other materials provided with the distribution.
# * Neither the name of Intel Corporation nor the names of its contributors
# may be used to endorse or promote products derived from this work without
# specific prior written permission.
#
# THIS SOFTWARE IS PROVIDED BY INTEL CORPORATION "AS IS"
# AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
# IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
# ARE DISCLAIMED. IN NO EVENT SHALL INTEL CORPORATION BE LIABLE FOR ANY DIRECT,
# INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING,
# BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
# DATA, OR PROFITS; OR BUSINESS INTERRUPTIO | N) HOWEVER CAUSED AND ON ANY THEORY
# OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING
# NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE,
# EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
#
# Authors:
# Hongjuan, Wang | <hongjuanx.wang@intel.com>
# Yun, Liu<yunx.liu@intel.com>
import os
import sys
import commands
import shutil
import urllib2
SCRIPT_PATH = os.path.realpath(__file__)
ConstPath = os.path.dirname(SCRIPT_PATH)
def setUp():
global device, XwalkPath, crosswalkVersion, PackTools, ARCH, cachedir
#device = "E6OKCY411012"
device = os.environ.get('DEVICE_ID')
cachedir = os.environ.get('CROSSWALK_APP_TOOLS_CACHE_DIR')
if not device:
print ("Get env error\n")
sys.exit(1)
fp = open(ConstPath + "/../arch.txt", 'r')
if fp.read().strip("\n\t") != "x86":
ARCH = "arm"
else:
ARCH = "x86"
fp.close()
vp = open(ConstPath + "/../version.txt", 'r')
crosswalkVersion = vp.read().strip("\n\t")
vp.close()
PackTools = ConstPath + "/../tools/crosswalk-app-tools/src/"
XwalkPath = ConstPath + "/../tools/"
if "crosswalk-app-tools" not in os.listdir(XwalkPath):
print "Please check if the crosswalk-app-tools exists in " + ConstPath + "/../tools/"
sys.exit(1)
elif "crosswalk-app-tools" in os.listdir(XwalkPath) and len(os.listdir(XwalkPath)) < 2:
print "Please check if the Crosswalk Binary exists in " + ConstPath + "/../tools/"
sys.exit(1)
def clear(pkg):
os.chdir(XwalkPath)
if os.path.exists(ConstPath + "/../tools/" + pkg):
try:
shutil.rmtree(XwalkPath + pkg)
except Exception as e:
os.system("rm -rf " + XwalkPath + pkg + " &>/dev/null")
def create(self):
clear("org.xwalk.test")
setUp()
os.chdir(XwalkPath)
cmd = PackTools + \
"crosswalk-app create org.xwalk.test --android-crosswalk=" + \
crosswalkVersion
packstatus = commands.getstatusoutput(cmd)
self.assertEquals(packstatus[0], 0)
self.assertIn("org.xwalk.test", os.listdir(os.getcwd()))
def build(self, cmd):
buildstatus = commands.getstatusoutput(cmd)
self.assertEquals(buildstatus[0], 0)
self.assertIn("pkg", os.listdir(XwalkPath + "org.xwalk.test"))
os.chdir('pkg')
apks = os.listdir(os.getcwd())
self.assertNotEquals(len(apks), 0)
for i in range(len(apks)):
self.assertTrue(apks[i].endswith(".apk"))
if "x86" in apks[i]:
self.assertIn("x86", apks[i])
if i < len(os.listdir(os.getcwd())):
self.assertIn("arm", apks[i - 1])
else:
self.assertIn("arm", apks[i + 1])
elif "arm" in apks[i]:
self.assertIn("arm", apks[i])
if i < len(os.listdir(os.getcwd())):
self.assertIn("x86", apks[i - 1])
else:
self.assertIn("x86", apks[i + 1])
def update(self, cmd):
updatestatus = commands.getstatusoutput(cmd)
self.assertEquals(updatestatus[0], 0)
self.assertNotIn("ERROR:", updatestatus[1])
version = updatestatus[1].split('\n')[-1].split(' ')[-1][1:-1]
if not cachedir:
namelist = os.listdir(os.getcwd())
else:
newcachedir = os.environ.get('CROSSWALK_APP_TOOLS_CACHE_DIR')
os.chdir(newcachedir)
namelist = os.listdir(os.getcwd())
os.chdir(XwalkPath + 'org.xwalk.test')
crosswalk = 'crosswalk-{}.zip'.format(version)
self.assertIn(crosswalk, namelist)
return version
def run(self):
setUp()
apks = os.listdir(os.getcwd())
for apk in apks:
if ARCH in apk:
inststatus = commands.getstatusoutput(
'adb -s ' +
device +
' install -r ' +
os.getcwd() +
'/' +
apk)
# print inststatus
self.assertEquals(inststatus[0], 0)
self.assertIn("Success", inststatus[1])
pmstatus = commands.getstatusoutput(
'adb -s ' +
device +
' shell pm list package |grep org.xwalk.test')
self.assertEquals(pmstatus[0], 0)
launstatus = commands.getstatusoutput(
'adb -s ' +
device +
' shell am start -n org.xwalk.test/.TestActivity')
self.assertEquals(launstatus[0], 0)
stopstatus = commands.getstatusoutput(
'adb -s ' +
device +
' shell am force-stop org.xwalk.test')
self.assertEquals(stopstatus[0], 0)
uninstatus = commands.getstatusoutput(
'adb -s ' +
device +
' uninstall org.xwalk.test')
self.assertEquals(uninstatus[0], 0)
def channel(self, channel):
createcmd = PackTools + \
"crosswalk-app create org.xwalk.test --android-crosswalk=" + channel
packstatus = commands.getstatusoutput(createcmd)
self.assertEquals(packstatus[0], 0)
self.assertIn(channel, packstatus[1])
crosswalklist = urllib2.urlopen(
'https://download.01.org/crosswalk/releases/crosswalk/android/' +
channel +
'/').read()
fp = open('test', 'w')
fp.write(crosswalklist)
fp.close()
line = commands.getstatusoutput(
"cat test|sed -n '/src\=\"\/icons\/folder.gif\"/=' |sed -n '$p'")[1].strip()
cmd = "cat test |sed -n '%dp' |awk -F 'href=' '{print $2}' |awk -F '\"|/' '{print $2}'" % int(
line)
version = commands.getstatusoutput(cmd)[1]
if not '.' in version:
line = commands.getstatusoutput(
"tac test|sed -n '/src\=\"\/icons\/folder.gif\"/=' |sed -n '2p'")[1].strip()
cmd = "tac test |sed -n '%dp' |awk -F 'href=' '{print $2}' |awk -F '\"|/' '{print $2}'" % int(
line)
version = commands.getstatusoutput(cmd)[1]
commands.getstatusoutput("rm -rf test")
crosswalk = 'crosswalk-{}.zip'.format(version)
namelist = os.listdir(os.getcwd())
self.assertIn(crosswalk, namelist)
|
ianare/exif-py | exifread/tags/makernote/fujifilm.py | Python | bsd-3-clause | 3,010 | 0 | """
Makernote (proprietary) tag definitions for FujiFilm.
http://www.sno.phy.queensu.ca/~phil/exiftool/TagNames/FujiFilm.html
"""
from ...utils import make_string
TAGS = {
0x0000: ('NoteVersion', make_string),
0x0010: ('InternalSerialNumber', ),
0x1000: ('Quality', ),
0x1001: ('Sharpness', {
0x1: 'Soft',
0x2: 'Soft',
0x3: 'Normal',
0x4: 'Hard',
0x5: 'Hard2',
0x82: 'Medium Soft',
0x84: 'Medium Hard',
0x8000: 'Film Simulation'
}),
0x1002: ('WhiteBalance', {
0x0: 'Auto' | ,
0x100: 'Daylight',
0x200: 'Cloudy',
0x300: 'Daylight Fluorescent',
0x301: 'Day White Fluorescent',
0x302: 'White Fluorescent',
0x303: | 'Warm White Fluorescent',
0x304: 'Living Room Warm White Fluorescent',
0x400: 'Incandescent',
0x500: 'Flash',
0x600: 'Underwater',
0xf00: 'Custom',
0xf01: 'Custom2',
0xf02: 'Custom3',
0xf03: 'Custom4',
0xf04: 'Custom5',
0xff0: 'Kelvin'
}),
0x1003: ('Saturation', {
0x0: 'Normal',
0x80: 'Medium High',
0x100: 'High',
0x180: 'Medium Low',
0x200: 'Low',
0x300: 'None (B&W)',
0x301: 'B&W Red Filter',
0x302: 'B&W Yellow Filter',
0x303: 'B&W Green Filter',
0x310: 'B&W Sepia',
0x400: 'Low 2',
0x8000: 'Film Simulation'
}),
0x1004: ('Contrast', {
0x0: 'Normal',
0x80: 'Medium High',
0x100: 'High',
0x180: 'Medium Low',
0x200: 'Low',
0x8000: 'Film Simulation'
}),
0x1005: ('ColorTemperature', ),
0x1006: ('Contrast', {
0x0: 'Normal',
0x100: 'High',
0x300: 'Low'
}),
0x100a: ('WhiteBalanceFineTune', ),
0x1010: ('FlashMode', {
0: 'Auto',
1: 'On',
2: 'Off',
3: 'Red Eye Reduction'
}),
0x1011: ('FlashStrength', ),
0x1020: ('Macro', {
0: 'Off',
1: 'On'
}),
0x1021: ('FocusMode', {
0: 'Auto',
1: 'Manual'
}),
0x1022: ('AFPointSet', {
0: 'Yes',
1: 'No'
}),
0x1023: ('FocusPixel', ),
0x1030: ('SlowSync', {
0: 'Off',
1: 'On'
}),
0x1031: ('PictureMode', {
0: 'Auto',
1: 'Portrait',
2: 'Landscape',
4: 'Sports',
5: 'Night',
6: 'Program AE',
256: 'Aperture Priority AE',
512: 'Shutter Priority AE',
768: 'Manual Exposure'
}),
0x1032: ('ExposureCount', ),
0x1100: ('MotorOrBracket', {
0: 'Off',
1: 'On'
}),
0x1210: ('ColorMode', {
0x0: 'Standard',
0x10: 'Chrome',
0x30: 'B & W'
}),
0x1300: ('BlurWarning', {
0: 'Off',
1: 'On'
}),
0x1301: ('FocusWarning', {
0: 'Off',
1: 'On'
}),
0x1302: ('ExposureWarning', {
0: 'Off',
1: 'On'
}),
}
|
Southpaw-TACTIC/Team | src/python/Lib/site-packages/PySide/examples/network/http.py | Python | epl-1.0 | 5,973 | 0.001842 | #!/usr/bin/env python
"""PySide port of the network/http example from Qt v4.x"""
import sys
from PySide import QtCore, QtGui, QtNetwork
class HttpWindow(QtGui.QDialog):
def __init__(self, parent=None):
QtGui.QDialog.__init__(self, parent)
self.urlLineEdit = QtGui.QLineEdit("http://www.ietf.org/iesg/1rfc_index.txt")
self.urlLabel = QtGui.QLabel(self.tr("&URL:"))
self.urlLabel.setBuddy(self.urlLineEdit)
self.statusLabel = QtGui.QLabel(self.tr("Please enter the URL of a file "
"you want to download."))
self.quitButton = QtGui.QPushButton(self.tr("Quit"))
self.downloadButton = QtGui.QPushButton(self.tr("Download"))
self.downloadButton.setDefault(True)
self.progressDialog = QtGui.QProgressDialog(self)
self.http = QtNetwork.QHttp(self)
self.outFi | le = None
self.httpGetId = 0
self.httpRequestAborted = False
self.connect(self.urlLineEdit, QtCore.SI | GNAL("textChanged(QString &)"),
self.enableDownloadButton)
self.connect(self.http, QtCore.SIGNAL("requestFinished(int, bool)"),
self.httpRequestFinished)
self.connect(self.http, QtCore.SIGNAL("dataReadProgress(int, int)"),
self.updateDataReadProgress)
self.connect(self.http, QtCore.SIGNAL("responseHeaderReceived(QHttpResponseHeader &)"),
self.readResponseHeader)
self.connect(self.progressDialog, QtCore.SIGNAL("canceled()"),
self.cancelDownload)
self.connect(self.downloadButton, QtCore.SIGNAL("clicked()"),
self.downloadFile)
self.connect(self.quitButton, QtCore.SIGNAL("clicked()"),
self, QtCore.SLOT("close()"))
topLayout = QtGui.QHBoxLayout()
topLayout.addWidget(self.urlLabel)
topLayout.addWidget(self.urlLineEdit)
buttonLayout = QtGui.QHBoxLayout()
buttonLayout.addStretch(1)
buttonLayout.addWidget(self.downloadButton)
buttonLayout.addWidget(self.quitButton)
mainLayout = QtGui.QVBoxLayout()
mainLayout.addLayout(topLayout)
mainLayout.addWidget(self.statusLabel)
mainLayout.addLayout(buttonLayout)
self.setLayout(mainLayout)
self.setWindowTitle(self.tr("HTTP"))
self.urlLineEdit.setFocus()
def downloadFile(self):
url = QtCore.QUrl(self.urlLineEdit.text())
fileInfo = QtCore.QFileInfo(url.path())
fileName = fileInfo.fileName()
if QtCore.QFile.exists(fileName):
QtGui.QMessageBox.information(self, self.tr("HTTP"), self.tr(
"There already exists a file called %s "
"in the current directory.") % (fileName))
return
self.outFile = QtCore.QFile(fileName)
if not self.outFile.open(QtCore.QIODevice.WriteOnly):
QtGui.QMessageBox.information(self, self.tr("HTTP"),
self.tr("Unable to save the file %(name)s: %(error)s.")
% {'name': fileName,
'error': self.outFile.errorString()})
self.outFile = None
return
if url.port() != -1:
self.http.setHost(url.host(), url.port())
else:
self.http.setHost(url.host(), 80)
if url.userName():
self.http.setUser(url.userName(), url.password())
self.httpRequestAborted = False
self.httpGetId = self.http.get(url.path(), self.outFile)
self.progressDialog.setWindowTitle(self.tr("HTTP"))
self.progressDialog.setLabelText(self.tr("Downloading %s.") % (fileName))
self.downloadButton.setEnabled(False)
def cancelDownload(self):
self.statusLabel.setText(self.tr("Download canceled."))
self.httpRequestAborted = True
self.http.abort()
self.downloadButton.setEnabled(True)
def httpRequestFinished(self, requestId, error):
if self.httpRequestAborted:
if self.outFile is not None:
self.outFile.close()
self.outFile.remove()
self.outFile = None
self.progressDialog.hide()
return
if requestId != self.httpGetId:
return
self.progressDialog.hide()
self.outFile.close()
if error:
self.outFile.remove()
QtGui.QMessageBox.information(self, self.tr("HTTP"),
self.tr("Download failed: %s.")
% (self.http.errorString()))
else:
fileName = QtCore.QFileInfo(QtCore.QUrl(self.urlLineEdit.text()).path()).fileName()
self.statusLabel.setText(self.tr("Downloaded %s to current directory.") % (fileName))
self.downloadButton.setEnabled(True)
self.outFile = None
def readResponseHeader(self, responseHeader):
if responseHeader.statusCode() != 200:
QtGui.QMessageBox.information(self, self.tr("HTTP"),
self.tr("Download failed: %s.")
% (responseHeader.reasonPhrase()))
self.httpRequestAborted = True
self.progressDialog.hide()
self.http.abort()
return
def updateDataReadProgress(self, bytesRead, totalBytes):
if self.httpRequestAborted:
return
self.progressDialog.setMaximum(totalBytes)
self.progressDialog.setValue(bytesRead)
def enableDownloadButton(self):
self.downloadButton.setEnabled(not self.urlLineEdit.text())
if __name__ == '__main__':
app = QtGui.QApplication(sys.argv)
httpWin = HttpWindow()
sys.exit(httpWin.exec_())
|
easytaxibr/mockrista | simpleserver.py | Python | mit | 6,740 | 0.04273 | #Modicado
from BaseHTTPServer import BaseHTTPRequestHandler,HTTPServer
from SocketServer import ThreadingMixIn
from urlparse import urlparse
import json
import threading
import argparse
import re
import cgi
import random
import sys
import math
positions = [
{"lat": -23.542887 , "lng": -46.73158},
{"lat": -23.542179 , "lng": -46.730915},
{"lat": -23.541411 , "lng": -46.729907},
{"lat": -23.541333 , "lng": -46.729113},
{"lat": -23.541215 , "lng": -46.728255},
{"lat": -23.541569 , "lng": -46.727933},
{"lat": -23.541824 , "lng": -46.727697},
{"lat": -23.542454 , "lng": -46.727203},
{"lat": -23.542965 , "lng": -46.726795},
{"lat": -23.543320 , "lng": -46.726152},
{"lat": -23.543595 , "lng": -46.725658},
{"lat": -23.544205 , "lng": -46.724006},
{"lat": -23.544303 , "lng": -46.723105},
{"lat": -23.544382 , "lng": -46.722032},
{"lat": -23.544598 , "lng": -46.721216},
{"lat": -23.544775 , "lng": -46.720251},
{"lat": -23.544940 , "lng": -46.719849},
{"lat": -23.545149 , "lng": -46.719221},
{"lat": -23.545444 , "lng": -46.71862},
{"lat": -23.545896 , "lng": -46.717869},
{"lat": -23.546585 , "lng": -46.717032},
{"lat": -23.547155 , "lng": -46.716324},
{"lat": -23.547805 , "lng": -46.715659},
{"lat": -23.548257 , "lng": -46.71523},
{"lat": -23.548650 , "lng": -46.714844},
{"lat": -23.548864 , "lng": -46.714516},
{"lat": -23.549218 , "lng": -46.714162},
{"lat": -23.549454 , "lng": -46.714312},
{"lat": -23.549621 , "lng": -46.714527},
{"lat": -23.549956 , "lng": -46.714838},
{"lat": -23.550113 , "lng": -46.715117},
{"lat": -23.550349 , "lng": -46.715418},
{"lat": -23.550516 , "lng": -46.715686},
{"lat": -23.550831 , "lng": -46.715997},
{"lat": -23.551146 , "lng": -46.71619},
{"lat": -23.552483 , "lng": -46.716952},
{"lat": -23.552926 , "lng": -46.717209},
{"lat": -23.553388 , "lng": -46.717424},
{"lat": -23.553811 , "lng": -46.717671},
{"lat": -23.554086 , "lng": -46.717992},
{"lat": -23.552444 , "lng": -46.72134},
{"lat": -23.551116 , "lng": -46.724065},
{"lat": -23.549828 , "lng": -46.726704},
{"lat": -23.549297 , "lng": -46.727348},
{"lat": -23.548185 , "lng": -46.729333},
{"lat": -23.547153 , "lng": -46.731114},
{"lat": -23.546208 , "lng": -46.732391},
{"lat": -23.545943 , "lng": -46.732702},
{"lat": -23.545490 , "lng": -46.733195},
{"lat": -23.544104 , "lng": -46.734311},
{"lat": -23.542953 , "lng": -46.735438},
{"lat": -23.542412 , "lng": -46.735223},
{"lat": -23.541005 , "lng": -46.733807},
{"lat": -23.540602 , "lng": -46.733378},
{"lat": -23.540150 , "lng": -46.732991},
{"lat": -23.540386 , "lng": -46.73268},
{"lat": -23.540986 , "lng": -46.731994},
{"lat": -23.541487 , "lng": -46.731393},
{"lat": -23.541822 , "lng": -46.731039},
{"lat": -23.542018 , "lng": -46.730781},
{"lat": -23.542451 , "lng": -46.731135},
{"lat": -23.543169 , "lng": -46.731886}
]
class HTTPRequestHandler(BaseHTTPRequestHandler):
sessionCounter = {}
arrived = False
def getTheCurrentTaxiPosition(self, session):
if session in self.sessionCounter:
print "There is Session"
tick = self.sessionCounter[session]
tick = tick + 1
self.sessionCounter[session] = tick
else:
print "There NO is Session"
self.sessionCounter[session] = 0
pos = self.sessionCounter[session]
if len(positions) > pos:
self.arrived = False
return positions[pos]
else:
self.arrived = True
return positions[len(positions) - 1]
def randomTaxiPositionJson(self, lat, lng, n):
taxis = []
radius = 500
radiusInDegrees=float(radius)/float(111300)
r = radiusInDegrees
x0 = float(lat)
y0 = float(lng)
for i in range(1,n):
u = float(random.random())
v = float(random.random())
w = r * math.sqrt(u)
t = 2 * math.pi * v
x = w * math.cos(t)
y = w * math.sin(t)
print "x %f y %f" % (x, y)
xLat = x + x0
yLong = y + y0
taxis.append({"lat": xLat , "lng": yLong, "driver-name" : "Driver Test", "driver-car" : "Driver Car"})
return taxis
def do_GET(self):
try:
if None != re.search('/api/taxi-position/the-taxi/*', self.path):
query = urlparse(self.path).query
query_components = dict(qc.split("=") for qc in query.split("&"))
session = query_components["session"]
response = {"driver_name": "Foo Bar",
"car_model" : "Tesla Model",
"license_plate" : "FOO-4242",
"position" : self.getTheCurrentTaxiPosition(session),
"is_arravied" : self.arrived}
self.send_response(200)
self.send_header('Content-Type', 'application/json')
self.end_headers()
self.wfile.write(json.dumps(response))
elif None != re.search('/api/gettaxis/*', self.path):
query = urlparse(self.path).query
query_components = dict(qc.split("=") for qc in query.split("&"))
lat = query_components["lat"]
lng = query_components["lng"]
response = {"taxis": self.randomTaxiPositionJson(lat, lng, 100)}
self.send_response(200)
self.send_header('Content-Type', 'application/json')
self.end_headers()
self.wfile.write(json.dumps(response))
elif None != re.search('/api/taxi-avarage-eta/*', self.path):
query = urlparse(self.path).query
query_components = dict(qc.split("=") for qc in query.split("&"))
lat = query_components["lat"]
lng = query_components["lng"]
response = | {"taxis": self.randomTaxiPositionJson(lat, lng, 10),
"agarage-eta": 5}
self.send_response(200)
self.send_header('Content-Type', 'application/json')
self.end_headers()
self.wfile.write(json.dumps(response))
else:
self.send_response(403)
self.send_ | header('Content-Type', 'application/json')
self.end_headers()
except ValueError:
print ValueError
self.send_response(400)
self.send_header('Content-Type', 'application/json')
self.end_headers()
return
class ThreadedHTTPServer(ThreadingMixIn, HTTPServer):
allow_reuse_address = True
def shutdown(self):
self.socket.close()
HTTPServer.shutdown(self)
class SimpleHttpServer():
def __init__(self, ip, port):
self.server = ThreadedHTTPServer((ip,port), HTTPRequestHandler)
def start(self):
self.server_thread = threading.Thread(target=self.server.serve_forever)
self.server_thread.daemon = True
self.server_thread.start()
def waitForThread(self):
self.server_thread.join()
def stop(self):
self.server.shutdown()
self.waitForThread()
if __name__=='__main__':
parser = argparse.ArgumentParser(description='HTTP Server')
parser.add_argument('port', type=int, help='Listening port for HTTP Server')
parser.add_argument('ip', help='HTTP Server IP')
args = parser.parse_args()
server = SimpleHttpServer(args.ip, args.port)
print 'HTTP Server Running...........'
server.start()
server.waitForThread()
|
tmetsch/suricate | suricate/ui/api.py | Python | mit | 13,237 | 0 | # coding=utf-8
"""
An API used by the UI and RESTful API.
"""
from bson import ObjectId
__author__ = 'tmetsch'
import collections
import json
import pika
import pika.exceptions as pikaex
import uuid
from suricate.data import object_store
from suricate.data import streaming
TEMPLATE = '''
% if len(error.strip()) > 0:
<div class="error">{{!error}}</div>
% end
% for item in output:
% if item[:6] == 'image:':
<div><img src="data:{{item[6:]}}"/></div>
% elif item[:6] == 'embed:':
<div>{{!item[6:]}}</div>
% elif item[:1] == '#':
<div>{{item.rstrip()}}</div>
% elif item.strip() != '':
<div class="code">{{item.rstrip()}}</div>
% end
% end
'''
class API(object):
"""
Little helper class to abstract the REST and UI from.
"""
def __init__(self, amqp_uri, mongo_uri):
self.clients = {}
self.amqp_uri = amqp_uri
# get obj/streaming client up!
self.obj_str = object_store.MongoStore(mongo_uri)
self.stream = streaming.AMQPClient(mongo_uri)
# Data sources...
def info_data(self, uid, token):
"""
Return dictionary with key/values about data objects and streams.
:param uid: Identifier for the user.
:param token: The token of the user.
"""
data_info = self.obj_str.info(uid, token)
data_info.update(self.stream.info(uid, token))
return data_info
def list_data_sources(self, uid, token):
"""
List available data sources. Return list of ids for objects & streams.
:param uid: Identifier for the user.
:param token: The token of the user.
"""
tmp = self.obj_str.list_objects(uid, token)
tmp2 = self.stream.list_streams(uid, token)
return tmp, tmp2
# Objects
def create_object(self, content, uid, token, meta_dat):
"""
Create a data object.
:param content: Object content.
:param uid: Identifier for the user.
:param token: The token of the user.
"""
self.obj_str.create_object(uid, token, content, meta=meta_dat)
def retrieve_object(self, iden, uid, token):
"""
Retrieve a data object.
:param iden: Id of the object.
:param uid: Identifier for the user.
:param token: The token of the user.
"""
tmp = self.obj_str.retrieve_object(uid, token, iden)
return tmp
def delete_object(self, iden, uid, token):
"""
Delete a data object.
:param iden: Id of the object.
:param uid: Identifier for the user.
:param token: The token of the user.
"""
self.obj_str.delete_object(uid, token, iden)
# Streams
def create_stream(self, uri, queue, uid, token):
"""
Create a data stream.
:param uri: RabbitMQ Broker URI.
:param queue: Queue.
:param uid: Identifier for the user.
| :param token: The token of the user.
"""
self.stream.create(uid, token, uri, queue)
def retrieve_stream(self, iden, uid, token):
"""
Retrieve a data stream.
:param iden: Id of the stream.
:param uid: Identifier for the user.
:param token: The token of the user.
"""
uri, queue, msgs = self.stream.retrieve(uid, token, iden)
return | uri, queue, msgs
def delete_stream(self, iden, uid, token):
"""
Delete a data stream.
:param iden: Id of the stream.
:param uid: Identifier for the user.
:param token: The token of the user.
"""
self.stream.delete(uid, token, iden)
def set_meta(self, data_src, iden, tags, uid, token):
"""
Set meta information.
:param data_src: Reflects to db name.
:param iden: Id of the object/stream.
:param tags: Metadata dict.
:param uid: Identifier for the user.
:param token: The token of the user.
"""
database = self.obj_str.client[uid]
database.authenticate(uid, token)
collection = database[data_src]
collection.update({'_id': ObjectId(iden)},
{"$set": {'meta.tags': tags}})
####
# Everything below this is RPC!
####
# TODO: check if can work with function name and kwargs to construct
# payload.
# Projects
def list_projects(self, uid, token):
"""
RPC call to list notebooks.
:param uid: Identifier for the user.
:param token: The token of the user.
"""
payload = {'uid': uid,
'token': token,
'call': 'list_projects'}
tmp = self._call_rpc(uid, payload)
return tmp['projects']
def retrieve_project(self, proj_name, uid, token):
"""
RPC call to retrieve projects.
:param proj_name: Name of the project.
:param uid: Identifier for the user.
:param token: The token of the user.
"""
payload = {'uid': uid,
'token': token,
'project_id': proj_name,
'call': 'retrieve_project'}
tmp = self._call_rpc(uid, payload)
return tmp['project']
def delete_project(self, proj_name, uid, token):
"""
RPC call to delete a project.
:param proj_name: Name of the project.
:param uid: Identifier for the user.
:param token: The token of the user.
"""
payload = {'uid': uid,
'token': token,
'project_id': proj_name,
'call': 'delete_project'}
self._call_rpc(uid, payload)
# Notebooks
def create_notebook(self, proj_name, uid, token, ntb_name='start.py',
src='\n'):
"""
RPC call to create a notebook (or creating an empty project).
:param proj_name: Name of the project.
:param uid: Identifier for the user.
:param token: The token of the user.
"""
payload = {'uid': uid,
'token': token,
'project_id': proj_name,
'notebook_id': None,
'notebook': {'meta': {'tags': [],
'name': ntb_name,
'mime-type': 'text/x-script.phyton'},
'src': src,
'dashboard_template': TEMPLATE,
'out': [],
'err': ''},
'call': 'create_notebook'}
self._call_rpc(uid, payload)
def retrieve_notebook(self, proj_name, ntb_id, uid, token):
"""
RPC call to retrieve a notebook.
:param proj_name: Name of the project.
:param ntb_id: Id of the notebook.
:param uid: Identifier for the user.
:param token: The token of the user.
"""
payload = {'uid': uid,
'token': token,
'project_id': proj_name,
'notebook_id': ntb_id,
'call': 'retrieve_notebook'}
tmp = self._call_rpc(uid, payload)
return tmp['notebook']
def update_notebook(self, proj_name, ntb_id, ntb, uid, token):
"""
RPC call to update a notebook.
:param proj_name: Name of the project.
:param ntb_id: Id of the notebook.
:param ntb: New notebook structure.
:param uid: Identifier for the user.
:param token: The token of the user.
"""
payload = {'uid': uid,
'token': token,
'project_id': proj_name,
'notebook_id': ntb_id,
'src': ntb,
'call': 'update_notebook'}
self._call_rpc(uid, payload)
def delete_notebook(self, proj_name, ntb_id, uid, token):
"""
RPC call to delete a notebook.
:param proj_name: Name of the project.
:param ntb_id: Id of the notebook.
:param uid: Identifier for the user.
:param token: The token of the u |
psykzz/flask-rollbar | tests/test_flake8.py | Python | mit | 1,281 | 0.003903 | # -*- coding: utf8 -*-
from __future__ import unicode_literals
import unittest
import os
import sys
from flake8.api import legacy as engine
if sys.version_info[0] == 3:
unicode = str
if sys.version_info[:2] == (2, 6):
# Monkeypatch to make tests work on 2.6
def assert_less(first, second, msg=None):
assert first > second
unittest.TestCase.assertLess = assert_less
class TestCodeComplexity(unittest.TestCase):
def test_flake8_conformance(self):
flake8style = engine.get_style_guide(
ignore=['E501'],
max_complexity=6
)
directory = 'flask_rollbar'
self.assertEqual(os.path.isdir(directory), True,
"Invalid test directory '%s'. You need to update test_flake8.py" % directory)
# Get all the files to check
files = []
for dirpath, dirnames, filenames in os.walk(directory):
for filename in [f for f in filenames if f.endswith(".py")]:
files += [os.path.join(dirpath, filename)]
result = flake8style.check_files(files)
self.assertEqual(result.total_errors, 0,
| "Code fo | und to be too complex or failing PEP8")
if __name__ == '__main__':
unittest.main()
|
improlabs/Banglish-Sentiment-Analysis | python3/pyavrophonetic/config.py | Python | gpl-3.0 | 1,660 | 0 | #!/usr/bin/env python
"""Provides configurations for pyAvroPhonetic
-------------------------------------------------------------------------------
Copyright (C) 2013 Kaustav Das Modak <kaustav.dasmodak@yahoo.co.in.
This file is part of pyAvroPhonetic.
pyAvroPhonetic is free software: you can redistribute it and/or modify
it under the terms of the GNU General Public License as published by
the Free Software Foundation, either version 3 of the License, or
(at your option) any later version.
pyAvroPhonetic is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
GNU General Public License for more details.
You should have received a copy of the GNU General Public License
along with pyAvroPhonetic. If not, see <http://www.gnu.org/licenses/>.
"""
# Imports
import os
import simpl | ejson as json
import codecs
# Constants
# -- Path to current directory
BASE_PATH = os.path.dirname | (__file__)
# -- path to avrodict.json
AVRO_DICT_FILE = os.path.abspath(os.path.join(BASE_PATH,
"resources/avrodict.json"))
# -- Loads json data from avrodict.json
AVRO_DICT = json.load(codecs.open(AVRO_DICT_FILE, encoding='utf-8'))
# -- Shortcut to vowels
AVRO_VOWELS = set(AVRO_DICT['data']['vowel'])
# -- Shortcut to consonants
AVRO_CONSONANTS = set(AVRO_DICT['data']['consonant'])
# -- Shortcut to case-sensitives
AVRO_CASESENSITIVES = set(AVRO_DICT['data']['casesensitive'])
# -- Shortcut to number
AVRO_NUMBERS = set(AVRO_DICT['data']['number'])
|
sandow-digital/django-seo | rollyourown/seo/default.py | Python | bsd-3-clause | 1,038 | 0.010597 | #!/usr/bin/env python
# -*- coding: UTF-8 -*-
from rollyourown import seo
from django.conf import settings
class DefaultMetadata(seo.Metadata):
""" A very basic default class for those who do not wish to write their own.
"""
title = seo.Tag(head=True, max_length=68)
keywords = seo.MetaTag()
description = seo.MetaTag(max_length=155)
heading = seo.Tag(name="h1")
class Meta:
verbose_name = "Metadata"
verbose_name_plural = "Metadata"
use_sites = False
# This default class is automatically created when SEO_MODELS is
# defined, so we'll take our model list from there.
seo_models = getattr(settings, 'SEO_MODELS', [])
class HelpText:
title = "This is the page title, that appears in the title bar."
keywords = "Comma-separated keywords for searc | h engines."
description = "A short description, displayed in search results."
head | ing = "This is the page heading, appearing in the <h1> tag."
|
isaacdixon274/httpy | server.py | Python | gpl-3.0 | 3,664 | 0.043122 | #httpy server
#A simple HTTP | server written in Python
import socket #Import sockets
import os #Import os
import mimetypes #Import mimetypes
import subprocess
import datetime
conf = {
'port': '80',
'max_request_size': '2048',
'server_dir': '/etc/httpy',
'www_dir': '/var/www',
'log_dir': '/var/log/httpy',
'main_log': 'main.log',
'error_log': 'error.log',
'default_pages': 'index.html,index.htm,index.php',
'error_page_404': '404.html'
}
confFile = open('config')
lines | = confFile.readlines()
confFile.close()
for line in lines:
line = line.rstrip('\n')
if line.replace(' ', '') != '':
if line.lstrip(' ')[0] != '#':
key, value = line.split('=')
conf[key] = value
def log(ip = '127.0.0.1', event = 'error', msg = ''):
msg = str(datetime.datetime.now()) + '\t' + ip + '\t' + event + '\t' + msg
print(msg)
if event == 'error':
f = open(conf['log_dir'] + '/' + conf['error_log'], 'a')
f.write(msg + '\n')
f.close()
else:
f = open(conf['log_dir'] + '/' + conf['main_log'], 'a')
f.write(msg + '\n')
f.close()
s = socket.socket() #Create socket
s.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, 1)
s.bind(('', int(conf['port']))) #Bind socket to address and port
log(event = 'start')
s.listen(10) #Listen for up to ten connections
while True: #Restart server if an error occurs
try:
#while True:
c, addr = s.accept() #Accept a connection
ip = addr[0]
log(ip = ip, event = 'connect')
data = c.recv(int(conf['max_request_size'])).decode() #Receive data
log(ip = ip, event = 'request', msg = data.replace('\r\n', ' '))
method = data.split(' ')[0]
if not data == '': #Check for data
requestPathParts = data[4:data.index(' HTTP')].replace(' ', '').split('?')
requestURI = requestPathParts[0]
docPath = conf['www_dir'] + requestURI
queries = ''
if method == 'GET':
if len(requestPathParts) > 1:
queries = requestPathParts[1]
if method == 'POST':
if '\r\n\r\n' in data:
queries = data.split('\r\n\r\n')[1]
if os.path.isdir(docPath):
if docPath[-1] != '/':
docPath += '/'
for page in conf['default_pages'].replace(' ', '').split(','):
if os.path.exists(docPath + page):
docPath += page
code = '200'
status = 'OK'
if not os.path.exists(docPath):
docPath = conf['error_page_404']
code = '404'
status = 'Not Found'
filename, ext = os.path.splitext(docPath)
print(conf['www_dir'])
if ext == '.php':
os.chdir(os.path.dirname(docPath))
phpSuperVariables = '$_SERVER[\'REQUEST_URI\'] = "' + requestURI + '"; $_SERVER[\'DOCUMENT_ROOT\'] = "' + conf['www_dir'] + '";'
subprocess.check_output('echo "" | php -R \'include("' + docPath + '");\' -B \'' + phpSuperVariables + ' parse_str($argv[1], $_' + method + ');\' \'' + queries + '\' > ' + conf['server_dir'] + '/tmp', shell=True)
docPath = conf['server_dir'] + '/tmp'
f = open(docPath, 'rb')
page = f.read()
f.close()
mimeType = mimetypes.guess_type(docPath)[0]
if mimeType == None:
mimeType = conf['default_mime']
responseHeaders = 'Content-Type: ' + mimeType + '; encoding=UTF-8\nContent-Length: ' + str(len(page)) + '\nConnection: close\r\n\r\n'
c.send(b'HTTP/1.1 ' + code.encode() + b' ' + status.encode() + b'\n')
c.send(responseHeaders.encode())
c.send(page)
log(ip, 'response', docPath)
else:
raise Exception('No data received') #Raise an error
except Exception as msg: #Handle errors without stopping program
log(ip = ip, event = 'error', msg = str(msg)) #Display error message
finally:
try:
c.close() #Close connection
except:
pass
finally:
log(ip = ip, event = 'disconn')
|
ghorn/casadi | test/python/vectortools.py | Python | lgpl-3.0 | 1,561 | 0.014734 | #
# This file is part of CasADi.
#
# CasADi -- A symbolic framework for dynamic optimization.
# Copyright (C) 2010-2014 Joel Andersson, Joris Gillis, Moritz Diehl,
# K.U. Leuven. All rights reserved.
# Copyright (C) 2011-2014 Greg Horn
#
# CasADi is free software; you can redistribute it and/or
# modify it under the terms of the GNU Lesser General Public
# License as published by the Free Software Foundation; either
# version 3 of the License, or (at your option) any later version.
#
# CasADi is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
# Lesser General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public
# License along with CasADi; if not, write to the Free Software
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
#
#
from casadi import *
import casadi as c
import numpy
import unittest
from types import *
from helpers import *
from casadi.tools import *
class Vectortoolsstests(casadiTestCase):
def test_complement(self):
self.message("complement")
w = [2,1,4,6]
self.assertRaises(RuntimeError,lambda : complement(w,3) )
self.assertRaises(RuntimeError,lambda : co | mplement(w,6) )
wc = list(complement(w,8))
self.checkarray | (DM(wc),DM([0,3,5,7]),"complement")
if __name__ == '__main__':
unittest.main()
|
t0mk/ansible | lib/ansible/modules/cloud/ovh/ovh_ip_loadbalancing_backend.py | Python | gpl-3.0 | 11,833 | 0.001859 | #!/usr/bin/python
# This file is part of Ansible
#
# Ansible is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# Ansible is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with Ansible. If not, see <http://www.gnu.org/licenses/>.
ANSIBLE_METADATA = {'status': ['preview'],
'supported_by': 'community',
'version': '1.0'}
DOCUMENTATION = '''
---
module: ovh_ip_loadbalancing_backend
short_description: Manage OVH IP LoadBalancing backends
description:
- Manage OVH (French European hosting provider) LoadBalancing IP backends
version_added: "2.2"
author: Pascal HERAUD @pascalheraud
notes:
- Uses the python OVH Api U(https://github.com/ovh/python-ovh).
You have to create an application (a key and secret) with a consummer
key as described into U(https://eu.api.ovh.com/g934.first_step_with_api)
requirements:
- ovh > 0.3.5
options:
name:
required: true
description:
- Name of the LoadBalancing internal name (ip-X.X.X.X)
backend:
required: true
description:
- The IP address of the backend to update / modify / delete
state:
required: false
default: present
choices: ['present', 'absent']
description:
- Determines whether the backend is to be created/modified
or deleted
probe:
required: false
default: none
choices: ['none', 'http', 'icmp' , 'oco']
description:
- Determines the type of probe to use for this backend
weight:
required: false
default: 8
description:
- Determines the weight for this backend
endpoint:
required: true
description:
- The endpoint to use ( for instance ovh-eu)
application_key:
required: true
description:
- The applicationKey to use
app | lication_secret:
required: true
description:
- The application secret to use
consumer_key:
required: true
description:
- The consumer key to use
timeout:
required: false
default: 120
description:
- The timeout in seconds used to wait for a task to be
completed.
'''
EXAMPLES = '''
# Adds or mod | ify the backend '212.1.1.1' to a
# loadbalancing 'ip-1.1.1.1'
- ovh_ip_loadbalancing:
name: ip-1.1.1.1
backend: 212.1.1.1
state: present
probe: none
weight: 8
endpoint: ovh-eu
application_key: yourkey
application_secret: yoursecret
consumer_key: yourconsumerkey
# Removes a backend '212.1.1.1' from a loadbalancing 'ip-1.1.1.1'
- ovh_ip_loadbalancing:
name: ip-1.1.1.1
backend: 212.1.1.1
state: absent
endpoint: ovh-eu
application_key: yourkey
application_secret: yoursecret
consumer_key: yourconsumerkey
'''
RETURN = '''
'''
import time
try:
import ovh
import ovh.exceptions
from ovh.exceptions import APIError
HAS_OVH = True
except ImportError:
HAS_OVH = False
def getOvhClient(ansibleModule):
endpoint = ansibleModule.params.get('endpoint')
application_key = ansibleModule.params.get('application_key')
application_secret = ansibleModule.params.get('application_secret')
consumer_key = ansibleModule.params.get('consumer_key')
return ovh.Client(
endpoint=endpoint,
application_key=application_key,
application_secret=application_secret,
consumer_key=consumer_key
)
def waitForNoTask(client, name, timeout):
currentTimeout = timeout
while len(client.get('/ip/loadBalancing/{0}/task'.format(name))) > 0:
time.sleep(1) # Delay for 1 sec
currentTimeout -= 1
if currentTimeout < 0:
return False
return True
def main():
module = AnsibleModule(
argument_spec=dict(
name=dict(required=True),
backend=dict(required=True),
weight=dict(default=8, type='int'),
probe=dict(default='none',
choices=['none', 'http', 'icmp', 'oco']),
state=dict(default='present', choices=['present', 'absent']),
endpoint=dict(required=True),
application_key=dict(required=True, no_log=True),
application_secret=dict(required=True, no_log=True),
consumer_key=dict(required=True, no_log=True),
timeout=dict(default=120, type='int')
)
)
if not HAS_OVH:
module.fail_json(msg='ovh-api python module'
'is required to run this module ')
# Get parameters
name = module.params.get('name')
state = module.params.get('state')
backend = module.params.get('backend')
weight = long(module.params.get('weight'))
probe = module.params.get('probe')
timeout = module.params.get('timeout')
# Connect to OVH API
client = getOvhClient(module)
# Check that the load balancing exists
try:
loadBalancings = client.get('/ip/loadBalancing')
except APIError as apiError:
module.fail_json(
msg='Unable to call OVH api for getting the list of loadBalancing, '
'check application key, secret, consumerkey and parameters. '
'Error returned by OVH api was : {0}'.format(apiError))
if name not in loadBalancings:
module.fail_json(msg='IP LoadBalancing {0} does not exist'.format(name))
# Check that no task is pending before going on
try:
if not waitForNoTask(client, name, timeout):
module.fail_json(
msg='Timeout of {0} seconds while waiting for no pending '
'tasks before executing the module '.format(timeout))
except APIError as apiError:
module.fail_json(
msg='Unable to call OVH api for getting the list of pending tasks '
'of the loadBalancing, check application key, secret, consumerkey '
'and parameters. Error returned by OVH api was : {0}'
.format(apiError))
try:
backends = client.get('/ip/loadBalancing/{0}/backend'.format(name))
except APIError as apiError:
module.fail_json(
msg='Unable to call OVH api for getting the list of backends '
'of the loadBalancing, check application key, secret, consumerkey '
'and parameters. Error returned by OVH api was : {0}'
.format(apiError))
backendExists = backend in backends
moduleChanged = False
if state == "absent":
if backendExists:
# Remove backend
try:
client.delete(
'/ip/loadBalancing/{0}/backend/{1}'.format(name, backend))
if not waitForNoTask(client, name, timeout):
module.fail_json(
msg='Timeout of {0} seconds while waiting for completion '
'of removing backend task'.format(timeout))
except APIError as apiError:
module.fail_json(
msg='Unable to call OVH api for deleting the backend, '
'check application key, secret, consumerkey and '
'parameters. Error returned by OVH api was : {0}'
.format(apiError))
moduleChanged = True
else:
if backendExists:
# Get properties
try:
backendProperties = client.get(
'/ip/loadBalancing/{0}/backend/{1}'.format(name, backend))
except APIError as apiError:
module.fail_json(
msg='Unable to call OVH api for getting the |
newvem/pytz | pytz/zoneinfo/US/Eastern.py | Python | mit | 9,981 | 0.213105 | '''tzinfo timezone information for US/Eastern.'''
from pytz.tzinfo import DstTzInfo
from pytz.tzinfo import memorized_datetime as d
from pytz.tzinfo import memorized_ttinfo as i
class Eastern(DstTzInfo):
'''US/Eastern timezone definition. See datetime.tzinfo for details'''
zone = 'US/Eastern'
_utc_transition_times = [
d(1,1,1,0,0,0),
d(1918,3,31,7,0,0),
d(1918,10,27,6,0,0),
d(1919,3,30,7,0,0),
d(1919,10,26,6,0,0),
d(1920,3,28,7,0,0),
d(1920,10,31,6,0,0),
d(1921,4,24,7,0,0),
d(1921,9,25,6,0,0),
d(1922,4,30,7,0,0),
d(1922,9,24,6,0,0),
d(1923,4,29,7,0,0),
d(1923,9,30,6,0,0),
d(1924,4,27,7,0,0),
d(1924,9,28,6,0,0),
d(1925,4,26,7,0,0),
d(1925,9,27,6,0,0),
d(1926,4,25,7,0,0),
d(1926,9,26,6,0,0),
d(1927,4,24,7,0,0),
d(1927,9,25,6,0,0),
d(1928,4,29,7,0,0),
d(1928,9,30,6,0,0),
d(1929,4,28,7,0,0),
d(1929,9,29,6,0,0),
d(1930,4,27,7,0,0),
d(1930,9,28,6,0,0),
d(1931,4,26,7,0,0),
d(1931,9,27,6,0,0),
d(1932,4,24,7,0,0),
d(1932,9,25,6,0,0),
d(1933,4,30,7,0,0),
d(1933,9,24,6,0,0),
d(1934,4,29,7,0,0),
d(1934,9,30,6,0,0),
d(1935,4,28,7,0,0),
d(1935,9,29,6,0,0),
d(1936,4,26,7,0,0),
d(1936,9,27,6,0,0),
d(1937,4,25,7,0,0),
d(1937,9,26,6,0,0),
d( | 1938,4,24,7,0,0),
d(1938,9,25,6,0,0),
d(1939,4,30,7,0,0),
d(1939,9,24,6,0,0),
d(1940,4,28,7,0,0),
d(1940,9,29,6,0,0),
d(1941,4,27,7,0,0),
d(1941,9,28,6,0,0),
d(1942,2,9,7,0,0),
d(1945,8,14,23,0,0),
d(1945,9,30,6,0,0),
d(1946,4,28,7,0,0),
d(1946,9,29,6,0,0),
d(1947,4,27,7,0,0),
d(1947,9,28,6,0 | ,0),
d(1948,4,25,7,0,0),
d(1948,9,26,6,0,0),
d(1949,4,24,7,0,0),
d(1949,9,25,6,0,0),
d(1950,4,30,7,0,0),
d(1950,9,24,6,0,0),
d(1951,4,29,7,0,0),
d(1951,9,30,6,0,0),
d(1952,4,27,7,0,0),
d(1952,9,28,6,0,0),
d(1953,4,26,7,0,0),
d(1953,9,27,6,0,0),
d(1954,4,25,7,0,0),
d(1954,9,26,6,0,0),
d(1955,4,24,7,0,0),
d(1955,10,30,6,0,0),
d(1956,4,29,7,0,0),
d(1956,10,28,6,0,0),
d(1957,4,28,7,0,0),
d(1957,10,27,6,0,0),
d(1958,4,27,7,0,0),
d(1958,10,26,6,0,0),
d(1959,4,26,7,0,0),
d(1959,10,25,6,0,0),
d(1960,4,24,7,0,0),
d(1960,10,30,6,0,0),
d(1961,4,30,7,0,0),
d(1961,10,29,6,0,0),
d(1962,4,29,7,0,0),
d(1962,10,28,6,0,0),
d(1963,4,28,7,0,0),
d(1963,10,27,6,0,0),
d(1964,4,26,7,0,0),
d(1964,10,25,6,0,0),
d(1965,4,25,7,0,0),
d(1965,10,31,6,0,0),
d(1966,4,24,7,0,0),
d(1966,10,30,6,0,0),
d(1967,4,30,7,0,0),
d(1967,10,29,6,0,0),
d(1968,4,28,7,0,0),
d(1968,10,27,6,0,0),
d(1969,4,27,7,0,0),
d(1969,10,26,6,0,0),
d(1970,4,26,7,0,0),
d(1970,10,25,6,0,0),
d(1971,4,25,7,0,0),
d(1971,10,31,6,0,0),
d(1972,4,30,7,0,0),
d(1972,10,29,6,0,0),
d(1973,4,29,7,0,0),
d(1973,10,28,6,0,0),
d(1974,1,6,7,0,0),
d(1974,10,27,6,0,0),
d(1975,2,23,7,0,0),
d(1975,10,26,6,0,0),
d(1976,4,25,7,0,0),
d(1976,10,31,6,0,0),
d(1977,4,24,7,0,0),
d(1977,10,30,6,0,0),
d(1978,4,30,7,0,0),
d(1978,10,29,6,0,0),
d(1979,4,29,7,0,0),
d(1979,10,28,6,0,0),
d(1980,4,27,7,0,0),
d(1980,10,26,6,0,0),
d(1981,4,26,7,0,0),
d(1981,10,25,6,0,0),
d(1982,4,25,7,0,0),
d(1982,10,31,6,0,0),
d(1983,4,24,7,0,0),
d(1983,10,30,6,0,0),
d(1984,4,29,7,0,0),
d(1984,10,28,6,0,0),
d(1985,4,28,7,0,0),
d(1985,10,27,6,0,0),
d(1986,4,27,7,0,0),
d(1986,10,26,6,0,0),
d(1987,4,5,7,0,0),
d(1987,10,25,6,0,0),
d(1988,4,3,7,0,0),
d(1988,10,30,6,0,0),
d(1989,4,2,7,0,0),
d(1989,10,29,6,0,0),
d(1990,4,1,7,0,0),
d(1990,10,28,6,0,0),
d(1991,4,7,7,0,0),
d(1991,10,27,6,0,0),
d(1992,4,5,7,0,0),
d(1992,10,25,6,0,0),
d(1993,4,4,7,0,0),
d(1993,10,31,6,0,0),
d(1994,4,3,7,0,0),
d(1994,10,30,6,0,0),
d(1995,4,2,7,0,0),
d(1995,10,29,6,0,0),
d(1996,4,7,7,0,0),
d(1996,10,27,6,0,0),
d(1997,4,6,7,0,0),
d(1997,10,26,6,0,0),
d(1998,4,5,7,0,0),
d(1998,10,25,6,0,0),
d(1999,4,4,7,0,0),
d(1999,10,31,6,0,0),
d(2000,4,2,7,0,0),
d(2000,10,29,6,0,0),
d(2001,4,1,7,0,0),
d(2001,10,28,6,0,0),
d(2002,4,7,7,0,0),
d(2002,10,27,6,0,0),
d(2003,4,6,7,0,0),
d(2003,10,26,6,0,0),
d(2004,4,4,7,0,0),
d(2004,10,31,6,0,0),
d(2005,4,3,7,0,0),
d(2005,10,30,6,0,0),
d(2006,4,2,7,0,0),
d(2006,10,29,6,0,0),
d(2007,3,11,7,0,0),
d(2007,11,4,6,0,0),
d(2008,3,9,7,0,0),
d(2008,11,2,6,0,0),
d(2009,3,8,7,0,0),
d(2009,11,1,6,0,0),
d(2010,3,14,7,0,0),
d(2010,11,7,6,0,0),
d(2011,3,13,7,0,0),
d(2011,11,6,6,0,0),
d(2012,3,11,7,0,0),
d(2012,11,4,6,0,0),
d(2013,3,10,7,0,0),
d(2013,11,3,6,0,0),
d(2014,3,9,7,0,0),
d(2014,11,2,6,0,0),
d(2015,3,8,7,0,0),
d(2015,11,1,6,0,0),
d(2016,3,13,7,0,0),
d(2016,11,6,6,0,0),
d(2017,3,12,7,0,0),
d(2017,11,5,6,0,0),
d(2018,3,11,7,0,0),
d(2018,11,4,6,0,0),
d(2019,3,10,7,0,0),
d(2019,11,3,6,0,0),
d(2020,3,8,7,0,0),
d(2020,11,1,6,0,0),
d(2021,3,14,7,0,0),
d(2021,11,7,6,0,0),
d(2022,3,13,7,0,0),
d(2022,11,6,6,0,0),
d(2023,3,12,7,0,0),
d(2023,11,5,6,0,0),
d(2024,3,10,7,0,0),
d(2024,11,3,6,0,0),
d(2025,3,9,7,0,0),
d(2025,11,2,6,0,0),
d(2026,3,8,7,0,0),
d(2026,11,1,6,0,0),
d(2027,3,14,7,0,0),
d(2027,11,7,6,0,0),
d(2028,3,12,7,0,0),
d(2028,11,5,6,0,0),
d(2029,3,11,7,0,0),
d(2029,11,4,6,0,0),
d(2030,3,10,7,0,0),
d(2030,11,3,6,0,0),
d(2031,3,9,7,0,0),
d(2031,11,2,6,0,0),
d(2032,3,14,7,0,0),
d(2032,11,7,6,0,0),
d(2033,3,13,7,0,0),
d(2033,11,6,6,0,0),
d(2034,3,12,7,0,0),
d(2034,11,5,6,0,0),
d(2035,3,11,7,0,0),
d(2035,11,4,6,0,0),
d(2036,3,9,7,0,0),
d(2036,11,2,6,0,0),
d(2037,3,8,7,0,0),
d(2037,11,1,6,0,0),
]
_transition_info = [
i(-18000,0,'EST'),
i(-14400,3600,'EDT'),
i(-18000,0,'EST'),
i(-14400,3600,'EDT'),
i(-18000,0,'EST'),
i(-14400,3600,'EDT'),
i(-18000,0,'EST'),
i(-14400,3600,'EDT'),
i(-18000,0,'EST'),
i(-14400,3600,'EDT'),
i(-18000,0,'EST'),
i(-14400,3600,'EDT'),
i(-18000,0,'EST'),
i(-14400,3600,'EDT'),
i(-18000,0,'EST'),
i(-14400,3600,'EDT'),
i(-18000,0,'EST'),
i(-14400,3600,'EDT'),
i(-18000,0,'EST'),
i(-14400,3600,'EDT'),
i(-18000,0,'EST'),
i(-14400,3600,'EDT'),
i(-18000,0,'EST'),
i(-14400,3600,'EDT'),
i(-18000,0,'EST'),
i(-14400,3600,'EDT'),
i(-18000,0,'EST'),
i(-14400,3600,'EDT'),
i(-18000,0,'EST'),
i(-14400,3600,'EDT'),
i(-18000,0,'EST'),
i(-14400,3600,'EDT'),
i(-18000,0,'EST'),
i(-14400,3600,'EDT'),
i(-18000,0,'EST'),
i(-14400,3600,'EDT'),
i(-18000,0,'EST'),
i(-14400,3600,'EDT'),
i(-18000,0,'EST'),
i(-14400,3600,'EDT'),
i(-18000,0,'EST'),
i(-14400,3600,'EDT'),
i(-18000,0,'EST'),
i(-14400,3600,'EDT'),
i(-18000,0,'EST'),
i(-14400,3600,'EDT'),
i(-18000,0,'EST'),
i(-14400,3600,'EDT'),
i(-18000,0,'EST'),
i(-14400,3600,'EWT'),
i(-14400,3600,'EPT'),
i(-18000,0,'EST'),
i(-14400,3600,'EDT'),
i(-18000,0,'EST'),
i(-14400,3600,'EDT'),
i(-18000,0,'EST'),
i(-14400,3600,'EDT'),
i(-18000,0,'EST'),
i(-14400,3600,'EDT'),
i(-18000,0,'EST'),
i(-14400,3600,'EDT'),
i(-18000,0,'EST'),
i(-14400,3600,'EDT'),
i(-18000,0,'EST'),
i(-14400,3600,'EDT'),
i(-18000,0,'EST'),
i(-14400,3600,'EDT'),
i(-18000,0,'EST'),
i(-14400,3600,'EDT'),
i(-18000,0,'EST'),
i(-14400,3600,'EDT'),
i(-18000,0,'EST'),
i(-14400,3600,'EDT'),
i(-18000,0,'EST'),
i(-14400,3600,'EDT'),
i(-18000,0,'EST'),
i(-14400,3600,'EDT'),
i(-18000,0,'EST'),
i(-14400,3600,'EDT'),
i(-18000,0,'EST'),
i(-14400,3600,'EDT'),
i(-18000,0,'EST'),
i(-14400,3600,'EDT'),
i(-18000,0,'EST'),
i(-14400,3600,'EDT'),
i(-18000,0,'EST'),
i(-14400,3600,'EDT'),
i(-18000,0,'EST'),
i(-14400,3600,'EDT'),
i(-18000,0,'EST'),
i(-14400,3600,'EDT'),
i(-18000,0,'EST'),
i(-14400,3600,'EDT'),
i(-18000,0,'EST'),
i(-14400,3600,'EDT'),
i(-18000,0,'EST'),
i(-14400,3600,'EDT'),
i(-18000,0,'EST'),
i(-14400,3600,'EDT'),
i(-18000,0,'EST'),
i(-14400,3600,'EDT'),
i(-18000,0,'EST'),
i(-14400,3600,'EDT'),
i(-18000,0,'EST'),
i(-14400,3600,'EDT'),
i(-18000,0,'EST'),
i(-14400,3600,'EDT'),
i(-18000,0,'EST'),
i(-14400,3600,'EDT'),
i(-18000,0,'EST'),
i(-14400,3600,'EDT'),
i(-18000,0,'EST'),
i(-14400,3600,'EDT'),
i(-18000,0,'EST'),
i(-14400,3600,'EDT'),
i(-18000,0,'EST'),
i(-14400,3600,'EDT'),
i(-18000,0,'EST'),
i(-14400,3600,'EDT'),
i(-18000,0,'EST'),
i(-14400,3600,'EDT'),
i(-18000,0,'EST'),
i(-14400,3600,'EDT'),
i(-18000,0,'EST'),
i(-14400,3600,'EDT'),
i(-18000,0,'EST'),
i(-14400,3600,'EDT'),
i(-18000,0,'EST'),
i(-14400,3600,'EDT'),
i(-18000,0,'EST'),
i(-14400,3600,'EDT'),
i(-18000,0,'EST'),
i(-14400,3600,'EDT'),
i(-18000,0,'EST'),
i(-14400,3600,'EDT'),
i(-18000,0,'EST'),
i(-14400,3600,'EDT'),
i(-18000,0,'EST'),
i(-14400,3600,'EDT'),
i(-18000,0,'EST'),
i(-14400,3600,'EDT'),
i(-18000,0,'EST'),
i(-14400,3600,'EDT'),
i(-18000,0,'EST'),
i(-14400,3600,'EDT'),
i(-18000,0,'EST'),
i(-14400,3600,'EDT'),
i(-18000,0,'EST'),
i(-14400,3600,'EDT'),
i(-18000,0,'E |
obtitus/py-boinc-plotter | pyBoincPlotter/__init__.py | Python | gpl-3.0 | 914 | 0 | #!/usr/bin/env python
# This file is part of the py-boinc-plotter,
# which provides parsing and plotting of boinc statistics and
# badge information.
# Copyright (C) 2013 obtitus@gmail.com
#
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program. If not, see <http://www.gnu.org/licenses/>.
# | END LICENCE
import boinc
import changePrefs
import boinccmd
import browser
impor | t plot
|
NendoTaka/CodeForReference | CodeWars/7kyu/divisibleBySeven.py | Python | mit | 262 | 0.003817 | def helper(m):
if m < 10:
return (m, 0)
count = 1
k = m//10
k -= 2 * (m % 10)
if k > 10 | 0:
a = helper(k)
count += a[1]
k = a[0]
return | (k, count)
def seven(m):
return helper(m)
# your code
|
whyflyru/django-seo | djangoseo/utils.py | Python | bsd-3-clause | 7,375 | 0.00122 | # -*- coding: utf-8 -*-
import logging
import re
import importlib
import django
import six
from django.contrib.sites.shortcuts import get_current_site
from django.utils.functional import lazy
from django.utils.safestring import mark_safe
from django.utils.module_loading import import_string
from django.utils.html import conditional_escape
from django.conf import settings
from django.db import models
from django.db.models import Q
from django.urls import (URLResolver as RegexURLResolver, URLPattern as RegexURLPattern, Resolver404, get_resolver,
clear_url_caches)
logger = logging.getLogger(__name__)
class NotSet(object):
""" A singleton to identify unset values (where None would have meaning) """
def __str__(self):
return "NotSet"
def __repr__(self):
return self.__str__()
NotSet = NotSet()
class Literal(object):
""" Wrap literal values so that the system knows to treat them that way """
def __init__(self, value):
self.value = value
def _pattern_resolve_to_name(pattern, path):
if django.VERSION < (2, 0):
match = pattern.regex.search(path)
else:
match = pattern.pattern.regex.search(path)
if match:
name = ""
if pattern.name:
name = pattern.name
elif hasattr(pattern, '_callback_str'):
name = pattern._callback_str
else:
name = "%s.%s" % (pattern.callback.__module__, pattern.callback.func_name)
return name
def _resolver_resolve_to_name(resolver, path):
tried = []
django1 = django.VERSION < (2, 0)
if django1:
match = resolver.regex.search(path)
else:
match = resolver.pattern.regex.search(path)
if match:
new_path = path[match.end():]
for pattern in resolver.url_patterns:
try:
if isinstance(pattern, RegexURLPattern):
name = _pattern_resolve_to_name(pattern, new_path)
elif isinstance(pattern, RegexURLResolver):
name = _resolver_resolve_to_name(pattern, new_path)
except Resolver404 as e:
if django1:
tried.extend([(pattern.regex.pattern + ' ' + t) for t in e.args[0]['tried']])
else:
tried.extend([(pattern.pattern.regex.pattern + ' ' + t) for t in e.args[0]['tried']])
else:
if name:
return name
if django1:
tried.append(pattern.regex.pattern)
else:
tried.append(pattern.pattern.regex.pattern)
raise Resolver404({'tried': tried, 'path': new_path})
def resolve_to_name(path, urlconf=None):
try:
return _resolver_resolve_to_name(get_resolver(urlconf), path)
except Resolver404:
return None
def _replace_quot(match):
unescape = lambda v: v.replace('"', '"').replace('&', '&')
return u'<%s%s>' % (unescape(match.group(1)), unescape(match.group(3)))
def escape_tags(value, valid_tags):
""" Strips text from the given html string, leaving only tags.
This functionality requires BeautifulSoup, nothing will be
done otherwise.
This isn't perfect. Someone could put javascript in here:
<a onClick="alert('hi');">test</a>
So if you use valid_tags, you still need to trust your data entry.
Or we could try:
- only escape the non matching bits
- use BeautifulSoup to understand the elements, escape everything
else and remove potentially harmful attributes (onClick).
- Remove this feature entirely. Half-escaping things securely is
very difficult, developers should not be lured into a false
sense of security.
"""
# 1. escape everything
value = conditional_escape(value)
# 2. Reenable certain tags
if valid_tags:
# TODO: precompile somewhere once?
tag_re = re.compile(r'<(\s*/?\s*(%s))(.*?\s*)>' %
u'|'.join(re.escape(tag) for tag in valid_tags))
value = tag_re.sub(_replace_quot, value)
# Allow comments to be hidden
value = value.replace("<!--", "<!--").replace("-->", "-->")
return mark_safe(value)
def _get_seo_content_types(seo_models):
""" Returns a list of content types from the models defined in settings
(SEO_MODELS)
"""
from django.contrib.contenttypes.models import ContentType
try:
return [ContentType.objects.get_for_model(m).id for m in seo_models]
except: # previously caught DatabaseError
# Return an empty list if this is called too early
return []
def get_seo_content_types(seo_models):
return lazy(_get_seo_content_types, list)(seo_models)
def _reload_urlconf():
"""
Reload Django URL configuration and clean caches
"""
mo | dule = importlib.import_module(settings.ROOT_URLCONF)
if six.PY2:
reload(module)
else:
importlib.reload(module)
cle | ar_url_caches()
def register_model_in_admin(model, admin_class=None):
"""
Register model in Django admin interface
"""
from django.contrib import admin
admin.site.register(model, admin_class)
_reload_urlconf()
def create_dynamic_model(model_name, app_label='djangoseo', **attrs):
"""
Create dynamic Django model
"""
module_name = '%s.models' % app_label
default_attrs = {
'__module__': module_name,
'__dynamic__': True
}
attrs.update(default_attrs)
if six.PY2:
model_name = str(model_name)
return type(model_name, (models.Model,), attrs)
def import_tracked_models():
"""
Import models
"""
redirects_models = getattr(settings, 'SEO_TRACKED_MODELS', [])
models = []
for model_path in redirects_models:
try:
model = import_string(model_path)
models.append(model)
except ImportError as e:
logging.warning("Failed to import model from path '%s'" % model_path)
return models
def handle_seo_redirects(request):
"""
Handle SEO redirects. Create Redirect instance if exists redirect pattern.
:param request: Django request
"""
from .models import RedirectPattern, Redirect
if not getattr(settings, 'SEO_USE_REDIRECTS', False):
return
full_path = request.get_full_path()
current_site = get_current_site(request)
subdomain = getattr(request, 'subdomain', '')
redirect_patterns = RedirectPattern.objects.filter(
Q(site=current_site),
Q(subdomain=subdomain) | Q(all_subdomains=True)
).order_by('all_subdomains')
for redirect_pattern in redirect_patterns:
if re.match(redirect_pattern.url_pattern, full_path):
kwargs = {
'site': current_site,
'old_path': full_path,
'new_path': redirect_pattern.redirect_path,
'subdomain': redirect_pattern.subdomain,
'all_subdomains': redirect_pattern.all_subdomains
}
try:
Redirect.objects.get_or_create(**kwargs)
except Exception:
logger.warning('Failed to create redirection', exc_info=True, extra=kwargs)
break
|
kiwix/gutenberg | gutenbergtozim/urls.py | Python | gpl-3.0 | 7,238 | 0.000829 | #!/usr/bin/env python
# -*- coding: utf-8 -*-
# vim: ai ts=4 sts=4 et sw=4 nu
from __future__ import (unicode_literals, absolute_import,
division, print_function)
import os
import platform
from collections import defaultdict
from gutenbergtozim.database import Book, BookFormat, Url
from gutenbergtozim.utils import FORMAT_MATRIX, exec_cmd
from gutenbergtozim import logger
try:
import urlparse
except ImportError:
import urllib.parse as urlparse
from playhouse.csv_loader import load_csv
class UrlBuilder:
"""
Url builder for the files of a Gutenberg book.
Example:
>>> builder = UrlBuilder()
>>> builder.with_id(<some_id>)
>>> builder.with_base(UrlBuilder.BASE_{ONE|TWO|THREE})
>>> url = builder.build()
"""
SERVER_NAME = "aleph_gutenberg_org"
RSYNC = "rsync://aleph.gutenberg.org/gutenberg/"
BASE_ONE = 'http://aleph.gutenberg.org/'
BASE_TWO = 'http://aleph.gutenberg.org/cache/epub/'
BASE_THREE = 'http://aleph.gutenberg.org/etext'
def __init__(self):
self.base = self.BASE_ONE
def build(self):
"""
Build either an url depending on whether the base url
is `BASE_ONE` or `BASE_TWO`.
The former generates urls according to the Url pattern:
id: 10023 -> pattern: <base-url>/1/0/0/2/10023
The latter generates urls according to the Url pattern:
id: 10023 -> pattern: <base-url>/10023
There's no implementation for the book Id's 0-10, because
these books do not exist.
"""
if int(self.b_id) > 10:
if self.base == self.BASE_ONE:
base_url = os.path.join(
os.path.join(*list(str(self.b_id))[:-1]), str(self.b_id))
url = os.path.join(self.base, base_url)
elif self.base == self.BASE_TWO:
url = os.path.join(self.base, str(self.b_id))
elif self.base == self.BASE_THREE:
url = self.base
else:
logger.warning('Figuring out the url of books \
with an ID of {ID <= 10} is not implemented')
return None
| return url
def with_base(self, base):
self.base = base
def with_id(self, b_id):
self.b_id = b_id
def __unicode__(self):
return self. | build_url()
def get_urls(book):
"""
Get all possible urls that could point to the
book on either of the two mirrors.
param: book: The book you want the possible urls from
returns: a list of all possible urls sorted by their probability
"""
filtered_book = [bf.format for bf in
BookFormat.select().where(BookFormat.book == book)]
# Strip out the encoding of the file
def f(x):
return x.mime.split(';')[0].strip()
available_formats = [
{x.pattern.format(id=book.id): {'mime': f(x), 'id': book.id}}
for x in filtered_book
if f(x) in FORMAT_MATRIX.values()]
files = sort_by_mime_type(available_formats)
return build_urls(files)
def sort_by_mime_type(files):
"""
Reverse the passed in `files` dict and return a dict
that is sorted by `{mimetype: {filetype, id}}` instead of
by `{filetype: mimetype}`.
"""
mime = defaultdict(list)
for f in files:
for k, v in f.items():
mime[v['mime']].append({'name': k, 'id': v['id']})
return dict(mime)
def build_urls(files):
mapping = {
'application/epub+zip': build_epub,
'application/pdf': build_pdf,
'text/html': build_html
}
for i in mapping:
if i in files:
possible_url = mapping[i](files[i])
filtre = [u for u in possible_url if Url.get_or_none(url=urlparse.urlparse(u).path[1:])]
if len(filtre) == 0 and len(possible_url) != 0:
files[i] = possible_url
else:
files[i] = filtre
return files
def index_of_substring(lst, substrings):
for i, s in enumerate(lst):
for substring in substrings:
if substring in s:
return i
return -1
def build_epub(files):
"""
Build the posssible urls of the epub file.
"""
urls = []
b_id = str(files[0]['id'])
u = UrlBuilder()
u.with_id(b_id)
u.with_base(UrlBuilder.BASE_TWO)
if not u.build():
return []
name = ''.join(['pg', b_id])
url = os.path.join(u.build(), name + '.epub')
urls.append(url)
return urls
def build_pdf(files):
"""
Build the posssible urls of the pdf files.
"""
urls = []
b_id = str(files[0]['id'])
u = UrlBuilder()
u.with_base(UrlBuilder.BASE_TWO)
u.with_id(b_id)
u1 = UrlBuilder()
u1.with_base(UrlBuilder.BASE_ONE)
u1.with_id(b_id)
if not u.build():
return []
for i in files:
if 'images' not in i['name']:
url = os.path.join(u.build(), i['name'])
urls.append(url)
url_dash1 = os.path.join(u1.build(), b_id + '-' + 'pdf' + '.pdf')
url_dash = os.path.join(u.build(), b_id + '-' + 'pdf' + '.pdf')
url_normal = os.path.join(u.build(), b_id + '.pdf')
url_pg = os.path.join(u.build(), 'pg' + b_id + '.pdf')
urls.extend([url_dash, url_normal, url_pg, url_dash1])
return list(set(urls))
def build_html(files):
"""
Build the posssible urls of the html files.
"""
urls = []
b_id = str(files[0]['id'])
file_names = [i['name'] for i in files]
u = UrlBuilder()
u.with_id(b_id)
if not u.build():
return []
if all(['-h.html' not in file_names, '-h.zip' in file_names]):
for i in files:
url = os.path.join(u.build(), i['name'])
urls.append(url)
url_zip = os.path.join(u.build(), b_id + '-h' + '.zip')
# url_utf8 = os.path.join(u.build(), b_id + '-8' + '.zip')
url_html = os.path.join(u.build(), b_id + '-h' + '.html')
url_htm = os.path.join(u.build(), b_id + '-h' + '.htm')
u.with_base(UrlBuilder.BASE_TWO)
name = ''.join(['pg', b_id])
html_utf8 = os.path.join(u.build(), name + '.html.utf8')
u.with_base(UrlBuilder.BASE_THREE)
file_index = index_of_substring(files, ['html', 'htm'])
file_name = files[file_index]['name']
etext_nums = []
etext_nums.extend(range(90, 100))
etext_nums.extend(range(0, 6))
etext_names = ["{0:0=2d}".format(i) for i in etext_nums]
etext_urls = []
for i in etext_names:
etext_urls.append(os.path.join(u.build() + i, file_name))
urls.extend([url_zip, url_htm, url_html, html_utf8])
urls.extend(etext_urls)
return list(set(urls))
def setup_urls():
file_with_url = os.path.join("tmp", "file_on_{}".format(UrlBuilder.SERVER_NAME))
cmd = ["bash", "-c", "rsync -a --list-only {} > {}".format(UrlBuilder.RSYNC, file_with_url)]
exec_cmd(cmd)
in_place_opt = ["-i", ".bak"] if platform.system() == "Darwin" else ["-i"]
cmd = ["sed"] + in_place_opt + [r"s#.* \(.*\)$#\\1#", file_with_url]
exec_cmd(cmd)
field_names = ['url']
load_csv(Url, file_with_url, field_names=field_names)
if __name__ == '__main__':
book = Book.get(id=9)
print(get_urls(book))
|
CyberVines/Universal-Quantum-Cymatics | Record_RX.py | Python | mit | 6,310 | 0.004279 | #!/usr/bin/env python3
# -*- coding: utf-8 -*-
#
# SPDX-License-Identifier: GPL-3.0
#
# GNU Radio Python Flow Graph
# Title: Record_RX
# Author: Justin Ried
# GNU Radio version: 3.8.1.0
from distutils.version impor | t StrictVersion
if __name__ == '__main__':
import ctypes
import sys
if sys.platform.startswith('linux'):
try:
x11 = ctypes.cdll.LoadLibrary('libX11.so')
x11.XInitThreads()
| except:
print("Warning: failed to XInitThreads()")
from PyQt5 import Qt
from gnuradio import qtgui
from gnuradio.filter import firdes
import sip
from gnuradio import blocks
from gnuradio import gr
import sys
import signal
from argparse import ArgumentParser
from gnuradio.eng_arg import eng_float, intx
from gnuradio import eng_notation
import osmosdr
import time
from gnuradio import qtgui
class Record_RX(gr.top_block, Qt.QWidget):
def __init__(self):
gr.top_block.__init__(self, "Record_RX")
Qt.QWidget.__init__(self)
self.setWindowTitle("Record_RX")
qtgui.util.check_set_qss()
try:
self.setWindowIcon(Qt.QIcon.fromTheme('gnuradio-grc'))
except:
pass
self.top_scroll_layout = Qt.QVBoxLayout()
self.setLayout(self.top_scroll_layout)
self.top_scroll = Qt.QScrollArea()
self.top_scroll.setFrameStyle(Qt.QFrame.NoFrame)
self.top_scroll_layout.addWidget(self.top_scroll)
self.top_scroll.setWidgetResizable(True)
self.top_widget = Qt.QWidget()
self.top_scroll.setWidget(self.top_widget)
self.top_layout = Qt.QVBoxLayout(self.top_widget)
self.top_grid_layout = Qt.QGridLayout()
self.top_layout.addLayout(self.top_grid_layout)
self.settings = Qt.QSettings("GNU Radio", "Record_RX")
try:
if StrictVersion(Qt.qVersion()) < StrictVersion("5.0.0"):
self.restoreGeometry(self.settings.value("geometry").toByteArray())
else:
self.restoreGeometry(self.settings.value("geometry"))
except:
pass
##################################################
# Variables
##################################################
self.samp_rate = samp_rate = 2e6
##################################################
# Blocks
##################################################
self.qtgui_freq_sink_x_0 = qtgui.freq_sink_c(
1024, #size
firdes.WIN_BLACKMAN_hARRIS, #wintype
0, #fc
samp_rate, #bw
"", #name
1
)
self.qtgui_freq_sink_x_0.set_update_time(0.10)
self.qtgui_freq_sink_x_0.set_y_axis(-140, 10)
self.qtgui_freq_sink_x_0.set_y_label('Relative Gain', 'dB')
self.qtgui_freq_sink_x_0.set_trigger_mode(qtgui.TRIG_MODE_FREE, 0.0, 0, "")
self.qtgui_freq_sink_x_0.enable_autoscale(False)
self.qtgui_freq_sink_x_0.enable_grid(False)
self.qtgui_freq_sink_x_0.set_fft_average(1.0)
self.qtgui_freq_sink_x_0.enable_axis_labels(True)
self.qtgui_freq_sink_x_0.enable_control_panel(False)
labels = ['', '', '', '', '',
'', '', '', '', '']
widths = [1, 1, 1, 1, 1,
1, 1, 1, 1, 1]
colors = ["blue", "red", "green", "black", "cyan",
"magenta", "yellow", "dark red", "dark green", "dark blue"]
alphas = [1.0, 1.0, 1.0, 1.0, 1.0,
1.0, 1.0, 1.0, 1.0, 1.0]
for i in range(1):
if len(labels[i]) == 0:
self.qtgui_freq_sink_x_0.set_line_label(i, "Data {0}".format(i))
else:
self.qtgui_freq_sink_x_0.set_line_label(i, labels[i])
self.qtgui_freq_sink_x_0.set_line_width(i, widths[i])
self.qtgui_freq_sink_x_0.set_line_color(i, colors[i])
self.qtgui_freq_sink_x_0.set_line_alpha(i, alphas[i])
self._qtgui_freq_sink_x_0_win = sip.wrapinstance(self.qtgui_freq_sink_x_0.pyqwidget(), Qt.QWidget)
self.top_grid_layout.addWidget(self._qtgui_freq_sink_x_0_win)
self.osmosdr_source_0 = osmosdr.source(
args="numchan=" + str(1) + " " + ''
)
self.osmosdr_source_0.set_sample_rate(samp_rate)
self.osmosdr_source_0.set_center_freq(462725000, 0)
self.osmosdr_source_0.set_freq_corr(0, 0)
self.osmosdr_source_0.set_gain(10, 0)
self.osmosdr_source_0.set_if_gain(25, 0)
self.osmosdr_source_0.set_bb_gain(16, 0)
self.osmosdr_source_0.set_antenna('', 0)
self.osmosdr_source_0.set_bandwidth(0, 0)
self.blocks_file_sink_0 = blocks.file_sink(gr.sizeof_gr_complex*1, '/root/Desktop/CV', False)
self.blocks_file_sink_0.set_unbuffered(False)
##################################################
# Connections
##################################################
self.connect((self.osmosdr_source_0, 0), (self.blocks_file_sink_0, 0))
self.connect((self.osmosdr_source_0, 0), (self.qtgui_freq_sink_x_0, 0))
def closeEvent(self, event):
self.settings = Qt.QSettings("GNU Radio", "Record_RX")
self.settings.setValue("geometry", self.saveGeometry())
event.accept()
def get_samp_rate(self):
return self.samp_rate
def set_samp_rate(self, samp_rate):
self.samp_rate = samp_rate
self.osmosdr_source_0.set_sample_rate(self.samp_rate)
self.qtgui_freq_sink_x_0.set_frequency_range(0, self.samp_rate)
def main(top_block_cls=Record_RX, options=None):
if StrictVersion("4.5.0") <= StrictVersion(Qt.qVersion()) < StrictVersion("5.0.0"):
style = gr.prefs().get_string('qtgui', 'style', 'raster')
Qt.QApplication.setGraphicsSystem(style)
qapp = Qt.QApplication(sys.argv)
tb = top_block_cls()
tb.start()
tb.show()
def sig_handler(sig=None, frame=None):
Qt.QApplication.quit()
signal.signal(signal.SIGINT, sig_handler)
signal.signal(signal.SIGTERM, sig_handler)
timer = Qt.QTimer()
timer.start(500)
timer.timeout.connect(lambda: None)
def quitting():
tb.stop()
tb.wait()
qapp.aboutToQuit.connect(quitting)
qapp.exec_()
if __name__ == '__main__':
main()
|
davy39/eric | Preferences/Ui_ShortcutDialog.py | Python | gpl-3.0 | 4,515 | 0.003544 | # -*- coding: utf-8 -*-
# Form implementation generated from reading ui file './Preferences/ShortcutDialog.ui'
#
# Created: Tue Nov 18 17:53:56 2014
# by: PyQt5 UI code generator 5.3.2
#
# WARNING! All changes made in this file will be lost!
from PyQt5 import QtCore, QtGui, QtWidgets
class Ui_ShortcutDialog(object):
def setupUi(self, ShortcutDialog):
ShortcutDialog.setObjectName("ShortcutDialog")
ShortcutDialog.resize(539, 134)
self.vboxlayout = QtWidgets.QVBoxLayout(ShortcutDialog)
self.vboxlayout.setObjectName("vboxlayout" | )
self.shortcutsGroup = QtWidgets.QGroupBox(ShortcutDialog)
self.shortcutsGroup.setTitle("")
self.shortcutsGroup.setObjectName("shortcutsGroup")
self.gridLayout = QtWidgets.QGridLayout(self.shortcutsGroup)
self.gridLayout.setObjectName("g | ridLayout")
self.primaryButton = QtWidgets.QRadioButton(self.shortcutsGroup)
self.primaryButton.setFocusPolicy(QtCore.Qt.NoFocus)
self.primaryButton.setChecked(True)
self.primaryButton.setObjectName("primaryButton")
self.gridLayout.addWidget(self.primaryButton, 0, 0, 1, 1)
self.primaryClearButton = QtWidgets.QPushButton(self.shortcutsGroup)
self.primaryClearButton.setFocusPolicy(QtCore.Qt.NoFocus)
self.primaryClearButton.setObjectName("primaryClearButton")
self.gridLayout.addWidget(self.primaryClearButton, 0, 1, 1, 1)
self.keyEdit = QtWidgets.QLineEdit(self.shortcutsGroup)
self.keyEdit.setReadOnly(True)
self.keyEdit.setObjectName("keyEdit")
self.gridLayout.addWidget(self.keyEdit, 0, 2, 1, 1)
self.alternateButton = QtWidgets.QRadioButton(self.shortcutsGroup)
self.alternateButton.setFocusPolicy(QtCore.Qt.NoFocus)
self.alternateButton.setObjectName("alternateButton")
self.gridLayout.addWidget(self.alternateButton, 1, 0, 1, 1)
self.alternateClearButton = QtWidgets.QPushButton(self.shortcutsGroup)
self.alternateClearButton.setEnabled(False)
self.alternateClearButton.setFocusPolicy(QtCore.Qt.NoFocus)
self.alternateClearButton.setObjectName("alternateClearButton")
self.gridLayout.addWidget(self.alternateClearButton, 1, 1, 1, 1)
self.alternateKeyEdit = QtWidgets.QLineEdit(self.shortcutsGroup)
self.alternateKeyEdit.setReadOnly(True)
self.alternateKeyEdit.setObjectName("alternateKeyEdit")
self.gridLayout.addWidget(self.alternateKeyEdit, 1, 2, 1, 1)
self.vboxlayout.addWidget(self.shortcutsGroup)
self.buttonBox = QtWidgets.QDialogButtonBox(ShortcutDialog)
self.buttonBox.setOrientation(QtCore.Qt.Horizontal)
self.buttonBox.setStandardButtons(QtWidgets.QDialogButtonBox.Cancel|QtWidgets.QDialogButtonBox.Ok)
self.buttonBox.setObjectName("buttonBox")
self.vboxlayout.addWidget(self.buttonBox)
self.retranslateUi(ShortcutDialog)
self.primaryButton.toggled['bool'].connect(self.primaryClearButton.setEnabled)
self.alternateButton.toggled['bool'].connect(self.alternateClearButton.setEnabled)
self.buttonBox.rejected.connect(ShortcutDialog.reject)
QtCore.QMetaObject.connectSlotsByName(ShortcutDialog)
ShortcutDialog.setTabOrder(self.keyEdit, self.alternateKeyEdit)
ShortcutDialog.setTabOrder(self.alternateKeyEdit, self.buttonBox)
def retranslateUi(self, ShortcutDialog):
_translate = QtCore.QCoreApplication.translate
ShortcutDialog.setWindowTitle(_translate("ShortcutDialog", "Edit Shortcut"))
ShortcutDialog.setWhatsThis(_translate("ShortcutDialog", "Press your shortcut keys and select OK"))
self.primaryButton.setToolTip(_translate("ShortcutDialog", "Select to change the primary keyboard shortcut"))
self.primaryButton.setText(_translate("ShortcutDialog", "Primary Shortcut:"))
self.primaryClearButton.setToolTip(_translate("ShortcutDialog", "Press to clear the key sequence buffer."))
self.primaryClearButton.setText(_translate("ShortcutDialog", "Clear"))
self.alternateButton.setToolTip(_translate("ShortcutDialog", "Select to change the alternative keyboard shortcut"))
self.alternateButton.setText(_translate("ShortcutDialog", "Alternative Shortcut:"))
self.alternateClearButton.setToolTip(_translate("ShortcutDialog", "Press to clear the key sequence buffer."))
self.alternateClearButton.setText(_translate("ShortcutDialog", "Clear"))
|
odoousers2014/LibrERP | base_address_contacts/res_partner_address.py | Python | agpl-3.0 | 10,827 | 0.00351 | # -*- coding: utf-8 -*-
##############################################################################
#
# Copyright (C) 2013 - TODAY Denero Team. (<http://www.deneroteam.com>)
# All Rights Reserved
#
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published
# by the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <http://www.gnu.org/licenses/>.
#
##############################################################################
from openerp.osv import orm, fields
from openerp import addons
from openerp.tools.translate import _
class res_partner_title(orm.Model):
_inherit = "res.partner.title"
_order = "sequence"
_columns = {
'sequence': fields.integer('Sequence'),
}
_defaults = {
'sequence': 1,
}
class res_contact_function(orm.Model):
_name = "res.contact.function"
_description = "Contact Function"
_order = "name"
_columns = {
'name': fields.char('Name', size=32),
}
class res_partner_address_contact(orm.Model):
_name = "res.partner.address.contact"
_description = "Address Contact"
def name_get(self, cr, uid, ids, context=None):
res = []
for rec in self.browse(cr, uid, ids, context=context):
if rec.title:
res.append((rec.id, rec.title.name + ' ' + rec.last_name + ' ' + (rec.first_name or '')))
else:
res.append((rec.id, rec.last_name + ' ' + (rec.first_name or '')))
return res
def _name_get_full(self, cr, uid, ids, prop, unknow_none, context=None):
result = {}
for rec in self.browse(cr, uid, ids, context=context):
if rec.title:
result[rec.id] = rec.title.name + ' ' + rec.last_name + ' ' + (rec.first_name or '')
else:
result[rec.id] = rec.last_name + ' ' + (rec.first_name or '')
return result
_columns = {
'complete_name': fields.function(_name_get_full, string='Name', size=64, type="char", store=False, select=True),
'name': fields.char('Name', size=64, ),
'last_name': fields.char('Last Name', size=64, required=True),
'first_name': fields.char('First Name', size=64),
'mobile': fields.char('Mobile', size=64),
'fisso': fields.char('Phone', size=64),
'title': fields.many2one('res.partner.title', 'Title', domain=[('domain', '=', 'contact')]),
'website': fields.char('Website', size=120),
'address_id': fields.many2one('res.partner.address', 'Address'),
'partner_id': fields.related(
'address_id', 'partner_id', type='many2one', relation='res.partner', string='Main Employer'),
'lang_id': fields.many2one('res.lang', 'Language'),
'country_id': fields.many2one('res.country', 'Nationality'),
'birthdate': fields.char('Birthdate', size=64),
'active': fields.boolean('Active', help="If the active field is set to False,\
it will allow you to hide the partner contact without removing it."),
'email': fields.char('E-Mail', size=240),
'comment': fields.text('Notes', translate=True),
'photo': fields.binary('Photo'),
'function': fields.char("Function", size=64),
'function_id': fields.many2one('res.contact.function', 'Function'),
}
def _get_photo(self, cr, uid, context=None):
photo_path = addons.get_module_resource('base_address_contacts', 'images', 'photo.png')
return open(photo_path, 'rb').read().encode('base64')
_defaults = {
'name': '/',
'photo': _get_photo,
'active': True,
'address_id': lambda self, cr, uid, context: context.get('address_id', False),
}
_order = "last_name"
def name_search(self, cr, uid, name='', args=None, operator='ilike', context=None, limit=None):
if not args:
args = []
if context is None:
context = {}
if name:
ids = self.search(
cr, uid, ['|', ('last_name', operator, name), ('first_name', operator, name)] + args, limit=limit,
context=context)
else:
ids = self.search(cr, uid, args, limit=limit, context=context)
return self.name_get(cr, uid, ids, context=context)
def create(self, cr, uid, vals, context=None):
if context is None:
context = {}
name = ''
update = False
if vals.get('last_name', False):
name += vals['last_name']
update = True
if vals.get('first_name', False):
name += ' ' + vals['first_name']
update = True
if update:
vals['name'] = name
return super(res_partner_address_contact, self).create(cr, uid, vals, context=context)
def write(self, cr, uid, ids, vals, context=None):
if context is None:
context = {}
name = ''
update = False
if vals.get('last_name', False):
name += vals['last_name']
update = True
if vals.get('first_name', False):
name += ' ' + vals['first_name']
update = True
if update:
vals['name'] = name
return super(res_partner_address_contact, self).write(cr, uid, ids, vals, context)
class res_partner_address(orm.Model):
_inherit = 'res.partner.address'
def get_full_name(self, cr, uid, ids, field_name, arg, context=None):
res = {}
for re in self.browse(cr, uid, ids, context=context):
addr = ''
if re.partner_id:
if re.partner_id.name != re.name:
addr = re.name or ''
if re.name and (re.city or re.country_id):
addr += ', '
addr += (re.city or '') + ', ' + (re.street or '')
if re.partner_id and context.get('contact_display', False) == 'partner_address':
addr = "%s: %s" % (re.partner_id.name, addr.strip())
else:
addr = addr.strip()
res[re.id] = addr or ''
return res
def name_get(self, cr, uid, ids, context=None):
if not len(ids):
return []
res = []
length = context.get('name_lenght', False) or 45
for record in self.browse(cr, uid, ids, context=context):
name = record.complete_name or record.name or ''
if len(name) > length:
name = name[:length] + '...'
res.append((record.id, name))
return res
def name_search(self, cr, user, name, args=None, operator='ilike', context=None, limit=100):
if not args:
args = []
if context is None:
context = {}
ids = []
name_array = name.split()
search_domain = []
for n in name_array:
search_domain.append('|')
search_domain.append(('name', operator, n))
search_domain.append(('complete_name', operator, n))
ids = self.search(cr, user, search_domain + args, limit=limit, context=context)
return self.name_get(cr, user, ids, context=context)
_columns = {
'partner_id': fields.many2one('res.partner', 'Partner Name', ond | elete='set null', select=True, help="Keep empty for a private address, not related to partner.", required=True),
'contact_ids': fields.one2many('res.partner.address.contact', 'address_id', 'Functions and Contacts'),
'mobile': fields.char('Mobile', size=64),
'pec': fields.char('PEC', size=64),
'complete_name': fields.function | (get_full_name, m |
google-research/tf-slim | tf_slim/ops/framework_ops.py | Python | apache-2.0 | 2,660 | 0.002256 | # coding=utf-8
# Copyright 2015 The TensorFlow Authors. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# ==============================================================================
"""Classes and functions used to construct graphs."""
from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
# pylint: disable=g-direct-tensorflow-import
from tensorflow.python.framework import ops
__all__ = ['get_graph_from_inputs',
'get_name_scope']
def get_graph_from_inputs(op_input_list, graph=None):
"""Returns the appropriate graph to use for the given inputs.
1. If `graph` is provided, we validate that all inputs in `op_input_list` are
from the same graph.
2. Otherwise, we attempt to select a graph from the first Operation- or
Tensor-valued input in `op_input_list`, and validate that all other
such inputs are in the same graph.
3. If the graph was not specified and it could not be i | nferred from
`op_input_list`, we attempt to use the default graph.
Args:
op_input_list: A list of inputs to an operation, which may include `Tensor`,
`Operation`, and other objects that may be converted to a graph element.
graph: (Optional) The explici | t graph to use.
Raises:
TypeError: If `op_input_list` is not a list or tuple, or if graph is not a
Graph.
ValueError: If a graph is explicitly passed and not all inputs are from it,
or if the inputs are from multiple graphs, or we could not find a graph
and there was no default graph.
Returns:
The appropriate graph to use for the given inputs.
"""
# pylint: disable=protected-access
return ops._get_graph_from_inputs(op_input_list, graph)
def get_name_scope():
"""Returns the current name scope of the default graph.
For example:
```python
with tf.name_scope('scope1'):
with tf.name_scope('scope2'):
print(tf.contrib.framework.get_name_scope())
```
would print the string `scope1/scope2`.
Returns:
A string representing the current name scope.
"""
return ops.get_default_graph().get_name_scope()
|
asnorkin/sentiment_analysis | site/lib/python2.7/site-packages/scipy/optimize/_basinhopping.py | Python | mit | 27,548 | 0.000399 | """
basinhopping: The basinhopping global optimization algorithm
"""
from __future__ import division, print_function, absolute_import
import numpy as np
from numpy import cos, sin
import scipy.optimize
import collections
from scipy._lib._util import check_random_state
__all__ = ['basinhopping']
class Storage(object):
"""
Class used to store the lowest energy structure
"""
def __init__(self, minres):
self._add(minres)
def _add(self, minres):
self.minres = minres
self.minres.x = np.copy(minres.x)
def update(self, minres):
if minres.fun < self.minres.fun:
self._add(minres)
return True
else:
return False
def get_lowest(self):
return self.minres
class BasinHoppingRunner(object):
"""This class implements the core of the basinhopping algorithm.
x0 : ndarray
The starting coordinates.
minimizer : callable
The local minimizer, with signature ``result = minimizer(x)``.
The return value is an `optimize.OptimizeResult` object.
step_taking : callable
This function displaces the coordinates randomly. Signature should
be ``x_new = step_taking(x)``. Note that `x` may be modified in-place.
accept_tests : list of callables
Each test is passed the kwargs `f_new`, `x_new`, `f_old` and
`x_old`. These tests will be used to judge whether or not to accept
the step. The acceptable return values are True, False, or ``"force
accept"``. If any of the tests return False then the step is rejected.
If the latter, then this will override any other tests in order to
accept the step. This can be used, for example, to forcefully escape
from a local minimum that ``basinhopping`` is trapped in.
disp : bool, optional
Display status messages.
"""
def __init__(self, x0, minimizer, step_taking, accept_tests, disp=False):
self.x = np.copy(x0)
self.minimizer = minimizer
self.step_taking = step_taking
self.accept_tests = accept_tests
self.disp = disp
self.nstep = 0
# initialize return object
self.res = scipy.optimize.OptimizeResult()
self.res.minimization_failures = 0
# do initial minimization
minres = minimizer(self.x)
if not minres.success:
self.res.minimization_failures += 1
if self.disp:
print("warning: basinhopping: local minimization failure")
self.x = np.copy(minres.x)
self.energy = minres.fun
if self.disp:
print("basinhopping step %d: f %g" % (self.nstep, self.energy))
# initialize storage class
self.storage = Storage(minres)
if hasattr(minres, "nfev"):
self.res.nfev = minres.nfev
if hasattr(minres, "njev"):
self.res.njev = minres.njev
if hasattr(minres, "nhev"):
self.res.nhev = minres.nhev
def _monte_carlo_step(self):
"""Do one monte carlo iteration
Randomly displace the coordinates, minimize, and decide whether
or not to accept the new coordinates.
"""
# Take a random step. Make a copy of x because the step_taking
# algorithm might change x in place
x_after_step = np.copy(self.x)
x_after_step = self.step_taking(x_after_step)
# do a local minimization
minres = self.minimizer(x_after_step)
x_after_quench = minres.x
energy_after_quench = minres.fun
if not minres.succe | ss:
self.res.minimization_failures += 1
if self.disp:
print("warning: basinhopping: local minimization failure")
if ha | sattr(minres, "nfev"):
self.res.nfev += minres.nfev
if hasattr(minres, "njev"):
self.res.njev += minres.njev
if hasattr(minres, "nhev"):
self.res.nhev += minres.nhev
# accept the move based on self.accept_tests. If any test is False,
# than reject the step. If any test returns the special value, the
# string 'force accept', accept the step regardless. This can be used
# to forcefully escape from a local minimum if normal basin hopping
# steps are not sufficient.
accept = True
for test in self.accept_tests:
testres = test(f_new=energy_after_quench, x_new=x_after_quench,
f_old=self.energy, x_old=self.x)
if testres == 'force accept':
accept = True
break
elif not testres:
accept = False
# Report the result of the acceptance test to the take step class.
# This is for adaptive step taking
if hasattr(self.step_taking, "report"):
self.step_taking.report(accept, f_new=energy_after_quench,
x_new=x_after_quench, f_old=self.energy,
x_old=self.x)
return accept, minres
def one_cycle(self):
"""Do one cycle of the basinhopping algorithm
"""
self.nstep += 1
new_global_min = False
accept, minres = self._monte_carlo_step()
if accept:
self.energy = minres.fun
self.x = np.copy(minres.x)
new_global_min = self.storage.update(minres)
# print some information
if self.disp:
self.print_report(minres.fun, accept)
if new_global_min:
print("found new global minimum on step %d with function"
" value %g" % (self.nstep, self.energy))
# save some variables as BasinHoppingRunner attributes
self.xtrial = minres.x
self.energy_trial = minres.fun
self.accept = accept
return new_global_min
def print_report(self, energy_trial, accept):
"""print a status update"""
minres = self.storage.get_lowest()
print("basinhopping step %d: f %g trial_f %g accepted %d "
" lowest_f %g" % (self.nstep, self.energy, energy_trial,
accept, minres.fun))
class AdaptiveStepsize(object):
"""
Class to implement adaptive stepsize.
This class wraps the step taking class and modifies the stepsize to
ensure the true acceptance rate is as close as possible to the target.
Parameters
----------
takestep : callable
The step taking routine. Must contain modifiable attribute
takestep.stepsize
accept_rate : float, optional
The target step acceptance rate
interval : int, optional
Interval for how often to update the stepsize
factor : float, optional
The step size is multiplied or divided by this factor upon each
update.
verbose : bool, optional
Print information about each update
"""
def __init__(self, takestep, accept_rate=0.5, interval=50, factor=0.9,
verbose=True):
self.takestep = takestep
self.target_accept_rate = accept_rate
self.interval = interval
self.factor = factor
self.verbose = verbose
self.nstep = 0
self.nstep_tot = 0
self.naccept = 0
def __call__(self, x):
return self.take_step(x)
def _adjust_step_size(self):
old_stepsize = self.takestep.stepsize
accept_rate = float(self.naccept) / self.nstep
if accept_rate > self.target_accept_rate:
#We're accepting too many steps. This generally means we're
#trapped in a basin. Take bigger steps
self.takestep.stepsize /= self.factor
else:
#We're not accepting enough steps. Take smaller steps
self.takestep.stepsize *= self.factor
if self.verbose:
print("adaptive stepsize: acceptance rate %f target %f new "
"stepsize %g old stepsize %g" % (accept_rate,
self.target_accept_rate, self.takestep.stepsize,
old_stepsize))
def take_step(self, x):
self.ns |
korovkin/WNNotifier | notifier/parcon/options.py | Python | apache-2.0 | 381 | 0.010499 |
class Options(object):
def __init__(self, m, d={}, **defa | ults):
self.values = {}
self.values.update(defaults)
self.values.update(d)
self.values.update(m)
def __getattr__(self, name):
return self.values[name]
__get | item__ = __getattr__
def __iter__(self):
for k, v in self.values:
yield k, v |
elianerpereira/gtg | GTG/tests/test_backend_tomboy.py | Python | gpl-3.0 | 15,680 | 0.000319 | # -*- coding: utf-8 -*-
# -----------------------------------------------------------------------------
# Getting Things GNOME! - a personal organizer for the GNOME desktop
# Copyright (c) 2008-2013 - Lionel Dricot & Bertrand Rousseau
#
# This program is free software: you can redistribute it and/or modify it under
# the terms of the GNU General Public License as published by the Free Software
# Foundation, either version 3 of the License, or (at your option) any later
# version.
#
# This program is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
# FOR A PARTICULAR PURPOSE. See the GNU General Public License for more
# details.
#
# You should have received a copy of the GNU General Public License along with
# this program. If not, see <http://www.gnu.org/licenses/>.
# -----------------------------------------------------------------------------
""" Tests for the tomboy backend """
from datetime import datetime
from dbus.mainloop.glib import DBusGMainLoop
import dbus
import dbus.glib
import dbus.service
import errno
import gobject
import math
import os
import random
import signal
import sys
import tempfile
import threading
import time
import unittest
import uuid
from GTG.backends import BackendFactory
from GTG.backends.genericbackend import GenericBackend
from GTG.core.datastore import DataStore
PID_TOMBOY = False
class TestBackendTomboy(unittest.TestCase):
""" Tests for the tomboy backend """
def setUp(self):
thread_tomboy = threading.Thread(target=self.spawn_fake_tomboy_server)
thread_tomboy.start()
thread_tomboy.join()
# only the test process should go further, the dbus server one should
# stop here
if not PID_TOMBOY:
return
# we create a custom dictionary listening to the server, and register
# it in GTG.
additional_dic = {}
additional_dic["use this fake connection instead"] = (
FakeTomboy.BUS_NAME, FakeTomboy.BUS_PATH, FakeTomboy.BUS_INTERFACE)
additional_dic[GenericBackend.KEY_ATTACHED_TAGS] = \
[GenericBackend.ALLTASKS_TAG]
additional_dic[GenericBackend.KEY_DEFAULT_BACKEND] = True
dic = BackendFactory().get_new_backend_dict('backend_tomboy',
additional_dic)
self.datastore = DataStore()
self.backend = self.datastore.register_backend(dic)
# waiting for the "start_get_tasks" to settle
time.sleep(1)
# we create a dbus session to speak with the server
self.bus = dbus.SessionBus()
obj = self.bus.get_object(FakeTomboy.BUS_NAME, FakeTomboy.BUS_PATH)
self.tomboy = dbus.Interface(obj, FakeTomboy.BUS_INTERFACE)
def spawn_fake_tomboy_server(self):
# the fake tomboy server has to be in a different process,
# otherwise it will lock on the GIL.
# For details, see
# http://lists.freedesktop.org/archives/dbus/2007-January/006921.html
# we use a lockfile to make sure the server is running before we start
# the test
global PID_TOMBOY
lockfile_fd, lockfile_path = tempfile.mkstemp()
PID_TOMBOY = os.fork()
if PID_TOMBOY:
# we wait in polling that the server has been started
while True:
try:
fd = os.open(lockfile_path,
os.O_CREAT | os.O_EXCL | os.O_RDWR)
except OSError, e:
if e.errno != errno.EEXIST:
raise
time.sleep(0.3)
continue
os.close(fd)
break
else:
FakeTomboy()
os.close(lockfile_fd)
os.unlink(lockfile_path)
def tearDown(self):
if not PID_TOMBOY:
return
self.datastore.save(quit=True)
time.sleep(0.5)
self.tomboy.FakeQuit()
# FIXME: self.bus.close()
os.kill(PID_TOMBOY, signal.SIGKILL)
os.waitpid(PID_TOMBOY, 0)
def test_everything(self):
# we cannot use separate test functions because we only want a single
# FakeTomboy dbus server running
if not PID_TOMBOY:
return
for function in dir(self):
if function.startswith("TEST_"):
getattr(self, function)()
self.tomboy.Reset()
for tid in self.datastore.get_all_tasks():
self.datastore.request_task_deletion(tid)
time.sleep(0.1)
def TEST_processing_tomboy_notes(self):
self.backend.set_attached_tags([GenericBackend.ALLTASKS_TAG])
# adding a note
note = self.tomboy.CreateNamedNote(str(uuid.uuid4()))
self.backend._process_tomboy_note(note)
self.assertEqual(len(self.datastore.get_all_tasks()), 1)
tid = self.backend.sync_engine.sync_memes.get_local_id(note)
task = self.datastore.get_task(tid)
# re-adding that (should not change anything)
self.backend._process_tomboy_note(note)
self.assertEqual(len(self.datastore.get_all_tasks()), 1)
self.assertEqual(
self.backend.sync_engine.sync_memes.get_local_id(note), tid)
# removing the note and updating gtg
self.tomboy.DeleteNote(note)
self.backend.set_task(task)
self.assertEqual(len(self.datastore.get_all_tasks()), 0)
def TEST_set_task(self):
self.backend.set_attached_tags([GenericBackend.ALLTASKS_TAG])
# adding a ta | sk
task = self.datastore.requester.new_task()
task.set_title("title")
self.backend.set_task(task)
self.assertEqual(len(self.tom | boy.ListAllNotes()), 1)
note = self.tomboy.ListAllNotes()[0]
self.assertEqual(str(self.tomboy.GetNoteTitle(note)), task.get_title())
# re-adding that (should not change anything)
self.backend.set_task(task)
self.assertEqual(len(self.tomboy.ListAllNotes()), 1)
self.assertEqual(note, self.tomboy.ListAllNotes()[0])
# removing the task and updating tomboy
self.datastore.request_task_deletion(task.get_id())
self.backend._process_tomboy_note(note)
self.assertEqual(len(self.tomboy.ListAllNotes()), 0)
def TEST_update_newest(self):
self.backend.set_attached_tags([GenericBackend.ALLTASKS_TAG])
task = self.datastore.requester.new_task()
task.set_title("title")
self.backend.set_task(task)
note = self.tomboy.ListAllNotes()[0]
gtg_modified = task.get_modified()
tomboy_modified = self._modified_string_to_datetime(
self.tomboy.GetNoteChangeDate(note))
# no-one updated, nothing should happen
self.backend.set_task(task)
self.assertEqual(gtg_modified, task.get_modified())
self.assertEqual(tomboy_modified,
self._modified_string_to_datetime(
self.tomboy.GetNoteChangeDate(note)))
# we update the GTG task
UPDATED_GTG_TITLE = "UPDATED_GTG_TITLE"
task.set_title(UPDATED_GTG_TITLE)
self.backend.set_task(task)
self.assertTrue(gtg_modified < task.get_modified())
self.assertTrue(tomboy_modified <=
self._modified_string_to_datetime(
self.tomboy.GetNoteChangeDate(note)))
self.assertEqual(task.get_title(), UPDATED_GTG_TITLE)
self.assertEqual(self.tomboy.GetNoteTitle(note), UPDATED_GTG_TITLE)
gtg_modified = task.get_modified()
tomboy_modified = self._modified_string_to_datetime(
self.tomboy.GetNoteChangeDate(note))
# we update the TOMBOY task
UPDATED_TOMBOY_TITLE = "UPDATED_TOMBOY_TITLE"
# the resolution of tomboy notes changed time is 1 second, so we need
# to wait. This *shouldn't* be needed in the actual code because
# tomboy signals are always a few seconds late.
time.sleep(1)
self.tomboy.SetNoteContents(note, UPDATED_TOMBOY_TITLE |
darkpeach/AlgorithmCoding | _206_ReverseLinkedList.py | Python | epl-1.0 | 297 | 0.003367 | class Solution(object):
def reverseList(self, | head):
"""
:type head: ListNode
:rtype: ListNode
"""
pre = None
while head:
temp = head.next
head.next = pre
pre = head
hea | d = temp
return pre |
jameswatt2008/jameswatt2008.github.io | python/Python核心编程/网络编程/截图和代码/概述、SOCKET/多进程copy文件/test/tarfile.py | Python | gpl-2.0 | 92,927 | 0.001614 | #! /usr/bin/python3.5
#-------------------------------------------------------------------
# tarfile.py
#-------------------------------------------------------------------
# Copyright (C) 2002 Lars Gustaebel <lars@gustaebel.de>
# All rights reserved.
#
# Permission is hereby granted, free of charge, to any person
# obtaining a copy of this software and associated documentation
# files (the "Software"), to deal in the Software without
# restriction, including without limitation the rights to use,
# copy, modify, merge, publish, distribute, sublicense, and/or sell
# copies of the Software, and to permit persons to whom the
# Software is furnished to do so, subject to the following
# conditions:
#
# The above copyright notice and this permission notice shall be
# included in all copies or substantial portions of the Software.
#
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
# EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES
# OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND
# NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT
# HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY,
# WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING
# FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR
# OTHER DEALINGS IN THE SOFTWARE.
#
"""Read from and write to tar format archives.
"""
version = "0.9.0"
__author__ = "Lars Gust\u00e4bel (lars@gustaebel.de)"
__date__ = "$Date: 2011-02-25 17:42:01 +0200 (Fri, 25 Feb 2011) $"
__cvsid__ = "$Id: tarfile.py 88586 2011-02-25 15:42:01Z marc-andre.lemburg $"
__credits__ = "Gustavo Niemeyer, Niels Gust\u00e4bel, Richard Townsend."
#---------
# Imports
#---------
from builtins import open as bltn_open
import sys
import os
import io
import shutil
import stat
import time
import struct
import copy
import re
try: |
import grp, pwd
except ImportError:
grp = pwd = None
# os.symlink on Windows prior to 6.0 raises NotImplementedError
symlink_exception = (AttributeError, NotImplementedError)
try:
# OSError (winerror=1314) will be raised if the call | er does not hold the
# SeCreateSymbolicLinkPrivilege privilege
symlink_exception += (OSError,)
except NameError:
pass
# from tarfile import *
__all__ = ["TarFile", "TarInfo", "is_tarfile", "TarError"]
#---------------------------------------------------------
# tar constants
#---------------------------------------------------------
NUL = b"\0" # the null character
BLOCKSIZE = 512 # length of processing blocks
RECORDSIZE = BLOCKSIZE * 20 # length of records
GNU_MAGIC = b"ustar \0" # magic gnu tar string
POSIX_MAGIC = b"ustar\x0000" # magic posix tar string
LENGTH_NAME = 100 # maximum length of a filename
LENGTH_LINK = 100 # maximum length of a linkname
LENGTH_PREFIX = 155 # maximum length of the prefix field
REGTYPE = b"0" # regular file
AREGTYPE = b"\0" # regular file
LNKTYPE = b"1" # link (inside tarfile)
SYMTYPE = b"2" # symbolic link
CHRTYPE = b"3" # character special device
BLKTYPE = b"4" # block special device
DIRTYPE = b"5" # directory
FIFOTYPE = b"6" # fifo special device
CONTTYPE = b"7" # contiguous file
GNUTYPE_LONGNAME = b"L" # GNU tar longname
GNUTYPE_LONGLINK = b"K" # GNU tar longlink
GNUTYPE_SPARSE = b"S" # GNU tar sparse file
XHDTYPE = b"x" # POSIX.1-2001 extended header
XGLTYPE = b"g" # POSIX.1-2001 global header
SOLARIS_XHDTYPE = b"X" # Solaris extended header
USTAR_FORMAT = 0 # POSIX.1-1988 (ustar) format
GNU_FORMAT = 1 # GNU tar format
PAX_FORMAT = 2 # POSIX.1-2001 (pax) format
DEFAULT_FORMAT = GNU_FORMAT
#---------------------------------------------------------
# tarfile constants
#---------------------------------------------------------
# File types that tarfile supports:
SUPPORTED_TYPES = (REGTYPE, AREGTYPE, LNKTYPE,
SYMTYPE, DIRTYPE, FIFOTYPE,
CONTTYPE, CHRTYPE, BLKTYPE,
GNUTYPE_LONGNAME, GNUTYPE_LONGLINK,
GNUTYPE_SPARSE)
# File types that will be treated as a regular file.
REGULAR_TYPES = (REGTYPE, AREGTYPE,
CONTTYPE, GNUTYPE_SPARSE)
# File types that are part of the GNU tar format.
GNU_TYPES = (GNUTYPE_LONGNAME, GNUTYPE_LONGLINK,
GNUTYPE_SPARSE)
# Fields from a pax header that override a TarInfo attribute.
PAX_FIELDS = ("path", "linkpath", "size", "mtime",
"uid", "gid", "uname", "gname")
# Fields from a pax header that are affected by hdrcharset.
PAX_NAME_FIELDS = {"path", "linkpath", "uname", "gname"}
# Fields in a pax header that are numbers, all other fields
# are treated as strings.
PAX_NUMBER_FIELDS = {
"atime": float,
"ctime": float,
"mtime": float,
"uid": int,
"gid": int,
"size": int
}
#---------------------------------------------------------
# initialization
#---------------------------------------------------------
if os.name in ("nt", "ce"):
ENCODING = "utf-8"
else:
ENCODING = sys.getfilesystemencoding()
#---------------------------------------------------------
# Some useful functions
#---------------------------------------------------------
def stn(s, length, encoding, errors):
"""Convert a string to a null-terminated bytes object.
"""
s = s.encode(encoding, errors)
return s[:length] + (length - len(s)) * NUL
def nts(s, encoding, errors):
"""Convert a null-terminated bytes object to a string.
"""
p = s.find(b"\0")
if p != -1:
s = s[:p]
return s.decode(encoding, errors)
def nti(s):
"""Convert a number field to a python number.
"""
# There are two possible encodings for a number field, see
# itn() below.
if s[0] in (0o200, 0o377):
n = 0
for i in range(len(s) - 1):
n <<= 8
n += s[i + 1]
if s[0] == 0o377:
n = -(256 ** (len(s) - 1) - n)
else:
try:
s = nts(s, "ascii", "strict")
n = int(s.strip() or "0", 8)
except ValueError:
raise InvalidHeaderError("invalid header")
return n
def itn(n, digits=8, format=DEFAULT_FORMAT):
"""Convert a python number to a number field.
"""
# POSIX 1003.1-1988 requires numbers to be encoded as a string of
# octal digits followed by a null-byte, this allows values up to
# (8**(digits-1))-1. GNU tar allows storing numbers greater than
# that if necessary. A leading 0o200 or 0o377 byte indicate this
# particular encoding, the following digits-1 bytes are a big-endian
# base-256 representation. This allows values up to (256**(digits-1))-1.
# A 0o200 byte indicates a positive number, a 0o377 byte a negative
# number.
if 0 <= n < 8 ** (digits - 1):
s = bytes("%0*o" % (digits - 1, int(n)), "ascii") + NUL
elif format == GNU_FORMAT and -256 ** (digits - 1) <= n < 256 ** (digits - 1):
if n >= 0:
s = bytearray([0o200])
else:
s = bytearray([0o377])
n = 256 ** digits + n
for i in range(digits - 1):
s.insert(1, n & 0o377)
n >>= 8
else:
raise ValueError("overflow in number field")
return s
def calc_chksums(buf):
"""Calculate the checksum for a member's header by summing up all
characters except for the chksum field which is treated as if
it was filled with spaces. According to the GNU tar sources,
some tars (Sun and NeXT) calculate chksum with signed char,
which will be different if there are chars in the buffer with
the high bit set. So we calculate two checksums, unsigned and
signed.
"""
unsigned_chksum = 256 + sum(struct.unpack_from("148B8x356B", buf))
signed_chksum = 256 + sum(struct.unpack_from("148b8x356b", buf))
re |
mdaif/olympia | apps/api/middleware.py | Python | bsd-3-clause | 3,064 | 0 | from django.conf import settings
from django.contrib.auth.models import AnonymousUser
import commonware.log
import waffle
from users.models import UserProfile
from .models import Access
from .oauth import OAuthServer
log = commonware.log.getLogger('z.api')
class RestOAuthMiddleware(object):
"""
This is based on https://github.com/amrox/django-tastypie-two-legged-oauth
with permission.
"""
def process_request(self, request):
# Do not process the request if the flag is off.
if not waffle.switch_is_active('drf'):
return
path_ = request.get_full_path()
try:
_, lang, platform, api, rest = path_.split('/', 4)
except ValueError:
return
# For now we only want these to apply to the API.
if not api.lower() == 'api':
return
if not settings.SITE_URL:
raise ValueError('SITE_URL is not specified')
# Set up authed_from attribute.
if not hasattr(request, 'authed_from'):
request.authed_from = []
auth_header_value = request.META.get('HTTP_AUTHORIZATION')
if (not auth_header_value and
'oauth_token' not in request.META['QUERY_STRING']):
self.user = AnonymousUser()
log.info('No HTTP_AUTHORIZATION header')
return
# Set up authed_from attribute.
auth_header = {'Authorization': auth_header_value}
method = getattr(request, 'signed_method', request.method)
oauth = OAuthServer()
# Only 2-legged OAuth scenario.
log.info('Trying 2 legged OAuth')
try:
valid, oauth_request = oauth.verify_request(
request.build_absolute_uri(),
method, headers=auth_header,
require_resource_owner=False)
except ValueError:
log.error('ValueError on verifying_request', exc_info=True)
return
if not valid:
log.error(u'Cannot find APIAccess token with that key: %s'
% oauth_request._params[u'oauth_consumer_key'])
return
uid = Access.objects.filter(
key=oauth_request.client_key).values_list(
'use | r_id', flat=True)[ | 0]
if not uid:
log.error(u'Cannot find Access with that key: %s'
% oauth_request.client_key)
return
request.user = UserProfile.objects.get(pk=uid)
# But you cannot have one of these roles.
denied_groups = set(['Admins'])
roles = set(request.user.groups.values_list('name', flat=True))
if roles and roles.intersection(denied_groups):
log.info(u'Attempt to use API with denied role, user: %s'
% request.user.pk)
# Set request attributes back to None.
request.user = None
return
if request.user:
request.authed_from.append('RestOAuth')
log.info('Successful OAuth with user: %s' % request.user)
|
hatwar/buyback-erpnext | erpnext/accounts/utils.py | Python | agpl-3.0 | 17,424 | 0.024908 | # Copyright (c) 2015, Frappe Technologies Pvt. Ltd. and Contributors
# License: GNU General Public License v3. See license.txt
from __future__ import unicode_literals
import frappe
from frappe.utils import nowdate, cstr, flt, now, getdate, add_months
from frappe import throw, _
from frappe.utils import formatdate
import frappe.desk.reportview
# imported to enable erpnext.accounts.utils.get_account_currency
from erpnext.accounts.doctype.account.account import get_account_currency
class FiscalYearError(frappe.ValidationError): pass
class BudgetError(frappe.ValidationError): pass
@frappe.whitelist()
def get_fiscal_year(date=None, fiscal_year=None, label="Date", verbose=1, company=None):
return get_fiscal_years(date, fiscal_year, label, verbose, company)[0]
def get_fiscal_years(transaction_date=None, fiscal_year=None, label="Date", verbose=1, company=None):
# if year start date is 2012-04-01, year end date should be 2013-03-31 (hence subdate)
cond = " ifnull(disabled, 0) = 0"
if fiscal_year:
cond += " and fy.name = %(fiscal_year)s"
else:
cond += " and %(transaction_date)s >= fy.year_start_date and %(transaction_date)s <= fy.year_end_date"
if company:
cond += """ and (not exists(select name from `tabFiscal Year Company` fyc where fyc.parent = fy.name)
or exists(select company from `tabFiscal Year Company` fyc where fyc.parent = fy.name and fyc.company=%(company)s ))"""
fy = frappe.db.sql("""select fy.name, fy.year_start_date, fy.year_end_date from `tabFiscal Year` fy
where %s order by fy.year_start_date desc""" % cond, {
"fiscal_year": fiscal_year,
"transaction_date": transaction_date,
"company": company
})
if not fy:
error_msg = _("""{0} {1} not in any active Fiscal Year. For more details check {2}.""").format(label, formatdate(transaction_date), "https://erpnext.com/kb/accounts/fiscal-year-error")
if verbose==1: frappe.msgprint(error_msg)
raise FiscalYearError, error_msg
return fy
def validate_fiscal_year(date, fiscal_year, label=_("Date"), doc=None):
years = [f[0] for f in get_fiscal_years(date, label=label)]
if fiscal_year not in years:
if doc:
doc.fiscal_year = years[0]
else:
throw(_("{0} '{1}' not in Fiscal Year {2}").format(label, formatdate(date), fiscal_year))
@frappe.whitelist()
def get_balance_on(account=None, date=None, party_type=None, party=None, in_account_currency=True):
if not account and frappe.form_dict.get("account"):
account = frappe.form_dict.get("account")
if not date and frappe.form_dict.get("date"):
date = frappe.form_dict.get("date")
if not party_type and frappe.form_dict.get("party_type"):
party_type = frappe.form_dict.get("party_type")
if not party and frappe.form_dict.get("party"):
party = frappe.form_dict.get("party")
cond = []
if date:
cond.append("posting_date <= '%s'" % date)
else:
# get balance of all entries that exist
date = nowdate()
try:
year_start_date = get_fiscal_year(date, verbose=0)[1]
except FiscalYearError:
if getdate(date) > getdate(nowdate()):
# if fiscal year not found and the date is greater than today
# get fiscal year for today's date and its corresponding year start date
year_start_date = get_fiscal_year(nowdate(), verbose=1)[1]
else:
# this indicates that it is a date older than any existing fiscal year.
# hence, assuming balance as 0.0
return 0.0
if account:
acc = frappe.get_doc("Account", account)
if not frappe.flags.ignore_account_permission:
acc.check_permission("read")
# for pl accounts, get balance within a fiscal year
if acc.report_type == 'Profit and Loss':
cond.append("posting_date >= '%s' and voucher_type != 'Period Closing Voucher'" \
% year_start_date)
# different filter for group and ledger - improved performance
if acc.is_group:
cond.append("""exists (
select name from `tabAccount` ac where ac.name = gle.account
and ac.lft >= %s and ac.rgt <= %s
)""" % (acc.lft, acc.rgt))
# If group and currency same as company,
# always return balance based on debit and credit in company currency
if acc.account_currency == frappe.db.get_value("Company", acc.company, "default_currency"):
in_account_currency = False
else:
cond.append("""gle.account = "%s" """ % (account.replace('"', '\\"'), ))
if party_type and party:
cond.append("""gle.party_type = "%s" and gle.party = "%s" """ %
(party_type.replace('"', '\\"'), party.replace('"', '\\"')))
if account or (party_type and party):
if in_account_currency:
select_field = "sum(ifnull(debit_in_account_currency, 0)) - sum(ifnull(credit_in_account_currency, 0))"
else:
select_field = "sum(ifnull(debit, 0)) - sum(ifnull(credit, 0))"
bal = fra | ppe.db.sql("""
| SELECT {0}
FROM `tabGL Entry` gle
WHERE {1}""".format(select_field, " and ".join(cond)))[0][0]
# if bal is None, return 0
return flt(bal)
@frappe.whitelist()
def add_ac(args=None):
if not args:
args = frappe.local.form_dict
args.pop("cmd")
ac = frappe.new_doc("Account")
ac.update(args)
ac.old_parent = ""
ac.freeze_account = "No"
ac.insert()
return ac.name
@frappe.whitelist()
def add_cc(args=None):
if not args:
args = frappe.local.form_dict
args.pop("cmd")
cc = frappe.new_doc("Cost Center")
cc.update(args)
cc.old_parent = ""
cc.insert()
return cc.name
def reconcile_against_document(args):
"""
Cancel JV, Update aginst document, split if required and resubmit jv
"""
for d in args:
check_if_jv_modified(d)
validate_allocated_amount(d)
# cancel JV
jv_obj = frappe.get_doc('Journal Entry', d['voucher_no'])
jv_obj.make_gl_entries(cancel=1, adv_adj=1)
# update ref in JV Detail
update_against_doc(d, jv_obj)
# re-submit JV
jv_obj = frappe.get_doc('Journal Entry', d['voucher_no'])
jv_obj.make_gl_entries(cancel = 0, adv_adj =1)
def check_if_jv_modified(args):
"""
check if there is already a voucher reference
check if amount is same
check if jv is submitted
"""
ret = frappe.db.sql("""
select t2.{dr_or_cr} from `tabJournal Entry` t1, `tabJournal Entry Account` t2
where t1.name = t2.parent and t2.account = %(account)s
and t2.party_type = %(party_type)s and t2.party = %(party)s
and ifnull(t2.reference_type, '') in ("", "Sales Order", "Purchase Order")
and t1.name = %(voucher_no)s and t2.name = %(voucher_detail_no)s
and t1.docstatus=1 """.format(dr_or_cr = args.get("dr_or_cr")), args)
if not ret:
throw(_("""Payment Entry has been modified after you pulled it. Please pull it again."""))
def validate_allocated_amount(args):
if args.get("allocated_amt") < 0:
throw(_("Allocated amount can not be negative"))
elif args.get("allocated_amt") > args.get("unadjusted_amt"):
throw(_("Allocated amount can not greater than unadusted amount"))
def update_against_doc(d, jv_obj):
"""
Updates against document, if partial amount splits into rows
"""
jv_detail = jv_obj.get("accounts", {"name": d["voucher_detail_no"]})[0]
jv_detail.set(d["dr_or_cr"], d["allocated_amt"])
jv_detail.set('debit' if d['dr_or_cr']=='debit_in_account_currency' else 'credit',
d["allocated_amt"]*flt(jv_detail.exchange_rate))
original_reference_type = jv_detail.reference_type
original_reference_name = jv_detail.reference_name
jv_detail.set("reference_type", d["against_voucher_type"])
jv_detail.set("reference_name", d["against_voucher"])
if d['allocated_amt'] < d['unadjusted_amt']:
jvd = frappe.db.sql("""
select cost_center, balance, against_account, is_advance,
account_type, exchange_rate, account_currency
from `tabJournal Entry Account` where name = %s
""", d['voucher_detail_no'], as_dict=True)
amount_in_account_currency = flt(d['unadjusted_amt']) - flt(d['allocated_amt'])
amount_in_company_currency = amount_in_account_currency * flt(jvd[0]['exchange_rate'])
# new entry with balance amount
ch = jv_obj.append("accounts")
ch.account = d['account']
ch.account_type = jvd[0]['account_type']
ch.account_currency = jvd[0]['account_currency']
ch.exchange_rate = jvd[0]['exchange_rate']
ch.party_type = d["party_type"]
ch.party = d["party"]
ch.cost_center = cstr(jvd[0]["cost_center"])
ch.balance = flt(jvd[0]["balance"])
|
MuteSpirit/mute_expect | paramiko_process.py | Python | mit | 4,439 | 0.008335 | #!/usr/bin/env python
# -*- coding: utf-8 -*-
################################################################################
## @todo Add delaybeforesend, delayafterclose, delayafterterminate
## # Most Linux machines don't like delaybeforesend to be below 0.03 (30 ms).
## # self.delaybeforesend = 0.05 # Sets sleep time used just before sending data to child. Time in seconds.
## # self.delayafterclose = 0.1 # Sets delay in close() method to allow kernel time to update process status. Time in seconds.
## # self.delayafterterminate = 0.1 # Sets delay in terminate() method to allow kernel time to update process status. Time in seconds.
## @todo Use __init__'s arg 'logfile' for configure logging
## @todo Implement 'cwd' and 'env' usage
## @todo Add def __str__
import paramiko
import interactive_process
import logging_decorator
################################################################################
class ParamikoProcess(InteractiveProcess):
@trace_method
def __init__(self, transport, command, args=[], timeout=30, maxread=2000, searchwindowsize=None, logfile=None, cwd=None, env=None):
self._terminated = True
self._exitstatus = None
self._timeout = timeout
self._maxread = maxread # max bytes to read at one time into buffer
self._stringIo = None
self._buffer = '' # This is the read buffer. See maxread.
self._searchwindowsize = searchwindowsize # Anything before searchwindowsize point is preserved, but not searched.
self._name = '<' + repr(self) + '>' # File-like object.
self._closed = True
# allow dummy instances for subclasses that may not use transport, command or args.
if transport is None:
self._chan = None
self._name = '<factory incomplete>'
elif not isinstance(term, paramiko.Channel):
raise TypeError('1st arg must be paramiko.Channel object')
else:
if command is None:
self._cmd_line = None
self._name = '<factory incomplete>'
else:
if type (args) != type([]):
raise TypeError ('The argument, args, must be a list.')
cmd_line = command + ' '.join(args)
self._chan = transport.open_session()
self._chan.setblocking(0)
self._chan.get_pty() # is required for using 'picocom'
self._chan.exec_command | (cmd_line)
self._chan_fd = self._chan.fileno()
self._closed = False
self._terminated = False
def __del__(self):
"""This makes sure that no system resources a | re left open. Python only
garbage collects Python objects. OS file descriptors are not Python
objects, so they must be handled explicitly. If the child file
descriptor was opened outside of this class (passed to the constructor)
then this does not close it. """
if not self._closed:
# It is possible for __del__ methods to execute during the
# teardown of the Python VM itself. Thus self.close() may
# trigger an exception because os.close may be None.
# -- Fernando Perez
try:
self.close()
except AttributeError:
pass
def close (self, force=True): # File-like object.
if self._stringIo is not None:
self._stringIo.close()
if self._chan is not None:
self._chan.close()
def isatty() # File-like object.
return True # self._chan.get_pty() was called implicity
def read (self, size=-1): # File-like object.
pass
def read_nonblocking (self, size=1, timeout=-1):
if self._closed:
raise ValueError ('I/O operation on closed file in read_nonblocking().')
if timeout == -1:
timeout = self.timeout
def readline (self, size=-1): # File-like object.
pass
def send(self, s):
pass
def sendline(self, s=''):
pass
def terminate(self, force=False):
pass
def wait(self):
pass
def expect(self, pattern, timeout=-1, searchwindowsize=None):
pass
def expect_list(self, pattern_list, timeout=-1, searchwindowsize=-1):
pass
def expect_exact(self, pattern_list, timeout=-1, searchwindowsize=-1):
pass
|
adiyengar/Spirit | example/project/settings/prod_local.py | Python | mit | 781 | 0.002561 | from __future__ import unicode_literals
import os
import sys
from .prod import *
DEBUG = True
TEMPLATE_DEBUG = True
# https://docs.djangoproject.com/en/dev/ref/settings/#admins
ADMINS = (('Adi', 'adi@u.northwestern.edu'), )
# Secret key generator: https://djskgen.herokuapp.com/
# You should set your key as an environ variable
SECRET_KEY = os.environ.get("SECRET_KEY", "uxi*44khd4ao#f!8ux$+^1=f*7r6thl@14y- | 4#q2*14ci4%zre")
# htt | ps://docs.djangoproject.com/en/dev/ref/settings/#allowed-hosts
ALLOWED_HOSTS = ['localhost']
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql_psycopg2',
'NAME': 'sighht_deploy',
'USER': 'postgres',
'PASSWORD': 'Ad!020687shEsh',
'HOST': 'localhost',
'PORT': '5432',
}
}
|
PhloxAR/phloxar | PhloxAR/dc1394/frame.py | Python | apache-2.0 | 6,774 | 0.000443 | # -----------------------------------------------------------------------------
#
# -*- coding: utf-8 -*-
#
# phlox-libdc1394/dc1394/frame.py
#
# Copyright (C) 2016, by Matthias Yang Chen <matthias_cy@outlook.com>
# All rights reserved.
#
# phlox-libdc1394 is free software: you can redistribute it and/or modify it
# under the terms of the GNU Lesser General Public License as
# published by the Free Software Foundation, either version 3 of the
# License, or (at your option) any later version.
#
# phlox-libdc1394 is distributed in the hope that it will be useful, but
# WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
# Lesser General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public
# License along with phlox-libdc1394. If not,
# see <http://www.gnu.org/licenses/>.
# -----------------------------------------------------------------------------
from __future__ import division, print_function
from __future__ import absolute_import, unicode_literals
from ctypes import ARRAY, c_byte
from numpy import ndarray
from .core import *
__all__ = ['Frame']
class Frame(ndarray):
"""
A frame returned by the camera.
All metadata are retained as attributes of the resulting image.
"""
_cam = None
_frame = None
def __new__(cls, camera, frame):
"""
Convert a dc1394 frame into a Frame instance.
:param camera:
:param frame:
:return:
"""
dtype = ARRAY(c_byte, frame.contents.image_bytes)
buf = dtype.from_address(frame.contents.image)
width, height = frame.contents.size
pixels = width * height
endian = frame.contents.little_endian and '<' or '>'
type_str = '%su%i' % (endian, frame.contents.image_bytes / pixels)
img = ndarray.__new__(cls, shape=(height, width), dtype=type_str, buffer=buf)
img.frame_id = frame.contents.id
img.frames_behind = frame.contents.frames_behind
img.position = frame.contents.position
img.packet_size = frame.contents.packet_size
img.packets_per_frame = frame.contents.packet_per_frame
img.timestamp = frame.contents.timestamp
img.video_mode = video_modes[frame.contents.video_mode]
img.data_depth = frame.contents.data_depth
img.color_coding = color_codings[frame.contents.color_coding]
img.color_filter = frame.contents.color_filter
img.yuv_byte_order = frame.contents.yuv_byte_order
img.stride = fr | ame.contents.stride
# save camera and frame for enqueue()
img._frame = frame
| img._cam = camera
return img
def __array_finalize__(self, img):
"""
Finalize the new Image class array.
If called with an image object, inherit the properties of that image.
"""
if img is None:
return
# do not inherit _frame and _cam since we also get called on copy()
# and should not hold references to the frame in this case
for key in ["position", "color_coding", "color_filter",
"yuv_byte_order", "stride", "packet_size",
"packets_per_frame", "timestamp", "frames_behind",
"frame_id", "data_depth", "video_mode"]:
setattr(self, key, getattr(img, key, None))
def enqueue(self):
"""
Returns a frame to the ring buffer once it has been used.
This method is also called implicitly on ``del``.
Only call this method on the original frame obtained from
Camera.dequeue` and not on its views, new-from-templates or
copies. Otherwise an AttributeError will be raised.
"""
if not hasattr(self, "_frame"): # or self.base is not None:
raise AttributeError("can only enqueue the original frame")
if self._frame is not None:
dll.dc1394_capture_enqueue(self._cam, self._frame)
self._frame = None
self._cam = None
# from contextlib iport closing
# with closing(camera.dequeue()) as im:
# do stuff with im
close = enqueue
def __del__(self):
try:
self.enqueue()
except AttributeError:
pass
@property
def corrupt(self):
"""
Whether this frame corrupt.
Returns ``True`` if the given frame has been detected to be
corrupt (missing data, corrupted data, overrun buffer, etc.) and
``False`` otherwise.
.. note::
Certain types of corruption may go undetected in which case
``False`` will be returned erroneously. The ability to
detect corruption also varies between platforms.
.. note::
Corrupt frames still need to be enqueued with `enqueue`
when no longer needed by the user.
"""
return bool(dll.dc1394_capture_is_frame_corrupt(self._cam, self._frame))
def to_rgb(self):
"""
Convert the image to an RGB image.
Array shape is: (image.shape[0], image.shape[1], 3)
Uses the dc1394_convert_to_RGB() function for the conversion.
"""
res = ndarray(3 * self.size, dtype='u1')
shape = self.shape
inp = ndarray(shape=len(self.data), buffer=self.data, dtype='u1')
dll.dc1394_convert_to_RGB8(inp, res, shape[1], shape[0],
self.yuv_byte_order, self.color_coding,
self.data_depth)
res.shape = shape[0], shape[1], 3
return res
def to_mono8(self):
"""
Convert he image to 8 bit gray scale.
Uses the dc1394_convert_to_MONO8() function
"""
res = ndarray(self.size, dtype='u1')
shape = self.shape
inp = ndarray(shape=len(self.data), buffer=self.data, dtype='u1')
dll.dc1394_convert_to_MONO8(inp, res, shape[1], shape[0],
self.yuv_byte_order, self.color_coding,
self.data_depth)
res.shape = shape
return res
def to_yuv422(self):
"""
Convert he image to YUV422 color format.
Uses the dc1394_convert_to_YUV422() function
"""
res = ndarray(self.size, dtype='u1')
shape = self.shape
inp = ndarray(shape=len(self.data), buffer=self.data, dtype='u1')
dll.dc1394_convert_to_YUV422(inp, res, shape[1], shape[0],
self.yuv_byte_order, self.color_coding,
self.data_depth)
return ndarray(shape=shape, buffer=res.data, dtype='u2')
|
mozilla/kitsune | kitsune/customercare/migrations/0002_auto_20210716_0556.py | Python | bsd-3-clause | 640 | 0 | # Generated by Django 2.2.23 on 2021-07-16 05:56
from django.db import migrations
class Migration(mig | rations.Migration): |
dependencies = [
('customercare', '0001_initial'),
]
operations = [
migrations.RemoveField(
model_name='reply',
name='user',
),
migrations.RemoveField(
model_name='tweet',
name='reply_to',
),
migrations.DeleteModel(
name='TwitterAccount',
),
migrations.DeleteModel(
name='Reply',
),
migrations.DeleteModel(
name='Tweet',
),
]
|
drewcsillag/skunkweb | pylibs/pargen/__init__.py | Python | gpl-2.0 | 951 | 0.011567 | #
# Copyright (C) 2001 Andrew T. Csillag <drew_csillag@geocities.com>
#
# You may distribute under the terms of either the GNU General
# Public License or the SkunkWeb License, as specified in the
# README file.
#
"""
This module implements a parser gener | ator,
similar, but not identical to well-known tools such as Yacc or
Bison.
So unless you are writing a parser for some kind of
computer language (that isn't HTML or XML), you probably can
pass this module by. If you are writing a parser,
pargen should
be able to help.
The input file to pargen is a file
of the following format that is somewhat similar:
#comments begin with a #
:methodToCall: ruleName : rightHandSideItem1 [ rightHandSideItem2 ...]
You then run pargen on this file an | d produce (hopefully) a parsing
table for your grammar either in marshalled form, or a Python modular form
(the default).
The Parser module has the rest of the details.
"""
|
code4romania/czl-scrape | scrapy/czlscrape/utils.py | Python | mpl-2.0 | 2,713 | 0 | import re
from scrapy.selector import SelectorList
DIACRITICS_RULES = [
(r'[șş]', 's'),
(r'[ȘŞ]', 'S'),
(r'[țţ]', 't'),
(r'[ȚŢ]', 'T'),
(r'[ăâ]', 'a'),
(r'[ĂÂ]', 'A'),
(r'[î]', 'i'),
(r'[Î]', 'I'),
]
ROMANIAN_MONTHS = {
'ianuarie': 1,
'februarie': 2,
'martie': 3,
'aprilie': 4,
'mai': 5,
'iunie': 6,
'iulie': 7,
'august': 8,
'septembrie': 9,
'octombrie': 10,
'noiembrie': 11,
'decembrie': 12,
}
DOC_EXTENSIONS = [".docs", ".doc", ".txt", ".crt", ".xls", ".xml", ".pdf",
| ".docx", ".xlsx", ]
def guess_initiative_type(text: str, rules: list) -> str:
"""
Try to identify the type of a law initiative from its description.
Use a best guess approach. The rules are provided by the caller as a list
of tuples. Each tuple is composed of a search string and the initiative
type it matches to.
:param text: the description of the initiative
:param rules: the rules of identification expressed as a list of tuples
:retu | rn: the type of initiative if a rule matches; "OTHER" if no rule
matches
"""
text = strip_diacritics(text)
for search_string, initiative_type in rules:
if search_string in text:
return initiative_type
else:
return "OTHER"
def strip_diacritics(text: str) -> str:
"""
Replace all diacritics in the given text with their regular counterparts.
:param text: the text to look into
:return: the text without diacritics
"""
result = text
for search_pattern, replacement in DIACRITICS_RULES:
result = re.sub(search_pattern, replacement, result)
return result
def romanian_month_number(text: str) -> int:
"""
Return the number of the given month identified by its Romanian name.
:param text: the name of the month in Romanian
:return: the number of the month if the month name is recognized,
otherwise None
"""
return ROMANIAN_MONTHS.get(text.lower())
def extract_documents(selector_list: SelectorList):
"""
Extract white-listed documents from CSS selectors.
Generator function. Search for links to white-listed document types and
return all matching ones. Each entry has two properties. "type" contains
the link text, "url" contains the link URL.
:param selector_list: a SelectorList
:return: a generator
"""
for link_selector in selector_list:
url = link_selector.css('::attr(href)').extract_first()
if any(url.endswith(ext) for ext in DOC_EXTENSIONS):
yield {
'type': link_selector.css('::text').extract_first(),
'url': url,
}
|
adriaanvuik/solid_state_physics | lll.py | Python | bsd-2-clause | 6,649 | 0.00015 | # Copyright 2011-2013 Kwant authors.
#
# This file is part of Kwant. It is subject to the license terms in the file
# LICENSE.rst found in the top-level directory of this distribution and at
# http://kwant-project.org/license. A list of Kwant authors can be found in
# the file AUTHORS.rst at the top-level directory of this distribution and at
# http://kwant-project.org/authors.
__all__ = ['lll', 'cvp', 'voronoi']
import numpy as np
from itertools import product
def gs_coefficient(a, b):
"""Gram-Schmidt coefficient."""
return np.dot(a, b) / np.linalg.norm(b)**2
def gs(mat):
"""Compute Gram-Schmidt decomposition on a matrix."""
mat = np.copy(mat)
for i in range(len(mat)):
for j in range(i):
mat[i] -= gs_coefficient(mat[i], mat[j]) * mat[j]
return mat
def is_c_reduced(vecs, c):
"""Check if a basis is c-reduced."""
vecs = gs(vecs)
r = np.apply_along_axis(lambda x: np.linalg.norm(x)**2, 1, vecs)
return np.all((r[: -1] / r[1:]) < c)
def lll(basis, c=1.34):
"""
Calculate a reduced lattice basis using LLL algorithm.
Reduce a basis of a lattice to an almost orthonormal form. For details see
e.g. http://en.wikipedia.org/wiki/LLL-algorithm.
Parameters
----------
basis : 2d array-like of floats
The lattice basis vectors to be reduced.
c : float
Reduction parameter for the algorithm. Must be larger than 1 1/3,
since otherwise a solution is not guaranteed to exist.
Returns
-------
reduced_basis : numpy array
The basis vectors of the LLL-reduced basis.
transformation : numpy integer array
Coefficient matrix for tranforming from the reduced basis to the
original one.
"""
vecs = np.copy(basis)
if vecs.ndim != 2:
raise ValueError('`vecs` must be a 2d array-like object.')
if vecs.shape[0] > vecs.shape[1]:
raise ValueError('The number of basis vectors exceeds the '
'space dimensionality.')
vecs_orig = np.copy(vecs)
vecsstar = np.copy(vecs)
m = vecs.shape[0]
u = np.identity(m)
def ll_reduce(i):
for j in reversed(range(i)):
vecs[i] -= np.round(u[i, j]) * vecs[j]
u[i] -= np.round(u[i, j]) * u[j]
# Initialize values.
for i in range(m):
for j in range(i):
u[i, j] = gs_coefficient(vecs[i], vecsstar[j])
vecsstar[i] -= u[i, j] * vecsstar[j]
ll_reduce(i)
# Main part of LLL algorithm.
i = 0
while i < m-1:
if (np.linalg.norm(vecsstar[i]) ** 2 <
c * np.linalg.norm(vecsstar[i+1]) ** 2):
i += 1
else:
vecsstar[i+1] += u[i+1, i] * vecsstar[i]
u[i, i] = gs_coefficient(vecs[i], vecsstar[i+1])
u[i, i+1] = u[i+1, i] = 1
u[i+1, i+1] = 0
vecsstar[i] -= u[i, i] * vecsstar[i+1]
vecs[[i, i+1]] = vecs[[i+1, i]]
vecsstar[[i, i+1]] = vecsstar[[i+1, i]]
u[[i, i+1]] = u[[i+1, i]]
for j in range(i+2, m):
u[j, i] = gs_coefficient(vecs[j], vecsstar[i])
u[j, i+1] = gs_coefficient(vecs[j], vecsstar[i+1])
if abs(u[i+1, i]) > 0.5:
ll_reduce(i+1)
i = max(i-1, 0)
coefs = np.linalg.lstsq(vecs_orig.T, vecs.T)[0]
if not np.allclose(np.round(coefs), coefs, atol=1e-6):
raise RuntimeError('LLL algorithm instability.')
if not is_c_reduced(vecs, c):
raise RuntimeError('LLL algorithm instability.')
return vecs, np.array(np.round(coefs), int)
def cvp(vec, basis, n=1):
"""
Solve the n-closest vector problem for a vector, given a basis.
This algorithm performs poorly in general, so it should be supplied
with LLL-reduced bases.
Parameters
----------
vec : 1d array-like of floats
The lattice vectors closest to this vector are to be found.
basis : 2d array-like of floats
Sequence of basis vectors
n : int
Number of lattice vectors closest to the point that need to be found.
Returns
-------
coords : numpy array
An array with the coefficients of the lattice vectors closest to the
requested point.
Notes
-----
This function can also be used to solve the `n` shortest lattice vector
problem if the `vec` is zero, and `n+1` points are requested
(and the first output is ignored).
"""
# Calculate coordinates of the starting point in this basis.
basis = np.asarray(basis)
if basis.ndim != 2:
raise ValueError('`basis` must be a 2d array-like object.')
vec = np.asarray(vec)
center_coords = np.array(np.round(np.linalg.lstsq(basis.T, vec)[0]), int)
# Cutoff radius for n-th nearest neighbor.
rad = 1
nth_dist = np.inf
while True:
r = np.round(rad * np.linalg.cond(basis)) + 1
points = np.mgrid[tuple(slice(i - r, i + r) for i in center_coords)]
points = points.reshape(basis.shape[0], -1).T
if len(points) < n:
rad += 1
continue
point_coords = np.dot(points, basis)
point_coords -= vec.T
distances = np.sqrt(np.sum(point_coords**2, 1))
order = np.argsort(distances)
distances = distances[order]
if distances[n - 1] < nth_dist:
nth_dist = distances[n - 1]
rad += 1
else:
return np.array(points[order][:n], int)
def voronoi(basis):
"""
Retur | n an array of lattice vectors forming its voronoi cell.
Parameters
----------
basis : 2d array-like of floats
Basis vectors for which the Voronoi neighbors have to be found.
Returns
-------
voronoi_neighbors : numpy array of ints
All the lattice vectors that may potentially neighbor the origin.
Notes
-----
This algorithm does not calculate the minimal Voronoi cell of the lattice
and can be optimized. Its main aim is flood-fill, however, and better
| safe than sorry.
"""
basis = np.asarray(basis)
if basis.ndim != 2:
raise ValueError('`basis` must be a 2d array-like object.')
displacements = list(product(*(len(basis) * [[0, .5]])))[1:]
vertices = np.array([cvp(np.dot(vec, basis), basis)[0] for vec in
displacements])
vertices = np.array(np.round((vertices - displacements) * 2), int)
for i in range(len(vertices)):
if not np.any(vertices[i]):
vertices[i] += 2 * np.array(displacements[i])
vertices = np.concatenate([vertices, -vertices])
return vertices
|
davidedelvento/temperanotes | test_temperanotes.py | Python | apache-2.0 | 9,430 | 0.016331 | import temperanotes
import pytest, bisect
@pytest.fixture
def idiot_temp():
temp = [1, 1.05, 1.1, 1.15, 1.2, 1.3, 1.4, 1.5, 1.6, 1.7, 1.8, 1.9] # not a temperament, just a set of numbers for testing
assert len(temp) == 12 # need 12 notes for the chromatic scale
return temp
def test_note_names():
exclude = ['B#', 'Cb', 'E#', 'Fb']
assert len(temperanotes.note_names_sharp) == 12
assert len(temperanotes.note_names_flat) == 12
for note in "ABCDEFG":
assert note in temperanotes.note_names_sharp
assert note in temperanotes.note_names_flat
note_accidental = note + "#"
if not note_accidental in exclude:
assert note_accidental in temperanotes.note_names_sharp
note_accidental = note + "b"
if not note_accidental in exclude:
assert note_accidental in temperanotes.note_names_flat
def test_get_key_index():
assert temperanotes.get_key_index('A') == 0
assert temperanotes.get_key_index('C') == 3
assert temperanotes.get_key_index('F') == 8
assert temperanotes.get_key_index('F#') == 9
assert temperanotes.get_key_index('G#') == 11
assert temperanotes.get_key_index('Ab') == 11
def test_normal_octave_in_C(idiot_temp):
# when starting from C,
# A is the 10th semitone of the chromatic scale, i.e. idiot_temp[9]
expected_freq = [440.0 / idiot_temp[9] * i for i in idiot_temp]
actual_freq = temperanotes.frequencies(temperament = idiot_temp, notes_low = 0, notes_high = 12, key = 'C', base_freq = 440.0, key_freq = 'A')
assert actual_freq == expected_freq
def test_normal_octave(idiot_temp):
expected_freq = [440.0 * i for i in idiot_temp]
actual_freq = temperanotes.frequencies(temperament = idiot_temp, notes_low = 0, notes_high = 12, key = 'A', base_freq = 440.0, key_freq = 'A')
assert actual_freq == expected_freq
def test_lower_octave(idiot_temp):
expected_freq = [440.0 / 2 * i for i in idiot_temp]
actual_freq = temperanotes.frequencies(temperament = idiot_temp, notes_low = 12, notes_high = 0, key = 'A', base_freq = 440.0, key_freq = 'A')
assert actual_freq == expected_freq
def test_one_octave_and_one_note(idiot_temp):
expected_freq = [440.0 * i for i in idiot_temp] + [440.0 * 2]
assert len(expected_freq) == 13 # obvious, but making sure no simply bugs in test itself
actual_freq = temperanotes.frequencies(temperament = idiot_temp, notes_low = 0, notes_high = 13, key = 'A', base_freq = 440.0, key_freq = 'A')
assert actual_freq == expected_freq
def test_one_octave_and_one_note_per_direction(idiot_temp):
expected_freq_lo = [440.0 / 2 * i for i in idiot_temp]
expected_freq_hi = [440.0 * i for i in idiot_temp]
expected_freq = [440.0 / 4 * idiot_temp[-1]] + expected_freq_lo + expected_freq_hi + [440.0 * 2]
assert len(expected_freq) == 24 + 2 # obvious, but making sure no simply bugs in test itself
actual_freq = temperanotes.frequencies(temperament = idiot_temp, notes_low = 13, notes_high = 13, key = 'A', base_freq = 440.0, key_freq = 'A')
assert actual_freq == expected_freq
def test_one_octave_and_half_per_direction(idiot_temp):
expected_freq_lolo = [440.0 / 4 * i for i in idiot_temp]
expected_freq_lo = [440.0 / 2 | * i for i in idiot_temp]
expected_freq_hi = [440.0 * i for i in idiot_temp]
expected_freq_hihi = [440.0 * 2 * i for i in idiot_temp]
expected_freq = expected_freq_lolo[6:] + expected_freq_lo + expected_freq_hi + expected_freq_hihi[:6]
| assert len(expected_freq) == 48 - 12 # obvious, but making sure no simply bugs in test itself
actual_freq = temperanotes.frequencies(temperament = idiot_temp, notes_low = 18, notes_high = 18, key = 'A', base_freq = 440.0, key_freq = 'A')
assert actual_freq == expected_freq
def test_two_octaves(idiot_temp):
expected_freq_lo = [440.0 / 2 * i for i in idiot_temp]
expected_freq_hi = [440.0 * i for i in idiot_temp]
expected_freq = expected_freq_lo + expected_freq_hi
assert len(expected_freq) == 24 # obvious, but making sure no simply bugs in test itself
actual_freq = temperanotes.frequencies(temperament = idiot_temp, notes_low = 12, notes_high = 12, key = 'A', base_freq = 440.0, key_freq = 'A')
assert actual_freq == expected_freq
def test_four_octaves(idiot_temp):
expected_freq_lolo = [440.0 / 4 * i for i in idiot_temp]
expected_freq_lo = [440.0 / 2 * i for i in idiot_temp]
expected_freq_hi = [440.0 * i for i in idiot_temp]
expected_freq_hihi = [440.0 * 2 * i for i in idiot_temp]
expected_freq = expected_freq_lolo + expected_freq_lo + expected_freq_hi + expected_freq_hihi
assert len(expected_freq) == 48 # obvious, but making sure no simply bugs in test itself
actual_freq = temperanotes.frequencies(temperament = idiot_temp, notes_low = 24, notes_high = 24, key = 'A', base_freq = 440.0, key_freq = 'A')
assert actual_freq == expected_freq
def test_equal_temp():
expected = [1., 2. ** (1./12), 2. ** (1./6), 2. ** (1./4), 2. ** (1./3), 2. ** (5./12), 2. ** (1./2), 2. ** (7./12), 2. ** (2./3), 2. ** (3./4), 2. ** (5./6), 2. ** (11./12)]
actual = temperanotes.equal_temperament()
assert actual == expected
def test_cents():
expected = [100 * i for i in range(12)]
actual = temperanotes.to_cents(temperanotes.equal_temperament())
assert actual == expected
def test_read_temperament_nocents():
data = """#This is a comment
1
1.01 # this is another comment
1.3
1.4
# more comments
1.5
1.6
1.7
1.8
1.9
1.10
1.11
1.12"""
expected = [1, 1.01, 1.3, 1.4, 1.5, 1.6, 1.7, 1.8, 1.9, 1.10, 1.11, 1.12]
actual, cents = temperanotes.read_temperament(data)
assert actual == expected
assert len(cents) == 0
def test_read_temperament_withcents_and_math():
data = """#This is a comment
1, 100
sqrt(2), 200 # this is another comment
1.3, 4 ** (1/3) # 1.58 must round to 2
2 ** 1/12, 500
# more comments
1.5, 600
1.6, 700
1.7, 900
1.8, 1000
1.9, 2000 # comments can appear anywhere
1.10, 3000
1.11, 1
1.12, 7
# comments at the end"""
expected = [1, 1.4142135623730951, 1.3, 0.1666666666666666666666666, 1.5, 1.6, 1.7, 1.8, 1.9, 1.10, 1.11, 1.12]
actual, cents = temperanotes.read_temperament(data)
assert actual == expected
assert cents == [100, 200, 2, 500, 600, 700, 900, 1000, 2000, 3000, 1, 7]
def test_read_incorrect_temperaments():
data = 11 * "1, 100\n"
with pytest.raises(SystemExit):
temperanotes.read_temperament(data)
data = 13 * "1, 100\n"
with pytest.raises(SystemExit):
temperanotes.read_temperament(data)
def test_read_more_entries_cents():
data = (5 * "1, 100\n" +
2 * "2, 150, 200\n" + # additional data
5 * "7, 200\n")
with pytest.raises(SystemExit):
temperanotes.read_temperament(data)
def test_read_incorrect_cents():
data = (5 * "1, 100\n" +
2 * "2,\n" + # missing some cents (with comma)
5 * "7, 200\n")
with pytest.raises(SystemExit):
temperanotes.read_temperament(data)
def test_read_missing_cents():
data = (5 * "1, 100\n" +
2 * "2\n" + # missing some cents (without comma)
5 * "7, 200\n")
with pytest.raises(SystemExit):
temperanotes.read_temperament(data)
def test_read_file_with_errors():
data = (5 * "1, 100\n" +
2 * "foo_bar, 200\n" + # syntax error in frequencies
5 * " |
Staffjoy/client_python | staffjoy/resources/chomp_task.py | Python | mit | 166 | 0 | from staffjoy.resource import Resource
class ChompT | ask(Resource):
PATH = "internal/tasking/chomp/{sche | dule_id}"
ENVELOPE = None
ID_NAME = "schedule_id"
|
IDSIA/sacred | examples/log_example.py | Python | mit | 999 | 0 | #!/usr/bin/env python
# coding=utf-8
""" An example showcasing the logging system of Sacred."""
import logging
from sacred import Experiment
ex = Experiment("log_example")
# set up a custom logger
logger = logging.g | etLogger("mylogger")
logger.handlers = []
ch = logging.StreamHandler()
formatter = logging.Formatter('[%(levelname).1s] %(name)s >> "%(message)s"')
ch.setFormatter(formatter)
logger. | addHandler(ch)
logger.setLevel("INFO")
# attach it to the experiment
ex.logger = logger
@ex.config
def cfg():
number = 2
got_gizmo = False
@ex.capture
def transmogrify(got_gizmo, number, _log):
if got_gizmo:
_log.debug("Got gizmo. Performing transmogrification...")
return number * 42
else:
_log.warning("No gizmo. Can't transmogrify!")
return 0
@ex.automain
def main(number, _log):
_log.info("Attempting to transmogrify %d...", number)
result = transmogrify()
_log.info("Transmogrification complete: %d", result)
return result
|
Pal3love/otRebuilder | Package/otRebuilder/Dep/fontTools/ttLib/tables/TupleVariation.py | Python | mit | 21,422 | 0.025955 | from __future__ import print_function, division, absolute_import
from fontTools.misc.py23 import *
from fontTools.misc.fixedTools import fixedToFloat, floatToFixed
from fontTools.misc.textTools import safeEval
import array
import io
import logging
import struct
import sys
# https://www.microsoft.com/typography/otspec/otvarcommonformats.htm
EMBEDDED_PEAK_TUPLE = 0x8000
INTERMEDIATE_REGION = 0x4000
PRIVATE_POINT_NUMBERS = 0x2000
DELTAS_ARE_ZERO = 0x80
DELTAS_ARE_WORDS = 0x40
DELTA_RUN_COUNT_MASK = 0x3f
POINTS_ARE_WORDS = 0x80
POINT_RUN_COUNT_MASK = 0x7f
TUPLES_SHARE_POINT_NUMBERS = 0x8000
TUPLE_COUNT_MASK = 0x0fff
TUPLE_INDEX_MASK = 0x0fff
log = logging.getLogger(__name__)
class TupleVariation(object):
def __init__(self, axes, coordinates):
self.axes = axes.copy()
self.coordinates = coordinates[:]
def __repr__(self):
axes = ",".join(sorted(["%s=%s" % (name, value) for (name, value) in self.axes.items()]))
return "<TupleVariation %s %s>" % (axes, self.coordinates)
def __eq__(self, other):
return self.coordinates == other.coordinates and self.axes == other.axes
def getUsedPoints(self):
result = set()
for i, point in enumerate(self.coordinates):
if point is not None:
result.add(i)
return result
def hasImpact(self):
"""Returns True if this TupleVariation has any visible impact.
If the result is False, the TupleVariation can be omitted from the font
without making any visible difference.
"""
for c in self.coordinates:
if c is not None:
return True
return False
def toXML(self, writer, axisTags):
writer.begintag("tuple")
writer.newline()
for axis in axisTags:
value = self.axes.get(axis)
if value is not None:
minValue, value, maxValue = (float(v) for v in value)
defaultMinValue = min(value, 0.0) # -0.3 --> -0.3; 0.7 --> 0.0
defaultMaxValue = max(value, 0.0) # -0.3 --> 0.0; 0.7 --> 0.7
if minValue == defaultMinValue and maxValue == defaultMaxValue:
writer.simpletag("coord", axis=axis, valu | e=value)
else:
writer.simpletag("coord", axis=axis, value=value, min=minValue, max=maxValue)
writer.newline()
wrote_any_deltas = False
for i, delta in enumerate(self.coordinates):
if type(delta) == tuple and len(delta) == 2:
writer.simpletag("delta", pt=i, x=delta[0], y=delta[1])
writer.newline()
wrote_any_deltas = True
elif type(delta) == int:
writ | er.simpletag("delta", cvt=i, value=delta)
writer.newline()
wrote_any_deltas = True
elif delta is not None:
log.error("bad delta format")
writer.comment("bad delta #%d" % i)
writer.newline()
wrote_any_deltas = True
if not wrote_any_deltas:
writer.comment("no deltas")
writer.newline()
writer.endtag("tuple")
writer.newline()
def fromXML(self, name, attrs, _content):
if name == "coord":
axis = attrs["axis"]
value = float(attrs["value"])
defaultMinValue = min(value, 0.0) # -0.3 --> -0.3; 0.7 --> 0.0
defaultMaxValue = max(value, 0.0) # -0.3 --> 0.0; 0.7 --> 0.7
minValue = float(attrs.get("min", defaultMinValue))
maxValue = float(attrs.get("max", defaultMaxValue))
self.axes[axis] = (minValue, value, maxValue)
elif name == "delta":
if "pt" in attrs:
point = safeEval(attrs["pt"])
x = safeEval(attrs["x"])
y = safeEval(attrs["y"])
self.coordinates[point] = (x, y)
elif "cvt" in attrs:
cvt = safeEval(attrs["cvt"])
value = safeEval(attrs["value"])
self.coordinates[cvt] = value
else:
log.warning("bad delta format: %s" %
", ".join(sorted(attrs.keys())))
def compile(self, axisTags, sharedCoordIndices, sharedPoints):
tupleData = []
assert all(tag in axisTags for tag in self.axes.keys()), ("Unknown axis tag found.", self.axes.keys(), axisTags)
coord = self.compileCoord(axisTags)
if coord in sharedCoordIndices:
flags = sharedCoordIndices[coord]
else:
flags = EMBEDDED_PEAK_TUPLE
tupleData.append(coord)
intermediateCoord = self.compileIntermediateCoord(axisTags)
if intermediateCoord is not None:
flags |= INTERMEDIATE_REGION
tupleData.append(intermediateCoord)
points = self.getUsedPoints()
if sharedPoints == points:
# Only use the shared points if they are identical to the actually used points
auxData = self.compileDeltas(sharedPoints)
usesSharedPoints = True
else:
flags |= PRIVATE_POINT_NUMBERS
numPointsInGlyph = len(self.coordinates)
auxData = self.compilePoints(points, numPointsInGlyph) + self.compileDeltas(points)
usesSharedPoints = False
tupleData = struct.pack('>HH', len(auxData), flags) + bytesjoin(tupleData)
return (tupleData, auxData, usesSharedPoints)
def compileCoord(self, axisTags):
result = []
for axis in axisTags:
_minValue, value, _maxValue = self.axes.get(axis, (0.0, 0.0, 0.0))
result.append(struct.pack(">h", floatToFixed(value, 14)))
return bytesjoin(result)
def compileIntermediateCoord(self, axisTags):
needed = False
for axis in axisTags:
minValue, value, maxValue = self.axes.get(axis, (0.0, 0.0, 0.0))
defaultMinValue = min(value, 0.0) # -0.3 --> -0.3; 0.7 --> 0.0
defaultMaxValue = max(value, 0.0) # -0.3 --> 0.0; 0.7 --> 0.7
if (minValue != defaultMinValue) or (maxValue != defaultMaxValue):
needed = True
break
if not needed:
return None
minCoords = []
maxCoords = []
for axis in axisTags:
minValue, value, maxValue = self.axes.get(axis, (0.0, 0.0, 0.0))
minCoords.append(struct.pack(">h", floatToFixed(minValue, 14)))
maxCoords.append(struct.pack(">h", floatToFixed(maxValue, 14)))
return bytesjoin(minCoords + maxCoords)
@staticmethod
def decompileCoord_(axisTags, data, offset):
coord = {}
pos = offset
for axis in axisTags:
coord[axis] = fixedToFloat(struct.unpack(">h", data[pos:pos+2])[0], 14)
pos += 2
return coord, pos
@staticmethod
def compilePoints(points, numPointsInGlyph):
# If the set consists of all points in the glyph, it gets encoded with
# a special encoding: a single zero byte.
if len(points) == numPointsInGlyph:
return b"\0"
# In the 'gvar' table, the packing of point numbers is a little surprising.
# It consists of multiple runs, each being a delta-encoded list of integers.
# For example, the point set {17, 18, 19, 20, 21, 22, 23} gets encoded as
# [6, 17, 1, 1, 1, 1, 1, 1]. The first value (6) is the run length minus 1.
# There are two types of runs, with values being either 8 or 16 bit unsigned
# integers.
points = list(points)
points.sort()
numPoints = len(points)
# The binary representation starts with the total number of points in the set,
# encoded into one or two bytes depending on the value.
if numPoints < 0x80:
result = [bytechr(numPoints)]
else:
result = [bytechr((numPoints >> 8) | 0x80) + bytechr(numPoints & 0xff)]
MAX_RUN_LENGTH = 127
pos = 0
lastValue = 0
while pos < numPoints:
run = io.BytesIO()
runLength = 0
useByteEncoding = None
while pos < numPoints and runLength <= MAX_RUN_LENGTH:
curValue = points[pos]
delta = curValue - lastValue
if useByteEncoding is None:
useByteEncoding = 0 <= delta <= 0xff
if useByteEncoding and (delta > 0xff or delta < 0):
# we need to start a new run (which will not use byte encoding)
break
# TODO This never switches back to a byte-encoding from a short-encoding.
# That's suboptimal.
if useByteEncoding:
run.write(bytechr(delta))
else:
run.write(bytechr(delta >> 8))
run.write(bytechr(delta & 0xff))
lastValue = curValue
pos += 1
runLength += 1
if useByteEncoding:
runHeader = bytechr(runLength - 1)
else:
runHeader = bytechr((runLength - 1) | POINTS_ARE_WORDS)
result.append(runHeader)
result.append(run.getvalue())
return bytesjoin(result)
@staticmethod
def decompilePoints_(numPoints, data, offset, tableTag):
"""(numPoints, data, offset, tableTag) --> ([point1, point2, ...], newOffset)"""
assert tableTag in ('cvar', 'gvar')
pos = offset
numPointsInData = byteord(data[pos])
pos += 1
if (numPointsInData & POINTS_ARE_WORDS) != 0:
numPointsInData = (numPointsInData & POINT_RUN_COUNT_MA |
rsignell-usgs/ocean_map | code/west_coast/surf_vel.py | Python | mit | 2,525 | 0.047921 | """
Created on Wed Apr 18 16:02:24 2012
@author: rsignell
"""
import netCDF4
import numpy as np
import datetime
import scipy.interpolate
def surf_vel(x,y,url,date_mid=datetime.datetime.utcnow(),uvar='u',vvar='v',isurf_layer=0,lonvar='lon',latvar='lat',
tvar='time',hours_ave=24,lon360=False,ugrid=False,lonlat_sub=1,time_sub=1):
nc=netCDF4.Dataset(url)
lon = nc.variables[lonvar][:]-360.*lon360
lat = nc.variables[latvar][:]
if ugrid:
lon2d=lon
lat2d=lat
elif lon.ndim==1:
# ai and aj are logical arrays, True in subset region
igood = np.where((lon>=x.min()) & (lon<=x.max()))
jgood = np.where((lat>=y.min()) & (lat<=y.max()))
bi=np.arange(igood[0].min(),igood[0].max()+1,lonlat_sub)
bj=np.arange(jgood[0].min(),jgood[0].max()+1,lonlat_sub)
[lon2d,lat2d]=np.meshgrid(lon[bi],lat[bj])
elif lon.ndim==2:
igood=np.where(((lon>=x.min())&(lon<=x.max())) & ((lat>=y.min())&(lat<=y.max())))
bj=np.arange(igood[0].min(),igood[0].max()+1,lonlat_sub)
bi=np.arange(igood[1].min(),igood[1].max()+1,lonlat_sub)
lon2d=nc.variables[lonvar][bj,bi]
lat2d=nc.variables[latvar][bj,bi]
else:
print 'uh oh'
#desired_stop_date=datetime.datetime(2011,9,9,17,00) # specific time (UTC)
desired_stop_date=date_mid+datetime.timedelta(0,3600.*hours_ave/2.)
istop = netCDF4.date2index(desired_stop_date,nc.variables[tvar],select='nearest')
actual_stop_date=netCDF4.num2date(nc.variables[tvar][istop],nc.variables[tvar].units)
start_date=actual_stop_date-datetime.timedelta(0,3600.*hours_ave)
istart = netCDF4.date2index(start_date,nc.variables[tvar],select='nearest')
if ugrid:
u1=np.mean(nc.variables[uvar][istart:istop:time_sub,isurf_layer,:],axis=0)
v1=np.mean(nc.variables[vvar][istart:istop:time_sub,isurf_layer,:],axis=0)
else:
print('reading u...')
u1=np.mean(nc.variables[uvar | ][istart:istop:time_sub,isurf_layer,bj,bi],axis=0)
print('reading v...')
v1=np.mean(nc.variables[vvar][istart:istop:time_sub,isurf_layer,bj,bi],axis=0)
xx2,yy2=np.meshgrid(x,y)
ui=scipy.interpolate.griddata((lon2d.flatten(),lat2d.flatten()),u1.flatten(),(xx2,yy2),method='linear',fill_value=0.0)
vi=scipy.interpolate.griddata((lon2d.flatten(),lat2d.flatten()),v1.flatten(),(xx2,yy2),method='linear',fill_value=0.0)
ui[np.isnan(ui)]=0.0
| vi[np.isnan(vi)]=0.0
return ui,vi
|
yusuf-musleh/Expense-Tracker | expense_tracker/expense_tracker/settings.py | Python | mit | 3,157 | 0.001267 | """
Django settings for expense_tracker project.
Generated by 'django-admin startproject' using Django 1.10.2.
For more information on this file, see
https://docs.djangoproject.com/en/1.10/topics/settings/
For the full list of settings and their values, see
https://docs.djangoproject.com/en/1.10/ref/settings/
"""
import os
# Build paths inside the project like this: os.path.join(BASE_DIR, ...)
BASE_DIR = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))
# Quick-start development settings - unsuitable for production
# See https://docs.djangoproject.com/en/1.10/howto/deployment/checklist/
# SECURITY WARNING: keep the secret key used in production secret!
SECRET_KEY = '%wm#(m4jd8f9iipb)d6@nr#_fr@n8vnsur96#xxs$!0m627ewe'
# SECURITY WARNING: don't run with debug turned on in production!
DEBUG = True
ALLOWED_HOSTS = []
# Application definition
INST | ALLED_APPS = [
'tracker.apps.TrackerConfig',
'django.contrib.admin',
'django.contrib.auth',
'django.contrib.contenttypes',
'django.contrib.sessions',
'django.contrib.messages',
'django.contrib.staticfiles',
]
MIDDLEWARE = [
'django.middleware.security.SecurityMiddleware',
'django.contrib.sessions.middleware.SessionMiddleware',
'django.middleware.common. | CommonMiddleware',
'django.middleware.csrf.CsrfViewMiddleware',
'django.contrib.auth.middleware.AuthenticationMiddleware',
'django.contrib.messages.middleware.MessageMiddleware',
'django.middleware.clickjacking.XFrameOptionsMiddleware',
]
ROOT_URLCONF = 'expense_tracker.urls'
TEMPLATES = [
{
'BACKEND': 'django.template.backends.django.DjangoTemplates',
'DIRS': [],
'APP_DIRS': True,
'OPTIONS': {
'context_processors': [
'django.template.context_processors.debug',
'django.template.context_processors.request',
'django.contrib.auth.context_processors.auth',
'django.contrib.messages.context_processors.messages',
],
},
},
]
WSGI_APPLICATION = 'expense_tracker.wsgi.application'
# Database
# https://docs.djangoproject.com/en/1.10/ref/settings/#databases
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.sqlite3',
'NAME': os.path.join(BASE_DIR, 'db.sqlite3'),
}
}
# Password validation
# https://docs.djangoproject.com/en/1.10/ref/settings/#auth-password-validators
AUTH_PASSWORD_VALIDATORS = [
{
'NAME': 'django.contrib.auth.password_validation.UserAttributeSimilarityValidator',
},
{
'NAME': 'django.contrib.auth.password_validation.MinimumLengthValidator',
},
{
'NAME': 'django.contrib.auth.password_validation.CommonPasswordValidator',
},
{
'NAME': 'django.contrib.auth.password_validation.NumericPasswordValidator',
},
]
# Internationalization
# https://docs.djangoproject.com/en/1.10/topics/i18n/
LANGUAGE_CODE = 'en-us'
TIME_ZONE = 'UTC'
USE_I18N = True
USE_L10N = True
USE_TZ = True
# Static files (CSS, JavaScript, Images)
# https://docs.djangoproject.com/en/1.10/howto/static-files/
STATIC_URL = '/static/'
|
mattmilten/PySCIPOpt | tests/test_branch_probing_lp.py | Python | mit | 2,957 | 0.005749 | from pyscipopt import Model, Branchrule, SCIP_RESULT, quicksum
class MyBranching(Branchrule):
def __init__(self, model, cont):
self.model = model
self.cont = cont
self.count = 0
self.was_called_val = False
self.was_called_int = False
def branchexeclp(self, allowaddcons):
self.count += 1
if self.count >= 2:
return {"result": SCIP_RESULT.DIDNOTRUN}
assert allowaddcons
assert not self.model.inRepropagation()
assert not self.model.inProbing()
self.model.startProbing()
assert not self.model.isObjChangedProbing()
self.model.fixVarProbing(self.cont, 2.0)
self | .model.constructLP()
self.model.solveProbingLP()
self.model.getLPObjVal()
self.model.endProbing()
self.integral = self.model | .getLPBranchCands()[0][0]
if self.count == 1:
down, eq, up = self.model.branchVarVal(self.cont, 1.3)
self.model.chgVarLbNode(down, self.cont, -1.5)
self.model.chgVarUbNode(up, self.cont, 3.0)
self.was_called_val = True
down2, eq2, up2 = self.model.branchVar(self.integral)
self.was_called_int = True
self.model.createChild(6, 7)
return {"result": SCIP_RESULT.BRANCHED}
m = Model()
m.setIntParam("presolving/maxrounds", 0)
#m.setLongintParam("lp/rootiterlim", 3)
m.setRealParam("limits/time", 60)
x0 = m.addVar(lb=-2, ub=4)
r1 = m.addVar()
r2 = m.addVar()
y0 = m.addVar(lb=3)
t = m.addVar(lb=None)
l = m.addVar(vtype="I", lb=-9, ub=18)
u = m.addVar(vtype="I", lb=-3, ub=99)
more_vars = []
for i in range(1000):
more_vars.append(m.addVar(vtype="I", lb= -12, ub=40))
m.addCons(quicksum(v for v in more_vars) <= (40 - i) * quicksum(v for v in more_vars[::2]))
for i in range(1000):
more_vars.append(m.addVar(vtype="I", lb= -52, ub=10))
m.addCons(quicksum(v for v in more_vars[50::2]) <= (40 - i) * quicksum(v for v in more_vars[405::2]))
m.addCons(r1 >= x0)
m.addCons(r2 >= -x0)
m.addCons(y0 == r1 +r2)
#m.addCons(t * l + l * u >= 4)
m.addCons(t + l + 7* u <= 300)
m.addCons(t >= quicksum(v for v in more_vars[::3]) - 10 * more_vars[5] + 5* more_vars[9])
m.addCons(more_vars[3] >= l + 2)
m.addCons(7 <= quicksum(v for v in more_vars[::4]) - x0)
m.addCons(quicksum(v for v in more_vars[::2]) + l <= quicksum(v for v in more_vars[::4]))
m.setObjective(t - quicksum(j*v for j, v in enumerate(more_vars[20:-40])))
#m.addCons(t >= r1 * (r1 - x0) + r2 * (r2 + x0))
my_branchrule = MyBranching(m, x0)
m.includeBranchrule(my_branchrule, "test branch", "test branching and probing and lp functions",
priority=10000000, maxdepth=3, maxbounddist=1)
m.optimize()
print("x0", m.getVal(x0))
print("r1", m.getVal(r1))
print("r2", m.getVal(r2))
print("y0", m.getVal(y0))
print("t", m.getVal(t))
assert my_branchrule.was_called_val
assert my_branchrule.was_called_int
|
adamBrinek/tuned | tuned/tests/profiles/test_loader.py | Python | gpl-2.0 | 3,581 | 0.023736 | import unittest
import tempfile
import shutil
import os.path
import tuned.profiles.exceptions
from tuned.profiles.loader import Loader
from flexmock import flexmock
class MockProfile(object):
def __init__(self, name, config):
self.name = name
self.options = {}
self.units = {}
self.test_config = config
class MockProfileFactory(object):
def create(self, name, config):
return MockProfile(name, config)
class MockProfileMerger(object):
def merge(self, profiles):
new = MockProfile("merged", {})
new.test_merged = profiles
return new
class LoaderTestCase(unittest.TestCase):
def setUp(self):
self.factory = MockProfileFactory()
self.merger = MockProfileMerger()
self.loader = Loader(self._tmp_load_dirs, self.factory, self.merger)
@classmethod
def setUpClass(cls):
tmpdir1 = tempfile.mkdtemp()
tmpdir2 = tempfile.mkdtemp()
cls._tmp_load_dirs = [tmpdir1, tmpdir2]
cls._create_profile(tmpdir1, "default", "[main]\n\n[network]\ntype=net\ndevices=em*\n\n[disk]\nenabled=false\n")
cls._create_profile(tmpdir1, "invalid", "INVALID")
cls._create_profile(tmpdir1, "expand", "[expand]\ntype=script\nscript=runme.sh\n")
cls._create_profile(tmpdir2, "empty", "")
cls._create_profile(tmpdir1, "custom", "[custom]\ntype=one\n")
cls._create_profile(tmpdir2, "custom", "[custom]\ntype=two\n")
@classmethod
def tearDownClass(cls):
for tmp_dir in cls._tmp_load_dirs:
shutil.rmtree(tmp_dir, True)
@classmethod
def _create_profile(cls, load_dir, profile_name, tuned_conf_content):
profile_dir = os.path.join(load_dir, profile_name)
conf_name = os.path.join(profile_dir, "tuned.conf")
os.mkdir(profile_dir)
with open( | conf_name, "w") as conf_file:
| conf_file.write(tuned_conf_content)
def test_init(self):
Loader([], None, None)
Loader(["/tmp"], None, None)
Loader(["/foo", "/bar"], None, None)
def test_init_wrong_type(self):
with self.assertRaises(TypeError):
Loader(False, self.factory, self.merger)
def test_load(self):
profile = self.loader.load("default")
self.assertIn("main", profile.test_config)
self.assertIn("disk", profile.test_config)
self.assertEquals(profile.test_config["network"]["devices"], "em*")
def test_load_empty(self):
profile = self.loader.load("empty")
self.assertDictEqual(profile.test_config, {})
def test_load_invalid(self):
with self.assertRaises(tuned.profiles.exceptions.InvalidProfileException):
invalid_config = self.loader.load("invalid")
def test_load_nonexistent(self):
with self.assertRaises(tuned.profiles.exceptions.InvalidProfileException):
config = self.loader.load("nonexistent")
def test_load_order(self):
profile = self.loader.load("custom")
self.assertEquals(profile.test_config["custom"]["type"], "two")
def test_default_load(self):
profile = self.loader.load("empty")
self.assertIs(type(profile), MockProfile)
def test_script_expand_names(self):
profile = self.loader.load("expand")
expected_name = os.path.join(self._tmp_load_dirs[0], "expand", "runme.sh")
self.assertEqual(profile.test_config["expand"]["script"], expected_name)
def test_load_multiple_profiles(self):
profile = self.loader.load(["default", "expand"])
self.assertEqual(len(profile.test_merged), 2)
def test_include_directive(self):
profile1 = MockProfile("first", {})
profile1.options = {"include": "default"}
profile2 = MockProfile("second", {})
flexmock(self.factory).should_receive("create").and_return(profile1).and_return(profile2).twice()
profile = self.loader.load("empty")
self.assertEqual(len(profile.test_merged), 2)
|
dirkmueller/kiwi | kiwi/filesystem/ext4.py | Python | gpl-3.0 | 1,355 | 0 | # Copyright (c) 2015 SUSE Linux GmbH. All rights reserved.
#
# This file is part of kiwi.
#
# kiwi is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later versi | on.
#
# kiwi is distributed in the hope that it will be useful, |
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with kiwi. If not, see <http://www.gnu.org/licenses/>
# project
from kiwi.filesystem.base import FileSystemBase
from kiwi.command import Command
class FileSystemExt4(FileSystemBase):
"""
**Implements creation of ext4 filesystem**
"""
def create_on_device(self, label: str = None):
"""
Create ext4 filesystem on block device
:param string label: label name
"""
device = self.device_provider.get_device()
if label:
self.custom_args['create_options'].append('-L')
self.custom_args['create_options'].append(label)
Command.run(
['mkfs.ext4'] + self.custom_args['create_options'] + [device]
)
|
wdv4758h/arandr | screenlayout/widget.py | Python | gpl-3.0 | 16,992 | 0.006062 | # ARandR -- Another XRandR GUI
# Copyright (C) 2008 -- 2011 chrysn <chrysn@fsfe.org>
#
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program. If not, see <http://www.gnu.org/licenses/>.
from __future__ import division
import os
import stat
import pango
import pangocairo
import gobject, gtk
from .auxiliary import Position, Size, NORMAL, ROTATIONS, InadequateConfiguration
from .xrandr import XRandR, Feature
from .snap import Snap
import gettext
gettext.install('arandr')
class ARandRWidget(gtk.DrawingArea):
__gsignals__ = {
'expose-event':'override', # FIXME: still needed?
'changed':(gobject.SIGNAL_RUN_LAST, gobject.TYPE_NONE, ()),
}
def __init__(self, factor=8, display=None, force_version=False):
super(ARandRWidget, self).__init__()
self._factor = factor
self.set_size_request(1024//self.factor, 1024//self.factor) # best guess for now
self.connect('button-press-event', self.click)
self.set_events(gtk.gdk.BUTTON_PRESS_MASK)
self.setup_draganddrop()
self._xrandr = XRandR(display=display, force_version=force_version)
#################### widget features ####################
def _set_factor(self, f):
self._factor = f
self._update_size_request()
self._force_repaint()
factor = property(lambda self: self._factor, _set_factor)
def abort_if_unsafe(self):
if not len([x for x in self._xrandr.configuration.outputs.values() if x.active]):
d = gtk.MessageDialog(None, gtk.DIALOG_MODAL, gtk.MESSAGE_WARNING, gtk.BUTTONS_YES_NO, _("Your configuration does not include an active monitor. Do you want to apply the configuration?"))
result = d.run()
d.destroy()
if result == gtk.RESPONSE_YES:
return False
else:
return True
return False
def error_message(self, message):
d = gtk.MessageDialog(None, gtk.DIALOG_MODAL, gtk.MESSAGE_ERROR, gtk.BUTTONS_CLOSE, message)
d.run()
d.destroy()
def _update_size_request(self):
max_gapless = sum(max(o.size) if o.active else 0 for o in self._xrandr.configuration.outputs.values()) # this ignores that some outputs might not | support ro | tation, but will always err at the side of caution.
# have some buffer
usable_size = int(max_gapless * 1.1)
# don't request too large a window, but make sure very possible compination fits
xdim = min(self._xrandr.state.virtual.max[0], usable_size)
ydim = min(self._xrandr.state.virtual.max[1], usable_size)
self.set_size_request(xdim//self.factor, ydim//self.factor)
#################### loading ####################
def load_from_file(self, file):
data = open(file).read()
template = self._xrandr.load_from_string(data)
self._xrandr_was_reloaded()
return template
def load_from_x(self):
self._xrandr.load_from_x()
self._xrandr_was_reloaded()
return self._xrandr.DEFAULTTEMPLATE
def _xrandr_was_reloaded(self):
self.sequence = sorted(self._xrandr.outputs)
self._lastclick = (-1,-1)
self._update_size_request()
if self.window:
self._force_repaint()
self.emit('changed')
def save_to_x(self):
self._xrandr.save_to_x()
self.load_from_x()
def save_to_file(self, file, template=None, additional=None):
data = self._xrandr.save_to_shellscript_string(template, additional)
open(file, 'w').write(data)
os.chmod(file, stat.S_IRWXU)
self.load_from_file(file)
#################### doing changes ####################
def _set_something(self, which, on, data):
old = getattr(self._xrandr.configuration.outputs[on], which)
setattr(self._xrandr.configuration.outputs[on], which, data)
try:
self._xrandr.check_configuration()
except InadequateConfiguration:
setattr(self._xrandr.configuration.outputs[on], which, old)
raise
self._force_repaint()
self.emit('changed')
def set_position(self, on, pos):
self._set_something('position', on, pos)
def set_rotation(self, on, rot):
self._set_something('rotation', on, rot)
def set_resolution(self, on, res):
self._set_something('mode', on, res)
def set_primary(self, on, primary):
o = self._xrandr.configuration.outputs[on]
if primary and not o.primary:
for o2 in self._xrandr.outputs:
self._xrandr.configuration.outputs[o2].primary = False
o.primary = True
elif not primary and o.primary:
o.primary = False
else:
return
self._force_repaint()
self.emit('changed')
def set_active(self, on, active):
v = self._xrandr.state.virtual
o = self._xrandr.configuration.outputs[on]
if not active and o.active:
o.active = False
# don't delete: allow user to re-enable without state being lost
if active and not o.active:
if hasattr(o, 'position'):
o.active = True # nothing can go wrong, position already set
else:
pos = Position((0,0))
for m in self._xrandr.state.outputs[on].modes:
# determine first possible mode
if m[0]<=v.max[0] and m[1]<=v.max[1]:
mode = m
break
else:
raise InadequateConfiguration("Smallest mode too large for virtual.")
o.active = True
o.position = pos
o.mode = mode
o.rotation = NORMAL
self._force_repaint()
self.emit('changed')
#################### painting ####################
def do_expose_event(self, event):
cr = pangocairo.CairoContext(self.window.cairo_create())
cr.rectangle(event.area.x, event.area.y, event.area.width, event.area.height)
cr.clip()
# clear
cr.set_source_rgb(0,0,0)
cr.rectangle(0,0,*self.window.get_size())
cr.fill()
cr.save()
cr.scale(1/self.factor, 1/self.factor)
cr.set_line_width(self.factor*1.5)
self._draw(self._xrandr, cr)
def _draw(self, xrandr, cr):
cfg = xrandr.configuration
state = xrandr.state
cr.set_source_rgb(0.25,0.25,0.25)
cr.rectangle(0,0,*state.virtual.max)
cr.fill()
cr.set_source_rgb(0.5,0.5,0.5)
cr.rectangle(0,0,*cfg.virtual)
cr.fill()
for on in self.sequence:
o = cfg.outputs[on]
if not o.active: continue
rect = (o.tentative_position if hasattr(o, 'tentative_position') else o.position) + tuple(o.size)
center = rect[0]+rect[2]/2, rect[1]+rect[3]/2
# paint rectangle
cr.set_source_rgba(1,1,1,0.7)
cr.rectangle(*rect)
cr.fill()
cr.set_source_rgb(0,0,0)
cr.rectangle(*rect)
cr.stroke()
# set up for text
cr.save()
textwidth = rect[3 if o.rotation.is_odd else 2]
widthperchar = textwidth/len(on)
textheight = int(widthperchar * 0.8) # i think this looks nice and won't overflow even for wide fonts
newdescr = pango.FontDescription("sans")
newdescr.set_size(textheight * pango.SCALE)
# create text
layout = cr.crea |
Azure/azure-sdk-for-python | sdk/signalr/azure-mgmt-signalr/azure/mgmt/signalr/models/_signal_rmanagement_client_enums.py | Python | mit | 5,623 | 0.005869 | # coding=utf-8
# --------------------------------------------------------------------------
# Copyright (c) Microsoft Corporation. All rights reserved.
# Licensed under the MIT License. See License.txt in the project root for license information.
# Code generated by Microsoft (R) AutoRest Code Generator.
# Changes may cause incorrect behavior and will be lost if the code is regenerated.
# --------------------------------------------------------------------------
from enum import Enum, EnumMeta
from six import with_metaclass
class _CaseInsensitiveEnumMeta(EnumMeta):
def __getitem__(self, name):
return super().__getitem__(name.upper())
def __getattr__(cls, name):
"""Return the enum member matching `name`
We use __getattr__ instead of descriptors or inserting into th | e enum
class' __dict__ in order to support `name` and `value` being both
properties for enum members (which live in the class' __dict__) and
enum members themselves.
""" |
try:
return cls._member_map_[name.upper()]
except KeyError:
raise AttributeError(name)
class ACLAction(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
"""Default action when no other rule matches
"""
ALLOW = "Allow"
DENY = "Deny"
class CreatedByType(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
"""The type of identity that created the resource.
"""
USER = "User"
APPLICATION = "Application"
MANAGED_IDENTITY = "ManagedIdentity"
KEY = "Key"
class FeatureFlags(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
"""FeatureFlags is the supported features of Azure SignalR service.
* ServiceMode: Flag for backend server for SignalR service. Values allowed: "Default": have
your own backend server; "Serverless": your application doesn't have a backend server;
"Classic": for backward compatibility. Support both Default and Serverless mode but not
recommended; "PredefinedOnly": for future use.
* EnableConnectivityLogs: "true"/"false", to enable/disable the connectivity log category
respectively.
* EnableMessagingLogs: "true"/"false", to enable/disable the connectivity log category
respectively.
* EnableLiveTrace: Live Trace allows you to know what's happening inside Azure SignalR service,
it will give you live traces in real time, it will be helpful when you developing your own
Azure SignalR based web application or self-troubleshooting some issues. Please note that live
traces are counted as outbound messages that will be charged. Values allowed: "true"/"false",
to enable/disable live trace feature.
"""
SERVICE_MODE = "ServiceMode"
ENABLE_CONNECTIVITY_LOGS = "EnableConnectivityLogs"
ENABLE_MESSAGING_LOGS = "EnableMessagingLogs"
ENABLE_LIVE_TRACE = "EnableLiveTrace"
class KeyType(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
"""The keyType to regenerate. Must be either 'primary' or 'secondary'(case-insensitive).
"""
PRIMARY = "Primary"
SECONDARY = "Secondary"
SALT = "Salt"
class ManagedIdentityType(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
"""Represent the identity type: systemAssigned, userAssigned, None
"""
NONE = "None"
SYSTEM_ASSIGNED = "SystemAssigned"
USER_ASSIGNED = "UserAssigned"
class PrivateLinkServiceConnectionStatus(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
"""Indicates whether the connection has been Approved/Rejected/Removed by the owner of the
service.
"""
PENDING = "Pending"
APPROVED = "Approved"
REJECTED = "Rejected"
DISCONNECTED = "Disconnected"
class ProvisioningState(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
"""Provisioning state of the resource.
"""
UNKNOWN = "Unknown"
SUCCEEDED = "Succeeded"
FAILED = "Failed"
CANCELED = "Canceled"
RUNNING = "Running"
CREATING = "Creating"
UPDATING = "Updating"
DELETING = "Deleting"
MOVING = "Moving"
class ScaleType(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
"""The scale type applicable to the sku.
"""
NONE = "None"
MANUAL = "Manual"
AUTOMATIC = "Automatic"
class ServiceKind(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
"""The kind of the service - e.g. "SignalR" for "Microsoft.SignalRService/SignalR"
"""
SIGNAL_R = "SignalR"
RAW_WEB_SOCKETS = "RawWebSockets"
class SharedPrivateLinkResourceStatus(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
"""Status of the shared private link resource
"""
PENDING = "Pending"
APPROVED = "Approved"
REJECTED = "Rejected"
DISCONNECTED = "Disconnected"
TIMEOUT = "Timeout"
class SignalRRequestType(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
"""Allowed request types. The value can be one or more of: ClientConnection, ServerConnection,
RESTAPI.
"""
CLIENT_CONNECTION = "ClientConnection"
SERVER_CONNECTION = "ServerConnection"
RESTAPI = "RESTAPI"
TRACE = "Trace"
class SignalRSkuTier(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
"""Optional tier of this particular SKU. 'Standard' or 'Free'.
``Basic`` is deprecated, use ``Standard`` instead.
"""
FREE = "Free"
BASIC = "Basic"
STANDARD = "Standard"
PREMIUM = "Premium"
class UpstreamAuthType(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
"""Gets or sets the type of auth. None or ManagedIdentity is supported now.
"""
NONE = "None"
MANAGED_IDENTITY = "ManagedIdentity"
|
davidcox/glumpy | glumpy/image.py | Python | bsd-3-clause | 8,858 | 0.008806 | #!/usr/bin/env python
# -*- coding: utf-8 -*-
#-----------------------------------------------------------------------------
# Copyright (C) 2009-2010 Nicolas P. Rougier
#
# Distributed under the terms of the BSD License. The full license is in
# the file COPYING, distributed as part of this software.
#-----------------------------------------------------------------------------
import numpy as np
import OpenGL.GL as gl
import texture, shader, colormap, color
class Image(object):
''' '''
def __init__(self, Z, format=None, cmap=colormap.IceAndFire, vmin=None, vmax=None,
interpolation='nearest', origin='lower', lighted=False,
gridsize=(0.0,0.0,0.0), elevation = 0.0):
''' Creates a texture from numpy array.
Parameters:
-----------
Z : numpy array
Z may be a float32 or uint8 array with following shapes:
* M
* MxN
* MxNx[1,2,3,4]
format: [None | 'A' | 'LA' | 'RGB' | 'RGBA']
Specify the texture format to use. Most of times it is possible to
find it automatically but there are a few cases where it not
possible to decide. For example an array with shape (M,3) can be
considered as 2D alpha texture of size (M,3) or a 1D RGB texture of
size (M,).
interpolation: 'nearest', 'bilinear' or 'bicubic'
Interpolation method.
vmin: scalar
Minimal representable value.
vmax: scalar
Maximal representable value.
origin: 'lower' or 'upper'
Place the [0,0] index of the array in the upper left or lower left
corner.
'''
self._lut = None
self._interpolation = interpolation
self._lighted = lighted
self._gridsize = gridsize
self._elevation = elevation
self._texture = texture.Texture(Z)
self._origin = origin
self._vmin = vmin
self._vmax = vmax
self._data = Z
self.cmap = cmap # This takes care of actual build
self._shader = None
self.build()
def build(self):
''' Build shader '''
interpolation = self._interpolation
gridsize = self._gridsize
elevation = self._elevation
lighted = self._lighted
cmap = self._cmap
self._shader = None
# Source format is RGB or RGBA, no need of a colormap
if self._texture.src_format in [gl.GL_RGB,gl.GL_RGBA]:
if interpolation == 'bicubic':
self._shader = shader.Bicubic(False, ligh | ted=lighted, gridsize=gridsize, elevation=elevation)
elif interpolation == 'bilinear':
self._shader = shader.Bilinear(False, lighted=lighted, gridsize=gridsize, elevation=elevation)
else:
self._shader = None
# Source format is not RGB or RGBA
else:
if cmap:
if interpolation == 'bicubic':
self._shader = shader.Bicubic(True, lighted | =lighted, gridsize=gridsize, elevation=elevation)
elif interpolation == 'bilinear':
self._shader = shader.Bilinear(True, lighted=lighted, gridsize=gridsize, elevation=elevation)
else:
self._shader = shader.Nearest(True, lighted=lighted, gridsize=gridsize, elevation=elevation)
else:
if interpolation == 'bicubic':
self._shader = shader.Bicubic(False, lighted=lighted, gridsize=gridsize, elevation=elevation)
elif interpolation == 'bilinear':
self._shader = shader.Bilinear(False, lighted=lighted, gridsize=gridsize, elevation=elevation)
else:
self._shader = None
self.update()
@property
def shape(self):
''' Underlying array shape. '''
return self._data.shape
@property
def data(self):
''' Underlying array '''
return self._data
@property
def texture(self):
''' Underlying texture '''
return self._texture
@property
def shader(self):
''' Currently active shader '''
return self._shader
@property
def format(self):
''' Array representation format (string). '''
format = self._texture.src_format
if format == gl.GL_ALPHA:
return 'A'
elif format == gl.GL_LUMINANCE_ALPHA:
return 'LA'
elif format == gl.GL_RGB:
return 'RGB'
elif format == gl.GL_RGBA:
return 'RGBA'
def _get_cmap(self):
return self._cmap
def _set_cmap(self, cmap):
self._cmap = cmap
colors = self.cmap.LUT['rgb'][1:].flatten().view((np.float32,3))
self._lut = texture.Texture(colors)
cmap = property(_get_cmap, _set_cmap,
doc=''' Colormap to be used to represent the array. ''')
def _get_elevation(self):
return self._elevation
def _set_elevation(self, elevation):
# Do we need to re-build shader ?
if not (elevation*self._elevation):
self._elevation = elevation
self.build()
elif self._shader:
self._elevation = elevation
self._shader._elevation = elevation
elevation = property(_get_elevation, _set_elevation,
doc=''' Image elevation. ''')
def _get_origin(self):
return self._origin
def _set_origin(self, origin):
self._origin = origin
origin = property(_get_origin, _set_origin,
doc=''' Place the [0,0] index of the array in the upper
left or lower left corner. ''')
def _get_lighted(self):
return self._lighted
def _set_lighted(self, lighted):
self._lighted = lighted
self.build()
lighted = property(_get_lighted, _set_lighted,
doc=''' Indicate whether image is ligthed. ''')
def _get_interpolation(self):
return self._interpolation
def _set_interpolation(self, interpolation):
self._interpolation = interpolation
self.build()
interpolation = property(_get_interpolation, _set_interpolation,
doc=''' Interpolation method. ''')
def _get_vmin(self):
return self._vmin
def _set_vmin(self, vmin):
self._vmin = vmin
vmin = property(_get_vmin, _set_vmin,
doc=''' Minimal representable value. ''')
def _get_vmax(self):
return self._vmax
def _set_vmax(self, vmax):
self._vmax = vmax
vmax = property(_get_vmax, _set_vmax,
doc=''' Maximal representable value. ''')
def _get_gridsize(self):
return self._gridsize
def _get_gridsize_x(self):
return self._gridsize[0]
def _get_gridsize_y(self):
return self._gridsize[1]
def _get_gridsize_z(self):
return self._gridsize[2]
def _set_gridsize(self, gridsize):
# Do we need to re-build shader ?
x,y,z = gridsize
x,y,z = max(0,x),max(0,y),max(0,z)
_x,_y,_z = self._gridsize
self._gridsize = x,y,z
if not (x+y+z)*(_x+_y+_z) and (x+y+z)+(_x+_y+_z):
self.build()
elif self._shader:
self._shader._gridsize = x,y,z
def _set_gridsize_x(self, x):
self.gridsize = (max(0,x), self._gridsize[1], self._gridsize[2])
def _set_gridsize_y(self, y):
self.gridsize = (self._gridsize[0], max(0,y), self._gridsize[2])
def _set_gridsize_z(self, z):
self.gridsize = (self._gridsize[0], self._gridsize[1], max(0,z))
gridsize = property(_get_gridsize, _set_gridsize,
doc=''' Image grid (x,y,z). ''')
def update(self):
''' Data update. '''
if self.vmin is None:
vmin = self.data.min()
else:
vmin = self.vmin
if self.vmax is None:
vmax = self._data.max()
else:
vmax = self.vmax
if vmin == vmax:
vmin, vmax = 0, 1 |
larsmans/cython | pyximport/pyxbuild.py | Python | apache-2.0 | 5,193 | 0.006162 | """Build a Pyrex file from .pyx source to .so loadable module using
the installed distutils infrastructure. Call:
out_fname = pyx_to_dll("foo.pyx")
"""
import os
import sys
from distutils.dist import Distribution
from distutils.errors import DistutilsArgError, DistutilsError, CCompilerError
from distutils.extension import Extension
from distutils.util import grok_environment_error
try:
from Cython.Distutils import build_ext
HAS_CYTHON = True
except ImportError:
HAS_CYTHON = False
DEBUG = 0
_reloads={}
def pyx_to_dll(filename, ext = None, force_rebuild = 0,
build_in_temp=False, pyxbuild_dir=None, setup_args={},
reload_support=False, inplace=False):
"""Compile a PYX file to a DLL and return the name of the generated .so
or .dll ."""
assert os.path.exists(filename), "Could not find %s" % os.path.abspath(filename)
path, name = os.path.split(os.path.abspath(filename))
if not ext:
modname, extension = os.path.splitext(name)
assert extension in (".pyx", ".py"), extension
if not HAS_CYTHON:
filename = filename[:-len(extension)] + '.c'
ext = Extension(name=modname, sources=[filename])
if not pyxbuild_dir:
pyxbuild_dir = os.path.join(path, "_pyxbld")
package_base_dir = path
for package_name in ext.name.split('.')[-2::-1]:
package_base_dir, pname = os.path.split(package_base_dir)
if pname != package_name:
# something is wrong - package path doesn't match file path
package_base_dir = None
break
script_args=setup_args.get("script_args",[])
if DEBUG or "--verbose" in script_args:
quiet = "--verbose"
else:
quiet = "--quiet"
args = [quiet, "build_ext"]
if force_rebuild:
args.append("--force")
if inplace and package_base_dir:
args.extend(['--build-lib', package_base_dir])
if ext.name == '__init__' or ext.name.endswith('.__init__'):
# package => provide __path__ early
if not hasattr(ext, 'cython_directives'):
ext.cython_directives = {'set_initial_path' : 'SOURCEFILE'}
elif 'set_initial_path' not in ext.cython_directives:
ext.cython_directives['set_initial_path'] = 'SOURCEFILE'
if HAS_CYTHON and build_in_temp:
args.append("--pyrex-c-in-temp")
sargs = setup_args.copy()
sargs.update(
{"script_name": None,
"script_args": args + script_args} )
dist = Distribution(sargs)
if not dist.ext_modules:
dist.ext_modules = []
dist.ext_modules.append(ext)
if HAS_CYTHON:
dist.cmdclass = {'build_ext': build_ext}
build = dist.get_command_obj('build')
build.build_base = pyxbuild_dir
config_files = dist.find_config_files()
try: config_files.remove('setup.cfg')
except ValueError: pass
dist.parse_config_files(config_files)
cfgfiles = dist.find_config_files()
try: cfgfiles.remove('setup.cfg')
except ValueError: pass
dist.parse_config_files(cfgfiles)
try:
ok = dist.parse_command_line()
except DistutilsArgError:
raise
if DEBUG:
print("options (after parsing command line):")
dist.dump_option_dicts()
assert ok
try:
obj_build_ext = dist.get_command_obj("build_ext")
dist.run_commands()
so_path = obj_build_ext.get_outputs()[0]
if obj_build_ext.inplace:
# Python distutils get_outputs()[ returns a wrong so_path
# when --inplace ; see http://bugs.python.org/issue5977
# workaround:
so_path = os.path.join(os.path.dirname(filename),
os.path.basename(so_path))
if reload_support:
org_path = so_path
timestamp = os.path.getmtime(org_path)
global _reloads
last_timestamp, last_path, count = _reloads.get(org_path, (None,None,0) )
| if last_timestamp == timestamp:
so_path = last_path
else:
basename = os.path. | basename(org_path)
while count < 100:
count += 1
r_path = os.path.join(obj_build_ext.build_lib,
basename + '.reload%s'%count)
try:
import shutil # late import / reload_support is: debugging
shutil.copy2(org_path, r_path)
so_path = r_path
except IOError:
continue
break
else:
# used up all 100 slots
raise ImportError("reload count for %s reached maximum"%org_path)
_reloads[org_path]=(timestamp, so_path, count)
return so_path
except KeyboardInterrupt:
sys.exit(1)
except (IOError, os.error):
exc = sys.exc_info()[1]
error = grok_environment_error(exc)
if DEBUG:
sys.stderr.write(error + "\n")
raise
if __name__=="__main__":
pyx_to_dll("dummy.pyx")
import test
|
openego/oeplatform | modelview/migrations/0026_auto_20160315_1447.py | Python | agpl-3.0 | 479 | 0 | # -*- coding: utf-8 -*-
# Generated by | Django 1.9 on 2016-03-15 13:47
from __future__ import unicode_literals
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [("modelview", "0025_auto_20160315_1446")]
operations = [
migrations.AlterField(
| model_name="energymodel",
name="mathematical_objective_other_text",
field=models.CharField(blank=True, max_length=200),
)
]
|
nanshe-org/nanshe_workflow | nanshe_workflow/data.py | Python | apache-2.0 | 24,568 | 0.000122 | __author__ = "John Kirkham <kirkhamj@janelia.hhmi.org>"
__date__ = "$Nov 05, 2015 13:54$"
import collections
from contextlib import contextmanager
import errno
import itertools
import glob
import numbers
import os
import shutil
import tempfile
import uuid
import zipfile
import scandir
import h5py
import numpy
import tifffile
import zarr
import dask
import dask.array
import dask.delayed
import dask.distributed
import kenjutsu.format
from kenjutsu.measure import len_slices
from kenjutsu.blocks import num_blocks, split_blocks
from builtins import (
map as imap,
range as irange,
zip as izip,
)
from past.builtins import unicode
from nanshe_workflow.ipy import display, FloatProgress
def io_remove(name):
if not os.path.exists(name):
return
elif os.path.isfile(name):
os.remove(name)
elif os.path.isdir(name):
shutil.rmtree(name)
else:
raise ValueError("Unable to remove path, '%s'." % name)
@dask.delayed
def dask_rm_file(fname):
try:
os.remove(fname)
except IOError as e:
if e.errno != errno.ENOENT:
raise
return fname
@dask.delayed
def dask_rm_dir(dname, *deps):
try:
shutil.rmtree(dname)
except IOError as e:
if e.errno != errno.ENOENT:
raise
return dname
def dask_rm_tree(dirname):
dirname = os.path.abspath(dirname)
def _int_dask_rm_tree(dname):
deps = []
for pobj in scandir.scandir(dname):
if pobj.is_file(follow_symlinks=False):
deps.append(dask_rm_file(pobj.path))
if pobj.is_dir(follow_symlinks=False):
deps.append(dask_rm_dir(
pobj.path, *_int_dask_rm_tree(pobj.path)
))
return deps
return dask_rm_dir(dirname, *_int_dask_rm_tree(dirname))
def dask_io_remove(name, executor=None):
name = os.path.abspath(name)
tmp_dir = "tmp_nanshe_workflow_{0}_".format(
os.path.splitext(os.path.basename(name))[0]
)
tmp_dir = tempfile.mkdtemp(prefix=tmp_dir)
if os.path.exists(name):
os.rename(name, os.path.join(tmp_dir, name))
rm_task = dask_rm_tree(tmp_dir)
rm_task = dask.delayed(io_remove)(rm_task)
if executor is None:
return rm_task
f = executor.compute(rm_task)
dask.distributed.fire_and_forget(f)
return f
def zip_dir(dirname, compression=zipfile.ZIP_STORED, allowZip64=True):
dirname = os.path.abspath(dirname)
zipname = dirname + os.extsep + "zip"
if os.path.exists(zipname):
os.remove(zipname)
num_files = sum([len(fns) for _1, _2, fns in scandir.walk(dirname)])
progress_bar = FloatProgress(min=0.0, max=float(num_files))
display(progress_bar)
with zipfile.ZipFile(zipname,
mode="w",
compression=compression,
allowZip64=allowZip64) as fh:
for path, dnames, fnames in scandir.walk(dirname):
fnames = sorted(fnames)
for each_fname in fnames:
each_fname = os.path.join(path, each_fname)
each_fname_rel = os.path.relpath(each_fname, dirname)
fh.write(each_fname, each_fname_rel)
progress_bar.value += 1
return zipname
def concat_dask(dask_arr):
n_blocks = dask_arr.shape
result = dask_arr.copy()
for i in irange(-1, -1 - len(n_blocks), -1):
result2 = result[..., 0]
for j in itertools.product(*[
irange(e) for e in n_blocks[:i]
]):
result2[j] = dask.array.concatenate(
result[j].tolist(),
axis=i
)
result = result2
result = result[()]
return result
def dask_load_hdf5(fn, dn, chunks=None):
with h5py.File(fn) as fh:
shape = fh[dn].shape
dtype = fh[dn].dtype
if chunks is None:
chunks = fh[dn].chunks
else:
chunks = tuple(
es if ec == -1 else ec for es, ec in z | ip(shape, chunks)
)
def _read_chunk(fn, dn, idx):
with h5py.File(fn) as fh:
return fh[dn][idx]
a = numpy.empty(
num_blocks(shape, chunks),
dtype=object
)
for i, s in izip(*split_blocks(shape, chunks, index=True)[:2]):
a[i] = dask.array.from_delayed(
dask.delayed(_read_chunk)(fn, dn, s),
len_slices(s),
dtype
)
a = concat_das | k(a)
return a
def dask_store_zarr(filename, datasetnames, datasets, executor):
if len(datasetnames) != len(datasets):
raise ValueError(
"Need `datasetnames` and `datasets` to have the same length."
)
with open_zarr(filename, "w") as fh:
status = None
dask_arrays = []
zarr_arrays = []
for each_datasetname, each_dataset in izip(datasetnames, datasets):
each_dask_array = dask.array.asarray(each_dataset)
each_zarr_array = fh.create_dataset(
each_datasetname,
shape=each_dask_array.shape,
dtype=each_dask_array.dtype,
chunks=True
)
each_dask_array = each_dask_array.rechunk(each_zarr_array.chunks)
dask_arrays.append(each_dask_array)
zarr_arrays.append(each_zarr_array)
status = executor.compute(dask.array.store(
dask_arrays, zarr_arrays, lock=False, compute=False
))
dask.distributed.fire_and_forget(status)
dask.distributed.progress(status, notebook=False)
print("")
def save_tiff(fn, a):
if os.path.exists(fn):
os.remove(fn)
with tifffile.TiffWriter(fn, bigtiff=True) as tif:
for i in irange(a.shape[0]):
tif.save(numpy.asarray(a[i]))
class DistributedDirectoryStore(zarr.DirectoryStore):
def __delitem__(self, key):
path = os.path.join(self.path, key)
if os.path.exists(path):
dask.distributed.fire_and_forget(dask_io_remove(path).persist())
else:
raise KeyError(key)
def __setitem__(self, key, value):
# Delete in parallel, asynchronously.
# Immediately makes room for new key-value pair.
try:
del self[key]
except KeyError:
pass
super(DistributedDirectoryStore, self).__setitem__(key, value)
@contextmanager
def open_zarr(name, mode="r"):
if not os.path.exists(name) and mode in ["a", "w"]:
store = DistributedDirectoryStore(name)
yield zarr.open_group(store, mode)
elif os.path.isdir(name):
store = DistributedDirectoryStore(name)
yield zarr.open_group(store, mode)
elif zipfile.is_zipfile(name):
with zarr.ZipStore(name, mode=mode, compression=0, allowZip64=True) as store:
yield zarr.open_group(store, mode)
else:
raise NotImplementedError("Unable to open '%s'." % name)
def zip_zarr(name, executor=None):
name_z = zip_dir(name)
name_rm = os.extsep + name
os.rename(name, name_rm)
shutil.move(name_z, name)
if executor is None:
io_remove(name_rm)
else:
dask_io_remove(name_rm, executor=executor)
def hdf5_to_zarr(hdf5_file, zarr_file):
def copy(name, h5py_obj):
if isinstance(h5py_obj, h5py.Group):
zarr_obj = zarr_file.create_group(name)
elif isinstance(h5py_obj, h5py.Dataset):
zarr_obj = zarr_file.create_dataset(
name,
data=h5py_obj,
chunks=h5py_obj.chunks
)
else:
raise NotImplementedError(
"No Zarr type analogue for HDF5 type,"
" '%s'." % str(type(h5py_obj))
)
zarr_obj.attrs.update(h5py_obj.attrs)
hdf5_file.visititems(copy)
def _zarr_visitvalues(group, func):
def _visit(obj):
yield obj
keys = sorted(getattr(obj, "keys", lambda: [])())
for each_key in keys:
for each_obj in _visit(obj[each_key]):
yield each_obj
for each_obj in itertools.islice(_visit( |
zmap/ztag | ztag/transforms/__init__.py | Python | apache-2.0 | 1,377 | 0.000726 | from bacnet import BACNetTransform
from cwmp import CWMPTransform
from dns import DNSTransform
from ftp import FTPTransform
from http import HTTPTransform
from http import HTTPWWWTransform
from https import HTTPSTransform
from https import HTTPSGetTransform
from https import HTTPSWWWTransform
from h | ttps import HeartbleedTransform
from https import RSAExportTransform
from https import DHETransform
from https import DHEExportTransform
from https import ECDHETransform
from https im | port TLSv10Transform
from https import TLSv11Transform
from https import TLSv12Transform
from https import TLSv13Transform
from https import SSLv3Transform
from imap import IMAPStartTLSTransform
from imap import IMAPSTransform
from modbus import ModbusTransform
from ntp import NTPTransform
from pop3 import POP3StartTLSTransform
from pop3 import POP3STransform
from s7 import S7Transform
from smtp import SMTPStartTLSTransform
from smtp import SMTPSTransform
from ssh import SSHV2Transform
from telnet import TelnetTransform
from upnp import UPnPTransform
from fox import NiagaraFoxTransform
from dnp3 import DNP3Transform
from sslv2 import SSLv2Transform
from smb import SMBTransform
from oracle import OracleTransform
from postgres import PostgresTransform
from mongodb import MongoDBTransform
from mssql import MSSQLTransform
from mysql import MySQLTransform
from ipp import IPPTransform |
joelstanner/django-imager | imager/imager/tests.py | Python | mit | 2,457 | 0.000814 | from __future__ import print_function
from django.test import TestCase
from django.contrib.auth.models import User
from imager_images.models import Photo, Album
from django.test import Client
import factory
class UserFactory(factory.django.DjangoModelFactory):
class Meta:
model = User
django_get_or_create = ('username',)
username = 'Bob'
password = factory.PostGenerationMethodCall('set_password', 'password')
class PhotoFactory(factory.django.DjangoModelFactory):
class Meta:
model = Photo
profile = UserFactory.create(username='Bobby').ImagerProfile
photo = 'picture.jpeg'
class AlbumFactory | (factory.django.DjangoModelFactory):
class Meta:
model = Album
profile = UserFactory.create(username='Freddy').ImagerProfile
class TestHomepageViews(TestCase):
STOCKPHOTO_URL = '/media/default_stock_photo_640_360.jpg'
def setUp(self):
self.bob = UserFactory.create()
self.alice = UserFactory.create(username='Alice')
self.bobphoto = PhotoFactory.create(profile=self.bob.ImagerProfile)
self.publicbobphoto = PhotoFactory.create | (profile=self.bob.ImagerProfile,
published='pb')
def test_empty_url_finds_home_page(self):
response = self.client.get('/')
self.assertTemplateUsed(response, 'index.html')
def test_home_page_photo_is_user_photo_or_default(self):
response = self.client.get('/')
self.assertEqual(
response.context['random_photo'],
self.publicbobphoto.photo.url)
def test_home_page_photo_is_stock_if_no_user_photos(self):
self.publicbobphoto.delete()
response = self.client.get('/')
self.assertEqual(
response.context['random_photo'],
self.STOCKPHOTO_URL)
class TestRegistrationViews(TestCase):
def setUp(self):
self.client = Client()
def test_register_page_works(self):
response = self.client.get('/accounts/register/')
self.assertEqual(response.status_code, 200)
def test_register_new_user(self):
self.client.post('/accounts/register/',
{'username': 'bobby',
'email': 'bobby@example.com',
'password1': 'test',
'password2': 'test'}
)
self.assertEqual(len(User.objects.all()), 1)
|
drepetto/chiplotle | chiplotle/geometry/shapes/line_displaced.py | Python | gpl-3.0 | 763 | 0.002621 | from chiplotle.geometry.shapes.path import path
from chiplotle.geometry.transforms.perpendicular_displace \
import perpendicular_displace
def line_displaced(start_coord, end_coord, displacements):
'''Returns a Path defined as a line spanning points `st | art_coord` and
`end_coord`, displaced by scalars `displacements`.
The number of points in the path is determined by the lenght of
`displacements`.
'''
p = path([start_coord, end_coord])
perpendi | cular_displace(p, displacements)
return p
if __name__ == '__main__':
from chiplotle import *
import math
disp = [math.sin(i**0.7 / 3.14159 * 2) * 100 for i in range(200)]
line = line_displaced(Coordinate(0, 0), Coordinate(1000, 1000), disp)
io.view(line)
|
enthought/etsproxy | enthought/chaco/base.py | Python | bsd-3-clause | 79 | 0 | # proxy module
from | __future__ import absolu | te_import
from chaco.base import *
|
mlund/pyha | pyha/openmm.py | Python | mit | 2,333 | 0.030004 | from simtk.openmm import app
import simtk.openmm as mm
from simtk import unit
def findForce(system, forcetype, add=True):
""" Finds a specific force in the system force list - added if not found."""
for force in system.getForces():
if isinstance(force, forcetype):
return force
if add==True:
system.addForce(forcetype())
return findForce(system, forcetype)
return None
def setGlobalForceParameter(force, k | ey, value):
for i in range(force.getNumGlobalParameters()):
if force.getGlobalParameterName(i)==key:
print('setting force parameter', key, '=', value)
force.setGlobalParameterDefaultValue(i, value);
def atomIndexInResidue(residue):
""" list of atom index in residue """
index=[]
for a in list(residue.atoms()):
index.append(a.index)
return index
def getResiduePositions(residue, positions):
""" Returns array w. atomic positions of residue """
ndx = atomIndexInResidue(res | idue)
return np.array(positions)[ndx]
def uniquePairs(index):
""" list of unique, internal pairs """
return list(combinations( range(index[0],index[-1]+1),2 ) )
def addHarmonicConstraint(harmonicforce, pairlist, positions, threshold, k):
""" add harmonic bonds between pairs if distance is smaller than threshold """
print('Constraint force constant =', k)
for i,j in pairlist:
distance = unit.norm( positions[i]-positions[j] )
if distance<threshold:
harmonicforce.addBond( i,j,
distance.value_in_unit(unit.nanometer),
k.value_in_unit( unit.kilojoule/unit.nanometer**2/unit.mole ))
print("added harmonic bond between", i, j, 'with distance',distance)
def addExclusions(nonbondedforce, pairlist):
""" add nonbonded exclusions between pairs """
for i,j in pairlist:
nonbondedforce.addExclusion(i,j)
def rigidifyResidue(residue, harmonicforce, positions, nonbondedforce=None,
threshold=6.0*unit.angstrom, k=2500*unit.kilojoule/unit.nanometer**2/unit.mole):
""" make residue rigid by adding constraints and nonbonded exclusions """
index = atomIndexInResidue(residue)
pairlist = uniquePairs(index)
addHarmonicConstraint(harmonic, pairlist, pdb.positions, threshold, k)
if nonbondedforce is not None:
for i,j in pairlist:
print('added nonbonded exclusion between', i, j)
nonbonded.addExclusion(i,j)
|
wbrefvem/openshift-ansible | roles/lib_openshift/library/oc_edit.py | Python | apache-2.0 | 55,673 | 0.001293 | #!/usr/bin/env python
# pylint: disable=missing-docstring
# flake8: noqa: T001
# ___ ___ _ _ ___ ___ _ _____ ___ ___
# / __| __| \| | __| _ \ /_\_ _| __| \
# | (_ | _|| .` | _|| / / _ \| | | _|| |) |
# \___|___|_|\_|___|_|_\/_/_\_\_|_|___|___/_ _____
# | \ / _ \ | \| |/ _ \_ _| | __| \_ _|_ _|
# | |) | (_) | | .` | (_) || | | _|| |) | | | |
# |___/ \___/ |_|\_|\___/ |_| |___|___/___| |_|
#
# Copyright 2016 Red Hat, Inc. and/or its affiliates
# and other contributors as indicated by the @author tags.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
# -*- -*- -*- Begin included fragment: lib/import.py -*- -*- -*-
'''
OpenShiftCLI class that wraps the oc commands in a subprocess
'''
# pylint: disable=too-many-lines
from __future__ import print_function
import atexit
import copy
import fcntl
import json
import os
import re
import shutil
import subprocess
import tempfile
# pylint: disable=import-error
try:
import ruamel.yaml as yaml
except ImportError:
import yaml
from ansible.module_utils.basic import AnsibleModule
# -*- -*- -*- End included fragment: lib/import.py -*- -*- -*-
# -*- -*- -*- Begin included fragment: doc/edit -*- -*- -*-
DOCUMENTATION = '''
---
module: oc_edit
short_description: Modify, and idempotently manage openshift objects.
description:
- Modify openshift objects programmatically.
options:
state:
description:
- Currently present is only supported state.
required: true
default: present
choices: ["present"]
aliases: []
kubeconfig:
description:
- The path for the kubeconfig file to use for authentication
required: false
default: /etc/origin/master/admin.kubeconfig
aliases: []
debug:
description:
- Turn on debug output.
required: false
default: False
aliases: []
name:
description:
- Name of the object that is being queried.
required: false
default: None
aliases: []
namespace:
description:
- The namespace where the object lives.
required: false
default: str
aliases: []
kind:
description:
- The kind attribute of the object.
required: True
default: None
choices:
- bc
- buildconfig
- configmaps
- dc
- deploymentconfig
- imagestream
- imagestreamtag
- is
- istag
- namespace
- project
- projects
- node
- ns
- persistentvolume
- pv
- rc
- replicationcontroller
- routes
- scc
- secret
- securitycontextconstraints
- service
- svc
aliases: []
file_name:
description:
- The file name in which to edit
required: false
default: None
aliases: []
file_format:
description:
- The format of the file being edited.
required: false
default: yaml
aliases: []
content:
description:
- Content of the file
required: false
default: None
aliases: []
edits:
description:
- a list of dictionaries with a yedit format for edits
required: false
default: None
aliases: []
force:
description:
- Whether or not to force the operation
required: false
default: None
aliases: []
separator:
description:
- The separator format for the edit.
required: false
default: '.'
aliases: []
author:
- "Kenny Woodson <kwoodson@redhat.com>"
extends_documentation_fragment: []
'''
EXAMPLES = '''
oc_edit:
kind: rc
name: hawkular-cassandra-rc
namespace: openshift-infra
content:
spec.template.spec.containers[0].resources.limits.memory: 512
spec.template.spec.containers[0].resources.requests.memory: 256
'''
# -*- -*- -*- End included fragment: doc/edit -*- -*- -*-
# -*- -*- -*- Begin included fragment: ../../lib_utils/src/class/yedit.py -*- -*- -*-
class YeditException(Exception): # pragma: no cover
''' Exception class for Yedit '''
pass
# pylint: disable=too-many-public-methods
class Yedit(object): # pragma: no cover
''' Class to modify yaml files '''
re_valid_key = r"(((\[-?\d+\])|([0-9a-zA-Z%s/_-]+)).?)+$"
re_key = r"(?:\[(-?\d+)\])|([0-9a-zA-Z{}/_-]+)"
com_sep = set(['.', '#', '|', ':'])
# pylint: disable=too-many-arguments
def __init__(self,
filename=None,
content=None,
content_type='yaml',
separator='.',
backup=False):
self.content = content
self._separator = separator
self.filename = filename
self.__yaml_dict = content
self.content_type = content_type
self.backup = backup
self.load(content_type=self.content_type)
if self.__yaml_dict is None:
self.__yaml_dict = {}
@property
def separator(self):
''' getter method for separator '''
return self._separator
@separator.setter
def separator(self, inc_sep):
''' setter method for separator '''
self._separator = inc_sep
@property
def yaml_dict(self):
''' getter method for yaml_dict '''
return self.__yaml_dict
@yaml_dict.setter
def yaml_dict(self, value):
''' setter method for yaml_dict '''
self.__yaml_dict = value
@staticmethod
def parse_key(key, sep='.'):
'''parse the key allowing the appropriate separator'''
common_separators = list(Yedit.com_sep - set([sep]))
return re.findall(Yedit.re_key.format(''.join(common_separators)), key)
@staticmethod
def valid_key(key, sep='.'):
'''validate the incoming key'''
common_separators = list(Yedit.com_sep - set([sep]))
if not re.match(Yedit.re_valid_key.format(''.join(common_separators)), key):
return False
return True
# pylint: disable=too-many-return-statements,too-many-branches
@staticmethod
def remove_entry(data, key, index=None, value=None, sep='.'):
''' remove data at location key '''
if key == '' and isinstance(data, dict):
if value is not None:
data.pop(value)
elif index is not None:
raise YeditException("remove_entry for a dictionary does not have an index {}".format(index))
else:
data.clear()
return True
elif key == '' and isinstance(data, list):
ind = None
if value is not None:
try:
ind = data.index(value)
except ValueError:
return False
elif index is not None:
ind = index
else:
del data[:]
if ind is not None:
data.pop(ind)
return True
if not (key and Yedit.valid_key(key, sep)) and \
isinstance(data, (list, dict)):
return None
key_indexes = Yedit.parse_key(key, sep)
for arr_ind, dict_key in key_indexes[:-1]:
if dict_key and isinstance(data, dic | t):
data = data.get(dict_key)
elif (arr_ind and isinstance(data, list) and
int(arr_ind) <= len(data) - 1):
data = data[int(arr_ind)]
else:
return None
# process last index for remove
# expected list entry
if key_indexes[-1][0]:
if isinstance(data, list) and int(key_indexes[-1][0]) <= len(data) - 1: # noqa: E501
del data[int(key_indexes[-1][0])] |
return True
# expected dict entry
elif key_indexes[-1][1]:
if isinstance(data, dict):
del data[key_indexes[- |
10se1ucgo/LoLTrivia | plugins/debug.py | Python | mit | 4,487 | 0.003567 | # Based on Rapptz's RoboDanny's repl cog
import contextlib
import inspect
import logging
import re
import sys
import textwrap
import traceback
from io import StringIO
from typing import *
from typing import Pattern
import discord
from discord.ext import commands
# i took this from somewhere and i cant remember where
md: Pattern = re.compile(r"^(([ \t]*`{3,4})([^\n]*)(?P<code>[\s\S]+?)(^[ \t]*\2))", re.MULTILINE)
logger = logging.getLogger(__name__)
class BotDebug(object):
def __init__(self, client: commands.Bot):
self.client = client
self.last_eval = None
@commands.command(hidden=True)
async def exec(self, ctx: commands.Context, *, cmd: str):
result, stdout, stderr = await self.run(ctx, cmd, use_exec=True)
await self.send_output(ctx, result, stdout, stderr)
@commands.command(hidden=True)
async def eval(self, ctx: commands.Context, *, cmd: str):
scope = {"_": self.last_eval, "last": self.last_eval}
result, stdout, stderr = await self.run(ctx, cmd, use_exec=False, extra_scope=scope)
self.last_eval = result
await self.send_output(ctx, result, stdout, stderr)
async def send_output(self, ctx: commands.Context, result: str, stdout: str, stderr: str):
print(result, stdout, stderr)
if result is not None:
await ctx.send(f"Result: `{result}`")
if stdout:
logger.info(f"exec stdout: \n{stdout}")
await ctx.send("stdout:")
await self.send_split(ctx, stdout)
if stderr:
logger.error(f"exec stderr: \n{stderr}")
await ctx.send("stderr:")
await self.send_split(ctx, stderr)
async def run(self, ctx: commands.Context, cmd: str, use_exec: bool, extra_scope: dict=None) -> Tuple[Any, str, str]:
if not self.client.is_owner(ctx.author):
return None, "", ""
# note: exec/eval inserts __builtins__ if a custom version is not defined (or set to {} or whatever)
scope: Dict[str, Any] = {'bot': self.client, 'ctx': ctx, 'discord': discord}
if extra_scope:
scope.update(extra_scope)
match: Match = md.match(cmd)
code: str = match.group("code").strip() if match else cmd.strip('` \n')
logger.info(f"Executing code '{code}'")
result = None
with std_redirect() as (stdout, stderr):
try:
if use_exec:
# wrap in async function to run in loop and allow await calls
func = f"async def run():\n{textwrap.indent(code, ' ')}"
exec(func, scope)
result = await scope['run']()
else:
result = eval(code, scope)
# eval doesn't allow `await`
if inspect.isawaitable(result):
result = await result
except (SystemExit, KeyboardInterrupt):
raise
except Exception:
await self.on_error(ctx)
else:
await ctx.message.add_reaction('✅')
return result, stdout.getvalue(), stderr.getvalue()
async def on_error(self, ctx: commands.Context):
# prepend a "- " to each line and use ```diff``` syntax highlighting to color the error message red.
# also strip lines 2 and 3 of the traceback which includes full path to the file, irrelevant for repl code.
# yes i know error[:1] is basica | lly error[0] but i want it to stay as a list
logger.exception("Error in exec code")
error = traceback.format_exc().splitlines()
error | = textwrap.indent('\n'.join(error[:1] + error[3:]), '- ', lambda x: True)
await ctx.send("Traceback:")
await self.send_split(ctx, error, prefix="```diff\n")
async def send_split(self, ctx: commands.Context, text: str, *, prefix="```\n", postfix="\n```"):
max_len = 2000 - (len(prefix) + len(postfix))
text: List[str] = [text[x:x + max_len] for x in range(0, len(text), max_len)]
print(text)
for message in text:
await ctx.send(f"{prefix}{message}{postfix}")
@contextlib.contextmanager
def std_redirect():
stdout = sys.stdout
stderr = sys.stderr
sys.stdout = StringIO()
sys.stderr = StringIO()
yield sys.stdout, sys.stderr
sys.stdout = stdout
sys.stderr = stderr
def init(bot: commands.Bot, cfg: dict):
bot.add_cog(BotDebug(bot))
|
cdiener/rater | app.py | Python | mit | 11,748 | 0.008257 | # all the imports
import sqlite3
from flask import Flask, request, session, g, redirect, url_for, \
abort, render_template, flash
from queries import *
from functools import wraps
from contextlib import closing
from wtforms import SelectField, PasswordField, validators
from flask_wtf import Form
from flask_wtf.file import FileField, FileRequired, FileAllowed
from os import urandom, path
import uuid
import pandas as pd
PERSON_COLS = (14, 0, 1 ,4, 2, 3, 8, 5, 7, 6, 9, 10, 11)
app = Flask(__name__)
if path.exists("deploy_conf.py"):
app.config.from_pyfile("deploy_conf.py")
print("Using deploy configuration...")
else:
app.config.from_pyfile("config.py")
print("Using mock configuration...")
# The forms
###########
rating_validator = validators.AnyOf(['0', '1', '2', '3', '4'],
message="You forgot to select an option")
class LoginForm(Form):
token = PasswordField('Enter your Token:', [validators.Required(),
validators.AnyOf(app.config['USERS'].keys(), message="Invalid token!")])
class PersonForm(Form):
pos = SelectField('What do YOU think is the position of the applicant?',
[rating_validator], choices=[('100', 'Pick one..'), ('0', 'Undergrad'),
('1', 'M.Sc./Ph.D. student or health pro.'),
('2', 'Postdoc, Associate Prof. or M.D.'), ('3', 'Principal Investigator')])
inst = SelectField('How do you rate the institution?', [rating_validator],
choices=[('100', 'Pick one..'), ('0', 'dubious'), ('1', 'average'),
('2', 'national leader'), ('3', 'international leader')])
dist = SelectField('How do you rate the travel distance?', [rating_validator],
choices=[('100', 'Pick one..'), ('0', 'local'), ('1', 'national'),
('2', 'international')])
topic = SelectField('How do you rate the research topic?',
[rating_validator], choices=[('100', 'Pick one..'), ('0', '0 - bad'),
('1', '1 - average'), ('2', '2 - amazing')])
class AbstractForm(Form):
abstract = SelectField('How do you rate the abstract(s)?', [rating_validator],
choices=[('100', 'Pick one..'), ('0', '0 - insufficient'), ('1', '1 - barely acceptable'),
('2', '2 - acceptable'), ('3', '3 - pretty good'), ('4', '4 - amazing')])
english = SelectField('How do you rate the quality of English?', [rating_validator],
choices=[('100', 'Pick one..'), ('0', 'insufficient'), ('1', 'acceptable'),
('2', 'fluent')])
class ImportForm(Form):
persons = FileField('Choose an applicant file', [FileAllowed(['.csv'])])
posters = FileField('Choose a poster abstracts file', [FileAllowed(['.csv'])])
talks = FileField('Choose a talk abstracts file', [FileAllowed(['.csv'])])
###########
# Utility functions
###########
def connect_db():
return sqlite3.connect(app.config['DATABASE'])
def add_fakes(db, n, base_id=1):
from faker import Faker
fake = Faker()
for i in range(n):
vals = (str(base_id+i), fake.first_name(), fake.last_name(), fake.email(),
'NA', fake.date(), fake.military_ship(), fake.company(), fake.state(),
fake.country(), 'Ph.D.', fake.job(), fake.sentence(), fake.sentence(),
fake.text(750), fake.name(), fake.company(), fake.sentence(),
fake.text(450), fake.name(), fake.company())
db.execute(insert_complete, vals)
db.commit()
def init_db(n=0):
with closing(connect_db()) as db:
with app.open_resource('schema.sql', mode='r') as f:
db.cursor().executescript(f.read())
if n > 0: add_fakes(db, n)
def make_token(n, word=None):
if word:
return uuid.uuid5(uuid.NAMESPACE_DNS, word + app.config['SECRET_KEY']).hex[0:n]
return uuid.uuid4().hex[0:n]
def tokenize(users):
return {make_token(16, u[0]): u for u in users}
@app.before_request
def before_request():
g.db = connect_db()
@app.teardown_request
def teardown_request(exception):
db = getattr(g, 'db', None)
if db is not None:
db.close()
def login_required(f):
@wraps(f)
def decorated_function(*args, **kwargs):
if not 'user' in session:
return redirect(url_for('login'))
return f(*args, **kwargs)
return decorated_function
#########3
# Routes
@app.route('/login', methods=['POST', 'GET'])
def login():
form = LoginForm(request.form)
if request.method == 'POST' and form.validate():
token = form.token.data
session['user'] = app.config['USERS'][token][0]
session['role'] = app.config['USERS'][token][1]
session['rated'] = 0
session['p'] = None # next applicant to review
session['a'] = None # next abstract to review
flash('Thank you. Logging in...')
return redirect(url_for('show_entries'))
return render_template('login.html', form=form, n_user=len(app.config['USERS']))
@app.route('/')
@login_required
def show_entries():
data = {}
cur = g.db.execute(review_count, (session['user'],))
data['nrev'] = cur.fetchone()[0]
cur = g.db.execute("select count(*) from ratings")
data['ntotrev'] = cur.fetchone()[0]
cur = g.db.execute("select count(*) from abstracts")
data['nabstotrev'] = cur.fetchone()[0]
cur = g.db.execute(person_count)
data['ntot'] = cur.fetchone()[0]
cur = g.db.execute(abstract_count)
data['nabstot'] = cur.fetchone()[0]
cur = g.db.execute(abstract_rev_count, (session['user'],))
data['nabsrev'] = cur.fetchone()[0]
return render_template('index.html',user=session['user'],
role=session['role'], **data)
@app.route('/logout')
def logout():
session.pop('user', None)
session.pop('role', None)
session.pop('rated', None)
flash('You were logged out')
return redirect(url_for('login'))
@app.route('/applicants', methods=['GET', 'POST'])
@login_required
def rate_person():
if request.method == 'GET':
cur = g.db.execute(next_person, (session['user'],))
session['p'] = cur.fetchone()
form = PersonForm(request.form)
if request.method == 'POST' and form.validate() and session['p']:
g.db.execute(insert_person_rating, (session['p'][0], session['user'],
form.pos.data, form.inst.data, form.dist.data, form.topic.data))
g.db.commit()
session['rated'] += 1
return redirect(url_for('added', type='applicant'))
return render_template('applicants.html', form=form, p=session['p'],
user=session['user'], role=session['role'])
@app.route('/abstracts', methods=['GET', 'POST'])
@login_required
def rate_abstract():
if session['role'] != 'all':
return render_template('message.html', type='error', title='Nope...',
message='You are not allowed to review abstracts :(',
user=session['user'], role=session['role'])
if request.method == 'GET':
cur = g.db.execute(next_ab | stract, (session['user'],))
session['a'] = cur.fetchone()
form = AbstractForm(request.form)
if request.method == 'POST' and form.validate() and session['a']:
g.db.execute(insert_abstract_rating, (s | ession['a'][0], session['user'],
form.abstract.data, form.english.data))
g.db.commit()
session['rated'] += 1
return redirect(url_for('added', type='abstract'))
return render_template('abstracts.html', form=form, a=session['a'],
user=session['user'], role=session['role'])
@app.route('/added/<type>')
@login_required
def added(type):
return render_template('added.html', rated=session['rated'],
user=session['user'], role=session['role'], type=type)
@app.route('/results')
@login_required
def results():
persons = pd.read_sql("select * from persons", g.db)
ratings = pd.read_sql(average_ratings, g.db)
abstracts = pd.read_sql(average_abstracts, g.db)
persons = pd.merge(persons, ratings, on='pid', how='left')
persons = pd.merge(persons, abstracts, on='pid', how='left', suffixes=('_applicant', '_abstract'))
persons["total"] = persons[['p_position', 'p_institution', 'p_distance',
'p_topic', 'p_abstract']].sum(axis=1).fillna(0)
persons = persons.sort_values(by="total", ascending=False)
persons.to_csv('static/res.csv', encoding='utf-8') |
jskinn/robot-vision-experiment-framework | trials/slam/tests/test_visual_slam.py | Python | bsd-2-clause | 4,146 | 0.004342 | # Copyright (c) 2017, John Skinner
import unittest
import numpy as np
import pickle
import database.tests.test_entity
import util.dict_utils as du
import util.transform as tf
import core.sequence_type
import trials.slam.visual_slam as vs
import trials.slam.tracking_state as ts
class TestSLAMTrialResult(database.tests.test_entity.EntityContract, unittest.TestCase):
def get_class(self):
return vs.SLAMTrialResult
def make_instance(self, *args, **kwargs):
states = [ts.TrackingState.NOT_INITIALIZED, ts.TrackingState.OK, ts.TrackingState.LOST]
kwargs = du.defaults(kwargs, {
'system_id': np.random.randint(10, 20),
'trajectory': {
np.random.uniform(0, 600): tf.Transform(location=np.random.uniform(-1000, 1000, 3),
rotation=np.random.uniform(0, 1, 4))
for _ in range(100)
},
'ground_truth_trajectory': {
np.random.uniform(0, 600): tf.Transform(location=np.random.uniform(-1000, 1000, 3),
rotation=np.random.uniform(0, 1, 4))
for _ in range(100)
},
'tracking_stats': {
np.random.uniform(0, 600): states[np.random.randint(0, len(states))]
for _ in range(100)
},
'sequence_type': core.sequence_type.ImageSequenceType.SEQUENTIAL,
'system_settings': {
'a': np.random.randint(20, 30)
}
})
return vs.SLAMTrialResult(*args, **kwargs)
def assert_models_equal(self, trial_result1, trial_result2):
"""
Helper to assert that two SLAM trial results models are equal
:param trial_result1:
:param trial_result2:
:return:
"""
if not isinstance(trial_result1, vs.SLAMTrialResult) or not isinstance(trial_result2, vs.SLAMTrialResult):
self.fail('object was not a SLAMTrialResult')
self.assertEqual(trial_result1.identifier, trial_result2.identifier)
self.assertEqual(trial_result1.system_id, trial_result2.system_id)
self.assertEqual(trial_result1.success, trial_result2.success)
self._assertTrajectoryEqual(trial_result1.trajectory, trial_result2.trajectory)
self._assertTrajectoryEqual(trial_result1.ground_truth_trajectory, trial_result2.ground_truth_trajectory)
self.assertEqual(trial_result1.tracking_stats, trial_result2.tracking_stats)
def assert_serialized_equal(self, s_model1, s_model2):
self.assertEqual(set(s_model1.keys()), set(s_model2.keys()))
for key in s_model1.keys():
if (key is not 'ground_truth_trajectory' and
key is not 'trajectory' and
key is not 'tracking_stats'):
self.assertEqual(s_model1[key], s_model2[key])
traj1 = pickle.loads(s_model1['trajectory'])
traj2 = pickle.loads(s_model2['trajectory'])
self._assertTrajectoryEqual(traj1, traj2)
traj1 = pickle.loads(s_model1['ground_truth_trajectory'])
traj2 = pickle.loads(s_model2['ground_truth_trajectory'])
self._assertTrajectoryEqual(traj1, traj2)
stats1 = pickle.loads(s_model1['tracking_stats'])
stats2 = pickle.loads(s_model2['tracking_stats'])
self.assertEqual(stats1, st | ats2)
def _assertTrajectoryEqual(self, traj1, traj2):
self.assertEqual(list(traj1.keys()).sort(), list(traj2.keys()).sort())
for time in traj1.keys():
self.assertTrue(np.array_equal(traj1[time].location, traj2[time].location),
"Locations are not equal")
self.assertTrue(np.array_equal(traj1[time].rotation_quat(w_first=True),
traj2[time].rotation_quat(w_fi | rst=True)),
"Rotations {0} and {1} are not equal".format(tuple(traj1[time].rotation_quat(w_first=True)),
tuple(traj2[time].rotation_quat(w_first=True))))
|
openstack/tacker | tacker/tests/unit/vnflcm/test_utils.py | Python | apache-2.0 | 2,800 | 0 | # Copyright (c) 2020 NTT DATA
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
# implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import os
import ddt
from oslo_config import cfg
| from tacker.tests.unit import base
from tacker.tests.unit.vnflcm import fakes
from tacker.tests import uuidsentinel
from tacker.vnflcm import utils as vnflcm_utils
@ddt.ddt
class VnfLcmUtilsTestCase(base.TestCase):
@d | dt.data(
{'image_path': 'cirros-0.5.2-x86_64-disk.img',
'extracted_path': 'cirros-0.5.2-x86_64-disk.img'},
{'image_path': '../ImageFiles/image/cirros-0.5.2-x86_64-disk.img',
'extracted_path': 'ImageFiles/image/cirros-0.5.2-x86_64-disk.img'},
{'image_path': '../../Files/image/cirros-0.5.2-x86_64-disk.img',
'extracted_path': 'Files/image/cirros-0.5.2-x86_64-disk.img'}
)
@ddt.unpack
def test_create_grant_request_with_software_image_path(self, image_path,
extracted_path):
vnf_package_id = uuidsentinel.package_uuid
vnfd_dict = fakes.get_vnfd_dict(image_path=image_path)
vnf_software_images = vnflcm_utils._create_grant_request(
vnfd_dict, vnf_package_id)
vnf_package_path = cfg.CONF.vnf_package.vnf_package_csar_path
expected_image_path = os.path.join(vnf_package_path, vnf_package_id,
extracted_path)
self.assertEqual(expected_image_path,
vnf_software_images['VDU1'].image_path)
def test_get_param_data_with_flavour_description(self):
vnfd_dict = fakes.get_vnfd_dict()
vnfd_dict.update({'imports': []})
instantiate_vnf_req = fakes.get_instantiate_vnf_request_obj()
param_value = vnflcm_utils._get_param_data(vnfd_dict,
instantiate_vnf_req)
expected_flavour_description = 'A simple flavor'
self.assertEqual(expected_flavour_description,
param_value['flavour_description'])
def test_topology_template_param_of_vnf_dict(self):
vnf_dict = fakes.vnf_dict()
vnf_keys = vnf_dict['vnfd']['attributes']['vnfd_simple']
self.assertIn('node_templates', vnf_keys)
self.assertIn('policies', vnf_keys)
self.assertIn('groups', vnf_keys)
|
sprymix/importkit | importkit/yaml/validator/tests/test_types.py | Python | mit | 5,672 | 0.000705 | ##
# Copyright (c) 2008-2010 Sprymix Inc.
# All rights reserved.
#
# See LICENSE for details.
##
import collections
from importkit.yaml import validator
from importkit.yaml.validator.tests.base import SchemaTest, raises, result
class TestTypes(SchemaTest):
def setUp(self):
super().setUp()
self.schema = self.get_schema('types.Schema')
@raises(validator.SchemaValidationError, 'expected none')
def test_validator_types_none_fail1(self):
"""
none: '12'
"""
@result(key='none', value=None)
def test_validator_types_none_result(self):
"""
none:
"""
@raises(validator.SchemaValidationError, 'expected integer')
def test_validator_types_int_fail1(self):
"""
int: '12'
"""
@raises(validator.SchemaValidationError, 'expected integer')
def test_validator_types_int_fail2(self):
"""
int: 123.2
"""
@result(key='int', value=31415)
def test_validator_types_int_result(self):
"""
int: 31415
"""
@raises(validator.SchemaValidationError, 'expected number (int or float)')
def test_validator_types_number_fail1(self):
"""
number: [123, 1]
"""
@result(key='number', value=31415)
def test_validator_types_number_int_result(self):
"""
number: 31415
"""
@result(key='number', value=31415.2)
def test_validator_types_number_float_result(self):
"""
number: 31415.2
"""
@raises(validator.SchemaValidationError, 'expected text (number or str)')
def test_validator_types_text_fail1(self):
"""
text: [123, 1]
"""
@result(key='text', value='31415')
def test_validator_types_text_int_result(self):
"""
text: 31415
"""
@result(key='text', value='31415.123')
def test_validator_types_text_float_result(self):
"""
text: 31415.123
"""
@result(key='bool', value=True)
def test_validator_types_bool_yes_result(self):
"""
bool: yes
"""
@result(key='bool', value=True)
def test_validator_types_bool_True_result(self):
"""
bool: True
"""
@result(key='bool', value=True)
def test_validator_types_bool_true_result(self):
"""
bool: true
"""
@result(key='bool', value=False)
def test_validator_types_bool_yes_result2(self):
"""
bool: no
"""
@result(key='bool', value=False)
def test_validator_types_bool_True_result2(self):
"""
bool: false
"""
@raises(validator.SchemaValidationError, 'expected bool')
def test_validator_types_bool_fail1(self):
"""
bool: 1
"""
@raises(validator.SchemaValidationError, 'expected bool')
def test_validator_types_bool_fail2(self):
"""
bool: 'yes'
"""
@raises(validator.SchemaValidationError, 'mapping expected')
def test_validator_types_map_fail1(self):
"""
dict: 'WRONG'
"""
@raises(validator.SchemaValidationError, "unexpected key 'wrongkey'")
def test_validator_types_map_fail2(self):
"""
dict:
wrongkey: 1
"""
@result(key='dict', value={'test1': 3, 'test2': 'a'})
def test_validator_types_map_defaults(self):
"""
dict:
"""
@raises(validator.SchemaValidationError, 'the number of elements in mapping must not be less than 2')
def test_validator_types_map_constraints1(self):
"""
fdict:
a: "1"
"""
@raises(validator.SchemaValidationError, 'the number of elements in mapping must not exceed 3')
def test_validator_types_map_constraints2(self):
"""
fdict:
a: "1"
b: "2"
c: "3"
d: "4"
"""
@result(key='fdict', value={'a': "1", 'b': "2"})
def test_validator_types_map_constraints_ok(self):
"""
fdict:
a: "1"
b: "2"
"""
@raises(validator.SchemaValidationError, "duplicate mapping key 'A'")
def test_validator_types_map_duplicate_key_check(self):
"""
fdict:
A: "1"
A: "2"
"""
@result(key='fdict', value={'a': "1", ('b', 'c'): "2"})
def test_validator_types_map_nonscalar_key(self):
"""
fdict:
a: "1"
[b, c]: "2"
"""
@result(key='redict', value={'UPPERCASE': 10, 'lowercase': '10', '12345': True})
def test_validator_type_map_pattern_key_ok(self):
"""
redict:
UPPERCASE: 10
lowercase: '10'
"""
@raises(validator.SchemaValidationError, "unexpected key '1'")
def test_validator_type_map_pattern_key_fail(self):
"""
redict:
1: 10
"""
@result(key='minmax', value=3)
def test_validator_types_int_minmax(self):
"""
minmax: 3
"""
@raises(validator.SchemaValidationError, 'range-min validation failed')
def test_validator_types_int_minmax_fail(self):
"""
minmax: 2
"""
@raises(validator.SchemaValidationError, 'range-max-ex validation failed')
def test_validator_types_i | nt_minmax_fail2(self):
"""
minmax: 20
"""
@result(key='odict', value=collections.OrderedDict([('A', 1), ('B', 2), ('C', 3), ('D', 4)]))
def test_validator_types | _ordered_map(self):
"""
odict:
A: 1
B: 2
C: 3
D: 4
"""
|
penzance/ab-testing-tool | ab_tool/tests/test_experiment_pages.py | Python | mit | 28,898 | 0.005225 | from ab_tool.tests.common import (SessionTestCase, TEST_COURSE_ID,
TEST_OTHER_COURSE_ID, NONEXISTENT_TRACK_ID, NONEXISTENT_EXPERIMENT_ID,
APIReturn, LIST_MODULES)
from django.core.urlresolvers import reverse
from ab_tool.models import (Experiment, InterventionPointUrl)
from ab_tool.exceptions import (EXPERIMENT_TRACKS_ALREADY_FINALIZED,
NO_TRACKS_FOR_EXPERIMENT, UNAUTHORIZED_ACCESS,
INTERVENTION_POINTS_ARE_INSTALLED)
import json
from mock import patch
class TestExperimentPages(SessionTestCase):
""" Tests related to Experiment and Experiment pages and methods """
def test_create_experiment_view(self):
""" Tests edit_experiment template renders for url 'create_experiment' """
response = self.client.get(reverse("ab_testing_tool_create_experiment"))
self.assertOkay(response)
self.assertTemplateUsed(response, "ab_tool/edit_experiment.html")
def test_create_experiment_view_unauthorized(self):
""" Tests edit_experiment template does not render for url 'create_experiment'
when unauthorized """
self.set_roles([])
response = self.client.get(reverse("ab_testing_tool_create_experiment"), follow=True)
self.assertTemplateNotUsed(response, "ab_tool/create_experiment.html")
self.assertTemplateUsed(response, "ab_tool/not_authorized.html")
def test_edit_experiment_view(self):
""" Tests edit_experiment template renders when authenticated """
experiment = self.create_test_experiment()
response = self.client.get(reverse("ab_testing_tool_edit_experiment", args=(experiment.id,)))
self.assertTemplateUsed(response, "ab_tool/edit_experiment.html")
def test_edit_experiment_view_started_experiment(self):
""" Tests edit_experiment template renders when experiment has started """
experiment = self.create_test_experiment()
experiment.tracks_finalized = True
experiment.save()
response = self.client.get(reverse("ab_testing_tool_edit_experiment", args=(experiment.id,)))
self.assertTemplateUsed(response, "ab_tool/edit_experiment.html")
def test_edit_experiment_view_with_tracks_weights(self):
""" Tests edit_experiment template renders properly with track weights """
experiment = self.create_test_experiment()
experiment.assignment_method = Experiment.WEIGHTED_PROBABILITY_RANDOM
track1 = self.create_test_track(name="track1", experiment=experiment)
track2 = self.create_test_track(name="track2", experiment=experiment)
self.create_test_track_weight(experiment=experiment, track=track1)
self.create_test_track_weight(experiment=experiment, track=track2)
response = self.client.get(reverse("ab_testing_tool_edit_experiment", args=(experiment.id,)))
self.assertTemplateUsed(response, "ab_tool/edit_experiment.html")
def test_edit_experiment_view_unauthorized(self):
""" Tests edit_experiment template doesn't render when unauthorized """
self.set_roles([])
experiment = self.create_test_experiment(course_id=TEST_OTHER_COURSE_ID)
response = self.client.get(reverse("ab_testing_tool_edit_experiment", args=(experiment.id,)),
follow=True)
self.assertTemplateNotUsed(response, "ab_tool/edit_experiment.html")
self.assertTemplateUsed(response, "ab_tool/not_authorized.htm | l")
def test_edit_experiment_view_none | xistent(self):
"""Tests edit_experiment when experiment does not exist"""
e_id = NONEXISTENT_EXPERIMENT_ID
response = self.client.get(reverse("ab_testing_tool_edit_experiment", args=(e_id,)))
self.assertTemplateNotUsed(response, "ab_tool/edit_experiment.html")
self.assertEquals(response.status_code, 404)
def test_edit_experiment_view_wrong_course(self):
""" Tests edit_experiment when attempting to access a experiment from a different course """
experiment = self.create_test_experiment(course_id=TEST_OTHER_COURSE_ID)
response = self.client.get(reverse("ab_testing_tool_edit_experiment", args=(experiment.id,)))
self.assertError(response, UNAUTHORIZED_ACCESS)
def test_edit_experiment_view_last_modified_updated(self):
""" Tests edit_experiment to confirm that the last updated timestamp changes """
experiment = self.create_test_experiment()
experiment.name += " (updated)"
response = self.client.post(reverse("ab_testing_tool_submit_edit_experiment",
args=(experiment.id,)),
content_type="application/json",
data=experiment.to_json())
self.assertEquals(response.content, "success")
updated_experiment = Experiment.objects.get(id=experiment.id)
self.assertLess(experiment.updated_on, updated_experiment.updated_on,
response)
def test_submit_create_experiment(self):
""" Tests that create_experiment creates a Experiment object verified by
DB count when uniformRandom is true"""
Experiment.get_placeholder_course_experiment(TEST_COURSE_ID)
num_experiments = Experiment.objects.count()
experiment = {
"name": "experiment", "notes": "hi", "uniformRandom": True,
"csvUpload": False,
"tracks": [{"id": None, "weighting": None, "name": "A"}]
}
response = self.client.post(
reverse("ab_testing_tool_submit_create_experiment"), follow=True,
content_type="application/json", data=json.dumps(experiment)
)
self.assertEquals(num_experiments + 1, Experiment.objects.count(), response)
def test_submit_create_experiment_csv_upload(self):
""" Tests that create_experiment creates a Experiment object verified by
DB count when csvUpload is True and no track weights are specified"""
Experiment.get_placeholder_course_experiment(TEST_COURSE_ID)
num_experiments = Experiment.objects.count()
experiment = {
"name": "experiment", "notes": "hi", "uniformRandom": False,
"csvUpload": True,
"tracks": [{"id": None, "name": "A"}]
}
response = self.client.post(
reverse("ab_testing_tool_submit_create_experiment"), follow=True,
content_type="application/json", data=json.dumps(experiment)
)
self.assertEquals(num_experiments + 1, Experiment.objects.count(), response)
def test_submit_create_experiment_with_weights_as_assignment_method(self):
""" Tests that create_experiment creates a Experiment object verified by
DB count when uniformRandom is false and the tracks have weightings """
Experiment.get_placeholder_course_experiment(TEST_COURSE_ID)
num_experiments = Experiment.objects.count()
experiment = {
"name": "experiment", "notes": "hi", "uniformRandom": False,
"csvUpload": False,
"tracks": [{"id": None, "weighting": 100, "name": "A"}]
}
response = self.client.post(
reverse("ab_testing_tool_submit_create_experiment"), follow=True,
content_type="application/json", data=json.dumps(experiment)
)
self.assertEquals(num_experiments + 1, Experiment.objects.count(), response)
def test_submit_create_experiment_unauthorized(self):
"""Tests that create_experiment creates a Experiment object verified by DB count"""
self.set_roles([])
Experiment.get_placeholder_course_experiment(TEST_COURSE_ID)
num_experiments = Experiment.objects.count()
experiment = {"name": "experiment", "notes": "hi"}
response = self.client.post(
reverse("ab_testing_tool_submit_create_experiment"), follow=True,
content_type="application/json", data=json.dumps(experiment)
)
self.assertEquals(num_experiments, Experiment.objects.count())
self.assertTemplateUsed( |
newvem/pytz | pytz/zoneinfo/Africa/Lagos.py | Python | mit | 475 | 0.044211 | '''tzinfo t | imezone information for Africa/Lagos.'''
from pytz.tzinfo import DstTzInfo
from pytz.tzinfo import memorized_datetime as d
from pytz.tzinfo import memorized_ttinfo as i
class Lagos(DstTzInfo):
'''Africa/Lagos timezone definition. See datetime.tzinfo for details'''
zone = 'Africa/Lagos'
_utc_transition_times = [
d(1,1,1,0,0,0),
d(1919,8,31,23,46,24),
]
_transition_info = [
i(840,0,'LMT'),
i(3600,0,'WAT' | ),
]
Lagos = Lagos()
|
SepehrMN/nest-simulator | pynest/examples/synapsecollection.py | Python | gpl-2.0 | 5,672 | 0.000705 | # -*- coding: utf-8 -*-
#
# synapsecollection.py
#
# This file is part of NEST.
#
# Copyright (C) 2004 The NEST Initiative
#
# NEST is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, either version 2 of the License, or
# (at your option) any later version.
#
# NEST is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with NEST. If not, see <http://www.gnu.org/licenses/>.
"""
Example script to show some of the possibilities of the SynapseCollection class. We
connect neurons, and get the SynapseCollection with a GetConnections call. To get
a better understanding of the connections, we plot the weights between the
source and targets.
"""
import nest
import matplotlib.pyplot as plt
import numpy as np
def makeMatrix(sources, targets, weights):
"""
Returns a matrix with the weights between the source and target node_ids.
"""
aa = np.zeros((max(sources)+1, max(targets)+1))
for src, trg, wght in zip(sources, targets, weights):
aa[src, trg] += wght
return aa
def plotMatrix(srcs, tgts, weights, title, pos):
"""
Plots weight matrix.
"""
plt.subplot(pos)
plt.matshow(makeMatrix(srcs, tgts, weights), fignum=False)
plt.xlim([min(tgts)-0.5, max(tgts)+0.5])
plt.xlabel('target')
plt.ylim([max(srcs)+0.5, min(srcs)-0.5])
plt.ylabel('source')
plt.title(title)
plt.colorbar(fraction=0.046, pad=0.04)
"""
Start with a simple, one_to_one example.
We create the neurons, connect them, and get the connections. From this we can
get the connected sources, targets, and weights. The corresponding matrix will
be the identity matrix, as we have a one_to_one connection.
"""
nest.ResetKernel()
nrns = nest.Create('iaf_psc_alpha', 10)
nest.Connect(nrns, nrns, 'one_to_one')
conns = nest.GetConnections(nrns, nrns) # This returns a Syn | apseCollec | tion
# We can get desired information of the SynapseCollection with simple get() call.
g = conns.get(['source', 'target', 'weight'])
srcs = g['source']
tgts = g['target']
weights = g['weight']
# Plot the matrix consisting of the weights between the sources and targets
plt.figure(figsize=(12, 10))
plotMatrix(srcs, tgts, weights, 'Uniform weight', 121)
"""
Add some weights to the connections, and plot the updated weight matrix.
"""
# We can set data of the connections with a simple set() call.
w = [{'weight': x*1.0} for x in range(1, 11)]
conns.set(w)
weights = conns.weight
plotMatrix(srcs, tgts, weights, 'Set weight', 122)
"""
We can also plot an all_to_all connection, with uniformly distributed weights,
and different number of sources and targets.
"""
nest.ResetKernel()
pre = nest.Create('iaf_psc_alpha', 10)
post = nest.Create('iaf_psc_delta', 5)
nest.Connect(pre, post,
syn_spec={'weight':
{'distribution': 'uniform', 'low': 0.5, 'high': 4.5}})
# Get a SynapseCollection with all connections
conns = nest.GetConnections()
srcs = conns.source
tgts = conns.target
weights = conns.weight
plt.figure(figsize=(12, 10))
plotMatrix(srcs, tgts, weights, 'All to all connection', 111)
"""
Lastly, we'll do an exmple that is a bit more complex. We connect different
neurons with different rules, synapse models and weight distributions, and get
different SynapseCollections by calling GetConnections with different inputs.
"""
nest.ResetKernel()
nrns = nest.Create('iaf_psc_alpha', 15)
nest.Connect(nrns[:5], nrns[:5],
'one_to_one',
{'synapse_model': 'stdp_synapse',
'weight': {'distribution': 'normal', 'mu': 5.0, 'sigma': 2.0}})
nest.Connect(nrns[:10], nrns[5:12],
{'rule': 'pairwise_bernoulli', 'p': 0.4},
{'weight': 4.0})
nest.Connect(nrns[5:10], nrns[:5],
{'rule': 'fixed_total_number', 'N': 5},
{'weight': 3.0})
nest.Connect(nrns[10:], nrns[:12],
'all_to_all',
{'synapse_model': 'stdp_synapse',
'weight': {'distribution': 'uniform', 'low': 1., 'high': 5.}})
nest.Connect(nrns, nrns[12:],
{'rule': 'fixed_indegree', 'indegree': 3})
# First get a SynapseCollection consisting of all the connections
conns = nest.GetConnections()
srcs = conns.source
tgts = conns.target
weights = conns.weight
plt.figure(figsize=(14, 12))
plotMatrix(list(srcs), list(tgts), weights, 'All connections', 221)
# Get SynapseCollection consisting of a subset of connections
conns = nest.GetConnections(nrns[:10], nrns[:10])
g = conns.get(['source', 'target', 'weight'])
srcs = g['source']
tgts = g['target']
weights = g['weight']
plotMatrix(srcs, tgts, weights, 'Connections of the first ten neurons', 222)
# Get SynapseCollection consisting of just the stdp_synapses
conns = nest.GetConnections(synapse_model='stdp_synapse')
g = conns.get(['source', 'target', 'weight'])
srcs = g['source']
tgts = g['target']
weights = g['weight']
plotMatrix(srcs, tgts, weights, 'Connections with stdp_synapse', 223)
# Get SynapseCollection consisting of the fixed_total_number connections, but set
# weight before plotting
conns = nest.GetConnections(nrns[5:10], nrns[:5])
w = [{'weight': x*1.0} for x in range(1, 6)]
conns.set(w)
g = conns.get(['source', 'target', 'weight'])
srcs = g['source']
tgts = g['target']
weights = g['weight']
plotMatrix(srcs, tgts, weights, 'fixed_total_number, set weight', 224)
plt.show()
|
nwjs/chromium.src | tools/polymer/html_to_js.py | Python | bsd-3-clause | 1,414 | 0.009194 | # Copyright 2020 The Chromium Authors. All rights reserved.
# Use of this source code is governed by a BSD-style license that can be
# found in the LICENSE file.
# Inlines an HTML file into a JS (or TS) file at a location specified by a
# placeholder. This is useful for implementing Web Components using JS modules,
# where all the HTML needs to reside in the JS file (no more HTML imports).
| import argparse
import sys
import io
from os import path, getcwd
from polymer import process_v3_ready
_CWD = getcwd()
def main(argv):
parser = argp | arse.ArgumentParser()
parser.add_argument('--in_folder', required=True)
parser.add_argument('--out_folder', required=True)
parser.add_argument('--js_files', required=True, nargs="*")
args = parser.parse_args(argv)
in_folder = path.normpath(path.join(_CWD, args.in_folder))
out_folder = path.normpath(path.join(_CWD, args.out_folder))
results = []
for js_file in args.js_files:
# Detect file extension, since it can be either a .ts or .js file.
extension = path.splitext(js_file)[1]
html_file = path.basename(js_file).replace(extension, '.html')
result = process_v3_ready(
path.join(in_folder, js_file), path.join(in_folder, html_file))
with io.open(path.join(out_folder, result[1]), mode='wb') as f:
for l in result[0]:
f.write(l.encode('utf-8'))
return
if __name__ == '__main__':
main(sys.argv[1:])
|
google-research/fitvid | metrics.py | Python | apache-2.0 | 3,220 | 0.013665 | # Copyright 2020 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Metrics."""
import numpy as np
import tensorflow.compat.v2 as tf
import tensorflow_gan as tfgan
import tensorflow_hub as hub
i3d_model = None
lpips_model = None
def flatten_video(video):
return np.reshape(video, (-1,) + video.shape[2:])
def psnr(video_1, video_2):
video_1 = flatten_video(video_1)
video_2 = flatten_video(video_2)
dist = tf.image.psnr(video_1, video_2, max_val=1.0)
return np.mean(dist.numpy())
def ssim(video_1, video_2):
video_1 = flatten_video(video_1)
video_2 = flatten_video(video_2)
dist = tf.image.ssim(video_1, video_2, max_val=1.0)
return np.mean(dist.numpy())
def psnr_image(target_image, out_image):
dist = tf.image.psnr(target_image, out_image, max_val=1.0)
return np.mean(dist.numpy())
def psnr_per_frame(target_video, out_video):
max_val = 1.0
mse = np.mean(np.square(out_video - target_video), axis=(2, 3, 4))
return 20 * np.log10(max_val) - 10.0 * np.log10(mse)
def lpips_image(generated_image, real_image):
global lpips_model
result = tf.convert_to_tensor(0.0)
return result
def lpips(video_1, video_2):
video_1 = flatten_video(video_1)
video_2 = flatten_video(video_2)
dist = lpips_image(video_1, video_2)
return np.mean(dist.numpy())
def fvd_preprocess(videos, target_resolution):
videos = tf.convert_to_tensor(videos * 255.0, dtype=tf.float32)
videos_shape = videos.shape.as_list()
all_frames = tf.reshape(videos, [-1] + videos_shape[-3:])
resized_videos = tf.image.resize(all_frames, size=target_resolution)
target_shape = [videos_shape[0], -1] + list(target_resolution) + [3]
output_videos = tf.reshape(resized_videos, target_shape)
scaled_videos = 2. * tf.cast(output_videos, tf.float32) / 255. - 1
return scaled_videos
def create_id3_embedding(videos):
"""Get id3 embeddings."""
global i3d_m | odel
module_spec = 'https://tfhub.dev/deepmind/i3d-kinetics-400/1'
if not i3d_model:
base_ | model = hub.load(module_spec)
input_tensor = base_model.graph.get_tensor_by_name('input_frames:0')
i3d_model = base_model.prune(input_tensor, 'RGB/inception_i3d/Mean:0')
output = i3d_model(videos)
return output
def calculate_fvd(real_activations, generated_activations):
return tfgan.eval.frechet_classifier_distance_from_activations(
real_activations, generated_activations)
def fvd(video_1, video_2):
video_1 = fvd_preprocess(video_1, (224, 224))
video_2 = fvd_preprocess(video_2, (224, 224))
x = create_id3_embedding(video_1)
y = create_id3_embedding(video_2)
result = calculate_fvd(x, y)
return result.numpy()
def inception_score(images):
return tfgan.eval.inception_score(images)
|
ahmedaljazzar/edx-platform | openedx/core/djangoapps/programs/tasks/v1/tests/test_tasks.py | Python | agpl-3.0 | 21,370 | 0.002387 | """
Tests for programs celery tasks.
"""
import json
from datetime import datetime, timedelta
import ddt
import httpretty
import mock
import pytz
from waffle.testutils import override_switch
from celery.exceptions import MaxRetriesExceededError
from django.conf import settings
from django.test import override_settings, TestCase
from edx_oauth2_provider.tests.factories import ClientFactory
from edx_rest_api_client import exceptions
from edx_rest_api_client.client import EdxRestApiClient
from lms.djangoapps.certificates.tests.factories import GeneratedCertificateFactory
from openedx.core.djangoapps.catalog.tests.mixins import CatalogIntegrationMixin
from openedx.core.djangoapps.certificates.config import waffle
from openedx.core.djangoapps.content.course_overviews.tests.factories import CourseOverviewFactory
from openedx.core.djangoapps.credentials.tests.mixins import CredentialsApiConfigMixin
from openedx.core.djangoapps.programs.tasks.v1 import tasks
from openedx.core.djangoapps.site_configuration.tests.factories import SiteFactory
from openedx.core.djangolib.testing.utils import skip_unless_lms
from student.tests.factories import UserFactory
CREDENTIALS_INTERNAL_SERVICE_URL = 'https://credentials.example.com'
TASKS_MODULE = 'openedx.core.djangoapps.programs.tasks.v1.tasks'
@skip_unless_lms
class GetAwardedCertificateProgramsTestCase(TestCase):
"""
Test the get_certified_programs function
"""
def make_credential_result(self, **kwargs):
"""
Helper to make dummy results from the credentials API
"""
result = {
'id': 1,
'username': 'dummy-username',
'credential': {
'credential_id': None,
'program_uuid': None,
},
'status': 'dummy-status',
'uuid': 'dummy-uuid',
'certificate_url': 'http://credentials.edx.org/credentials/dummy-uuid/'
}
result.update(**kwargs)
return result
@mock.patch(TASKS_MODULE + '.get_credentials')
def test_get_certified_programs(self, mock_get_credentials):
"""
Ensure the API is called and results handled correctly.
"""
student = UserFactory(username='test-username')
mock_get_credentials.return_value = [
self.make_credential_result(status='awarded', credential={'program_uuid': 1}),
]
result = tasks.get_certified_programs(student)
self.assertEqual(mock_get_credentials.call_args[0], (student,))
self.assertEqual(mock_get_credentials.call_args[1], {'credential_type': 'program'})
self.assertEqual(result, [1])
@skip_unless_lms
class AwardProgramCertificateTestCase(TestCase):
"""
Test the award_program_certificate function
"""
@httpretty.activate
def test_award_program_certificate(self):
"""
Ensure the correct API call gets made
"""
test_username = 'test-username'
test_client = EdxRestApiClient('http://test-server', jwt='test-token')
httpretty.register_uri(
httpretty.POST,
'http://test-server/credentials/',
)
tasks.award_program_certificate(test_client, test_username, 123, datetime(2010, 5, 30))
expected_body = {
'username': test_username,
'credential': {
'program_uuid': 123,
'type': tasks.PROGRAM_CERTIFICATE,
},
'attributes': [
{
'name': 'visible_date',
'value': '2010-05-30T00:00:00Z',
}
]
}
self.assertEqual(json.loads(httpretty.last_request().body), expected_body)
@skip_unless_lms
@ddt.ddt
@mock.patch(TASKS_MODULE + '.award_program_certificate')
@mock.patch(TASKS_MODULE + '.get_certified_programs')
@mock.patch(TASKS_MODULE + '.get_completed_programs')
@override_settings(CREDENTIALS_SERVICE_USERNAME='test-service-username')
class AwardProgramCertificatesTestCase(CatalogIntegrationMixin, CredentialsApiConfigMixin, TestCase):
"""
Tests for the 'award_program_certificates' celery task.
"""
def setUp(self):
super(AwardProgramCertificatesTestCase, self).setUp()
self.create_credentials_config()
self.student = UserFactory.create(username='test-student')
self.site = SiteFactory()
self.catalog_integration = self.create_catalog_integration()
ClientFactory.create(name='credentials')
UserFactory.create(username=settings.CREDENTIALS_SERVICE_USERNAME)
def test_completion_check(
self,
mock_get_completed_programs,
mock_get_certified_programs, # pylint: disable=unused-argument
mock_award_program_certificate, # pylint: disable=unused-argument
):
"""
Checks that the Programs API is used correctly to determine completed
programs.
"""
tasks.award_program_certificates.delay(self.student.username).get()
mock_get_completed_programs.assert_called(self.site, self.student)
@ddt.data(
([1], [2, 3]),
([], [1, 2, 3]),
([1, 2, 3], []),
)
@ddt.unpack
def test_awarding_certs(
self,
already_awarded_program_uuids,
expected_awarded_program_uuids,
mock_get_completed_programs,
mock_get_certified_programs,
mock_award_program_certificate,
):
"""
Checks that the Credentials API is used to award certificates for
the proper programs.
"""
mock_get_completed_programs.return_value = {1: 1, 2: 2, 3: 3}
mock_get_certified_programs.return_value = already_awarded_program_uuid | s
tasks.award_program_certificates.delay(self.student.username).get()
actual_program_uuids = [call[0][2] for call in mock_award_program_certificate.call_args_list]
self.assertEqual(actual_program_uuids, expected_awarded_program_uuids)
actual_visible_dates = [call[0][3] for call in mock_award_program_certificate.call_args_list]
s | elf.assertEqual(actual_visible_dates, expected_awarded_program_uuids) # program uuids are same as mock dates
@ddt.data(
('credentials', 'enable_learner_issuance'),
)
@ddt.unpack
def test_retry_if_config_disabled(
self,
disabled_config_type,
disabled_config_attribute,
*mock_helpers
):
"""
Checks that the task is aborted if any relevant api configs are
disabled.
"""
getattr(self, 'create_{}_config'.format(disabled_config_type))(**{disabled_config_attribute: False})
with mock.patch(TASKS_MODULE + '.LOGGER.warning') as mock_warning:
with self.assertRaises(MaxRetriesExceededError):
tasks.award_program_certificates.delay(self.student.username).get()
self.assertTrue(mock_warning.called)
for mock_helper in mock_helpers:
self.assertFalse(mock_helper.called)
def test_abort_if_invalid_username(self, *mock_helpers):
"""
Checks that the task will be aborted and not retried if the username
passed was not found, and that an exception is logged.
"""
with mock.patch(TASKS_MODULE + '.LOGGER.exception') as mock_exception:
tasks.award_program_certificates.delay('nonexistent-username').get()
self.assertTrue(mock_exception.called)
for mock_helper in mock_helpers:
self.assertFalse(mock_helper.called)
def test_abort_if_no_completed_programs(
self,
mock_get_completed_programs,
mock_get_certified_programs,
mock_award_program_certificate,
):
"""
Checks that the task will be aborted without further action if there
are no programs for which to award a certificate.
"""
mock_get_completed_programs.return_value = {}
tasks.award_program_certificates.delay(self.student.username).get()
self.assertTrue(mock_get_completed_programs.called)
self.assertFalse(mock_get_certified_programs.called)
self.assertFalse(mock_ |
Kobzol/debug-visualizer | debugger/lldbc/lldb_io_manager.py | Python | gpl-3.0 | 3,121 | 0.000641 | # -*- coding: utf-8 -*-
#
# Copyright (C) 2015-2016 Jakub Beranek
#
# This file is part of Devi.
#
# Devi is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, version 3 of the License, or
# (at your option) any later version.
#
# Devi is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with Devi. If not, see <http://www.gnu.org/licenses/>.
#
import os
import tempfile
import threading
class LldbIOManager(object):
@staticmethod
def create_pipe():
tmpdir = tempfile.gettempdir()
temp_name = next(tempfile._get_candidate_names())
fifo = os.path.join(tmpdir, temp_name + ".fifo")
os.mkfifo(fifo)
return os.path.abspath(fifo)
def __init__(self):
self.file_threads = []
self.file_paths = []
self.stdin = None
self.stdout = None
self.stderr = None
def _open_file(self, attribute, mode, file_path):
setattr(self, attribute, open(file_path, mode, buffering=0))
def _close_file(self, attribute):
try:
if getattr(self, attribute):
getattr( | self, attribute).close()
setattr(self, attribute, None)
except:
pass
def handle_io(self):
if len(self.file_threads) > 0:
return
| stdin, stdout, stderr = [LldbIOManager.create_pipe()
for _ in xrange(3)]
self.file_paths += (stdin, stdout, stderr)
self.file_threads.append(threading.Thread(target=self._open_file,
args=["stdin",
"w",
stdin]))
self.file_threads.append(threading.Thread(target=self._open_file,
args=["stdout",
"r",
stdout]))
self.file_threads.append(threading.Thread(target=self._open_file,
args=["stderr",
"r",
stderr]))
map(lambda thread: thread.start(), self.file_threads)
return (stdin, stdout, stderr)
def stop_io(self):
map(lambda thread: thread.join(), self.file_threads)
self._close_file("stdin")
self._close_file("stdout")
self._close_file("stderr")
self.file_threads = []
for path in self.file_paths:
try:
os.remove(path)
except:
pass
self.file_paths = []
|
drongh/vnpy | vn.trader/gateway.py | Python | mit | 14,192 | 0.007792 | # encoding: UTF-8
from eventEngine import *
# 默认空值
EMPTY_STRING = ''
EMPTY_UNICODE = u''
E | MPTY_INT = 0
EMPTY_FLOAT = 0.0
# 方向常量
DIRECTION_NONE = u'无方向'
DIRECTION_LONG = u'多'
DIRECTION_SHORT = u'空'
DIRECTION_UNK | NOWN = u'未知'
DIRECTION_NET = u'净'
# 开平常量
OFFSET_NONE = u'无开平'
OFFSET_OPEN = u'开仓'
OFFSET_CLOSE = u'平仓'
OFFSET_UNKNOWN = u'未知'
# 状态常量
STATUS_NOTTRADED = u'未成交'
STATUS_PARTTRADED = u'部分成交'
STATUS_ALLTRADED = u'全部成交'
STATUS_CANCELLED = u'已撤销'
STATUS_UNKNOWN = u'未知'
# 合约类型常量
PRODUCT_EQUITY = u'股票'
PRODUCT_FUTURES = u'期货'
PRODUCT_OPTION = u'期权'
PRODUCT_INDEX = u'指数'
PRODUCT_COMBINATION = u'组合'
# 期权类型
OPTION_CALL = u'看涨期权'
OPTION_PUT = u'看跌期权'
########################################################################
class VtGateway(object):
"""交易接口"""
#----------------------------------------------------------------------
def __init__(self, eventEngine):
"""Constructor"""
self.eventEngine = eventEngine
#----------------------------------------------------------------------
def onTick(self, tick):
"""市场行情推送"""
# 通用事件
event1 = Event(type_=EVENT_TICK)
event1.dict_['data'] = tick
self.eventEngine.put(event1)
# 特定合约代码的事件
event2 = Event(type_=EVENT_TICK+tick.vtSymbol)
event2.dict_['data'] = tick
self.eventEngine.put(event2)
#----------------------------------------------------------------------
def onTrade(self, trade):
"""成交信息推送"""
# 因为成交通常都是事后才会知道成交编号,因此只需要推送通用事件
event1 = Event(type_=EVENT_TRADE)
event1.dict_['data'] = trade
self.eventEngine.put(event1)
#----------------------------------------------------------------------
def onOrder(self, order):
"""订单变化推送"""
# 通用事件
event1 = Event(type_=EVENT_ORDER)
event1.dict_['data'] = order
self.eventEngine.put(event1)
# 特定订单编号的事件
event2 = Event(type_=EVENT_ORDER+order.vtOrderID)
event2.dict_['data'] = order
self.eventEngine.put(event2)
#----------------------------------------------------------------------
def onPosition(self, position):
"""持仓信息推送"""
# 通用事件
event1 = Event(type_=EVENT_POSITION)
event1.dict_['data'] = position
self.eventEngine.put(event1)
# 特定合约代码的事件
event2 = Event(type_=EVENT_POSITION+position.vtPositionName)
event2.dict_['data'] = position
self.eventEngine.put(event2)
#----------------------------------------------------------------------
def onAccount(self, account):
"""账户信息推送"""
# 通用事件
event1 = Event(type_=EVENT_ACCOUNT)
event1.dict_['data'] = account
self.eventEngine.put(event1)
# 特定合约代码的事件
event2 = Event(type_=EVENT_ACCOUNT+account.vtAccountID)
event2.dict_['data'] = account
self.eventEngine.put(event2)
#----------------------------------------------------------------------
def onError(self, error):
"""错误信息推送"""
# 通用事件
event1 = Event(type_=EVENT_ERROR)
event1.dict_['data'] = error
self.eventEngine.put(event1)
#----------------------------------------------------------------------
def onLog(self, log):
"""日志推送"""
# 通用事件
event1 = Event(type_=EVENT_LOG)
event1.dict_['data'] = log
self.eventEngine.put(event1)
#----------------------------------------------------------------------
def onContract(self, contract):
"""合约基础信息推送"""
# 通用事件
event1 = Event(type_=EVENT_CONTRACT)
event1.dict_['data'] = contract
self.eventEngine.put(event1)
#----------------------------------------------------------------------
def connect(self):
"""连接"""
pass
#----------------------------------------------------------------------
def subscribe(self):
"""订阅行情"""
pass
#----------------------------------------------------------------------
def sendOrder(self):
"""发单"""
pass
#----------------------------------------------------------------------
def cancelOrder(self):
"""撤单"""
pass
#----------------------------------------------------------------------
def close(self):
"""关闭"""
pass
########################################################################
class VtBaseData(object):
"""回调函数推送数据的基础类,其他数据类继承于此"""
#----------------------------------------------------------------------
def __init__(self):
"""Constructor"""
self.gatewayName = EMPTY_STRING # Gateway名称
self.rawData = None # 原始数据
########################################################################
class VtTickData(VtBaseData):
"""Tick行情数据类"""
#----------------------------------------------------------------------
def __init__(self):
"""Constructor"""
super(VtTickData, self).__init__()
# 代码相关
self.symbol = EMPTY_STRING # 合约代码
self.vtSymbol = EMPTY_STRING # 合约在vt系统中的唯一代码,通常是 Gateway名.合约代码
# 成交数据
self.lastPrice = EMPTY_FLOAT # 最新成交价
self.volume = EMPTY_INT # 最新成交量
self.openInterest = EMPTY_INT # 持仓量
self.tickTime = EMPTY_STRING # 更新时间
# 五档行情
self.bidPrice1 = EMPTY_FLOAT
self.bidPrice2 = EMPTY_FLOAT
self.bidPrice3 = EMPTY_FLOAT
self.bidPrice4 = EMPTY_FLOAT
self.bidPrice5 = EMPTY_FLOAT
self.askPrice1 = EMPTY_FLOAT
self.askPrice2 = EMPTY_FLOAT
self.askPrice3 = EMPTY_FLOAT
self.askPrice4 = EMPTY_FLOAT
self.askPrice5 = EMPTY_FLOAT
self.bidVolume1 = EMPTY_INT
self.bidVolume2 = EMPTY_INT
self.bidVolume3 = EMPTY_INT
self.bidVolume4 = EMPTY_INT
self.bidVolume5 = EMPTY_INT
self.askVolume1 = EMPTY_INT
self.askVolume2 = EMPTY_INT
self.askVolume3 = EMPTY_INT
self.askVolume4 = EMPTY_INT
self.askVolume5 = EMPTY_INT
########################################################################
class VtTradeData(VtBaseData):
"""成交数据类"""
#----------------------------------------------------------------------
def __init__(self):
"""Constructor"""
super(VtTradeData, self).__init__()
# 代码编号相关
self.symbol = EMPTY_STRING # 合约代码
self.vtSymbol = EMPTY_STRING # 合约在vt系统中的唯一代码,通常是 Gateway名.合约代码
self.tradeID = EMPTY_STRING # 成交编号
self.vtTradeID = EMPTY_STRING # 成交在vt系统中的唯一编号,通常是 Gateway名.成交编号
self.orderID = EMPTY_STRING # 订单编号
self.vtOrderID = EMPTY_STRING # 订单在vt系统中的唯一编号,通常是 Gateway名.订单编号
# 成交相关
self.direction = EMPTY_UNICODE # 成交方向
self.offset = EMPTY_UNICODE # 成交开平仓
self.price = EMPTY_FLOAT # 成交价格
self.volume = EMPTY_INT # 成交数量
self.tradeTime = EMPTY_STRING # 成交时间
########################################################################
class VtOrderData(VtBaseData):
"""订单数据类"""
#----------------------------------------------------------------------
def __init__(self):
"""Constructor"""
super(VtOrderData, self).__init__()
# 代码编号相关
self.symbol = EMPTY_STRING # 合约代码
self.vtSymbol = EMPTY_STRING # 合约在vt系统中的唯一代码,通常是 Gateway名.合约代码
self.orderID = EMPTY_STRING # 订单编号
self.vtOrderID = EMPTY_STRING # 订单在vt系统中的唯一编号,通常是 Gateway名.订单编号
# 报单相关
self.direction = EMPTY_UNICODE # 报单方向
self.offset = EMPTY_UNICODE # 报单开平仓
self |
ArcherSys/ArcherSys | skulpt/src/lib/posixfile.py | Python | mit | 72 | 0 | raise NotImp | lementedError("posixfile is no | t yet implemented in Skulpt")
|
XENON1T/cax | setup.py | Python | isc | 2,712 | 0.001106 | #!/usr/bin/env python
# -*- coding: utf-8 -*-
from setuptools import setup, find_packages
PROJECT = 'cax'
VERSION = '5.2.1'
with open('README.rst') as readme_file:
readme = readme_file.read()
with open('HISTORY.rst') as history_file:
history = history_file.read()
requirements = [
'checksumdir', 'scp', 'pagerduty-api', 'pymongo', 'paramiko',
'numpy', 'sympy', 'pytz',
]
test_requirements = [
'pytest', 'mongomock',
]
setup(
name='cax',
version=VERSION,
description="Copying Around XENON1T data",
long_description=readme + '\n\n' + history,
author="Christopher Tunnell",
author_email='ctunnell@nikhef.nl',
| url='https://github.com/tunnell/cax',
packages=find_packages(),
include_package_data=True,
install_requires=requirements,
data_files=[ ('cax', ['cax/cax.json']),
('cax/host_config', ['cax/host_config/tegner_bash_p3.config', 'cax/host_config/tegner_bash_p2.config', 'cax/host_config/midway_bash_p3.config', 'cax/host_config/midway_bash_p2.co | nfig', 'cax/host_config/xe1tdatamanager_bash_p3.config', 'cax/host_config/xe1tdatamanager_bash_p2.config'])
],
license="ISCL",
zip_safe=False,
keywords='cax',
classifiers=[
'Intended Audience :: System Administrators',
'Development Status :: 5 - Production/Stable'
'License :: OSI Approved :: ISC License (ISCL)',
'Natural Language :: English',
'Programming Language :: Python :: 3',
'Programming Language :: Python :: 3.3',
'Programming Language :: Python :: 3.4',
'Programming Language :: Python :: 3.5',
],
test_suite='tests',
tests_require=test_requirements,
setup_requires=['pytest-runner'],
entry_points={
'console_scripts': [
'cax = cax.main:main',
'massive-cax = cax.main:massive',
'caxer = cax.main:main', # For uniformity with paxer
'cax-process = cax.tasks.process:main',
'cax-mv = cax.main:move',
'cax-rm = cax.main:remove',
'cax-stray = cax.main:stray',
'cax-status = cax.main:status',
'massive-tsm = cax.main:massive_tsmclient',
'cax-tsm-remove = cax.main:remove_from_tsm',
'cax-tsm-watch = cax.main:cax_tape_log_file',
'ruciax = cax.main:ruciax',
'ruciax-rm = cax.main:remove_from_rucio',
'massive-ruciax = cax.main:massiveruciax',
'ruciax-check = cax.main:ruciax_status',
'ruciax-purge = cax.main:ruciax_purge',
'ruciax-download = cax.main:ruciax_download',
'ruciax-locator = cax.main:ruciax_locator',
],
},
)
|
danielmt/vshard | vendor/github.com/youtube/vitess/py/vtproto/throttlerdata_pb2.py | Python | mit | 7,327 | 0.006824 | # Generated by the protocol buffer compiler. DO NOT EDIT!
# source: throttlerdata.proto
import sys
_b=sys.version_info[0]<3 and (lambda x:x) or (lambda x:x.encode('latin1'))
from google.protobuf import descriptor as _descriptor
from google.protobuf import message as _message
from google.protobuf import reflection as _reflection
from google.protobuf import symbol_database as _symbol_database
from google.protobuf import descriptor_pb2
# @@protoc_insertion_point(imports)
_sym_db = _symbol_database.Default()
DESCRIPTOR = _descriptor.FileDescriptor(
name='throttlerdata.proto',
package='throttlerdata',
syntax='proto3',
serialized_pb=_b('\n\x13throttlerdata.proto\x12\rthrottlerdata\"\x11\n\x0fMaxRatesRequest\"{\n\x10MaxRatesResponse\x12\x39\n\x05rates\x18\x01 \x03(\x0b\x32*.throttlerdata.MaxRatesResponse.RatesEntry\x1a,\n\nRatesEntry\x12\x0b\n\x03key\x18\x01 \x01(\t\x12\r\n\x05value\x18\x02 \x01(\x03:\x02\x38\x01\"!\n\x11SetMaxRateRequest\x12\x0c\n\x04rate\x18\x01 \x01(\x03\"#\n\x12SetMaxRateResponse\x12\r\n\x05names\x18\x01 \x03(\tb\x06proto3')
)
_sym_db.RegisterFileDescriptor(DESCRIPTOR)
_MAXRATESREQUEST = _descriptor.Descriptor(
name='MaxRatesRequest',
full_name='throttlerdata.MaxRatesRequest',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
],
extensions=[
],
nested_types=[],
enum_types=[
],
options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=38,
serialized_end=55,
)
_MAXRATESRESPONSE_RATESENTRY = _descriptor.Descriptor(
name='RatesEntry',
full_name='throttlerdata.MaxRatesResponse.RatesEntry',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='key', full_name='throttlerdata.MaxRatesResponse.RatesEntry.key', index=0,
number=1, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='value', full_name='throttlerdata.MaxRatesResponse.RatesEntry.value', index=1,
number=2, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
],
extensions=[
],
nested_types=[],
enum_types=[
],
options=_descriptor._ParseOptions(descriptor_pb2.MessageOptions(), _b('8\001')),
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=136,
serialized_end=180,
)
_MAXRATESRESPONSE = _descriptor.Descriptor(
name='MaxRatesResponse',
full_name='throttlerdata.MaxRatesResponse',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='rates', full_name='throttlerdata.MaxRatesResponse.rates', index=0,
number=1, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
],
extensions=[
],
nested_types=[_MAXRATESRESPONSE_RATESENTRY, ],
enum_types=[
],
options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=57,
serialized_end=180,
)
_SETMAXRATEREQUEST = _descriptor.Descriptor(
name='SetMaxRateRequest',
full_name='throttlerdata.SetMaxRateRequest',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='rate', f | ull_name='throttlerdata.SetMaxRateRequest.rate', index=0,
number=1, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
],
extensions=[
],
nested_types=[],
enum_typ | es=[
],
options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=182,
serialized_end=215,
)
_SETMAXRATERESPONSE = _descriptor.Descriptor(
name='SetMaxRateResponse',
full_name='throttlerdata.SetMaxRateResponse',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='names', full_name='throttlerdata.SetMaxRateResponse.names', index=0,
number=1, type=9, cpp_type=9, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
],
extensions=[
],
nested_types=[],
enum_types=[
],
options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=217,
serialized_end=252,
)
_MAXRATESRESPONSE_RATESENTRY.containing_type = _MAXRATESRESPONSE
_MAXRATESRESPONSE.fields_by_name['rates'].message_type = _MAXRATESRESPONSE_RATESENTRY
DESCRIPTOR.message_types_by_name['MaxRatesRequest'] = _MAXRATESREQUEST
DESCRIPTOR.message_types_by_name['MaxRatesResponse'] = _MAXRATESRESPONSE
DESCRIPTOR.message_types_by_name['SetMaxRateRequest'] = _SETMAXRATEREQUEST
DESCRIPTOR.message_types_by_name['SetMaxRateResponse'] = _SETMAXRATERESPONSE
MaxRatesRequest = _reflection.GeneratedProtocolMessageType('MaxRatesRequest', (_message.Message,), dict(
DESCRIPTOR = _MAXRATESREQUEST,
__module__ = 'throttlerdata_pb2'
# @@protoc_insertion_point(class_scope:throttlerdata.MaxRatesRequest)
))
_sym_db.RegisterMessage(MaxRatesRequest)
MaxRatesResponse = _reflection.GeneratedProtocolMessageType('MaxRatesResponse', (_message.Message,), dict(
RatesEntry = _reflection.GeneratedProtocolMessageType('RatesEntry', (_message.Message,), dict(
DESCRIPTOR = _MAXRATESRESPONSE_RATESENTRY,
__module__ = 'throttlerdata_pb2'
# @@protoc_insertion_point(class_scope:throttlerdata.MaxRatesResponse.RatesEntry)
))
,
DESCRIPTOR = _MAXRATESRESPONSE,
__module__ = 'throttlerdata_pb2'
# @@protoc_insertion_point(class_scope:throttlerdata.MaxRatesResponse)
))
_sym_db.RegisterMessage(MaxRatesResponse)
_sym_db.RegisterMessage(MaxRatesResponse.RatesEntry)
SetMaxRateRequest = _reflection.GeneratedProtocolMessageType('SetMaxRateRequest', (_message.Message,), dict(
DESCRIPTOR = _SETMAXRATEREQUEST,
__module__ = 'throttlerdata_pb2'
# @@protoc_insertion_point(class_scope:throttlerdata.SetMaxRateRequest)
))
_sym_db.RegisterMessage(SetMaxRateRequest)
SetMaxRateResponse = _reflection.GeneratedProtocolMessageType('SetMaxRateResponse', (_message.Message,), dict(
DESCRIPTOR = _SETMAXRATERESPONSE,
__module__ = 'throttlerdata_pb2'
# @@protoc_insertion_point(class_scope:throttlerdata.SetMaxRateResponse)
))
_sym_db.RegisterMessage(SetMaxRateResponse)
_MAXRATESRESPONSE_RATESENTRY.has_options = True
_MAXRATESRESPONSE_RATESENTRY._options = _descriptor._ParseOptions(descriptor_pb2.MessageOptions(), _b('8\001'))
import abc
from grpc.beta import implementations as beta_implementations
from grpc.framework.common import cardinality
from grpc.framework.interfaces.face import utilities as face_utilities
# @@protoc_insertion_point(module_scope)
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.