INSTRUCTION
stringlengths
1
8.43k
RESPONSE
stringlengths
75
104k
Fake event handler same as: py: meth: WorldView. on_menu_exit () but force - disables mouse exclusivity.
def on_menu_exit(self,new): """ Fake event handler, same as :py:meth:`WorldView.on_menu_exit()` but force-disables mouse exclusivity. """ super(WorldViewMouseRotatable,self).on_menu_exit(new) self.world.peng.window.toggle_exclusivity(False)
Keyboard event handler handling only the escape key. If an escape key press is detected mouse exclusivity is toggled via: py: meth: PengWindow. toggle_exclusivity () \.
def on_key_press(self,symbol,modifiers): """ Keyboard event handler handling only the escape key. If an escape key press is detected, mouse exclusivity is toggled via :py:meth:`PengWindow.toggle_exclusivity()`\ . """ if symbol == key.ESCAPE: self.world.peng.window.toggle_exclusivity() return pyglet.event.EVENT_HANDLED
Handles mouse motion and rotates the attached camera accordingly. For more information about how to customize mouse movement see the class documentation here: py: class: WorldViewMouseRotatable () \.
def on_mouse_motion(self, x, y, dx, dy): """ Handles mouse motion and rotates the attached camera accordingly. For more information about how to customize mouse movement, see the class documentation here :py:class:`WorldViewMouseRotatable()`\ . """ if not self.world.peng.window.exclusive: return m = self.world.peng.cfg["controls.mouse.sensitivity"] x, y = self.rot x, y = x + dx * m, y + dy * m y = max(-90, min(90, y)) x %= 360 newrot = (x,y) self.rot= newrot
Start a new step. returns a context manager which allows you to report an error
def step(self, step_name): """Start a new step. returns a context manager which allows you to report an error""" @contextmanager def step_context(step_name): if self.event_receiver.current_case is not None: raise Exception('cannot open a step within a step') self.event_receiver.begin_case(step_name, self.now_seconds(), self.name) try: yield self.event_receiver except: etype, evalue, tb = sys.exc_info() self.event_receiver.error('%r' % [etype, evalue, tb]) raise finally: self.event_receiver.end_case(step_name, self.now_seconds()) return step_context(step_name)
Converts the given resource name to a file path. A resource path is of the format <app >: <cat1 >. <cat2 >. <name > where cat1 and cat2 can be repeated as often as desired. ext is the file extension to use e. g.. png or similar. As an example the resource name peng3d: some. category. foo with the extension. png results in the path <basepath >/ assets/ peng3d/ some/ category/ foo. png \. This resource naming scheme is used by most other methods of this class. Note that it is currently not possible to define multiple base paths to search through.
def resourceNameToPath(self,name,ext=""): """ Converts the given resource name to a file path. A resource path is of the format ``<app>:<cat1>.<cat2>.<name>`` where cat1 and cat2 can be repeated as often as desired. ``ext`` is the file extension to use, e.g. ``.png`` or similar. As an example, the resource name ``peng3d:some.category.foo`` with the extension ``.png`` results in the path ``<basepath>/assets/peng3d/some/category/foo.png``\ . This resource naming scheme is used by most other methods of this class. Note that it is currently not possible to define multiple base paths to search through. """ nsplit = name.split(":")[1].split(".") return os.path.join(self.basepath,"assets",name.split(":")[0],*nsplit)+ext
Returns whether or not the resource with the given name and extension exists. This must not mean that the resource is meaningful it simply signals that the file exists.
def resourceExists(self,name,ext=""): """ Returns whether or not the resource with the given name and extension exists. This must not mean that the resource is meaningful, it simply signals that the file exists. """ return os.path.exists(self.resourceNameToPath(name,ext))
Adds a new texture category with the given name. If the category already exists it will be overridden.
def addCategory(self,name): """ Adds a new texture category with the given name. If the category already exists, it will be overridden. """ self.categories[name]={} self.categoriesTexCache[name]={} self.categoriesTexBin[name]=pyglet.image.atlas.TextureBin(self.texsize,self.texsize) self.peng.sendEvent("peng3d:rsrc.category.add",{"peng":self.peng,"category":name})
Gets the texture associated with the given name and category. category must have been created using: py: meth: addCategory () before. If it was loaded previously a cached version will be returned. If it was not loaded it will be loaded and inserted into the cache. See: py: meth: loadTex () for more information.
def getTex(self,name,category): """ Gets the texture associated with the given name and category. ``category`` must have been created using :py:meth:`addCategory()` before. If it was loaded previously, a cached version will be returned. If it was not loaded, it will be loaded and inserted into the cache. See :py:meth:`loadTex()` for more information. """ if category not in self.categoriesTexCache: return self.getMissingTex(category) if name not in self.categoriesTexCache[category]: self.loadTex(name,category) return self.categoriesTexCache[category][name]
Loads the texture of the given name and category. All textures currently must be PNG files although support for more formats may be added soon. If the texture cannot be found a missing texture will instead be returned. See: py: meth: getMissingTexture () for more information. Currently all texture mipmaps will be generated and the filters will be set to: py: const: GL_NEAREST for the magnification filter and: py: const: GL_NEAREST_MIPMAP_LINEAR for the minification filter. This results in a pixelated texture and not a blurry one.
def loadTex(self,name,category): """ Loads the texture of the given name and category. All textures currently must be PNG files, although support for more formats may be added soon. If the texture cannot be found, a missing texture will instead be returned. See :py:meth:`getMissingTexture()` for more information. Currently, all texture mipmaps will be generated and the filters will be set to :py:const:`GL_NEAREST` for the magnification filter and :py:const:`GL_NEAREST_MIPMAP_LINEAR` for the minification filter. This results in a pixelated texture and not a blurry one. """ try: img = pyglet.image.load(self.resourceNameToPath(name,".png")) except FileNotFoundError: img = self.getMissingTexture() texreg = self.categoriesTexBin[category].add(img) #texreg = texreg.get_transform(True,True) # Mirrors the image due to how pyglets coordinate system works # Strange behavior, sometimes needed and sometimes not self.categories[category][name]=texreg target = texreg.target texid = texreg.id texcoords = texreg.tex_coords # Prevents texture bleeding with texture sizes that are powers of 2, else weird lines may appear at certain angles. glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_REPEAT) glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_REPEAT) glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST) glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST_MIPMAP_LINEAR) glGenerateMipmap(GL_TEXTURE_2D) out = target,texid,texcoords self.categoriesTexCache[category][name]=out self.peng.sendEvent("peng3d:rsrc.tex.load",{"peng":self.peng,"name":name,"category":category}) return out
Returns a texture to be used as a placeholder for missing textures. A default missing texture file is provided in the assets folder of the source distribution. It consists of a simple checkerboard pattern of purple and black this image may be copied to any project using peng3d for similar behavior. If this texture cannot be found a pattern is created in - memory simply a solid square of purple. This texture will also be cached separately from other textures.
def getMissingTexture(self): """ Returns a texture to be used as a placeholder for missing textures. A default missing texture file is provided in the assets folder of the source distribution. It consists of a simple checkerboard pattern of purple and black, this image may be copied to any project using peng3d for similar behavior. If this texture cannot be found, a pattern is created in-memory, simply a solid square of purple. This texture will also be cached separately from other textures. """ if self.missingTexture is None: if self.resourceExists(self.missingtexturename,".png"): self.missingTexture = pyglet.image.load(self.resourceNameToPath(self.missingtexturename,".png")) return self.missingTexture else: # Falls back to create pattern in-memory self.missingTexture = pyglet.image.create(1,1,pyglet.image.SolidColorImagePattern([255,0,255,255])) return self.missingTexture else: return self.missingTexture
Adds a new texture from the given image. img may be any object that supports Pyglet - style copying in form of the blit_to_texture () method. This can be used to add textures that come from non - file sources e. g. Render - to - texture.
def addFromTex(self,name,img,category): """ Adds a new texture from the given image. ``img`` may be any object that supports Pyglet-style copying in form of the ``blit_to_texture()`` method. This can be used to add textures that come from non-file sources, e.g. Render-to-texture. """ texreg = self.categoriesTexBin[category].add(img) #texreg = texreg.get_transform(True,True) # Mirrors the image due to how pyglets coordinate system works # Strange behaviour, sometimes needed and sometimes not self.categories[category][name]=texreg target = texreg.target texid = texreg.id texcoords = texreg.tex_coords # Prevents texture bleeding with texture sizes that are powers of 2, else weird lines may appear at certain angles. glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_REPEAT) glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_REPEAT) glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST) glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST_MIPMAP_LINEAR) glGenerateMipmap(GL_TEXTURE_2D) out = target,texid,texcoords self.categoriesTexCache[category][name]=out return out
Gets the model object by the given name. If it was loaded previously a cached version will be returned. If it was not loaded it will be loaded and inserted into the cache.
def getModel(self,name): """ Gets the model object by the given name. If it was loaded previously, a cached version will be returned. If it was not loaded, it will be loaded and inserted into the cache. """ if name in self.modelobjcache: return self.modelobjcache[name] return self.loadModel(name)
Loads the model of the given name. The model will also be inserted into the cache.
def loadModel(self,name): """ Loads the model of the given name. The model will also be inserted into the cache. """ m = model.Model(self.peng,self,name) self.modelobjcache[name]=m self.peng.sendEvent("peng3d:rsrc.model.load",{"peng":self.peng,"name":name}) return m
Gets the model data associated with the given name. If it was loaded a cached copy will be returned. It it was not loaded it will be loaded and cached.
def getModelData(self,name): """ Gets the model data associated with the given name. If it was loaded, a cached copy will be returned. It it was not loaded, it will be loaded and cached. """ if name in self.modelcache: return self.modelcache[name] return self.loadModelData(name)
Loads the model data of the given name. The model file must always be a. json file.
def loadModelData(self,name): """ Loads the model data of the given name. The model file must always be a .json file. """ path = self.resourceNameToPath(name,".json") try: data = json.load(open(path,"r")) except Exception: # Temporary print("Exception during model load: ") import traceback;traceback.print_exc() return {}# will probably cause other exceptions later on, TODO out = {} if data.get("version",1)==1: # Currently only one version, basic future-proofing # This version should get incremented with breaking changes to the structure # Materials out["materials"]={} for name,matdata in data.get("materials",{}).items(): m = model.Material(self,name,matdata) out["materials"][name]=m out["default_material"]=out["materials"][data.get("default_material",list(out["materials"].keys())[0])] # Bones out["bones"]={"__root__":model.RootBone(self,"__root__",{"start_rot":[0,0],"length":0})} for name,bonedata in data.get("bones",{}).items(): b = model.Bone(self,name,bonedata) out["bones"][name]=b for name,bone in out["bones"].items(): if name == "__root__": continue bone.setParent(out["bones"][bone.bonedata["parent"]]) # Regions out["regions"]={} for name,regdata in data.get("regions",{}).items(): r = model.Region(self,name,regdata) r.material = out["materials"][regdata.get("material",out["default_material"])] r.bone = out["bones"][regdata.get("bone","__root__")] out["bones"][regdata.get("bone","__root__")].addRegion(r) out["regions"][name]=r # Animations out["animations"]={} out["animations"]["static"]=model.Animation(self,"static",{"type":"static","bones":{}}) for name,anidata in data.get("animations",{}).items(): a = model.Animation(self,name,anidata) a.setBones(out["bones"]) out["animations"][name]=a out["default_animation"]=out["animations"][data.get("default_animation",out["animations"]["static"])] else: raise ValueError("Unknown version %s of model '%s'"%(data.get("version",1),name)) self.modelcache[name]=out return out
Sets the background of the Container. Similar to: py: meth: peng3d. gui. SubMenu. setBackground () \ but only effects the region covered by the Container.
def setBackground(self,bg): """ Sets the background of the Container. Similar to :py:meth:`peng3d.gui.SubMenu.setBackground()`\ , but only effects the region covered by the Container. """ self.bg = bg if isinstance(bg,list) or isinstance(bg,tuple): if len(bg)==3 and isinstance(bg,list): bg.append(255) self.bg_vlist.colors = bg*4 elif bg in ["flat","gradient","oldshadow","material"]: self.bg = ContainerButtonBackground(self,borderstyle=bg,batch=self.batch2d) self.redraw()
Adds a widget to this container. Note that trying to add the Container to itself will be ignored.
def addWidget(self,widget): """ Adds a widget to this container. Note that trying to add the Container to itself will be ignored. """ if self is widget: # Prevents being able to add the container to itself, causing a recursion loop on redraw return self.widgets[widget.name]=widget
Draws the submenu and its background. Note that this leaves the OpenGL state set to 2d drawing and may modify the scissor settings.
def draw(self): """ Draws the submenu and its background. Note that this leaves the OpenGL state set to 2d drawing and may modify the scissor settings. """ if not self.visible: # Simple visibility check, has to be tested to see if it works properly return if not isinstance(self.submenu,Container): glEnable(GL_SCISSOR_TEST) glScissor(*self.pos+self.size) SubMenu.draw(self) if not isinstance(self.submenu,Container): glDisable(GL_SCISSOR_TEST)
Redraws the background and any child widgets.
def on_redraw(self): """ Redraws the background and any child widgets. """ x,y = self.pos sx,sy = self.size self.bg_vlist.vertices = [x,y, x+sx,y, x+sx,y+sy, x,y+sy] self.stencil_vlist.vertices = [x,y, x+sx,y, x+sx,y+sy, x,y+sy] if isinstance(self.bg,Background): if not self.bg.initialized: self.bg.init_bg() self.bg.initialized=True self.bg.redraw_bg()
Redraws the background and contents including scrollbar. This method will also check the scrollbar for any movement and will be automatically called on movement of the slider.
def on_redraw(self): """ Redraws the background and contents, including scrollbar. This method will also check the scrollbar for any movement and will be automatically called on movement of the slider. """ n = self._scrollbar.n self.offset_y = -n # Causes the content to move in the opposite direction of the slider # Size of scrollbar sx=24 # Currently constant, TODO: add dynamic sx of scrollbar sy=self.size[1] # Pos of scrollbar x=self.size[0]-sx y=0 # Currently constant, TODO: add dynamic y-pos of scrollbar # Dynamic pos/size may be added via align/lambda/etc. # Note that the values are written to the _* variant of the attribute to avoid 3 uneccessary redraws self._scrollbar._size = sx,sy self._scrollbar._pos = x,y self._scrollbar._nmax = self.content_height super(ScrollableContainer,self).on_redraw()
AABB Collision checker that can be used for most axis - aligned collisions. Intended for use in widgets to check if the mouse is within the bounds of a particular widget.
def mouse_aabb(mpos,size,pos): """ AABB Collision checker that can be used for most axis-aligned collisions. Intended for use in widgets to check if the mouse is within the bounds of a particular widget. """ return pos[0]<=mpos[0]<=pos[0]+size[0] and pos[1]<=mpos[1]<=pos[1]+size[1]
Adds a category with the given name. If the category already exists a: py: exc: KeyError will be thrown. Use: py: meth: updateCategory () instead if you want to update a category.
def addCategory(self,name,nmin=0,n=0,nmax=100): """ Adds a category with the given name. If the category already exists, a :py:exc:`KeyError` will be thrown. Use :py:meth:`updateCategory()` instead if you want to update a category. """ assert isinstance(name,basestring) # py2 compat is done at the top if name in self.categories: raise KeyError("Category with name '%s' already exists"%name) self.categories[name]=[nmin,n,nmax] self.redraw()
Smartly updates the given category. Only values that are given will be updated others will be left unchanged. If the category does not exist a: py: exc: KeyError will be thrown. Use: py: meth: addCategory () instead if you want to add a category.
def updateCategory(self,name,nmin=None,n=None,nmax=None): """ Smartly updates the given category. Only values that are given will be updated, others will be left unchanged. If the category does not exist, a :py:exc:`KeyError` will be thrown. Use :py:meth:`addCategory()` instead if you want to add a category. """ # smart update, only stuff that was given if name not in self.categories: raise KeyError("No Category with name '%s'"%name) if nmin is not None: self.categories[name][0]=nmin if n is not None: self.categories[name][1]=n if nmax is not None: self.categories[name][2]=nmax self.redraw() self.doAction("progresschange")
Deletes the category with the given name. If the category does not exist a: py: exc: KeyError will be thrown.
def deleteCategory(self,name): """ Deletes the category with the given name. If the category does not exist, a :py:exc:`KeyError` will be thrown. """ if name not in self.categories: raise KeyError("No Category with name '%s'"%name) del self.categories[name] self.redraw()
Helper property containing the percentage this slider is filled. This property is read - only.
def p(self): """ Helper property containing the percentage this slider is "filled". This property is read-only. """ return (self.n-self.nmin)/max((self.nmax-self.nmin),1)
Adds a new layer to the stack optionally at the specified z - value. layer must be an instance of Layer or subclasses. z can be used to override the index of the layer in the stack. Defaults to - 1 for appending.
def addLayer(self,layer,z=-1): """ Adds a new layer to the stack, optionally at the specified z-value. ``layer`` must be an instance of Layer or subclasses. ``z`` can be used to override the index of the layer in the stack. Defaults to ``-1`` for appending. """ # Adds a new layer to the stack, optionally at the specified z-value # The z-value is the index this layer should be inserted in, or -1 for appending if not isinstance(layer,Layer): raise TypeError("layer must be an instance of Layer!") if z==-1: self.layers.append(layer) else: self.layers.insert(z,layer)
Map a buffer region using this attribute as an accessor.
def _get_region(self, buffer, start, count): '''Map a buffer region using this attribute as an accessor. The returned region can be modified as if the buffer was a contiguous array of this attribute (though it may actually be interleaved or otherwise non-contiguous). The returned region consists of a contiguous array of component data elements. For example, if this attribute uses 3 floats per vertex, and the `count` parameter is 4, the number of floats mapped will be ``3 * 4 = 12``. :Parameters: `buffer` : `AbstractMappable` The buffer to map. `start` : int Offset of the first vertex to map. `count` : int Number of vertices to map :rtype: `AbstractBufferRegion` ''' byte_start = self.stride * start byte_size = self.stride * count array_count = self.count * count if self.stride == self.size or not array_count: # non-interleaved ptr_type = ctypes.POINTER(self.c_type * array_count) return buffer.get_region(byte_start, byte_size, ptr_type) else: # interleaved byte_start += self.offset byte_size -= self.offset elem_stride = self.stride // ctypes.sizeof(self.c_type) elem_offset = self.offset // ctypes.sizeof(self.c_type) ptr_type = ctypes.POINTER( self.c_type * int((count * elem_stride - elem_offset))) region = buffer.get_region(byte_start, byte_size, ptr_type) return vertexbuffer.IndirectArrayRegion( region, array_count, self.count, elem_stride)
Draw vertices in the domain.
def _draw(self, mode, vertex_list=None): '''Draw vertices in the domain. If `vertex_list` is not specified, all vertices in the domain are drawn. This is the most efficient way to render primitives. If `vertex_list` specifies a `VertexList`, only primitives in that list will be drawn. :Parameters: `mode` : int OpenGL drawing mode, e.g. ``GL_POINTS``, ``GL_LINES``, etc. `vertex_list` : `VertexList` Vertex list to draw, or ``None`` for all lists in this domain. ''' glPushClientAttrib(GL_CLIENT_VERTEX_ARRAY_BIT) for buffer, attributes in self.buffer_attributes: buffer.bind() for attribute in attributes: attribute.enable() attribute.set_pointer(attribute.buffer.ptr) if vertexbuffer._workaround_vbo_finish: glFinish() if vertex_list is not None: glDrawArrays(mode, vertex_list.start, vertex_list.count) else: starts, sizes = self.allocator.get_allocated_regions() primcount = len(starts) if primcount == 0: pass elif primcount == 1: # Common case glDrawArrays(mode, starts[0], int(sizes[0])) elif gl_info.have_version(1, 4): starts = (GLint * primcount)(*starts) sizes = (GLsizei * primcount)(*sizes) glMultiDrawArrays(mode, starts, sizes, primcount) else: for start, size in zip(starts, sizes): glDrawArrays(mode, start, size) for buffer, _ in self.buffer_attributes: buffer.unbind() glPopClientAttrib()
Patches the: py: mod: pyglet. graphics. vertexattribute \: py: mod: pyglet. graphics. vertexbuffer and: py: mod: pyglet. graphics. vertexdomain modules. This patch is only needed with Python 3. x and will be applied automatically when initializing: py: class: Peng () \. The patches consist of simply converting some list indices slices and other numbers to integers from floats with. 0. These patches have not been tested thoroughly but work with at least test. py and test_gui. py \. Can be enabled and disabled via: confval: pyglet. patch. patch_float2int \.
def patch_float2int(): """ Patches the :py:mod:`pyglet.graphics.vertexattribute`\ , :py:mod:`pyglet.graphics.vertexbuffer` and :py:mod:`pyglet.graphics.vertexdomain` modules. This patch is only needed with Python 3.x and will be applied automatically when initializing :py:class:`Peng()`\ . The patches consist of simply converting some list indices, slices and other numbers to integers from floats with .0. These patches have not been tested thoroughly, but work with at least ``test.py`` and ``test_gui.py``\ . Can be enabled and disabled via :confval:`pyglet.patch.patch_float2int`\ . """ pyglet.graphics.vertexattribute.AbstractAttribute.get_region = _get_region pyglet.graphics.vertexbuffer.MappableVertexBufferObject.bind = _bind pyglet.graphics.vertexbuffer.IndirectArrayRegion.__setitem__ = _iar__setitem__ pyglet.graphics.vertexdomain.VertexDomain.draw = _draw
Registers the given pyglet - style event handler for the given pyglet event. This function allows pyglet - style event handlers to receive events bridged through the peng3d event system. Internally this function creates a lambda function that decodes the arguments and then calls the pyglet - style event handler. The raiseErrors flag is passed through to the peng3d event system and will cause any errors raised by this handler to be ignored... seealso:: See: py: meth: ~peng3d. peng. Peng. addEventListener () for more information.
def register_pyglet_handler(peng,func,event,raiseErrors=False): """ Registers the given pyglet-style event handler for the given pyglet event. This function allows pyglet-style event handlers to receive events bridged through the peng3d event system. Internally, this function creates a lambda function that decodes the arguments and then calls the pyglet-style event handler. The ``raiseErrors`` flag is passed through to the peng3d event system and will cause any errors raised by this handler to be ignored. .. seealso:: See :py:meth:`~peng3d.peng.Peng.addEventListener()` for more information. """ peng.addEventListener("pyglet:%s"%event,(lambda data:func(*data["args"])),raiseErrors)
Adds a callback to the specified action. All other positional and keyword arguments will be stored and passed to the function upon activation.
def addAction(self,action,func,*args,**kwargs): """ Adds a callback to the specified action. All other positional and keyword arguments will be stored and passed to the function upon activation. """ if not hasattr(self,"actions"): self.actions = {} if action not in self.actions: self.actions[action] = [] self.actions[action].append((func,args,kwargs))
Helper method that calls all callbacks registered for the given action.
def doAction(self,action): """ Helper method that calls all callbacks registered for the given action. """ if not hasattr(self,"actions"): return for f,args,kwargs in self.actions.get(action,[]): f(*args,**kwargs)
Generates a new ID. If reuse_ids was false the new ID will be read from an internal counter which is also automatically increased. This means that the newly generated ID is already reserved. If reuse_ids was true this method starts counting up from start_id until it finds an ID that is not currently known. Note that the ID is not reserved this means that calling this method simultaneously from multiple threads may cause the same ID to be returned twice. Additionally if the ID is greater or equal to max_id \ an: py: exc: AssertionError is raised.
def genNewID(self): """ Generates a new ID. If ``reuse_ids`` was false, the new ID will be read from an internal counter which is also automatically increased. This means that the newly generated ID is already reserved. If ``reuse_ids`` was true, this method starts counting up from ``start_id`` until it finds an ID that is not currently known. Note that the ID is not reserved, this means that calling this method simultaneously from multiple threads may cause the same ID to be returned twice. Additionally, if the ID is greater or equal to ``max_id``\ , an :py:exc:`AssertionError` is raised. """ if self.reuse_ids: i = self.start_id while True: if i not in self._data["reg"]: assert i<=self.max_id return i # no need to change any variables i+=1 else: with self.id_lock: # new id creation in lock, to avoid issues with multiple threads i = self._data["next_id"] assert i<=self.max_id self._data["next_id"]+=1 return i
Registers a name to the registry. name is the name of the object and must be a string. force_id can be optionally set to override the automatic ID generation and force a specific ID. Note that using force_id is discouraged since it may cause problems when reuse_ids is false.
def register(self,name,force_id=None): """ Registers a name to the registry. ``name`` is the name of the object and must be a string. ``force_id`` can be optionally set to override the automatic ID generation and force a specific ID. Note that using ``force_id`` is discouraged, since it may cause problems when ``reuse_ids`` is false. """ with self.registry_lock: if force_id is None: new_id = self.genNewID() else: new_id = force_id self._data["reg"][new_id]=name return new_id
Takes in an object and normalizes it to its ID/ integer representation. Currently only integers and strings may be passed in else a: py: exc: TypeError will be thrown.
def normalizeID(self,in_id): """ Takes in an object and normalizes it to its ID/integer representation. Currently, only integers and strings may be passed in, else a :py:exc:`TypeError` will be thrown. """ if isinstance(in_id,int): assert in_id in self._data["reg"] return in_id elif isinstance(in_id,str): assert in_id in self._data["reg"].inv return self._data["reg"].inv[in_id] else: raise TypeError("Only int and str can be converted to IDs")
Takes in an object and normalizes it to its name/ string. Currently only integers and strings may be passed in else a: py: exc: TypeError will be thrown.
def normalizeName(self,in_name): """ Takes in an object and normalizes it to its name/string. Currently, only integers and strings may be passed in, else a :py:exc:`TypeError` will be thrown. """ if isinstance(in_name,str): assert in_name in self._data["reg"].inv return in_name elif isinstance(in_name,int): assert in_name in self._data["reg"] return self._data["reg"][in_name] else: raise TypeError("Only int and str can be converted to names")
Sets the view used to the specified name \. The name must be known to the world or else a: py: exc: ValueError is raised.
def setView(self,name): """ Sets the view used to the specified ``name``\ . The name must be known to the world or else a :py:exc:`ValueError` is raised. """ if name not in self.world.views: raise ValueError("Invalid viewname for world!") self.viewname = viewname self.view = self.world.getView(self.viewname)
Sets up the attributes used by: py: class: Layer3D () and calls: py: meth: Layer3D. predraw () \.
def predraw(self): """ Sets up the attributes used by :py:class:`Layer3D()` and calls :py:meth:`Layer3D.predraw()`\ . """ self.cam = self.view.cam super(LayerWorld,self).predraw()
Adds the given layer at the given Z Index. If z_index is not given the Z Index specified by the layer will be used.
def addLayer(self,layer,z_index=None): """ Adds the given layer at the given Z Index. If ``z_index`` is not given, the Z Index specified by the layer will be used. """ if z_index is None: z_index = layer.z_index i = 0 for l,z in self.layers: if z>z_index: break i+=1 self._layers[layer.name]=layer self.layers.insert(i,[layer,z_index])
Redraws the given layer.: raises ValueError: If there is no Layer with the given name.
def redraw_layer(self,name): """ Redraws the given layer. :raises ValueError: If there is no Layer with the given name. """ if name not in self._layers: raise ValueError("Layer %s not part of widget, cannot redraw") self._layers[name].on_redraw()
Draws all layers of this LayeredWidget. This should normally be unneccessary since it is recommended that layers use Vertex Lists instead of OpenGL Immediate Mode.
def draw(self): """ Draws all layers of this LayeredWidget. This should normally be unneccessary, since it is recommended that layers use Vertex Lists instead of OpenGL Immediate Mode. """ super(LayeredWidget,self).draw() for layer,_ in self.layers: layer._draw()
Deletes all layers within this LayeredWidget before deleting itself. Recommended to call if you are removing the widget but not yet exiting the interpreter.
def delete(self): """ Deletes all layers within this LayeredWidget before deleting itself. Recommended to call if you are removing the widget, but not yet exiting the interpreter. """ for layer,_ in self.layers: layer.delete() self.layers = [] self._layers = {} super(LayeredWidget,self).delete()
Called when the Layer should be redrawn. If a subclass uses the: py: meth: initialize () Method it is very important to also call the Super Class Method to prevent crashes.
def on_redraw(self): """ Called when the Layer should be redrawn. If a subclass uses the :py:meth:`initialize()` Method, it is very important to also call the Super Class Method to prevent crashes. """ super(WidgetLayer,self).on_redraw() if not self._initialized: self.initialize() self._initialized = True
Property to be used for setting and getting the border of the layer. Note that setting this property causes an immediate redraw.
def border(self): """ Property to be used for setting and getting the border of the layer. Note that setting this property causes an immediate redraw. """ if callable(self._border): return util.WatchingList(self._border(*(self.widget.pos+self.widget.size)),self._wlredraw_border) else: return util.WatchingList(self._border,self._wlredraw_border)
Property to be used for setting and getting the offset of the layer. Note that setting this property causes an immediate redraw.
def offset(self): """ Property to be used for setting and getting the offset of the layer. Note that setting this property causes an immediate redraw. """ if callable(self._offset): return util.WatchingList(self._offset(*(self.widget.pos+self.widget.size)),self._wlredraw_offset) else: return util.WatchingList(self._offset,self._wlredraw_offset)
Returns the absolute position and size of the layer. This method is intended for use in vertex position calculation as the border and offset have already been applied. The returned value is a 4 - tuple of ( sx sy ex ey ) \. The two values starting with an s are the start position or the lower - left corner. The second pair of values signify the end position or upper - right corner.
def getPos(self): """ Returns the absolute position and size of the layer. This method is intended for use in vertex position calculation, as the border and offset have already been applied. The returned value is a 4-tuple of ``(sx,sy,ex,ey)``\ . The two values starting with an s are the "start" position, or the lower-left corner. The second pair of values signify the "end" position, or upper-right corner. """ # Returns sx,sy,ex,ey # sx,sy are bottom-left/lowest # ex,ey are top-right/highest sx,sy = self.widget.pos[0]+self.border[0]+self.offset[0], self.widget.pos[1]+self.border[1]+self.offset[1] ex,ey = self.widget.pos[0]+self.widget.size[0]-self.border[0]+self.offset[0], self.widget.pos[1]+self.widget.size[1]-self.border[1]+self.offset[1] return sx,sy,ex,ey
Returns the size of the layer with the border size already subtracted.
def getSize(self): """ Returns the size of the layer, with the border size already subtracted. """ return self.widget.size[0]-self.border[0]*2,self.widget.size[1]-self.border[1]*2
Adds an image to the internal registry. rsrc should be a 2 - tuple of ( resource_name category ) \.
def addImage(self,name,rsrc): """ Adds an image to the internal registry. ``rsrc`` should be a 2-tuple of ``(resource_name,category)``\ . """ self.imgs[name]=self.widget.peng.resourceMgr.getTex(*rsrc)
Switches the active image to the given name.: raises ValueError: If there is no such image
def switchImage(self,name): """ Switches the active image to the given name. :raises ValueError: If there is no such image """ if name not in self.imgs: raise ValueError("No image of name '%s'"%name) elif self.cur_img==name: return self.cur_img = name self.on_redraw()
Re - draws the text by calculating its position. Currently the text will always be centered on the position of the layer.
def redraw_label(self): """ Re-draws the text by calculating its position. Currently, the text will always be centered on the position of the layer. """ # Convenience variables x,y,_,_ = self.getPos() sx,sy = self.getSize() self._label.x = x+sx/2. self._label.y = y+sy/2. self._label.width = sx # Height is not set, would look weird otherwise #self._label.height = sx self._label._update()
Re - draws the text by calculating its position. Currently the text will always be centered on the position of the layer.
def redraw_label(self): """ Re-draws the text by calculating its position. Currently, the text will always be centered on the position of the layer. """ # Convenience variables x,y,_,_ = self.getPos() sx,sy = self.getSize() if self.font_name is not None: self._label.font_name = self.font_name if self.font_size is not None: self._label.font_size = self.font_size if self.font_color is not None: self._label.color = self.font_color self._label.x = x+sx/2. self._label.y = y+sy/2. self._label.width = sx # Height is not set, would look weird otherwise #self._label.height = sx self._label._update()
Overrideable function that generates the colors to be used by various styles. Should return a 5 - tuple of ( bg o i s h ) \. bg is the base color of the background. o is the outer color it is usually the same as the background color. i is the inner color it is usually lighter than the background color. s is the shadow color it is usually quite a bit darker than the background. h is the highlight color it is usually quite a bit lighter than the background. The returned values may also be statically overridden by setting the: py: attr: color_<var > attribute to anything but None \.
def getColors(self): """ Overrideable function that generates the colors to be used by various styles. Should return a 5-tuple of ``(bg,o,i,s,h)``\ . ``bg`` is the base color of the background. ``o`` is the outer color, it is usually the same as the background color. ``i`` is the inner color, it is usually lighter than the background color. ``s`` is the shadow color, it is usually quite a bit darker than the background. ``h`` is the highlight color, it is usually quite a bit lighter than the background. The returned values may also be statically overridden by setting the :py:attr:`color_<var>` attribute to anything but ``None``\ . """ bg = self.widget.submenu.bg[:3] if isinstance(self.widget.submenu.bg,list) or isinstance(self.widget.submenu.bg,tuple) else [242,241,240] bg = bg if self.color_bg is None else self.color_bg o,i = bg, [min(bg[0]+8,255),min(bg[1]+8,255),min(bg[2]+8,255)] s,h = [max(bg[0]-40,0),max(bg[1]-40,0),max(bg[2]-40,0)], [min(bg[0]+12,255),min(bg[1]+12,255),min(bg[2]+12,255)] o = o if self.color_o is None else self.color_o i = i if self.color_i is None else self.color_i s = s if self.color_s is None else self.color_s h = h if self.color_h is None else self.color_h # Outer,Inner,Shadow,Highlight return bg,o,i,s,h
Called to generate the vertices used by this layer. The length of the output of this method should be three times the: py: attr: n_vertices attribute. See the source code of this method for more information about the order of the vertices.
def genVertices(self): """ Called to generate the vertices used by this layer. The length of the output of this method should be three times the :py:attr:`n_vertices` attribute. See the source code of this method for more information about the order of the vertices. """ sx,sy,ex,ey = self.getPos() b = self.bborder # Vertex Naming # Y # |1 2 3 4 # |5 6 7 8 # |9 10 11 12 # |13 14 15 16 # +------> X # Border order # 4 2-tuples # Each marks x,y offset from the respective corner # tuples are in order topleft,topright,bottomleft,bottomright # indices: # 0,1:topleft; 2,3:topright; 4,5:bottomleft; 6,7:bottomright # For a simple border that is even, just repeat the first tuple three more times v1 = sx, ey v2 = sx+b[0], ey v3 = ex-b[2], ey v4 = ex, ey v5 = sx, ey-b[1] v6 = sx+b[0], ey-b[1] v7 = ex-b[2], ey-b[3] v8 = ex, ey-b[3] v9 = sx, sy+b[5] v10= sx+b[4], sy+b[5] v11= ex-b[6], sy+b[7] v12= ex, sy+b[7] v13= sx, sy v14= sx+b[4], sy v15= ex-b[6], sy v16= ex, sy # Layer is separated into 9 sections, naming: # 1 2 3 # 4 5 6 # 7 8 9 # Within each section, vertices are given counter-clockwise, starting with the bottom-left # 4 3 # 1 2 # This is important when assigning colors q1 = v5 +v6 +v2 +v1 q2 = v6 +v7 +v3 +v2 q3 = v7 +v8 +v4 +v3 q4 = v9 +v10+v6 +v5 q5 = v10+v11+v7 +v6 q6 = v11+v12+v8 +v7 q7 = v13+v14+v10+v9 q8 = v14+v15+v11+v10 q9 = v15+v16+v12+v11 return q1+q2+q3+q4+q5+q6+q7+q8+q9
DEPRECATED Reads a mesh saved in the HDF5 format.
def read_h5(hdfstore, group = ""): """ DEPRECATED Reads a mesh saved in the HDF5 format. """ m = Mesh() m.elements.data = hdf["elements/connectivity"] m.nodes.data = hdf["nodes/xyz"] for key in hdf.keys(): if key.startswith("/nodes/sets"): k = key.replace("/nodes/sets/", "") m.nodes.sets[k] = set(hdf[key]) if key.startswith("/elements/sets"): k = key.replace("/elements/sets/", "") m.elements.sets[k] = set(hdf[key]) if key.startswith("/elements/surfaces"): k = key.replace("/elements/surfaces/", "") m.elements.surfaces[k] = hdf[key] if key.startswith("/fields/"): if key.endswith("/metadata"): tag = key.split("/")[2] f = Field() f.metadata = hdf["fields/{0}/metadata".format(tag)] f.metadata = hdf["fields/{0}/data".format(tag)] f.master = m m.add_field(tag, f) hdf.close() return m
Reads a GMSH MSH file and returns a: class: Mesh instance.: arg path: path to MSH file.: type path: str
def read_msh(path): """ Reads a GMSH MSH file and returns a :class:`Mesh` instance. :arg path: path to MSH file. :type path: str """ elementMap = { 15:"point1", 1:"line2", 2:"tri3", 3:"quad4", 4:"tetra4", 5:"hexa8", 6:"prism6", 7:"pyra4", } lines = np.array(open(path, "r").readlines()) locs = {} nl = len(lines) for i in range(nl): line = lines[i].lower().strip() if line.startswith("$"): if line.startswith("$end"): locs[env].append(i) else: env = line[1:] locs[env] = [i] nodes = pd.read_csv( io.StringIO("\n".join( lines[locs["nodes"][0]+2:locs["nodes"][1]])), sep = " ", names = ["labels", "x", "y", "z"]) elements = {"labels":[], "etype":[], "conn": [], "tags":[], "sets":{}} for line in lines[locs["elements"][0]+2:locs["elements"][1]]: d = np.array([int(w) for w in line.split()]) elements["labels"].append( d[0] ) elements["etype"].append(elementMap[d[1]] ) elements["tags"].append( d[3: 3+d[2]] ) elements["conn"].append(d[3+d[2]:]) elements["labels"] = np.array(elements["labels"]) physicalNames = {} for line in lines[locs["physicalnames"][0]+2:locs["physicalnames"][1]]: w = line.split() physicalNames[int(w[1])] = w[2].replace('"', '') sets = {} tags = np.array([t[0] for t in elements["tags"]]) for k in physicalNames.keys(): sets[physicalNames[k]] = np.array([t == k for t in tags]) """ for tag, values in sets.items(): elements["sets"][tag] = elements["labels"][values] """ elements["sets"] = sets return Mesh(nlabels = nodes["labels"], coords = np.array(nodes[["x", "y", "z"]]), elabels = elements["labels"], conn = elements["conn"], types = elements["etype"], esets = elements["sets"])
Reads Abaqus inp file
def read_inp(path): """ Reads Abaqus inp file """ def lineInfo(line): out = {"type": "data"} if line[0] == "*": if line[1] == "*": out["type"] = "comment" out["text"] = line[2:] else: out["type"] = "command" words = line[1:].split(",") out["value"] = words[0].strip() out["options"] = {} for word in words[1:]: key, value = [s.strip() for s in word.split("=")] out["options"][key] = value return out def elementMapper(inpeltype): if inpeltype == "t3d2": return "Line2" if inpeltype[:3] in ["cps", "cpe", "cax"]: if inpeltype[3] == "3": return "tri3" if inpeltype[3] == "4": return "quad4" if inpeltype[:3] in ["c3d"]: if inpeltype[3] == "4": return "tetra4" if inpeltype[3] == "5": return "pyra5" if inpeltype[3] == "6": return "prism6" if inpeltype[3] == "8": return "hexa8" nlabels = [] coords = [] nsets = {} elabels = [] etypes = [] connectivity = [] esets = {} surfaces = {} # File preprocessing lines = np.array([l.strip().lower() for l in open(path).readlines()]) lines = [line for line in lines if len(line) != 0] # Data processing env, setlabel = None, None for line in lines: d = lineInfo(line) if d["type"] == "command": env = d["value"] # Nodes if env == "node": opt = d["options"] currentset = None if "nset" in opt.keys(): currentset = opt["nset"] nsets[currentset] = [] # Elements if env == "element": opt = d["options"] eltype = elementMapper(opt["type"]) currentset = None if "elset" in opt.keys(): currentset = opt["elset"] esets[currentset] = [] # Nsets if env == "nset": opt = d["options"] currentset = opt["nset"] nsets[currentset] = [] # Elsets if env == "elset": opt = d["options"] currentset = opt["elset"] esets[currentset] = [] # Surfaces if env == "surface": opt = d["options"] currentsurface = opt["name"] if opt["type"] == "element": surfaces[currentsurface] = [] if d["type"] == "data": words = line.strip().split(",") if env == "node": label = int(words[0]) nlabels.append(label) coords.append( np.array([np.float64(w) for w in words[1:4]]) ) if currentset != None: nsets[currentset].append(label) if env == "element": label = int(words[0]) elabels.append(label) connectivity.append( np.array( [np.int32(w) for w in words[1:] if len(w) != 0 ]) ) etypes.append(eltype) if currentset != None: esets[currentset].append(label) if env == "nset": nsets[currentset] += [int(w) for w in words if len(w) != 0] if env == "elset": esets[currentset] += [int(w) for w in words if len(w) != 0] if env == "surface": if opt["type"] == "element": surfaces[currentsurface].append([w.strip() for w in words]) surfaces2 = {} for tag, surface in surfaces.items(): surfaces2[tag] = [] for sdata in surface: labels = esets[sdata[0]] face = int(sdata[1].split("s")[1].strip())-1 for label in labels: surfaces2[tag].append((label, face)) return Mesh(nlabels = nlabels, coords = coords, nsets = nsets, elabels = elabels, etypes = etypes, connectivity = connectivity, esets = esets,)
Dumps the mesh to XDMF format.
def write_xdmf(mesh, path, dataformat = "XML"): """ Dumps the mesh to XDMF format. """ pattern = Template(open(MODPATH + "/templates/mesh/xdmf.xdmf").read()) attribute_pattern = Template(open(MODPATH + "/templates/mesh/xdmf_attribute.xdmf").read()) # MAPPINGS cell_map = { "tri3": 4, "quad4": 5, "tetra4": 6, "pyra5": 7, "prism6": 8, "hexa8": 9} # REFERENCES nodes, elements = mesh.nodes.data, mesh.elements.data fields = mesh.fields # NUMBERS Ne, Nn = len(elements), len(nodes) # NODES nodes_map = np.arange(nodes.index.max()+1) nodes_map[nodes.index] = np.arange(len(nodes.index)) nodes_map[0] = -1 # ELEMENTS cols = ["n{0}".format(i) for i in range(elements.shape[1]-1)] connectivities = mesh.elements.data[cols].as_matrix() connectivities[np.isnan(connectivities)] = 0 connectivities = connectivities.astype(np.int32) connectivities = nodes_map[connectivities] labels = np.array(elements.index) etypes = np.array([cell_map[t] for t in elements.etype]) lconn = Ne + (connectivities != -1).sum() # FIELDS fields_string = "" field_data = {} for tag, field in fields.items(): field_data[tag] = {} field.data.sort_index(inplace = True) fshape = field.data.shape[1] if fshape == 1: ftype = "Scalar" elif fshape == 3: ftype = "Vector" elif fshape == 2: ftype = "Vector" # UGLY HACK... field = copy.copy(field) field.data["v3"] = np.zeros_like(field.data.index) fields[tag] = field # BACK TO NORMAL elif fshape == 6: ftype = "Tensor6" elif fshape == 4: ftype = "Tensor6" # UGLY HACK... field = copy.copy(field) field.data["v13"] = np.zeros_like(field.data.index) field.data["v23"] = np.zeros_like(field.data.index) fields[tag] = field # BACK TO NORMAL if field.metadata.position == "Nodal": position = "Node" if field.metadata.position == "Element": position = "Cell" field_data[tag]["TAG"] = tag field_data[tag]["ATTRIBUTETYPE"] = ftype field_data[tag]["FORMAT"] = dataformat field_data[tag]["FIELD_DIMENSION"] = " ".join([str(l) for l in field.data.shape]) field_data[tag]["POSITION"] = position if dataformat == "XML": #NODES nodes_string = "\n".join([11*" " + "{0} {1} {2}".format( n.x, n.y, n.z) for i, n in nodes.iterrows()]) # ELEMENTS elements_string = "" for i in range(Ne): elements_string += 11*" " + str(etypes[i]) + " " c = connectivities[i] c = c[np.where(c != -1)] elements_string += " ".join([str(i) for i in c]) + "\n" elements_strings = elements_string[:-1] # FIELDS for tag, field in fields.items(): fdata = field.data.to_csv(sep = " ", index = False, header = False).split("\n") fdata = [11 * " " + l for l in fdata] fdata = "\n".join(fdata) field_data[tag]["DATA"] = fdata fields_string += attribute_pattern.substitute(**field_data[tag]) elif dataformat == "HDF": hdf = pd.HDFStore(path + ".h5") hdf.put("COORDS", mesh.nodes.data[list("xyz")]) flatconn = np.zeros(lconn, dtype = np.int32) pos = 0 for i in range(Ne): c = connectivities[i] c = c[np.where(c != -1)] lc = len(c) flatconn[pos] = etypes[i] flatconn[pos + 1 + np.arange(lc)] = c pos += 1 + lc hdf.put("CONNECTIVITY", pd.DataFrame(flatconn)) nodes_string = 11*" " + "{0}.h5:/COORDS/block0_values".format(path) elements_string = 11*" " + "{0}.h5:/CONNECTIVITY/block0_values".format(path) for tag, field in fields.items(): fstrings[tag] = fstrings[tag].replace("#DATA", 11*" " + "{0}.h5:/FIELDS/{1}/block0_values".format(path, tag)) fields_string += fstrings[tag] hdf.put("FIELDS/{0}".format(tag), fields.data) hdf.close() """ pattern = pattern.replace("#ELEMENT_NUMBER", str(Ne)) pattern = pattern.replace("#CONN_DIMENSION", str(lconn)) pattern = pattern.replace("#CONN_PATH", elements_string) pattern = pattern.replace("#NODE_NUMBER", str(Nn)) pattern = pattern.replace("#NODE_PATH", nodes_string) pattern = pattern.replace("#DATAFORMAT", dataformat) pattern = pattern.replace("#ATTRIBUTES", fields_string) """ fields_string = "\n".join([attribute_pattern.substitute(**value) for key, value in field_data.items()]) pattern = pattern.substitute( ELEMENT_NUMBER = str(Ne), CONN_DIMENSION = str(lconn), CONN_PATH = elements_string, NODE_NUMBER = str(Nn), NODE_PATH = nodes_string, DATAFORMAT = dataformat, ATTRIBUTES = fields_string) open(path + ".xdmf", "wb").write(pattern)
Exports the mesh to the INP format.
def write_inp(mesh, path = None, maxwidth = 40, sections = "solid"): """ Exports the mesh to the INP format. """ def set_to_inp(sets, keyword): ss = "" for sk in sets.keys(): labels = sets[sk].loc[sets[sk]].index.values labels = list(labels) labels.sort() if len(labels)!= 0: ss += "*{0}, {0}={1}\n".format(keyword, sk) ss += argiope.utils.list_to_string(labels) + "\n" return ss.strip() # DATA mesh = mesh.copy() # NODES nodes_output = (mesh.nodes.coords.to_csv(header = False).split()) nodes_output = ("\n".join([" " + s.replace(",", ", ") for s in nodes_output])) # NODE SETS if "sets" in mesh.nodes.columns.levels[0]: nsets = set_to_inp(mesh.nodes.sets, "NSET") else: nsets = "**" # SURFACES surf_output = [] if "surfaces" in mesh.elements.keys(): sk = mesh.elements.surfaces.keys() for sindex in np.unique(sk.labels[0]): slabel = sk.levels[0][sindex] surface = mesh.elements.surfaces[slabel] if surface.values.sum() != 0: mesh.surface_to_element_sets(slabel) surf_output.append( "*SURFACE, TYPE=ELEMENT, NAME={0}".format(slabel)) for findex in surface.keys(): if surface[findex].sum() != 0: surf_output.append(" _SURF_{0}_FACE{1}, S{1}".format(slabel, findex[1:])) else: surf_output.append("**") # ELEMENTS elements_output = "" for etype, group in mesh.elements.groupby((("type", "solver", ""),)): els = group.conn.replace(0, np.nan).to_csv(header = False, float_format='%.0f').split() elements_output += "*ELEMENT, TYPE={0}\n".format(etype) elements_output += ("\n".join([" " + s.strip().strip(","). replace(",", ", ") for s in els])) elements_output += "\n" elements_output = elements_output.strip() el_sets = {} # MATERIALS section_output = "" for material, group in mesh.elements.groupby("materials"): slabel = "_MAT_{0}".format(material) section_output += "*ELSET, ELSET=_MAT_{0}\n{1}\n".format( material, argiope.utils.list_to_string(group.index.values)) #mesh.elements[("sets", slabel, "")] = False #mesh.elements.loc[group.index, ("sets", slabel, "")] = True if sections == "solid": section_output += "*SOLID SECTION, ELSET=_MAT_{0}, MATERIAL={0}\n".format( material) # ELEMENTS SETS if "sets" in mesh.elements.columns.levels[0]: esets = set_to_inp(mesh.elements.sets.swaplevel(1,0, axis = 1)[""],"ELSET") else: esets = "**" """ ek = mesh.elements.sets.keys() for esindex in np.unique(ek.labels[0]): eslabel = ek.levels[0][esindex] eset = mesh.elements.sets[slabel] """ # PATTERN pattern = Template(open(MODPATH + "/templates/mesh/inp.inp").read()) pattern = pattern.substitute( NODES = nodes_output, NODE_SETS = nsets, ELEMENTS = elements_output, ELEMENT_SETS = esets, ELEMENT_SURFACES = "\n".join(surf_output), SECTIONS = section_output.strip()) pattern = pattern.strip() if path == None: return pattern else: open(path, "w").write(pattern)
Connectivity builder using Numba for speed boost.
def _make_conn(shape): """ Connectivity builder using Numba for speed boost. """ shape = np.array(shape) Ne = shape.prod() if len(shape) == 2: nx, ny = np.array(shape) +1 conn = np.zeros((Ne, 4), dtype = np.int32) counter = 0 pattern = np.array([0,1,1+nx,nx]) for j in range(shape[1]): for i in range(shape[0]): conn[counter] = pattern + 1 + i + j*nx counter += 1 if len(shape) == 3: nx, ny, nz = np.array(shape) +1 conn = np.zeros((Ne, 8), dtype = np.int32) counter = 0 pattern = np.array([0,1,1+nx,nx,nx*ny,1+nx*ny,1+(nx+1)*ny,(nx+1)*ny]) for k in range(shape[2]): for j in range(shape[1]): for i in range(shape[0]): conn[counter] = pattern + 1 + i + j*nx+ k*nx*ny counter += 1 return conn
Returns a structured mesh.: arg shape: 2 or 3 integers ( eg: shape = ( 10 10 10 )).: type shape: tuple: arg dim: 2 or 3 floats ( eg: dim = ( 4. 2. 1. )): type dim: tuple.. note::
def structured_mesh(shape = (2,2,2), dim = (1.,1.,1.)): """ Returns a structured mesh. :arg shape: 2 or 3 integers (eg: shape = (10, 10, 10)). :type shape: tuple :arg dim: 2 or 3 floats (eg: dim = (4., 2., 1.)) :type dim: tuple .. note:: This function does not use GMSH for mesh generation. >>> import argiope as ag >>> mesh = ag.mesh.structured_mesh(shape =(10,10,10), dim=(1.,1.,1.))) """ # PREPROCESSING shape = np.array(shape) dim = np.array(dim) Ne = shape.prod() Nn = (shape + 1).prod() # LABELS nindex = np.arange(Nn) + 1 eindex = np.arange(Ne) + 1 # COORDINATES coords = [ np.linspace(0., dim[i], shape[i] + 1) for i in range(len(shape))] coords = np.array(np.meshgrid(*coords)) coords = np.array([c.swapaxes(0,1).flatten("F") for c in coords]).T if len(shape) == 2: c = coords coords = np.zeros((Nn, 3)) coords[:, :2] = c # CONNECTIVITY conn = _make_conn(shape) # MESH INSTANCE mesh = Mesh(nlabels = nindex, coords = coords, elabels = eindex, conn = conn,) if len(shape) == 2: mesh.elements[("type", "argiope")] = "quad4" if len(shape) == 3: mesh.elements[("type", "argiope")] = "hexa8" return mesh
r Sets the node data.: arg nlabels: node labels. Items be strictly positive and int typed in 1D array - like with shape: math: ( N_n ).: type nlabels: 1D uint typed array - like: arg coords: node coordinates. Must be float typed 2D array - like of shape: math: ( N_n \ times 3 ).: type coords: 2D float typed array - like: arg nsets: node sets. Contains boolean array - like of shape: math: ( N_n ).: type nsets: dict
def set_nodes(self, nlabels = [], coords = [], nsets = {}, **kwargs): r""" Sets the node data. :arg nlabels: node labels. Items be strictly positive and int typed in 1D array-like with shape :math:`(N_n)`. :type nlabels: 1D uint typed array-like :arg coords: node coordinates. Must be float typed 2D array-like of shape :math:`(N_n \times 3)`. :type coords: 2D float typed array-like :arg nsets: node sets. Contains boolean array-like of shape :math:`(N_n)`. :type nsets: dict """ # DATA PREPROCESSING nlabels = np.array(nlabels).astype(np.int64) coords = np.array(coords).astype(np.float64) if (nlabels < 0).sum() > 0: raise ValueError("Node labels must be strictly positive.") if len(nlabels) != len(coords): raise ValueError("'nlabels' and 'coords' must have the same length") if coords.shape[1] != 3: raise ValueError("coordinates must be 3 dimensional.") # ATTRIBUTES CREATION columns = pd.MultiIndex.from_tuples((("coords", "x"), ("coords", "y"), ("coords", "z"))) self.nodes = pd.DataFrame(data = coords, columns = columns, index = nlabels) self.nodes.index.name = "node" for k, v in nsets.items(): v = np.array(v) if v.dtype != 'bool': raise ValueError("Sets must be boolean array-likes.") self.nodes["sets", k] = v self.nodes["sets", "all"] = True
Sets the element data.: arg elabels: element labels. Items be strictly positive and int typed in 1D array - like with shape: math: ( N_e ).: type elabels: 1D uint typed array - like: arg types: element types chosen among argiope specific element types.: type types: str typed array - like: arg stypes: element types chosen in solver ( depends on the chosen solver ) specific element types.: type stypes: str typed array - like: arg conn: connectivity table. In order to deal with non rectangular tables: math: 0 can be used to fill missing data.: type conn: uint typed array - like: arg esets: element sets. Contains boolean array - like of shape: math: ( N_e ).: type esets: dict: arg surfaces: surfaces. Contains boolean array - like of shape: math: ( N_e N_s ) with: math: N_s being the maximum number of faces on a single element.: type surfaces: dict: arg materials: material keys. Any number a of materials can be used.: type materials: str typed array - like
def set_elements(self, elabels = None, types = None, stypes = "", conn = None, esets = {}, surfaces = {}, materials = "", **kwargs): """ Sets the element data. :arg elabels: element labels. Items be strictly positive and int typed in 1D array-like with shape :math:`(N_e)`. :type elabels: 1D uint typed array-like :arg types: element types chosen among argiope specific element types. :type types: str typed array-like :arg stypes: element types chosen in solver (depends on the chosen solver) specific element types. :type stypes: str typed array-like :arg conn: connectivity table. In order to deal with non rectangular tables, :math:`0` can be used to fill missing data. :type conn: uint typed array-like :arg esets: element sets. Contains boolean array-like of shape :math:`(N_e)`. :type esets: dict :arg surfaces: surfaces. Contains boolean array-like of shape :math:`(N_e, N_s )` with :math:`N_s` being the maximum number of faces on a single element. :type surfaces: dict :arg materials: material keys. Any number a of materials can be used. :type materials: str typed array-like """ # COLUMNS BUILDING if elabels is None: warnings.warn( "Since no element labels where provided, no elements where created", Warning) self.elements = None else: columns = pd.MultiIndex.from_tuples([("type", "argiope", "")]) self.elements = pd.DataFrame(data = types, columns = columns, index = elabels) self.elements.index.name = "element" self.elements.loc[:, ("type", "solver", "")] = stypes # Connectivity c = pd.DataFrame(conn, index = elabels) c.fillna(0, inplace = True) c[:] = c.values.astype(np.int32) c.columns = pd.MultiIndex.from_product([["conn"], ["n{0}".format(n) for n in np.arange(c.shape[1])], [""]]) self.elements = self.elements.join(c) # Sets for k, v in esets.items(): self.elements[("sets", k, "")] = v self.elements["sets", "all", ""] = True # Surfaces for k, v in surfaces.items(): for fk, vv in v.items(): self.elements[("surfaces", k, "s{0}".format(fk))] = vv # Materials self.elements[("materials", "", "") ] = materials self.elements.sort_index(axis = 1, inplace = True)
Sets the fields.
def set_fields(self, fields = None, **kwargs): """ Sets the fields. """ self.fields = [] if fields != None: for field in fields: self.fields.append(field)
Add the fields into the list of fields.
def add_fields(self, fields = None, **kwargs): """ Add the fields into the list of fields. """ if fields != None: for field in fields: self.fields.append(field)
Checks element definitions.
def check_elements(self): """ Checks element definitions. """ # ELEMENT TYPE CHECKING existing_types = set(self.elements.type.argiope.values.flatten()) allowed_types = set(ELEMENTS.keys()) if (existing_types <= allowed_types) == False: raise ValueError("Element types {0} not in know elements {1}".format( existing_types - allowed_types, allowed_types)) print("<Elements: OK>")
Returns the dimension of the embedded space of each element.
def space(self): """ Returns the dimension of the embedded space of each element. """ return self.elements.type.argiope.map( lambda t: ELEMENTS[t].space)
Returns the number of vertices of eache element according to its type/
def nvert(self): """ Returns the number of vertices of eache element according to its type/ """ return self.elements.type.argiope.map( lambda t: ELEMENTS[t].nvert)
Returns the decomposition of the elements. Inputs: * into: must be in [ edges faces simplices angles ] * loc: None or labels of the chosen elements. * at: must be in [ labels coords ]
def split(self, into = "edges", loc = None, at = "labels", sort_index = True): """ Returns the decomposition of the elements. Inputs: * into: must be in ['edges', 'faces', 'simplices', 'angles'] * loc: None or labels of the chosen elements. * at: must be in ['labels', 'coords'] """ if type(loc) == type(None): elements = self.elements else: elements = self.elements.loc[loc] out = [] for etype, group in elements.groupby([("type", "argiope", "")]): try: output_maps = getattr(ELEMENTS[etype], into) for om in range(len(output_maps)): oshape = len(output_maps[om]) conn = group.conn columns = pd.MultiIndex.from_product([(om,), np.arange(oshape)], names = [into, "vertex"]) data = (conn.values[:, output_maps[om]].reshape(len(conn), oshape)) df = pd.DataFrame(data = data, columns = columns, index = conn.index).stack((0,1)) out.append(df) except: print("Can not extract '{0}' from '{1}'".format(into, etype)) if len(out) != 0: out = pd.concat(out) out.sort_index(inplace = True) if at == "coords": data = self.nodes.coords.loc[out.values].values out = pd.DataFrame(index = out.index, data = data, columns = ["x", "y", "z"]) return out
Returns a dataframe containing volume and centroids of all the elements.
def centroids_and_volumes(self, sort_index = True): """ Returns a dataframe containing volume and centroids of all the elements. """ elements = self.elements out = [] for etype, group in self.elements.groupby([("type", "argiope", "")]): etype_info = ELEMENTS[etype] simplices_info = etype_info.simplices index = group.index simplices_data = self.split(into = "simplices", loc = index, at = "coords") simplices = simplices_data.values.reshape( index.size, simplices_info.shape[0], simplices_info.shape[1], 3) edges = simplices[:,:,1:] - simplices[:,:,:1] simplices_centroids = simplices.mean(axis = 2) if etype_info.space == 2: simplices_volumes = np.linalg.norm( np.cross(edges[:,:,0], edges[:,:,1], axis = 2), axis = 2)/2. elif etype_info.space == 3: simplices_volumes = (np.cross(edges[:,:,0], edges[:,:,1], axis = 2) * edges[:,:, 2]).sum(axis = 2) / 6. elements_volumes = simplices_volumes.sum(axis = 1) elements_centroids = ((simplices_volumes.reshape(*simplices_volumes.shape, 1) * simplices_centroids).sum(axis = 1) / elements_volumes.reshape(*elements_volumes.shape,1)) volumes_df = pd.DataFrame(index = index, data = elements_volumes, columns = pd.MultiIndex.from_product( [["volume"], [""]])) centroids_df = pd.DataFrame(index = index, data = elements_centroids, columns = pd.MultiIndex.from_product( [["centroid"], ["x", "y", "z"]])) out.append(pd.concat([volumes_df, centroids_df], axis = 1)) out = pd.concat(out) if sort_index: out.sort_index(inplace = True) return out.sort_index(axis= 1)
Returns the internal angles of all elements and the associated statistics
def angles(self, zfill = 3): """ Returns the internal angles of all elements and the associated statistics """ elements = self.elements.sort_index(axis = 1) etypes = elements[("type", "argiope")].unique() out = [] for etype in etypes: etype_info = ELEMENTS[etype] angles_info = etype_info.angles loc = elements[("type", "argiope", "")] == etype index = elements.loc[loc].index angles_data = self.split(into = "angles", loc = loc, at = "coords") data = angles_data.values.reshape(index.size, angles_info.shape[0], angles_info.shape[1], 3) edges = data[:,:,[0,2],:] - data[:,:,1:2,:] edges /= np.linalg.norm(edges, axis = 3).reshape( index.size, angles_info.shape[0], 2, 1) angles = np.degrees(np.arccos(( edges[:,:,0] * edges[:,:,1]).sum(axis = 2))) deviation = angles - etype_info.optimal_angles angles_df = pd.DataFrame(index = index, data = angles, columns = pd.MultiIndex.from_product( [["angles"], ["a" + "{0}".format(s).zfill(zfill) for s in range(angles_info.shape[0])]])) deviation_df = pd.DataFrame(index = index, data = deviation, columns = pd.MultiIndex.from_product( [["deviation"], ["d" + "{0}".format(s).zfill(zfill) for s in range(angles_info.shape[0])]])) df = pd.concat([angles_df, deviation_df], axis = 1).sort_index(axis = 1) df["stats", "max_angle"] = df.angles.max(axis = 1) df["stats", "min_angle"] = df.angles.min(axis = 1) df["stats", "max_angular_deviation"] = df.deviation.max(axis = 1) df["stats", "min_angular_deviation"] = df.deviation.min(axis = 1) df["stats", "max_abs_angular_deviation"] = abs(df.deviation).max(axis = 1) df = df.sort_index(axis = 1) out.append(df) out = pd.concat(out).sort_index(axis = 1) return out
Returns the aspect ratio of all elements.
def edges(self, zfill = 3): """ Returns the aspect ratio of all elements. """ edges = self.split("edges", at = "coords").unstack() edges["lx"] = edges.x[1]-edges.x[0] edges["ly"] = edges.y[1]-edges.y[0] edges["lz"] = edges.z[1]-edges.z[0] edges["l"] = np.linalg.norm(edges[["lx", "ly", "lz"]], axis = 1) edges = (edges.l).unstack() edges.columns = pd.MultiIndex.from_product([["length"], ["e" + "{0}".format(s).zfill(zfill) for s in np.arange(edges.shape[1])]]) edges[("stats", "lmax")] = edges.length.max(axis = 1) edges[("stats", "lmin")] = edges.length.min(axis = 1) edges[("stats", "aspect_ratio")] = edges.stats.lmax / edges.stats.lmin return edges.sort_index(axis = 1)
Returns mesh quality and geometric stats.
def stats(self): """ Returns mesh quality and geometric stats. """ cv = self.centroids_and_volumes() angles = self.angles() edges = self.edges() return pd.concat([cv , angles[["stats"]], edges[["stats"]] ], axis = 1).sort_index(axis = 1)
Makes a node set from an element set.
def element_set_to_node_set(self, tag): """ Makes a node set from an element set. """ nodes, elements = self.nodes, self.elements loc = (elements.conn[elements[("sets", tag, "")]] .stack().stack().unique()) loc = loc[loc != 0] nodes[("sets", tag)] = False nodes.loc[loc, ("sets", tag) ] = True
Converts a node set to surface.
def node_set_to_surface(self, tag): """ Converts a node set to surface. """ # Create a dummy node with label 0 nodes = self.nodes.copy() dummy = nodes.iloc[0].copy() dummy["coords"] *= np.nan dummy["sets"] = True nodes.loc[0] = dummy # Getting element surfaces element_surfaces= self.split("surfaces").unstack() # killer hack ! surf = pd.DataFrame( nodes.sets[tag].loc[element_surfaces.values.flatten()] .values.reshape(element_surfaces.shape) .prod(axis = 1) .astype(np.bool), index = element_surfaces.index).unstack().fillna(False) for k in surf.keys(): self.elements["surfaces", tag, "f{0}".format(k[1]+1) ] = surf.loc[:, k]
Creates elements sets corresponding to a surface.
def surface_to_element_sets(self, tag): """ Creates elements sets corresponding to a surface. """ surface = self.elements.surfaces[tag] for findex in surface.keys(): if surface[findex].sum() != 0: self.elements[("sets", "_SURF_{0}_FACE{1}" .format(tag, findex[1:]), "")] = surface[findex]
Returns the mesh as matplotlib polygon collection. ( tested only for 2D meshes )
def to_polycollection(self, *args, **kwargs): """ Returns the mesh as matplotlib polygon collection. (tested only for 2D meshes) """ from matplotlib import collections nodes, elements = self.nodes, self.elements.reset_index() verts = [] index = [] for etype, group in elements.groupby([("type", "argiope", "")]): index += list(group.index) nvert = ELEMENTS[etype].nvert conn = group.conn.values[:, :nvert].flatten() coords = nodes.coords[["x", "y"]].loc[conn].values.reshape( len(group), nvert, 2) verts += list(coords) verts = np.array(verts) verts= verts[np.argsort(index)] return collections.PolyCollection(verts, *args,**kwargs )
Returns the mesh as a matplotlib. tri. Triangulation instance. ( 2D only )
def to_triangulation(self): """ Returns the mesh as a matplotlib.tri.Triangulation instance. (2D only) """ from matplotlib.tri import Triangulation conn = self.split("simplices").unstack() coords = self.nodes.coords.copy() node_map = pd.Series(data = np.arange(len(coords)), index = coords.index) conn = node_map.loc[conn.values.flatten()].values.reshape(*conn.shape) return Triangulation(coords.x.values, coords.y.values, conn)
Returns fields metadata as a dataframe.
def fields_metadata(self): """ Returns fields metadata as a dataframe. """ return (pd.concat([f.metadata() for f in self.fields], axis = 1) .transpose() .sort_values(["step_num", "frame", "label", "position"]))
Returns metadata as a dataframe.
def metadata(self): """ Returns metadata as a dataframe. """ return pd.Series({ "part": self.part, "step_num": self.step_num, "step_label": self.step_label, "frame": self.frame, "frame_value": self.frame_value, "label": self.label, "position": self.position, })
Checks if required directories exist and creates them if needed.
def make_directories(self): """ Checks if required directories exist and creates them if needed. """ if os.path.isdir(self.workdir) == False: os.mkdir(self.workdir)
Runs the post - proc script.
def run_postproc(self): """ Runs the post-proc script. """ t0 = time.time() if self.verbose: print('#### POST-PROCESSING "{0}" USING POST-PROCESSOR "{1}"'.format(self.label, self.solver.upper())) if self.solver == "abaqus": command = '{0} viewer noGUI={1}_abqpp.py'.format(self.solver_path, self.label) process = subprocess.Popen( command, cwd = self.workdir, shell = True, stdout = subprocess.PIPE, stderr = subprocess.STDOUT) for line in iter(process.stdout.readline, b''): line = line.rstrip().decode('utf8') print(" ", line) t1 = time.time() if self.verbose: print(' => POST-PROCESSED {0}: DURATION = {1:.2f}s >'.format(self.label, t1 - t0))
Makes the mesh using gmsh.
def run_gmsh(self): """ Makes the mesh using gmsh. """ argiope.utils.run_gmsh(gmsh_path = self.gmsh_path, gmsh_space = self.gmsh_space, gmsh_options = self.gmsh_options, name = self.file_name + ".geo", workdir = self.workdir) self.mesh = argiope.mesh.read_msh(self.workdir + self.file_name + ".msh")
Reads an history output report.
def read_history_report(path, steps, x_name = None): """ Reads an history output report. """ data = pd.read_csv(path, delim_whitespace = True) if x_name != None: data[x_name] = data.X del data["X"] data["step"] = 0 t = 0. for i in range(len(steps)): dt = steps[i].duration loc = data[data.t == t].index if len(loc) == 2: data.loc[loc[1]:, "step"] = i t += dt return data
Reads a field output report.
def read_field_report(path, data_flag = "*DATA", meta_data_flag = "*METADATA"): """ Reads a field output report. """ text = open(path).read() mdpos = text.find(meta_data_flag) dpos = text.find(data_flag) mdata = io.StringIO( "\n".join(text[mdpos:dpos].split("\n")[1:])) data = io.StringIO( "\n".join(text[dpos:].split("\n")[1:])) data = pd.read_csv(data, index_col = 0) data = data.groupby(data.index).mean() mdata = pd.read_csv(mdata, sep = "=", header = None, index_col = 0)[1] mdata = mdata.to_dict() out = {} out["step_num"] = int(mdata["step_num"]) out["step_label"] = mdata["step_label"] out["frame"] = int(mdata["frame"]) out["frame_value"] = float(mdata["frame_value"]) out["part"] = mdata["instance"] position_map = {"NODAL": "node", "ELEMENT_CENTROID": "element", "WHOLE_ELEMENT": "element"} out["position"] = position_map[mdata["position"]] out["label"] = mdata["label"] out["data"] = data field_class = getattr(argiope.mesh, mdata["argiope_class"]) return field_class(**out)
Converts a list - like to string with given line width.
def list_to_string(l = range(200), width = 40, indent = " "): """ Converts a list-like to string with given line width. """ l = [str(v) + "," for v in l] counter = 0 out = "" + indent for w in l: s = len(w) if counter + s > width: out += "\n" + indent counter = 0 out += w counter += s return out.strip(",")
Returns an Abaqus INP formated string for a given linear equation.
def _equation(nodes = (1, 2), dofs = (1, 1), coefficients = (1., 1.), comment = None): """ Returns an Abaqus INP formated string for a given linear equation. """ N = len(nodes) if comment == None: out = "" else: out = "**EQUATION: {0}\n".format(comment) out+= "*EQUATION\n {0}\n ".format(N) out += "\n ".join([ ",".join([ str(nodes[i]), str(int(dofs[i])), str(coefficients[i]) ]) for i in range(N)]) return out
Returns a set as inp string with unsorted option.
def _unsorted_set(df, label, **kwargs): """ Returns a set as inp string with unsorted option. """ out = "*NSET, NSET={0}, UNSORTED\n".format(label) labels = df.index.values return out + argiope.utils.list_to_string(labels, **kwargs)
Parses the API response and raises appropriate errors if raise_errors was set to True
def parse_response(self, response): """Parses the API response and raises appropriate errors if raise_errors was set to True :param response: response from requests http call :returns: dictionary of response :rtype: dict """ payload = None try: if isinstance(response.json, collections.Callable): payload = response.json() else: # json isn't callable in old versions of requests payload = response.json except ValueError: # response does not have JSON content payload = response.content if not self._raise_errors: return payload else: if response.status_code == 401: raise AuthenticationError(payload['message']) elif response.status_code == 500: raise ServerError(payload['message']) elif isinstance(payload, dict) and not payload['success']: raise APIError(payload['message']) else: return payload
Builds the url for the specified method and arguments and returns the response as a dictionary.
def _get(self, method, **kwargs): """Builds the url for the specified method and arguments and returns the response as a dictionary. """ payload = kwargs.copy() payload['api_key'] = self.api_key payload['api_secret'] = self.api_secret to = payload.pop('to', None) if to: if isinstance(to, basestring): payload['to'] = to else: # Presumably it's a list or tuple for num_i, fax_num in enumerate(to): payload['to[%d]' % num_i] = fax_num files = payload.pop('files', []) if not isinstance(files, (list, tuple)): files = (files,) req_files = {} for file_i, f in enumerate(files): if isinstance(f, basestring): req_files['filename[%d]' % file_i] = open(f, 'rb') else: f.seek(0) req_files['filename[%d]' % file_i] = f url = '%s/v%d/%s' % (self.BASE_URL, self.VERSION, method) r = requests.post(url, data=payload, files=req_files) return self.parse_response(r)
Returns the material definition as a string in Abaqus INP format.
def write_inp(self): """ Returns the material definition as a string in Abaqus INP format. """ template = self.get_template() return template.substitute({"class": self.__class__.__name__, "label": self.label}).strip()
Returns the material definition as a string in Abaqus INP format.
def write_inp(self): """ Returns the material definition as a string in Abaqus INP format. """ template = self.get_template() plastic_table = self.get_plastic_table() return template.substitute({ "class": self.__class__.__name__, "label": self.label, "young_modulus": self.young_modulus, "poisson_ratio": self.poisson_ratio, "plastic_table": (self.get_plastic_table()[["stress", "plastic_strain"]] .to_csv(header = False, index = False, sep = ",").strip())}).strip()
Calculates the plastic data
def get_plastic_table(self): """ Calculates the plastic data """ E = self.young_modulus sy = self.yield_stress n = self.hardening_exponent eps_max = self.max_strain Np = self.strain_data_points ey = sy/E s = 10.**np.linspace(0., np.log10(eps_max/ey), Np) strain = ey * s stress = sy * s**n plastic_strain = strain - stress / E return pd.DataFrame({"strain": strain, "stress": stress, "plastic_strain": plastic_strain})
Calculates the plastic data
def get_plastic_table(self): """ Calculates the plastic data """ K = self.consistency sy = self.yield_stress n = self.hardening_exponent eps_max = self.max_strain Np = self.strain_data_points plastic_strain = np.linspace(0., eps_max, Np) stress = sy + K * plastic_strain**n return pd.DataFrame({"stress": stress, "plastic_strain": plastic_strain})
Returns the DNA/ DNA melting temp using nearest - neighbor thermodynamics.
def temp(s, DNA_c=5000.0, Na_c=10.0, Mg_c=20.0, dNTPs_c=10.0, uncorrected=False): ''' Returns the DNA/DNA melting temp using nearest-neighbor thermodynamics. This function returns better results than EMBOSS DAN because it uses updated thermodynamics values and takes into account initialization parameters from the work of SantaLucia (1998). Corrects for mono- and divalent cation concentrations. Arguments: - DNA_c: DNA concentration [nM] - Na_c: Na+ concentration [mM] - Mg_c: Mg2+ concentration [mM] - dNTPs_c: dNTP concentration [mM] - correction: correct for cation concentration? ''' R = 1.987 # Universal gas constant (cal/(K*mol)) s = s.upper() dh, ds = _tercorr(s) k = DNA_c * 1e-9 # Adapted from Table 1 in Allawi and SantaLucia (1997). # delta H (kcal/mol) dh_coeffs = {"AA": -7.9, "TT": -7.9, "AT": -7.2, "TA": -7.2, "CA": -8.5, "TG": -8.5, "GT": -8.4, "AC": -8.4, "CT": -7.8, "AG": -7.8, "GA": -8.2, "TC": -8.2, "CG": -10.6, "GC": -9.8, "GG": -8.0, "CC": -8.0} # delta S (eu) ds_coeffs = {"AA": -22.2, "TT": -22.2, "AT": -20.4, "TA": -21.3, "CA": -22.7, "TG": -22.7, "GT": -22.4, "AC": -22.4, "CT": -21.0, "AG": -21.0, "GA": -22.2, "TC": -22.2, "CG": -27.2, "GC": -24.4, "GG": -19.9, "CC": -19.9} # Multiplies the number of times each nuc pair is in the sequence by the # appropriate coefficient, then returns the sum of all the pairs dh = dh + \ sum(_overcount(s, pair) * coeff for pair, coeff in dh_coeffs.items()) ds = ds + \ sum(_overcount(s, pair) * coeff for pair, coeff in ds_coeffs.items()) fgc = len([filter(lambda x: x == 'G' or x == 'C', s)]) / float(len(s)) # Melting temperature tm = (1000 * dh) / (ds + (R * log(k))) if uncorrected: return tm - 273.15 MNa = Na_c * 1e-3 MMg = Mg_c * 1e-3 MdNTPs = dNTPs_c * 1e-3 # Free magnesium concentration Ka = 3e4 # association constant in biological buffers D = (Ka * MdNTPs - Ka * MMg + 1)**2 + (4 * Ka * MMg) Fmg = (-(Ka * MdNTPs - Ka * MMg + 1) + sqrt(D)) / (2 * Ka) cation_ratio = sqrt(Fmg) / MNa if MNa > 0 else 7.0 if cation_ratio < 0.22: tm = 1 / ( (1 / tm) + ((4.29 * fgc - 3.95) * log(MNa) + 0.94 * log(MNa)**2) * 1e-5) else: a = 3.92 d = 1.42 g = 8.31 Fmg = MMg if cation_ratio < 6.0: a = a * (0.843 - 0.352 * sqrt(MNa) * log(MNa)) d = d * \ (1.279 - 4.03 * log(MNa) * 1e-3 - 8.03 * log(MNa)**2 * 1e-3) g = g * (0.486 - 0.258 * log(MNa) + 5.25 * log(MNa)**3 * 1e-3) tm = 1 / ( (1 / tm) + (a - 0.911 * log(Fmg) + fgc * (6.26 + d * log(Fmg)) + 1 / (2 * (len(s) - 1)) * (-48.2 + 52.5 * log(Fmg) + g * log(Fmg)**2)) * 1e-5) return tm - 273.15
Writes a xy_report based on xy data.
def write_xy_report(odb, path, tags, columns, steps): """ Writes a xy_report based on xy data. """ xyData = [session.XYDataFromHistory(name = columns[i], odb = odb, outputVariableName = tags[i], steps = steps) for i in xrange(len(tags))] session.xyReportOptions.setValues(numDigits=8, numberFormat=SCIENTIFIC) session.writeXYReport(fileName=path, appendMode=OFF, xyData=xyData)
Writes a field report and rewrites it in a cleaner format.
def write_field_report(odb, path, label, argiope_class, variable, instance, output_position, step = -1, frame = -1, sortItem='Node Label'): """ Writes a field report and rewrites it in a cleaner format. """ stepKeys = get_steps(odb) step = xrange(len(stepKeys))[step] frame = xrange(get_frames(odb, stepKeys[step]))[frame] nf = NumberFormat(numDigits=9, precision=0, format=SCIENTIFIC) session.fieldReportOptions.setValues( printTotal=OFF, printMinMax=OFF, numberFormat=nf) leaf = dgo.LeafFromPartInstance( partInstanceName = instance) session.viewports['Viewport: 1'].odbDisplay.displayGroup.replace(leaf=leaf) session.writeFieldReport( fileName = path, append = OFF, sortItem = sortItem, odb = odb, step = step, frame = frame, outputPosition = output_position, variable = variable) lines = [line.strip() for line in open(path).readlines()] isdata = -1 data = [] for line in lines: if isdata == 1: if len(line) == 0: isdata -= 1 else: data.append(line) elif isdata < 1: if line.startswith("--"): isdata += 1 data = "\n".join([",".join(line.split()) for line in data if len(line) != 0]) # HEADER header = str(output_position).lower() + "," header += ",".join([v[1] for v in variable[0][2]]) + "\n" # METADATA metadata = ( ("label", label), ("argiope_class", argiope_class) , ("odb", odb.path), ("instance", instance), ("position", output_position), ("step_num", step), ("step_label", stepKeys[step]), ("frame", frame), ("frame_value", odb.steps[stepKeys[step]].frames[frame].frameValue) ) out = "*METADATA\n{0}\n*DATA\n{1}".format( "\n".join(["{0}={1}".format(k, v) for k, v in metadata]), header + data) open(path, "w").write(out)
Display a dashboard from the dashboard file ( s ) provided in the DASHBOARDS Paths and/ or URLs for dashboards ( URLs must secrets with http or https )
def start(dashboards, once, secrets): """Display a dashboard from the dashboard file(s) provided in the DASHBOARDS Paths and/or URLs for dashboards (URLs must secrets with http or https) """ if secrets is None: secrets = os.path.join(os.path.expanduser("~"), "/.doodledashboard/secrets") try: loaded_secrets = try_read_secrets_file(secrets) except InvalidSecretsException as err: click.echo(get_error_message(err, default="Secrets file is invalid"), err=True) raise click.Abort() read_configs = [""" dashboard: display: type: console """] for dashboard_file in dashboards: read_configs.append(read_file(dashboard_file)) dashboard_config = DashboardConfigReader(initialise_component_loader(), loaded_secrets) try: dashboard = read_dashboard_from_config(dashboard_config, read_configs) except YAMLError as err: click.echo(get_error_message(err, default="Dashboard configuration is invalid"), err=True) raise click.Abort() try: DashboardValidator().validate(dashboard) except ValidationException as err: click.echo(get_error_message(err, default="Dashboard configuration is invalid"), err=True) raise click.Abort() explain_dashboard(dashboard) click.echo("Dashboard running...") while True: try: DashboardRunner(dashboard).cycle() except SecretNotFound as err: click.echo(get_error_message(err, default="Datafeed didn't have required secret"), err=True) raise click.Abort() if once: break
View the output of the datafeeds and/ or notifications used in your DASHBOARDS
def view(action, dashboards, secrets): """View the output of the datafeeds and/or notifications used in your DASHBOARDS""" if secrets is None: secrets = os.path.join(os.path.expanduser("~"), "/.doodledashboard/secrets") try: loaded_secrets = try_read_secrets_file(secrets) except InvalidSecretsException as err: click.echo(get_error_message(err, default="Secrets file is invalid"), err=True) raise click.Abort() dashboard_config = DashboardConfigReader(initialise_component_loader(), loaded_secrets) read_configs = [read_file(f) for f in dashboards] dashboard = read_dashboard_from_config(dashboard_config, read_configs) try: messages = DashboardRunner(dashboard).poll_datafeeds() except SecretNotFound as err: click.echo(get_error_message(err, default="Datafeed didn't have required secret"), err=True) raise click.Abort() cli_output = {"source-data": messages} if action == "notifications": cli_output["notifications"] = [] for notification in dashboard.notifications: notification_output = notification.create(messages) filtered_messages = messages if isinstance(notification, FilteredNotification): filtered_messages = notification.filter_messages(messages) cli_output["notifications"].append({ "filtered-messages": filtered_messages, "notification": str(notification_output) }) json_output = json.dumps(cli_output, sort_keys=True, indent=4, cls=MessageJsonEncoder) click.echo(json_output)
List components that are available on your machine
def list(component_type): """List components that are available on your machine""" config_loader = initialise_component_loader() component_types = sorted({ "displays": lambda: config_loader.load_by_type(ComponentType.DISPLAY), "datafeeds": lambda: config_loader.load_by_type(ComponentType.DATA_FEED), "filters": lambda: config_loader.load_by_type(ComponentType.FILTER), "notifications": lambda: config_loader.load_by_type(ComponentType.NOTIFICATION) }.items(), key=lambda t: t[0]) def print_ids(creators): ids = {c.id_key_value[1] if hasattr(c, "id_key_value") else c.get_id() for c in creators} for i in sorted(ids): click.echo(" - %s" % i) for k, v in component_types: if component_type == k or component_type == "all": click.echo("Available %s:" % k) print_ids(v()) if component_type == "all": click.echo("")
Parses the section of configuration pertaining to a component: param config: dict of specific config section: return:
def parse(self, config): """ Parses the section of configuration pertaining to a component :param config: dict of specific config section :return: """ if "type" not in config: raise InvalidConfigurationException("The dashboard configuration has not defined a 'type'. %s" % config) component_type = config["type"] component_config = self._get_config_by_id(self._component_configs, component_type) if not component_config: raise ComponentNotFoundForType(component_type) options = config.get("options", {}) component = self._parse_item(component_config, options, config) component.name = config.get("name", "") return component