code stringlengths 114 1.05M | path stringlengths 3 312 | quality_prob float64 0.5 0.99 | learning_prob float64 0.2 1 | filename stringlengths 3 168 | kind stringclasses 1 value |
|---|---|---|---|---|---|
package aoc2021
import (
"fmt"
"strconv"
"strings"
)
/*
--- Day 11: Dumbo Octopus ---
You enter a large cavern full of rare bioluminescent dumbo octopuses! They seem to not like the Christmas lights on your submarine, so you turn them off for now.
There are 100 octopuses arranged neatly in a 10 by 10 grid. Each octopus slowly gains energy over time and flashes brightly for a moment when its energy is full. Although your lights are off, maybe you could navigate through the cave without disturbing the octopuses if you could predict when the flashes of light will happen.
Each octopus has an energy level - your submarine can remotely measure the energy level of each octopus (your puzzle input). For example:
5483143223
2745854711
5264556173
6141336146
6357385478
4167524645
2176841721
6882881134
4846848554
5283751526
The energy level of each octopus is a value between 0 and 9. Here, the top-left octopus has an energy level of 5, the bottom-right one has an energy level of 6, and so on.
You can model the energy levels and flashes of light in steps. During a single step, the following occurs:
First, the energy level of each octopus increases by 1.
Then, any octopus with an energy level greater than 9 flashes. This increases the energy level of all adjacent octopuses by 1, including octopuses that are diagonally adjacent. If this causes an octopus to have an energy level greater than 9, it also flashes. This process continues as long as new octopuses keep having their energy level increased beyond 9. (An octopus can only flash at most once per step.)
Finally, any octopus that flashed during this step has its energy level set to 0, as it used all of its energy to flash.
Adjacent flashes can cause an octopus to flash on a step even if it begins that step with very little energy. Consider the middle octopus with 1 energy in this situation:
Before any steps:
11111
19991
19191
19991
11111
After step 1:
34543
40004
50005
40004
34543
After step 2:
45654
51115
61116
51115
45654
An octopus is highlighted when it flashed during the given step.
Here is how the larger example above progresses:
Before any steps:
5483143223
2745854711
5264556173
6141336146
6357385478
4167524645
2176841721
6882881134
4846848554
5283751526
After step 1:
6594254334
3856965822
6375667284
7252447257
7468496589
5278635756
3287952832
7993992245
5957959665
6394862637
After step 2:
8807476555
5089087054
8597889608
8485769600
8700908800
6600088989
6800005943
0000007456
9000000876
8700006848
After step 3:
0050900866
8500800575
9900000039
9700000041
9935080063
7712300000
7911250009
2211130000
0421125000
0021119000
After step 4:
2263031977
0923031697
0032221150
0041111163
0076191174
0053411122
0042361120
5532241122
1532247211
1132230211
After step 5:
4484144000
2044144000
2253333493
1152333274
1187303285
1164633233
1153472231
6643352233
2643358322
2243341322
After step 6:
5595255111
3155255222
3364444605
2263444496
2298414396
2275744344
2264583342
7754463344
3754469433
3354452433
After step 7:
6707366222
4377366333
4475555827
3496655709
3500625609
3509955566
3486694453
8865585555
4865580644
4465574644
After step 8:
7818477333
5488477444
5697666949
4608766830
4734946730
4740097688
6900007564
0000009666
8000004755
6800007755
After step 9:
9060000644
7800000976
6900000080
5840000082
5858000093
6962400000
8021250009
2221130009
9111128097
7911119976
After step 10:
0481112976
0031112009
0041112504
0081111406
0099111306
0093511233
0442361130
5532252350
0532250600
0032240000
After step 10, there have been a total of 204 flashes. Fast forwarding, here is the same configuration every 10 steps:
After step 20:
3936556452
5686556806
4496555690
4448655580
4456865570
5680086577
7000009896
0000000344
6000000364
4600009543
After step 30:
0643334118
4253334611
3374333458
2225333337
2229333338
2276733333
2754574565
5544458511
9444447111
7944446119
After step 40:
6211111981
0421111119
0042111115
0003111115
0003111116
0065611111
0532351111
3322234597
2222222976
2222222762
After step 50:
9655556447
4865556805
4486555690
4458655580
4574865570
5700086566
6000009887
8000000533
6800000633
5680000538
After step 60:
2533334200
2743334640
2264333458
2225333337
2225333338
2287833333
3854573455
1854458611
1175447111
1115446111
After step 70:
8211111164
0421111166
0042111114
0004211115
0000211116
0065611111
0532351111
7322235117
5722223475
4572222754
After step 80:
1755555697
5965555609
4486555680
4458655580
4570865570
5700086566
7000008666
0000000990
0000000800
0000000000
After step 90:
7433333522
2643333522
2264333458
2226433337
2222433338
2287833333
2854573333
4854458333
3387779333
3333333333
After step 100:
0397666866
0749766918
0053976933
0004297822
0004229892
0053222877
0532222966
9322228966
7922286866
6789998766
After 100 steps, there have been a total of 1656 flashes.
Given the starting energy levels of the dumbo octopuses in your cavern, simulate 100 steps. How many total flashes are there after 100 steps?
*/
type Day11Point struct {
X int
Y int
Value int
Grid *Day11Grid
Checked bool
}
func (p *Day11Point) Increment() bool {
p.Value += 1
return p.Value == 10
}
func (p *Day11Point) IsFlashing() bool {
return p.Value > 9
}
type Day11Grid struct {
Points [][]*Day11Point
Width int
Height int
StepCount int
Flashing []*Day11Point
}
func NewDay11Grid(data string) *Day11Grid {
lines := strings.Split(data, "\n")
y := 0
grid := &Day11Grid{}
for _, line := range lines {
line = strings.Trim(line, " ")
row := make([]*Day11Point, 0)
for x := 0; x < len(line); x++ {
value, _ := strconv.Atoi(line[x : x+1])
point := &Day11Point{X: x, Y: y, Grid: grid, Value: value}
row = append(row, point)
}
grid.Points = append(grid.Points, row)
y += 1
}
grid.Height = y
grid.Width = len(lines[0])
return grid
}
var Reset = "\033[0m"
var Red = "\033[31m"
var Green = "\033[32m"
var Yellow = "\033[33m"
var Blue = "\033[34m"
var Purple = "\033[35m"
var Cyan = "\033[36m"
var Gray = "\033[37m"
var White = "\033[97m"
func (g *Day11Grid) Debug() string {
line := ""
for y := 0; y < g.Height; y++ {
for x := 0; x < g.Width; x++ {
p := g.GetPoint(x, y)
if p.IsFlashing() {
// line = fmt.Sprintf("%v %v%v%v", line, White, 0, Reset)
line = fmt.Sprintf("%v %v", line, 0)
} else {
line = fmt.Sprintf("%v %v", line, p.Value)
}
}
line += "\n"
}
return line
}
func (g *Day11Grid) FormALine() string {
line := ""
for y := 0; y < g.Height; y++ {
for x := 0; x < g.Width; x++ {
p := g.GetPoint(x, y)
value := p.Value
if value > 9 {
value = 0
}
l := fmt.Sprintf("%v%v", line, value)
line = l
}
}
return line
}
func (g *Day11Grid) NonFlashingNeighbours(p *Day11Point) []*Day11Point {
neighbours := make([]*Day11Point, 0)
for nx := p.X - 1; nx <= p.X+1; nx++ {
for ny := p.Y - 1; ny <= p.Y+1; ny++ {
np := g.GetPoint(nx, ny)
if np == nil {
continue
}
if np == p {
continue
}
if np.IsFlashing() {
continue
}
neighbours = append(neighbours, np)
}
}
return neighbours
}
func (g *Day11Grid) Reset() {
for _, p := range g.Flashing {
p.Value = 0
}
for x := 0; x < g.Width; x++ {
for y := 0; y < g.Height; y++ {
p := g.GetPoint(x, y)
p.Checked = false
if p.Value > 9 {
p.Value = 0
}
}
}
g.Flashing = make([]*Day11Point, 0)
}
func (g *Day11Grid) GetPoint(x int, y int) *Day11Point {
if x < 0 || x >= g.Width {
return nil
}
if y < 0 || y >= g.Height {
return nil
}
return g.Points[y][x]
}
// Step should
// 1. icnrement all
// 2. flashwalk
// 3. return number of flashes
func (g *Day11Grid) Step(DEBUG bool) int {
g.StepCount += 1
// first, mark all as not flashing
g.Reset()
// first, increment everything
for x := 0; x < g.Width; x++ {
for y := 0; y < g.Height; y++ {
p := g.GetPoint(x, y)
p.Increment()
}
}
// now everything is in incremented, we walk each entry and see if it is flashing; if it is,
// we increment all neighbours if we can and see if they flash
flashes := 0
depth := 0
for y := 0; y < g.Height; y++ {
for x := 0; x < g.Width; x++ {
p := g.GetPoint(x, y)
if !p.Checked && p.IsFlashing() {
flashes += g.FlashWalk(g.StepCount, depth+1, p, p)
}
p.Checked = true
}
}
if DEBUG {
fmt.Printf("Step[%v] (post flashwalk) total flashes=%v\n%v\n", g.StepCount, flashes, g.Debug())
}
return flashes
}
func (g *Day11Grid) FlashWalk(step int, depth int, p *Day11Point, originPoint *Day11Point) int {
if !p.IsFlashing() {
return 0
}
p.Checked = true
count := 1
neighbours := g.NonFlashingNeighbours(p)
for _, np := range neighbours {
if np == p || np == originPoint {
continue
}
causedFlash := np.Increment()
if causedFlash {
count += g.FlashWalk(step, depth+1, np, originPoint)
}
}
return count
}
// rename this to the year and day in question
func (app *Application) Y2021D11P1() {
DEBUG := true
data := DAY_2021_11_TEST_DATA_1
grid := NewDay11Grid(data)
fmt.Printf("[0]\n%v\n", grid.Debug())
grid.Step(DEBUG)
fmt.Printf("[1]\n%v\n", grid.Debug())
grid.Step(DEBUG)
fmt.Printf("[2]\n%v\n", grid.Debug())
data1 := DAY_2021_11_TEST_DATA
grid1 := NewDay11Grid(data1)
fmt.Printf("[0]\n%v\n", grid1.Debug())
total1 := 0
for x := 1; x <= 100; x++ {
total1 += grid1.Step(DEBUG)
// fmt.Printf("[%v]\n%v\n", x, grid1.Debug())
}
fmt.Printf("Total of %v flashes.\n", total1)
data2 := DAY_2021_11_DATA
grid2 := NewDay11Grid(data2)
fmt.Printf("[0]\n%v\n", grid2.Debug())
total2 := 0
for x := 1; x <= 100; x++ {
total2 += grid2.Step(DEBUG)
}
fmt.Printf("Total of %v flashes.\n", total2)
}
/*
--- Part Two ---
It seems like the individual flashes aren't bright enough to navigate. However, you might have a better option: the flashes seem to be synchronizing!
In the example above, the first time all octopuses flash simultaneously is step 195:
*/
func (app *Application) Y2021D11P2() {
data1 := DAY_2021_11_TEST_DATA
grid1 := NewDay11Grid(data1)
fmt.Printf("[0]\n%v\n", grid1.Debug())
for x := 1; x <= 192; x++ {
grid1.Step(false)
}
grid1.Step(true)
grid1.Step(true)
grid1.Step(true)
data2 := DAY_2021_11_DATA
grid2 := NewDay11Grid(data2)
for {
flashes := grid2.Step(false)
if flashes == 100 {
fmt.Printf("100 flashes at step %v.\n", grid2.StepCount)
break
}
}
}
type Day11Response struct {
Lines []string
Index int
}
func (app *Application) Y2021D11P2Api() *Day11Response {
data2 := DAY_2021_11_DATA
grid2 := NewDay11Grid(data2)
lines := make([]string, 0)
index := 0
for {
flashes := grid2.Step(false)
line := grid2.FormALine()
lines = append(lines, line)
index++
if flashes == 100 {
break
}
}
for index := 0; index < 10; index++ {
grid2.Step(false)
line := grid2.FormALine()
lines = append(lines, line)
}
return &Day11Response{Lines: lines, Index: index}
}
// rename and uncomment this to the year and day in question once complete for a gold star!
// func (app *Application) Y20XXDXXP1Render() {
// }
// rename and uncomment this to the year and day in question once complete for a gold star!
// func (app *Application) Y20XXDXXP2Render() {
// }
// this is what we will reflect and call - so both parts with run. It's up to you to make it print nicely etc.
// The app reference has a CLI for logging.
// func (app *Application) Y2021D11() {
// app.Y2021D11P1()
// app.Y2021D11P2()
// } | app/aoc2021/aoc2021_11.go | 0.637934 | 0.496216 | aoc2021_11.go | starcoder |
package pixelpusher
import (
"encoding/binary"
"image/color"
"sync"
)
// pointInTriangle tries to decide if the given x and y are within the triangle defined by p0, p1 and p2
// area is 1.0 divided on (the area of the triangle, times 2)
func pointInTriangle(x, y int32, p0, p1, p2 *Pos, areaMod float32) bool {
s := areaMod * float32(p0.y*p2.x-p0.x*p2.y+(p2.y-p0.y)*x+(p0.x-p2.x)*y)
t := areaMod * float32(p0.x*p1.y-p0.y*p1.x+(p0.y-p1.y)*x+(p1.x-p0.x)*y)
return s > 0 && t > 0 && (1-s-t) > 0
}
// Area returns the area of a triangle
func area(p0, p1, p2 *Pos) float32 {
return 0.5 * float32(-p1.y*p2.x+p0.y*(-p1.x+p2.x)+p0.x*(p1.y-p2.y)+p1.x*p2.y)
}
// drawPartialTriangle draws a part of a triangle
// areaMod is 1.0 divided on (the area of the triangle, times 2)
func drawPartialTriangle(wg *sync.WaitGroup, pixels []uint32, p1, p2, p3 *Pos, minX, maxX, minY, maxY int32, areaMod float32, colorValue uint32, pitch int32) {
defer wg.Done()
for y := minY; y < maxY; y++ {
offset := y * pitch
for x := minX; x < maxX; x++ {
if pointInTriangle(x, y, p1, p2, p3, areaMod) {
pixels[offset+x] = colorValue
}
}
}
}
// Triangle draws a triangle, concurrently.
// Core is the number of goroutines that will be used.
// pitch is the "width" of the pixel buffer.
func Triangle(cores int, pixels []uint32, x1, y1, x2, y2, x3, y3 int32, c color.RGBA, pitch int32) {
var wg sync.WaitGroup
p1 := &Pos{x1, y1}
p2 := &Pos{x2, y2}
p3 := &Pos{x3, y3}
minY, maxY := MinMax3(y1, y2, y3)
minX, maxX := MinMax3(x1, x2, x3)
// Triangle area, with modifications, for performance
areaMod := 1.0 / (area(p1, p2, p3) * 2.0)
colorValue := binary.BigEndian.Uint32([]uint8{c.A, c.R, c.G, c.B})
ylength := (maxY - minY)
if ylength == 0 {
// This is not a triangle, but a horizontal line, since ylength == 0
HorizontalLineFast(pixels, minY, minX, maxX, c, pitch)
return
}
// cores-1 because of the final part after the for loop
ystep := ylength / int32(cores-1)
if ystep == 0 {
return
}
var minYCore int32
var maxYCore int32
for y := minY; y < (maxY - ystep); y += ystep {
minYCore = y
maxYCore = y + ystep
wg.Add(1)
// TODO: Create a slice of pixels by looking at the min and max values, then pass only that slice on!
go drawPartialTriangle(&wg, pixels, p1, p2, p3, minX, maxX, minYCore, maxYCore, areaMod, colorValue, pitch)
}
// Draw the final part, if there are a few pixels missing at the end
if maxYCore < maxY {
wg.Add(1)
go drawPartialTriangle(&wg, pixels, p1, p2, p3, minX, maxX, maxYCore, maxY, areaMod, colorValue, pitch)
}
wg.Wait()
}
// WireTriangle draws a wireframe triangle, concurrently.
// Core is the number of goroutines that will be used.
// pitch is the "width" of the pixel buffer.
func WireTriangle(cores int, pixels []uint32, x1, y1, x2, y2, x3, y3 int32, c color.RGBA, pitch int32) {
if cores >= 3 {
var wg sync.WaitGroup
wg.Add(3)
go func(wg *sync.WaitGroup) {
defer wg.Done()
Line(pixels, x1, y1, x2, y2, c, pitch)
}(&wg)
go func(wg *sync.WaitGroup) {
defer wg.Done()
Line(pixels, x1, y1, x3, y3, c, pitch)
}(&wg)
go func(wg *sync.WaitGroup) {
defer wg.Done()
Line(pixels, x2, y2, x3, y3, c, pitch)
}(&wg)
wg.Wait()
} else {
Line(pixels, x1, y1, x2, y2, c, pitch)
Line(pixels, x1, y1, x3, y3, c, pitch)
Line(pixels, x2, y2, x3, y3, c, pitch)
}
} | triangle.go | 0.681515 | 0.640622 | triangle.go | starcoder |
package cmd
import (
"os"
"strings"
"time"
"github.com/ryanberckmans/est/core"
"github.com/spf13/cobra"
)
var scheduleCmd = &cobra.Command{
Use: "schedule",
Short: "Display a predicted, probabilistic schedule for unstarted, estimated tasks",
Long: `Display a predicted, probabilistic schedule for unstarted, estimated tasks.
The schedule predicts the date on which all tasks will be delivered.
The prediction is based on a monte carlo simulation of how long future tasks
will actually take, based on personalized accuracy of historical task estimates.
Personalized historical task estimates are partially faked if less than twenty
tasks are done.
The prediction is most accurate when most of your tasks' estimated and actual
hours are under 16 hours; your history of done tasks numbers in the dozens;
and you track the majority of your working time in est.
The prediction is most useful if your expected future tasks are estimated in
est, because those are the tasks predicted by 'est schedule'. If many of your
future tasks are not estimated in est, or the tasks estimated in est are never
actually worked on, then the usefulness of 'est schedule' will be reduced.
In future, 'est schedule' should allow selection of tasks to estimate, e.g.
all tasks related to one project, and merging of estfiles for team scheduling.
`,
Run: func(cmd *cobra.Command, args []string) {
runSchedule()
},
}
var scheduleDisplayDatesOnly bool
func runSchedule() {
core.WithEstConfigAndFile(func(ec *core.EstConfig, ef *core.EstFile) {
ts := ef.Tasks.IsNotDeleted().IsEstimated().IsNotStarted().IsNotDone()
now := time.Now()
os.Stdout.WriteString("Predicting delivery schedule for unstarted, estimated tasks...")
rs := core.PadFakeHistoricalEstimateAccuracyRatios(
ef.HistoricalEstimateAccuracyRatios().Ratios(), ef.FakeHistoricalEstimateAccuracyRatios)
dates := core.DeliverySchedule(globalWorkTimes, now, rs, ts)
ss := core.RenderDeliverySchedule(dates)
os.Stdout.WriteString("done\n")
if scheduleDisplayDatesOnly {
s := strings.Join(ss[:], "\n") + "\n"
os.Stdout.WriteString(s)
} else {
// Convert dates into wall clock days in future, because termui supports only float data
dd := make([]float64, len(dates))
for i := range dates {
dd[i] = dates[i].Sub(now).Hours() / 24
}
core.PredictedDeliveryDateChart(dd, ts, ss[:])
}
}, func() {
// failed to load estconfig or estfile. Err printed elsewhere.
os.Exit(1)
})
}
func init() {
scheduleCmd.PersistentFlags().BoolVarP(&scheduleDisplayDatesOnly, "dates-only", "d", false, "display dates only, no chart, non-interactively")
rootCmd.AddCommand(scheduleCmd)
} | cmd/schedule.go | 0.590189 | 0.531392 | schedule.go | starcoder |
package tree
// Visitor visits every node in the tree. The visitor-pattern is used to traverse
// the grammar-tree. Every tree-node has an 'Accept' method that lets the visitor
// visit itself and all of its children. In contrast to many visitor-pattern
// implementations, the visitor is not an interface. It has a lot of methods
// and because of golang's type-system, every visitor-implementation would have
// to implement every method. Instead one can create a default-visitor, which
// only has noop-methods, and then set the visit-methods that should be custom.
type Visitor interface {
VisitParameter(*Parameter)
VisitIdentifier(*Identifier)
VisitLetBinding(*LetBinding)
VisitCallArgument(*CallArgument)
VisitListTypeName(*ListTypeName)
VisitTestStatement(*TestStatement)
VisitStringLiteral(*StringLiteral)
VisitNumberLiteral(*NumberLiteral)
VisitListExpression(*ListExpression)
VisitCallExpression(*CallExpression)
VisitEmptyStatement(*EmptyStatement)
VisitYieldStatement(*YieldStatement)
VisitBreakStatement(*BreakStatement)
VisitBlockStatement(*StatementBlock)
VisitWildcardNode(*WildcardNode)
VisitAssertStatement(*AssertStatement)
VisitUnaryExpression(*UnaryExpression)
VisitImportStatement(*ImportStatement)
VisitAssignStatement(*AssignStatement)
VisitReturnStatement(*ReturnStatement)
VisitTranslationUnit(*TranslationUnit)
VisitCreateExpression(*CreateExpression)
VisitInvalidStatement(*InvalidStatement)
VisitFieldDeclaration(*FieldDeclaration)
VisitGenericTypeName(*GenericTypeName)
VisitOptionalTypeName(*OptionalTypeName)
VisitConcreteTypeName(*ConcreteTypeName)
VisitClassDeclaration(*ClassDeclaration)
VisitBinaryExpression(*BinaryExpression)
VisitMethodDeclaration(*MethodDeclaration)
VisitPostfixExpression(*PostfixExpression)
VisitImplementStatement(*ImplementStatement)
VisitRangedLoopStatement(*RangedLoopStatement)
VisitExpressionStatement(*ExpressionStatement)
VisitForEachLoopStatement(*ForEachLoopStatement)
VisitConditionalStatement(*ConditionalStatement)
VisitListSelectExpression(*ListSelectExpression)
VisitFieldSelectExpression(*ChainExpression)
VisitConstructorDeclaration(*ConstructorDeclaration)
}
type DelegatingVisitor struct {
ParameterVisitor func(*Parameter)
IdentifierVisitor func(*Identifier)
LetBindingVisitor func(*LetBinding)
CallArgumentVisitor func(*CallArgument)
ListTypeNameVisitor func(*ListTypeName)
TestStatementVisitor func(*TestStatement)
StringLiteralVisitor func(*StringLiteral)
NumberLiteralVisitor func(*NumberLiteral)
ListExpressionVisitor func(*ListExpression)
CallExpressionVisitor func(*CallExpression)
EmptyStatementVisitor func(*EmptyStatement)
WildcardNodeVisitor func(*WildcardNode)
BreakStatementVisitor func(*BreakStatement)
YieldStatementVisitor func(*YieldStatement)
BlockStatementVisitor func(*StatementBlock)
AssertStatementVisitor func(*AssertStatement)
UnaryExpressionVisitor func(*UnaryExpression)
ImportStatementVisitor func(*ImportStatement)
AssignStatementVisitor func(*AssignStatement)
ReturnStatementVisitor func(*ReturnStatement)
TranslationUnitVisitor func(*TranslationUnit)
CreateExpressionVisitor func(*CreateExpression)
InvalidStatementVisitor func(*InvalidStatement)
FieldDeclarationVisitor func(*FieldDeclaration)
PostfixExpressionVisitor func(*PostfixExpression)
ImplementStatementVisitor func(*ImplementStatement)
GenericTypeNameVisitor func(*GenericTypeName)
OptionalTypeNameVisitor func(*OptionalTypeName)
ConcreteTypeNameVisitor func(*ConcreteTypeName)
ClassDeclarationVisitor func(*ClassDeclaration)
BinaryExpressionVisitor func(*BinaryExpression)
MethodDeclarationVisitor func(*MethodDeclaration)
RangedLoopStatementVisitor func(*RangedLoopStatement)
ExpressionStatementVisitor func(*ExpressionStatement)
ForEachLoopStatementVisitor func(*ForEachLoopStatement)
ConditionalStatementVisitor func(*ConditionalStatement)
ListSelectExpressionVisitor func(*ListSelectExpression)
FieldSelectExpressionVisitor func(*ChainExpression)
ConstructorDeclarationVisitor func(*ConstructorDeclaration)
}
func NewEmptyVisitor() *DelegatingVisitor {
return &DelegatingVisitor{
ParameterVisitor: func(*Parameter) {},
IdentifierVisitor: func(*Identifier) {},
LetBindingVisitor: func(*LetBinding) {},
CallArgumentVisitor: func(*CallArgument) {},
ListTypeNameVisitor: func(*ListTypeName) {},
TestStatementVisitor: func(*TestStatement) {},
StringLiteralVisitor: func(*StringLiteral) {},
NumberLiteralVisitor: func(*NumberLiteral) {},
CallExpressionVisitor: func(*CallExpression) {},
ListExpressionVisitor: func(*ListExpression) {},
EmptyStatementVisitor: func(*EmptyStatement) {},
YieldStatementVisitor: func(*YieldStatement) {},
WildcardNodeVisitor: func(*WildcardNode) {},
BreakStatementVisitor: func(*BreakStatement) {},
BlockStatementVisitor: func(*StatementBlock) {},
AssertStatementVisitor: func(*AssertStatement) {},
UnaryExpressionVisitor: func(*UnaryExpression) {},
ImportStatementVisitor: func(*ImportStatement) {},
AssignStatementVisitor: func(*AssignStatement) {},
ReturnStatementVisitor: func(*ReturnStatement) {},
TranslationUnitVisitor: func(*TranslationUnit) {},
CreateExpressionVisitor: func(*CreateExpression) {},
InvalidStatementVisitor: func(*InvalidStatement) {},
FieldDeclarationVisitor: func(*FieldDeclaration) {},
PostfixExpressionVisitor: func(*PostfixExpression) {},
GenericTypeNameVisitor: func(*GenericTypeName) {},
ConcreteTypeNameVisitor: func(*ConcreteTypeName) {},
ClassDeclarationVisitor: func(*ClassDeclaration) {},
BinaryExpressionVisitor: func(*BinaryExpression) {},
MethodDeclarationVisitor: func(*MethodDeclaration) {},
RangedLoopStatementVisitor: func(*RangedLoopStatement) {},
ExpressionStatementVisitor: func(*ExpressionStatement) {},
ForEachLoopStatementVisitor: func(*ForEachLoopStatement) {},
ConditionalStatementVisitor: func(*ConditionalStatement) {},
ListSelectExpressionVisitor: func(*ListSelectExpression) {},
FieldSelectExpressionVisitor: func(*ChainExpression) {},
ImplementStatementVisitor: func(*ImplementStatement) {},
OptionalTypeNameVisitor: func(*OptionalTypeName) {},
ConstructorDeclarationVisitor: func(*ConstructorDeclaration) {},
}
}
func (visitor *DelegatingVisitor) VisitParameter(node *Parameter) {
visitor.ParameterVisitor(node)
}
func (visitor *DelegatingVisitor) VisitIdentifier(node *Identifier) {
visitor.IdentifierVisitor(node)
}
func (visitor *DelegatingVisitor) VisitCallArgument(node *CallArgument) {
visitor.CallArgumentVisitor(node)
}
func (visitor *DelegatingVisitor) VisitListTypeName(node *ListTypeName) {
visitor.ListTypeNameVisitor(node)
}
func (visitor *DelegatingVisitor) VisitTestStatement(node *TestStatement) {
visitor.TestStatementVisitor(node)
}
func (visitor *DelegatingVisitor) VisitStringLiteral(node *StringLiteral) {
visitor.StringLiteralVisitor(node)
}
func (visitor *DelegatingVisitor) VisitNumberLiteral(node *NumberLiteral) {
visitor.NumberLiteralVisitor(node)
}
func (visitor *DelegatingVisitor) VisitCallExpression(node *CallExpression) {
visitor.CallExpressionVisitor(node)
}
func (visitor *DelegatingVisitor) VisitEmptyStatement(node *EmptyStatement) {
visitor.EmptyStatementVisitor(node)
}
func (visitor *DelegatingVisitor) VisitYieldStatement(node *YieldStatement) {
visitor.YieldStatementVisitor(node)
}
func (visitor *DelegatingVisitor) VisitBlockStatement(node *StatementBlock) {
visitor.BlockStatementVisitor(node)
}
func (visitor *DelegatingVisitor) VisitAssertStatement(node *AssertStatement) {
visitor.AssertStatementVisitor(node)
}
func (visitor *DelegatingVisitor) VisitUnaryExpression(node *UnaryExpression) {
visitor.UnaryExpressionVisitor(node)
}
func (visitor *DelegatingVisitor) VisitImportStatement(node *ImportStatement) {
visitor.ImportStatementVisitor(node)
}
func (visitor *DelegatingVisitor) VisitAssignStatement(node *AssignStatement) {
visitor.AssignStatementVisitor(node)
}
func (visitor *DelegatingVisitor) VisitReturnStatement(node *ReturnStatement) {
visitor.ReturnStatementVisitor(node)
}
func (visitor *DelegatingVisitor) VisitTranslationUnit(node *TranslationUnit) {
visitor.TranslationUnitVisitor(node)
}
func (visitor *DelegatingVisitor) VisitCreateExpression(node *CreateExpression) {
visitor.CreateExpressionVisitor(node)
}
func (visitor *DelegatingVisitor) VisitInvalidStatement(node *InvalidStatement) {
visitor.InvalidStatementVisitor(node)
}
func (visitor *DelegatingVisitor) VisitFieldDeclaration(node *FieldDeclaration) {
visitor.FieldDeclarationVisitor(node)
}
func (visitor *DelegatingVisitor) VisitGenericTypeName(node *GenericTypeName) {
visitor.GenericTypeNameVisitor(node)
}
func (visitor *DelegatingVisitor) VisitListSelectExpression(node *ListSelectExpression) {
visitor.ListSelectExpressionVisitor(node)
}
func (visitor *DelegatingVisitor) VisitConcreteTypeName(node *ConcreteTypeName) {
visitor.ConcreteTypeNameVisitor(node)
}
func (visitor *DelegatingVisitor) VisitClassDeclaration(node *ClassDeclaration) {
visitor.ClassDeclarationVisitor(node)
}
func (visitor *DelegatingVisitor) VisitBinaryExpression(node *BinaryExpression) {
visitor.BinaryExpressionVisitor(node)
}
func (visitor *DelegatingVisitor) VisitMethodDeclaration(node *MethodDeclaration) {
visitor.MethodDeclarationVisitor(node)
}
func (visitor *DelegatingVisitor) VisitRangedLoopStatement(node *RangedLoopStatement) {
visitor.RangedLoopStatementVisitor(node)
}
func (visitor *DelegatingVisitor) VisitExpressionStatement(node *ExpressionStatement) {
visitor.ExpressionStatementVisitor(node)
}
func (visitor *DelegatingVisitor) VisitForEachLoopStatement(node *ForEachLoopStatement) {
visitor.ForEachLoopStatementVisitor(node)
}
func (visitor *DelegatingVisitor) VisitConditionalStatement(node *ConditionalStatement) {
visitor.ConditionalStatementVisitor(node)
}
func (visitor *DelegatingVisitor) VisitFieldSelectExpression(node *ChainExpression) {
visitor.FieldSelectExpressionVisitor(node)
}
func (visitor *DelegatingVisitor) VisitWildcardNode(node *WildcardNode) {
visitor.WildcardNodeVisitor(node)
}
func (visitor *DelegatingVisitor) VisitConstructorDeclaration(node *ConstructorDeclaration) {
visitor.ConstructorDeclarationVisitor(node)
}
func (visitor *DelegatingVisitor) VisitPostfixExpression(node *PostfixExpression) {
visitor.PostfixExpressionVisitor(node)
}
func (visitor *DelegatingVisitor) VisitBreakStatement(node *BreakStatement) {
visitor.BreakStatementVisitor(node)
}
func (visitor *DelegatingVisitor) VisitOptionalTypeName(node *OptionalTypeName) {
visitor.OptionalTypeNameVisitor(node)
}
func (visitor *DelegatingVisitor) VisitLetBinding(node *LetBinding) {
visitor.LetBindingVisitor(node)
}
func (visitor *DelegatingVisitor) VisitImplementStatement(node *ImplementStatement) {
visitor.ImplementStatementVisitor(node)
}
func (visitor *DelegatingVisitor) VisitListExpression(expression *ListExpression) {
visitor.ListExpressionVisitor(expression)
}
type nodeReporter interface {
reportNodeEncounter(kind NodeKind)
}
func NewReportingVisitor(reporter nodeReporter) Visitor {
return &DelegatingVisitor{
ParameterVisitor: func(*Parameter) {
reporter.reportNodeEncounter(ParameterNodeKind)
},
IdentifierVisitor: func(*Identifier) {
reporter.reportNodeEncounter(IdentifierNodeKind)
},
CallArgumentVisitor: func(*CallArgument) {
reporter.reportNodeEncounter(CallArgumentNodeKind)
},
ListTypeNameVisitor: func(*ListTypeName) {
reporter.reportNodeEncounter(ListTypeNameNodeKind)
},
TestStatementVisitor: func(*TestStatement) {
reporter.reportNodeEncounter(TestStatementNodeKind)
},
StringLiteralVisitor: func(*StringLiteral) {
reporter.reportNodeEncounter(StringLiteralNodeKind)
},
NumberLiteralVisitor: func(*NumberLiteral) {
reporter.reportNodeEncounter(NumberLiteralNodeKind)
},
CallExpressionVisitor: func(*CallExpression) {
reporter.reportNodeEncounter(CallExpressionNodeKind)
},
BreakStatementVisitor: func(*BreakStatement) {
reporter.reportNodeEncounter(BreakStatementNodeKind)
},
EmptyStatementVisitor: func(*EmptyStatement) {
reporter.reportNodeEncounter(EmptyStatementNodeKind)
},
YieldStatementVisitor: func(*YieldStatement) {
reporter.reportNodeEncounter(YieldStatementNodeKind)
},
BlockStatementVisitor: func(*StatementBlock) {
reporter.reportNodeEncounter(StatementBlockNodeKind)
},
AssertStatementVisitor: func(*AssertStatement) {
reporter.reportNodeEncounter(AssertStatementNodeKind)
},
UnaryExpressionVisitor: func(*UnaryExpression) {
reporter.reportNodeEncounter(UnaryExpressionNodeKind)
},
ImportStatementVisitor: func(*ImportStatement) {
reporter.reportNodeEncounter(ImportStatementNodeKind)
},
AssignStatementVisitor: func(*AssignStatement) {
reporter.reportNodeEncounter(AssignStatementNodeKind)
},
ReturnStatementVisitor: func(*ReturnStatement) {
reporter.reportNodeEncounter(ReturnStatementNodeKind)
},
TranslationUnitVisitor: func(*TranslationUnit) {
reporter.reportNodeEncounter(TranslationUnitNodeKind)
},
CreateExpressionVisitor: func(*CreateExpression) {
reporter.reportNodeEncounter(CreateExpressionNodeKind)
},
InvalidStatementVisitor: func(*InvalidStatement) {
reporter.reportNodeEncounter(InvalidStatementNodeKind)
},
FieldDeclarationVisitor: func(*FieldDeclaration) {
reporter.reportNodeEncounter(FieldDeclarationNodeKind)
},
GenericTypeNameVisitor: func(*GenericTypeName) {
reporter.reportNodeEncounter(GenericTypeNameNodeKind)
},
ConcreteTypeNameVisitor: func(*ConcreteTypeName) {
reporter.reportNodeEncounter(ConcreteTypeNameNodeKind)
},
ClassDeclarationVisitor: func(*ClassDeclaration) {
reporter.reportNodeEncounter(ClassDeclarationNodeKind)
},
BinaryExpressionVisitor: func(*BinaryExpression) {
reporter.reportNodeEncounter(BinaryExpressionNodeKind)
},
MethodDeclarationVisitor: func(*MethodDeclaration) {
reporter.reportNodeEncounter(MethodDeclarationNodeKind)
},
RangedLoopStatementVisitor: func(*RangedLoopStatement) {
reporter.reportNodeEncounter(RangedLoopStatementNodeKind)
},
ExpressionStatementVisitor: func(*ExpressionStatement) {
reporter.reportNodeEncounter(ExpressionStatementNodeKind)
},
ForEachLoopStatementVisitor: func(*ForEachLoopStatement) {
reporter.reportNodeEncounter(ForEachLoopStatementNodeKind)
},
ConditionalStatementVisitor: func(*ConditionalStatement) {
reporter.reportNodeEncounter(ConditionalStatementNodeKind)
},
ListSelectExpressionVisitor: func(*ListSelectExpression) {
reporter.reportNodeEncounter(ListSelectExpressionNodeKind)
},
ConstructorDeclarationVisitor: func(*ConstructorDeclaration) {
reporter.reportNodeEncounter(ConstructorDeclarationNodeKind)
},
FieldSelectExpressionVisitor: func(*ChainExpression) {
reporter.reportNodeEncounter(ChainExpressionNodeKind)
},
PostfixExpressionVisitor: func(*PostfixExpression) {
reporter.reportNodeEncounter(PostfixExpressionNodeKind)
},
WildcardNodeVisitor: func(*WildcardNode) {
reporter.reportNodeEncounter(WildcardNodeKind)
},
LetBindingVisitor: func(*LetBinding) {
reporter.reportNodeEncounter(LetBindingNodeKind)
},
OptionalTypeNameVisitor: func(*OptionalTypeName) {
reporter.reportNodeEncounter(OptionalTypeNameNodeKind)
},
ImplementStatementVisitor: func(*ImplementStatement) {
reporter.reportNodeEncounter(ImplementStatementNodeKind)
},
ListExpressionVisitor: func(*ListExpression) {
reporter.reportNodeEncounter(ListExpressionNodeKind)
},
}
}
type Counter struct {
nodes map[NodeKind]int
visitor Visitor
}
func NewCounter() *Counter {
counter := &Counter{nodes: map[NodeKind]int{}}
counter.visitor = NewReportingVisitor(counter)
return counter
}
func (counter *Counter) reportNodeEncounter(kind NodeKind) {
counter.nodes[kind] = counter.nodes[kind] + 1
}
func (counter *Counter) Count(node Node) {
node.AcceptRecursive(counter.visitor)
}
func (counter *Counter) ValueFor(kind NodeKind) int {
return counter.nodes[kind]
}
type SingleFunctionVisitor struct {
visit func(Node)
}
func VisitWith(visitor func(Node)) *SingleFunctionVisitor {
return &SingleFunctionVisitor{visit: visitor}
}
func (visitor *SingleFunctionVisitor) VisitParameter(node *Parameter) {
visitor.visit(node)
}
func (visitor *SingleFunctionVisitor) VisitIdentifier(node *Identifier) {
visitor.visit(node)
}
func (visitor *SingleFunctionVisitor) VisitLetBinding(node *LetBinding) {
visitor.visit(node)
}
func (visitor *SingleFunctionVisitor) VisitCallArgument(node *CallArgument) {
visitor.visit(node)
}
func (visitor *SingleFunctionVisitor) VisitListTypeName(node *ListTypeName) {
visitor.visit(node)
}
func (visitor *SingleFunctionVisitor) VisitTestStatement(node *TestStatement) {
visitor.visit(node)
}
func (visitor *SingleFunctionVisitor) VisitStringLiteral(node *StringLiteral) {
visitor.visit(node)
}
func (visitor *SingleFunctionVisitor) VisitNumberLiteral(node *NumberLiteral) {
visitor.visit(node)
}
func (visitor *SingleFunctionVisitor) VisitCallExpression(node *CallExpression) {
visitor.visit(node)
}
func (visitor *SingleFunctionVisitor) VisitEmptyStatement(node *EmptyStatement) {
visitor.visit(node)
}
func (visitor *SingleFunctionVisitor) VisitYieldStatement(node *YieldStatement) {
visitor.visit(node)
}
func (visitor *SingleFunctionVisitor) VisitBreakStatement(node *BreakStatement) {
visitor.visit(node)
}
func (visitor *SingleFunctionVisitor) VisitBlockStatement(node *StatementBlock) {
visitor.visit(node)
}
func (visitor *SingleFunctionVisitor) VisitWildcardNode(node *WildcardNode) {
visitor.visit(node)
}
func (visitor *SingleFunctionVisitor) VisitAssertStatement(node *AssertStatement) {
visitor.visit(node)
}
func (visitor *SingleFunctionVisitor) VisitUnaryExpression(node *UnaryExpression) {
visitor.visit(node)
}
func (visitor *SingleFunctionVisitor) VisitImportStatement(node *ImportStatement) {
visitor.visit(node)
}
func (visitor *SingleFunctionVisitor) VisitAssignStatement(node *AssignStatement) {
visitor.visit(node)
}
func (visitor *SingleFunctionVisitor) VisitReturnStatement(node *ReturnStatement) {
visitor.visit(node)
}
func (visitor *SingleFunctionVisitor) VisitTranslationUnit(node *TranslationUnit) {
visitor.visit(node)
}
func (visitor *SingleFunctionVisitor) VisitCreateExpression(node *CreateExpression) {
visitor.visit(node)
}
func (visitor *SingleFunctionVisitor) VisitInvalidStatement(node *InvalidStatement) {
visitor.visit(node)
}
func (visitor *SingleFunctionVisitor) VisitFieldDeclaration(node *FieldDeclaration) {
visitor.visit(node)
}
func (visitor *SingleFunctionVisitor) VisitGenericTypeName(node *GenericTypeName) {
visitor.visit(node)
}
func (visitor *SingleFunctionVisitor) VisitOptionalTypeName(node *OptionalTypeName) {
visitor.visit(node)
}
func (visitor *SingleFunctionVisitor) VisitConcreteTypeName(node *ConcreteTypeName) {
visitor.visit(node)
}
func (visitor *SingleFunctionVisitor) VisitClassDeclaration(node *ClassDeclaration) {
visitor.visit(node)
}
func (visitor *SingleFunctionVisitor) VisitBinaryExpression(node *BinaryExpression) {
visitor.visit(node)
}
func (visitor *SingleFunctionVisitor) VisitMethodDeclaration(node *MethodDeclaration) {
visitor.visit(node)
}
func (visitor *SingleFunctionVisitor) VisitPostfixExpression(node *PostfixExpression) {
visitor.visit(node)
}
func (visitor *SingleFunctionVisitor) VisitImplementStatement(node *ImplementStatement) {
visitor.visit(node)
}
func (visitor *SingleFunctionVisitor) VisitRangedLoopStatement(node *RangedLoopStatement) {
visitor.visit(node)
}
func (visitor *SingleFunctionVisitor) VisitExpressionStatement(node *ExpressionStatement) {
visitor.visit(node)
}
func (visitor *SingleFunctionVisitor) VisitForEachLoopStatement(node *ForEachLoopStatement) {
visitor.visit(node)
}
func (visitor *SingleFunctionVisitor) VisitConditionalStatement(node *ConditionalStatement) {
visitor.visit(node)
}
func (visitor *SingleFunctionVisitor) VisitListSelectExpression(node *ListSelectExpression) {
visitor.visit(node)
}
func (visitor *SingleFunctionVisitor) VisitFieldSelectExpression(node *ChainExpression) {
visitor.visit(node)
}
func (visitor *SingleFunctionVisitor) VisitConstructorDeclaration(node *ConstructorDeclaration) {
visitor.visit(node)
}
func (visitor *SingleFunctionVisitor) VisitListExpression(node *ListExpression) {
visitor.visit(node)
} | pkg/compiler/grammar/tree/visitor.go | 0.658966 | 0.574037 | visitor.go | starcoder |
package model2d
import (
"fmt"
"math"
"github.com/heustis/tsp-solver-go/model"
)
// Edge2D represents the line segment between two points.
type Edge2D struct {
Start *Vertex2D `json:"start"`
End *Vertex2D `json:"end"`
vector *Vertex2D
length float64
}
// DistanceIncrease returns the difference in length between the edge
// and the two edges formed by inserting the vertex between the edge's start and end.
// For example, if start->end has a length of 5, start->vertex has a length of 3,
// and vertex->end has a length of 6, this will return 4 (i.e. 6 + 3 - 5)
func (e *Edge2D) DistanceIncrease(vertex model.CircuitVertex) float64 {
return e.Start.DistanceTo(vertex) + e.End.DistanceTo(vertex) - e.length
}
func (v *Edge2D) Equals(other interface{}) bool {
// Compare pointers first, for performance, but then check start and end points, in case the same edge is created multiple times.
if v == other {
return true
} else if otherVertex, okay := other.(*Edge2D); okay {
return v.Start.Equals(otherVertex.Start) && v.End.Equals(otherVertex.End)
} else {
return false
}
}
// GetStart returns the start vertex of the edge.
func (e *Edge2D) GetStart() model.CircuitVertex {
return e.Start
}
// GetEnd returns the end vertex of the edge.
func (e *Edge2D) GetEnd() model.CircuitVertex {
return e.End
}
// GetLength returns the length of the edge.
func (e *Edge2D) GetLength() float64 {
return e.length
}
// GetVector returns the normalized (length=1.0) vector from the edge's start to the edge's end.
func (e *Edge2D) GetVector() *Vertex2D {
if e.vector == nil {
e.vector = NewVertex2D((e.End.X-e.Start.X)/e.length, (e.End.Y-e.Start.Y)/e.length)
}
return e.vector
}
// Intersects checks if the two edges go through at least one identical point (returns true if they do).
func (e *Edge2D) Intersects(other model.CircuitEdge) bool {
otherEdge2D := other.(*Edge2D)
// See http://paulbourke.net/geometry/pointlineplane/
eDeltaX := e.End.X - e.Start.X
eDeltaY := e.End.Y - e.Start.Y
otherDeltaX := otherEdge2D.End.X - otherEdge2D.Start.X
otherDeltaY := otherEdge2D.End.Y - otherEdge2D.Start.Y
denominator := otherDeltaY*eDeltaX - otherDeltaX*eDeltaY
startToStartDeltaX := e.Start.X - otherEdge2D.Start.X
startToStartDeltaY := e.Start.Y - otherEdge2D.Start.Y
if math.Abs(denominator) < model.Threshold {
// Edges are parallel, check if the edges are on the same line, then return true if lines overlap.
// To do this, use the same math as the denominator, with the edges being this edge and an edge from this start to the other start.
// If this is also parallel, the line segments are on the same infinite line.
startToStartParallel := (otherEdge2D.Start.Y-e.Start.Y)*eDeltaX - (otherEdge2D.Start.X-e.Start.X)*eDeltaY
return math.Abs(startToStartParallel) < model.Threshold && (model.IsBetween(e.Start.X, otherEdge2D.Start.X, otherEdge2D.End.X) ||
model.IsBetween(e.End.X, otherEdge2D.Start.X, otherEdge2D.End.X) ||
model.IsBetween(otherEdge2D.Start.X, e.Start.X, e.End.X) ||
model.IsBetween(otherEdge2D.End.X, e.Start.X, e.End.X))
}
// Determine the percentage of this edge's length from the start to the intersecting point.
// Needs to be between 0 and 1, negative indicates the intersection is before the start, greater than 1 indicates that it is after the end.
numeratorE := otherDeltaX*startToStartDeltaY - otherDeltaY*startToStartDeltaX
if intersectDistPercentE := numeratorE / denominator; intersectDistPercentE < -model.Threshold || intersectDistPercentE > 1.0+model.Threshold {
return false
}
// Check that the intersecting point exists within the other edge's length based on its percentage.
numeratorOther := eDeltaX*startToStartDeltaY - eDeltaY*startToStartDeltaX
intersectDistPercentOther := numeratorOther / denominator
return intersectDistPercentOther >= -model.Threshold && intersectDistPercentOther < 1.0+model.Threshold
}
// Merge creates a new edge starting from this edge's start vertex and ending at the supplied edge's end vertex.
func (e *Edge2D) Merge(other model.CircuitEdge) model.CircuitEdge {
return NewEdge2D(e.Start, other.GetEnd().(*Vertex2D))
}
// Split creates two new edges "start-to-vertex" and "vertex-to-end" based on this edge and the supplied vertex.
func (e *Edge2D) Split(vertex model.CircuitVertex) (model.CircuitEdge, model.CircuitEdge) {
return NewEdge2D(e.Start, vertex.(*Vertex2D)), NewEdge2D(vertex.(*Vertex2D), e.End)
}
// String prints the edge as a string.
func (e *Edge2D) String() string {
return fmt.Sprintf(`{"start":%s,"end":%s}`, e.Start, e.End)
}
// NewEdge2D creates a edge from the starting Vertex2D to the ending Vertex2D.
func NewEdge2D(start *Vertex2D, end *Vertex2D) *Edge2D {
length := start.DistanceTo(end)
return &Edge2D{
Start: start,
End: end,
vector: nil,
length: length,
}
}
var _ model.CircuitEdge = (*Edge2D)(nil) | model2d/edge2d.go | 0.91837 | 0.777553 | edge2d.go | starcoder |
package Euler2D
import (
"fmt"
"image/color"
"time"
"github.com/notargets/avs/chart2d"
"github.com/notargets/avs/functions"
graphics2D "github.com/notargets/avs/geometry"
utils2 "github.com/notargets/avs/utils"
"github.com/notargets/gocfd/utils"
)
type PlotMeta struct {
Plot bool
Scale float64
TranslateX, TranslateY float64
Field FlowFunction
FieldMinP, FieldMaxP *float64 // nil if no forced min, max
FrameTime time.Duration
StepsBeforePlot int
LineType chart2d.LineType
}
type ChartState struct {
chart *chart2d.Chart2D
fs *functions.FSurface
gm *graphics2D.TriMesh
}
func (c *Euler) GetPlotField(Q [4]utils.Matrix, plotField FlowFunction) (field utils.Matrix) {
var (
Kmax = c.dfr.K
Np = c.dfr.SolutionElement.Np
NpFlux = c.dfr.FluxElement.Np
fld utils.Matrix
skipInterp bool
)
switch {
case plotField <= Energy:
fld = Q[int(plotField)]
case plotField < ShockFunction:
fld = utils.NewMatrix(Np, Kmax)
fldD := fld.DataP
for ik := 0; ik < Kmax*Np; ik++ {
fldD[ik] = c.FSFar.GetFlowFunction(Q, ik, plotField)
}
case plotField == ShockFunction:
var sf *ModeAliasShockFinder
if c.Limiter != nil {
sf = c.Limiter.ShockFinder[0]
} else {
sf = NewAliasShockFinder(c.dfr.SolutionElement, 2)
}
fld = utils.NewMatrix(Np, Kmax)
fldD := fld.DataP
for k := 0; k < Kmax; k++ {
qElement := Q[0].Col(k)
m := sf.ShockIndicator(qElement.DataP)
for i := 0; i < Np; i++ {
ik := k + i*Kmax
fldD[ik] = m
}
}
case plotField == EpsilonDissipation:
field = c.Dissipation.GetScalarEpsilonPlotField(c)
skipInterp = true
case plotField == EpsilonDissipationC0:
field = c.Dissipation.GetC0EpsilonPlotField(c)
skipInterp = true
case plotField == XGradientDensity || plotField == YGradientDensity ||
plotField == XGradientXMomentum || plotField == YGradientXMomentum ||
plotField == XGradientYMomentum || plotField == YGradientYMomentum ||
plotField == XGradientEnergy || plotField == YGradientEnergy:
GradX, GradY := utils.NewMatrix(NpFlux, Kmax), utils.NewMatrix(NpFlux, Kmax)
DOFX, DOFY := utils.NewMatrix(NpFlux, Kmax), utils.NewMatrix(NpFlux, Kmax)
var varNum int
switch plotField {
case XGradientDensity, YGradientDensity:
varNum = 0
case XGradientXMomentum, YGradientXMomentum:
varNum = 1
case XGradientYMomentum, YGradientYMomentum:
varNum = 2
case XGradientEnergy, YGradientEnergy:
varNum = 3
}
c.GetSolutionGradientUsingRTElement(-1, varNum, Q, GradX, GradY, DOFX, DOFY)
if plotField < 300 {
field = GradX
} else {
field = GradY
}
skipInterp = true
}
if !skipInterp {
field = c.dfr.FluxInterp.Mul(fld)
}
return
}
func (c *Euler) PlotQ(pm *PlotMeta, width, height int) {
var (
Q = c.RecombineShardsKBy4(c.Q)
plotField = pm.Field
delay = pm.FrameTime
lineType = pm.LineType
scale = pm.Scale
translate = [2]float32{float32(pm.TranslateX), float32(pm.TranslateY)}
oField = c.GetPlotField(Q, plotField)
fI = c.dfr.ConvertScalarToOutputMesh(oField)
)
if c.chart.gm == nil {
c.chart.gm = c.dfr.OutputMesh()
}
c.chart.fs = functions.NewFSurface(c.chart.gm, [][]float32{fI}, 0)
fmt.Printf(" Plot>%s min,max = %8.5f,%8.5f\n", pm.Field.String(), oField.Min(), oField.Max())
c.PlotFS(width, height,
pm.FieldMinP, pm.FieldMaxP, 0.99*oField.Min(), 1.01*oField.Max(),
scale, translate, lineType)
utils.SleepFor(int(delay.Milliseconds()))
return
}
func (c *Euler) PlotFS(width, height int,
fminP, fmaxP *float64, fmin, fmax float64,
scale float64, translate [2]float32, ltO ...chart2d.LineType) {
var (
fs = c.chart.fs
trimesh = fs.Tris
lt = chart2d.NoLine
specifiedScale = fminP != nil || fmaxP != nil
autoScale = !specifiedScale
)
if c.chart.chart == nil {
box := graphics2D.NewBoundingBox(trimesh.GetGeometry())
box = box.Scale(float32(scale))
box = box.Translate(translate)
//c.chart.chart = chart2d.NewChart2D(1900, 1080, box.XMin[0], box.XMax[0], box.XMin[1], box.XMax[1])
//c.chart.chart = chart2d.NewChart2D(3840, 2160, box.XMin[0], box.XMax[0], box.XMin[1], box.XMax[1])
c.chart.chart = chart2d.NewChart2D(width, height, box.XMin[0], box.XMax[0], box.XMin[1], box.XMax[1])
go c.chart.chart.Plot()
if specifiedScale {
// Scale field min/max to preset values
switch {
case fminP != nil && fmaxP != nil:
fmin, fmax = *fminP, *fmaxP
case fminP != nil:
fmin = *fminP
case fmaxP != nil:
fmax = *fmaxP
}
colorMap := utils2.NewColorMap(float32(fmin), float32(fmax), 1.)
c.chart.chart.AddColorMap(colorMap)
}
}
if autoScale {
// Autoscale field min/max every time
colorMap := utils2.NewColorMap(float32(fmin), float32(fmax), 1.)
c.chart.chart.AddColorMap(colorMap)
}
white := color.RGBA{R: 255, G: 255, B: 255, A: 1}
black := color.RGBA{R: 0, G: 0, B: 0, A: 1}
_, _ = white, black
if len(ltO) != 0 {
lt = ltO[0]
}
if err := c.chart.chart.AddFunctionSurface("FSurface", *fs, lt, white); err != nil {
panic("unable to add function surface series")
}
} | model_problems/Euler2D/plot.go | 0.70304 | 0.4575 | plot.go | starcoder |
package model
import (
"fmt"
"reflect"
"strconv"
)
// Model is an utility type to access and manipulate struct informations.
type Model struct {
Fields []Field // field fields (pointers to fields)
ref interface{}
tag string
}
func structType(s interface{}) reflect.Value {
v := reflect.ValueOf(s)
if v.Kind() == reflect.Ptr {
v = v.Elem()
}
if v.Kind() != reflect.Struct {
panic(fmt.Errorf("the be a struct or a pointer to a struct; got %T", v.Type()))
}
return v
}
// Allocates a new Model and will extract and cache informations of fields that have the given `tag`
func New(s interface{}, tag string) Model {
v := reflect.ValueOf(s)
if v.Type().Kind() != reflect.Ptr || v.Type().Elem().Kind() != reflect.Struct {
panic(fmt.Errorf("the model must be pointer to a struct; got %T", v.Type()))
}
m := Model{
ref: s,
tag: tag,
}
sv, st := fields(structType(s), tag)
m.Fields = make([]Field, len(sv))
for i := 0; i < len(sv); i++ {
m.Fields[i] = Field{
TagName: st[i].Tag.Get(tag),
StructField: st[i],
Interface: sv[i].Addr().Interface(),
}
}
return m
}
// Returns the real values of the tagged fields. This cannot be cached.
func Values(s interface{}, tag string) []interface{} {
sv, st := fields(structType(s), tag)
values := make([]interface{}, 0, len(sv))
for i, value := range sv {
if value.Type().Kind() == reflect.Struct && st[i].Tag.Get(tag) == "" {
values = append(values, Values(value.Interface(), tag)...)
continue
}
values = append(values, value.Interface())
}
return values
}
// Returns a map of names (tags) and their values
// Example: m.Map()["last_name"]
func (m Model) Map() map[string]interface{} {
values := Values(m.ref, m.tag)
ret := make(map[string]interface{}, len(values))
for i, value := range values {
ret[m.TagNames()[i]] = value
}
return ret
}
// Returns an array of values of tagged fields
func (m Model) Values() []interface{} {
return Values(m.ref, m.tag)
}
// Returns the "real" name of all struct's fields
func (m Model) Names() []string {
ret := make([]string, len(m.Fields))
for i, field := range m.Fields {
ret[i] = field.Name
}
return ret
}
func TagNames(s interface{}, tag string) []string {
return New(s, tag).TagNames()
}
// Returns the tagged names of all struct's fields
func (m Model) TagNames() []string {
ret := make([]string, len(m.Fields))
for i, field := range m.Fields {
ret[i] = field.TagName
}
return ret
}
// Repeat returns the given string as many times as the len of model.Fields
// Useful when building insert statements with Query
// Example:
// people.Repeat("?")
// ["?", "?", "?"] // len is the number of mapped people struct's fields
func (m Model) Repeat(s string) []string {
ret := make([]string, len(m.Fields))
for i, _ := range m.Fields {
ret[i] = s
}
return ret
}
// RepeatInc returns the given string as many times as the len of model.Fields plus his increment
// Useful when building insert statements (like in postgres) with Query struct.
// Example:
// people.Repeat("$")
// ["$1", "$2", "$3"]
func (m Model) RepeatInc(s string) []string {
ret := make([]string, len(m.Fields))
for i, _ := range m.Fields {
ret[i] = s + strconv.Itoa(i+1)
}
return ret
}
// Assign returns an array of assignment strings:
// Example:
// ["name=?" "surname=?"]
func (m Model) Assign() []string {
ret := make([]string, len(m.Fields))
for i, field := range m.Fields {
// I'm not sure for this use case this is more performant:
// string(append([]byte(field.TagName), "=?"...))
ret[i] = field.TagName + "=?"
}
return ret
}
// Interfaces returns the underlining interface of each struct's field.
// Useful when binding results to our struct.
func Interfaces(s interface{}, tag string) []interface{} {
return New(s, tag).Interfaces()
}
// Interfaces returns the underlining interface of each struct's field.
// Useful when binding results to our struct.
func (m Model) Interfaces() []interface{} {
ret := make([]interface{}, len(m.Fields))
for i, field := range m.Fields {
ret[i] = field.Interface
}
return ret
}
// MapInterface returns a map of names (tags) and their interfaces
// Example:
// m.MapInterface()["last_name"]
func (m Model) MapInterface() map[string]interface{} {
interfaces := m.Interfaces()
ret := make(map[string]interface{}, len(interfaces))
for i, value := range interfaces {
ret[m.TagNames()[i]] = value
}
return ret
} | model.go | 0.720172 | 0.463809 | model.go | starcoder |
package strmatcher
import (
"math/bits"
"sort"
"strings"
"unsafe"
)
// PrimeRK is the prime base used in Rabin-Karp algorithm.
const PrimeRK = 16777619
// RollingHash calculates the rolling murmurHash of given string based on a provided suffix hash.
func RollingHash(hash uint32, input string) uint32 {
for i := len(input) - 1; i >= 0; i-- {
hash = hash*PrimeRK + uint32(input[i])
}
return hash
}
// MemHash is the hash function used by go map, it utilizes available hardware instructions(behaves
// as aeshash if aes instruction is available).
// With different seed, each MemHash<seed> performs as distinct hash functions.
func MemHash(seed uint32, input string) uint32 {
return uint32(strhash(unsafe.Pointer(&input), uintptr(seed))) // nosemgrep
}
const (
mphMatchTypeCount = 2 // Full and Domain
)
type mphRuleInfo struct {
rollingHash uint32
matchers [mphMatchTypeCount][]uint32
}
// MphMatcherGroup is an implementation of MatcherGroup.
// It implements Rabin-Karp algorithm and minimal perfect hash table for Full and Domain matcher.
type MphMatcherGroup struct {
rules []string // RuleIdx -> pattern string, index 0 reserved for failed lookup
values [][]uint32 // RuleIdx -> registered matcher values for the pattern (Full Matcher takes precedence)
level0 []uint32 // RollingHash & Mask -> seed for Memhash
level0Mask uint32 // Mask restricting RollingHash to 0 ~ len(level0)
level1 []uint32 // Memhash<seed> & Mask -> stored index for rules
level1Mask uint32 // Mask for restricting Memhash<seed> to 0 ~ len(level1)
ruleInfos *map[string]mphRuleInfo
}
func NewMphMatcherGroup() *MphMatcherGroup {
return &MphMatcherGroup{
rules: []string{""},
values: [][]uint32{nil},
level0: nil,
level0Mask: 0,
level1: nil,
level1Mask: 0,
ruleInfos: &map[string]mphRuleInfo{}, // Only used for building, destroyed after build complete
}
}
// AddFullMatcher implements MatcherGroupForFull.
func (g *MphMatcherGroup) AddFullMatcher(matcher FullMatcher, value uint32) {
pattern := strings.ToLower(matcher.Pattern())
g.addPattern(0, "", pattern, matcher.Type(), value)
}
// AddDomainMatcher implements MatcherGroupForDomain.
func (g *MphMatcherGroup) AddDomainMatcher(matcher DomainMatcher, value uint32) {
pattern := strings.ToLower(matcher.Pattern())
hash := g.addPattern(0, "", pattern, matcher.Type(), value) // For full domain match
g.addPattern(hash, pattern, ".", matcher.Type(), value) // For partial domain match
}
func (g *MphMatcherGroup) addPattern(suffixHash uint32, suffixPattern string, pattern string, matcherType Type, value uint32) uint32 {
fullPattern := pattern + suffixPattern
info, found := (*g.ruleInfos)[fullPattern]
if !found {
info = mphRuleInfo{rollingHash: RollingHash(suffixHash, pattern)}
g.rules = append(g.rules, fullPattern)
g.values = append(g.values, nil)
}
info.matchers[matcherType] = append(info.matchers[matcherType], value)
(*g.ruleInfos)[fullPattern] = info
return info.rollingHash
}
// Build builds a minimal perfect hash table for insert rules.
// Algorithm used: Hash, displace, and compress. See http://cmph.sourceforge.net/papers/esa09.pdf
func (g *MphMatcherGroup) Build() error {
ruleCount := len(*g.ruleInfos)
g.level0 = make([]uint32, nextPow2(ruleCount/4))
g.level0Mask = uint32(len(g.level0) - 1)
g.level1 = make([]uint32, nextPow2(ruleCount))
g.level1Mask = uint32(len(g.level1) - 1)
// Create buckets based on all rule's rolling hash
buckets := make([][]uint32, len(g.level0))
for ruleIdx := 1; ruleIdx < len(g.rules); ruleIdx++ { // Traverse rules starting from index 1 (0 reserved for failed lookup)
ruleInfo := (*g.ruleInfos)[g.rules[ruleIdx]]
bucketIdx := ruleInfo.rollingHash & g.level0Mask
buckets[bucketIdx] = append(buckets[bucketIdx], uint32(ruleIdx))
g.values[ruleIdx] = append(ruleInfo.matchers[Full], ruleInfo.matchers[Domain]...) // nolint:gocritic
}
g.ruleInfos = nil // Set ruleInfos nil to release memory
// Sort buckets in descending order with respect to each bucket's size
bucketIdxs := make([]int, len(buckets))
for bucketIdx := range buckets {
bucketIdxs[bucketIdx] = bucketIdx
}
sort.Slice(bucketIdxs, func(i, j int) bool { return len(buckets[bucketIdxs[i]]) > len(buckets[bucketIdxs[j]]) })
// Exercise Hash, Displace, and Compress algorithm to construct minimal perfect hash table
occupied := make([]bool, len(g.level1)) // Whether a second-level hash has been already used
hashedBucket := make([]uint32, 0, 4) // Second-level hashes for each rule in a specific bucket
for _, bucketIdx := range bucketIdxs {
bucket := buckets[bucketIdx]
hashedBucket = hashedBucket[:0]
seed := uint32(0)
for len(hashedBucket) != len(bucket) {
for _, ruleIdx := range bucket {
memHash := MemHash(seed, g.rules[ruleIdx]) & g.level1Mask
if occupied[memHash] { // Collision occurred with this seed
for _, hash := range hashedBucket { // Revert all values in this hashed bucket
occupied[hash] = false
g.level1[hash] = 0
}
hashedBucket = hashedBucket[:0]
seed++ // Try next seed
break
}
occupied[memHash] = true
g.level1[memHash] = ruleIdx // The final value in the hash table
hashedBucket = append(hashedBucket, memHash)
}
}
g.level0[bucketIdx] = seed // Displacement value for this bucket
}
return nil
}
// Lookup searches for input in minimal perfect hash table and returns its index. 0 indicates not found.
func (g *MphMatcherGroup) Lookup(rollingHash uint32, input string) uint32 {
i0 := rollingHash & g.level0Mask
seed := g.level0[i0]
i1 := MemHash(seed, input) & g.level1Mask
if n := g.level1[i1]; g.rules[n] == input {
return n
}
return 0
}
// Match implements MatcherGroup.Match.
func (g *MphMatcherGroup) Match(input string) []uint32 {
matches := [][]uint32{}
hash := uint32(0)
for i := len(input) - 1; i >= 0; i-- {
hash = hash*PrimeRK + uint32(input[i])
if input[i] == '.' {
if mphIdx := g.Lookup(hash, input[i:]); mphIdx != 0 {
matches = append(matches, g.values[mphIdx])
}
}
}
if mphIdx := g.Lookup(hash, input); mphIdx != 0 {
matches = append(matches, g.values[mphIdx])
}
switch len(matches) {
case 0:
return nil
case 1:
return matches[0]
default:
result := []uint32{}
for i := len(matches) - 1; i >= 0; i-- {
result = append(result, matches[i]...)
}
return result
}
}
// MatchAny implements MatcherGroup.MatchAny.
func (g *MphMatcherGroup) MatchAny(input string) bool {
hash := uint32(0)
for i := len(input) - 1; i >= 0; i-- {
hash = hash*PrimeRK + uint32(input[i])
if input[i] == '.' {
if g.Lookup(hash, input[i:]) != 0 {
return true
}
}
}
return g.Lookup(hash, input) != 0
}
func nextPow2(v int) int {
if v <= 1 {
return 1
}
const MaxUInt = ^uint(0)
n := (MaxUInt >> bits.LeadingZeros(uint(v))) + 1
return int(n)
}
//go:noescape
//go:linkname strhash runtime.strhash
func strhash(p unsafe.Pointer, h uintptr) uintptr | common/strmatcher/matchergroup_mph.go | 0.685318 | 0.455441 | matchergroup_mph.go | starcoder |
package main
import (
"math"
"math/rand"
)
// Distribution provides an interface to model a statistical distribution.
type Distribution interface {
Advance()
Get() float64 // should be idempotent
}
// NormalDistribution models a normal distribution.
type NormalDistribution struct {
Mean float64
StdDev float64
value float64
}
func ND(mean, stddev float64) *NormalDistribution {
return &NormalDistribution{Mean: mean, StdDev: stddev}
}
// Advance advances this distribution. Since a normal distribution is
// stateless, this is just overwrites the internal cache value.
func (d *NormalDistribution) Advance() {
d.value = rand.NormFloat64()*d.StdDev + d.Mean
}
// Get returns the last computed value for this distribution.
func (d *NormalDistribution) Get() float64 {
return d.value
}
// UniformDistribution models a uniform distribution.
type UniformDistribution struct {
Low float64
High float64
value float64
}
func UD(low, high float64) *UniformDistribution {
return &UniformDistribution{Low: low, High: high}
}
// Advance advances this distribution. Since a uniform distribution is
// stateless, this is just overwrites the internal cache value.
func (d *UniformDistribution) Advance() {
x := rand.Float64() // uniform
x *= d.High - d.Low
x += d.Low
d.value = x
}
// Get computes and returns the next value in the distribution.
func (d *UniformDistribution) Get() float64 {
return d.value
}
// RandomWalkDistribution is a stateful random walk. Initialize it with an
// underlying distribution, which is used to compute the new step value.
type RandomWalkDistribution struct {
Step Distribution
State float64 // optional
}
func WD(step Distribution, state float64) *RandomWalkDistribution {
return &RandomWalkDistribution{Step: step, State: state}
}
// Advance computes the next value of this distribution and stores it.
func (d *RandomWalkDistribution) Advance() {
d.Step.Advance()
d.State += d.Step.Get()
}
// Get returns the last computed value for this distribution.
func (d *RandomWalkDistribution) Get() float64 {
return d.State
}
// ClampedRandomWalkDistribution is a stateful random walk, with minimum and
// maximum bounds. Initialize it with a Min, Max, and an underlying
// distribution, which is used to compute the new step value.
type ClampedRandomWalkDistribution struct {
Step Distribution
Min float64
Max float64
State float64 // optional
}
func CWD(step Distribution, min, max, state float64) *ClampedRandomWalkDistribution {
return &ClampedRandomWalkDistribution{
Step: step,
Min: min,
Max: max,
State: state,
}
}
// Advance computes the next value of this distribution and stores it.
func (d *ClampedRandomWalkDistribution) Advance() {
d.Step.Advance()
d.State += d.Step.Get()
if d.State > d.Max {
d.State = d.Max
}
if d.State < d.Min {
d.State = d.Min
}
}
// Get returns the last computed value for this distribution.
func (d *ClampedRandomWalkDistribution) Get() float64 {
return d.State
}
// MonotonicRandomWalkDistribution is a stateful random walk that only
// increases. Initialize it with a Start and an underlying distribution,
// which is used to compute the new step value. The sign of any value of the
// u.d. is always made positive.
type MonotonicRandomWalkDistribution struct {
Step Distribution
State float64
}
// Advance computes the next value of this distribution and stores it.
func (d *MonotonicRandomWalkDistribution) Advance() {
d.Step.Advance()
d.State += math.Abs(d.Step.Get())
}
func (d *MonotonicRandomWalkDistribution) Get() float64 {
return d.State
}
func MWD(step Distribution, state float64) *MonotonicRandomWalkDistribution {
return &MonotonicRandomWalkDistribution{Step: step, State: state}
}
type ConstantDistribution struct {
State float64
}
func (d *ConstantDistribution) Advacne() {
}
func (d *ConstantDistribution) Get() float64 {
return d.State
} | cmd/bulk_data_gen/distribution.go | 0.925626 | 0.627152 | distribution.go | starcoder |
package data
import (
"math"
"github.com/calummccain/coxeter/vector"
)
const (
eVal53n = 3.0884042547 // math.Pi / math.Atan(P)
pVal53n = 6.0
eVal53nTrunc = 3.0884042547 // math.Pi / math.Atan(P)
pVal53nTrunc = 11.465072284 // math.Pi / math.Atan(math.Sqrt(1.0/(7.0+4.0*Rt2)))
eVal53nRect = 3.0884042547 // math.Pi / math.Atan(P)
pVal53nRect = 1e100 //∞
)
func GoursatTetrahedron53n(n float64) GoursatTetrahedron {
cot := 1.0 / math.Pow(math.Tan(math.Pi/n), 2.0)
cf := (P + 2.0) * cot / (1 + cot)
ce := P2 * cot
fe := P2 * (1 + cot) / (P + 2.0)
var cv, fv, ev float64
if math.Abs(n-pVal53n) < BoundaryEps {
cv = 3.0
fv = 2.0 * (1.0 - Rt_5)
ev = P_2
} else {
cv = P4 * cot / (3.0 - cot)
fv = P4 * (1.0 + cot) / ((P + 2) * (3.0 - cot))
ev = P2 / (3.0 - cot)
}
return GoursatTetrahedron{
V: vector.Vec4{W: 1, X: P, Y: P_1, Z: 0},
E: vector.Vec4{W: 1, X: P, Y: 0, Z: 0},
F: vector.Vec4{W: 3 - P, X: P, Y: 0, Z: 1},
C: vector.Vec4{W: 1, X: 0, Y: 0, Z: 0},
CFE: vector.Vec4{W: 0, X: 0, Y: 1, Z: 0},
CFV: vector.Vec4{W: 0, X: 1, Y: -P2, Z: -P},
CEV: vector.Vec4{W: 0, X: 0, Y: 0, Z: 1},
FEV: vector.Vec4{W: P2*cot - 1.0, X: P3 * cot, Y: 0, Z: P2 * cot},
CF: cf,
CE: ce,
CV: cv,
FE: fe,
FV: fv,
EV: ev,
}
}
func Coxeter53n(n float64) Coxeter {
cos := math.Pow(math.Cos(math.Pi/n), 2.0)
return Coxeter{
P: 5.0,
Q: 3.0,
R: n,
A: func(v vector.Vec4) vector.Vec4 { return vector.Vec4{W: v.W, X: v.X, Y: -v.Y, Z: v.Z} },
B: func(v vector.Vec4) vector.Vec4 {
return vector.Vec4{W: v.W, X: 0.5 * (P*v.X + v.Y + v.Z*P_1), Y: 0.5 * (v.X - v.Y*P_1 - P*v.Z), Z: 0.5 * (v.X*P_1 - P*v.Y + v.Z)}
},
C: func(v vector.Vec4) vector.Vec4 { return vector.Vec4{W: v.W, X: v.X, Y: v.Y, Z: -v.Z} },
D: func(v vector.Vec4) vector.Vec4 {
return vector.Vec4{
W: (2*P*Rt5*cos-1)*v.W - (2*Rt5*cos-2*P_1)*v.X - (2*Rt5*cos*P_1-2*P_2)*v.Z,
X: 2*P3*cos*v.W + (1-2*P2*cos)*v.X - 2*P*cos*v.Z,
Y: v.Y,
Z: 2*P2*cos*v.W - 2*P*cos*v.X + (1-2*cos)*v.Z,
}
},
FaceReflections: []string{
"", "c", "bc", "acacbabc", "cacbabc",
"cbabc", "bacbabc", "cbabacbabc", "babacbabc",
"abc", "acbabc", "abacbabc",
},
GoursatTetrahedron: GoursatTetrahedron53n(n),
}
}
func Honeycomb53n(n float64) Honeycomb {
cot := 1.0 / math.Pow(math.Tan(math.Pi/n), 2.0)
space := Boundaries(n, eVal53n, pVal53n)
var scale func(vector.Vec4) vector.Vec4
if space == 'p' {
scale = func(v vector.Vec4) vector.Vec4 {
return vector.Vec4{W: Rt3 * v.W, X: v.X, Y: v.Y, Z: v.Z}
}
} else if space == 'e' {
// TODO
scale = func(v vector.Vec4) vector.Vec4 {
return vector.Vec4{W: v.W, X: v.X, Y: v.Y, Z: v.Z}
}
} else {
scale = func(v vector.Vec4) vector.Vec4 {
return vector.Vec4{
W: P2 * math.Sqrt(math.Abs(cot/(cot-3.0))) * v.W,
X: math.Sqrt(math.Abs((P2*cot-1.0)/(cot-3.0))) * v.X,
Y: math.Sqrt(math.Abs((P2*cot-1.0)/(cot-3.0))) * v.Y,
Z: math.Sqrt(math.Abs((P2*cot-1.0)/(cot-3.0))) * v.Z,
}
}
}
var innerProd func(vector.Vec4, vector.Vec4) float64
if space == 'p' {
innerProd = func(a, b vector.Vec4) float64 { return 3.0*a.W*b.W - a.X*b.X - a.Y*b.Y - a.Z*b.Z }
} else {
innerProd = func(a, b vector.Vec4) float64 {
return (P4*cot*a.W*b.W - (P2*cot-1.0)*(a.X*b.X+a.Y*b.Y+a.Z*b.Z)) / math.Abs(3.0-cot)
}
}
return Honeycomb{
Coxeter: Coxeter53n(n),
CellType: "spherical",
Vertices: []vector.Vec4{
{W: 1, X: 1, Y: 1, Z: 1},
{W: 1, X: 1, Y: 1, Z: -1},
{W: 1, X: 1, Y: -1, Z: 1},
{W: 1, X: 1, Y: -1, Z: -1},
{W: 1, X: -1, Y: 1, Z: 1},
{W: 1, X: -1, Y: 1, Z: -1},
{W: 1, X: -1, Y: -1, Z: 1},
{W: 1, X: -1, Y: -1, Z: -1},
{W: 1, X: 0, Y: P, Z: P_1},
{W: 1, X: 0, Y: P, Z: -P_1},
{W: 1, X: 0, Y: -P, Z: P_1},
{W: 1, X: 0, Y: -P, Z: -P_1},
{W: 1, X: P, Y: P_1, Z: 0},
{W: 1, X: P, Y: -P_1, Z: 0},
{W: 1, X: -P, Y: P_1, Z: 0},
{W: 1, X: -P, Y: -P_1, Z: 0},
{W: 1, X: P_1, Y: 0, Z: P},
{W: 1, X: -P_1, Y: 0, Z: P},
{W: 1, X: P_1, Y: 0, Z: -P},
{W: 1, X: -P_1, Y: 0, Z: -P},
},
Edges: [][2]int{
{0, 8}, {0, 12}, {0, 16}, {1, 9},
{1, 12}, {1, 18}, {2, 10}, {2, 13},
{2, 16}, {3, 11}, {3, 13}, {3, 18},
{4, 8}, {4, 14}, {4, 17}, {5, 9},
{5, 14}, {5, 19}, {6, 10}, {6, 15},
{6, 17}, {7, 11}, {7, 15}, {7, 19},
{8, 9}, {10, 11}, {12, 13}, {14, 15},
{16, 17}, {18, 19},
},
Faces: [][]int{
{0, 16, 2, 13, 12}, {1, 12, 13, 3, 18},
{0, 12, 1, 9, 8}, {0, 8, 4, 17, 16},
{2, 16, 17, 6, 10}, {1, 18, 19, 5, 9},
{4, 8, 9, 5, 14}, {5, 19, 7, 15, 14},
{6, 17, 4, 14, 15}, {3, 13, 2, 10, 11},
{3, 11, 7, 19, 18}, {11, 10, 6, 15, 7},
},
EVal: eVal53n,
PVal: pVal53n,
Space: space,
Scale: scale,
InnerProduct: innerProd,
}
}
func Honeycomb53nTrunc(n float64) Honeycomb {
cot := 1.0 / math.Pow(math.Tan(math.Pi/n), 2.0)
space := Boundaries(n, eVal53nTrunc, pVal53nTrunc)
A := (P4 - 1.0) * 0.2
B := 1.0 + Rt_5
C := A * P_1
D := B * P_1
E := (1 - Rt_5) * 0.5
F := (1 + Rt_5) * 0.5
G := P_1 * Rt_5
var scale func(vector.Vec4) vector.Vec4
if space == 'p' {
scale = func(v vector.Vec4) vector.Vec4 {
return vector.Vec4{W: math.Sqrt(P2+0.2*P_2) * v.W, X: v.X, Y: v.Y, Z: v.Z}
}
} else if space == 'e' {
scale = func(v vector.Vec4) vector.Vec4 {
return vector.Vec4{W: v.W, X: v.X, Y: v.Y, Z: v.Z}
}
} else {
scale = func(v vector.Vec4) vector.Vec4 {
return vector.Vec4{
W: P * math.Sqrt(math.Abs(cot/(1.0+P_4*0.2-P_2*cot*0.2))) * v.W,
X: math.Sqrt(math.Abs((cot-P_2)/(1.0+P_4*0.2-P_2*cot*0.2))) * v.X,
Y: math.Sqrt(math.Abs((cot-P_2)/(1.0+P_4*0.2-P_2*cot*0.2))) * v.Y,
Z: math.Sqrt(math.Abs((cot-P_2)/(1.0+P_4*0.2-P_2*cot*0.2))) * v.Z,
}
}
}
var innerProd func(vector.Vec4, vector.Vec4) float64
if space == 'p' {
innerProd = func(a, b vector.Vec4) float64 { return (P2+0.2*P_2)*a.W*b.W - (a.X*b.X + a.Y*b.Y + a.Z*b.Z) }
} else {
innerProd = func(a, b vector.Vec4) float64 {
return (P2*cot*a.W*b.W - (cot-P_2)*(a.X*b.X+a.Y*b.Y+a.Z*b.Z)) / math.Abs(1.0+P_4*0.2-P_2*cot*0.2)
}
}
return Honeycomb{
Coxeter: Coxeter53n(n),
CellType: "spherical",
Vertices: []vector.Vec4{
{W: 1, X: B, Y: C, Z: -E},
{W: 1, X: B, Y: C, Z: E},
{W: 1, X: P, Y: G, Z: 0},
{W: 1, X: P, Y: -G, Z: 0},
{W: 1, X: B, Y: -C, Z: -E},
{W: 1, X: B, Y: -C, Z: E},
{W: 1, X: F, Y: -A, Z: D},
{W: 1, X: A, Y: -D, Z: F},
{W: 1, X: D, Y: -F, Z: A},
{W: 1, X: C, Y: -E, Z: B},
{W: 1, X: C, Y: E, Z: B},
{W: 1, X: G, Y: 0, Z: P},
{W: 1, X: D, Y: F, Z: A},
{W: 1, X: A, Y: D, Z: F},
{W: 1, X: F, Y: A, Z: D},
{W: 1, X: E, Y: B, Z: C},
{W: 1, X: 0, Y: P, Z: G},
{W: 1, X: -E, Y: B, Z: C},
{W: 1, X: -F, Y: A, Z: D},
{W: 1, X: -A, Y: D, Z: F},
{W: 1, X: -D, Y: F, Z: A},
{W: 1, X: -C, Y: E, Z: B},
{W: 1, X: -G, Y: 0, Z: P},
{W: 1, X: -C, Y: -E, Z: B},
{W: 1, X: -D, Y: -F, Z: A},
{W: 1, X: -A, Y: -D, Z: F},
{W: 1, X: -F, Y: -A, Z: D},
{W: 1, X: -E, Y: -B, Z: C},
{W: 1, X: E, Y: -B, Z: C},
{W: 1, X: 0, Y: -P, Z: G},
{W: 1, X: 0, Y: -P, Z: -G},
{W: 1, X: E, Y: -B, Z: -C},
{W: 1, X: -E, Y: -B, Z: -C},
{W: 1, X: F, Y: -A, Z: -D},
{W: 1, X: A, Y: -D, Z: -F},
{W: 1, X: D, Y: -F, Z: -A},
{W: 1, X: C, Y: -E, Z: -B},
{W: 1, X: G, Y: 0, Z: -P},
{W: 1, X: C, Y: E, Z: -B},
{W: 1, X: D, Y: F, Z: -A},
{W: 1, X: F, Y: A, Z: -D},
{W: 1, X: A, Y: D, Z: -F},
{W: 1, X: E, Y: B, Z: -C},
{W: 1, X: -E, Y: B, Z: -C},
{W: 1, X: 0, Y: P, Z: -G},
{W: 1, X: -B, Y: C, Z: -E},
{W: 1, X: -B, Y: C, Z: E},
{W: 1, X: -P, Y: G, Z: 0},
{W: 1, X: -P, Y: -G, Z: 0},
{W: 1, X: -B, Y: -C, Z: E},
{W: 1, X: -B, Y: -C, Z: -E},
{W: 1, X: -A, Y: -D, Z: -F},
{W: 1, X: -F, Y: -A, Z: -D},
{W: 1, X: -D, Y: -F, Z: -A},
{W: 1, X: -C, Y: -E, Z: -B},
{W: 1, X: -G, Y: 0, Z: -P},
{W: 1, X: -C, Y: E, Z: -B},
{W: 1, X: -D, Y: F, Z: -A},
{W: 1, X: -F, Y: A, Z: -D},
{W: 1, X: -A, Y: D, Z: -F},
},
Edges: [][2]int{
{0, 1}, {0, 2}, {0, 41}, {1, 2}, {1, 13},
{2, 3}, {3, 4}, {3, 5}, {4, 5}, {4, 34},
{5, 7}, {6, 7}, {6, 8}, {6, 28}, {7, 8},
{8, 9}, {9, 10}, {9, 11}, {10, 11}, {10, 12},
{11, 22}, {12, 13}, {12, 14}, {13, 14}, {14, 15},
{15, 16}, {15, 17}, {16, 17}, {16, 44}, {17, 18},
{18, 19}, {18, 20}, {19, 20}, {19, 46}, {20, 21},
{21, 22}, {21, 23}, {22, 23}, {23, 24}, {24, 25},
{24, 26}, {25, 26}, {25, 49}, {26, 27}, {27, 28},
{27, 29}, {28, 29}, {29, 30}, {30, 31}, {30, 32},
{31, 32}, {31, 33}, {32, 52}, {33, 34}, {33, 35},
{34, 35}, {35, 36}, {36, 37}, {36, 38}, {37, 38},
{37, 55}, {38, 39}, {39, 40}, {39, 41}, {40, 41},
{40, 42}, {42, 43}, {42, 44}, {43, 44}, {43, 58},
{45, 46}, {45, 47}, {45, 59}, {46, 47}, {47, 48},
{48, 49}, {48, 50}, {49, 50}, {50, 51}, {51, 52},
{51, 53}, {52, 53}, {53, 54}, {54, 55}, {54, 56},
{55, 56}, {56, 57}, {57, 58}, {57, 59}, {58, 59},
},
Faces: [][]int{
{0, 1, 2}, {3, 4, 5}, {6, 7, 8}, {9, 10, 11}, {12, 13, 14},
{15, 16, 17}, {18, 19, 20}, {21, 22, 23}, {24, 25, 26}, {27, 28, 29},
{30, 31, 32}, {33, 34, 35}, {36, 37, 38}, {39, 40, 41}, {42, 43, 44},
{45, 46, 47}, {48, 49, 50}, {51, 52, 53}, {54, 55, 56}, {57, 58, 59},
{0, 1, 13, 14, 15, 16, 44, 42, 40, 41}, {1, 2, 3, 5, 7, 8, 9, 10, 12, 13},
{0, 2, 3, 4, 34, 35, 36, 38, 39, 41}, {10, 11, 22, 21, 20, 18, 17, 15, 14, 12},
{4, 5, 7, 6, 28, 29, 30, 31, 33, 34}, {6, 8, 9, 11, 22, 23, 24, 26, 27, 28},
{25, 26, 27, 29, 30, 32, 52, 51, 50, 49}, {31, 32, 52, 53, 54, 55, 37, 36, 35, 33},
{37, 38, 39, 40, 42, 43, 58, 57, 56, 55}, {16, 17, 18, 19, 46, 45, 59, 58, 43, 44},
{19, 20, 21, 23, 24, 25, 49, 48, 47, 46}, {45, 47, 48, 50, 51, 53, 54, 56, 57, 59},
},
EVal: eVal53nTrunc,
PVal: pVal53nTrunc,
Space: space,
Scale: scale,
InnerProduct: innerProd,
}
}
func Honeycomb53nRect(n float64) Honeycomb {
cot := 1.0 / math.Pow(math.Tan(math.Pi/n), 2.0)
space := Boundaries(n, eVal53nRect, pVal53nRect)
var scale func(vector.Vec4) vector.Vec4
if space == 'e' {
scale = func(v vector.Vec4) vector.Vec4 {
return vector.Vec4{W: v.W, X: v.X, Y: v.Y, Z: v.Z}
}
} else {
scale = func(v vector.Vec4) vector.Vec4 {
return vector.Vec4{
W: P * math.Sqrt(cot) * v.W,
X: math.Sqrt(math.Abs(cot-P_2)) * v.X,
Y: math.Sqrt(math.Abs(cot-P_2)) * v.Y,
Z: math.Sqrt(math.Abs(cot-P_2)) * v.Z,
}
}
}
innerProd := func(a, b vector.Vec4) float64 {
return P2*cot*a.W*b.W - (cot-P_2)*(a.X*b.X+a.Y*b.Y+a.Z*b.Z)
}
return Honeycomb{
Coxeter: Coxeter53n(n),
CellType: "spherical",
Vertices: []vector.Vec4{
{W: 1, X: P * 0.5, Y: 0.5, Z: P2 * 0.5},
{W: 1, X: P * 0.5, Y: -0.5, Z: P2 * 0.5},
{W: 1, X: P2 * 0.5, Y: -P * 0.5, Z: 0.5},
{W: 1, X: P, Y: 0, Z: 0},
{W: 1, X: P2 * 0.5, Y: P * 0.5, Z: 0.5},
{W: 1, X: 0, Y: 0, Z: P},
{W: 1, X: 0.5, Y: -P2 * 0.5, Z: P * 0.5},
{W: 1, X: P2 * 0.5, Y: -P * 0.5, Z: -0.5},
{W: 1, X: P2 * 0.5, Y: P * 0.5, Z: -0.5},
{W: 1, X: 0.5, Y: P2 * 0.5, Z: P * 0.5},
{W: 1, X: -P * 0.5, Y: -0.5, Z: P2 * 0.5},
{W: 1, X: -0.5, Y: -P2 * 0.5, Z: P * 0.5},
{W: 1, X: 0, Y: -P, Z: 0},
{W: 1, X: 0.5, Y: -P2 * 0.5, Z: -P * 0.5},
{W: 1, X: P * 0.5, Y: -0.5, Z: -P2 * 0.5},
{W: 1, X: P * 0.5, Y: 0.5, Z: -P2 * 0.5},
{W: 1, X: 0.5, Y: P2 * 0.5, Z: -P * 0.5},
{W: 1, X: 0, Y: P, Z: 0},
{W: 1, X: -0.5, Y: P2 * 0.5, Z: P * 0.5},
{W: 1, X: -P * 0.5, Y: 0.5, Z: P2 * 0.5},
{W: 1, X: -P2 * 0.5, Y: -P * 0.5, Z: 0.5},
{W: 1, X: -0.5, Y: -P2 * 0.5, Z: -P * 0.5},
{W: 1, X: 0, Y: 0, Z: -P},
{W: 1, X: -0.5, Y: P2 * 0.5, Z: -P * 0.5},
{W: 1, X: -P2 * 0.5, Y: P * 0.5, Z: 0.5},
{W: 1, X: -P2 * 0.5, Y: -P * 0.5, Z: -0.5},
{W: 1, X: -P * 0.5, Y: -0.5, Z: -P2 * 0.5},
{W: 1, X: -P * 0.5, Y: 0.5, Z: -P2 * 0.5},
{W: 1, X: -P2 * 0.5, Y: P * 0.5, Z: -0.5},
{W: 1, X: -P, Y: 0, Z: 0},
},
Edges: [][2]int{
{0, 1}, {0, 4}, {0, 5}, {0, 9},
{1, 5}, {1, 2}, {1, 6}, {2, 6},
{2, 7}, {2, 3}, {3, 7}, {3, 8},
{3, 4}, {4, 8}, {4, 9}, {5, 10},
{5, 19}, {6, 11}, {6, 12}, {7, 13},
{7, 14}, {8, 15}, {8, 16}, {9, 17},
{9, 18}, {10, 11}, {10, 20}, {10, 19},
{11, 20}, {11, 12}, {12, 13}, {12, 21},
{13, 21}, {13, 14}, {14, 15}, {14, 22},
{15, 16}, {15, 22}, {16, 17}, {16, 23},
{17, 18}, {17, 23}, {18, 19}, {18, 24},
{19, 24}, {20, 29}, {20, 25}, {21, 25},
{21, 26}, {22, 26}, {22, 27}, {23, 27},
{23, 28}, {24, 28}, {24, 29}, {25, 26},
{26, 27}, {27, 28}, {28, 29}, {29, 25},
},
Faces: [][]int{
{0, 1, 2, 3, 4}, {0, 5, 19, 18, 9}, {1, 6, 11, 10, 5}, {2, 7, 13, 12, 6},
{3, 8, 15, 14, 7}, {4, 9, 17, 16, 8}, {11, 12, 21, 25, 20}, {13, 14, 22, 26, 21},
{15, 16, 23, 27, 22}, {17, 18, 24, 28, 23}, {19, 10, 20, 29, 24}, {25, 26, 27, 28, 29},
{0, 1, 5}, {1, 2, 6}, {2, 3, 7}, {3, 4, 8}, {4, 0, 9},
{5, 10, 19}, {6, 11, 12}, {7, 13, 14}, {8, 15, 16}, {9, 17, 18},
{10, 11, 20}, {12, 13, 21}, {14, 15, 22}, {16, 17, 23}, {18, 19, 24},
{20, 25, 29}, {21, 25, 26}, {22, 26, 27}, {23, 27, 28}, {24, 28, 29},
},
EVal: eVal53nRect,
PVal: pVal53nRect,
Space: space,
Scale: scale,
InnerProduct: innerProd,
}
} | data/53n.go | 0.508544 | 0.569553 | 53n.go | starcoder |
package stat
import (
"math"
"sort"
"strings"
"fmt"
"net/url"
)
// Return p percentil of pre-sorted integer data. 0 <= p <= 100.
func PercentilInt(data []int, p int) int {
n := len(data)
if n == 0 {
return 0
}
if n == 1 {
return data[0]
}
pos := float64(p) * float64(n+1) / 100
fpos := math.Floor(pos)
intPos := int(fpos)
dif := pos - fpos
if intPos < 1 {
return data[0]
}
if intPos >= n {
return data[n-1]
}
lower := data[intPos-1]
upper := data[intPos]
val := float64(lower) + dif*float64(upper-lower)
return int(math.Floor(val + 0.5))
}
// Return p percentil of pre-sorted float64 data. 0 <= p <= 100.
func percentilFloat64(data []float64, p int) float64 {
n := len(data)
if n == 0 {
return 0
}
if n == 1 {
return data[0]
}
pos := float64(p) * float64(n+1) / 100
fpos := math.Floor(pos)
intPos := int(fpos)
dif := pos - fpos
if intPos < 1 {
return data[0]
}
if intPos >= n {
return data[n-1]
}
lower := data[intPos-1]
upper := data[intPos]
val := lower + dif*(upper-lower)
return val
}
// Compute minimum, p percentil, median, average, 100-p percentil and maximum of values in data.
func SixvalInt(data []int, p int) (min, lq, med, avg, uq, max int) {
min, max = math.MaxInt32, math.MinInt32
sum, n := 0, len(data)
if n == 0 {
return
}
if n == 1 {
min = data[0]
lq = data[0]
med = data[0]
avg = data[0]
uq = data[0]
max = data[0]
return
}
for _, v := range data {
if v < min {
min = v
}
if v > max {
max = v
}
sum += v
}
avg = sum / n
sort.Ints(data)
if n%2 == 1 {
med = data[(n-1)/2]
} else {
med = (data[n/2] + data[n/2-1]) / 2
}
lq = PercentilInt(data, p)
uq = PercentilInt(data, 100-p)
return
}
func DistributionInt(data []int, levels []int) (p []int) {
sort.Ints(data)
for _, q := range levels {
p = append(p, PercentilInt(data, q))
}
return p
}
// Compute minimum, p percentil, median, average, 100-p percentil and maximum of values in data.
func SixvalFloat64(data []float64, p int) (min, lq, med, avg, uq, max float64) {
n := len(data)
// Special cases 0 and 1
if n == 0 {
return
}
if n == 1 {
min = data[0]
lq = data[0]
med = data[0]
avg = data[0]
uq = data[0]
max = data[0]
return
}
// First pass (min, max, coarse average)
var sum float64
min, max = math.MaxFloat64, -math.MaxFloat64
for _, v := range data {
if v < min {
min = v
}
if v > max {
max = v
}
sum += v
}
avg = sum / float64(n)
// Second pass: Correct average
var corr float64
for _, v := range data {
corr += v - avg
}
avg += corr / float64(n)
// Median
sort.Float64s(data)
if n%2 == 1 {
med = data[(n-1)/2]
} else {
med = (data[n/2] + data[n/2-1]) / 2
}
// Percentiles
if p < 0 {
p = 0
}
if p > 100 {
p = 100
}
lq = percentilFloat64(data, p)
uq = percentilFloat64(data, 100-p)
return
}
// Generate histogram via google charts api).
func HistogramChartUrlInt(d []int, title, label string) (url_ string) {
url_ = "http://chart.googleapis.com/chart?cht=bvg&chs=600x300&chxs=0,676767,11.5,0,lt,676767&chxt=x&chdlp=b"
url_ += "&chbh=a&chco=404040&chtt=" + url.QueryEscape(strings.Trim(title, " \t\n"))
url_ += "&chdl=" + url.QueryEscape(strings.Trim(label, " \t\n"))
// Decide on number of bins
min, _, _, _, _, max := SixvalInt(d, 25)
cnt := 10
if len(d) <= 10 {
cnt = 3
} else if len(d) <= 15 {
cnt = 5
} else if len(d) > 40 {
cnt = 15
}
step := float64(max-min) / float64(cnt) // inital
// make step multiple of 2, 5 or 10
pow := 0
for step > 10 {
pow++
step /= 10
}
var width int
switch true {
case step < 1.5:
width = 1
case step < 3:
width = 2
case step < 8:
width = 5
default:
width = 10
}
for ; pow > 0; pow-- {
width *= 10
}
low := (min / width) * width
high := (max/width + 1) * width
cnt = (high - low) / width
// Binify and scale largest bar to 100
var bin []int = make([]int, cnt)
mc := 0
for _, n := range d {
b := (n - low) / width
if b < 0 {
b = 0
} else if b >= cnt {
b = cnt - 1
}
bin[b] = bin[b] + 1
if bin[b] > mc {
mc = bin[b]
}
}
for i, n := range bin {
bin[i] = 100 * n / mc
}
// Output data to url
url_ += fmt.Sprintf("&chxr=0,%d,%d", low+width/2, high-width/2)
url_ += "&chd=t:"
for i, n := range bin {
if i > 0 {
url_ += ","
}
url_ += fmt.Sprintf("%d", n)
}
return
} | stat/stat.go | 0.544801 | 0.46873 | stat.go | starcoder |
package chart
import "math"
var (
// Draw contains helpers for drawing common objects.
Draw = &draw{}
)
type draw struct{}
// LineSeries draws a line series with a renderer.
func (d draw) LineSeries(r Renderer, canvasBox Box, xrange, yrange Range, style Style, vs ValueProvider) {
if vs.Len() == 0 {
return
}
cb := canvasBox.Bottom
cl := canvasBox.Left
v0x, v0y := vs.GetValue(0)
x0 := cl + xrange.Translate(v0x)
y0 := cb - yrange.Translate(v0y)
yv0 := yrange.Translate(0)
var vx, vy float64
var x, y int
fill := style.GetFillColor()
if !fill.IsZero() {
style.GetFillOptions().WriteToRenderer(r)
r.MoveTo(x0, y0)
for i := 1; i < vs.Len(); i++ {
vx, vy = vs.GetValue(i)
x = cl + xrange.Translate(vx)
y = cb - yrange.Translate(vy)
r.LineTo(x, y)
}
r.LineTo(x, Math.MinInt(cb, cb-yv0))
r.LineTo(x0, Math.MinInt(cb, cb-yv0))
r.LineTo(x0, y0)
r.Fill()
}
style.GetStrokeOptions().WriteToRenderer(r)
r.MoveTo(x0, y0)
for i := 1; i < vs.Len(); i++ {
vx, vy = vs.GetValue(i)
x = cl + xrange.Translate(vx)
y = cb - yrange.Translate(vy)
r.LineTo(x, y)
}
r.Stroke()
}
// BoundedSeries draws a series that implements BoundedValueProvider.
func (d draw) BoundedSeries(r Renderer, canvasBox Box, xrange, yrange Range, style Style, bbs BoundedValueProvider, drawOffsetIndexes ...int) {
drawOffsetIndex := 0
if len(drawOffsetIndexes) > 0 {
drawOffsetIndex = drawOffsetIndexes[0]
}
cb := canvasBox.Bottom
cl := canvasBox.Left
v0x, v0y1, v0y2 := bbs.GetBoundedValue(0)
x0 := cl + xrange.Translate(v0x)
y0 := cb - yrange.Translate(v0y1)
var vx, vy1, vy2 float64
var x, y int
xvalues := make([]float64, bbs.Len())
xvalues[0] = v0x
y2values := make([]float64, bbs.Len())
y2values[0] = v0y2
style.GetFillAndStrokeOptions().WriteToRenderer(r)
r.MoveTo(x0, y0)
for i := 1; i < bbs.Len(); i++ {
vx, vy1, vy2 = bbs.GetBoundedValue(i)
xvalues[i] = vx
y2values[i] = vy2
x = cl + xrange.Translate(vx)
y = cb - yrange.Translate(vy1)
if i > drawOffsetIndex {
r.LineTo(x, y)
} else {
r.MoveTo(x, y)
}
}
y = cb - yrange.Translate(vy2)
r.LineTo(x, y)
for i := bbs.Len() - 1; i >= drawOffsetIndex; i-- {
vx, vy2 = xvalues[i], y2values[i]
x = cl + xrange.Translate(vx)
y = cb - yrange.Translate(vy2)
r.LineTo(x, y)
}
r.Close()
r.FillStroke()
}
// HistogramSeries draws a value provider as boxes from 0.
func (d draw) HistogramSeries(r Renderer, canvasBox Box, xrange, yrange Range, style Style, vs ValueProvider, barWidths ...int) {
if vs.Len() == 0 {
return
}
//calculate bar width?
seriesLength := vs.Len()
barWidth := int(math.Floor(float64(xrange.GetDomain()) / float64(seriesLength)))
if len(barWidths) > 0 {
barWidth = barWidths[0]
}
cb := canvasBox.Bottom
cl := canvasBox.Left
//foreach datapoint, draw a box.
for index := 0; index < seriesLength; index++ {
vx, vy := vs.GetValue(index)
y0 := yrange.Translate(0)
x := cl + xrange.Translate(vx)
y := yrange.Translate(vy)
d.Box(r, Box{
Top: cb - y0,
Left: x - (barWidth >> 1),
Right: x + (barWidth >> 1),
Bottom: cb - y,
}, style)
}
}
// MeasureAnnotation measures how big an annotation would be.
func (d draw) MeasureAnnotation(r Renderer, canvasBox Box, style Style, lx, ly int, label string) Box {
style.WriteToRenderer(r)
defer r.ResetStyle()
textBox := r.MeasureText(label)
textWidth := textBox.Width()
textHeight := textBox.Height()
halfTextHeight := textHeight >> 1
pt := style.Padding.GetTop(DefaultAnnotationPadding.Top)
pl := style.Padding.GetLeft(DefaultAnnotationPadding.Left)
pr := style.Padding.GetRight(DefaultAnnotationPadding.Right)
pb := style.Padding.GetBottom(DefaultAnnotationPadding.Bottom)
strokeWidth := style.GetStrokeWidth()
top := ly - (pt + halfTextHeight)
right := lx + pl + pr + textWidth + DefaultAnnotationDeltaWidth + int(strokeWidth)
bottom := ly + (pb + halfTextHeight)
return Box{
Top: top,
Left: lx,
Right: right,
Bottom: bottom,
}
}
// Annotation draws an anotation with a renderer.
func (d draw) Annotation(r Renderer, canvasBox Box, style Style, lx, ly int, label string) {
style.GetTextOptions().WriteToRenderer(r)
defer r.ResetStyle()
textBox := r.MeasureText(label)
textWidth := textBox.Width()
halfTextHeight := textBox.Height() >> 1
style.GetFillAndStrokeOptions().WriteToRenderer(r)
pt := style.Padding.GetTop(DefaultAnnotationPadding.Top)
pl := style.Padding.GetLeft(DefaultAnnotationPadding.Left)
pr := style.Padding.GetRight(DefaultAnnotationPadding.Right)
pb := style.Padding.GetBottom(DefaultAnnotationPadding.Bottom)
textX := lx + pl + DefaultAnnotationDeltaWidth
textY := ly + halfTextHeight
ltx := lx + DefaultAnnotationDeltaWidth
lty := ly - (pt + halfTextHeight)
rtx := lx + pl + pr + textWidth + DefaultAnnotationDeltaWidth
rty := ly - (pt + halfTextHeight)
rbx := lx + pl + pr + textWidth + DefaultAnnotationDeltaWidth
rby := ly + (pb + halfTextHeight)
lbx := lx + DefaultAnnotationDeltaWidth
lby := ly + (pb + halfTextHeight)
r.MoveTo(lx, ly)
r.LineTo(ltx, lty)
r.LineTo(rtx, rty)
r.LineTo(rbx, rby)
r.LineTo(lbx, lby)
r.LineTo(lx, ly)
r.Close()
r.FillStroke()
style.GetTextOptions().WriteToRenderer(r)
r.Text(label, textX, textY)
}
// Box draws a box with a given style.
func (d draw) Box(r Renderer, b Box, s Style) {
s.GetFillAndStrokeOptions().WriteToRenderer(r)
defer r.ResetStyle()
r.MoveTo(b.Left, b.Top)
r.LineTo(b.Right, b.Top)
r.LineTo(b.Right, b.Bottom)
r.LineTo(b.Left, b.Bottom)
r.LineTo(b.Left, b.Top)
r.FillStroke()
}
func (d draw) BoxRotated(r Renderer, b Box, thetaDegrees float64, s Style) {
d.BoxCorners(r, b.Corners().Rotate(thetaDegrees), s)
}
func (d draw) BoxCorners(r Renderer, bc BoxCorners, s Style) {
s.GetFillAndStrokeOptions().WriteToRenderer(r)
defer r.ResetStyle()
r.MoveTo(bc.TopLeft.X, bc.TopLeft.Y)
r.LineTo(bc.TopRight.X, bc.TopRight.Y)
r.LineTo(bc.BottomRight.X, bc.BottomRight.Y)
r.LineTo(bc.BottomLeft.X, bc.BottomLeft.Y)
r.Close()
r.FillStroke()
}
// DrawText draws text with a given style.
func (d draw) Text(r Renderer, text string, x, y int, style Style) {
style.GetTextOptions().WriteToRenderer(r)
defer r.ResetStyle()
r.Text(text, x, y)
}
func (d draw) MeasureText(r Renderer, text string, style Style) Box {
style.GetTextOptions().WriteToRenderer(r)
defer r.ResetStyle()
return r.MeasureText(text)
}
// TextWithin draws the text within a given box.
func (d draw) TextWithin(r Renderer, text string, box Box, style Style) {
style.GetTextOptions().WriteToRenderer(r)
defer r.ResetStyle()
lines := Text.WrapFit(r, text, box.Width(), style)
linesBox := Text.MeasureLines(r, lines, style)
y := box.Top
switch style.GetTextVerticalAlign() {
case TextVerticalAlignBottom, TextVerticalAlignBaseline: // i have to build better baseline handling into measure text
y = y - linesBox.Height()
case TextVerticalAlignMiddle, TextVerticalAlignMiddleBaseline:
y = (y - linesBox.Height()) >> 1
}
var tx, ty int
for _, line := range lines {
lineBox := r.MeasureText(line)
switch style.GetTextHorizontalAlign() {
case TextHorizontalAlignCenter:
tx = box.Left + ((box.Width() - lineBox.Width()) >> 1)
case TextHorizontalAlignRight:
tx = box.Right - lineBox.Width()
default:
tx = box.Left
}
if style.TextRotationDegrees == 0 {
ty = y + lineBox.Height()
} else {
ty = y
}
d.Text(r, line, tx, ty, style)
y += lineBox.Height() + style.GetTextLineSpacing()
}
} | vendor/github.com/nicholasjackson/bench/vendor/github.com/wcharczuk/go-chart/draw.go | 0.728845 | 0.448607 | draw.go | starcoder |
package scalars
import (
"strconv"
"github.com/saturn4er/graphql"
"github.com/saturn4er/graphql/language/ast"
"github.com/saturn4er/graphql/language/kinds"
"github.com/EGT-Ukraine/go2gql/api/multipart_file"
)
var GraphQLInt64Scalar = graphql.NewScalar(graphql.ScalarConfig{
Name: "Int64",
Description: "The `Int64` scalar type represents non-fractional signed whole numeric values. Int can represent values between -(2^31) and 2^31 - 1. \n" +
"Can be passed like a string",
Serialize: func(value interface{}) interface{} {
switch val := value.(type) {
case int64:
return val
case *int64:
if val == nil {
return nil
}
return int64(*val)
case int32:
return int64(val)
case *int32:
if val == nil {
return nil
}
return int64(*val)
case int:
return int64(val)
case *int:
if val == nil {
return nil
}
return int64(*val)
}
return nil
},
ParseValue: func(value interface{}) interface{} {
switch val := value.(type) {
case string:
value, err := strconv.ParseInt(val, 10, 64)
if err != nil {
return nil
}
return value
case int32:
return int64(val)
case int64:
return val
case float32:
return int64(val)
case float64:
return int64(val)
}
return nil
},
ParseLiteral: func(valueAST ast.Value) interface{} {
switch valueAST.GetKind() {
case kinds.IntValue, kinds.StringValue:
val, err := strconv.ParseInt(valueAST.GetValue().(string), 10, 64)
if err != nil {
return nil
}
return val
}
return nil
},
})
var GraphQLInt32Scalar = graphql.NewScalar(graphql.ScalarConfig{
Name: "Int32",
Description: "The `Int32` scalar type represents non-fractional signed whole numeric values. Int can represent values between -(2^31) and 2^31 - 1. \n" +
"Can be passed like a string",
Serialize: func(value interface{}) interface{} {
switch val := value.(type) {
case int32:
return val
case *int32:
if val == nil {
return nil
}
return int32(*val)
case int:
return int32(val)
case *int:
if val == nil {
return nil
}
return int32(*val)
}
return nil
},
ParseValue: func(value interface{}) interface{} {
switch val := value.(type) {
case string:
value, err := strconv.ParseInt(val, 10, 32)
if err != nil {
return nil
}
return int32(value)
case int32:
return value
case *int32:
if val == nil {
return nil
}
return int32(*val)
case int64:
return int32(val)
case *int64:
if val == nil {
return nil
}
return int32(*val)
case float32:
return int32(val)
case float64:
return int32(val)
}
return nil
},
ParseLiteral: func(valueAST ast.Value) interface{} {
switch valueAST.GetKind() {
case kinds.IntValue, kinds.StringValue:
val, err := strconv.ParseInt(valueAST.GetValue().(string), 10, 32)
if err != nil {
return nil
}
return int32(val)
}
return nil
},
})
var GraphQLUInt64Scalar = graphql.NewScalar(graphql.ScalarConfig{
Name: "UInt64",
Description: "The `UInt64` scalar type represents non-fractional unsigned whole numeric values. Int can represent values between 0 and 2^64 - 1.\n" +
"Can be passed like a string",
Serialize: func(value interface{}) interface{} {
switch val := value.(type) {
case uint64:
return val
case *uint64:
if val == nil {
return nil
}
return uint64(*val)
case uint32:
return uint64(val)
case *uint32:
if val == nil {
return nil
}
return uint64(*val)
case uint:
return uint64(val)
case *uint:
if val == nil {
return nil
}
return uint64(*val)
}
return nil
},
ParseValue: func(value interface{}) interface{} {
switch val := value.(type) {
case string:
value, err := strconv.ParseUint(val, 10, 64)
if err != nil {
return nil
}
return value
case uint32:
return uint64(val)
case uint64:
return val
case float32:
return uint64(val)
case float64:
return uint64(val)
}
return nil
},
ParseLiteral: func(valueAST ast.Value) interface{} {
switch valueAST.GetKind() {
case kinds.IntValue, kinds.StringValue:
val, err := strconv.ParseUint(valueAST.GetValue().(string), 10, 64)
if err != nil {
return nil
}
return val
}
return nil
},
})
var GraphQLUInt32Scalar = graphql.NewScalar(graphql.ScalarConfig{
Name: "UInt32",
Description: "The `UInt32` scalar type represents non-fractional unsigned whole numeric values. Int can represent values between 0 and 2^32 - 1.\n" +
"Can be passed like a string",
Serialize: func(value interface{}) interface{} {
switch val := value.(type) {
case uint32:
return val
case *uint32:
if val == nil {
return nil
}
return uint32(*val)
case uint:
return uint32(val)
case *uint:
if val == nil {
return nil
}
return uint(*val)
}
return nil
},
ParseValue: func(value interface{}) interface{} {
switch val := value.(type) {
case string:
value, err := strconv.ParseUint(val, 10, 32)
if err != nil {
return nil
}
return uint32(value)
case uint32:
return val
case float32:
return uint32(val)
case float64:
return uint32(val)
}
return nil
},
ParseLiteral: func(valueAST ast.Value) interface{} {
switch valueAST.GetKind() {
case kinds.IntValue, kinds.StringValue:
val, err := strconv.ParseUint(valueAST.GetValue().(string), 10, 32)
if err != nil {
return nil
}
return uint32(val)
}
return nil
},
})
var GraphQLFloat32Scalar = graphql.NewScalar(graphql.ScalarConfig{
Name: "Float32",
Serialize: func(value interface{}) interface{} {
switch v := value.(type) {
case float32:
return v
case *float32:
if v == nil {
return nil
}
return *v
}
if val, ok := value.(float32); ok {
return val
}
return nil
},
ParseValue: func(value interface{}) interface{} {
switch val := value.(type) {
case string:
value, err := strconv.ParseFloat(val, 32)
if err != nil {
return nil
}
return float32(value)
case float32:
return val
}
return nil
},
ParseLiteral: func(valueAST ast.Value) interface{} {
switch valueAST.GetKind() {
case kinds.IntValue, kinds.StringValue:
val, err := strconv.ParseFloat(valueAST.GetValue().(string), 32)
if err != nil {
return nil
}
return float32(val)
}
return nil
},
})
var GraphQLFloat64Scalar = graphql.NewScalar(graphql.ScalarConfig{
Name: "Float64",
Serialize: func(value interface{}) interface{} {
switch val := value.(type) {
case float32:
return float64(val)
case float64:
return val
}
return nil
},
ParseValue: func(value interface{}) interface{} {
switch val := value.(type) {
case string:
value, err := strconv.ParseFloat(val, 64)
if err != nil {
return nil
}
return float64(value)
case *string:
if val == nil {
return nil
}
value, err := strconv.ParseFloat(*val, 64)
if err != nil {
return nil
}
return float64(value)
case float32:
return float64(val)
case *float32:
if val == nil {
return nil
}
return float64(*val)
case float64:
return val
case *float64:
if val == nil {
return nil
}
return *val
}
return nil
},
ParseLiteral: func(valueAST ast.Value) interface{} {
switch valueAST.GetKind() {
case kinds.IntValue, kinds.StringValue:
val, err := strconv.ParseFloat(valueAST.GetValue().(string), 64)
if err != nil {
return nil
}
return float64(val)
}
return nil
},
})
var NoDataScalar = graphql.NewScalar(graphql.ScalarConfig{
Name: "NoData",
Description: "The `NoData` scalar type represents no data.",
Serialize: func(value interface{}) interface{} {
return nil
},
ParseValue: func(value interface{}) interface{} {
return 0
},
ParseLiteral: func(valueAST ast.Value) interface{} {
return 0
},
})
var MultipartFile = graphql.NewScalar(graphql.ScalarConfig{
Name: "Upload",
Description: "The `Upload` scalar type represents no data.",
Serialize: func(value interface{}) interface{} {
switch t := value.(type) {
case multipart_file.MultipartFile:
return t.Header.Filename
case *multipart_file.MultipartFile:
return t.Header.Filename
}
return nil
},
ParseValue: func(value interface{}) interface{} {
switch t := value.(type) {
case multipart_file.MultipartFile:
return &t
case *multipart_file.MultipartFile:
return t
}
return nil
},
ParseLiteral: func(valueAST ast.Value) interface{} {
return 0
},
}) | api/scalars/scalars.go | 0.649245 | 0.460532 | scalars.go | starcoder |
package balancedtree
import (
"fmt"
"math"
"github.com/puxin71/DesignPatternInGo/linklist"
"github.com/puxin71/DesignPatternInGo/queue"
)
type Node struct {
Value int
LeftChild *Node
RightChild *Node
}
type binaryTree struct {
root *Node
nodeCount int
height int
}
type BinaryTree interface {
BinarySortedAdd(node *Node)
NodeCount() int
Height() int
PreOrderTraversal() []int
InOrderTraversal() []int
PostOrderTraversal() []int
TraverseDepthFirst(node *Node) []int
Root() *Node
// binary search on 1 trillion items take 40 steps
Get(value int) *Node
// delete node from sorted tree is complex!!!
}
func NewBinaryTree() BinaryTree {
return &binaryTree{root: nil, nodeCount: 0, height: 0}
}
func (t *binaryTree) BinarySortedAdd(node *Node) {
if t.root == nil {
t.root = node
t.nodeCount++
return
}
t.binarySortedAdd(t.root, node)
}
// sorted add. left child < node, right child > node
func (t *binaryTree) binarySortedAdd(parent, node *Node) {
if node == nil {
return
}
if node.Value == parent.Value {
return
}
if node.Value < parent.Value {
if parent.LeftChild == nil {
parent.LeftChild = node
t.nodeCount++
fmt.Printf("parent: %d, add left child: %d\n", parent.Value, parent.LeftChild.Value)
} else {
t.binarySortedAdd(parent.LeftChild, node)
}
} else {
if parent.RightChild == nil {
parent.RightChild = node
t.nodeCount++
fmt.Printf("parent: %d, add right child: %d\n", parent.Value, parent.RightChild.Value)
} else {
t.binarySortedAdd(parent.RightChild, node)
}
}
}
func (t *binaryTree) NodeCount() int {
return t.nodeCount
}
func (t *binaryTree) Height() int {
return int(math.Log2(float64(t.nodeCount)))
}
func (t *binaryTree) PreOrderTraversal() []int {
var values []int
return t.traversePreOrder(t.root, values)
}
// the algorithm processes a node, and then its left child, and then its right child
func (t *binaryTree) traversePreOrder(node *Node, values []int) []int {
values = append(values, node.Value)
if node.LeftChild != nil {
values = t.traversePreOrder(node.LeftChild, values)
}
if node.RightChild != nil {
values = t.traversePreOrder(node.RightChild, values)
}
return values
}
func (t *binaryTree) InOrderTraversal() []int {
var values []int
if t.root == nil {
return nil
}
values = t.traverseInOrder(t.root, values)
return values
}
// inOrder process the left child, the node, then the right child
func (t *binaryTree) traverseInOrder(node *Node, values []int) []int {
if node.LeftChild != nil {
values = t.traverseInOrder(node.LeftChild, values)
}
values = append(values, node.Value)
if node.RightChild != nil {
values = t.traverseInOrder(node.RightChild, values)
}
return values
}
// postOrder traverse the right child, the left child, then the node
func (t *binaryTree) PostOrderTraversal() []int {
var values []int
if t.root == nil {
return nil
}
values = t.traversePostOrder(t.root, values)
return values
}
func (t *binaryTree) traversePostOrder(node *Node, values []int) []int {
if node.RightChild != nil {
values = t.traversePostOrder(node.RightChild, values)
}
if node.LeftChild != nil {
values = t.traversePostOrder(node.LeftChild, values)
}
values = append(values, node.Value)
return values
}
func (t *binaryTree) TraverseDepthFirst(node *Node) []int {
var values []int
if node == nil {
return nil
}
myQueue := queue.NewQueue()
myQueue.Enqueue(t.createCell(node))
for !myQueue.IsEmpty() {
cell := myQueue.Dequeue()
currNode := cell.Value.(Node)
values = append(values, currNode.Value)
if currNode.LeftChild != nil {
myQueue.Enqueue(t.createCell(currNode.LeftChild))
}
if currNode.RightChild != nil {
myQueue.Enqueue(t.createCell(currNode.RightChild))
}
}
return values
}
func (t *binaryTree) createCell(node *Node) *linklist.DoubleLinkCell {
if node == nil {
return nil
}
return &linklist.DoubleLinkCell{Value: *node, Prev: nil, Next: nil}
}
func (t *binaryTree) Root() *Node {
return t.root
}
func (t *binaryTree) Get(value int) *Node {
if t.root == nil {
return nil
}
return t.get(t.root, value)
}
func (t *binaryTree) get(parent *Node, value int) *Node {
if parent.Value == value {
return parent
}
if value < parent.Value {
if parent.LeftChild == nil {
return nil
} else {
return t.get(parent.LeftChild, value)
}
}
if parent.RightChild == nil {
return nil
}
return t.get(parent.RightChild, value)
} | balancedtree/tree.go | 0.636918 | 0.416085 | tree.go | starcoder |
package types
import "strconv"
// Analysis files, file settings, analysis settings and column mapping.
type Analysis struct {
Columns map[string]string
Data []map[string]string
Settings Settings
}
// CircHeatmap is a circular heatmap plot
type CircHeatmap struct {
Name string `json:"name"`
Readouts []CircHeatmapReadout `json:"readouts"`
}
// CircHeatmapReadout contains data on a readout for a circular heatmap
type CircHeatmapReadout struct {
Known bool `json:"known"`
Label string `json:"label"`
Segments map[string]RoundedSegment `json:"segments"`
}
// CircHeatmapLegend is a slice of legend elements
type CircHeatmapLegend []CircHeatmapLegendElement
// CircHeatmapLegendElement contains settings for each metric in the circheatmap.
type CircHeatmapLegendElement struct {
Attribute string `json:"attribute"`
Color string `json:"color"`
Filter float64 `json:"filter"`
Max float64 `json:"max"`
Min float64 `json:"min"`
}
// Matrices holds input data formatted as matrices.
type Matrices struct {
Abundance, Ratio, Score [][]float64
Conditions, Readouts []string
}
// RoundedSegment rounds circheatmap to a precision of 2
type RoundedSegment float64
// MarshalJSON rounds scatter points to a precision of 2 when exporting to JSON
func (r RoundedSegment) MarshalJSON() ([]byte, error) {
return []byte(strconv.FormatFloat(float64(r), 'f', 2, 64)), nil
}
// ScatterAxesLabels are labels for the x and y axis.
type ScatterAxesLabels struct {
X string `json:"x"`
Y string `json:"y"`
}
// ScatterPlot contains data for a scatter plot
type ScatterPlot struct {
Labels ScatterAxesLabels `json:"labels"`
Name string `json:"name"`
Points []ScatterPoint `json:"points"`
}
// ScatterPoint contains data for a point on a scatter plot
type ScatterPoint struct {
Color string `json:"color"`
Label string `json:"label"`
X float64 `json:"x"`
Y float64 `json:"y"`
}
// Settings contains tool-specific analysis settings.
type Settings struct {
// Shared settings
Abundance string
AbundanceCap float64
AbundanceType string
AutomaticallySetFill bool
Condition string
Control string
Files []string
FillColor string
FillMax float64
FillMin float64
LogBase string
MinAbundance float64
MinConditions int
MockConditionAbundance bool
Normalization string
NormalizationReadout string
ParsimoniousReadoutFiltering bool
Png bool
PrimaryFilter float64
Readout string
ReadoutLength string
Score string
ScoreType string
SecondaryFilter float64
Type string
// condition-condition
ConditionX string
ConditionY string
// correlation
ConditionAbundanceFilter float64
ConditionScoreFilter float64
Correlation string
CytoscapeCutoff float64
IgnoreSourceTargetMatches bool
ReadoutAbundanceFilter float64
ReadoutScoreFilter float64
UseReplicates bool
// dotplot
BiclusteringApprox bool
Clustering string
ClusteringMethod string
ClusteringOptimize bool
ConditionClustering string
ConditionList []string
Distance string
EdgeColor string
InvertColor bool
RatioDimension string
ReadoutClustering string
ReadoutList []string
XLabel string
YLabel string
WriteDistance bool
WriteHeatmap bool
// scv
AbundanceFilterColumn string
ConditionIDType string
ConditionMapColumn string
ConditionMapFile string
GeneFile string
Known string
KnownFile string
OtherAbundance []string
ProteinExpressionFile string
ProteinTissues []string
ReadoutIDType string
ReadoutMapColumn string
ReadoutMapFile string
RnaExpressionFile string
RnaTissues []string
Specificity bool
VerticalHeatmap bool
// specificity
SpecificityMetric string
} | pkg/types/settings.go | 0.725649 | 0.468487 | settings.go | starcoder |
package republique
import (
"fmt"
"math"
)
// Waypoint data for each segment of a move order
type Waypoint struct {
X int32
Y int32
X2 int32
Y2 int32
Speed float64
Elapsed float64
Turns int
Going string
Distance float64
Path string
Prep bool
}
// GetValue gets the grid contents at x,y
func (m *MapData) GetValue(x, y int32) byte {
offset := x + y*m.X
if offset < 0 || offset > m.X*m.Y {
return ' '
}
return m.Data[offset]
}
// GetWaypoints gets a slice of waypoints for a given command on this map
func (m *MapData) GetWaypoints(command *Command) []Waypoint {
waypoints := []Waypoint{}
x := command.GetGameState().GetGrid().GetX()
y := command.GetGameState().GetGrid().GetY()
switch command.GetGameState().GetOrders() {
case Order_RESTAGE, Order_NO_ORDERS, Order_RALLY:
return waypoints
case Order_DEFEND:
waypoint := Waypoint{
X: x,
Y: y,
X2: x,
Y2: y,
Speed: 1.0,
Elapsed: 5.0,
Turns: 0,
Going: "ready",
Distance: 0.0,
Path: "Deployed, Defensive Formation",
}
for _, v := range command.GetUnits() {
if v.Arm == Arm_ARTILLERY && !v.GetGameState().GetGunsDeployed() {
waypoint.Turns = 1
waypoint.Going = "deploying"
waypoint.Path = "-> Deploy guns to fire"
waypoint.Prep = true
waypoint.Elapsed = 10.0
break
}
}
case Order_FIRE:
// do they need to deploy ?
waypoint := Waypoint{
X: x,
Y: y,
X2: x,
Y2: y,
Speed: 1.0,
Elapsed: 5.0,
Turns: 0,
Going: "ready",
Distance: 0.0,
Path: "Deployed, Ready to fire",
}
for _, v := range command.GetUnits() {
if v.Arm == Arm_ARTILLERY && !v.GetGameState().GetGunsDeployed() {
waypoint.Turns = 1
waypoint.Going = "deploying"
waypoint.Path = "-> Deploy for bombardment (half a turn)"
waypoint.Prep = true
waypoint.Elapsed = 10.0
break
}
}
return append(waypoints, waypoint)
case Order_ENGAGE:
switch command.GetGameState().GetFormation() {
case Formation_MARCH_COLUMN, Formation_RESERVE:
waypoints = append(waypoints, Waypoint{
X: x,
Y: y,
X2: x,
Y2: y,
Speed: 1.0,
Elapsed: 20.0,
Turns: 1,
Going: "preparing",
Distance: 0.0,
Path: "-> Form line of Battle (about 1 turn)",
Prep: true,
})
}
case Order_MARCH:
switch command.Arm {
case Arm_ARTILLERY:
for _, v := range command.GetUnits() {
if v.Arm == Arm_ARTILLERY && v.GetGameState().GetGunsDeployed() {
waypoints = append(waypoints, Waypoint{
X: x,
Y: y,
X2: x,
Y2: y,
Speed: 1.0,
Elapsed: 10.0,
Turns: 1,
Going: "limbering",
Distance: 0.0,
Path: "-> Limber ready to move (half a turn)",
Prep: true,
})
break
}
}
case Arm_INFANTRY:
if command.GetRank() != Rank_CORPS && command.GetRank() != Rank_ARMY {
switch command.GetGameState().GetFormation() {
case Formation_MARCH_COLUMN, Formation_COLUMN:
default:
waypoints = append(waypoints, Waypoint{
X: x,
Y: y,
X2: x,
Y2: y,
Speed: 1.0,
Elapsed: 10.0,
Turns: 1,
Going: "prepare to march",
Distance: 0.0,
Path: "-> Form march columns (about 1 turn)",
Prep: true,
})
}
}
}
}
for k, v := range command.GetGameState().GetObjective() {
if k > 0 {
distance := math.Sqrt(float64((v.X-x)*(v.X-x) + (v.Y-y)*(v.Y-y)))
speed := 1.0
switch {
case command.Rank == Rank_CORPS, command.Rank == Rank_ARMY:
speed = 2.0
case command.Arm == Arm_CAVALRY:
speed = 1.5
case command.Arm == Arm_INFANTRY && command.GetGameState().GetOrders() == Order_MARCH:
speed = 1.4
}
fromGrid := m.GetValue(x-1, y-1)
switch fromGrid {
case 't', 'w':
speed *= 0.8
case 'T', 'W', 'h', 'r':
speed *= 0.6
case 'H':
speed *= 0.5
}
toGrid := m.GetValue(v.X-1, v.Y-1)
switch toGrid {
case 't', 'w':
speed *= 0.7
case 'T', 'W', 'h':
speed *= 0.5
case 'H', 'r':
speed *= 0.4
}
if command.GetArm() == Arm_INFANTRY {
switch command.GetGameState().GetFormation() {
case Formation_MARCH_COLUMN:
speed *= 1.0
case Formation_LINE:
switch command.GetDrill() {
case Drill_LINEAR:
speed *= 0.5
case Drill_MASSED:
speed *= 0.6
case Drill_RAPID:
speed *= 0.7
}
default:
switch command.GetDrill() {
case Drill_LINEAR:
speed *= 0.7
case Drill_MASSED:
speed *= 0.8
case Drill_RAPID:
speed *= 0.9
}
}
}
going := "at a good march"
if command.Arm == Arm_CAVALRY {
if command.GetGameState().GetOrders() == Order_CHARGE {
speed *= 1.5
}
going = "at the trot"
switch {
case speed >= 2.0:
going = "at the gallop"
case speed >= 1.5:
going = "at the trot"
case speed <= 0.4:
going = "very slow going"
case speed <= 0.5:
going = "difficult going"
case speed <= 0.6:
going = "at a slow walk"
case speed <= 0.7:
going = "at the walk"
case speed <= 0.8:
going = "at the walk"
}
} else {
switch {
case speed >= 1.5:
going = "with great speed"
case speed <= 0.4:
going = "very slow"
case speed <= 0.5:
going = "harsh going"
case speed <= 0.6:
going = "slow going"
case speed <= 0.7:
going = "with some delays"
case speed <= 0.8:
going = "with minor delays"
}
}
elapsed := ((distance / (speed * 3.0)) * 60.0) // 20mins to the mile in good order
turns := int(elapsed+10.0) / 20
turnss := ""
if turns != 1 {
turnss = "s"
}
path := fmt.Sprintf("-> %d,%d (%0.1f miles %s, about %d turn%s)", v.X, v.Y, distance, going, turns, turnss)
waypoints = append(waypoints, Waypoint{
X: x,
Y: y,
X2: v.X,
Y2: v.Y,
Distance: distance,
Going: going,
Turns: turns,
Elapsed: elapsed,
Speed: speed,
Path: path,
})
x = v.X
y = v.Y
}
}
// then add a deploy option at the end if they are in march order
switch command.GetGameState().GetOrders() {
case Order_MARCH:
switch command.Arm {
case Arm_ARTILLERY:
waypoints = append(waypoints, Waypoint{
X: x,
Y: y,
X2: x,
Y2: y,
Speed: 1.0,
Elapsed: 10.0,
Turns: 1,
Going: "deploying",
Distance: 0.0,
Path: "-> Unlimber, prepare for action (half a turn)",
Prep: true,
})
case Arm_INFANTRY:
waypoints = append(waypoints, Waypoint{
X: x,
Y: y,
X2: x,
Y2: y,
Speed: 1.0,
Elapsed: 10.0,
Turns: 1,
Going: "deploying",
Distance: 0.0,
Path: "-> Deploy back to " + upString(command.GetGameState().GetFormation().String()) + " (about 1 turn)",
Prep: true,
})
}
}
return waypoints
} | proto/map.go | 0.543106 | 0.415314 | map.go | starcoder |
package gcm
import (
"encoding/binary"
)
// gcmFieldElement represents a value in GF(2¹²⁸). In order to reflect the GCM
// standard and make binary.BigEndian suitable for marshaling these values, the
// bits are stored in big endian order. For example:
// the coefficient of x⁰ can be obtained by v.low >> 63.
// the coefficient of x⁶³ can be obtained by v.low & 1.
// the coefficient of x⁶⁴ can be obtained by v.high >> 63.
// the coefficient of x¹²⁷ can be obtained by v.high & 1.
type gcmFieldElement struct {
low, high uint64
}
func (z *gcmFieldElement) setBytes(p []byte) {
z.low = binary.BigEndian.Uint64(p[:8])
z.high = binary.BigEndian.Uint64(p[8:])
}
func (x gcmFieldElement) marshal() []byte {
out := make([]byte, 16)
binary.BigEndian.PutUint64(out, x.low)
binary.BigEndian.PutUint64(out[8:], x.high)
return out
}
// gcm represents a Galois Counter Mode with a specific key. See
// https://csrc.nist.gov/groups/ST/toolkit/BCM/documents/proposedmodes/gcm/gcm-revised-spec.pdf
type gcm struct {
y gcmFieldElement
// productTable contains the first sixteen powers of the key, H.
// However, they are in bit reversed order.
productTable [16]gcmFieldElement
}
func New(key []byte) *gcm {
// We precompute 16 multiples of |key|. However, when we do lookups
// into this table we'll be using bits from a field element and
// therefore the bits will be in the reverse order. So normally one
// would expect, say, 4*key to be in index 4 of the table but due to
// this bit ordering it will actually be in index 0010 (base 2) = 2.
var x gcmFieldElement
x.setBytes(key)
var g gcm
g.productTable[reverseBits(1)] = x
for i := 2; i < 16; i += 2 {
g.productTable[reverseBits(i)] = gcmDouble(g.productTable[reverseBits(i/2)])
g.productTable[reverseBits(i+1)] = gcmAdd(g.productTable[reverseBits(i)], x)
}
return &g
}
// reverseBits reverses the order of the bits of 4-bit number in i.
func reverseBits(i int) int {
i = ((i << 2) & 0xc) | ((i >> 2) & 0x3)
i = ((i << 1) & 0xa) | ((i >> 1) & 0x5)
return i
}
// gcmAdd adds two elements of GF(2¹²⁸) and returns the sum.
func gcmAdd(x, y gcmFieldElement) gcmFieldElement {
// Addition in a characteristic 2 field is just XOR.
return gcmFieldElement{x.low ^ y.low, x.high ^ y.high}
}
func Mulx(s []byte) []byte {
var x gcmFieldElement
x.setBytes(s)
return gcmDouble(x).marshal()
}
// gcmDouble returns the result of doubling an element of GF(2¹²⁸).
func gcmDouble(x gcmFieldElement) (double gcmFieldElement) {
msbSet := x.high&1 == 1
// Because of the bit-ordering, doubling is actually a right shift.
double.high = x.high >> 1
double.high |= x.low << 63
double.low = x.low >> 1
// If the most-significant bit was set before shifting then it,
// conceptually, becomes a term of x^128. This is greater than the
// irreducible polynomial so the result has to be reduced. The
// irreducible polynomial is 1+x+x^2+x^7+x^128. We can subtract that to
// eliminate the term at x^128 which also means subtracting the other
// four terms. In characteristic 2 fields, subtraction == addition ==
// XOR.
if msbSet {
double.low ^= 0xe100000000000000
}
return
}
var gcmReductionTable = []uint16{
0x0000, 0x1c20, 0x3840, 0x2460, 0x7080, 0x6ca0, 0x48c0, 0x54e0,
0xe100, 0xfd20, 0xd940, 0xc560, 0x9180, 0x8da0, 0xa9c0, 0xb5e0,
}
// mul sets y to y*H, where H is the GCM key.
func (g *gcm) mul(y gcmFieldElement) gcmFieldElement {
var z gcmFieldElement
for i := 0; i < 2; i++ {
word := y.high
if i == 1 {
word = y.low
}
// Multiplication works by multiplying z by 16 and adding in
// one of the precomputed multiples of H.
for j := 0; j < 64; j += 4 {
msw := z.high & 0xf
z.high >>= 4
z.high |= z.low << 60
z.low >>= 4
z.low ^= uint64(gcmReductionTable[msw]) << 48
// the values in |table| are ordered for
// little-endian bit positions.
t := g.productTable[word&0xf]
z.low ^= t.low
z.high ^= t.high
word >>= 4
}
}
return z
}
const (
gcmBlockSize = 16
)
// UpdateBlocks extends y with more polynomial terms from blocks, based on
// Horner's rule. There must be a multiple of gcmBlockSize bytes in blocks.
func (g *gcm) UpdateBlocks(blocks []byte) {
if len(blocks)%16 != 0 {
panic("invalid block size")
}
for len(blocks) > 0 {
g.y.low ^= binary.BigEndian.Uint64(blocks)
g.y.high ^= binary.BigEndian.Uint64(blocks[8:])
g.y = g.mul(g.y)
blocks = blocks[gcmBlockSize:]
}
}
func (g *gcm) Sum(b []byte) []byte {
return g.y.marshal()
} | internal/gcm/gcm.go | 0.753467 | 0.439086 | gcm.go | starcoder |
package dslengine
import "fmt"
type (
// Definition is the common interface implemented by all definitions.
Definition interface {
// Context is used to build error messages that refer to the definition.
Context() string
}
// DefinitionSet contains DSL definitions that are executed as one unit.
// The slice elements may implement the Validate an, Source interfaces to enable the
// corresponding behaviors during DSL execution.
DefinitionSet []Definition
// Root is the interface implemented by the DSL root objects.
// These objects contains all the definition sets created by the DSL and can
// be passed to the dsl engine for execution.
Root interface {
// DSLName is displayed by the runner upon executing the DSL.
// Registered DSL roots must have unique names.
DSLName() string
// DependsOn returns the list of other DSL roots this root depends on.
// The DSL engine uses this function to order execution of the DSLs.
DependsOn() []Root
// IterateSets implements the visitor pattern: is is called by the engine so the
// DSL can control the order of execution. IterateSets calls back the engine via
// the given iterator as many times as needed providing the DSL definitions that
// must be run for each callback.
IterateSets(SetIterator)
// Reset restores the root to pre DSL execution state.
// This is mainly used by tests.
Reset()
}
// Validate is the interface implemented by definitions that can be validated.
// Validation is done by the DSL dsl post execution.
Validate interface {
Definition
// Validate returns nil if the definition contains no validation error.
// The Validate implementation may take advantage of ValidationErrors to report
// more than one errors at a time.
Validate() error
}
// Source is the interface implemented by definitions that can be initialized via DSL.
Source interface {
Definition
// DSL returns the DSL used to initialize the definition if any.
DSL() func()
}
// Finalize is the interface implemented by definitions that require an additional pass
// after the DSL has executed (e.g. to merge generated definitions or initialize default
// values)
Finalize interface {
Definition
// Finalize is run by the DSL runner once the definition DSL has executed and the
// definition has been validated.
Finalize()
}
// SetIterator is the function signature used to iterate over definition sets with
// IterateSets.
SetIterator func(s DefinitionSet) error
// MetadataDefinition is a set of key/value pairs
MetadataDefinition map[string][]string
// TraitDefinition defines a set of reusable properties.
TraitDefinition struct {
// Trait name
Name string
// Trait DSL
DSLFunc func()
}
// ValidationDefinition contains validation rules for an attribute.
ValidationDefinition struct {
// Values represents an enum validation as described at
// http://json-schema.org/latest/json-schema-validation.html#anchor76.
Values []interface{}
// Format represents a format validation as described at
// http://json-schema.org/latest/json-schema-validation.html#anchor104.
Format string
// PatternValidationDefinition represents a pattern validation as described at
// http://json-schema.org/latest/json-schema-validation.html#anchor33
Pattern string
// Minimum represents an minimum value validation as described at
// http://json-schema.org/latest/json-schema-validation.html#anchor21.
Minimum *float64
// Maximum represents a maximum value validation as described at
// http://json-schema.org/latest/json-schema-validation.html#anchor17.
Maximum *float64
// MinLength represents an minimum length validation as described at
// http://json-schema.org/latest/json-schema-validation.html#anchor29.
MinLength *int
// MaxLength represents an maximum length validation as described at
// http://json-schema.org/latest/json-schema-validation.html#anchor26.
MaxLength *int
// Required list the required fields of object attributes as described at
// http://json-schema.org/latest/json-schema-validation.html#anchor61.
Required []string
}
)
// Context returns the generic definition name used in error messages.
func (t *TraitDefinition) Context() string {
if t.Name != "" {
return fmt.Sprintf("trait %#v", t.Name)
}
return "unnamed trait"
}
// DSL returns the initialization DSL.
func (t *TraitDefinition) DSL() func() {
return t.DSLFunc
}
// Context returns the generic definition name used in error messages.
func (v *ValidationDefinition) Context() string {
return "validation"
}
// Merge merges other into v.
func (v *ValidationDefinition) Merge(other *ValidationDefinition) {
if v.Values == nil {
v.Values = other.Values
}
if v.Format == "" {
v.Format = other.Format
}
if v.Pattern == "" {
v.Pattern = other.Pattern
}
if v.Minimum == nil || (other.Minimum != nil && *v.Minimum > *other.Minimum) {
v.Minimum = other.Minimum
}
if v.Maximum == nil || (other.Maximum != nil && *v.Maximum < *other.Maximum) {
v.Maximum = other.Maximum
}
if v.MinLength == nil || (other.MinLength != nil && *v.MinLength > *other.MinLength) {
v.MinLength = other.MinLength
}
if v.MaxLength == nil || (other.MaxLength != nil && *v.MaxLength < *other.MaxLength) {
v.MaxLength = other.MaxLength
}
v.AddRequired(other.Required)
}
// AddRequired merges the required fields from other into v
func (v *ValidationDefinition) AddRequired(required []string) {
for _, r := range required {
found := false
for _, rr := range v.Required {
if r == rr {
found = true
break
}
}
if !found {
v.Required = append(v.Required, r)
}
}
}
// Dup makes a shallow dup of the validation.
func (v *ValidationDefinition) Dup() *ValidationDefinition {
return &ValidationDefinition{
Values: v.Values,
Format: v.Format,
Pattern: v.Pattern,
Minimum: v.Minimum,
Maximum: v.Maximum,
MinLength: v.MinLength,
MaxLength: v.MaxLength,
Required: v.Required,
}
} | dslengine/definitions.go | 0.726037 | 0.475849 | definitions.go | starcoder |
package bullet3
// #cgo CFLAGS: -I"./bullet3/lib/include/bullet" -I"./bullet3/lib/include/bullet_robotics" -I"./bullet3/lib/include/bullet" -I"./bullet3/lib/include"
// #cgo LDFLAGS: -L./bullet3/lib/lib -lBulletRobotics -lBulletInverseDynamics -lBulletInverseDynamicsUtils -l BulletFileLoader -l BulletWorldImporter -lBulletSoftBody -lBulletDynamics -lBulletCollision -lLinearMath -lBullet3Common -lm -lstdc++ -ldl
// #include "bullet.h"
// #include "PhysicsDirectC_API.h"
import "C"
import "github.com/go-gl/mathgl/mgl64"
type Param func(w *World)
func Gravity(g mgl64.Vec3) Param {
return func(w *World) {
w.SetGravity(g)
}
}
func TimeStep(step float64) Param {
return func(w *World) {
w.SetTimeStep(step)
}
}
type World struct {
handle C.b3PhysicsClientHandle
}
func NewWorld(params ...Param) *World {
w := &World{C.b3ConnectPhysicsDirect()}
for _, p := range params {
p(w)
}
return w
}
func (w *World) Destroy() {
C.b3DisconnectSharedMemory(w.handle)
}
func (w *World) SetTimeStep(step float64) {
C.b3SetTimeStep(w.handle, C.double(step))
}
func (w *World) SetGravity(g mgl64.Vec3) {
C.b3SetGravity(w.handle, C.double(g[0]), C.double(g[1]), C.double(g[2]))
}
func (w *World) ResetSimulation() {
C.b3ResetSimulation(w.handle)
}
func (w *World) Step() {
C.b3StepSimulation(w.handle)
}
func (w *World) CreateRigidBody(pos mgl64.Vec3, ori mgl64.Quat, halfExtents mgl64.Vec3, mass float64, shape CollisionShape) *RigidBody {
o := []float64{ori.V[0], ori.V[1], ori.V[2], ori.W}
id := C.b3CreateBody(w.handle, (*C.double)(&pos[0]), (*C.double)(&o[0]), (*C.double)(&halfExtents[0]), C.double(mass), C.int(shape.ShapeType()))
return &RigidBody{
id: int(id),
mass: mass,
collisionShape: shape,
world: w.handle,
}
}
type CollisionShape interface {
ShapeType() int
}
type RigidBody struct {
id int
mass float64
collisionShape CollisionShape
world C.b3PhysicsClientHandle
}
func (r *RigidBody) ID() int {
return r.id
}
func (r *RigidBody) Mass() float64 {
return r.mass
}
func (r *RigidBody) Position() mgl64.Vec3 {
var v mgl64.Vec3
C.b3GetBodyPosition(r.world, C.int(r.id), (*C.double)(&v[0]))
return v
}
func (r *RigidBody) Orientation() mgl64.Quat {
var v [4]float64
C.b3GetBodyOrientation(r.world, C.int(r.id), (*C.double)(&v[0]))
return mgl64.Quat{
V: mgl64.Vec3{v[0], v[1], v[2]},
W: v[3],
}
}
func (r *RigidBody) SetMass(mass float64) {
C.b3SetBodyMass(r.world, C.int(r.id), C.double(mass))
r.mass = mass
}
func (r *RigidBody) ApplyImpulse(impulse, worldPosition mgl64.Vec3) {
C.b3ApplyImpulse(r.world, C.int(r.id), (*C.double)(&impulse[0]), (*C.double)(&worldPosition[0]));
} | bullet.go | 0.608129 | 0.420064 | bullet.go | starcoder |
package merger
import (
"fmt"
"reflect"
)
// Merge merges a and b together where b has precedence.
// It is not destructive on their parameters, but these must not be modified while
// merge is in progress.
func Merge(a interface{}, b interface{}) (interface{}, error) {
aKind := reflect.ValueOf(a).Kind()
bKind := reflect.ValueOf(b).Kind()
if kindContains(overwriteables, aKind) && kindContains(overwriteables, bKind) {
// Is overwriteable
return b, nil
} else if kindContains(mergeables, aKind) && kindContains(mergeables, bKind) {
// Is mergeable
if aKind != bKind {
return nil, &MergeError{
errType: ErrDiffKind,
errString: fmt.Sprintf(errDiffKindText, aKind, bKind),
}
}
if aKind == reflect.Array {
return arrayMerge(a, b)
} else if aKind == reflect.Slice {
return sliceMerge(a, b)
} else if aKind == reflect.Struct {
return structMerge(a, b)
} else if aKind == reflect.Map {
return mapMerge(a, b)
}
}
return nil, &MergeError{
errType: ErrMergeUnsupported,
errString: fmt.Sprintf(errMergeUnsupportedText, aKind, bKind),
}
}
// Field order is taken from a, additional fields of b are appended.
// No type assertion is made
// BUG(djboris): When merging structs fields are appended (A.Fields ∥ B.Fields).
// But field types should be taken from b as it has precedence.
func structMerge(a, b interface{}) (interface{}, error) {
aV := reflect.ValueOf(a)
bV := reflect.ValueOf(b)
var commonFields []reflect.StructField
// Add fields of A to commonFields
for i := 0; i < aV.NumField(); i++ {
x := aV.Type().Field(i)
commonFields = append(commonFields, x)
}
// Add fields of B to commonFields
for i := 0; i < bV.NumField(); i++ {
x := bV.Type().Field(i)
if !structFieldContains(commonFields, x.Name) {
commonFields = append(commonFields, x)
} else {
// TODO Overwrite type, according to BUG for this function
}
}
// Construct output struct
resType := reflect.StructOf(commonFields)
resValue := reflect.New(resType).Elem()
for i := range commonFields {
fieldName := commonFields[i].Name
fieldA := aV.FieldByName(fieldName)
fieldB := bV.FieldByName(fieldName)
if fieldA.IsValid() && fieldB.IsValid() {
// Field exists in A and B, do a merge
m, err := Merge(fieldA.Interface(), fieldB.Interface())
if err != nil {
return nil, err
}
resValue.FieldByName(fieldName).Set(reflect.ValueOf(m))
} else if fieldA.IsValid() {
// Field exists in A
resValue.FieldByName(fieldName).Set(fieldA)
} else if fieldB.IsValid() {
// Field exists in B
resValue.FieldByName(fieldName).Set(fieldB)
} else {
return nil, &MergeError{
errType: ErrInvalidFields,
errString: fmt.Sprintf(errInvalidFieldsText, fieldName),
}
}
}
return resValue.Interface(), nil
}
// arrayMerge returns effectively a slice
// Result: a ∥ b
// Type assertion only on Elem type
func arrayMerge(a, b interface{}) (interface{}, error) {
aV := reflect.ValueOf(a)
bV := reflect.ValueOf(b)
if aV.Type().Elem() != bV.Type().Elem() {
return nil, &MergeError{
errType: ErrDiffArrayTypes,
errString: fmt.Sprintf(errDiffArrayTypesText, aV.Type().Elem(), bV.Type().Elem()),
}
}
resType := reflect.ArrayOf(aV.Len()+bV.Len(), aV.Type().Elem())
resValue := reflect.New(resType).Elem()
for i := 0; i < aV.Len(); i++ {
resValue.Index(i).Set(aV.Index(i))
}
for i := 0; i < bV.Len(); i++ {
resValue.Index(aV.Len() + i).Set(bV.Index(i))
}
return resValue.Interface(), nil
}
// Result: a ∪ b
// Type assertion on indexing and value type
func mapMerge(a, b interface{}) (interface{}, error) {
aV := reflect.ValueOf(a)
bV := reflect.ValueOf(b)
if aV.Type().Key() != bV.Type().Key() {
return nil, &MergeError{
errType: ErrDiffMapKeyTypes,
errString: fmt.Sprintf(errDiffMapKeyTypesText, aV.Type().Key(), bV.Type().Key()),
}
}
if aV.Type().Elem() != bV.Type().Elem() {
return nil, &MergeError{
errType: ErrDiffMapValueTypes,
errString: fmt.Sprintf(errDiffMapValueTypesText, aV.Type().Elem(), bV.Type().Elem()),
}
}
res := reflect.MakeMap(aV.Type())
// Copy everything from a
for _, k := range aV.MapKeys() {
res.SetMapIndex(k, aV.MapIndex(k))
}
for _, k := range bV.MapKeys() {
if res.MapIndex(k).Kind() == reflect.Invalid {
// If key was not in a already, add it from b
res.SetMapIndex(k, bV.MapIndex(k))
} else {
// Need to merge both values
resVal := res.MapIndex(k).Interface()
bVal := bV.MapIndex(k).Interface()
m, err := Merge(resVal, bVal)
if err != nil {
return nil, err
}
res.SetMapIndex(k, reflect.ValueOf(m))
}
}
return res.Interface(), nil
}
// Type assertion only on Elem type
func sliceMerge(a, b interface{}) (interface{}, error) {
aV := reflect.ValueOf(a)
bV := reflect.ValueOf(b)
if aV.Type().Elem() != bV.Type().Elem() {
return nil, &MergeError{
errType: ErrDiffSliceTypes,
errString: fmt.Sprintf(errDiffSliceTypesText, aV.Type().Elem(), bV.Type().Elem()),
}
}
return reflect.AppendSlice(aV, bV).Interface(), nil
}
// Boolean, numeric and string types (overwrite)
var overwriteables = []reflect.Kind{
reflect.Bool,
reflect.Int,
reflect.Int8,
reflect.Int16,
reflect.Int32,
reflect.Int64,
reflect.Uint,
reflect.Uint8,
reflect.Uint16,
reflect.Uint32,
reflect.Uint64,
reflect.Uintptr,
reflect.Float32,
reflect.Float64,
reflect.Complex64,
reflect.Complex128,
reflect.String,
}
// Array and slice types (concat) + map and struct types (union)
var mergeables = []reflect.Kind{
reflect.Array,
reflect.Map,
reflect.Slice,
reflect.Struct,
}
func structFieldContains(s []reflect.StructField, n string) bool {
for _, z := range s {
if z.Name == n {
return true
}
}
return false
}
func kindContains(s []reflect.Kind, k reflect.Kind) bool {
for _, z := range s {
if z == k {
return true
}
}
return false
} | merge.go | 0.52902 | 0.458409 | merge.go | starcoder |
package ast
// Yes, a lot of the right end points of the statements are off by one, and
// will also error out because of null exceptions in probably half of the
// cases. I will come back to this later but I seriously cannot be arsed right
// now.
import (
"strings"
"github.com/yjp20/turtle/straw/token"
)
type Node interface {
Pos() token.Pos
End() token.Pos
}
type Statement interface {
Node
statementNode()
}
type Expression interface {
Node
expressionNode()
}
// ---
// General Nodes
type Program struct {
Statements []Statement
}
func (p *Program) Pos() token.Pos { return 0 }
func (p *Program) End() token.Pos { return 0 }
type Comment struct {
TextPos token.Pos
Text string
}
func (c *Comment) Pos() token.Pos { return c.TextPos }
func (c *Comment) End() token.Pos { return c.TextPos + token.Pos(len(c.Text)) }
type CommentGroup struct {
Lines []*Comment
}
func (cg *CommentGroup) Pos() token.Pos { return cg.Lines[0].Pos() }
func (cg *CommentGroup) End() token.Pos { return cg.Lines[len(cg.Lines)-1].End() }
func (cg *CommentGroup) Text() (string, error) {
sb := strings.Builder{}
// TODO: smarter handling of concatenating comment strings
for _, comment := range cg.Lines {
_, err := sb.WriteString(comment.Text[2:])
if err != nil {
return "", err
}
}
return sb.String(), nil
}
// ---
// Statements
type AssignStatement struct {
Left Expression
Right Expression
}
func (ls *AssignStatement) statementNode() {}
func (ls *AssignStatement) Pos() token.Pos { return ls.Left.Pos() }
func (ls *AssignStatement) End() token.Pos { return ls.Right.End() }
type EachStatement struct {
Left Expression
Right Expression
}
func (es *EachStatement) statementNode() {}
func (es *EachStatement) Pos() token.Pos { return es.Left.Pos() }
func (es *EachStatement) End() token.Pos { return es.Right.End() }
type ExpressionStatement struct {
Expression Expression
}
func (es *ExpressionStatement) statementNode() {}
func (es *ExpressionStatement) Pos() token.Pos { return es.Expression.Pos() }
func (es *ExpressionStatement) End() token.Pos { return es.Expression.End() }
type BranchStatement struct {
Keyword token.Token
KeywordPos token.Pos
Label *Identifier
}
func (bs *BranchStatement) statementNode() {}
func (bs *BranchStatement) Pos() token.Pos { return bs.KeywordPos }
func (bs *BranchStatement) End() token.Pos { return bs.Label.End() }
type ReturnStatement struct {
Return token.Pos
Expression Expression
}
func (rs *ReturnStatement) statementNode() {}
func (rs *ReturnStatement) Pos() token.Pos { return rs.Return }
func (rs *ReturnStatement) End() token.Pos { return rs.Expression.End() }
type ForStatement struct {
For token.Pos
Clauses []Statement
Expression Expression
}
func (fs *ForStatement) statementNode() {}
func (fs *ForStatement) Pos() token.Pos { return fs.For }
func (fs *ForStatement) End() token.Pos { return fs.Expression.End() }
// FIXME: Think about how positions work in empty statements. Maybe look at how go does it in their parser?
type EmptyStatement struct{}
func (es *EmptyStatement) statementNode() {}
func (es *EmptyStatement) Pos() token.Pos { return -1 }
func (es *EmptyStatement) End() token.Pos { return -1 }
// ---
// Expresions
type Identifier struct {
NamePos token.Pos
Name string
}
func (id *Identifier) expressionNode() {}
func (id *Identifier) Pos() token.Pos { return id.NamePos }
func (id *Identifier) End() token.Pos { return id.NamePos + token.Pos(len(id.Name)) }
type CallExpression struct {
Expressions []Expression
}
func (cl *CallExpression) expressionNode() {}
func (cl *CallExpression) Pos() token.Pos { return cl.Expressions[0].Pos() }
func (cl *CallExpression) End() token.Pos { return cl.Expressions[0].End() }
type ConstructExpression struct {
Construct token.Pos
Type Expression
Value *Tuple
}
func (ce *ConstructExpression) expressionNode() {}
func (ce *ConstructExpression) Pos() token.Pos { return ce.Construct }
func (ce *ConstructExpression) End() token.Pos { return ce.Value.End() }
type Selector struct {
Expression Expression
Selection *Identifier
}
func (s *Selector) expressionNode() {}
func (s *Selector) Pos() token.Pos { return s.Expression.Pos() }
func (s *Selector) End() token.Pos { return s.Selection.End() }
type Indexor struct {
Expression Expression
Index Expression
}
func (i *Indexor) expressionNode() {}
func (i *Indexor) Pos() token.Pos { return i.Expression.Pos() }
func (i *Indexor) End() token.Pos { return i.Index.End() }
type Tuple struct {
Left token.Pos
Right token.Pos
Statements []Statement
}
func (t *Tuple) expressionNode() {}
func (t *Tuple) Pos() token.Pos { return t.Left }
func (t *Tuple) End() token.Pos { return t.Right }
type Block struct {
LeftBrace token.Pos
Statements []Statement
RightBrace token.Pos
}
func (b *Block) expressionNode() {}
func (b *Block) Pos() token.Pos { return b.LeftBrace }
func (b *Block) End() token.Pos { return b.RightBrace }
type If struct {
Conditional Expression
True Statement
False Statement
}
func (i *If) expressionNode() {}
func (i *If) Pos() token.Pos { return i.True.Pos() }
func (i *If) End() token.Pos { return i.False.End() }
type DefaultLiteral struct {
DefaultPos token.Pos
}
func (dl *DefaultLiteral) expressionNode() {}
func (dl *DefaultLiteral) Pos() token.Pos { return dl.DefaultPos }
func (dl *DefaultLiteral) End() token.Pos { return dl.DefaultPos + 1 }
type IntLiteral struct {
IntPos token.Pos
Literal string
Value int64
}
func (il *IntLiteral) expressionNode() {}
func (il *IntLiteral) Pos() token.Pos { return il.IntPos }
func (il *IntLiteral) End() token.Pos { return il.IntPos + token.Pos(len(il.Literal)) }
type FloatLiteral struct {
FloatPos token.Pos
Literal string
Value float64
}
func (fl *FloatLiteral) expressionNode() {}
func (fl *FloatLiteral) Pos() token.Pos { return fl.FloatPos }
func (fl *FloatLiteral) End() token.Pos { return fl.FloatPos + token.Pos(len(fl.Literal)) }
type StringLiteral struct {
StringPos token.Pos
Value string
}
func (sl *StringLiteral) expressionNode() {}
func (sl *StringLiteral) Pos() token.Pos { return sl.StringPos }
func (sl *StringLiteral) End() token.Pos { return sl.StringPos + token.Pos(len(sl.Value)) }
type RuneLiteral struct {
RunePos token.Pos
Value string
}
func (rl *RuneLiteral) expressionNode() {}
func (rl *RuneLiteral) Pos() token.Pos { return rl.RunePos }
func (rl *RuneLiteral) End() token.Pos { return rl.RunePos + token.Pos(len(rl.Value)) }
type RangeLiteral struct {
RangePos token.Pos
RightPos token.Pos
Left Expression
LeftInclusive bool
Right Expression
RightInclusive bool
}
func (rl *RangeLiteral) expressionNode() {}
func (rl *RangeLiteral) Pos() token.Pos { return rl.RangePos }
func (rl *RangeLiteral) End() token.Pos { return rl.RightPos + 1 }
type TrueLiteral struct {
True token.Pos
}
func (tl *TrueLiteral) expressionNode() {}
func (tl *TrueLiteral) Pos() token.Pos { return tl.True }
func (tl *TrueLiteral) End() token.Pos { return tl.True + 4 }
type FalseLiteral struct {
False token.Pos
}
func (fl *FalseLiteral) expressionNode() {}
func (fl *FalseLiteral) Pos() token.Pos { return fl.False }
func (fl *FalseLiteral) End() token.Pos { return fl.False + 5 }
type FunctionDefinition struct {
Func token.Pos
Identifier *Identifier
Args *Tuple
Params *Tuple
Return Expression
Body Statement
}
func (fd *FunctionDefinition) expressionNode() {}
func (fd *FunctionDefinition) Pos() token.Pos { return fd.Func }
func (fd *FunctionDefinition) End() token.Pos { return fd.Body.End() }
type Spread struct {
Elipsis token.Pos
Expression Expression
}
func (s *Spread) expressionNode() {}
func (s *Spread) Pos() token.Pos { return s.Elipsis }
func (s *Spread) End() token.Pos { return s.Expression.End() }
type Prefix struct {
Operator token.Token
OperatorPos token.Pos
Expression Expression
}
func (p *Prefix) expressionNode() {}
func (p *Prefix) Pos() token.Pos { return p.OperatorPos }
func (p *Prefix) End() token.Pos { return p.Expression.End() }
type Infix struct {
Operator token.Token
OperatorPos token.Pos
Left Expression
Right Expression
}
func (i *Infix) expressionNode() {}
func (i *Infix) Pos() token.Pos { return i.Left.Pos() }
func (i *Infix) End() token.Pos { return i.Right.End() }
type Bind struct {
Left Expression
Right Expression
}
func (b *Bind) expressionNode() {}
func (b *Bind) Pos() token.Pos { return b.Left.Pos() }
func (b *Bind) End() token.Pos { return b.Right.End() }
type TypeSpec struct {
Type token.Token
TypePos token.Pos
Params *Tuple
Spec *Tuple
}
func (ts *TypeSpec) expressionNode() {}
func (ts *TypeSpec) Pos() token.Pos { return ts.TypePos }
func (ts *TypeSpec) End() token.Pos { return ts.Spec.End() }
type As struct {
Value Expression
Type Expression
}
func (a *As) expressionNode() {}
func (a *As) Pos() token.Pos { return a.Value.Pos() }
func (a *As) End() token.Pos { return a.Type.End() }
type Match struct {
Match token.Pos
Left token.Pos
Right token.Pos
Item Expression
Conditions []Expression
Bodies []Statement
}
func (m *Match) expressionNode() {}
func (m *Match) Pos() token.Pos { return m.Left }
func (m *Match) End() token.Pos { return m.Right + 1 } | straw/ast/ast.go | 0.526099 | 0.45944 | ast.go | starcoder |
package main
import "container/heap"
/*
题目:
如何得到一个数据流中的中位数?如果从数据流中读出奇数个数值,那么中位数就是所有数值排序之后位于中间的数值。如果从数据流中读出偶数个数值,那么中位数就是所有数值排序之后中间两个数的平均值。
例如,
[2,3,4] 的中位数是 3
[2,3] 的中位数是 (2 + 3) / 2 = 2.5
设计一个支持以下两种操作的数据结构:
void addNum(int num) - 从数据流中添加一个整数到数据结构中。
double findMedian() - 返回目前所有元素的中位数。
限制:
最多会对 addNum、findMedian 进行 50000 次调用。
来源:力扣(LeetCode)
链接:https://leetcode-cn.com/problems/shu-ju-liu-zhong-de-zhong-wei-shu-lcof
*/
/*
方法一: 最大堆存小的[n/2],最小堆存大的[n/2]
时间复杂度:插入О(logN) 查找О(1)
空间复杂度:О(n)
运行时间:112 ms 内存消耗:13.1 MB
*/
type MedianFinder struct {
maxHeap *Heap
minHeap *Heap
}
/** initialize your data structure here. */
func Constructor() MedianFinder {
return MedianFinder{
maxHeap: NewMaxHeap(),
minHeap: NewMinHeap(),
}
}
func (this *MedianFinder) AddNum(num int) {
if this.maxHeap.Len() != this.minHeap.Len() {
this.maxHeap.push(num)
this.minHeap.push(this.maxHeap.pop())
} else {
this.minHeap.push(num)
this.maxHeap.push(this.minHeap.pop())
}
}
func (this *MedianFinder) FindMedian() float64 {
if this.maxHeap.Len() != this.minHeap.Len() {
return float64(this.maxHeap.Peek())
}
return float64(this.minHeap.Peek()+this.maxHeap.Peek()) / 2
}
type Heap struct {
data []int
isAsc bool
}
func NewMinHeap() *Heap {
h := &Heap{
data: []int{},
isAsc: true,
}
heap.Init(h)
return h
}
func NewMaxHeap() *Heap {
h := &Heap{
data: []int{},
isAsc: false,
}
heap.Init(h)
return h
}
func (h *Heap) Less(i, j int) bool {
if h.isAsc {
return h.data[i] < h.data[j]
}
return h.data[j] < h.data[i]
}
func (h *Heap) Swap(i, j int) {
h.data[i], h.data[j] = h.data[j], h.data[i]
}
func (h *Heap) Len() int {
return len(h.data)
}
func (h *Heap) Push(v interface{}) {
h.data = append(h.data, v.(int))
}
func (h *Heap) Pop() interface{} {
v := h.data[len(h.data)-1]
h.data = h.data[:len(h.data)-1]
return v
}
func (h *Heap) push(v int) {
heap.Push(h, v)
}
func (h *Heap) pop() int {
return heap.Pop(h).(int)
}
func (h *Heap) Peek() int {
return h.data[0]
}
/**
* Your MedianFinder object will be instantiated and called as such:
* obj := Constructor();
* obj.AddNum(num);
* param_2 := obj.FindMedian();
*/ | internal/lcof/41.shu-ju-liu-zhong-de-zhong-wei-shu/main.go | 0.537041 | 0.403567 | main.go | starcoder |
package ccopy
import (
"errors"
"fmt"
"reflect"
)
const tagCcopy = "ccopy"
// Config represents the config for the customizable deep copy.
// Maps between tag value and functions that receive the tagged data and return the same data type.
type Config map[string]interface{}
// Copy deep copies an object respecting the customizations provided in the config.
// Unexported fields of a struct are ignored and will not be copied.
// The types unsafe.Pointer and uintptr are not supported and they will cause a panic.
// A channel will point to the original channel.
func (c Config) Copy(obj interface{}) (interface{}, error) {
ov := reflect.ValueOf(obj)
oc, err := c.copy(ov)
if err != nil {
return nil, err
}
return oc.Interface(), nil
}
func (c Config) copy(ov reflect.Value) (reflect.Value, error) {
if !ov.IsValid() {
return reflect.Value{}, errors.New("invalid value")
}
if t := ov.Type(); t.PkgPath() == "time" && t.Name() == "Time" {
return c.copyTime(ov)
}
switch ov.Kind() {
case reflect.Struct:
return c.copyStruct(ov)
case reflect.Ptr:
return c.copyPointer(ov)
case reflect.Slice:
return c.copySlice(ov)
case reflect.Map:
return c.copyMap(ov)
case reflect.Interface:
return c.copyInterface(ov)
case reflect.Array:
return c.copyArray(ov)
case reflect.Int, reflect.String, reflect.Int64, reflect.Float64, reflect.Bool, reflect.Uint, reflect.Uint64,
reflect.Func, reflect.Chan, reflect.Float32,
reflect.Int8, reflect.Int16, reflect.Int32,
reflect.Complex64, reflect.Complex128,
reflect.Uint8, reflect.Uint16, reflect.Uint32:
return ov, nil
}
panic(fmt.Sprintf("unsupported type: %s", ov.Kind()))
}
func (c Config) copyStruct(ov reflect.Value) (reflect.Value, error) {
oc := reflect.New(ov.Type()).Elem()
ot := ov.Type()
for i := 0; i < ot.NumField(); i++ {
// skip unexported fields
if !ov.Field(i).CanInterface() {
continue
}
tag := ot.Field(i).Tag.Get(tagCcopy)
if tag == "" {
// cannot set zero values, in case of pointers
if v, err := c.copy(ov.Field(i)); err != nil {
return reflect.Zero(ov.Type()), err
} else if !v.IsZero() {
oc.Field(i).Set(v)
}
} else {
fn := c[tag]
if fn == nil {
return reflect.Zero(ov.Type()), fmt.Errorf("missing copy customiser for: %s", tag)
}
values := reflect.ValueOf(fn).Call([]reflect.Value{ov.Field(i)})
// cannot set zero values, in case of pointers
if !values[0].IsZero() {
oc.Field(i).Set(values[0])
}
}
}
return oc, nil
}
func (c Config) copyPointer(ov reflect.Value) (reflect.Value, error) {
if ov.IsNil() {
return ov, nil
}
oc := reflect.New(ov.Type().Elem())
v, err := c.copy(ov.Elem())
if err != nil {
return reflect.Zero(ov.Type()), err
}
if !v.IsZero() {
oc.Elem().Set(v)
}
return oc, nil
}
func (c Config) copyInterface(ov reflect.Value) (reflect.Value, error) {
if ov.IsNil() {
return ov, nil
}
oc := reflect.New(ov.Type()).Elem()
v, err := c.copy(ov.Elem())
if err != nil {
return reflect.Zero(ov.Type()), err
}
oc.Set(v)
return oc, nil
}
func (c Config) copySlice(ov reflect.Value) (reflect.Value, error) {
if ov.IsNil() {
return ov, nil
}
oc := reflect.MakeSlice(ov.Type(), 0, ov.Len())
for i := 0; i < ov.Len(); i++ {
v, err := c.copy(ov.Index(i))
if err != nil {
return reflect.Zero(ov.Type()), err
}
oc = reflect.Append(oc, v)
}
return oc, nil
}
func (c Config) copyArray(ov reflect.Value) (reflect.Value, error) {
oc := reflect.New(ov.Type()).Elem()
slice := oc.Slice3(0, 0, ov.Len())
for i := 0; i < ov.Len(); i++ {
v, err := c.copy(ov.Index(i))
if err != nil {
return reflect.Zero(ov.Type()), err
}
slice = reflect.Append(slice, v)
}
return oc, nil
}
func (c Config) copyMap(ov reflect.Value) (reflect.Value, error) {
if ov.IsNil() {
return ov, nil
}
oc := reflect.MakeMapWithSize(ov.Type(), ov.Len())
iter := ov.MapRange()
for iter.Next() {
k, err := c.copy(iter.Key())
if err != nil {
return reflect.Zero(ov.Type()), err
}
v, err := c.copy(iter.Value())
if err != nil {
return reflect.Zero(ov.Type()), err
}
oc.SetMapIndex(k, v)
}
return oc, nil
}
func (c Config) copyTime(ov reflect.Value) (reflect.Value, error) {
return ov, nil
} | ccopy.go | 0.564339 | 0.528473 | ccopy.go | starcoder |
package main
import (
"image"
"image/color"
"log"
"github.com/tajtiattila/blur"
)
func process(current, previous image.Image) image.Image {
img := AbsDiff(blur.Gaussian(previous, 18, blur.ReuseSrc), blur.Gaussian(current, 18, blur.ReuseSrc))
img = blur.Gaussian(img, 12, blur.ReuseSrc)
return Threshold(img, 10)
}
func Threshold(source image.Image, distance uint8) image.Image {
b := source.Bounds()
result := image.NewRGBA(image.Rect(0, 0, b.Max.X, b.Max.Y))
var maxDist = uint16(distance) * (256)
var high = uint32(65535 - maxDist)
var low = uint32(maxDist)
for y := b.Min.Y; y < b.Max.Y; y++ {
for x := b.Min.X; x < b.Max.X; x++ {
r, g, b, _ := source.At(x, y).RGBA()
rn, gn, bn := 65535, 65535, 65535
if r > high || r < low {
rn, gn, bn = 0, 0, 0
}
if g > high || g < low {
rn, gn, bn = 0, 0, 0
}
if b > high || b < low {
rn, gn, bn = 0, 0, 0
}
c := color.RGBA64{uint16(rn), uint16(gn), uint16(bn), 65535}
result.Set(x, y, c)
}
}
return result
}
func AbsDiff(source, target image.Image) image.Image {
b := source.Bounds()
diff := image.NewRGBA(image.Rect(0, 0, b.Max.X, b.Max.Y))
for y := b.Min.Y; y < b.Max.Y; y++ {
for x := b.Min.X; x < b.Max.X; x++ {
r1, g1, b1, _ := source.At(x, y).RGBA()
r2, g2, b2, _ := target.At(x, y).RGBA()
rd := abs(r1, r2)
gd := abs(g1, g2)
bd := abs(b1, b2)
c := color.RGBA64{rd, gd, bd, 65535}
diff.Set(x, y, c)
}
}
return diff
}
func abs(a, b uint32) uint16 {
if a < b {
return uint16(b - a)
}
return uint16(a - b)
}
func HotPixels(current image.Image) (hot int, pct float64) {
total := 0
b := current.Bounds()
for y := b.Min.Y; y < b.Max.Y; y++ {
for x := b.Min.X; x < b.Max.X; x++ {
total++
r, g, b, _ := current.At(x, y).RGBA()
if r > 0 || g > 0 || b > 0 {
hot++
}
}
}
return hot, (float64(hot) / float64(total) * 100.0)
}
func Extract(source, target image.Image) image.Image {
b := source.Bounds()
extracted := image.NewRGBA(image.Rect(0, 0, b.Max.X, b.Max.Y))
for y := b.Min.Y; y < b.Max.Y; y++ {
for x := b.Min.X; x < b.Max.X; x++ {
rs, gs, bs, _ := source.At(x, y).RGBA()
rt, gt, bt, _ := target.At(x, y).RGBA()
if rt > 0 || gt > 0 || bt > 0 {
c := color.RGBA64{uint16(rs), uint16(gs), uint16(bs), 65535}
extracted.Set(x, y, c)
}
}
}
return extracted
}
func Contrast(img image.Image, contrast int) image.Image {
b := img.Bounds()
new := image.NewRGBA(image.Rect(0, 0, b.Max.X, b.Max.Y))
factor := float64(259*(contrast+255)) / float64(255*(259-contrast))
log.Println(factor)
for y := b.Min.Y; y < b.Max.Y; y++ {
for x := b.Min.X; x < b.Max.X; x++ {
r, g, b, _ := img.At(x, y).RGBA()
rn := factor*(float64(r)-32640) + 32640
gn := factor*(float64(g)-32640) + 32640
bn := factor*(float64(b)-32640) + 32640
// log.Println(rn, gn, bn)
c := color.RGBA64{uint16(rn), uint16(gn), uint16(bn), 65535}
new.Set(x, y, c)
}
}
return new
} | examples/motion/image.go | 0.648689 | 0.416915 | image.go | starcoder |
package bp3d
import (
"fmt"
"math"
"sort"
)
// Bin represents a container in which items will be put into.
type Bin struct {
Name string
Width float64
Height float64
Depth float64
MaxWeight float64
Items []*Item // Items that packed in this bin
}
type BinSlice []*Bin
func (bs BinSlice) Len() int { return len(bs) }
func (bs BinSlice) Less(i, j int) bool {
return bs[i].GetVolume() < bs[j].GetVolume()
}
func (bs BinSlice) Swap(i, j int) {
bs[i], bs[j] = bs[j], bs[i]
}
// NewBin constructs new Bin with width w, height h, depth d, and max weight mw.
func NewBin(name string, w, h, d, mw float64) *Bin {
return &Bin{
Name: name,
Width: w,
Height: h,
Depth: d,
MaxWeight: mw,
Items: make([]*Item, 0),
}
}
// GetName returns bin's name.
func (b *Bin) GetName() string {
return b.Name
}
// GetWidth returns bin's width.
func (b *Bin) GetWidth() float64 {
return b.Width
}
// GetHeight returns bin's height.
func (b *Bin) GetHeight() float64 {
return b.Height
}
// GetDepth returns bin's depth.
func (b *Bin) GetDepth() float64 {
return b.Depth
}
// GetDepth returns bin's volume.
func (b *Bin) GetVolume() float64 {
return b.Width * b.Height * b.Depth
}
// GetDepth returns bin's max weight.
func (b *Bin) GetMaxWeight() float64 {
return b.MaxWeight
}
// PutItem tries to put item into pivot p of bin b.
func (b *Bin) PutItem(item *Item, p Pivot) (fit bool) {
item.Position = p
for i := 0; i < 6; i++ {
item.RotationType = RotationType(i)
d := item.GetDimension()
if b.GetWidth() < p[0]+d[0] || b.GetHeight() < p[1]+d[1] || b.GetDepth() < p[2]+d[2] {
continue
}
fit = true
for _, ib := range b.Items {
if ib.Intersect(item) {
fit = false
break
}
}
if fit {
b.Items = append(b.Items, item)
}
return
}
return
}
func (b *Bin) String() string {
return fmt.Sprintf("%s(%vx%vx%v, max_weight:%v)", b.GetName(), b.GetWidth(), b.GetHeight(), b.GetDepth(), b.GetMaxWeight())
}
type RotationType int
const (
RotationType_WHD RotationType = iota
RotationType_HWD
RotationType_HDW
RotationType_DHW
RotationType_DWH
RotationType_WDH
)
var RotationTypeStrings = [...]string{
"RotationType_WHD (w,h,d)",
"RotationType_HWD (h,w,d)",
"RotationType_HDW (h,d,w)",
"RotationType_DHW (d,h,w)",
"RotationType_DWH (d,w,h)",
"RotationType_WDH (w,d,h)",
}
func (rt RotationType) String() string {
return RotationTypeStrings[rt]
}
type Axis int
const (
WidthAxis Axis = iota
HeightAxis
DepthAxis
)
type Pivot [3]float64
type Dimension [3]float64
func (pv Pivot) String() string {
return fmt.Sprintf("%v,%v,%v", pv[0], pv[1], pv[2])
}
var startPosition = Pivot{0, 0, 0}
type Item struct {
Name string
Width float64
Height float64
Depth float64
Weight float64
// Used during packer.Pack()
RotationType RotationType
Position Pivot
}
type ItemSlice []*Item
func (is ItemSlice) Len() int { return len(is) }
func (is ItemSlice) Less(i, j int) bool {
return is[i].GetVolume() > is[j].GetVolume()
}
func (is ItemSlice) Swap(i, j int) {
is[i], is[j] = is[j], is[i]
}
// NewItem returns an Item named name, with width w, height h, depth h, and
// weight w. The quantity defaults to one.
func NewItem(name string, w, h, d, wg float64) *Item {
return &Item{
Name: name,
Width: w,
Height: h,
Depth: d,
Weight: wg,
}
}
func (i *Item) GetName() string {
return i.Name
}
func (i *Item) GetWidth() float64 {
return i.Width
}
func (i *Item) GetHeight() float64 {
return i.Height
}
func (i *Item) GetDepth() float64 {
return i.Depth
}
func (i *Item) GetVolume() float64 {
return i.Width * i.Height * i.Depth
}
func (i *Item) GetWeight() float64 {
return i.Weight
}
func (i *Item) GetDimension() (d Dimension) {
switch i.RotationType {
case RotationType_WHD:
d = Dimension{i.GetWidth(), i.GetHeight(), i.GetDepth()}
case RotationType_HWD:
d = Dimension{i.GetHeight(), i.GetWidth(), i.GetDepth()}
case RotationType_HDW:
d = Dimension{i.GetHeight(), i.GetDepth(), i.GetWidth()}
case RotationType_DHW:
d = Dimension{i.GetDepth(), i.GetHeight(), i.GetWidth()}
case RotationType_DWH:
d = Dimension{i.GetDepth(), i.GetWidth(), i.GetHeight()}
case RotationType_WDH:
d = Dimension{i.GetWidth(), i.GetDepth(), i.GetHeight()}
}
return
}
// Intersect checks whether there's an intersection between item i and item it.
func (i *Item) Intersect(i2 *Item) bool {
return rectIntersect(i, i2, WidthAxis, HeightAxis) &&
rectIntersect(i, i2, HeightAxis, DepthAxis) &&
rectIntersect(i, i2, WidthAxis, DepthAxis)
}
// rectIntersect checks whether two rectangles from axis x and y of item i1 and i2
// has intersection or not.
func rectIntersect(i1, i2 *Item, x, y Axis) bool {
d1 := i1.GetDimension()
d2 := i2.GetDimension()
cx1 := i1.Position[x] + d1[x]/2
cy1 := i1.Position[y] + d1[y]/2
cx2 := i2.Position[x] + d2[x]/2
cy2 := i2.Position[y] + d2[y]/2
ix := math.Max(cx1, cx2) - math.Min(cx1, cx2)
iy := math.Max(cy1, cy2) - math.Min(cy1, cy2)
return ix < (d1[x]+d2[x])/2 && iy < (d1[y]+d2[y])/2
}
func (i *Item) String() string {
return fmt.Sprintf("%s(%vx%vx%v, weight: %v) pos(%s) rt(%s)", i.GetName(), i.GetWidth(), i.GetHeight(), i.GetDepth(), i.GetWeight(), i.Position, i.RotationType)
}
type Packer struct {
Bins []*Bin
Items []*Item
UnfitItems []*Item // items that don't fit to any bin
}
func NewPacker() *Packer {
return &Packer{
Bins: make([]*Bin, 0),
Items: make([]*Item, 0),
UnfitItems: make([]*Item, 0),
}
}
func (p *Packer) AddBin(bins ...*Bin) {
p.Bins = append(p.Bins, bins...)
}
func (p *Packer) AddItem(items ...*Item) {
p.Items = append(p.Items, items...)
}
func (p *Packer) Pack() error {
sort.Sort(BinSlice(p.Bins))
sort.Sort(ItemSlice(p.Items))
// TODO(gedex): validate bins volumes. this is the reason we need error
// to be returned before iterating items.
for len(p.Items) > 0 {
bin := p.FindFittedBin(p.Items[0])
if bin == nil {
p.unfitItem()
continue
}
p.Items = p.packToBin(bin, p.Items)
}
return nil
}
// unfitItem moves p.Items[0] to p.UnfitItems.
func (p *Packer) unfitItem() {
if len(p.Items) == 0 {
return
}
p.UnfitItems = append(p.UnfitItems, p.Items[0])
p.Items = p.Items[1:]
}
// packToBin packs items to bin b. Returns unpacked items.
func (p *Packer) packToBin(b *Bin, items []*Item) (unpacked []*Item) {
if !b.PutItem(items[0], startPosition) {
if b2 := p.getBiggerBinThan(b); b2 != nil {
return p.packToBin(b2, items)
}
return p.Items
}
// Pack unpacked items.
for _, i := range items[1:] {
var fitted bool
lookup:
// Try available pivots in current bin that are not intersect with
// existing items in current bin.
for pt := 0; pt < 3; pt++ {
for _, ib := range b.Items {
var pv Pivot
switch Axis(pt) {
case WidthAxis:
pv = Pivot{ib.Position[0] + ib.GetWidth(), ib.Position[1], ib.Position[2]}
case HeightAxis:
pv = Pivot{ib.Position[0], ib.Position[1] + ib.GetHeight(), ib.Position[2]}
case DepthAxis:
pv = Pivot{ib.Position[0], ib.Position[1], ib.Position[2] + ib.GetDepth()}
}
if b.PutItem(i, pv) {
fitted = true
break lookup
}
}
}
if !fitted {
for b2 := p.getBiggerBinThan(b); b2 != nil; b2 = p.getBiggerBinThan(b) {
left := p.packToBin(b2, append(b2.Items, i))
if len(left) == 0 {
b = b2
fitted = true
break
}
}
if !fitted {
unpacked = append(unpacked, i)
}
}
}
return
}
func (p *Packer) getBiggerBinThan(b *Bin) *Bin {
v := b.GetVolume()
for _, b2 := range p.Bins {
if b2.GetVolume() > v {
return b2
}
}
return nil
}
// FindFittedBin finds bin in which item i will be fitted into.
func (p *Packer) FindFittedBin(i *Item) *Bin {
for _, b := range p.Bins {
if !b.PutItem(i, startPosition) {
continue
}
if len(b.Items) == 1 && b.Items[0] == i {
// Clear items in bin as we previously just check whether item i
// fits in bin b.
b.Items = []*Item{}
}
return b
}
return nil
} | bp3d.go | 0.775605 | 0.474509 | bp3d.go | starcoder |
package query
import (
"fmt"
"strings"
)
type InsertQuery struct {
table string
ignore bool
columns []string
duplicates []string
values [][]string
placeholder Placeholder
}
func NewInsert(t string) InsertQuery {
return InsertQuery{
table: t,
placeholder: PlaceholderQMark,
}
}
func (q InsertQuery) String() string {
builder := strings.Builder{}
builder.WriteString("INSERT ")
if q.ignore {
builder.WriteString("IGNORE ")
}
builder.WriteString(fmt.Sprintf("INTO %s ", q.table))
builder.WriteString(fmt.Sprintf("(%s) ", strings.Join(q.columns, ", ")))
valueCount := len(q.values)
values := make([]string, 0, valueCount)
for _, v := range q.values {
values = append(values, fmt.Sprintf("(%s)", strings.Join(v, ", ")))
}
builder.WriteString(fmt.Sprintf("VALUES %s ", strings.Join(values, ", ")))
if !q.ignore && len(q.duplicates) > 0 {
builder.WriteString(fmt.Sprintf("ON DUPLICATE KEY UPDATE %s", strings.Join(q.duplicates, ", ")))
}
return strings.Trim(builder.String(), " ")
}
func (q *InsertQuery) Columns(a bool, c ...string) {
if a {
q.columns = append(q.columns, c...)
return
}
q.columns = c
}
func (q *InsertQuery) Duplicates(a bool, d ...string) {
if a {
q.duplicates = append(q.duplicates, d...)
return
}
q.duplicates = d
}
func (q *InsertQuery) Ignore(d bool) {
q.ignore = d
}
func (q *InsertQuery) ValuesPrepared(rows int) {
// TODO maybe throw error
if count := len(q.columns); count > 0 {
q.values = make([][]string, 0, rows)
for i := 0; i < rows; i++ {
q.values = append(q.values, strings.Split(strings.Repeat(q.placeholder.String(), count), ""))
}
}
}
func (q *InsertQuery) Values(a bool, values ...interface{}) error {
if len(values) != len(q.columns) {
return fmt.Errorf("values count does not match column count")
}
if !a {
q.values = make([][]string, 0)
}
tmp := make([]string, 0)
for _, v := range values {
tmp = append(tmp, fmt.Sprintf("%v", v))
}
q.values = append(q.values, tmp)
return nil
} | insert.go | 0.511229 | 0.4081 | insert.go | starcoder |
package tpl
func DefaultModels() map[string]Description {
models := make(map[string]Description)
models["basic"] = Description{
Short: "'basic' is default. This model provides data about current project, developer etc",
Long: `Use the basic model if the template you are creating don't has the
need for a single entity or a list of entities`,
}
models["entity"] = Description{
Short: "Includes same properties as 'basic', but will also bring in an entity",
Long: `An entity model contains information about one entity.
Use the entity type if you have create code that need access to columns, primary keys, foreign keys
parent and child- relations.
This is the most used model type.`,
}
models["entities"] = Description{
Short: "Includes same properties as 'basic', but will also bring in a list of entities",
Long: `The entities model contains information about all entities provided by the following:
1) --entity name1 --entity name2. or
2) --entity-group groupname (a group of entities must be done on a
connection property in the config or a project)
With a list of entities you will be able to loop through each entity and create code thats fits in one single page
Each entity in the list access to columns, primary keys, foreign keys, parent and child- relations.`,
}
return models
}
func DefaultTypes() map[string]Description {
models := make(map[string]Description)
models["file"] = Description{
Short: "Includes same properties as 'basic', but will also bring in an entity",
Long: `A file template represents complete 'artifact', for C# nd Java this could be a
single class definition or anything else that fits on a page.
A file type can be exported as file
`,
}
models["block"] = Description{
Short: "A block template can be used as a reference in another block or file",
Long: `Using blocks as building stones will make the templates more readable
Blocks enable better reuse of code, smaller templates and should make templates
faster to create.
Reference the block in a snippet, file or another block with the following syntax:
{{ template "block-name" . }}
`,
}
return models
}
func DefaultKeys() map[string]Description {
models := make(map[string]Description)
gen := `The 'key' is used to connect a generic template to parts of the a specific project
or a standard setup from a config or language definition.
a set of keys for C# can be totally different from keys in go, java or other languages.
`
models["model"] = Description{
Short: "model",
Long: gen + `The 'model' key is normally used on poco and objects representing an entity`,
}
models["interface"] = Description{
Short: "interface",
Long: gen + `interface`,
}
models["repo"] = Description{
Short: "repo",
Long: gen + `repo`,
}
return models
} | src/tpl/defaults.go | 0.747524 | 0.518607 | defaults.go | starcoder |
package state
import (
"github.com/abchain/fabric/core/ledger/statemgmt"
"strings"
)
// CompositeRangeScanIterator - an implementation of interface 'statemgmt.RangeScanIterator'
// This provides a wrapper on top of more than one underlying iterators
type CompositeRangeScanIterator struct {
itrs []*statemgmt.StateDeltaIterator
currentItrs []int
}
func newCompositeRangeScanIterator(itrs ...*statemgmt.StateDeltaIterator) statemgmt.RangeScanIterator {
ret := &CompositeRangeScanIterator{itrs: itrs}
ret.init()
return ret
}
//fill every number for first tasks
func (citr *CompositeRangeScanIterator) init() {
for i, _ := range citr.itrs {
citr.currentItrs = append(citr.currentItrs, i)
}
}
func (citr *CompositeRangeScanIterator) next() bool {
if len(citr.currentItrs) == 0 {
return false
}
logger.Debugf("Next task: currentItrNumbers = %v", citr.currentItrs)
var lastPos int
for i, itr := range citr.itrs {
if len(citr.currentItrs) > 0 && i == citr.currentItrs[0] {
citr.currentItrs = citr.currentItrs[1:]
if !itr.NextWithDeleted() {
logger.Debugf("iterator %d is over", i)
continue
}
}
citr.itrs[lastPos] = itr
lastPos++
}
citr.itrs = citr.itrs[:lastPos]
//scan for next availiable values
//merge sort, from top to bottom, notice same value will be also considered ...
var minKey string
for i, itr := range citr.itrs {
k, _ := itr.GetKeyValue()
if cmp := strings.Compare(k, minKey); minKey == "" || cmp < 0 {
citr.currentItrs = []int{i}
minKey = k
} else if cmp == 0 {
citr.currentItrs = append(citr.currentItrs, i)
}
}
logger.Debugf("Evaluating key = %s currentItrNumbers = %v", minKey, citr.currentItrs)
return len(citr.currentItrs) > 0
}
func (citr *CompositeRangeScanIterator) Next() bool {
for citr.next() {
if k, v := citr.itrs[citr.currentItrs[0]].GetKeyRawValue(); v.IsDeleted() {
logger.Debugf("Evaluating key = %s is deleted key, skip for next", k)
} else {
return true
}
}
return false
}
// GetKeyValue - see interface 'statemgmt.StateDeltaIterator' for details
func (citr *CompositeRangeScanIterator) GetKeyValue() (string, []byte) {
return citr.itrs[citr.currentItrs[0]].GetKeyValue()
}
// Close - see interface 'statemgmt.StateDeltaIterator' for details
func (citr *CompositeRangeScanIterator) Close() {} | core/ledger/statemgmt/state/composite_range_scan_iterator.go | 0.538255 | 0.401336 | composite_range_scan_iterator.go | starcoder |
package date
import (
"fmt"
"time"
)
type Date struct {
tm time.Time
}
const timeFmt = "2006-01-02"
func Today() Date {
return FromTime(time.Now())
}
func TodayIn(loc *time.Location) Date {
return FromTime(time.Now().In(loc))
}
func Parse(s string) (Date, error) {
tm, err := time.ParseInLocation(timeFmt, s, time.UTC)
if err != nil {
return Date{}, err
} else {
return Date{tm}, err
}
}
func MustParse(s string) Date {
day, err := Parse(s)
if err != nil {
panic(fmt.Sprintf("Invalid date format: %#v", s))
} else {
return day
}
}
func Make(year int, month time.Month, day int) Date {
return Date{time.Date(year, month, day, 0, 0, 0, 0, time.UTC)}
}
func FromTime(tm time.Time) Date {
y, m, d := tm.Date()
return Date{time.Date(y, m, d, 0, 0, 0, 0, time.UTC)}
}
func (d Date) Date() (year int, month time.Month, day int) {
return d.tm.Date()
}
func (d Date) In(loc *time.Location) time.Time {
if loc == time.UTC {
return d.tm
}
yr, mo, dy := d.tm.Date()
return time.Date(yr, mo, dy, 0, 0, 0, 0, loc)
}
func (d Date) InUTC() time.Time {
return d.tm
}
func (d Date) String() string {
return d.tm.Format(timeFmt)
}
func (d Date) StringOr(zero string) string {
if d.IsZero() {
return zero
} else {
return d.String()
}
}
func (d Date) IsZero() bool {
return d.tm.IsZero()
}
func (d Date) Equal(u Date) bool {
return d.tm.Equal(u.tm)
}
func (d Date) Before(u Date) bool {
return d.tm.Before(u.tm)
}
func (d Date) After(u Date) bool {
return d.tm.After(u.tm)
}
func (d Date) AddDays(days int) Date {
return Date{d.tm.AddDate(0, 0, days)}
}
func (d Date) Add(years, months, days int) Date {
return Date{d.tm.AddDate(years, months, days)}
}
func (d Date) Next() Date {
return d.AddDays(1)
}
func (d Date) Prev() Date {
return d.AddDays(-1)
}
// satisfies flag.Value interface
func (d *Date) Set(s string) error {
u, err := Parse(s)
if err != nil {
return err
}
*d = u
return nil
}
func Min(a, b Date) Date {
if a.Before(b) {
return a
} else {
return b
}
}
func Max(a, b Date) Date {
if a.After(b) {
return a
} else {
return b
}
} | date.go | 0.666171 | 0.4436 | date.go | starcoder |
package mysql
import (
"time"
"gitlab.jiagouyun.com/cloudcare-tools/datakit/io"
"gitlab.jiagouyun.com/cloudcare-tools/datakit/plugins/inputs"
)
type dbmMetric struct {
Enabled bool `toml:"enabled"`
}
type dbmStateMeasurement struct {
name string
tags map[string]string
fields map[string]interface{}
ts time.Time
}
func (m *dbmStateMeasurement) LineProto() (*io.Point, error) {
return io.MakePoint(m.name, m.tags, m.fields, m.ts)
}
func (m *dbmStateMeasurement) Info() *inputs.MeasurementInfo {
return &inputs.MeasurementInfo{
Desc: "记录查询语句的执行次数、等待耗时、锁定时间和查询的记录行数等。",
Name: "mysql_dbm_metric",
Fields: map[string]interface{}{
"sum_timer_wait": &inputs.FieldInfo{
DataType: inputs.Int,
Type: inputs.Gauge,
Unit: inputs.NCount,
Desc: "The total query execution time(nanosecond) per normalized query and schema.",
},
"count_star": &inputs.FieldInfo{
DataType: inputs.Int,
Type: inputs.Gauge,
Unit: inputs.NCount,
Desc: "The total count of executed queries per normalized query and schema.",
},
"sum_errors": &inputs.FieldInfo{
DataType: inputs.Int,
Type: inputs.Gauge,
Unit: inputs.NCount,
Desc: "The total count of queries run with an error per normalized query and schema.",
},
"sum_lock_time": &inputs.FieldInfo{
DataType: inputs.Int,
Type: inputs.Gauge,
Unit: inputs.NCount,
Desc: "The total time(nanosecond) spent waiting on locks per normalized query and schema.",
},
"sum_rows_sent": &inputs.FieldInfo{
DataType: inputs.Int,
Type: inputs.Gauge,
Unit: inputs.NCount,
Desc: "The number of rows sent per normalized query and schema.",
},
"sum_select_scan": &inputs.FieldInfo{
DataType: inputs.Int,
Type: inputs.Gauge,
Unit: inputs.NCount,
Desc: "The total count of full table scans on the first table per normalized query and schema.",
},
"sum_no_index_used": &inputs.FieldInfo{
DataType: inputs.Int,
Type: inputs.Gauge,
Unit: inputs.NCount,
Desc: "The total count of queries which do not use an index per normalized query and schema.",
},
"sum_rows_affected": &inputs.FieldInfo{
DataType: inputs.Int,
Type: inputs.Gauge,
Unit: inputs.NCount,
Desc: "The number of rows mutated per normalized query and schema.",
},
"sum_rows_examined": &inputs.FieldInfo{
DataType: inputs.Int,
Type: inputs.Gauge,
Unit: inputs.NCount,
Desc: "The number of rows examined per normalized query and schema.",
},
"sum_select_full_join": &inputs.FieldInfo{
DataType: inputs.Int,
Type: inputs.Gauge,
Unit: inputs.NCount,
Desc: "The total count of full table scans on a joined table per normalized query and schema.",
},
"sum_no_good_index_used": &inputs.FieldInfo{
DataType: inputs.Int,
Type: inputs.Gauge,
Unit: inputs.NCount,
Desc: "The total count of queries which used a sub-optimal index per normalized query and schema.",
},
"message": &inputs.FieldInfo{
DataType: inputs.String,
Type: inputs.String,
Unit: inputs.UnknownType,
Desc: "The text of the normalized statement digest.",
},
},
Tags: map[string]interface{}{
"digest": &inputs.TagInfo{Desc: " The digest hash value computed from the original normalized statement. "},
"query_signature": &inputs.TagInfo{Desc: " The hash value computed from digest_text"},
"schema_name": &inputs.TagInfo{Desc: "The schema name"},
"server": &inputs.TagInfo{Desc: " The server address"},
},
}
}
type dbmRow struct {
schemaName string
digest string
digestText string
querySignature string
countStar uint64
sumTimerWait uint64
sumLockTime uint64
sumErrors uint64
sumRowsAffected uint64
sumRowsSent uint64
sumRowsExamined uint64
sumSelectScan uint64
sumSelectFullJoin uint64
sumNoIndexUsed uint64
sumNoGoodIndexUsed uint64
}
// generate row key by shcemaName querySignature.
func getRowKey(schemaName, querySignature string) string {
return schemaName + querySignature
}
// merge duplicate rows.
func mergeDuplicateRows(rows []dbmRow) []dbmRow {
keyRows := make(map[string]dbmRow)
for _, row := range rows {
keyStr := getRowKey(row.schemaName, row.querySignature)
if keyRow, ok := keyRows[keyStr]; ok {
keyRow.countStar += row.countStar
keyRow.sumTimerWait += row.sumTimerWait
keyRow.sumLockTime += row.sumLockTime
keyRow.sumErrors += row.sumErrors
keyRow.sumRowsAffected += row.sumRowsAffected
keyRow.sumRowsSent += row.sumRowsSent
keyRow.sumRowsExamined += row.sumRowsExamined
keyRow.sumSelectScan += row.sumSelectScan
keyRow.sumSelectFullJoin += row.sumSelectFullJoin
keyRow.sumNoIndexUsed += row.sumNoIndexUsed
keyRow.sumNoGoodIndexUsed += row.sumNoGoodIndexUsed
} else {
keyRows[keyStr] = row
}
}
dbmRows := []dbmRow{}
for _, row := range keyRows {
dbmRows = append(dbmRows, row)
}
return dbmRows
}
// calculate metric based on previous row identified by row key.
func getMetricRows(dbmRows []dbmRow, dbmCache *map[string]dbmRow) ([]dbmRow, map[string]dbmRow) {
newDbmCache := make(map[string]dbmRow)
metricRows := []dbmRow{}
for _, row := range dbmRows {
rowKey := getRowKey(row.schemaName, row.querySignature)
if _, ok := newDbmCache[rowKey]; ok {
l.Warnf("Duplicate querySignature: %s, using the new one", row.querySignature)
}
newDbmCache[rowKey] = row
if oldRow, ok := (*dbmCache)[rowKey]; ok {
diffRow := dbmRow{
digest: row.digest,
digestText: row.digestText,
schemaName: row.schemaName,
querySignature: row.querySignature,
}
isChange := false
if row.countStar >= oldRow.countStar {
value := row.countStar - oldRow.countStar
diffRow.countStar = value
if value > 0 {
isChange = true
}
} else {
continue
}
if row.sumTimerWait >= oldRow.sumTimerWait {
value := row.sumTimerWait - oldRow.sumTimerWait
diffRow.sumTimerWait = value / 1000 // nanosecond
if value > 0 {
isChange = true
}
} else {
continue
}
if row.sumLockTime >= oldRow.sumLockTime {
value := row.sumLockTime - oldRow.sumLockTime
diffRow.sumLockTime = value / 1000 // nanosecond
if value > 0 {
isChange = true
}
} else {
continue
}
if row.sumErrors >= oldRow.sumErrors {
value := row.sumErrors - oldRow.sumErrors
diffRow.sumErrors = value
if value > 0 {
isChange = true
}
} else {
continue
}
if row.sumRowsAffected >= oldRow.sumRowsAffected {
value := row.sumRowsAffected - oldRow.sumRowsAffected
diffRow.sumRowsAffected = value
if value > 0 {
isChange = true
}
} else {
continue
}
if row.sumRowsSent >= oldRow.sumRowsSent {
value := row.sumRowsSent - oldRow.sumRowsSent
diffRow.sumRowsSent = value
if value > 0 {
isChange = true
}
} else {
continue
}
if row.sumRowsExamined >= oldRow.sumRowsExamined {
value := row.sumRowsExamined - oldRow.sumRowsExamined
diffRow.sumRowsExamined = value
if value > 0 {
isChange = true
}
} else {
continue
}
if row.sumSelectScan >= oldRow.sumSelectScan {
value := row.sumSelectScan - oldRow.sumSelectScan
diffRow.sumSelectScan = value
if value > 0 {
isChange = true
}
} else {
continue
}
if row.sumSelectFullJoin >= oldRow.sumSelectFullJoin {
value := row.sumSelectFullJoin - oldRow.sumSelectFullJoin
diffRow.sumSelectFullJoin = value
if value > 0 {
isChange = true
}
} else {
continue
}
if row.sumNoIndexUsed >= oldRow.sumNoIndexUsed {
value := row.sumNoIndexUsed - oldRow.sumNoIndexUsed
diffRow.sumNoIndexUsed = value
if value > 0 {
isChange = true
}
} else {
continue
}
if row.sumNoGoodIndexUsed >= oldRow.sumNoGoodIndexUsed {
value := row.sumNoGoodIndexUsed - oldRow.sumNoGoodIndexUsed
diffRow.sumNoGoodIndexUsed = value
if value > 0 {
isChange = true
}
} else {
continue
}
// No changes, no metric collected
if !isChange {
continue
}
metricRows = append(metricRows, diffRow)
} else {
continue
}
}
return metricRows, newDbmCache
} | plugins/inputs/mysql/dbm_statements.go | 0.563258 | 0.46478 | dbm_statements.go | starcoder |
package problem023
import "math"
type queue struct {
innerSlice []boardState
}
type queueEmpty struct{}
func (err queueEmpty) Error() string {
return "queue empty"
}
func newQueue(len int, cap int) *queue {
return &queue{innerSlice: make([]boardState, len, cap)}
}
func (q *queue) enqueue(item boardState) {
q.innerSlice = append(q.innerSlice, item)
}
func (q *queue) dequeue() (boardState, error) {
if len(q.innerSlice) > 0 {
item := q.innerSlice[0]
q.innerSlice = q.innerSlice[1:]
return item, nil
} else {
return boardState{}, queueEmpty{}
}
}
type coordinate struct {
column int
row int
}
type board [][]bool
func (thisBoard board) deepCopy() board {
newBoard := make([][]bool, len(thisBoard), len(thisBoard))
for i, column := range thisBoard {
newBoard[i] = make([]bool, len(column), len(column))
for j, row := range column {
newBoard[i][j] = row
}
}
return newBoard
}
type boardState struct {
maze board
steps int
start coordinate
end coordinate
}
func Run(maze [][]bool, startColumn int, startRow int, endColumn int, endRow int) int {
q := newQueue(0, len(maze[0])*len(maze))
q.enqueue(boardState{
maze: board(maze).deepCopy(),
steps: 0,
start: coordinate{column: startColumn, row: startRow},
end: coordinate{column: endColumn, row: endRow},
})
minimumSteps := math.MaxInt32
for state, err := q.dequeue(); err == nil; state, err = q.dequeue() {
switch {
case state.steps >= len(state.maze)*len(state.maze[0]):
fallthrough
case state.start.column >= len(state.maze) || state.start.column < 0:
fallthrough
case state.start.row >= len(state.maze[state.start.column]) || state.start.row < 0:
fallthrough
case state.maze[state.start.column][state.start.row]:
continue
case state.start.column == state.end.column && state.start.row == state.end.row:
if minimumSteps > state.steps {
minimumSteps = state.steps
}
default:
state.maze[state.start.column][state.start.row] = true
state.steps++
q.enqueue(boardState{
maze: state.maze.deepCopy(),
steps: state.steps,
start: coordinate{column: state.start.column - 1, row: state.start.row},
end: coordinate{column: state.end.column, row: state.end.row},
})
q.enqueue(boardState{
maze: state.maze.deepCopy(),
steps: state.steps,
start: coordinate{column: state.start.column, row: state.start.row - 1},
end: coordinate{column: state.end.column, row: state.end.row},
})
q.enqueue(boardState{
maze: state.maze.deepCopy(),
steps: state.steps,
start: coordinate{column: state.start.column, row: state.start.row + 1},
end: coordinate{column: state.end.column, row: state.end.row},
})
q.enqueue(boardState{
maze: state.maze.deepCopy(),
steps: state.steps,
start: coordinate{column: state.start.column + 1, row: state.start.row},
end: coordinate{column: state.end.column, row: state.end.row},
})
}
}
return minimumSteps
} | problem023/problem023.go | 0.61057 | 0.419113 | problem023.go | starcoder |
package util
import (
"github.com/uncharted-distil/distil-compute/primitive/compute"
)
const (
// Accuracy identifies model metric based on nearness to the original result.
Accuracy = "accuracy"
// F1 identifies model metric based on precision and recall
F1 = "f1"
// F1Micro identifies model metric based on precision and recall
F1Micro = "f1Micro"
// F1Macro identifies model metric based on precision and recall.
F1Macro = "f1Macro"
// MeanAbsoluteError identifies model metric based on
MeanAbsoluteError = "meanAbsoluteError"
// MeanSquaredError identifies model metric based on the quality of the estimator.
MeanSquaredError = "meanSquaredError"
// NormalizedMutualInformation identifies model metric based on the relationship between variables.
NormalizedMutualInformation = "normalizedMutualInformation"
// RocAuc identifies model metric based on
RocAuc = "rocAuc"
// RocAucMicro identifies model metric based on
RocAucMicro = "rocAucMicro"
// RocAucMacro identifies model metric based on
RocAucMacro = "rocAucMacro"
// RootMeanSquaredError identifies model metric based on
RootMeanSquaredError = "rootMeanSquaredError"
// RootMeanSquaredErrorAvg identifies model metric based on
RootMeanSquaredErrorAvg = "rootMeanSquaredErrorAvg"
// RSquared identifies model metric based on
RSquared = "rSquared"
)
// MetricID uniquely identifies model metric methods
type MetricID string
// Metric defines the ID, display name and description for various model metrics
type Metric struct {
ID MetricID
DisplayName string
Description string
}
var (
allModelLabels = map[string]string{
Accuracy: compute.GetMetricLabel(compute.ConvertProblemMetricToTA2(Accuracy)),
F1: compute.GetMetricLabel(compute.ConvertProblemMetricToTA2(F1)),
F1Macro: compute.GetMetricLabel(compute.ConvertProblemMetricToTA2(F1Macro)),
F1Micro: compute.GetMetricLabel(compute.ConvertProblemMetricToTA2(F1Micro)),
RocAuc: compute.GetMetricLabel(compute.ConvertProblemMetricToTA2(RocAuc)),
RocAucMacro: compute.GetMetricLabel(compute.ConvertProblemMetricToTA2(RocAucMacro)),
RocAucMicro: compute.GetMetricLabel(compute.ConvertProblemMetricToTA2(RocAucMicro)),
MeanAbsoluteError: compute.GetMetricLabel(compute.ConvertProblemMetricToTA2(MeanAbsoluteError)),
MeanSquaredError: compute.GetMetricLabel(compute.ConvertProblemMetricToTA2(MeanSquaredError)),
NormalizedMutualInformation: compute.GetMetricLabel(compute.ConvertProblemMetricToTA2(NormalizedMutualInformation)),
RootMeanSquaredError: compute.GetMetricLabel(compute.ConvertProblemMetricToTA2(RootMeanSquaredError)),
RootMeanSquaredErrorAvg: compute.GetMetricLabel(compute.ConvertProblemMetricToTA2(RootMeanSquaredErrorAvg)),
RSquared: compute.GetMetricLabel(compute.ConvertProblemMetricToTA2(RSquared)),
}
// AllModelMetrics defines a list of model scoring metrics
AllModelMetrics = map[string]Metric{
Accuracy: {
Accuracy,
allModelLabels[Accuracy],
allModelLabels[Accuracy] + " scores the result based only on the percentage of correct predictions.",
},
F1: {
F1,
allModelLabels[F1],
allModelLabels[F1] + " scoring averages true positives, false negatives and false positives for binary classifications, balancing precision and recall.",
},
F1Macro: {
F1Macro,
allModelLabels[F1Macro],
"F1 Macro scoring averages true positives, false negatives and false positives for all multi-class classification options, balancing precision and recall.",
},
F1Micro: {
F1Micro,
allModelLabels[F1Micro],
allModelLabels[F1Micro] + " scoring averages true positives, false negatives and false positives for each multi-class classification options for multi-class problems, balancing precision and recall.",
},
RocAuc: {
RocAuc,
allModelLabels[RocAuc],
allModelLabels[RocAuc] + " scoring compares relationship between inputs on result for binary classifications.",
},
RocAucMacro: {
RocAucMacro,
allModelLabels[RocAucMacro],
allModelLabels[RocAucMacro] + " scoring compares the relationship between inputs and the result for all multi-class classification options.",
},
RocAucMicro: {
RocAucMicro,
allModelLabels[RocAucMicro],
allModelLabels[RocAucMicro] + " scoring compares hte relationship between inputs and the result for each multi-class classification options.",
},
MeanAbsoluteError: {
MeanAbsoluteError,
allModelLabels[MeanAbsoluteError],
allModelLabels[MeanAbsoluteError] + " measures the average magnitude of errors in a set of predictions.",
},
MeanSquaredError: {
MeanSquaredError,
allModelLabels[MeanSquaredError],
allModelLabels[MeanSquaredError] + " measures the quality of an estimator where values closer to 0 are better.",
},
NormalizedMutualInformation: {
NormalizedMutualInformation,
allModelLabels[NormalizedMutualInformation],
allModelLabels[NormalizedMutualInformation] + " scores the relationship / lack of entropy between variables where 0 is no relationship and 1 is a strong relationship.",
},
RootMeanSquaredError: {
RootMeanSquaredError,
allModelLabels[RootMeanSquaredError],
allModelLabels[RootMeanSquaredError] + " measures the quality of an estimator and the average magnitude of the error.",
},
RootMeanSquaredErrorAvg: {
RootMeanSquaredErrorAvg,
allModelLabels[RootMeanSquaredErrorAvg],
allModelLabels[RootMeanSquaredErrorAvg] + " measures the quality of an estimator and the average magnitude of the error averaged across classifcations options.",
},
RSquared: {
RSquared,
allModelLabels[RSquared],
allModelLabels[RSquared] + " measures the relationship between predictions and their inputs where values closer to 1 suggest a strong correlation.",
},
}
//TaskMetricMap maps tasks to metrics
TaskMetricMap = map[string]map[string]Metric{
compute.BinaryTask: {
Accuracy: AllModelMetrics[Accuracy],
F1: AllModelMetrics[F1],
RocAuc: AllModelMetrics[RocAuc],
},
compute.MultiClassTask: {
Accuracy: AllModelMetrics[Accuracy],
F1Macro: AllModelMetrics[F1Micro],
F1Micro: AllModelMetrics[F1Macro],
RocAucMacro: AllModelMetrics[RocAucMacro],
RocAucMicro: AllModelMetrics[RocAucMicro],
},
compute.SemiSupervisedTask: {
Accuracy: AllModelMetrics[Accuracy],
F1Micro: AllModelMetrics[F1Macro],
RocAucMicro: AllModelMetrics[RocAucMicro],
},
compute.ClassificationTask: {
Accuracy: AllModelMetrics[Accuracy],
F1: AllModelMetrics[F1],
F1Macro: AllModelMetrics[F1Micro],
F1Micro: AllModelMetrics[F1Macro],
RocAuc: AllModelMetrics[RocAuc],
RocAucMacro: AllModelMetrics[RocAucMacro],
RocAucMicro: AllModelMetrics[RocAucMicro],
},
compute.RegressionTask: {
MeanAbsoluteError: AllModelMetrics[MeanAbsoluteError],
MeanSquaredError: AllModelMetrics[MeanSquaredError],
RootMeanSquaredError: AllModelMetrics[RootMeanSquaredError],
RootMeanSquaredErrorAvg: AllModelMetrics[RootMeanSquaredErrorAvg],
RSquared: AllModelMetrics[RSquared],
},
}
) | api/util/metrics.go | 0.657318 | 0.490236 | metrics.go | starcoder |
package mat64
import (
"github.com/gonum/blas"
"github.com/gonum/blas/blas64"
)
var (
symDense *SymDense
_ Matrix = symDense
_ Symmetric = symDense
_ RawSymmetricer = symDense
)
const (
ErrUplo = "mat64: blas64.Symmetric not upper"
)
// SymDense is a symmetric matrix that uses Dense storage.
type SymDense struct {
mat blas64.Symmetric
}
// Symmetric represents a symmetric matrix (where the element at {i, j} equals
// the element at {j, i}). Symmetric matrices are always square.
type Symmetric interface {
Matrix
// Symmetric returns the number of rows/columns in the matrix.
Symmetric() int
}
// A RawSymmetricer can return a view of itself as a BLAS Symmetric matrix.
type RawSymmetricer interface {
RawSymmetric() blas64.Symmetric
}
// NewSymDense constructs an n x n symmetric matrix. If len(mat) == n * n,
// mat will be used to hold the underlying data, or if mat == nil, new data will be allocated.
// The underlying data representation is the same as a Dense matrix, except
// the values of the entries in the lower triangular portion are completely ignored.
func NewSymDense(n int, mat []float64) *SymDense {
if n < 0 {
panic("mat64: negative dimension")
}
if mat != nil && n*n != len(mat) {
panic(ErrShape)
}
if mat == nil {
mat = make([]float64, n*n)
}
return &SymDense{blas64.Symmetric{
N: n,
Stride: n,
Data: mat,
Uplo: blas.Upper,
}}
}
func (s *SymDense) Dims() (r, c int) {
return s.mat.N, s.mat.N
}
func (s *SymDense) Symmetric() int {
return s.mat.N
}
// RawSymmetric returns the matrix as a blas64.Symmetric. The returned
// value must be stored in upper triangular format.
func (s *SymDense) RawSymmetric() blas64.Symmetric {
return s.mat
}
func (s *SymDense) isZero() bool {
return s.mat.N == 0
}
func (s *SymDense) AddSym(a, b Symmetric) {
n := a.Symmetric()
if n != b.Symmetric() {
panic(ErrShape)
}
if s.isZero() {
s.mat = blas64.Symmetric{
N: n,
Stride: n,
Data: use(s.mat.Data, n*n),
Uplo: blas.Upper,
}
} else if s.mat.N != n {
panic(ErrShape)
}
if a, ok := a.(RawSymmetricer); ok {
if b, ok := b.(RawSymmetricer); ok {
amat, bmat := a.RawSymmetric(), b.RawSymmetric()
for i := 0; i < n; i++ {
btmp := bmat.Data[i*bmat.Stride+i : i*bmat.Stride+n]
stmp := s.mat.Data[i*s.mat.Stride+i : i*s.mat.Stride+n]
for j, v := range amat.Data[i*amat.Stride+i : i*amat.Stride+n] {
stmp[j] = v + btmp[j]
}
}
return
}
}
for i := 0; i < n; i++ {
stmp := s.mat.Data[i*s.mat.Stride : i*s.mat.Stride+n]
for j := i; j < n; j++ {
stmp[j] = a.At(i, j) + b.At(i, j)
}
}
}
func (s *SymDense) CopySym(a Symmetric) int {
n := a.Symmetric()
n = min(n, s.mat.N)
switch a := a.(type) {
case RawSymmetricer:
amat := a.RawSymmetric()
if amat.Uplo != blas.Upper {
panic(ErrUplo)
}
for i := 0; i < n; i++ {
copy(s.mat.Data[i*s.mat.Stride+i:i*s.mat.Stride+n], amat.Data[i*amat.Stride+i:i*amat.Stride+n])
}
default:
for i := 0; i < n; i++ {
stmp := s.mat.Data[i*s.mat.Stride : i*s.mat.Stride+n]
for j := i; j < n; j++ {
stmp[j] = a.At(i, j)
}
}
}
return n
}
// SymRankOne performs a symetric rank-one update to the matrix a and stores
// the result in the receiver
// s = a + alpha * x * x'
func (s *SymDense) SymRankOne(a Symmetric, alpha float64, x []float64) {
n := s.mat.N
var w SymDense
if s == a {
w = *s
}
if w.isZero() {
w.mat = blas64.Symmetric{
N: n,
Stride: n,
Uplo: blas.Upper,
Data: use(w.mat.Data, n*n),
}
} else if n != w.mat.N {
panic(ErrShape)
}
if s != a {
w.CopySym(a)
}
if len(x) != n {
panic(ErrShape)
}
blas64.Syr(alpha, blas64.Vector{Inc: 1, Data: x}, w.mat)
*s = w
return
}
// RankTwo performs a symmmetric rank-two update to the matrix a and stores
// the result in the receiver
// m = a + alpha * (x * y' + y * x')
func (s *SymDense) RankTwo(a Symmetric, alpha float64, x, y []float64) {
n := s.mat.N
var w SymDense
if s == a {
w = *s
}
if w.isZero() {
w.mat = blas64.Symmetric{
N: n,
Stride: n,
Uplo: blas.Upper,
Data: use(w.mat.Data, n*n),
}
} else if n != w.mat.N {
panic(ErrShape)
}
if s != a {
w.CopySym(a)
}
if len(x) != n {
panic(ErrShape)
}
if len(y) != n {
panic(ErrShape)
}
blas64.Syr2(alpha, blas64.Vector{Inc: 1, Data: x}, blas64.Vector{Inc: 1, Data: y}, w.mat)
*s = w
return
} | gocv/Godeps/_workspace/src/github.com/gonum/matrix/mat64/symmetric.go | 0.693473 | 0.44746 | symmetric.go | starcoder |
package graphs
type Graph struct {
nodes map[string]bool // Set of nodes
// Map of neighbours. Key is source node and value another map where key is destination node and value is cost
edges map[string]map[string]int
// Destination map represents nodes which are pointing edges towards key
reverseEdges map[string]map[string]bool
}
func New() *Graph {
return &Graph{
nodes: make(map[string]bool),
edges: make(map[string]map[string]int),
reverseEdges: make(map[string]map[string]bool),
}
}
// This initialized edges map for new node. If node already exists then it won't do anything
func (g *Graph) AddNode(name string) {
g.nodes[name] = true
if g.edges[name] == nil {
g.edges[name] = make(map[string]int)
}
if g.reverseEdges[name] == nil {
g.reverseEdges[name] = make(map[string]bool)
}
}
func (g *Graph) DoesNodeExists(name string) bool {
return g.nodes[name]
}
// Caveat: This function will add nodes if not present already
func (g *Graph) AddEdge(source, destination string, cost int) {
g.AddNode(source)
g.AddNode(destination)
g.edges[source][destination] = cost
g.reverseEdges[destination][source] = true
}
// Remove a node from graph
// This will also delete all edges coming to node and all edges going from node
func (g *Graph) RemoveNode(name string) {
delete(g.nodes, name) // Delete from set of nodes
reverseEdges := g.reverseEdges[name]
for k, _ := range reverseEdges {
delete(g.edges[k], name) // Delete all edges which are pointing to node
}
delete(g.reverseEdges, name)
}
func (g *Graph) RemoveEdge(source, destination string) {
delete(g.edges[source], destination)
delete(g.reverseEdges[destination], source)
}
// Return all edges going from node name
// Second return value indicates whether node is present in graph or not
func (g *Graph) GetEdges(name string) (map[string]int, bool) {
return g.edges[name], g.nodes[name]
}
func (g *Graph) BreadthFirstTraversal(startNode string) {
// TODO (Dheerendra): Implement this properly
} | collections/graphs/graph.go | 0.750827 | 0.633297 | graph.go | starcoder |
package suvec
import (
"log"
)
type Type int
const (
Matrix64 Type = iota
Box
Point3d
ErrorMatrix
)
type ErrorCode int
const (
MatchingDimensions = iota
NotSquareMatrix
NotImplemented
NotAScalarType
)
type Mat struct {
typ Type
rows, cols int
data []float64
}
/** Iehandle Internal error handler
"opration" failed with op1, and op2 (optional)
**/
func ehandle(e ErrorCode, op1, op2 *Mat, operation string, hint string) *Mat {
switch e {
case MatchingDimensions:
log.Println("Error matching dimensions ", operation)
case NotSquareMatrix:
log.Println("Matrix not square ", operation)
}
return NewErrorMatrix()
}
/** New Create a new Vector of float32
len Length, its a one-row, len-columns by default
vals initial values
By default its zero
If vals are give they are filled from left to the right
if vals is larger than len then the remeainign values are ignored
Values are initialized to 0
**/
func New(rows, cols int, typ Type) *Mat {
m := new(Mat)
m.typ = typ
m.rows = rows
m.cols = cols
switch m.typ {
case Matrix64:
m.data = make([]float64, rows*cols)
case Box:
m.data = make([]float64, 6)
case Point3d:
m.data = make([]float64, 3)
}
m.Zero()
return m
}
/** NewMatrix is just short for New(n, m, Matrix64)
**/
func NewMatrix(r, c int) *Mat {
return New(r, c, Matrix64)
}
/** NewVector creates a row-Vector with n elements
Short for : New(1, n, Matrix64).Init(vals)
**/
func NewVector(len int, vals ...float64) *Mat {
m := New(1, len, Matrix64)
for i, v := range vals {
if i < m.rows*m.cols {
m.data[i] = v
}
}
return m
}
func NewScalar(val float64) *Mat {
m := New(1, 1, Matrix64)
m.Set(0, 0, val)
return m
}
/** Error Matrix returns a valid Matrix, but can not be used for further computeation **/
func NewErrorMatrix() *Mat {
m := New(0, 0, ErrorMatrix)
return m
}
/** NewBox creates a 3x2 vector for a geometrical reprentation of a box */
func NewBox(x1, y1, z1, x2, y2, z2 float64) *Mat {
m := New(3, 2, Box)
m.data[0] = x1
m.data[1] = y1
m.data[2] = z1
m.data[3] = x2
m.data[4] = y2
m.data[5] = z2
return m
}
/** Init initializes the values of the matrix in row-order
**/
func (m *Mat) Init(vals ...float64) *Mat {
m.Zero()
for i, v := range vals {
if i < m.rows*m.cols {
m.data[i] = v
}
}
return m
}
/** Duplicate creates a new Instance of the same type and copies the values
**/
func Duplicate(val *Mat) *Mat {
m := new(Mat)
m.typ = val.typ
m.rows = val.rows
m.cols = val.cols
m.data = make([]float64, val.rows*val.cols)
for i := 0; i < m.rows*m.cols; i++ {
m.data[i] = val.data[i]
}
return m
}
/** Create a copy **/
func (m *Mat) Clone() *Mat {
return Duplicate(m)
}
func (m *Mat) SetTo(val *Mat) *Mat {
if m.rows == val.rows && m.cols == val.cols {
for i := 0; i < m.rows*m.cols; i++ {
m.data[i] = val.data[i]
}
} else {
// errorhandling: not Implemented yet
}
return m
}
func (m *Mat) Zero() *Mat {
for i := 0; i < m.rows; i++ {
for j := 0; j < m.cols; j++ {
m.data[i*m.cols+j] = 0
}
}
return m
}
func Zeros(rows, cols int) *Mat {
z := New(rows, cols, Matrix64)
for i := 0; i < z.rows; i++ {
for j := 0; j < z.cols; j++ {
z.data[i*z.cols+j] = 0
}
}
return z
}
func (m *Mat) Ones() *Mat {
for i := 0; i < m.rows; i++ {
for j := 0; j < m.cols; j++ {
m.data[i*m.cols+j] = 1
}
}
return m
}
func Ones(rows, cols int) *Mat {
z := New(rows, cols, Matrix64)
for i := 0; i < z.rows; i++ {
for j := 0; j < z.cols; j++ {
z.data[i*z.cols+j] = 1
}
}
return z
}
func (m *Mat) Identity() *Mat {
if m.rows != m.cols {
return ehandle(NotSquareMatrix, m, nil, "Identity", "")
}
m.Zero()
for i := 0; i < m.rows; i++ {
m.data[i*m.cols+i] = 1
}
return m
}
func (m *Mat) Get(r, c int) float64 {
return m.data[r*m.cols+c]
}
func (m *Mat) Cols() int {
return m.cols
}
func (m *Mat) Rows() int {
return m.cols
}
func (m *Mat) Set(r, c int, val float64) *Mat {
m.data[r*m.cols+c] = val
return m
}
func (m *Mat) IsScalar() bool {
if m.rows == 1 && m.cols == 1 {
return true
}
return false
}
func (m *Mat) IsVector() bool {
if m.rows == 1 && m.cols > 1 {
return true
}
if m.cols == 1 && m.rows > 1 {
return true
}
return false
}
func (m *Mat) IsRowVector() bool {
if m.rows == 1 && m.cols > 1 {
return true
}
return false
}
func (m *Mat) IsColumnVector() bool {
if m.cols == 1 && m.rows > 1 {
return true
}
return false
}
func (m *Mat) IsMatrix() bool {
if m.rows > 1 && m.cols > 1 {
return true
}
return false
}
func (m *Mat) IsSameSize(val *Mat) bool {
if m.rows != val.rows || m.cols != val.cols {
return false
}
return true
}
/** T Transposes a Matrix **/
func (m *Mat) T() *Mat {
res := New(m.cols, m.rows, m.typ)
for i := 0; i < m.rows; i++ {
for j := 0; j < m.cols; j++ {
res.data[j*m.rows+i] = m.data[i*m.cols+j]
}
}
return res
}
/** Length of largest array dimension **/
func (m *Mat) Length() *Mat {
log.Panic("Dont use this anymore")
return nil
}
/** Float32 returns the value of a Scalar as float32 Datatype.
- The input must be a scalar
**/
func (m *Mat) Float32() float32 {
if m.IsScalar() {
return float32(m.data[0])
}
ehandle(NotAScalarType, m, nil, "Float32", "")
return 0
}
/** Float64 returns the value of a Scalar as float64 Datatype.
- The input must be a scalar
**/
func (m *Mat) Float64() float64 {
if m.IsScalar() {
return float64(m.data[0])
}
ehandle(NotAScalarType, m, nil, "Float64", "")
return 0
} | lib/basic.go | 0.676192 | 0.534734 | basic.go | starcoder |
package trail
import (
"fmt"
"strconv"
)
// RFID15 is a 15 decimal implant number based on ISO 11784 & 11785.
// The data is packed in the 48 least significant bits.
type RFID15 uint64
// ParseRFID15 returns the parsed value if, and only if, s is syntactically correct.
func ParseRFID15(s string) RFID15 {
if len(s) != 15 {
return 0
}
country, err := strconv.ParseUint(s[:3], 10, 10)
if err != nil {
return 0
}
id, err := strconv.ParseUint(s[3:], 10, 38)
if err != nil {
return 0
}
return RFID15(country<<38 | id)
}
// String returns the 15 decimal code.
func (id RFID15) String() string {
return fmt.Sprintf("%03d%012d", id>>38, id&(1<<38-1))
}
// Manufacturer returns the 3 decimal code from the 10-bit country code field.
// See https://www.service-icar.com/tables/Tabella3.php
func (id RFID15) Manufacturer() string {
return id.String()[:3]
}
// ID returns the 12 decimal code from the 38-bit ID field.
func (id RFID15) ID() string {
return id.String()[3:]
}
// UELN (Universal Equine Life Number) is a horse-specific identification number
// that can be used to identify each horse individual.
// See http://www.ueln.net/ueln-presentation/rules-of-attribution-of-the-ueln/
type UELN [15]byte
// ParseUELN parses and normalizes [uppercasing] s into n.
// The return is OK if, and only if, s is syntactically correct.
func ParseUELN(s string) (n UELN, ok bool) {
if len(s) != len(n) {
return
}
for i := range n {
c := s[i]
switch {
case c <= '9' && c >= '0':
break
case i > 5 && c <= 'Z' && c >= 'A':
// letters allowed in national ID
break
case i > 5 && c <= 'z' && c >= 'a':
// convert to uppercase
c = c - ('a' - 'A')
default:
return UELN{}, false
}
n[i] = c
}
ok = true
return
}
// String returns the 15 character code.
func (n UELN) String() string {
return string(n[:])
}
// Country returns the 3 decimal ISO-3166 country code
// of the database which registered the foal at birth.
func (n UELN) Country() string {
return string(n[:3])
}
// Database returns the 3 decimal code of the database
// where the horse has been registered at birth.
func (n UELN) Database() string {
return string(n[3:6])
}
// NationalID returns the 9 character horse national
// identification number given by the stud-book of birth.
func (n UELN) NationalID() string {
return string(n[6:])
} | id.go | 0.757884 | 0.421909 | id.go | starcoder |
package dorp
import (
"errors"
"fmt"
"io"
"golang.org/x/crypto/nacl/secretbox"
)
// A SetMessage is the Go representation of the JSON message
// sent to set states
type SetMessage struct {
DoorState string
LightState string
}
// A State is a binary condition of the door or lights.
type State byte
//go:generate stringer -type=State
// Positive and Negative are the two possible states
const (
Negative State = iota
Positive
)
// ErrWrongNumberOfStates may be returned if a function takes a slice of states
// and got the wrong number. This probably will only occur deserializing network data.
var ErrWrongNumberOfStates = errors.New("incorrect state count")
// GenerateNonce creats a 24 byte nonce from the source of randomness rand
func GenerateNonce(rand io.Reader) ([24]byte, error) {
var nonce [24]byte
var n int
for i := 0; i < 3; i++ {
n, _ = rand.Read(nonce[:])
if n == 24 {
break
}
}
if n != 24 {
return nonce, fmt.Errorf("encrypt: unable to read 24 random bytes for nonce (read %d)", n)
}
return nonce, nil
}
// KeyToByteArray converts a key to the [32]bytes required by nacl.secretbox
func KeyToByteArray(key string) ([32]byte, error) {
var k [32]byte
if len(key) != 32 {
return k, fmt.Errorf("Key must be 32 bytes (characters) long")
}
n := copy(k[:], []byte(key))
if n != 32 {
return k, fmt.Errorf("Copying key failed")
}
return k, nil
}
// ProcessNonceMessage takes the message from the server and the shared key
// and returns the next nonce the server expects
func ProcessNonceMessage(message *[64]byte, key *[32]byte) (*[24]byte, error) {
var nonce [24]byte
copy(nonce[:], message[64-24:])
var nextNonce []byte
var ok bool
nextNonce, ok = secretbox.Open(nextNonce, message[:64-24], &nonce, key)
if !ok {
return nil, fmt.Errorf("Unable to open box")
}
n := copy(nonce[:], nextNonce)
if n != 24 {
return nil, fmt.Errorf("Recvd nonce has incorrect length")
}
return &nonce, nil
}
// Thought of a haiku
// I may as well leave it here
// To be found by you | dorp.go | 0.604866 | 0.407569 | dorp.go | starcoder |
package golist
import (
"fmt"
"math/rand"
"sort"
"time"
)
// SliceByte is a slice of type byte.
type SliceByte struct {
data []byte
}
// NewSliceByte returns a pointer to a new SliceByte initialized with the specified elements.
func NewSliceByte(elems ...byte) *SliceByte {
s := new(SliceByte)
s.data = make([]byte, len(elems))
for i := 0; i < len(elems); i++ {
s.data[i] = elems[i]
}
return s
}
// Append adds the elements to the end of SliceByte.
func (s *SliceByte) Append(elems ...byte) *SliceByte {
if s == nil {
return nil
}
s.data = append(s.data, elems...)
return s
}
// Prepend adds the elements to the beginning of SliceByte.
func (s *SliceByte) Prepend(elems ...byte) *SliceByte {
if s == nil {
return nil
}
s.data = append(elems, s.data...)
return s
}
// At returns the element in SliceByte at the specified index.
func (s *SliceByte) At(index int) byte {
if s.data == nil || len(s.data) == 0 {
panic("SliceByte does not contain any elements")
}
if index >= len(s.data) || index < 0 {
panic(fmt.Sprintf("index %d outside the range of SliceByte", index))
}
return s.data[index]
}
// Set sets the element of SliceByte at the specified index.
func (s *SliceByte) Set(index int, elem byte) *SliceByte {
if s == nil {
return nil
}
s.data[index] = elem
return s
}
// Insert inserts the elements into SliceByte at the specified index.
func (s *SliceByte) Insert(index int, elems ...byte) *SliceByte {
if s == nil {
return nil
}
// Grow the slice by the number of elements (using the zero value)
var zero byte
for i := 0; i < len(elems); i++ {
s.data = append(s.data, zero)
}
// Use copy to move the upper part of the slice out of the way and open a hole.
copy(s.data[index+len(elems):], s.data[index:])
// Store the new values
for i := 0; i < len(elems); i++ {
s.data[index+i] = elems[i]
}
// Return the result.
return s
}
// Remove removes the element from SliceByte at the specified index.
func (s *SliceByte) Remove(index int) *SliceByte {
if s == nil {
return nil
}
s.data = append(s.data[:index], s.data[index+1:]...)
return s
}
// Filter removes elements from SliceByte that do not satisfy the filter function.
func (s *SliceByte) Filter(fn func(elem byte) bool) *SliceByte {
if s == nil {
return nil
}
data := s.data[:0]
for _, elem := range s.data {
if fn(elem) {
data = append(data, elem)
}
}
s.data = data
return s
}
// Transform modifies each element of SliceByte according to the specified function.
func (s *SliceByte) Transform(fn func(elem byte) byte) *SliceByte {
if s == nil {
return nil
}
for i, elem := range s.data {
s.data[i] = fn(elem)
}
return s
}
// Unique modifies SliceByte to keep only the first occurrence of each element (removing any duplicates).
func (s *SliceByte) Unique() *SliceByte {
if s == nil {
return nil
}
seen := make(map[byte]struct{})
data := s.data[:0]
for _, elem := range s.data {
if _, ok := seen[elem]; !ok {
data = append(data, elem)
seen[elem] = struct{}{}
}
}
s.data = data
return s
}
// Reverse reverses the order of the elements of SliceByte.
func (s *SliceByte) Reverse() *SliceByte {
if s == nil {
return nil
}
for i := len(s.data)/2 - 1; i >= 0; i-- {
opp := len(s.data) - 1 - i
s.Swap(i, opp)
}
return s
}
// Shuffle randomly shuffles the order of the elements in SliceByte.
func (s *SliceByte) Shuffle(seed int64) *SliceByte {
if s == nil {
return nil
}
if seed == 0 {
seed = time.Now().UnixNano()
}
r := rand.New(rand.NewSource(seed))
r.Shuffle(s.Count(), s.Swap)
return s
}
// Data returns the raw elements of SliceByte.
func (s *SliceByte) Data() []byte {
if s == nil {
return nil
}
return s.data
}
// Count returns the number of elements in SliceByte.
func (s *SliceByte) Count() int {
return len(s.data)
}
// Len returns the number of elements in SliceByte (alias for Count).
func (s *SliceByte) Len() int {
return s.Count()
}
// Swap swaps the elements in SliceByte specified by the indices i and j.
func (s *SliceByte) Swap(i, j int) {
s.data[i], s.data[j] = s.data[j], s.data[i]
}
// Less returns true if the SliceByte element at index i is less than the element at index j.
func (s *SliceByte) Less(i, j int) bool {
return s.data[i] < s.data[j]
}
// Sort sorts the elements of SliceByte in increasing order.
func (s *SliceByte) Sort() *SliceByte {
if s == nil {
return nil
}
sort.Sort(s)
return s
}
// Min returns the smallest (least ordered) element in SliceByte.
func (s *SliceByte) Min() byte {
if s.data == nil || len(s.data) == 0 {
panic("SliceByte does not contain any elements")
}
// start with the first value
min := s.data[0]
for _, elem := range s.data[1:] {
if elem < min {
min = elem
}
}
return min
}
// Max returns the largest (greatest ordered) element in SliceByte.
func (s *SliceByte) Max() byte {
if s.data == nil || len(s.data) == 0 {
panic("SliceByte does not contain any elements")
}
// start with the first value
max := s.data[0]
for _, elem := range s.data[1:] {
if elem > max {
max = elem
}
}
return max
}
// Clone performs a deep copy of SliceByte and returns it
func (s *SliceByte) Clone() *SliceByte {
if s == nil {
return nil
}
s2 := new(SliceByte)
s2.data = make([]byte, len(s.data))
copy(s2.data, s.data)
return s2
}
// Equal returns true if the SliceByte is logically equivalent to the specified SliceByte.
func (s *SliceByte) Equal(s2 *SliceByte) bool {
if s == s2 {
return true
}
if s == nil || s2 == nil {
return false // has to be false because s == s2 tested earlier
}
if len(s.data) != len(s2.data) {
return false
}
for i, elem := range s.data {
if elem != s2.data[i] {
return false
}
}
return true
} | slice_byte.go | 0.807916 | 0.557002 | slice_byte.go | starcoder |
package main
var (
version = "devel-release"
commit = "unknown-hash"
date = "unknown-date"
)
const (
gomodShort = "A tool to visualise and analyse a Go project's dependency graph."
gomodLong = `A CLI tool for interacting with your Go project's dependency graph on various
levels. See the online documentation or the '--help' for specific sub-commands to discover all the
supported workflows.
NB: The '--verbose' flag available on each command takes an optional list of strings that allow you
to only get verbose output for specific pieces of logic. The available domains are:
* init -> Code that runs before any actual dependency processing happens.
* graph -> Interactions on the underlying graph representation used by 'gomod'.
* modinfo -> Retrieval of information about all modules involved in the dependency graph.
* pkginfo -> Retrieval of information about all packages involves in the dependency graph.
* moddeps -> Retrieval of information about module-level dependency graph (e.g versions).
* parser -> Parsing of user-specified dependency queries.
* query -> Execution of a parsed user-query on the dependency graph.
* printer -> Printing of the queried dependency graph.
* all -> Covers all domains above.
Without any arguments the behaviour defaults to enabling verbosity on all domains, the
equivalent of passing 'all' as argument.
`
graphShort = "Visualise the dependency graph of a Go module."
graphLong = `Generate a visualisation of the dependency network used by the code in your Go
module.
The command requires a query to be passed to determine what part of the graph
should be printed. The query language itself supports the following syntax:
- Exact or prefix path queries: foo.com/bar or foo.com/bar/...
- Inclusion of test-only dependencies: test(foo.com/bar)
- Dependency queries: 'deps(foo.com/bar)' or 'rdeps(foo.com/bar)
- Depth-limited variants of the above: 'deps(foo.com/bar, 5)'
- Recursive removal of single-parent leaf-nodes: shared(foo.com/bar)'
- Various set operations: X + Y, X - Y, X inter Y, X delta Y.
An example query:
gomod graph -p 'deps(foo.com/bar/...) inter deps(test(test.io/pkg/tool))'
The generated graph is colour and format coded:
- Each module, or group of packages belonging to the same module, has a distinct
colour.
- Test-only dependencies are recognisable by the use of a lighter
colour-palette.
- Test-only edges are recognisable by a light blue colour.
- Edges reflecting indirect module dependencies are marked with dashed instead
of continuous lines.
Other visual aspects (when run through the 'dot' tool) can be tuned with the
'--style' flag. You can specify any formatting options as
'<option>=<value>[,<option>=<value>]'
out of the following list:
- 'scale_nodes': one of 'true' or 'false' (default 'false'). This will scale the
size of each node of the graph based on the number of inbound
and outbound dependencies it has.
- 'cluster': one of 'off', 'shared', 'full' (default 'off'). This option
will generate clusters in the image that force the grouping of
shared dependencies together. The result is a tighter graph of
reduced size with less "holes" but which might have less
visible or understandable edges. When set to 'shared' only
dependencies with a single inbound edge are considered and
clustered according to the commonality of that ancestor. When
set to 'full' any two dependencies that have an identical set
of inbound edges are clustered together.
WARNING: Using the 'cluster' option can dramatically increase
the time required to generate image files, especially
for larger dependency graphs. But it's for the latter
that it can also greatly improve the readability of
the final image.
`
analyseShort = `Analyse the graph of dependencies for this Go module and output interesting
statistics.`
revealShort = "Reveal 'hidden' replace'd modules in your direct and direct independencies."
versionShort = "Display the version of the gomod tool."
) | main_strings.go | 0.799951 | 0.437103 | main_strings.go | starcoder |
package common
// AlgorithmID encodes symmetric algorithm parameters for Soter.
type AlgorithmID uint32
const (
algorithmMask = 0xF0000000
algorithmOffset = 28
kdfMask = 0x0F000000
kdfOffset = 24
paddingMask = 0x000F0000
paddingOffset = 16
keyLengthMask = 0x00000FFF
keyLengthOffset = 0
)
// SymmetricAlgorithm indicates symmetric encryption algorithm in AlgorithmID.
type SymmetricAlgorithm int
// Supported SymmetricAlgorithm values.
const (
AesECB SymmetricAlgorithm = 0x1
AesCBC = 0x2
AesXTS = 0x3
AesGCM = 0x4
)
// KeyDerivationFunction indicates key derivation function in AlgorithmID.
type KeyDerivationFunction int
// Supported KeyDerivationFunction values.
const (
NoKDF KeyDerivationFunction = 0x0
PBKDF2HmacSha256 = 0x1
)
// PaddingAlgorithm indicates padding algorithm in AlgorithmID.
type PaddingAlgorithm int
// Supported PaddingAlgorithm values.
const (
NoPadding PaddingAlgorithm = 0x0
PKCS7Padding = 0x1
)
// MakeAlgorithmID constructs algorithm ID from components.
func MakeAlgorithmID(algorithm SymmetricAlgorithm, kdf KeyDerivationFunction, padding PaddingAlgorithm, keyBits int) AlgorithmID {
var value uint32
value |= uint32(algorithm) << algorithmOffset
value |= uint32(kdf) << kdfOffset
value |= uint32(padding) << paddingOffset
value |= uint32(keyBits) << keyLengthOffset
return AlgorithmID(value)
}
// Algorithm returns symmetric algorithm component.
func (id AlgorithmID) Algorithm() SymmetricAlgorithm {
return SymmetricAlgorithm((id & algorithmMask) >> algorithmOffset)
}
// KDF returns key derivation function.
func (id AlgorithmID) KDF() KeyDerivationFunction {
return KeyDerivationFunction((id & kdfMask) >> kdfOffset)
}
// Padding returns padding algorithm.
func (id AlgorithmID) Padding() PaddingAlgorithm {
return PaddingAlgorithm((id & paddingMask) >> paddingOffset)
}
// KeyBits returns size of the key in bits.
func (id AlgorithmID) KeyBits() int {
return int((id & keyLengthMask) >> keyLengthOffset)
} | themis/spec/common/soter-alg.go | 0.698535 | 0.402921 | soter-alg.go | starcoder |
package response
// Python-related response types
const (
PythonDocumentationType = "python_documentation"
PythonSignaturePatternType = "python_signatures"
PythonSignatureCompletionsType = "python_signature_completions"
PythonCompletionsType = "python_completions"
PythonSuggestionsType = "python_suggestions"
PythonSuggestionType = "python_suggestion"
PythonSearchSuggestionType = "python_search_suggestion"
PythonDefinitionType = "python_definition"
PythonTopMembersType = "python_top_members"
)
// PythonExample represents one snippet of code.
type PythonExample struct {
Code string `json:"code"`
Source string `json:"source"`
}
// PythonSignaturePatterns is a collection of patterns.
type PythonSignaturePatterns struct {
Type string `json:"type"`
Signatures []*PythonSignaturePattern `json:"signatures"`
}
// PythonSignaturePattern is an example of a piece of code with a particular
// invocation pattern.
type PythonSignaturePattern struct {
Frequency float64 `json:"frequency"`
Signature string `json:"signature"`
Examples []*PythonExample `json:"examples"`
}
// PythonTypeInfo contains information about a argument type including it's
// canonical name, friendly (human-readable) name, and documentation
type PythonTypeInfo struct {
Name string `json:"name"`
FriendlyName string `json:"friendly_name"`
}
// PythonSignatureToken is a component of a signature completion result.
type PythonSignatureToken struct {
Token string `json:"token"`
TokenType string `json:"tokenType"`
Types []PythonTypeInfo `json:"types,omitempty"`
DocString string `json:"docsString,omitempty"`
Examples []string `json:"examples,omitempty"`
}
// PythonSignature wraps an array of PythonSignatureToken's
type PythonSignature struct {
Frequency float64 `json:"frequency"`
Signature []*PythonSignatureToken `json:"signature"`
}
// PythonSignatureCompletions are a set of argument-level signature completions
type PythonSignatureCompletions struct {
Type string `json:"type"`
Prefix string `json:"prefix"`
ReturnTypes []PythonTypeInfo `json:"returnTypes"`
ArgIndex int `json:"tokenCursorIndex"`
Completions []*PythonSignature `json:"completions"`
}
// PythonSearchSuggestion is a response containing search query suggestions.
// This struct contains PythonDocumentation and other
// relevant python responses; therefore, while the suggested search query
// are not python specific (it is only a string), we still define the struct
// here.
type PythonSearchSuggestion struct {
Type string `json:"type"`
RawQuery string `json:"raw_query"`
Identifier string `json:"identifier"`
Documentation *PythonDocumentation `json:"documentation"`
}
// PythonIdentifier is a response containing different string identifiers for a
// Python language entity
type PythonIdentifier struct {
ID string `json:"id"`
RelativeID string `json:"relative_id"`
Name string `json:"name"`
Kind string `json:"kind"`
}
// PythonDocumentation is a response containing documentation for a Python language
// entity (such as a method).
type PythonDocumentation struct {
ID string `json:"id"`
Name string `json:"name"`
Type string `json:"type"`
Kind string `json:"kind"`
Signature string `json:"signature"`
Description string `json:"description"`
LocalCode bool `json:"local_code"`
SeeAlso []PythonSeeAlso `json:"see_also"`
Warnings []PythonWarning `json:"warnings"`
Hints []string `json:"hints"`
StructuredDoc *PythonStructuredDoc `json:"structured_doc"`
Ancestors []PythonIdentifier `json:"ancestors"`
Children []PythonIdentifier `json:"children"`
References []PythonIdentifier `json:"references"`
CuratedExamplePreviews []*CuratedExamplePreview `json:"curated_example_previews"`
}
// PythonSeeAlso contains information about a related identifier.
type PythonSeeAlso struct {
ID string `json:"id"`
Summary string `json:"summary"`
}
// PythonWarning contains a warning for the identifier.
type PythonWarning struct {
Text string `json:"text"`
Type string `json:"type"`
}
// PythonStructuredDoc is a response containing structured information for a Python language entity.
type PythonStructuredDoc struct {
Ident string `json:"ident"`
Parameters []*PythonParameter `json:"parameters"`
Description string `json:"description"`
ReturnType string `json:"return_type"`
}
// PythonParameter is a response containing structured information for a parameter of a Python language class, function or method.
type PythonParameter struct {
Type string `json:"type"`
Name string `json:"name"`
Default string `json:"default"`
Description string `json:"description"`
}
// PythonCompletions is a collection of completions results.
type PythonCompletions struct {
Type string `json:"type"`
Prefix string `json:"prefix"`
PrefixType PythonTypeInfo `json:"prefixType"`
Completions []*PythonCompletion `json:"completions"`
}
// PythonCompletion is a response containing a single completion result.
type PythonCompletion struct {
// Attr contains the suggested attribute with no periods
Attr string `json:"attr"`
// TopLevel is true if this is a root-level package name, in which case no
// period will be shown before the attribute in the completions view.
TopLevel bool `json:"toplevel"`
Identifier string `json:"identifier"`
Type string `json:"type"`
FullIdentifier string `json:"full_identifier"`
}
// PythonSuggestions contains diff suggestions that are for the same
// error.
type PythonSuggestions struct {
Type string `json:"type"`
File string `json:"file"`
Suggestions []*PythonSuggestion `json:"suggestions"`
ID uint64 `json:"hash_id"`
Line int `json:"line"`
Begin int `json:"begin"`
End int `json:"end"`
}
// PythonSuggestion wraps a set of diffs for fixing a potential error.
type PythonSuggestion struct {
Type string `json:"type"`
Score float64 `json:"score"`
PluginID string `json:"plugin_id"`
FileMD5 string `json:"file_md5"`
FileBase64 []byte `json:"file_base64"`
Filename string `json:"filename"`
Diffs []*PythonDiff `json:"diffs"`
}
// PythonDiff contains two strings for making a diff, the line number
// and the token positions of the diff.
type PythonDiff struct {
Type string `json:"type"`
LineNum int `json:"linenum"`
Begin int `json:"begin"`
End int `json:"end"`
Source string `json:"source"`
Destination string `json:"destination"`
LineSrc string `json:"line_src"`
LineDest string `json:"line_dest"`
ID string `json:"hash_id"`
}
// PythonReference represents a reference to a fully-qualified name within a
// program
type PythonReference struct {
Begin int `json:"begin"`
End int `json:"end"`
Original string `json:"expression"`
FullyQualifiedName string `json:"fully_qualified"`
Instance bool `json:"instance"`
NodeType string `json:"node_type"` // "import", "name", "attribute", or "call"
}
// PythonTopMembers represents the top members of a python package.
type PythonTopMembers struct {
Type string `json:"type"`
Root string `json:"root"`
Members []*PythonMember `json:"members"`
}
// PythonTopMembersUnified represents the top members of a python package.
type PythonTopMembersUnified struct {
Type string `json:"type"`
Symbol string `json:"symbol"`
SymbolType string `json:"symbol_type"`
Members []*PythonMember `json:"members"`
TotalMembers int `json:"total_members"`
}
// PythonMember represents a member of a module.
type PythonMember struct {
Attr string `json:"attr"`
Identifier string `json:"identifier"`
Type string `json:"type"`
ID string `json:"id"`
}
// PythonKwargs represents the possible `**kwargs` for a function.
type PythonKwargs struct {
// Name is the name of the `**kwargs` as specified in the arg spec.
Name string `json:"name"`
// Kwargs is the possible `**kwargs` for a function.
Kwargs []*PythonKwarg `json:"kwargs"`
}
// PythonKwarg reporesent a possible **kwarg for a function.
type PythonKwarg struct {
Name string `json:"name"`
Types []string `json:"types"`
} | kite-go/response/python.go | 0.73173 | 0.422803 | python.go | starcoder |
package symexpr
import (
"fmt"
"math"
)
func (*Time) Eval(t float64, x, c, s []float64) float64 { return t }
func (v *Var) Eval(t float64, x, c, s []float64) float64 { return x[v.P] }
func (cnst *Constant) Eval(t float64, x, c, s []float64) float64 { return c[cnst.P] }
func (cnst *ConstantF) Eval(t float64, x, c, s []float64) float64 { return cnst.F }
func (sys *System) Eval(t float64, x, c, s []float64) float64 { return s[sys.P] }
func (u *Neg) Eval(t float64, x, c, s []float64) float64 {
return -1. * u.C.Eval(t, x, c, s)
}
func (u *Abs) Eval(t float64, x, c, s []float64) float64 {
return math.Abs(u.C.Eval(t, x, c, s))
}
func (u *Sqrt) Eval(t float64, x, c, s []float64) float64 {
return math.Sqrt(u.C.Eval(t, x, c, s))
}
func (u *Sin) Eval(t float64, x, c, s []float64) float64 {
return math.Sin(u.C.Eval(t, x, c, s))
}
func (u *Cos) Eval(t float64, x, c, s []float64) float64 {
return math.Cos(u.C.Eval(t, x, c, s))
}
func (u *Tan) Eval(t float64, x, c, s []float64) float64 {
return math.Tan(u.C.Eval(t, x, c, s))
}
func (u *Exp) Eval(t float64, x, c, s []float64) float64 {
return math.Exp(u.C.Eval(t, x, c, s))
}
func (u *Log) Eval(t float64, x, c, s []float64) float64 {
return math.Log(u.C.Eval(t, x, c, s))
}
func (u *PowI) Eval(t float64, x, c, s []float64) float64 {
return math.Pow(u.Base.Eval(t, x, c, s), float64(u.Power))
}
func (u *PowF) Eval(t float64, x, c, s []float64) float64 {
return math.Pow(u.Base.Eval(t, x, c, s), u.Power)
}
func (n *PowE) Eval(t float64, x, c, s []float64) float64 {
return math.Pow(n.Base.Eval(t, x, c, s), n.Power.Eval(t, x, c, s))
}
func (n *Div) Eval(t float64, x, c, s []float64) float64 {
return n.Numer.Eval(t, x, c, s) / n.Denom.Eval(t, x, c, s)
}
func (n *Add) Eval(t float64, x, c, s []float64) float64 {
ret := 0.0
for _, C := range n.CS {
if C == nil {
continue
}
ret += C.Eval(t, x, c, s)
}
return ret
}
func (n *Mul) Eval(t float64, x, c, s []float64) float64 {
ret := 1.0
for _, C := range n.CS {
if C == nil {
continue
}
ret *= C.Eval(t, x, c, s)
}
return ret
}
func RK4(eqn []Expr, ti, tj float64, x_in, x_tmp, c, s []float64) (x_out []float64) {
var k [32][4]float64
L := len(x_in)
h := tj - ti
for i := 0; i < L; i++ {
k[i][0] = eqn[i].Eval(ti, x_in, c, s)
}
for i := 0; i < L; i++ {
x_tmp[i] = x_in[i] + (h * k[i][0] / 2.0)
}
for i := 0; i < L; i++ {
k[i][1] = eqn[i].Eval(ti, x_tmp, c, s)
}
for i := 0; i < L; i++ {
x_tmp[i] = x_in[i] + (h * k[i][1] / 2.0)
}
for i := 0; i < L; i++ {
k[i][2] = eqn[i].Eval(ti, x_tmp, c, s)
}
for i := 0; i < L; i++ {
x_tmp[i] = x_in[i] + (h * k[i][2])
}
for i := 0; i < L; i++ {
k[i][3] = eqn[i].Eval(ti, x_tmp, c, s)
}
for i := 0; i < L; i++ {
x_out[i] = ((k[i][0] + 2.0*k[i][1] + 2.0*k[i][2] + k[i][3]) * (h / 6.0))
}
return
}
func PRK4(xn int, eqn Expr, ti, tj float64, x_in, x_out, x_tmp, c, s []float64) float64 {
var k [4]float64
L := len(x_in)
h := tj - ti
for i := 0; i < L; i++ {
mid := (0.5 * (x_out[i] - x_in[i]))
x_tmp[i] = x_in[i] + mid
}
k[0] = eqn.Eval(ti, x_in, c, s)
x_tmp[xn] = x_in[xn] + (h * k[0] / 2.0)
k[1] = eqn.Eval(ti, x_tmp, c, s)
x_tmp[xn] = x_in[xn] + (h * k[1] / 2.0)
k[2] = eqn.Eval(ti, x_tmp, c, s)
x_tmp[xn] = x_in[xn] + (h * k[2])
k[3] = eqn.Eval(ti, x_tmp, c, s)
return ((k[0] + 2.0*k[1] + 2.0*k[2] + k[3]) * (h / 6.0))
}
func PrintPRK4(xn int, eqn Expr, ti, to float64, x_in, x_out, x_tmp, c, s []float64) float64 {
var k [4]float64
L := len(x_in)
h := to - ti
for i := 0; i < L; i++ {
x_tmp[i] = x_in[i] + (0.5 * (x_out[i] - x_in[i]))
}
fmt.Printf("in: %v\n", x_in)
fmt.Printf("out: %v\n", x_out)
fmt.Printf("tmp: %v\n", x_tmp)
k[0] = eqn.Eval(ti, x_in, c, s)
x_tmp[xn] = x_in[xn] + (h * k[0] / 2.0)
fmt.Printf("tmp: %v\n", x_tmp)
k[1] = eqn.Eval(ti, x_tmp, c, s)
x_tmp[xn] = x_in[xn] + (h * k[1] / 2.0)
fmt.Printf("tmp: %v\n", x_tmp)
k[2] = eqn.Eval(ti, x_tmp, c, s)
x_tmp[xn] = x_in[xn] + (h * k[2])
fmt.Printf("tmp: %v\n", x_tmp)
k[3] = eqn.Eval(ti, x_tmp, c, s)
fmt.Printf("k: %v\n", k)
ans := ((k[0] + 2.0*k[1] + 2.0*k[2] + k[3]) * (h / 6.0))
fmt.Printf("ans: %.4f => %.4f\n\n", ans, x_out[xn]-x_in[xn])
return ans
} | eval.go | 0.627609 | 0.540439 | eval.go | starcoder |
package point
import (
"fmt"
"github.com/aiseeq/s2l/protocol/api"
"math"
"math/cmplx"
"sort"
)
type Pointer interface {
Point() Point
}
type Point complex128
type Points []Point
type Filter func(pt Point) bool
type Line struct {
A, B Point
}
type Lines []Line
type Circle struct {
Point
R float64
}
func Pt(x, y float64) Point {
return Point(complex(x, y))
}
func Pt0() Point {
return 0
}
func Pt2(a *api.Point2D) Point {
if a == nil {
return 0
}
return Point(complex(a.X, a.Y))
}
func Pt3(a *api.Point) Point {
if a == nil {
return 0
}
return Point(complex(a.X, a.Y))
}
func PtI(a *api.PointI) Point {
if a == nil {
return 0
}
return Point(complex(float64(a.X), float64(a.Y)))
}
func (a Point) Point() Point {
return a
}
func (a Point) String() string {
c := complex128(a)
return fmt.Sprintf("(%.2f, %.2f)", real(c), imag(c))
}
func (a Point) X() float64 {
return real(complex128(a))
}
func (a *Point) SetX(x float64) {
*a = Point(complex(x, imag(*a)))
}
func (a Point) Y() float64 {
return imag(complex128(a))
}
func (a *Point) SetY(y float64) {
*a = Point(complex(real(*a), y))
}
func (a Point) Floor() Point {
c := complex128(a)
return Point(complex(math.Floor(real(c)), math.Floor(imag(c))))
}
func (a Point) To2D() *api.Point2D {
c := complex128(a)
return &api.Point2D{X: float32(real(c)), Y: float32(imag(c))}
}
func (a Point) To3D() *api.Point {
c := complex128(a)
return &api.Point{X: float32(real(c)), Y: float32(imag(c))}
}
func (a Point) Add(x, y float64) Point {
c := complex128(a)
return Point(complex(real(c)+x, imag(c)+y))
}
func (a Point) Mul(x float64) Point {
c := complex128(a)
return Point(complex(real(c)*x, imag(c)*x))
}
func (a Point) Dist(ptr Pointer) float64 {
b := ptr.Point()
return cmplx.Abs(complex128(b) - complex128(a))
}
func (a Point) Dist2(ptr Pointer) float64 {
b := ptr.Point()
c := complex128(b - a)
r := real(c)
i := imag(c)
return r*r + i*i
}
func (a Point) Manhattan(ptr Pointer) float64 {
b := ptr.Point()
x := complex128(b) - complex128(a)
return real(x) + imag(x)
}
// Direction on grid
func (a Point) Dir(ptr Pointer) Point {
b := ptr.Point()
c := complex128(b - a)
return Point(complex(math.Copysign(1, real(c)), math.Copysign(1, imag(c))))
}
func (a Point) Rotate(rad float64) Point {
r, th := cmplx.Polar(complex128(a))
th += rad
return Point(cmplx.Rect(r, th))
}
func (a Point) Len() float64 {
return cmplx.Abs(complex128(a))
}
func (a Point) Norm() Point {
if l := a.Len(); l > 0 {
return a / Point(complex(l, 0))
} else {
return 0
}
}
func (a Point) Towards(ptr Pointer, offset float64) Point {
b := ptr.Point()
return a + (b-a).Norm()*Point(complex(offset, 0))
}
func (a Point) Neighbours4(offset float64) Points {
return Points{a.Add(0, offset), a.Add(offset, 0), a.Add(0, -offset), a.Add(-offset, 0)}
}
func (a Point) NeighboursDiagonal4(offset float64) Points {
return Points{a.Add(offset, offset), a.Add(offset, -offset), a.Add(-offset, -offset), a.Add(-offset, offset)}
}
func (a Point) Neighbours8(offset float64) Points {
return append(a.Neighbours4(offset), a.NeighboursDiagonal4(offset)...)
}
// User should check that he receives not nil
func (a Point) Closest(ps []Point) *Point {
var closest *Point
var dist = math.Inf(1)
for _, b := range ps {
if closest == nil || dist > a.Dist2(b) {
p := b
closest = &p
dist = a.Dist2(*closest)
}
}
return closest
}
func (a Point) IsCloserThan(dist float64, ptr Pointer) bool {
b := ptr.Point()
return a.Dist2(b) < dist*dist
}
func (a Point) IsFurtherThan(dist float64, ptr Pointer) bool {
b := ptr.Point()
return a.Dist2(b) > dist*dist
}
func (a Point) CellCenter() Point {
return a + 0.5 + 0.5i
}
func (ps *Points) Add(p ...Point) {
*ps = append(*ps, p...)
}
func (ps *Points) Remove(point Point) {
for k, p := range *ps {
if p == point {
if len(*ps) > k+1 {
*ps = append((*ps)[:k], (*ps)[k+1:]...)
} else {
*ps = (*ps)[:k] // Remove last
}
}
}
}
func (ps Points) Len() int {
return len(ps)
}
func (ps Points) Empty() bool {
return len(ps) == 0
}
func (ps Points) Exists() bool {
return len(ps) > 0
}
func (ps Points) Has(p Point) bool {
for _, pt := range ps {
if p == pt {
return true
}
}
return false
}
func (ps Points) Intersect(ps2 Points) Points {
res := Points{}
pmap := map[Point]bool{}
for _, pt := range ps {
pmap[pt] = true
}
for _, pt := range ps2 {
if pmap[pt] {
res.Add(pt)
}
}
return res
}
func (ps Points) Center() Point {
var sum Point
if len(ps) == 0 {
return 0
}
for _, p := range ps {
sum += p
}
return sum.Mul(1.0 / float64(len(ps)))
}
func (ps Points) ClosestTo(ptr Pointer) Point {
point := ptr.Point()
var closest Point
for _, p := range ps {
if closest == 0 || point.Dist2(closest) > point.Dist2(p) {
closest = p
}
}
return closest
}
func (ps Points) FurthestTo(ptr Pointer) Point {
point := ptr.Point()
var furthest Point
for _, p := range ps {
if furthest == 0 || point.Dist2(furthest) < point.Dist2(p) {
furthest = p
}
}
return furthest
}
func (ps Points) CloserThan(dist float64, ptr Pointer) Points {
pos := ptr.Point()
dist2 := dist * dist
closer := Points{}
for _, p := range ps {
if p.Dist2(pos) <= dist2 {
closer.Add(p)
}
}
return closer
}
func (ps Points) OrderByDistanceTo(ptr Pointer, desc bool) {
// todo: optimize? (via sort by other)
pos := ptr.Point()
sort.Slice(ps, func(i, j int) bool {
return desc != (ps[i].Dist2(pos) < ps[j].Dist2(pos))
})
}
func (ps Points) FirstFurtherThan(dist float64, from Pointer) Point {
dist2 := dist * dist
for _, p := range ps {
if p.Dist2(from) >= dist2 {
return p
}
}
return 0
}
func (ps Points) Filter(filters ...Filter) Points {
if len(filters) == 0 {
return ps
}
res := Points{}
NextPoint:
for _, p := range ps {
for _, filter := range filters {
if !filter(p) {
continue NextPoint
}
}
res = append(res, p)
}
return res
}
func (ls *Lines) Add(l ...Line) {
*ls = append(*ls, l...)
}
func NewCircle(x, y, r float64) *Circle {
return &Circle{Point(complex(x, y)), r}
}
func PtCircle(p Point, r float64) *Circle {
return &Circle{p, r}
}
// Find the intersection of the two circles, the number of intersections may have 0, 1, 2
func Intersect(a *Circle, b *Circle) (ps Points) {
if a.X() > b.X() {
return Intersect(b, a) // Try to fix intersection bug. Looks like, first circle should be on the left
}
dx, dy := b.X()-a.X(), b.Y()-a.Y()
lr := a.R + b.R //radius and
dr := math.Abs(a.R - b.R) //radius difference
ab := math.Sqrt(dx*dx + dy*dy) //center distance
if ab <= lr && ab > dr {
theta1 := math.Atan(dy / dx)
ef := lr - ab
ao := a.R - ef/2
theta2 := math.Acos(ao / a.R)
theta := theta1 + theta2
xc := a.X() + a.R*math.Cos(theta)
yc := a.Y() + a.R*math.Sin(theta)
ps = append(ps, Pt(xc, yc))
if ab < lr { //two intersections
theta3 := math.Acos(ao / a.R)
theta = theta3 - theta1
xd := a.X() + a.R*math.Cos(theta)
yd := a.Y() - a.R*math.Sin(theta)
ps = append(ps, Pt(xd, yd))
}
}
return
}
// Coordinates of rectangle that fits all points
func (ps Points) Rect() Points {
min := Pt(math.MaxFloat64, math.MaxFloat64)
max := Pt(-math.MaxFloat64, -math.MaxFloat64)
for _, p := range ps {
if p.X() < min.X() {
min = Pt(p.X(), min.Y())
}
if p.Y() < min.Y() {
min = Pt(min.X(), p.Y())
}
if p.X() > max.X() {
max = Pt(p.X(), max.Y())
}
if p.Y() > max.Y() {
max = Pt(max.X(), p.Y())
}
}
return Points{min, max}
} | lib/point/point.go | 0.792143 | 0.448426 | point.go | starcoder |
package genmap2d
import (
"image/color"
)
// Various tile IDs.
const (
TileIDGrass byte = iota
TileIDWater
TileIDTree
TileIDSand
TileIDMountain
TileIDSnow
TileIDVillage
TileIDMax
)
// TileFromHeight returns the tile ID for a given height.
func (m *Map) TileFromHeight(h int) byte {
if h <= 0 {
return TileIDWater
}
if h <= 2 {
return TileIDSand
}
if h >= 90 {
return TileIDSnow
}
if h >= 70 {
return TileIDMountain
}
if h >= 30 && h <= 60 {
return TileIDTree
}
return TileIDGrass
}
// TileColor returns the color of a tile based on the given tile ID.
func (m *Map) TileColor(tID byte) color.Color {
switch tID {
case TileIDGrass:
return color.RGBA{0x34, 0x8C, 0x31, 0xff}
case TileIDWater:
return color.RGBA{0x00, 0x75, 0x77, 0xff}
case TileIDTree:
return color.RGBA{0x42, 0x69, 0x2f, 0xff}
case TileIDSand:
return color.RGBA{0xc2, 0xb2, 0x80, 0xff}
case TileIDMountain:
return color.RGBA{0x91, 0x8E, 0x85, 0xff}
case TileIDSnow:
return color.RGBA{0xFF, 0xFF, 0xFF, 0xff}
case TileIDVillage:
return color.RGBA{0xFF, 0x00, 0x00, 0xff}
default:
return color.RGBA{0x00, 0x00, 0x00, 0xff}
}
}
// setup assigns all tiles based on the height of a given point.
func (m *Map) setup() {
m.run(func(x, y int) byte {
return m.TileFromHeight(int(m.HeightMap[x][y]) - 120)
})
}
// run executes function 'f' for every point and assignes the returned tile ID.
func (m *Map) run(f func(x, y int) byte) {
for x := 0; x < m.Width; x++ {
for y := 0; y < m.Height; y++ {
m.Cells[x][y] = f(x, y)
}
}
}
// tilesInRadius returns all tile IDs within a given radious around a given point.
func (m *Map) tilesInRadius(x, y, r int) []byte {
var res []byte
for cx := x - r; cx < x+r; cx++ {
if cx < 0 || cx >= m.Width {
continue
}
for cy := y - r; cy < y+r; cy++ {
if cy < 0 || cy >= m.Height {
continue
}
if dist(cx, cy, x, y) > r {
// Not in circle.
continue
}
res = append(res, m.Cells[cx][cy])
}
}
return res
} | genmap2d/tiles.go | 0.662578 | 0.480662 | tiles.go | starcoder |
package v2d
import (
"github.com/chewxy/math32"
)
// Add adds the two vectors.
func (v Vec) Add(w Vec) Vec {
return Vec{X: v.X + w.X, Y: v.Y + w.Y}
}
// Sub subtracts w from v.
func (v Vec) Sub(w Vec) Vec {
return Vec{X: v.X - w.X, Y: v.Y - w.Y}
}
// Neg returns -v.
func (v Vec) Neg() Vec {
return Vec{X: -v.X, Y: -v.Y}
}
// Scale scales the vector with s.
func (v Vec) Scale(s float32) Vec {
return Vec{X: v.X * s, Y: v.Y * s}
}
// Dot takes the dot-product of v and w.
func (v Vec) Dot(w Vec) float32 {
return v.X*w.X + v.Y*w.Y
}
// Cross assumes that v, w are (x,y,0) vectors and returns the z component of 3D cross product only.
func (v Vec) Cross(w Vec) float32 {
return v.X*w.Y - v.Y*w.X
}
// CrossZ returns the cross product of (x, y, 0) and (0, 0, z).
func (v Vec) CrossZ(z float32) Vec {
return Vec{X: v.Y * z, Y: -v.X * z}
}
// CrossZ returns the cross product of (0, 0, z) and (x, y, 0).
func CrossZ(z float32, v Vec) Vec {
return Vec{X: -v.Y * z, Y: v.X * z}
}
// LengthSquared returns the length of the vector squared.
func (v Vec) LengthSquared() float32 {
return v.X*v.X + v.Y*v.Y
}
// Length returns the length of the vector.
func (v Vec) Length() float32 {
return math32.Sqrt(v.X*v.X + v.Y*v.Y)
}
// Unit returns a unit vector in the same direction.
func (v Vec) Unit() Vec {
// pre-scaling to not get underflow / overflow in
// the sqrt in .length
v = v.Scale(min(math32.Abs(1/v.X), math32.Abs(1/v.Y)))
return v.Scale(1.0 / v.Length())
}
// Orth returns a vector w of same length as v that is orthogonal to v such that
// (v, w) are "right handed".
func (v Vec) Orth() (w Vec) {
w.X = -v.Y
w.Y = v.X
return
}
// Eq returns true if v and w are equal withing tol.
func (v Vec) Eq(w Vec, tol float32) bool {
return math32.Abs(v.X-w.X) <= tol && math32.Abs(v.Y-w.Y) <= tol
}
// IsZero checks if the vector is (0,0)
func (v Vec) IsZero() bool {
return v.X == 0 && v.Y == 0
}
// Angle returns the angle of the vector (from x axis)
func (v Vec) Angle() float32 {
return math32.Atan2(v.Y, v.X)
}
func min(a, b float32) float32 {
if a < b {
return a
}
return b
} | vec.go | 0.925936 | 0.604574 | vec.go | starcoder |
package network
import (
"math"
"github.com/haashi/go-neural/matrix"
)
// Network : a neural network
type Network struct {
inputs int
hiddenLayers int
outputs int
weights []*matrix.Matrix
learningRate float64
biasWeights []*matrix.Matrix
bias float64
}
// CreateNetwork : create a neural network
func CreateNetwork(inputs int, hiddenLayers int, layersSize []int, outputs int, learningRate float64, bias float64) *Network {
seed = 1103527590
net := Network{
inputs: inputs,
hiddenLayers: hiddenLayers,
outputs: outputs,
learningRate: learningRate,
bias: bias,
}
layersSize = append([]int{inputs}, layersSize...)
layersSize = append(layersSize, outputs)
net.weights = make([]*matrix.Matrix, hiddenLayers+1)
net.biasWeights = make([]*matrix.Matrix, hiddenLayers+1)
for i := 0; i < hiddenLayers+1; i++ {
net.weights[i] = &matrix.Matrix{
Col: layersSize[i+1],
Row: layersSize[i],
Values: initArray(layersSize[i] * layersSize[i+1]),
}
net.biasWeights[i] = &matrix.Matrix{
Col: layersSize[i+1],
Row: 1,
Values: initArray(layersSize[i+1]),
}
}
for k := 0; k < hiddenLayers+1; k++ {
for j := 0; j < net.weights[k].Col; j++ {
for i := 0; i < net.weights[k].Row; i++ {
net.weights[k].Values[i*net.weights[k].Col+j] = rand()
}
net.biasWeights[k].Values[j] = rand()
}
}
return &net
}
// Predict : forward propagation from input to output
func (net *Network) Predict(inputData []float64) *matrix.Matrix {
// forward propagation
inputs := &matrix.Matrix{
Row: 1,
Col: len(inputData),
Values: inputData,
}
for i := 0; i < len(net.weights); i++ {
inputs = matrix.Add(matrix.Dot(inputs, net.weights[i]), matrix.ScalMult(net.biasWeights[i], net.bias))
inputs = matrix.Apply(inputs, sigmoid)
}
return inputs
}
// Train : do a prediction, and backpropagate error
func (net *Network) Train(inputData []float64, targetData []float64) {
// forward propagation with
inputs := &matrix.Matrix{
Row: 1,
Col: len(inputData),
Values: inputData,
}
outputs := make([]*matrix.Matrix, len(net.weights))
deltas := make([]*matrix.Matrix, len(net.weights))
temp := inputs
for i := 0; i < len(net.weights); i++ {
temp = matrix.Add(matrix.Dot(temp, net.weights[i]), matrix.ScalMult(net.biasWeights[i], net.bias))
temp = matrix.Apply(temp, sigmoid)
outputs[i] = temp
}
result := temp
// find errors
target := &matrix.Matrix{
Row: 1,
Col: len(targetData),
Values: targetData,
}
deltas[len(deltas)-1] = matrix.Mult(matrix.Sub(target, result), matrix.Apply(result, dsigmoid))
for i := len(deltas) - 2; i >= 0; i-- {
deltas[i] = matrix.Mult(matrix.Dot(deltas[i+1], matrix.Transpose(net.weights[i+1])), matrix.Apply(outputs[i], dsigmoid))
}
for i := len(deltas) - 1; i >= 0; i-- {
if i != 0 {
net.weights[i] = matrix.Add(net.weights[i], matrix.ScalMult(matrix.Dot(matrix.Transpose(outputs[i-1]), deltas[i]), net.learningRate))
val := make([]float64, net.weights[i].Row)
for k := 0; k < net.weights[i].Row; k++ {
val[k] = 1.0
}
ones := &matrix.Matrix{
Col: net.weights[i].Row,
Row: 1,
Values: val,
}
net.biasWeights[i] = matrix.Add(net.biasWeights[i], matrix.ScalMult(matrix.Dot(matrix.Transpose(ones), deltas[i]), net.learningRate))
} else {
net.weights[i] = matrix.Add(net.weights[i], matrix.ScalMult(matrix.Dot(matrix.Transpose(inputs), deltas[i]), net.learningRate))
val := make([]float64, net.weights[i].Row)
for k := 0; k < net.weights[i].Row; k++ {
val[k] = 1.0
}
ones := &matrix.Matrix{
Col: net.weights[i].Row,
Row: 1,
Values: val,
}
net.biasWeights[i] = matrix.Add(net.biasWeights[i], matrix.ScalMult(matrix.Dot(matrix.Transpose(ones), deltas[i]), net.learningRate))
}
}
}
func sigmoid(z float64) float64 {
return 1.0 / (1 + math.Exp(-1*z))
}
func dsigmoid(z float64) float64 {
return z * (1 - z)
}
func initArray(size int) []float64 {
data := make([]float64, size)
for i := 0; i < size; i++ {
data[i] = 1
}
return data
}
var seed int = 1103527590
func rand() float64 {
val := float64(float32(seed) / float32(0x7fffffff))
seed = (1103515245*seed + 12345) % (0x80000000)
return val
} | network/network.go | 0.748168 | 0.620737 | network.go | starcoder |
package gorgonia
import (
"fmt"
"github.com/chewxy/gorgonia/tensor"
tf32 "github.com/chewxy/gorgonia/tensor/f32"
tf64 "github.com/chewxy/gorgonia/tensor/f64"
ti "github.com/chewxy/gorgonia/tensor/i"
"github.com/chewxy/gorgonia/tensor/types"
)
// Value represents a value that Gorgonia accepts
type Value interface {
Type() Type
Shape() types.Shape
Size() int
Dtype() Dtype
Eq(other Value) bool
clone() (Value, error)
zero() Value
fmt.Formatter
}
type Valuer interface {
Value() Value
}
// Tensor is a Value. It wraps over types.Tensor
type Tensor struct {
types.Tensor
}
func NewTensorValue(dt Dtype, shp ...int) Tensor {
var t types.Tensor
switch dt {
case Float64:
t = tf64.NewTensor(tf64.WithShape(shp...))
case Float32:
t = tf32.NewTensor(tf32.WithShape(shp...))
case Int:
t = ti.NewTensor(ti.WithShape(shp...))
default:
panic("Unhandled yet")
}
return Tensor{Tensor: t}
}
func FromTensor(t types.Tensor) Tensor {
return Tensor{Tensor: t}
}
// lazy values - this allows for pooling of tensorTypes
func (t Tensor) Type() Type {
dt, dim := tensorInfo(t.Tensor)
shp := t.Tensor.Shape()
tt := newTensorType(dim, dt)
tt.shape = shp
return tt
}
func (t Tensor) Dtype() Dtype {
return dtypeToDtype(t.Tensor.Dtype())
}
func (t Tensor) Shape() types.Shape { return t.Tensor.Shape() }
func (t Tensor) Eq(other Value) bool {
ot, ok := other.(Tensor)
if !ok {
return false
}
return ot.Tensor.Eq(t.Tensor)
}
func (t Tensor) clone() (Value, error) {
retVal := Tensor{
Tensor: tensor.Clone(t.Tensor),
}
return retVal, nil
}
func (t Tensor) zero() Value {
t.Tensor.Zero()
return t
}
// Scalar is more of a class of values than value, but eh, we'll leave it as is
type Scalar struct {
v interface{}
t Dtype
}
func NewScalarValue(val interface{}) Scalar {
var retVal Scalar
retVal.v = val
switch val.(type) {
case float64:
retVal.t = Float64
case float32:
retVal.t = Float32
case int:
retVal.t = Int
case int64:
retVal.t = Int64
case int32:
retVal.t = Int32
case byte:
retVal.t = Byte
case bool:
retVal.t = Bool
default:
panic("Unhandled type")
}
return retVal
}
func (s Scalar) Type() Type { return s.t }
func (s Scalar) Dtype() Dtype { return s.t }
func (s Scalar) Shape() types.Shape { return types.ScalarShape() }
func (s Scalar) Size() int { return 1 }
func (s Scalar) V() interface{} { return s.v }
func (s Scalar) Eq(other Value) bool {
os, ok := other.(Scalar)
if !ok {
return false
}
return os.v == s.v && os.t == s.t
}
func (s Scalar) sanity() (err error) {
switch s.v.(type) {
case float64:
if s.t != Float64 {
err = NewError(RuntimeError, "Type mismatch. Want Float64. Got %v instead", s.t)
}
case float32:
if s.t != Float32 {
err = NewError(RuntimeError, "Type mismatch. Want Float32. Got %v instead", s.t)
}
case int:
if s.t != Int {
err = NewError(RuntimeError, "Type mismatch. Want Int. Got %v instead", s.t)
}
case int64:
if s.t != Int64 {
err = NewError(RuntimeError, "Type mismatch. Want Int64. Got %v instead", s.t)
}
case int32:
if s.t != Int32 {
err = NewError(RuntimeError, "Type mismatch. Want Int32. Got %v instead", s.t)
}
case byte:
if s.t != Byte {
err = NewError(RuntimeError, "Type mismatch. Want Byte. Got %v instead", s.t)
}
case bool:
if s.t != Bool {
err = NewError(RuntimeError, "Type mismatch. Want Bool. Got %v instead", s.t)
}
default:
err = NewError(RuntimeError, "Scalar %v is of Unsupported type %T", s.v, s.v)
}
return
}
func (s Scalar) clone() (Value, error) {
s2 := Scalar{
v: s.v,
t: s.t,
}
err := s2.sanity()
return s2, err
}
func (s Scalar) zero() Value {
cloned, err := s.clone()
if err != nil {
panic(err) // yes, it's a panic.
}
s2 := cloned.(Scalar)
switch s.t {
case Float64:
s2.v = 0.0
case Float32:
s2.v = float32(0.0)
case Int:
s2.v = 0
case Int64:
s2.v = int64(0)
case Int32:
s2.v = int32(0)
case Byte:
s2.v = byte(0)
case Bool:
s2.v = false
}
return s2
}
func (s Scalar) Format(state fmt.State, c rune) {
if fmter, ok := s.v.(fmt.Formatter); ok {
fmter.Format(state, c)
}
fmt.Fprintf(state, "%v", s.v)
return
} | values.go | 0.668772 | 0.534491 | values.go | starcoder |
package bst
type OrderType string
type BST_Node struct {
Parent *BST_Node
Left *BST_Node
Right *BST_Node
Data int
}
type BST struct {
Root *BST_Node
LeafCount int
}
func NewBST() *BST {
return &BST{
Root: nil,
LeafCount: 0,
}
}
func (bt *BST) Insert(data int) {
if bt.LeafCount == 0 {
bt.Root = &BST_Node{
Parent: nil,
Left: nil,
Right: nil,
Data: data,
}
bt.LeafCount++
} else {
added_leaf := bt.Root.InsertLeaf(data)
if added_leaf == 1 {
bt.LeafCount++
}
}
}
func (b_node *BST_Node) InsertLeaf(data int) int {
if data < b_node.Data {
if b_node.Left == nil {
b_node.Left = &BST_Node{
Parent: b_node,
Left: nil,
Right: nil,
Data: data,
}
return 1
} else {
b_node.Left.InsertLeaf(data)
}
} else if data > b_node.Data {
if b_node.Right == nil {
b_node.Right = &BST_Node{
Parent: b_node,
Left: nil,
Right: nil,
Data: data,
}
return 1
} else {
b_node.Right.InsertLeaf(data)
}
} else {
return 0
}
return 1
}
func (b_node *BST_Node) TraverseTree(order_type OrderType, result []int) []int {
if order_type == "pre_order" {
result = append(result, b_node.Data)
if b_node.Left != nil {
result = b_node.Left.TraverseTree(order_type, result)
}
if b_node.Right != nil {
result = b_node.Right.TraverseTree(order_type, result)
}
} else if order_type == "in_order" {
if b_node.Left != nil {
result = b_node.Left.TraverseTree(order_type, result)
}
result = append(result, b_node.Data)
if b_node.Right != nil {
result = b_node.Right.TraverseTree(order_type, result)
}
} else if order_type == "post_order" {
if b_node.Left != nil {
result = b_node.Left.TraverseTree(order_type, result)
}
if b_node.Right != nil {
result = b_node.Right.TraverseTree(order_type, result)
}
result = append(result, b_node.Data)
}
return result
}
func (bt *BST) DepthFirstSearch(order_type OrderType) []int {
var result = []int{}
if bt.LeafCount == 0 {
return result
}
b_node := bt.Root
result = b_node.TraverseTree(order_type, result)
return result
} | data-structures/binary-search-trees/bst/bst.go | 0.64579 | 0.474631 | bst.go | starcoder |
package sc
// FFT implements the Fast Fourier Transform.
// The fast fourier transform analyzes the frequency content of a signal,
// which can be useful for audio analysis or for frequency-domain sound processing (phase vocoder).
type FFT struct {
// A buffer to store spectral data. The buffer's size must correspond to a power of 2.
// LocalBuf is useful here, because processes should not share data between synths.
// (Note: most PV UGens operate on this data in place. Use PV_Copy for parallel processing.)
Buffer Input
// The signal to be analyzed. The signal's rate determines the rate at which the input is read.
In Input
// The amount of offset from the beginning of one FFT analysis frame to the next, measured in multiples of the analysis frame size.
// This can range between 1.0 and values close to (but larger than) 0.0,
// and the default is 0.5 (meaning each frame has a 50% overlap with the preceding/following frames).
Hop Input
// Defines how the data is windowed:
// -1 is rectangular windowing, simple but typically not recommended
// 0 is (the default) Sine windowing, typically recommended for phase-vocoder work
// 1 is Hann windowing, typically recommended for analysis work.
WinType Input
// A simple control allowing FFT analysis to be active (>0) or inactive (<=0).
// This is mainly useful for signal analysis processes which are only intended to analyse at specific times rather than continuously
Active Input
// The windowed audio frames are usually the same size as the buffer.
// If you wish the FFT to be zero-padded then you can specify a window size smaller than the actual buffer size
// (e.g. window size 1024 with buffer size 2048).
// Both values must still be a power of two. Leave this at its default of zero for no zero-padding.
WinSize Input
}
func (fft *FFT) defaults() {
if fft.Hop == nil {
fft.Hop = C(0.5)
}
if fft.WinType == nil {
fft.In = C(0)
}
if fft.Active == nil {
fft.Active = C(1)
}
if fft.WinSize == nil {
fft.WinSize = C(0)
}
}
// Rate creates a new ugen at a specific rate.
// If an In signal or Buffer is not provided this method will trigger a runtime panic.
func (fft FFT) Rate(rate int8) Input {
CheckRate(rate)
if fft.Buffer == nil {
panic("FFT expects Buffer to not be nil")
}
if fft.In == nil {
panic("FFT expects In to not be nil")
}
(&fft).defaults()
return NewInput("FFT", rate, 0, 1, fft.Buffer, fft.In, fft.Hop, fft.WinType, fft.Active, fft.WinSize)
} | vendor/github.com/scgolang/sc/fft.go | 0.734691 | 0.6597 | fft.go | starcoder |
package beholder
// ConditionsDataSource .
var ConditionsDataSource = rulesDataSource(ConditionEntity,
rule("Conditions",
"Conditions alter a creature's capabilities in a variety of ways and can arise as a result of a spell, a class feature, a monster's attack, or other effect. Most conditions, such as blinded, are impairments, but a few, such as invisible, can be advantageous.",
"A condition lasts either until it is countered (the prone condition is countered by standing up, for example) or for a duration specified by the effect that imposed the condition.",
"If multiple effects impose the same condition on a creature, each instance of the condition has its own duration, but the condition's effects don't get worse. A creature either has a condition or doesn't.",
"The following definitions specify what happens to a creature while it is subjected to a condition.",
section("Blinded",
"• A blinded creature can't see and automatically fails any ability check that requires sight.",
"• Attack rolls against the creature have advantage, and the creature's attack rolls have disadvantage.",
),
section("Charmed",
"• A charmed creature can't attack the charmer or target the charmer with harmful abilities or magical effects.",
"• The charmer has advantage on any ability check to interact socially with the creature.",
),
section("Deafened",
"• A deafened creature can't hear and automatically fails any ability check that requires hearing.",
),
section("Exhaustion",
"Some special abilities and environmental hazards, such as starvation and the long-term effects of freezing or scorching temperatures, can lead to a special condition called exhaustion. Exhaustion is measured in six levels. An effect can give a creature one or more levels of exhaustion, as specified in the effect's description.",
"",
"<h2>Level Effect</h2>",
" 1 Disadvantage on ability checks",
" 2 Speed halved",
" 3 Disadvantage on attack rolls and saving throws",
" 4 Hit point maximum halved",
" 5 Speed reduced to 0",
" 6 Death",
"",
" If an already exhausted creature suffers another effect that causes exhaustion, its current level of exhaustion increases by the amount specified in the effect's description.",
" A creature suffers the effect of its current level of exhaustion as well as all lower levels. For example, a creature suffering level 2 exhaustion has its speed halved and has disadvantage on ability checks.",
" An effect that removes exhaustion reduces its level as specified in the effect's description, with all exhaustion effects ending if a creature's exhaustion level is reduced below 1.",
" Finishing a long rest reduces a creature's exhaustion level by 1, provided that the creature has also ingested some food and drink.",
),
section("Frightened",
"• A frightened creature has disadvantage on ability checks and attack rolls while the source of its fear is within line of sight.",
"• The creature can't willingly move closer to the source of its fear.",
),
section("Grappled",
"• A grappled creature's speed becomes 0, and it can't benefit from any bonus to its speed.",
"• The condition ends if the grappler is incapacitated (see the condition).",
"• The condition also ends if an effect removes the grappled creature from the reach of the grappler or grappling effect, such as when a creature is hurled away by the <b>thunder wave</b> spell.",
),
section("Incapacitated",
"• An incapacitated creature can't take actions or reactions.",
),
section("Invisible",
"• An invisible creature is impossible to see without the aid of magic or a special sense. For the purpose of hiding, the creature is heavily obscured. The creature's location can be detected by any noise it makes or any tracks it leaves.",
"• Attack rolls against the creature have disadvantage, and the creature's attack rolls have advantage.",
),
section("Paralyzed",
"• A paralyzed creature is incapacitated (see the condition) and can't move or speak.",
"• The creature automatically fails Strength and Dexterity saving throws. Attack rolls against the creature have advantage.",
"• Any attack that hits the creature is a critical hit if the attacker is within 5 feet of the creature.",
),
section("Petrified",
"• A petrified creature is transformed, along with any nonmagical object it is wearing or carrying, into a solid inanimate substance (usually stone). Its weight increases by a factor of ten, and it ceases aging.",
"• The creature is incapacitated (see the condition), can't move or speak, and is unaware of its surroundings.",
"• Attack rolls against the creature have advantage.",
"• The creature automatically fails Strength and Dexterity saving throws.",
"• The creature has resistance to all damage.",
"• The creature is immune to poison and disease, although a poison or disease already in its system is suspended, not neutralized.",
),
section("Poisoned",
"• A poisoned creature has disadvantage on attack rolls and ability checks.",
),
section("Prone",
"• A prone creature's only movement option is to crawl, unless it stands up and thereby ends the condition.",
"• The creature has disadvantage on attack rolls.",
"• An attack roll against the creature has advantage if the attacker is within 5 feet of the creature. Otherwise, the attack roll has disadvantage.",
),
section("Restrained",
"• A restrained creature's speed becomes 0, and it can't benefit from any bonus to its speed.",
"• Attack rolls against the creature have advantage, and the creature's attack rolls have disadvantage.",
"• The creature has disadvantage on Dexterity saving throws.",
),
section("Stunned",
"• A stunned creature is incapacitated (see the condition), can't move, and can speak only falteringly.",
"• The creature automatically fails Strength and Dexterity saving throws.",
"• Attack rolls against the creature have advantage.",
),
section("Unconscious",
"• An unconscious creature is incapacitated (see the condition), can't move or speak, and is unaware of its surroundings",
"• The creature drops whatever it's holding and falls prone.",
"• The creature automatically fails Strength and Dexterity saving throws.",
"• Attack rolls against the creature have advantage.",
"• Any attack that hits the creature is a critical hit if the attacker is within 5 feet of the creature.",
),
),
) | src/data-conditions.go | 0.563858 | 0.690751 | data-conditions.go | starcoder |
package lamb
import (
mat "github.com/nlpodyssey/spago/pkg/mat32"
"github.com/nlpodyssey/spago/pkg/ml/nn"
"github.com/nlpodyssey/spago/pkg/ml/optimizers/gd"
)
var _ gd.MethodConfig = &Config{}
// Config provides configuration settings for Lamb optimizer.
type Config struct {
gd.MethodConfig
StepSize mat.Float
Beta1 mat.Float
Beta2 mat.Float
Epsilon mat.Float
Lambda mat.Float
}
// NewConfig returns a new Lamb Config.
func NewConfig(stepSize, beta1, beta2, epsilon, lambda mat.Float) Config {
if !(beta1 >= 0.0 && beta1 < 1.0) {
panic("lamb: `beta1` must be in the range [0.0, 1.0)")
}
if !(beta2 >= 0.0 && beta2 < 1.0) {
panic("lamb: `beta2` must be in the range [0.0, 1.0)")
}
return Config{
StepSize: stepSize,
Beta1: beta1,
Beta2: beta2,
Epsilon: epsilon,
Lambda: lambda,
}
}
// NewDefaultConfig returns a new Config with generically reasonable default values.
func NewDefaultConfig() Config {
return Config{
StepSize: 0.001,
Beta1: 0.9,
Beta2: 0.999,
Epsilon: 1.0e-8,
Lambda: 0.1,
}
}
var _ gd.Method = &Lamb{}
// Lamb implements the Lamb gradient descent optimization method.
type Lamb struct {
Config
Alpha mat.Float
TimeStep int
}
// New returns a new Lamb optimizer, initialized according to the given configuration.
func New(c Config) *Lamb {
lamb := &Lamb{
Config: c,
Alpha: c.StepSize,
}
lamb.IncExample() // initialize 'alpha' coefficient
return lamb
}
// Label returns the enumeration-like value which identifies this gradient descent method.
func (o *Lamb) Label() int {
return gd.Lamb
}
const (
v int = 0
m int = 1
buf1 int = 2 // contains 'grads.ProdScalar(1.0 - beta1)'
buf2 int = 3 // contains 'grads.Prod(grads).ProdScalar(1.0 - beta2)'
buf3 int = 4
)
// NewSupport returns a new support structure with the given dimensions.
func (o *Lamb) NewSupport(r, c int) *nn.Payload {
supp := make([]mat.Matrix, 5)
supp[v] = mat.NewEmptyDense(r, c)
supp[m] = mat.NewEmptyDense(r, c)
supp[buf1] = mat.NewEmptyDense(r, c)
supp[buf2] = mat.NewEmptyDense(r, c)
supp[buf3] = mat.NewEmptyDense(r, c)
return &nn.Payload{
Label: o.Label(),
Data: supp,
}
}
// IncExample beats the occurrence of a new example.
func (o *Lamb) IncExample() {
o.TimeStep++
o.updateAlpha()
}
func (o *Lamb) updateAlpha() {
o.Alpha = o.StepSize * mat.Sqrt(1.0-mat.Pow(o.Beta2, mat.Float(o.TimeStep))) / (1.0 - mat.Pow(o.Beta1, mat.Float(o.TimeStep)))
}
// Delta returns the difference between the current params and where the method wants it to be.
func (o *Lamb) Delta(param nn.Param) mat.Matrix {
return o.calcDelta(param.Grad(), gd.GetOrSetPayload(param, o).Data, param.Value())
}
// v = v*beta1 + grads*(1.0-beta1)
// m = m*beta2 + (grads*grads)*(1.0-beta2)
// weights = ||params|| / || (v / (sqrt(m) + eps)) + (lambda * weights)
// d = (v / (sqrt(m) + eps)) + (lambda * weights) * alpha
func (o *Lamb) calcDelta(grads mat.Matrix, supp []mat.Matrix, weights mat.Matrix) mat.Matrix {
updateV(grads, supp, o.Beta1)
updateM(grads, supp, o.Beta2)
buf := supp[m].Sqrt().AddScalarInPlace(o.Epsilon)
defer mat.ReleaseMatrix(buf)
suppDiv := supp[v].Div(buf)
if o.Lambda != 0.0 {
scaledW := weights.ProdScalar(o.Lambda)
suppDiv.AddInPlace(scaledW)
}
weightsNorm := norm(weights)
adamStepNorm := norm(suppDiv)
trustRatio := mat.Float(1.0)
if !(weightsNorm == 0.0 || adamStepNorm == 0.0) {
trustRatio = weightsNorm / adamStepNorm
}
defer mat.ReleaseMatrix(suppDiv)
supp[buf3].ProdMatrixScalarInPlace(suppDiv, o.Alpha*trustRatio)
return supp[buf3]
}
// v = v*beta1 + grads*(1.0-beta1)
func updateV(grads mat.Matrix, supp []mat.Matrix, beta1 mat.Float) {
supp[v].ProdScalarInPlace(beta1)
supp[buf1].ProdMatrixScalarInPlace(grads, 1.0-beta1)
supp[v].AddInPlace(supp[buf1])
}
// m = m*beta2 + (grads*grads)*(1.0-beta2)
func updateM(grads mat.Matrix, supp []mat.Matrix, beta2 mat.Float) {
supp[m].ProdScalarInPlace(beta2)
sqGrad := grads.Prod(grads)
defer mat.ReleaseMatrix(sqGrad)
supp[buf2].ProdMatrixScalarInPlace(sqGrad, 1.0-beta2)
supp[m].AddInPlace(supp[buf2])
}
func norm(grads mat.Matrix) mat.Float {
sum := mat.Float(0.0)
for _, d := range grads.Data() {
sum += d * d
}
return mat.Sqrt(sum)
} | pkg/ml/optimizers/gd/lamb/lamb.go | 0.842604 | 0.414129 | lamb.go | starcoder |
package election
import (
"encoding/json"
"github.com/paulmach/go.geojson"
"github.com/twpayne/go-polyline"
)
// EncodedGeometry correlates to a GeoJSON linear ring geometry, replacing
// coordinates with the encoded representation.
type EncodedGeometry struct {
Type string `json:"type"`
Coordinates [][]string `json:"coordinates"`
}
// EncodedPolylineFeature correlates to a GeoJSON feature, allowing
// EncodedGeometry to replace Geometry.
type EncodedPolylineFeature struct {
ID string `json:"id,omitempty"`
Type string `json:"type"`
BoundingBox []float64 `json:"bbox,omitempty"`
// Geometry overrides the Feature's Geometry, allowing us to use the
// encoded polyline algorithm.
Geometry *EncodedGeometry `json:"geometry"`
Properties map[string]interface{} `json:"properties"`
}
// MarshalJSON returns JSON in a byte slice from a given encoded polyline
// feature.
func (epf *EncodedPolylineFeature) MarshalJSON() ([]byte, error) {
// Must use *epf to avoid infinite loop.
return json.Marshal(*epf)
}
// EncodedPolylineFeatureCollection correlates to a GeoJSON feature collection,
// but allows flexibility in the features it contains.
type EncodedPolylineFeatureCollection struct {
Type string `json:"type"`
BoundingBox []float64 `json:"bbox,omitempty"`
Features []MarshalJSON `json:"features"`
}
// MarshalJSON returns JSON in a byte slice from a given encoded polyline
// feature collection.
func (epfc *EncodedPolylineFeatureCollection) MarshalJSON() ([]byte, error) {
// Must use *epfc to avoid infinite loop.
return json.Marshal(*epfc)
}
// MarshalJSON provides a common interface between geojson.Feature,
// geojson.FeatureCollection and our encoded polyline implementation.
type MarshalJSON interface {
MarshalJSON() ([]byte, error)
}
func convertToEncodedPolylineFeature(f *geojson.Feature) (*EncodedPolylineFeature, bool) {
if f.Geometry.Type != "MultiPolygon" {
return nil, false
}
// 3 levels of nesting: Multi then polygon then linear ring; linear
// ring is string-encoded.
var encodedMultiPolygon [][]string
for _, polygon := range f.Geometry.MultiPolygon {
var encodedPolygon []string
for _, linearRing := range polygon {
// http://geojson.org/geojson-spec.html#positions: long,lat.
// https://developers.google.com/maps/documentation/utilities/polylineutility: lat,long.
invertedCoords := make([][]float64, len(linearRing))
for i, coords := range linearRing {
invertedCoords[i] = []float64{coords[1], coords[0]}
}
encoded := string(polyline.EncodeCoords(invertedCoords))
encodedPolygon = append(encodedPolygon, encoded)
}
encodedMultiPolygon = append(encodedMultiPolygon, encodedPolygon)
}
encodedGeom := &EncodedGeometry{
Type: "EncodedMultiPolygon",
Coordinates: encodedMultiPolygon,
}
return &EncodedPolylineFeature{
ID: f.ID.(string),
Type: f.Type,
BoundingBox: f.BoundingBox,
Geometry: encodedGeom,
Properties: f.Properties,
}, true
}
func maybeConvertToEncodedPolylineFeatureCollection(fc *geojson.FeatureCollection) MarshalJSON {
var features []MarshalJSON
for _, f := range fc.Features {
epf, ok := convertToEncodedPolylineFeature(f)
if !ok {
features = append(features, f)
continue
}
features = append(features, epf)
}
return &EncodedPolylineFeatureCollection{
Type: fc.Type,
BoundingBox: fc.BoundingBox,
Features: features,
}
} | go_backend/encoded_polyline.go | 0.859605 | 0.437103 | encoded_polyline.go | starcoder |
package datespec
import (
"time"
)
type DateSpec interface {
OccursOn(*Date) bool
}
// DailyDateSpec takes place every day.
type DailyDateSpec struct{}
// UnionDateSpec takes place on days where any of Specs take place.
type UnionDateSpec struct {
Specs []DateSpec
}
// EveryNthDayDateSpec takes place every Count days starting on January 1, 1970.
type EveryNthDayDateSpec struct {
Count int
}
// WeekdayDateSpec takes place every Weekday.
type WeekdayDateSpec struct {
Weekday time.Weekday
}
// EveryNthWeekdayDateSpec takes place on Weekday every Count weeks, starting the week of January 1, 1970.
type EveryNthWeekdayDateSpec struct {
Weekday time.Weekday
Count int
}
// DayOfMonthDateSpec takes place on the Day'th day of the month when Day is positive, or Day days from the end of the month if it's negative (e.g. in January, -1 would take place on January 31)
type DayOfMonthDateSpec struct {
Day int
}
// YearlyDateSpec takes place on the given day every year.
type YearlyDateSpec struct {
Month time.Month
Day int
}
// SingleDayDateSpec takes place once, on the given date.
type SingleDayDateSpec struct {
Year int
Month time.Month
Day int
}
type Date struct {
Year int
Month time.Month
Day int
}
func (s *UnionDateSpec) OccursOn(date *Date) bool {
for _, spec := range s.Specs {
if spec.OccursOn(date) {
return true
}
}
return false
}
func (s *DailyDateSpec) OccursOn(date *Date) bool {
return true
}
func (s *EveryNthDayDateSpec) OccursOn(date *Date) bool {
return daysSinceEpoch(date)%s.Count == 0
}
func (s *WeekdayDateSpec) OccursOn(date *Date) bool {
return date.LocalStartOfDay().Weekday() == s.Weekday
}
func (s *EveryNthWeekdayDateSpec) OccursOn(date *Date) bool {
return date.LocalStartOfDay().Weekday() == s.Weekday && (daysSinceEpoch(date)/7)%s.Count == 0
}
func (s *DayOfMonthDateSpec) OccursOn(date *Date) bool {
if s.Day < 0 {
return time.Date(date.Year, date.Month+1, s.Day+1, 0, 0, 0, 0, time.Local).Day() == date.Day
}
return s.Day == date.Day
}
func (s *YearlyDateSpec) OccursOn(date *Date) bool {
return s.Month == date.Month && s.Day == date.Day
}
func (s *SingleDayDateSpec) OccursOn(date *Date) bool {
return s.Year == date.Year && s.Month == date.Month && s.Day == date.Day
}
func (d *Date) LocalStartOfDay() time.Time {
return time.Date(d.Year, d.Month, d.Day, 0, 0, 0, 0, time.Local)
}
func daysSinceEpoch(date *Date) int {
return daysBetween(date.LocalStartOfDay(), time.Date(1970, time.January, 1, 0, 0, 0, 0, time.Local))
}
// https://dev.to/samwho/get-the-number-of-days-between-two-dates-in-go-5bf3
func daysBetween(a, b time.Time) int {
if a.After(b) {
a, b = b, a
}
days := -a.YearDay()
for year := a.Year(); year < b.Year(); year++ {
days += time.Date(year, time.December, 31, 0, 0, 0, 0, time.UTC).YearDay()
}
days += b.YearDay()
return days
} | pkg/datespec/datespec.go | 0.676192 | 0.655322 | datespec.go | starcoder |
package godeck
import (
"fmt"
"math/rand"
"sort"
"time"
)
// Suit is the number associated to the card
type Suit uint8
const (
// Spade is the value of a type of cards
Spade Suit = iota
// Diamond is the value of a type of cards
Diamond
// Club is the value of a type of cards
Club
// Heart is the value of a type of cards
Heart
// Joker is the value of a type of cards
Joker
)
var suits = [...]Suit{Spade, Diamond, Club, Heart}
// Rank is the value of the number of a card
type Rank uint8
const (
_ Rank = iota
// Ace is the numeric value of the number of a card
Ace
// Two is the numeric value of the number of a card
Two
// Three is the numeric value of the number of a card
Three
// Four is the numeric value of the number of a card
Four
// Five is the numeric value of the number of a card
Five
// Six is the numeric value of the number of a card
Six
// Seven is the numeric value of the number of a card
Seven
// Eight is the numeric value of the number of a card
Eight
// Nine is the numeric value of the number of a card
Nine
// Ten is the numeric value of the number of a card
Ten
// Jack is the numeric value of the number of a card
Jack
// Queen is the numeric value of the number of a card
Queen
// King is the numeric value of the number of a card
King
)
const (
minRank = Ace
maxRank = King
)
// Card is the type of card object
type Card struct {
Suit
Rank
}
func (c Card) String() string {
if c.Suit == Joker {
return c.Suit.String()
}
return fmt.Sprintf("%s of %s", c.Rank.String(), c.Suit.String())
}
// New creates a new godeck of cards
// It accepts option functions to modify the godeck
func New(options ...func([]Card) []Card) []Card {
var cards []Card
for _, suit := range suits {
for rank := minRank; rank <= maxRank; rank++ {
cards = append(cards, Card{
Suit: suit,
Rank: rank,
})
}
}
for _, option := range options {
cards = option(cards)
}
return cards
}
// DefaultSort is the default sorting function of cards
func DefaultSort(cards []Card) []Card {
return Sort(Less)(cards)
}
// Sort is the function used to sort cards in a godeck
func Sort(less func(cards []Card) func(i, j int) bool) func([]Card) []Card {
return func(cards []Card) []Card {
sort.Slice(cards, less(cards))
return cards
}
}
// Less is the function used to compare cards while sorting
func Less(cards []Card) func(i, j int) bool {
return func(i, j int) bool {
return absRank(cards[i]) < absRank(cards[j])
}
}
func absRank(c Card) int {
return int(c.Suit)*int(maxRank) + int(c.Rank)
}
var shuffleRand = rand.New(rand.NewSource(time.Now().Unix()))
// Shuffle shuffles cards randomly in a godeck
func Shuffle(cards []Card) []Card {
ret := make([]Card, len(cards))
permutations := shuffleRand.Perm(len(cards))
for i, j := range permutations {
ret[i] = cards[j]
}
return ret
}
// Jokers adds n number of jokers to the godeck
func Jokers(n int) func(cards []Card) []Card {
return func(cards []Card) []Card {
for i := 0; i < n; i++ {
cards = append(cards, Card{
Suit: Suit(Joker),
Rank: Rank(i),
})
}
return cards
}
}
// Deck creates n number of decks
func Deck(n int) func(cards []Card) []Card {
return func(cards []Card) []Card {
var ret []Card
for i := 0; i < n; i++ {
ret = append(ret, cards...)
}
return ret
}
}
// Filter returns a godeck filtering out cards according
// to the criteria of the passed filtering function
func Filter(f func(card Card) bool) func([]Card) []Card {
return func(cards []Card) []Card {
var ret []Card
for _, c := range cards {
if !f(c) {
ret = append(ret, c)
}
}
return ret
}
} | card.go | 0.569374 | 0.461259 | card.go | starcoder |
package bulk
import (
"context"
"math/rand"
"time"
)
// Backoff implements exponential backoff.
// The wait time between retries is a random value between 0 and the "retry envelope".
// The envelope starts at Initial and increases by the factor of Multiplier every retry,
// but is capped at Max.
type Backoff struct {
// Initial is the initial value of the retry envelope, defaults to 1 second.
Initial time.Duration
// Max is the maximum value of the retry envelope, defaults to 30 seconds.
Max time.Duration
// Multiplier is the factor by which the retry envelope increases.
// It should be greater than 1 and defaults to 2.
Multiplier float64
// cur is the current retry envelope
cur time.Duration
}
// Pause returns the next time.Duration that the caller should use to backoff.
func (bo *Backoff) Pause() time.Duration {
if bo.Initial == 0 {
bo.Initial = time.Second
}
if bo.cur == 0 {
bo.cur = bo.Initial
}
if bo.Max == 0 {
bo.Max = 30 * time.Second
}
if bo.Multiplier < 1 {
bo.Multiplier = 2
}
// Select a duration between 1ns and the current max. It might seem
// counterintuitive to have so much jitter, but
// https://www.awsarchitectureblog.com/2015/03/backoff.html argues that
// that is the best strategy.
d := time.Duration(1 + rand.Int63n(int64(bo.cur)))
bo.cur = time.Duration(float64(bo.cur) * bo.Multiplier)
if bo.cur > bo.Max {
bo.cur = bo.Max
}
return d
}
// Retry ...
func Retry(bo Backoff, f func() (stop bool, err error)) error {
for {
stop, err := f()
if stop || err != nil {
return err
}
p := bo.Pause()
time.Sleep(p)
}
}
// SleepWithContext is similar to time.Sleep, but it can be interrupted by ctx.Done() closing.
func SleepWithContext(ctx context.Context, d time.Duration) error {
t := time.NewTimer(d)
select {
case <-ctx.Done():
t.Stop()
return ctx.Err()
case <-t.C:
return nil
}
}
// RetryWithContext ...
func RetryWithContext(ctx context.Context, bo Backoff, f func() (stop bool, err error),
sleep func(context.Context, time.Duration) error) error {
for {
stop, err := f()
if stop || err != nil {
return err
}
p := bo.Pause()
if cerr := sleep(ctx, p); cerr != nil {
return cerr
}
}
} | bulk/backoff.go | 0.700588 | 0.410106 | backoff.go | starcoder |
// +build ignore
package main
import (
"log"
"os"
"text/template"
)
func main() {
for _, typ := range []struct{ Type, Name, MathName string }{
{"int32", "Int32", ""},
{"int64", "Int64", ""},
{"uint32", "Uint32", ""},
{"uint64", "Uint64", ""},
{"float32", "Float32", "Float32"},
{"float64", "Float64", "Float64"},
} {
f, err := os.Create(typ.Type + "_test.go")
if err != nil {
log.Fatal(err)
}
if err = tmpl.Execute(f, typ); err != nil {
log.Fatal(err)
}
f.Close()
}
}
var tmpl = template.Must(template.New("test").Parse(`// Code generated by go run generate-tests.go.
// Copyright 2017 <NAME>. All rights reserved.
// Use of this source code is governed by a
// Modified BSD License that can be found in
// the LICENSE file.
package atomics
import (
"fmt"
{{- if .MathName}}
"math"
{{- end}}
"testing"
"testing/quick"
)
func Test{{.Name}}Default(t *testing.T) {
var v {{.Name}}
if v.Load() != 0 {
t.Fatal("invalid default value")
}
}
func TestNew{{.Name}}(t *testing.T) {
if New{{.Name}}(0) == nil {
t.Fatal("New{{.Name}} returned nil")
}
}
func Test{{.Name}}Raw(t *testing.T) {
var v {{.Name}}
if v.Raw() == nil {
t.Fatal("Raw returned nil")
}
if err := quick.Check(func(v {{.Type}}) bool {
{{- if .MathName}}
return *New{{.Name}}(v).Raw() == math.{{.MathName}}bits(v)
{{- else}}
return *New{{.Name}}(v).Raw() == v
{{- end}}
}, nil); err != nil {
t.Fatal(err)
}
}
func Test{{.Name}}Load(t *testing.T) {
if err := quick.Check(func(v {{.Type}}) bool {
return New{{.Name}}(v).Load() == v
}, nil); err != nil {
t.Fatal(err)
}
}
func Test{{.Name}}Store(t *testing.T) {
if err := quick.Check(func(v {{.Type}}) bool {
var a {{.Name}}
a.Store(v)
return a.Load() == v
}, nil); err != nil {
t.Fatal(err)
}
}
func Test{{.Name}}Swap(t *testing.T) {
if err := quick.Check(func(old, new {{.Type}}) bool {
a := New{{.Name}}(old)
return a.Swap(new) == old && a.Load() == new
}, nil); err != nil {
t.Fatal(err)
}
}
func Test{{.Name}}CompareAndSwap(t *testing.T) {
if err := quick.Check(func(old, new {{.Type}}) bool {
a := New{{.Name}}(old)
return !a.CompareAndSwap(-old, new) &&
a.Load() == old &&
a.CompareAndSwap(old, new) &&
a.Load() == new
}, nil); err != nil {
t.Fatal(err)
}
}
func Test{{.Name}}Add(t *testing.T) {
if err := quick.Check(func(v, delta {{.Type}}) bool {
a := New{{.Name}}(v)
v += delta
return a.Add(delta) == v && a.Load() == v
}, nil); err != nil {
t.Fatal(err)
}
}
func Test{{.Name}}Increment(t *testing.T) {
if err := quick.Check(func(v {{.Type}}) bool {
a := New{{.Name}}(v)
v++
return a.Increment() == v && a.Load() == v
}, nil); err != nil {
t.Fatal(err)
}
}
func Test{{.Name}}Subtract(t *testing.T) {
if err := quick.Check(func(v, delta {{.Type}}) bool {
a := New{{.Name}}(v)
v -= delta
return a.Subtract(delta) == v && a.Load() == v
}, nil); err != nil {
t.Fatal(err)
}
}
func Test{{.Name}}Decrement(t *testing.T) {
if err := quick.Check(func(v {{.Type}}) bool {
a := New{{.Name}}(v)
v--
return a.Decrement() == v && a.Load() == v
}, nil); err != nil {
t.Fatal(err)
}
}
func Test{{.Name}}Reset(t *testing.T) {
if err := quick.Check(func(v {{.Type}}) bool {
a := New{{.Name}}(v)
return a.Reset() == v && a.Load() == 0
}, nil); err != nil {
t.Fatal(err)
}
}
func Test{{.Name}}String(t *testing.T) {
if err := quick.Check(func(v {{.Type}}) bool {
return New{{.Name}}(v).String() == fmt.Sprint(v)
}, nil); err != nil {
t.Fatal(err)
}
}
func BenchmarkNew{{.Name}}(b *testing.B) {
for n := 0; n < b.N; n++ {
New{{.Name}}(0)
}
}
func Benchmark{{.Name}}Load(b *testing.B) {
var v {{.Name}}
for n := 0; n < b.N; n++ {
v.Load()
}
}
func Benchmark{{.Name}}Store(b *testing.B) {
var v {{.Name}}
for n := 0; n < b.N; n++ {
v.Store(4) // RFC 1149.5 specifies 4 as the standard IEEE-vetted random number.
}
}
func Benchmark{{.Name}}Swap(b *testing.B) {
var v {{.Name}}
for n := 0; n < b.N; n++ {
v.Swap(4) // RFC 1149.5 specifies 4 as the standard IEEE-vetted random number.
}
}
func Benchmark{{.Name}}CompareAndSwap(b *testing.B) {
var v {{.Name}}
for n := 0; n < b.N; n++ {
v.CompareAndSwap(0, 0)
}
}
func Benchmark{{.Name}}Add(b *testing.B) {
var v {{.Name}}
for n := 0; n < b.N; n++ {
v.Add(4) // RFC 1149.5 specifies 4 as the standard IEEE-vetted random number.
}
}
func Benchmark{{.Name}}Increment(b *testing.B) {
var v {{.Name}}
for n := 0; n < b.N; n++ {
v.Increment()
}
}
func Benchmark{{.Name}}Subtract(b *testing.B) {
var v {{.Name}}
for n := 0; n < b.N; n++ {
v.Subtract(4) // RFC 1149.5 specifies 4 as the standard IEEE-vetted random number.
}
}
func Benchmark{{.Name}}Decrement(b *testing.B) {
var v {{.Name}}
for n := 0; n < b.N; n++ {
v.Decrement()
}
}
func Benchmark{{.Name}}Reset(b *testing.B) {
var v {{.Name}}
for n := 0; n < b.N; n++ {
v.Reset()
}
}
`)) | generate-tests.go | 0.553143 | 0.42483 | generate-tests.go | starcoder |
package ripemd160
import "encoding/binary"
// https://homes.esat.kuleuven.be/~bosselae/ripemd160.html
// https://homes.esat.kuleuven.be/~bosselae/ripemd160/pdf/AB-9601/AB-9601.pdf
// nonlinear functions at bit level: exor, mux, -, mux, -
func f(j int, x, y, z uint32) uint32 {
// f(j, x, y, z) = x ⊕ y ⊕ z ( 0 ≤ j ≤ 15)
if 0 <= j && j <= 15 {
return x ^ y ^ z
}
// f(j, x, y, z) = (x ∧ y) ∨ (¬x ∧ z) (16 ≤ j ≤ 31)
if 16 <= j && j <= 31 {
return (x & y) | (^x & z)
}
// f(j, x, y, z) = (x ∨ ¬y) ⊕ z (32 ≤ j ≤ 47)
if 32 <= j && j <= 47 {
return (x | ^y) ^ z
}
// f(j, x, y, z) = (x ∧ z) ∨ (y ∧ ¬z) (48 ≤ j ≤ 63)
if 48 <= j && j <= 63 {
return (x & z) | (y & ^z)
}
// f(j, x, y, z) = x ⊕ (y ∨ ¬z) (64 ≤ j ≤ 79)
if 64 <= j && j <= 79 {
return x ^ (y | ^z)
}
return 0
}
// K1 : K added constants (hexadecimal)
func K1(j int) uint32 {
// K(j) = 00000000x (0 ≤ j ≤ 15)
if 0 <= j && j <= 15 {
return 0x00000000
}
// K(j) = 5A827999x (16 ≤ j ≤ 31)
if 16 <= j && j <= 31 {
return 0x5A827999
}
// K(j) = 6ED9EBA1x (32 ≤ j ≤ 47)
if 32 <= j && j <= 47 {
return 0x6ED9EBA1
}
// K(j) = 8F1BBCDCx (48 ≤ j ≤ 63)
if 48 <= j && j <= 63 {
return 0x8F1BBCDC
}
// K(j) = A953FD4Ex (64 ≤ j ≤ 79)
if 64 <= j && j <= 79 {
return 0xA953FD4E
}
return 0
}
// K2 : K' added constants (hexadecimal)
func K2(j int) uint32 {
// K'(j) = 50A28BE6x (0 ≤ j ≤ 15)
if 0 <= j && j <= 15 {
return 0x50A28BE6
}
// K'(j) = 5C4DD124x (16 ≤ j ≤ 31)
if 16 <= j && j <= 31 {
return 0x5C4DD124
}
// K'(j) = 6D703EF3x (32 ≤ j ≤ 47)
if 32 <= j && j <= 47 {
return 0x6D703EF3
}
// K'(j) = 7A6D76E9x (48 ≤ j ≤ 63)
if 48 <= j && j <= 63 {
return 0x7A6D76E9
}
// K'(j) = 00000000x (64 ≤ j ≤ 79)
if 64 <= j && j <= 79 {
return 0x00000000
}
return 0
}
// r1 : r selection of message word
var r1 = []int{
// r(j) = j (0 ≤ j ≤ 15)
0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15,
// r(16..31) = 7, 4, 13, 1, 10, 6, 15, 3, 12, 0, 9, 5, 2, 14, 11, 8
7, 4, 13, 1, 10, 6, 15, 3, 12, 0, 9, 5, 2, 14, 11, 8,
// r(32..47) = 3, 10, 14, 4, 9, 15, 8, 1, 2, 7, 0, 6, 13, 11, 5, 12
3, 10, 14, 4, 9, 15, 8, 1, 2, 7, 0, 6, 13, 11, 5, 12,
// r(48..63) = 1, 9, 11, 10, 0, 8, 12, 4, 13, 3, 7, 15, 14, 5, 6, 2
1, 9, 11, 10, 0, 8, 12, 4, 13, 3, 7, 15, 14, 5, 6, 2,
// r(64..79) = 4, 0, 5, 9, 7, 12, 2, 10, 14, 1, 3, 8, 11, 6, 15, 13
4, 0, 5, 9, 7, 12, 2, 10, 14, 1, 3, 8, 11, 6, 15, 13,
}
// r2 : r' selection of message word
var r2 = []int{
// r'(0..15) = 5, 14, 7, 0, 9, 2, 11, 4, 13, 6, 15, 8, 1, 10, 3, 12
5, 14, 7, 0, 9, 2, 11, 4, 13, 6, 15, 8, 1, 10, 3, 12,
// r'(16..31) = 6, 11, 3, 7, 0, 13, 5, 10, 14, 15, 8, 12, 4, 9, 1, 2
6, 11, 3, 7, 0, 13, 5, 10, 14, 15, 8, 12, 4, 9, 1, 2,
// r'(32..47) = 15, 5, 1, 3, 7, 14, 6, 9, 11, 8, 12, 2, 10, 0, 4, 13
15, 5, 1, 3, 7, 14, 6, 9, 11, 8, 12, 2, 10, 0, 4, 13,
// r'(48..63) = 8, 6, 4, 1, 3, 11, 15, 0, 5, 12, 2, 13, 9, 7, 10, 14
8, 6, 4, 1, 3, 11, 15, 0, 5, 12, 2, 13, 9, 7, 10, 14,
// r'(64..79) = 12, 15, 10, 4, 1, 5, 8, 7, 6, 2, 13, 14, 0, 3, 9, 11
12, 15, 10, 4, 1, 5, 8, 7, 6, 2, 13, 14, 0, 3, 9, 11,
}
// s1 : s amount for rotate left (rol)
var s1 = []uint32{
// s(0..15) = 11, 14, 15, 12, 5, 8, 7, 9, 11, 13, 14, 15, 6, 7, 9, 8
11, 14, 15, 12, 5, 8, 7, 9, 11, 13, 14, 15, 6, 7, 9, 8,
// s(16..31) = 7, 6, 8, 13, 11, 9, 7, 15, 7, 12, 15, 9, 11, 7, 13, 12
7, 6, 8, 13, 11, 9, 7, 15, 7, 12, 15, 9, 11, 7, 13, 12,
// s(32..47) = 11, 13, 6, 7, 14, 9, 13, 15, 14, 8, 13, 6, 5, 12, 7, 5
11, 13, 6, 7, 14, 9, 13, 15, 14, 8, 13, 6, 5, 12, 7, 5,
// s(48..63) = 11, 12, 14, 15, 14, 15, 9, 8, 9, 14, 5, 6, 8, 6, 5, 12
11, 12, 14, 15, 14, 15, 9, 8, 9, 14, 5, 6, 8, 6, 5, 12,
// s(64..79) = 9, 15, 5, 11, 6, 8, 13, 12, 5, 12, 13, 14, 11, 8, 5, 6
9, 15, 5, 11, 6, 8, 13, 12, 5, 12, 13, 14, 11, 8, 5, 6,
}
// s2 : s' amount for rotate left (rol)
var s2 = []uint32{
// s'(0..15) = 8, 9, 9, 11, 13, 15, 15, 5, 7, 7, 8, 11, 14, 14, 12, 6
8, 9, 9, 11, 13, 15, 15, 5, 7, 7, 8, 11, 14, 14, 12, 6,
// s'(16..31) = 9, 13, 15, 7, 12, 8, 9, 11, 7, 7, 12, 7, 6, 15, 13, 11
9, 13, 15, 7, 12, 8, 9, 11, 7, 7, 12, 7, 6, 15, 13, 11,
// s'(32..47) = 9, 7, 15, 11, 8, 6, 6, 14, 12, 13, 5, 14, 13, 13, 7, 5
9, 7, 15, 11, 8, 6, 6, 14, 12, 13, 5, 14, 13, 13, 7, 5,
// s'(48..63) = 15, 5, 8, 11, 14, 14, 6, 14, 6, 9, 12, 9, 12, 5, 15, 8
15, 5, 8, 11, 14, 14, 6, 14, 6, 9, 12, 9, 12, 5, 15, 8,
// s'(64..79) = 8, 5, 12, 9, 12, 5, 14, 6, 8, 13, 6, 5, 15, 13, 11, 11
8, 5, 12, 9, 12, 5, 14, 6, 8, 13, 6, 5, 15, 13, 11, 11,
}
// rol_s denotes cyclic left shift (rotate) over s positions.
func rol(s, x uint32) uint32 {
return (x << s) | (x >> (32 - s))
}
func padding(msg []byte) []byte {
// https://tools.ietf.org/html/rfc1320
// 3.1 Step 1. Append Padding Bits
len := len(msg)
tmp := make([]byte, 64)
tmp[0] = 0x80
bs := make([]byte, len)
copy(bs, msg)
if len%64 < 56 {
bs = append(bs, tmp[0:56-len%64]...)
} else {
bs = append(bs, tmp[0:64+56-len%64]...)
}
// 3.2 Step 2. Append Length
bits := uint64(len * 8)
size := make([]byte, 8)
binary.LittleEndian.PutUint64(size, bits)
bs = append(bs, size...)
return bs
}
func compute(msg []byte) []byte {
t := len(msg) / 64
// initial value (hexadecimal)
// h0 = 67452301x; h1 = EFCDAB89x; h2 = 98BADCFEx; h3 = 10325476x; h4 = C3D2E1F0x;
h0 := uint32(0x67452301)
h1 := uint32(0xEFCDAB89)
h2 := uint32(0x98BADCFE)
h3 := uint32(0x10325476)
h4 := uint32(0xC3D2E1F0)
// RIPEMD-160: pseudo-code
for i := 0; i < t; i++ {
// let X = [];
X := make([]uint32, 16)
for t := 0; t < 16; t++ {
p := i*64 + t*4
X[t] = binary.LittleEndian.Uint32(msg[p : p+4])
}
// A := h0; B := h1; C := h2; D = h3; E = h4;
A1 := h0
B1 := h1
C1 := h2
D1 := h3
E1 := h4
// A' := h0; B' := h1; C' := h2; D' = h3; E' = h4;
A2 := h0
B2 := h1
C2 := h2
D2 := h3
E2 := h4
T := uint32(0)
for j := 0; j < 80; j++ {
// T := rol_s(j) (A 田 f(j, B, C, D) 田 Xi[r(j)] 田 K(j)) 田 E;
T = ta(rol(s1[j], ta(ta(ta(A1, f(j, B1, C1, D1)), X[r1[j]]), K1(j))), E1)
// A := E; E := D; D := rol_10(C); C := B; B := T;
A1 = E1
E1 = D1
D1 = rol(10, C1)
C1 = B1
B1 = T
// T := rol_s'(j) (A' 田 f(79 - j, B', C', D') 田 Xi[r'(j)] 田 K'(j)) 田 E';
T = ta(rol(s2[j], ta(ta(ta(A2, f(79-j, B2, C2, D2)), X[r2[j]]), K2(j))), E2)
// A' := E'; E' := D'; D' := rol_10(C'); C' := B'; B' := T;
A2 = E2
E2 = D2
D2 = rol(10, C2)
C2 = B2
B2 = T
}
// T := h1 田 C 田 D'; h1 := h2 田 D 田 E'; h2 := h3 田 E 田 A_;
T = ta(ta(h1, C1), D2)
h1 = ta(ta(h2, D1), E2)
h2 = ta(ta(h3, E1), A2)
// h3 := h4 田 A 田 B'; h4 := h0 田 B 田 C_; h0 := T;
h3 = ta(ta(h4, A1), B2)
h4 = ta(ta(h0, B1), C2)
h0 = T
}
hash := make([]byte, 20)
binary.LittleEndian.PutUint32(hash[0:], h0)
binary.LittleEndian.PutUint32(hash[4:], h1)
binary.LittleEndian.PutUint32(hash[8:], h2)
binary.LittleEndian.PutUint32(hash[12:], h3)
binary.LittleEndian.PutUint32(hash[16:], h4)
return hash
}
func ta(x, y uint32) uint32 {
return x + y
}
// Digest ...
func Digest(msg []byte) []byte {
return compute(padding(msg))
} | ripemd160/ripemd160.go | 0.532182 | 0.404507 | ripemd160.go | starcoder |
package tools
import (
"fmt"
"reflect"
"strconv"
"strings"
)
type StrConvert func(in string) (out interface{}, err error)
// TypeConvert container
var TypeConvert map[reflect.Kind]StrConvert
func init() {
TypeConvert = make(map[reflect.Kind]StrConvert)
TypeConvert[reflect.Bool] = BoolConvert
TypeConvert[reflect.Int] = IntConvert
TypeConvert[reflect.Int8] = Int8Convert
TypeConvert[reflect.Int16] = Int16Convert
TypeConvert[reflect.Int32] = Int32Convert
TypeConvert[reflect.Int64] = Int64Convert
TypeConvert[reflect.Uint] = UintConvert
TypeConvert[reflect.Uint8] = Uint8Convert
TypeConvert[reflect.Uint16] = Uint16Convert
TypeConvert[reflect.Uint32] = Uint32Convert
TypeConvert[reflect.Uint64] = Uint64Convert
TypeConvert[reflect.Float32] = Float32Convert
TypeConvert[reflect.Float64] = Float64Convert
TypeConvert[reflect.String] = StringConvert
TypeConvert[reflect.Slice] = SliceConvert
TypeConvert[reflect.Array] = ArrayConvert
TypeConvert[reflect.Map] = MapConvert
TypeConvert[reflect.Struct] = StructConvert
// unimplemented
TypeConvert[reflect.Uintptr] = UintptrConvert
TypeConvert[reflect.Complex64] = Complex64Convert
TypeConvert[reflect.Complex128] = Complex128Convert
TypeConvert[reflect.Chan] = ChanConvert
TypeConvert[reflect.Func] = FuncConvert
TypeConvert[reflect.Interface] = InterfaceConvert
TypeConvert[reflect.Ptr] = PtrConvert
}
func BoolConvert(in string) (out interface{}, err error) {
return parseBool(in)
}
func IntConvert(in string) (out interface{}, err error) {
return strconv.Atoi(in)
}
func Int8Convert(in string) (out interface{}, err error) {
r, err := strconv.ParseInt(in, 0, 8)
out = int8(r)
return
}
func Int16Convert(in string) (out interface{}, err error) {
r, err := strconv.ParseInt(in, 0, 16)
out = int16(r)
return
}
func Int32Convert(in string) (out interface{}, err error) {
r, err := strconv.ParseInt(in, 0, 32)
out = int32(r)
return
}
func Int64Convert(in string) (out interface{}, err error) {
r, err := strconv.ParseInt(in, 0, 64)
out = int64(r)
return
}
func UintConvert(in string) (out interface{}, err error) {
r, err := strconv.ParseUint(in, 0, 32)
out = uint(r)
return
}
func Uint8Convert(in string) (out interface{}, err error) {
r, err := strconv.ParseUint(in, 0, 8)
out = uint8(r)
return
}
func Uint16Convert(in string) (out interface{}, err error) {
r, err := strconv.ParseUint(in, 0, 16)
out = uint16(r)
return
}
func Uint32Convert(in string) (out interface{}, err error) {
r, err := strconv.ParseUint(in, 0, 32)
out = uint32(r)
return
}
func Uint64Convert(in string) (out interface{}, err error) {
r, err := strconv.ParseUint(in, 0, 64)
out = uint64(r)
return
}
func UintptrConvert(in string) (out interface{}, err error) {
panic("unimplemented")
}
func Float32Convert(in string) (out interface{}, err error) {
float, err := strconv.ParseFloat(in, 32)
out = float32(float)
return
}
func Float64Convert(in string) (out interface{}, err error) {
float, err := strconv.ParseFloat(in, 64)
out = float64(float)
return
}
func Complex64Convert(in string) (out interface{}, err error) {
panic("unimplemented")
}
func Complex128Convert(in string) (out interface{}, err error) {
panic("unimplemented")
}
func ArrayConvert(in string) (out interface{}, err error) {
out = strings.Split(in, `,`)
return
}
func ChanConvert(in string) (out interface{}, err error) {
panic("unimplemented")
}
func FuncConvert(in string) (out interface{}, err error) {
panic("unimplemented")
}
func InterfaceConvert(in string) (out interface{}, err error) {
panic("unimplemented")
}
func MapConvert(in string) (out interface{}, err error) {
panic("unimplemented")
}
func PtrConvert(in string) (out interface{}, err error) {
panic("unimplemented")
}
func SliceConvert(in string) (out interface{}, err error) {
panic("unimplemented")
}
func StringConvert(in string) (out interface{}, err error) {
out = in
return
}
func StructConvert(in string) (out interface{}, err error) {
panic("unimplemented")
}
func parseBool(str string) (value bool, err error) {
switch str {
case "1", "t", "T", "true", "TRUE", "True", "YES", "yes", "Yes", "y", "ON", "on", "On":
return true, nil
case "0", "f", "F", "false", "FALSE", "False", "NO", "no", "No", "n", "OFF", "off", "Off":
return false, nil
}
return false, fmt.Errorf("parsing \"%s\": invalid syntax", str)
} | core/tools/type_convert.go | 0.650023 | 0.404507 | type_convert.go | starcoder |
package base
import (
"errors"
"fmt"
"strconv"
"strings"
"time"
)
// NewEmptyGTS return an empty GTS
func NewEmptyGTS() *GTS {
return >S{
ClassName: "",
Labels: Labels{},
Attributes: Attributes{},
LastActivity: 0,
Values: [][]interface{}{},
}
}
// NewGTS return a nammed GTS
func NewGTS(className string) *GTS {
return >S{
ClassName: className,
Labels: Labels{},
Attributes: Attributes{},
LastActivity: 0,
Values: [][]interface{}{},
}
}
// NewGTSWithLabels return a nammed and labelized GTS
func NewGTSWithLabels(className string, labels Labels) *GTS {
return >S{
ClassName: className,
Labels: labels,
Attributes: Attributes{},
LastActivity: 0,
Values: [][]interface{}{},
}
}
// ParseGTSFromString parse an unique sensision format line into a new GTS
func ParseGTSFromString(sensisionLine string) (*GTS, error) {
ts, lat, long, alt, c, l, a, v, err := parseSensisionLine(strings.TrimSpace(sensisionLine))
if err != nil {
return nil, err
}
gts := >S{
ClassName: c,
Labels: l,
Attributes: a,
Values: [][]interface{}{{ts, lat, long, alt, v}},
}
return gts, nil
}
// ParseGTSFromBytes parse sensision format into a new GTS
func ParseGTSFromBytes(in []byte) (*GTS, error) {
return ParseGTSFromString(string(in))
}
// ParseGTSArrayFromString parse sensision format into a new GTS array
func ParseGTSArrayFromString(in string) (GTSList, error) {
gtss := GTSList{}
for _, sensisionLine := range strings.Split(in, "\n") {
gts, err := ParseGTSFromString(sensisionLine)
if err != nil {
return nil, err
}
gtss = append(gtss, gts)
}
return gtss, nil
}
// ParseGTSArrayFromBytes parse sensision format into a new GTS array
func ParseGTSArrayFromBytes(in []byte) (GTSList, error) {
return ParseGTSArrayFromString(string(in))
}
func parseSensisionLine(in string) (int64, float64, float64, float64, string, Labels, Attributes, interface{}, error) {
var err error
var v interface{}
v = 0
ts := int64(time.Now().Nanosecond()) / 1000
lat := float64(0)
long := float64(0)
alt := float64(0)
c := ""
l := Labels{}
a := Attributes{}
// 469756475/5496:54965/5496757 test{l=v,ufgsdg=vfivuvf}{a=b} 57486847
g := strings.Split(in, " ")
if len(g) < 3 {
err = errors.New("Cannot parse datapoint: " + in)
return ts, lat, long, alt, c, l, a, v, err
}
ctx := strings.Split(g[0], "/")
if len(ctx) != 3 {
err = errors.New("Cannot parse datapoint: " + in)
return ts, lat, long, alt, c, l, a, v, err
}
if ctx[0] == "" {
err = errors.New("No timestamp provided")
return ts, lat, long, alt, c, l, a, v, err
}
ts, err = strconv.ParseInt(ctx[0], 10, 64)
if err != nil {
err = errors.New("Cannot parse " + ctx[0] + " as valid timestamp")
return ts, lat, long, alt, c, l, a, v, err
}
if ctx[2] != "" {
alt, err = strconv.ParseFloat(ctx[2], 64)
if err != nil {
err = errors.New("Cannot parse " + ctx[0] + " as valid altitude")
return ts, lat, long, alt, c, l, a, v, err
}
}
if ctx[1] != "" {
latlong := strings.Split(ctx[1], ":")
if len(latlong) != 2 {
err = errors.New("Cannot parse " + ctx[1] + " as valid latitude:longitude")
return ts, lat, long, alt, c, l, a, v, err
}
lat, err = strconv.ParseFloat(latlong[0], 64)
if err != nil {
err = errors.New("Cannot parse " + ctx[0] + " as valid latitude")
return ts, lat, long, alt, c, l, a, v, err
}
long, err = strconv.ParseFloat(latlong[1], 64)
if err != nil {
err = errors.New("Cannot parse " + ctx[0] + " as valid latitude")
return ts, lat, long, alt, c, l, a, v, err
}
}
// g[1] = test{l=v,ufgsdg=vfivuvf}{a=b}
classLabelsAttributes := strings.SplitN(g[1], "{", 2)
// test l=v,ufgsdg=vfivuvf}{a=b}
// test l=v,ufgsdg=vfivuvf}
// test }
//fmt.Println(fmt.Sprintf("%+v", classLabelsAttributes))
if len(classLabelsAttributes) != 2 {
err = errors.New("Cannot parse " + g[1] + " as valid class + labels + attributes")
return ts, lat, long, alt, c, l, a, v, err
}
c = classLabelsAttributes[0]
classLabelsAttributes[1] = strings.TrimSuffix(classLabelsAttributes[1], "}")
// contains labels
if classLabelsAttributes[1] != "" {
if strings.Contains(classLabelsAttributes[1], "}{") {
//TODO: handle attributes
classLabelsAttributes[1] = strings.Split(classLabelsAttributes[1], "}{")[0]
}
// attributes cleaned
labelsValue := strings.Split(classLabelsAttributes[1], ",")
for _, labelPair := range labelsValue {
keyVal := strings.Split(labelPair, "=")
if len(keyVal) != 2 {
err = errors.New("Cannot parse " + labelPair + " as valid key and value label")
return ts, lat, long, alt, c, l, a, v, err
}
l[keyVal[0]] = keyVal[1]
}
}
vStr := strings.Join(g[2:], " ")
vStr = strings.Trim(vStr, "'")
if len(vStr) == 0 {
err = errors.New("Cannot parse " + strings.Join(g[2:], " ") + " as valid value")
return ts, lat, long, alt, c, l, a, v, err
}
var vInt int
vInt, err = strconv.Atoi(vStr)
if err == nil {
v = vInt
} else if vInt, err := strconv.ParseInt(vStr, 10, 64); err == nil {
v = vInt
} else {
v = vStr
}
return ts, lat, long, alt, c, l, a, v, nil
}
// SensisionSelector return the GTS selector (class + labels (+ attributes) )
func (gts *GTS) SensisionSelector(withAttributes bool) string {
s := gts.ClassName + formatLabels(gts.Labels)
if withAttributes {
s += formatAttributes(gts.Attributes)
}
return s
}
// SensisionSelectors return the GTSList selectors (class + labels (+ attributes) )
func (gtsList GTSList) SensisionSelectors(withAttributes bool) string {
s := ""
for _, gts := range gtsList {
s += gts.SensisionSelector(withAttributes) + "\n"
}
return s
}
// Sensision return the sensision format of the GTS
func (gts *GTS) Sensision() string {
s := ""
static := gts.SensisionSelector(false)
for _, dps := range gts.Values {
l := len(dps)
if l == 0 {
continue
}
ts := ""
lat := ""
lng := ""
alt := ""
val := getVal(dps[l-1])
if l >= 2 {
ts = fmt.Sprintf("%v", dps[0])
}
if l >= 4 {
lat = fmt.Sprintf("%v", dps[1])
lng = fmt.Sprintf("%v", dps[2])
}
if l == 5 {
alt = fmt.Sprintf("%v", dps[3])
}
if lat != "" && lng != "" {
s += fmt.Sprintf("%s/%s:%s/%s %s %s", ts, lat, lng, alt, static, val)
} else {
s += fmt.Sprintf("%s//%s %s %s", ts, alt, static, val)
}
s += "\n"
}
return s
}
// Sensision return the sensision format of all GTS
func (gtsList GTSList) Sensision() string {
s := ""
for _, gts := range gtsList {
s += gts.Sensision() + "\n"
}
return s
}
func getVal(i interface{}) string {
switch i.(type) {
case int, int8, int16, int32, int64, uint, uint8, uint16, uint32, uint64:
return fmt.Sprintf("%d", i)
case float32, float64:
return fmt.Sprintf("%g", i)
case string:
return fmt.Sprintf("'%s'", i)
case fmt.Stringer:
return fmt.Sprintf("'%s'", i.(fmt.Stringer).String())
}
return fmt.Sprintf("'%v'", i)
}
func formatLabels(labels Labels) string {
s := "{"
if labels != nil && len(labels) > 0 {
pairs := []string{}
for k, v := range labels {
if !strings.HasPrefix(v, "=") && !strings.HasPrefix(v, "~") {
v = "=" + v
}
pairs = append(pairs, k+v)
}
s += strings.Join(pairs, ",")
}
return s + "}"
}
func formatAttributes(attrs Attributes) string {
s := "{"
if attrs != nil {
pairs := []string{}
for k, v := range attrs {
if !strings.HasPrefix(v, "=") && !strings.HasPrefix(v, "~") {
v = "=" + v
}
pairs = append(pairs, k+v)
}
s += strings.Join(pairs, ",")
}
s += "}"
return s
} | base/gts-helper.go | 0.574753 | 0.509642 | gts-helper.go | starcoder |
package pixelate
import (
"image"
"image/color"
"github.com/Nyarum/img/utils"
)
// Halves the width of the double-width image created by hxl to produce nice
// smooth edges.
func halveWidth(img image.Image) image.Image {
b := img.Bounds()
o := image.NewRGBA(image.Rect(0, 0, b.Dx()/2, b.Dy()))
for y := 0; y < b.Dy(); y++ {
for x := 0; x < b.Dx()/2; x++ {
l := img.At(x*2, y)
r := img.At(x*2+1, y)
o.Set(x, y, utils.Average(l, r))
}
}
return o
}
// Hxl pixelates the Image into equilateral triangles with the width
// given. These are arranged into hexagonal shapes.
func Hxl(img image.Image, width int) image.Image {
b := img.Bounds()
pixelHeight := width * 2
pixelWidth := width
cols := b.Dx() / pixelWidth
rows := b.Dy() / pixelHeight
o := image.NewRGBA(image.Rect(0, 0, pixelWidth*cols*2, pixelHeight*rows))
// Note: "Top" doesn't mean above the x-axis, it means in the triangle
// pointing towards the x-axis.
inTop := func(x, y float64) bool {
return (x >= 0 && y >= x) || (x <= 0 && y >= -x)
}
// Same for "Bottom" this is the triangle below and pointing towards the
// x-axis.
inBottom := func(x, y float64) bool {
return (x >= 0 && y <= -x) || (x <= 0 && y <= x)
}
for col := 0; col < cols; col++ {
for row := 0; row < rows; row++ {
north := []color.Color{}
south := []color.Color{}
for y := 0; y < pixelHeight; y++ {
for x := 0; x < pixelWidth; x++ {
realY := row*pixelHeight + y
realX := col*pixelWidth + x
pixel := img.At(realX, realY)
yOrigin := float64(y - pixelHeight/2)
xOrigin := float64(x - pixelWidth/2)
if inTop(xOrigin, yOrigin) {
north = append(north, pixel)
} else if inBottom(xOrigin, yOrigin) {
south = append(south, pixel)
}
}
}
top := utils.Average(north...)
bot := utils.Average(south...)
for y := 0; y < pixelHeight; y++ {
for x := 0; x < pixelWidth*2; x++ {
realY := row*pixelHeight + y
realX := col*pixelWidth*2 + x
yOrigin := float64(y - pixelHeight/2)
xOrigin := float64(x - pixelWidth*2/2)
if inTop(xOrigin, yOrigin) {
o.Set(realX, realY, top)
} else if inBottom(xOrigin, yOrigin) {
o.Set(realX, realY, bot)
}
}
}
}
}
// Now for the shifted version
offsetY := pixelHeight / 2
offsetX := pixelWidth / 2
for col := -1; col < cols; col++ {
for row := -1; row < rows; row++ {
north := []color.Color{}
south := []color.Color{}
for y := 0; y < pixelHeight; y++ {
for x := 0; x < pixelWidth; x++ {
realY := row*pixelHeight + y + offsetY
realX := col*pixelWidth + x + offsetX
if realX >= 0 && realX < b.Dx() {
pixel := img.At(realX, realY)
yOrigin := float64(y - pixelHeight/2)
xOrigin := float64(x - pixelWidth/2)
if inTop(xOrigin, yOrigin) {
north = append(north, pixel)
} else if inBottom(xOrigin, yOrigin) {
south = append(south, pixel)
}
}
}
}
top := utils.Average(north...)
bot := utils.Average(south...)
for y := 0; y < pixelHeight; y++ {
for x := 0; x < pixelWidth*2; x++ {
realY := row*pixelHeight + y + offsetY
realX := col*pixelWidth*2 + x + offsetX*2
yOrigin := float64(y - pixelHeight/2)
xOrigin := float64(x - pixelWidth*2/2)
if inTop(xOrigin, yOrigin) {
o.Set(realX, realY, top)
} else if inBottom(xOrigin, yOrigin) {
o.Set(realX, realY, bot)
}
}
}
}
}
return halveWidth(o)
} | pixelate/hxl.go | 0.773473 | 0.555134 | hxl.go | starcoder |
package tsdmetrics
import "github.com/rcrowley/go-metrics"
type IntegerHistogram interface {
Clear()
Count() int64
Max() int64
Mean() int64
Min() int64
Percentile(float64) int64
Percentiles([]float64) []int64
Sample() metrics.Sample
Snapshot() IntegerHistogram
StdDev() int64
Sum() int64
Update(int64)
Variance() int64
}
func NewIntegerHistogram(s metrics.Sample) IntegerHistogram {
return &IntHistogram{sample: s}
}
// IntHistogram is the standard implementation of a Histogram and uses a
// Sample to bound its memory use.
type IntHistogram struct {
sample metrics.Sample
}
// Clear clears the histogram and its sample.
func (h *IntHistogram) Clear() { h.sample.Clear() }
// Count returns the number of samples recorded since the histogram was last
// cleared.
func (h *IntHistogram) Count() int64 { return h.sample.Count() }
// Max returns the maximum value in the sample.
func (h *IntHistogram) Max() int64 { return h.sample.Max() }
// Mean returns the mean of the values in the sample.
func (h *IntHistogram) Mean() int64 { return int64(h.sample.Mean()) }
// Min returns the minimum value in the sample.
func (h *IntHistogram) Min() int64 { return h.sample.Min() }
// Percentile returns an arbitrary percentile of the values in the sample.
func (h *IntHistogram) Percentile(p float64) int64 {
return int64(h.sample.Percentile(p))
}
// Percentiles returns a slice of arbitrary percentiles of the values in the
// sample.
func (h *IntHistogram) Percentiles(ps []float64) []int64 {
vals := make([]int64, len(ps))
for i, v := range h.sample.Percentiles(ps) {
vals[i] = int64(v)
}
return vals
}
// Sample returns the Sample underlying the histogram.
func (h *IntHistogram) Sample() metrics.Sample { return h.sample }
// Snapshot returns a read-only copy of the histogram.
func (h *IntHistogram) Snapshot() IntegerHistogram {
return &IntHistogramSnapshot{sample: h.sample.Snapshot().(*metrics.SampleSnapshot)}
}
// StdDev returns the standard deviation of the values in the sample.
func (h *IntHistogram) StdDev() int64 { return int64(h.sample.StdDev()) }
// Sum returns the sum in the sample.
func (h *IntHistogram) Sum() int64 { return int64(h.sample.Sum()) }
// Update samples a new value.
func (h *IntHistogram) Update(v int64) { h.sample.Update(v) }
// Variance returns the variance of the values in the sample.
func (h *IntHistogram) Variance() int64 { return int64(h.sample.Variance()) }
// IntHistogramSnapshot is a read-only copy of another Histogram.
type IntHistogramSnapshot struct {
sample *metrics.SampleSnapshot
}
// Clear panics.
func (*IntHistogramSnapshot) Clear() {
panic("Clear called on a IntHistogramSnapshot")
}
// Count returns the number of samples recorded at the time the snapshot was
// taken.
func (h *IntHistogramSnapshot) Count() int64 { return h.sample.Count() }
// Max returns the maximum value in the sample at the time the snapshot was
// taken.
func (h *IntHistogramSnapshot) Max() int64 { return h.sample.Max() }
// Mean returns the mean of the values in the sample at the time the snapshot
// was taken.
func (h *IntHistogramSnapshot) Mean() int64 { return int64(h.sample.Mean()) }
// Min returns the minimum value in the sample at the time the snapshot was
// taken.
func (h *IntHistogramSnapshot) Min() int64 { return h.sample.Min() }
// Percentile returns an arbitrary percentile of values in the sample at the
// time the snapshot was taken.
func (h *IntHistogramSnapshot) Percentile(p float64) int64 {
return int64(h.sample.Percentile(p))
}
// Percentiles returns a slice of arbitrary percentiles of values in the sample
// at the time the snapshot was taken.
func (h *IntHistogramSnapshot) Percentiles(ps []float64) []int64 {
vals := make([]int64, len(ps))
for i, v := range h.sample.Percentiles(ps) {
vals[i] = int64(v)
}
return vals
}
// Sample returns the Sample underlying the histogram.
func (h *IntHistogramSnapshot) Sample() metrics.Sample { return h.sample }
// Snapshot returns the snapshot.
func (h *IntHistogramSnapshot) Snapshot() IntegerHistogram { return h }
// StdDev returns the standard deviation of the values in the sample at the
// time the snapshot was taken.
func (h *IntHistogramSnapshot) StdDev() int64 { return int64(h.sample.StdDev()) }
// Sum returns the sum in the sample at the time the snapshot was taken.
func (h *IntHistogramSnapshot) Sum() int64 { return h.sample.Sum() }
// Update panics.
func (*IntHistogramSnapshot) Update(int64) {
panic("Update called on a IntHistogramSnapshot")
}
// Variance returns the variance of inputs at the time the snapshot was taken.
func (h *IntHistogramSnapshot) Variance() int64 { return int64(h.sample.Variance()) } | integer_histogram.go | 0.932951 | 0.706209 | integer_histogram.go | starcoder |
package ms
import (
"encoding/binary"
)
const (
BlocketteHeaderSize = 4
Blockette1000Size = 4
Blockette1001Size = 4
)
// BlocketteHeader stores the header of each miniseed blockette.
type BlocketteHeader struct {
BlocketteType uint16
NextBlockette uint16 // Byte of next blockette, 0 if last blockette
}
// DecodeBlocketteHeader returns a BlocketteHeader from a byte slice.
func DecodeBlocketteHeader(data []byte) BlocketteHeader {
var h [BlocketteHeaderSize]byte
copy(h[:], data)
return BlocketteHeader{
BlocketteType: binary.BigEndian.Uint16(h[0:2]),
NextBlockette: binary.BigEndian.Uint16(h[2:4]),
}
}
// EncodeBlocketteHeader converts a BlocketteHeader into a byte slice.
func EncodeBlocketteHeader(hdr BlocketteHeader) []byte {
var b [BlocketteHeaderSize]byte
binary.BigEndian.PutUint16(b[0:2], hdr.BlocketteType)
binary.BigEndian.PutUint16(b[2:4], hdr.NextBlockette)
h := make([]byte, BlocketteHeaderSize)
copy(h[0:BlocketteHeaderSize], b[:])
return h
}
// Unmarshal converts a byte slice into the BlocketteHeader
func (h *BlocketteHeader) Unmarshal(data []byte) error {
*h = DecodeBlocketteHeader(data)
return nil
}
// Marshal converts a BlocketteHeader into a byte slice.
func (h BlocketteHeader) Marshal() ([]byte, error) {
return EncodeBlocketteHeader(h), nil
}
// Blockette1000 is a Data Only Seed Blockette (excluding header).
type Blockette1000 struct {
Encoding uint8
WordOrder uint8
RecordLength uint8
Reserved uint8
}
// DecodeBlockette1000 returns a Blockette1000 from a byte slice.
func DecodeBlockette1000(data []byte) Blockette1000 {
var b [Blockette1000Size]byte
copy(b[:], data)
return Blockette1000{
Encoding: b[0],
WordOrder: b[1],
RecordLength: b[2],
Reserved: b[3],
}
}
// EncodeBlockette1000 converts a Blockette1000 into a byte slice.
func EncodeBlockette1000(blk Blockette1000) []byte {
var b [Blockette1000Size]byte
b[0] = uint8(blk.Encoding)
b[1] = uint8(blk.WordOrder)
b[2] = uint8(blk.RecordLength)
b[3] = uint8(blk.Reserved)
d := make([]byte, Blockette1000Size)
copy(d[0:Blockette1000Size], b[:])
return d
}
// Unmarshal converts a byte slice into the Blockette1000
func (b *Blockette1000) Unmarshal(data []byte) error {
*b = DecodeBlockette1000(data)
return nil
}
// Marshal converts a Blockette1000 into a byte slice.
func (b Blockette1000) Marshal() ([]byte, error) {
return EncodeBlockette1000(b), nil
}
// Blockette1001 is a "Data Extension Blockette" (excluding header).
type Blockette1001 struct {
TimingQuality uint8
MicroSec int8 //Increased accuracy for starttime
Reserved uint8
FrameCount uint8
}
// DecodeBlockette1001 returns a Blockette1001 from a byte slice.
func DecodeBlockette1001(data []byte) Blockette1001 {
var b [Blockette1001Size]byte
copy(b[:], data)
return Blockette1001{
TimingQuality: b[0],
MicroSec: int8(b[1]),
Reserved: b[2],
FrameCount: b[3],
}
}
func EncodeBlockette1001(blk Blockette1001) []byte {
var b [Blockette1001Size]byte
b[0] = blk.TimingQuality
b[1] = uint8(blk.MicroSec)
b[2] = blk.Reserved
b[3] = blk.FrameCount
d := make([]byte, Blockette1000Size)
copy(d[0:Blockette1000Size], b[:])
return d
}
// Unmarshal converts a byte slice into the Blockette1001
func (b *Blockette1001) Unmarshal(data []byte) error {
*b = DecodeBlockette1001(data)
return nil
}
// Marshal converts a Blockette1001 into a byte slice.
func (b Blockette1001) Marshal() ([]byte, error) {
return EncodeBlockette1001(b), nil
} | vendor/github.com/GeoNet/kit/seis/ms/blockette.go | 0.755547 | 0.449574 | blockette.go | starcoder |
package art
import (
"fmt"
"math"
)
func (a *Art) Add(operand ...string) error {
if len(operand) != 2 {
return fmt.Errorf("invalid operands length: %v", operand)
}
x, err := a.Get(operand[0])
if err != nil {
return err
}
y, err := a.Get(operand[1])
if err != nil {
return err
}
if err := a.Set(operand[0], x+y); err != nil {
return err
}
cf := math.MaxUint64-x < y
a.Register.SetFlag(CF, cf)
return nil
}
func (a *Art) Adc(operand ...string) error {
if len(operand) != 2 {
return fmt.Errorf("invalid operands length: %v", operand)
}
x, err := a.Get(operand[0])
if err != nil {
return err
}
y, err := a.Get(operand[1])
if err != nil {
return err
}
var z uint64
if a.Register.Flag(CF) {
z = 1
}
if err := a.Set(operand[0], x+y+z); err != nil {
return err
}
cf := math.MaxUint64-x < y || math.MaxUint64-x-y < z
a.Register.SetFlag(CF, cf)
return nil
}
func (a *Art) Sub(operand ...string) error {
if len(operand) != 2 {
return fmt.Errorf("invalid operands length: %v", operand)
}
x, err := a.Get(operand[0])
if err != nil {
return err
}
y, err := a.Get(operand[1])
if err != nil {
return err
}
if err := a.Set(operand[0], x-y); err != nil {
return err
}
a.Register.SetFlag(CF, x < y)
return nil
}
func (a *Art) Sbb(operand ...string) error {
if len(operand) != 2 {
return fmt.Errorf("invalid operands length: %v", operand)
}
x, err := a.Get(operand[0])
if err != nil {
return err
}
y, err := a.Get(operand[1])
if err != nil {
return err
}
var z uint64
if a.Register.Flag(CF) {
z = 1
}
if err := a.Set(operand[0], x-y-z); err != nil {
return err
}
cf := x < y || (z == 1 && x == y)
a.Register.SetFlag(CF, cf)
return nil
}
func (a *Art) Inc(operand ...string) error {
if len(operand) != 1 {
return fmt.Errorf("invalid operands length: %v", operand)
}
x, err := a.Get(operand[0])
if err != nil {
return err
}
if err := a.Set(operand[0], x+1); err != nil {
return err
}
return nil
}
func (a *Art) Dec(operand ...string) error {
if len(operand) != 1 {
return fmt.Errorf("invalid operands length: %v", operand)
}
x, err := a.Get(operand[0])
if err != nil {
return err
}
if err := a.Set(operand[0], x-1); err != nil {
return err
}
return nil
}
func (a *Art) Neg(operand ...string) error {
if len(operand) != 1 {
return fmt.Errorf("invalid operands length: %v", operand)
}
x, err := a.Get(operand[0])
if err != nil {
return err
}
if err := a.Set(operand[0], 1+^x); err != nil {
return err
}
a.Register.SetFlag(CF, x != 0)
return nil
} | arithmetic.go | 0.532668 | 0.411879 | arithmetic.go | starcoder |
package html
import (
"encoding/json"
"reflect"
"strings"
"unicode"
"github.com/murlokswarm/app"
"github.com/pkg/errors"
)
type mapper struct {
completePipeline []string
index int
jsonValue string
}
func newMapper(pipeline []string, jsonValue string) *mapper {
return &mapper{
completePipeline: pipeline,
jsonValue: jsonValue,
}
}
func (m *mapper) pipeline() []string {
return m.completePipeline[m.index:]
}
func (m *mapper) target() string {
return m.completePipeline[m.index]
}
func (m *mapper) fullTarget() string {
return strings.Join(m.completePipeline[:m.index+1], ".")
}
func (m *mapper) MapTo(compo app.Component) (function func(), err error) {
return m.mapTo(reflect.ValueOf(compo))
}
func (m *mapper) mapTo(value reflect.Value) (function func(), err error) {
switch value.Kind() {
case reflect.Ptr:
return m.mapToPointer(value)
case reflect.Struct:
return m.mapToStruct(value)
case reflect.Map:
return m.mapToMap(value)
case reflect.Slice, reflect.Array:
return m.mapToSlice(value)
case reflect.Func:
return m.mapToFunction(value)
default:
return m.mapToValue(value)
}
}
func (m *mapper) mapToPointer(ptr reflect.Value) (function func(), err error) {
if len(m.pipeline()) == 0 {
return m.mapToValue(ptr)
}
if !isExported(m.target()) {
err = errors.Errorf("%s is mapped to an unexported method", m.fullTarget())
return
}
method := ptr.MethodByName(m.target())
if method.IsValid() {
m.index++
return m.mapTo(method)
}
return m.mapTo(ptr.Elem())
}
func (m *mapper) mapToStruct(structure reflect.Value) (function func(), err error) {
if len(m.pipeline()) == 0 {
return m.mapToValue(structure)
}
if !isExported(m.target()) {
err = errors.Errorf(
"%s is mapped to unexported field or method",
m.fullTarget(),
)
return
}
if method := structure.MethodByName(m.target()); method.IsValid() {
m.index++
return m.mapTo(method)
}
field := structure.FieldByName(m.target())
if !field.IsValid() {
err = errors.Errorf(
"%s is mapped to a nonexistent field or method",
m.fullTarget(),
)
return
}
m.index++
return m.mapTo(field)
}
func (m *mapper) mapToMap(mapv reflect.Value) (function func(), err error) {
if len(m.pipeline()) == 0 {
m.mapToValue(mapv)
return
}
if isExported(m.target()) {
if method := mapv.MethodByName(m.target()); method.IsValid() {
m.index++
return m.mapTo(method)
}
}
err = errors.Errorf(
"%s is mapped to a map value",
m.fullTarget(),
)
return
}
func (m *mapper) mapToSlice(slice reflect.Value) (function func(), err error) {
if len(m.pipeline()) == 0 {
return m.mapToValue(slice)
}
if isExported(m.target()) {
if method := slice.MethodByName(m.target()); method.IsValid() {
m.index++
return m.mapTo(method)
}
}
err = errors.Errorf(
"%s is mapped to a slice value",
m.fullTarget(),
)
return
}
func (m *mapper) mapToFunction(fn reflect.Value) (function func(), err error) {
if len(m.pipeline()) != 0 {
err = errors.Errorf(
"%s is mapped to a unsuported method",
m.fullTarget(),
)
return
}
typ := fn.Type()
if typ.NumIn() > 1 {
err = errors.Errorf(
"%s is mapped to func that have more than 1 arg",
m.completePipeline,
)
return
}
if typ.NumIn() == 0 {
function = func() {
fn.Call(nil)
}
return
}
arg := reflect.New(typ.In(0))
if err = json.Unmarshal([]byte(m.jsonValue), arg.Interface()); err != nil {
err = errors.Wrapf(err, "%s:", m.completePipeline)
return
}
function = func() {
fn.Call([]reflect.Value{arg.Elem()})
}
return
}
func (m *mapper) mapToValue(value reflect.Value) (function func(), err error) {
if len(m.pipeline()) == 0 {
newValue := reflect.New(value.Type())
if err = json.Unmarshal([]byte(m.jsonValue), newValue.Interface()); err != nil {
err = errors.Wrapf(err, "%s:", m.completePipeline)
return
}
value.Set(newValue.Elem())
return
}
if !isExported(m.target()) {
err = errors.Errorf(
"%s is mapped to a unsuported method",
m.fullTarget(),
)
return
}
method := value.MethodByName(m.target())
if !method.IsValid() {
err = errors.Errorf(
"%s is mapped to a undefined method",
m.fullTarget(),
)
return
}
m.index++
return m.mapTo(method)
}
func isExported(fieldOrMethod string) bool {
return !unicode.IsLower(rune(fieldOrMethod[0]))
} | html/map.go | 0.539711 | 0.41182 | map.go | starcoder |
package tools
import (
"math"
"math/rand"
"strconv"
"sync"
"time"
)
// A struct that represents a simulator
// seed for a given type of sensor value
type SimulatorSeed struct {
// A float64 that represents the initial value of the seed
Initial float64
// A float64 that represents the value by which the seed moves the cursor
Adjust float64
// A float64 that represents the width of the seed while generating values
Width float64
// A float64 that represents the max deviation from the initial value before the seed curve ends
Peak float64
// A float64 that represents the current value of the seed
Cursor float64
// A string that represents the kind of seed curve to follow
Curve string
}
// A constructor function that generates and returns a SimulatorSeed object.
// Requires the initial, peak, adjust, width and curve name.
// The cursor is set the value of the initial passed.
func NewSimulatorSeed(initial, peak, adjust, width float64, curve string) *SimulatorSeed {
// Create a new SimulatorSeed
seed := SimulatorSeed{}
// Assign the values
seed.Initial = initial
seed.Peak = peak
seed.Adjust = adjust
seed.Width = width
seed.Cursor = initial
seed.Curve = curve
// Return the seed
return &seed
}
// A method of SimulatorSeed that serves as the seed curve while rising.
// Increments the Cursor by Adjust until it exceeds the peak.
// However if the reverse bool is set, the opposite occurs.
func (seed *SimulatorSeed) rising(reverse bool) {
for {
// Check the cursor state
if seed.Cursor > seed.Peak && !reverse {
break
} else if seed.Cursor < seed.Peak && reverse {
break
}
// Increment the cursor
seed.Cursor = seed.Cursor + seed.Adjust
time.Sleep(time.Second * 5)
}
}
// A method of SimulatorSeed that serves as the seed curve while falling.
// Decrements the Cursor by Adjust until it goes below the initial.
// However if the reverse bool is set, the opposite occurs.
func (seed *SimulatorSeed) falling(reverse bool) {
for {
// Check the cursor state
if seed.Cursor < seed.Initial && !reverse {
break
} else if seed.Cursor > seed.Initial && reverse {
break
}
// Decrement the cursor
seed.Cursor = seed.Cursor - seed.Adjust
time.Sleep(time.Second * 5)
}
}
// A method of SimulatorSeed that serves as a flip curve.
// Simply flips the Cursor from 1 to 0 after the Adjust*5 amount of seconds
// and then flips it back after the same amount of time.
func (seed *SimulatorSeed) flip() {
time.Sleep(time.Second * time.Duration(seed.Adjust*5))
seed.Cursor = 1
time.Sleep(time.Second * time.Duration(seed.Adjust*5))
seed.Cursor = 0
}
// A method of SimulatorSeed that starts the curve of the seed.
func (seed *SimulatorSeed) StartCurve(wg *sync.WaitGroup) {
// Check the type of curve to start
switch seed.Curve {
case "bell":
seed.rising(false)
time.Sleep(time.Second * 10)
seed.falling(false)
case "revbell":
seed.rising(true)
time.Sleep(time.Second * 10)
seed.falling(true)
case "flip":
seed.flip()
}
// Decrement the waitgroup.
wg.Done()
}
// A struct that represents a Fire Event Simulator
type FireEventSimulator struct {
// A bool tht represents if the simulator is on.
SimulationOn bool
// A pool of SimulatorSeeds for each sensor type.
SimulationSeeds map[string]*SimulatorSeed
}
// A constructor function that generates and returns a FireEventSimulator
// Creates the default seeds for each sensor type.
func NewFireEventSimulator() *FireEventSimulator {
// Create a FireEventSimulator
simulator := FireEventSimulator{}
// Set the simulator to off
simulator.SimulationOn = false
// Create an empty map and assign it
simulator.SimulationSeeds = make(map[string]*SimulatorSeed)
// Create the SimulatorSeeds for each sensor type.
simulator.SimulationSeeds["GAS"] = NewSimulatorSeed(450.0, 900.0, 25.0, 75.0, "bell")
simulator.SimulationSeeds["HUM"] = NewSimulatorSeed(50.0, 22.5, -1.5, 3.0, "revbell")
simulator.SimulationSeeds["TEM"] = NewSimulatorSeed(27.0, 55, 1.5, 3.0, "bell")
simulator.SimulationSeeds["FLM"] = NewSimulatorSeed(0, 1, 12.0, 0, "flip")
return &simulator
}
// A method of FireEventSimulator that starts a Fire Event.
// Requires a LogQueue to log the start and end of the Fire Event.
// Uses a wait group to monitor the completion of each individual seed's event curve.
func (simulator *FireEventSimulator) StartFireEvent(logqueue chan Log) {
// Create a wait group
wg := sync.WaitGroup{}
// Log the start of the fire event
logqueue <- NewOrchSchedlog("(simulator) fire event has started")
// Iterate over the Seed pool
for _, seed := range simulator.SimulationSeeds {
// Increment the wait group
wg.Add(1)
// start the seed curve
go seed.StartCurve(&wg)
}
// Wait for wait group to complete
wg.Wait()
// Log the end of the fire event
logqueue <- NewOrchSchedlog("(simulator) fire event has ended")
}
// A method of FireEventSimulator that returns a
// simulated value for a given sensor type.
func (simulator *FireEventSimulator) GetSimulatedValue(sensortype string) float64 {
// Create a float
var simvalue float64
// Check the sensor type and retrieve the appropriate seed.
// Use that seed and its values to generate a random simulated value.
switch sensortype {
case "HUM":
seed := simulator.SimulationSeeds["HUM"]
simvalue = generaterandomvalue(seed.Cursor, seed.Cursor-seed.Width, 2)
case "TEM":
seed := simulator.SimulationSeeds["TEM"]
simvalue = generaterandomvalue(seed.Cursor, seed.Cursor+seed.Width, 2)
case "GAS":
seed := simulator.SimulationSeeds["GAS"]
simvalue = generaterandomvalue(seed.Cursor, seed.Cursor+seed.Width, 0)
case "FLM":
seed := simulator.SimulationSeeds["FLM"]
simvalue = seed.Cursor
}
// Return the simulated value
return simvalue
}
// A function that generates a sensor value given the the value as an unparsed string and the sensor type.
// The mesh orchestrator is used to retrieve data from the simulator when it is on.
// The enableCorrection flag allows the sensor values to be overriden when the sensor is defunct or for debugging.
func GenerateSensorValue(sensorvalue string, sensortype string, meshorchestrator *MeshOrchestrator, enableCorrection bool) float64 {
// Create a float
var generatedvalue float64
// Check if simulation is on or if corrections have been enabled
if meshorchestrator.Simulator.SimulationOn || enableCorrection {
// generate a simulated value, either for a fire event or for baseline seed.
generatedvalue = meshorchestrator.Simulator.GetSimulatedValue(sensortype)
} else {
// parse the provided string sensor value
parsedval, _ := strconv.ParseFloat(sensorvalue, 64)
generatedvalue = parsedval
}
// Return the generated value
return generatedvalue
}
// A function that generates a random float64 number between a given two number with the precision set.
// The values are sorted for magnitude and the value is generated from the random seed.
// The precision must be a value between 0 and 8 and sets the number of decimal point in the float.
func generaterandomvalue(val1, val2 float64, precision int8) float64 {
// Seed the random generator
rand.Seed(time.Now().UnixNano())
// Check the precision limit
if precision > 8 || precision < 0 {
return 0
}
// Declare the min and max
var min, max float64
// Determine the min and max
if val1 > val2 {
min = val2
max = val1
} else {
min = val1
max = val2
}
// Calculate a trunctuator for the given precision
trunct := math.Pow(10, float64(precision))
// Generate the random value and round it to the precision.
randomvalue := min + rand.Float64()*(max-min)
randomvalue = math.Round(randomvalue*trunct) / trunct
// Return the generated value.
return randomvalue
} | tools/simtools.go | 0.824144 | 0.70202 | simtools.go | starcoder |
package spatial
// Polygon is a data type for storing simple polygons.
type Polygon []Line
func (p Polygon) Project(proj ConvertFunc) {
for ri := range p {
for i := range p[ri] {
p[ri][i] = proj(p[ri][i])
}
}
}
func (p Polygon) Copy() Projectable {
var np Polygon
for _, ring := range p {
np = append(np, ring.Copy().(Line))
}
return np
}
func (p Polygon) String() string {
return p.string()
}
func (p Polygon) ClipToBBox(bbox BBox) []Geom {
// Speed-ups for common cases to eliminate the need for calling geos.
if len(p) == 1 && len(p[0].Intersections(bbox.Segments())) == 0 {
if bbox.FullyIn(p[0].BBox()) {
return []Geom{MustNewGeom(Polygon{Line{
bbox.SW, {bbox.NE.X, bbox.SW.Y}, bbox.NE, {bbox.SW.X, bbox.NE.Y},
}})}
}
if p[0].BBox().FullyIn(bbox) {
return []Geom{MustNewGeom(p)}
}
}
return p.clipToBBox(bbox)
}
func (p Polygon) Rewind() {
for _, ring := range p {
ring.Reverse()
}
}
func (p Polygon) FixWinding() {
for n, ring := range p {
if n == 0 {
// First ring must be outer and therefore clockwise.
if !ring.Clockwise() {
ring.Reverse()
}
continue
}
// Compare in how many rings the point is located.
// If the number is odd, it's a hole.
var inrings int
for ninner, cring := range p {
if n == ninner {
continue
}
if ring[0].InPolygon(Polygon{cring}) {
inrings++
}
}
if (inrings%2 == 0 && !ring.Clockwise()) || (inrings%2 == 1 && ring.Clockwise()) {
ring.Reverse()
}
}
}
func (p Polygon) ValidTopology() bool {
return len(p.topologyErrors()) == 0
}
func (p Polygon) MustJSON() []byte {
g := MustNewGeom(p)
j, err := g.MarshalJSON()
if err != nil {
panic(err)
}
return j
}
type segErr struct {
Ring int
Seg int
}
func (p Polygon) topologyErrors() (errSegments []segErr) {
for nRing, ring := range p {
for nSeg, seg := range ring.SegmentsWithClosing() {
for nSegCmp, segCmp := range ring.SegmentsWithClosing() {
if nSeg == nSegCmp {
continue
}
ipt, has := seg.Intersection(segCmp)
if has && (ipt != seg[0] && ipt != seg[1]) {
errSegments = append(errSegments, segErr{Ring: nRing, Seg: nSeg})
}
}
}
}
return
} | lib/spatial/polygon.go | 0.681621 | 0.530297 | polygon.go | starcoder |
package cmd
import "tix/logger"
const helpMessage =
`Tix -A command line utility for generating jira, etc tickets from a markdown document.
Usage: tix [OPTIONS] <markdown file>
-d prints out ticket information instead of creating tickets (shorthand)
-dryrun
prints out ticket information instead of creating tickets
-h prints help for tix (shorthand)
-help
prints help for tix
-q suppresses all log output except errors (shorthand)
-quiet
suppresses all log output except errors
-v enables verbose output (shorthand)
-verbose
enables verbose output
-version
prints tix version
# Settings
Tix expects to find a file called tix.yml in the same directory as the markdown document provided. This settings
file contains information needed by tix in order to communicate with ticketing systems.
The following is the expected format of the settings file:
// yml
github:
no_projects: true // Indicates if tix should use projects or treat root tickets as issues. Defaults to false.
owner: owner // The owner of the github repo (ex - ncipollo)
repo: repo // The github repo (ex - tix)
tickets:
default:
default: default // Fields to be added to both projets and issues
project:
project: project // Fields to be added to projects
issue:
labels: [label1, label2] // Fields to be added to issues
jira:
no_epics: false // Indicates if tix should use epics or treat root tickets as stories / issues. Defaults to false.
url: https://url.to.your.jira.instance.com
tickets:
// All fields should be lower case. Field name spaces should be included (ex- epic name)
default:
field: value // Fields to be added to any kind of jira issue
epic:
field: value // Fields to be added to epics
issue:
field: value // Fields to be added to issues
task:
field: value // Fields to be added to tasks (sub-issues of issues)
variables:
key: value
envKey: $ENVIRONMENT_VARIABLE
// tix will parse the markdown document and replace each occurance of "key" with it's value (or environment variable
// when a '$' preceeds the value)
// end yml
# Markdown
Tix interprets the content of heading elements as a tickets. The indent level of the heading element will indicate the
level of nesting for the ticket. For example:
// Markdown
# Root Ticket
This will be a root level ticket.
## Child Ticket
This ticket will be a child of root ticket.
### Grandchild Ticket
This ticket will be a child of child ticket.
// End Markdown
The parent-child relationship for tickets will translate into specific relationships for ticketing systems. In Jira,
for example, the root ticket could be an epic, the next level down a story, then the final level would be a task.
## Markdown Elements
Tix parses each markdown element into an abstract representation of that element. Tix will then utilize these
abstractions to generate representations which are specific to a ticketing system. For example:
- List type elements will would utilize the wiki-media style indentation markers for jira (--)
- Code blocks in jira will use the {code} markers.
## Special Blocks
Tix supports special code blocks which may be used to add fields for a ticket. For example:
%v
# Jira
Tix expects your jira account information to be stored in your environment. Specifically, it will look for the following
variables:
- JIRA_USERNAME: This should be set to your Jira user name (typically an email address).
- JIRA_API_TOKEN: This should be set to your Jira api key. This may be generated by following the instructions found
here: [Jira Api tokens](https://confluence.atlassian.com/cloud/api-tokens-938839638.html)
Jira tickets have the following relationship with heading indent levels:
- #: Epic, if epics are allowed via settings. Story otherwise.
- ##: Story / issue
- ###: Task
# Github
Tix expects your github account information to be stored in your environment. Specifically, it will look for the following variable:
- GITHUB_API_TOKEN: This should be set to your github access token. Access token's may be generated here: [Github API Tokens](https://github.com/settings/tokens)
Github tickets have the following relationship with heading indent levels:
- #: Project, if projects are allowed via settings. Issue otherwise.
- ##: Issue
## Github Settings
The following settings effect issue type tickets:
- assignee: (string) Sets the assignee of the ticket. Should match a github account name (ex - ncipollo).
- assignees: (array) Sets multiple assignees for the ticket. Should match a github account name (ex - [ncipollo, etc]).
- column: (string) Specifies the column the issue should be placed in when it's within a project (ex - To do). Defaults to To do.
- labels: (array) Specifies the labels to apply to the issue (ex - [label1, label2]).
- milestone: (string) Specifies the milestone to add this issue to (ex - v1.0.0). Note: This will create a milestone if it doesn't yet exist. This can cause tix to fail if the milestone already exists but is closed.
- project: (number) The project the issue should be added to (ex - 13). The project should exist and be in the open state. This only takes effect if no_projects is true.
The following settings effect project type tickets:
- columns: (array) Specifies the columns which should be created with the project (ex - [todo, working on it, done]). Defaults to [To do, In progress, Done].
For up to date documentation check out https://github.com/ncipollo/tix/blob/main/README.md
`
const specialBlocks =
"```tix\n" +
"// Adds fields to the ticket, regardless of ticket system\n" +
"field: value\n" +
"```\n" +
"```github\n" +
"// Adds field to the ticket only if the ticketing system is github.\n" +
"field: value\n" +
"```\n" +
"```jira\n" +
"// Adds field to the ticket only if the ticketing system is jira.\n" +
"field: value\n" +
"```"
type HelpCommand struct {
}
func NewHelpCommand() *HelpCommand {
return &HelpCommand{}
}
func (h HelpCommand) Run() error {
logger.Output(helpMessage, specialBlocks)
return nil
} | cmd/help.go | 0.633637 | 0.685693 | help.go | starcoder |
package stringslice
// MatchFunc is a function that matches elements in a string slice.
type MatchFunc func(string) bool
// Filter returns a slice containing the elements of slice for which match returns true.
func Filter(slice []string, match MatchFunc) []string {
out := make([]string, 0, len(slice))
for _, s := range slice {
if match(s) {
out = append(out, s)
}
}
return out
}
// Unique returns a filter function that returns true for an element if it is
// the first time the function has been called with that element.
func Unique(size int) MatchFunc {
seen := make(map[string]struct{}, size)
return func(s string) bool {
if _, seen := seen[s]; seen {
return false
}
seen[s] = struct{}{}
return true
}
}
// MatchAny returns true if match returns true for any element of slice.
func MatchAny(slice []string, match MatchFunc) bool {
for _, s := range slice {
if match(s) {
return true
}
}
return false
}
// MatchAll returns true if match returns true for all elements of slice.
func MatchAll(slice []string, match MatchFunc) bool {
for _, s := range slice {
if !match(s) {
return false
}
}
return true
}
// Equal returns a filter function that returns true if the element equals the given argument.
func Equal(t string) MatchFunc {
return func(s string) bool { return s == t }
}
// MapFunc is a function that maps a string to a string.
type MapFunc func(string) string
// Map returns a slice containing the result of mapping for each element in the slice.
func Map(slice []string, mapping ...MapFunc) []string {
out := make([]string, len(slice))
for i, s := range slice {
for _, mapping := range mapping {
s = mapping(s)
}
out[i] = s
}
return out
}
// AddPrefix returns a map function that adds the given prefix to the element.
func AddPrefix(prefix string) MapFunc {
return func(s string) string { return prefix + s }
}
// AddSuffix returns a map function that adds the given suffix to the element.
func AddSuffix(suffix string) MapFunc {
return func(s string) string { return s + suffix }
} | stringslice/stringslice.go | 0.885322 | 0.450118 | stringslice.go | starcoder |
package bls12381
import (
"fmt"
"github.com/cloudflare/circl/ecc/bls12381/ff"
)
type isogG1Point struct{ x, y, z ff.Fp }
func (p isogG1Point) String() string { return fmt.Sprintf("x: %v\ny: %v\nz: %v", p.x, p.y, p.z) }
// IsOnCurve returns true if g is a valid point on the curve.
func (p *isogG1Point) IsOnCurve() bool {
var x2, x3, z2, z3, y2 ff.Fp
y2.Sqr(&p.y) // y2 = y^2
y2.Mul(&y2, &p.z) // y2 = y^2*z
z2.Sqr(&p.z) // z2 = z^2
z3.Mul(&z2, &p.z) // z3 = z^3
z3.Mul(&z3, &g1Isog11.b) // z3 = B*z^3
x2.Sqr(&p.x) // x2 = x^2
x3.Mul(&z2, &g1Isog11.a) // x3 = A*z^2
x3.Add(&x3, &x2) // x3 = x^2 + A*z^2
x3.Mul(&x3, &p.x) // x3 = x^3 + A*x*z^2
x3.Add(&x3, &z3) // x3 = x^3 + A*x*z^2 + Bz^3
return y2.IsEqual(&x3) == 1 && *p != isogG1Point{}
}
// sswu implements the Simplified Shallue-van de Woestijne-Ulas method for
// maping a field element to a point on the isogenous curve.
func (p *isogG1Point) sswu(u *ff.Fp) {
// Method in Appendix-G.2.1 of
// https://datatracker.ietf.org/doc/html/draft-irtf-cfrg-hash-to-curve-11
tv1, tv2, tv3, tv4 := &ff.Fp{}, &ff.Fp{}, &ff.Fp{}, &ff.Fp{}
xd, x1n, gxd, gx1 := &ff.Fp{}, &ff.Fp{}, &ff.Fp{}, &ff.Fp{}
y, y1, x2n, y2, xn := &ff.Fp{}, &ff.Fp{}, &ff.Fp{}, &ff.Fp{}, &ff.Fp{}
tv1.Sqr(u) // 1. tv1 = u^2
tv3.Mul(&g1sswu.Z, tv1) // 2. tv3 = Z * tv1
tv2.Sqr(tv3) // 3. tv2 = tv3^2
xd.Add(tv2, tv3) // 4. xd = tv2 + tv3
tv4.SetOne() // 5. tv4 = 1
x1n.Add(xd, tv4) // x1n = xd + tv4
x1n.Mul(x1n, &g1Isog11.b) // 6. x1n = x1n * B
xd.Mul(&g1Isog11.a, xd) // 7. xd = A * xd
xd.Neg() // xd = -xd
e1 := xd.IsZero() // 8. e1 = xd == 0
tv4.Mul(&g1sswu.Z, &g1Isog11.a) // 9. tv4 = Z * A
xd.CMov(xd, tv4, e1) // xd = CMOV(xd, tv4, e1)
tv2.Sqr(xd) // 10. tv2 = xd^2
gxd.Mul(tv2, xd) // 11. gxd = tv2 * xd
tv2.Mul(&g1Isog11.a, tv2) // 12. tv2 = A * tv2
gx1.Sqr(x1n) // 13. gx1 = x1n^2
gx1.Add(gx1, tv2) // 14. gx1 = gx1 + tv2
gx1.Mul(gx1, x1n) // 15. gx1 = gx1 * x1n
tv2.Mul(&g1Isog11.b, gxd) // 16. tv2 = B * gxd
gx1.Add(gx1, tv2) // 17. gx1 = gx1 + tv2
tv4.Sqr(gxd) // 18. tv4 = gxd^2
tv2.Mul(gx1, gxd) // 19. tv2 = gx1 * gxd
tv4.Mul(tv4, tv2) // 20. tv4 = tv4 * tv2
y1.ExpVarTime(tv4, g1sswu.c1[:]) // 21. y1 = tv4^c1
y1.Mul(y1, tv2) // 22. y1 = y1 * tv2
x2n.Mul(tv3, x1n) // 23. x2n = tv3 * x1n
y2.Mul(y1, &g1sswu.c2) // 24. y2 = y1 * c2
y2.Mul(y2, tv1) // 25. y2 = y2 * tv1
y2.Mul(y2, u) // 26. y2 = y2 * u
tv2.Sqr(y1) // 27. tv2 = y1^2
tv2.Mul(tv2, gxd) // 28. tv2 = tv2 * gxd
e2 := tv2.IsEqual(gx1) // 29. e2 = tv2 == gx1
xn.CMov(x2n, x1n, e2) // 30. xn = CMOV(x2n, x1n, e2)
y.CMov(y2, y1, e2) // 31. y = CMOV(y2, y1, e2)
e3 := u.Sgn0() ^ y.Sgn0() // 32. e3 = sgn0(u) == sgn0(y)
*tv1 = *y // 33. tv1 = y
tv1.Neg() // tv1 = -y
y.CMov(tv1, y, ^e3) // y = CMOV(tv1, y, e3)
p.x = *xn // 34. return
p.y.Mul(y, xd) // (x,y) = (xn/xd, y/1)
p.z = *xd // (X,Y,Z) = (xn, y*xd, xd)
}
// evalIsogG1 calculates g = g1Isog11(p), where g1Isog11 is an isogeny of
// degree 11 to the curve used in G1.
func (g *G1) evalIsogG1(p *isogG1Point) {
x, y, z := &p.x, &p.y, &p.z
t, zi := &ff.Fp{}, &ff.Fp{}
xNum, xDen, yNum, yDen := &ff.Fp{}, &ff.Fp{}, &ff.Fp{}, &ff.Fp{}
ixn := len(g1Isog11.xNum) - 1
ixd := len(g1Isog11.xDen) - 1
iyn := len(g1Isog11.yNum) - 1
iyd := len(g1Isog11.yDen) - 1
*xNum = g1Isog11.xNum[ixn]
*xDen = g1Isog11.xDen[ixd]
*yNum = g1Isog11.yNum[iyn]
*yDen = g1Isog11.yDen[iyd]
*zi = *z
for (ixn | ixd | iyn | iyd) != 0 {
if ixn > 0 {
ixn--
t.Mul(zi, &g1Isog11.xNum[ixn])
xNum.Mul(xNum, x)
xNum.Add(xNum, t)
}
if ixd > 0 {
ixd--
t.Mul(zi, &g1Isog11.xDen[ixd])
xDen.Mul(xDen, x)
xDen.Add(xDen, t)
}
if iyn > 0 {
iyn--
t.Mul(zi, &g1Isog11.yNum[iyn])
yNum.Mul(yNum, x)
yNum.Add(yNum, t)
}
if iyd > 0 {
iyd--
t.Mul(zi, &g1Isog11.yDen[iyd])
yDen.Mul(yDen, x)
yDen.Add(yDen, t)
}
zi.Mul(zi, z)
}
g.x.Mul(xNum, yDen)
g.y.Mul(yNum, xDen)
g.y.Mul(&g.y, y)
g.z.Mul(xDen, yDen)
g.z.Mul(&g.z, z)
} | ecc/bls12381/g1Isog.go | 0.696268 | 0.47244 | g1Isog.go | starcoder |
package easytime
import (
"time"
)
var (
DateTimeFormat = "2006-01-02 15:04:05"
DateFormat = "2006-01-02"
TimeFormat = "15:04:05"
ShortDateTimeFormat = "20060102150405"
ShortDateFormat = "20060102"
ShortTimeFormat = "150405"
)
// NowString Gets the current time string
func NowString() string {
return FormatToString(time.Now(), DateTimeFormat)
}
// IsSameDayWithOffset Judge whether the given two times are the same day,
// When the offset value is 0, it indicates a natural day,
// Otherwise, it is calculated according to the hourly offset value
func IsSameDayWithOffset(time1, time2 time.Time, offset int32) bool {
time1 = time1.Add(time.Duration(-offset) * time.Hour)
time2 = time2.Add(time.Duration(-offset) * time.Hour)
return time1.Year() == time2.Year() && time1.YearDay() == time2.YearDay()
}
// IsSameDay the specified offset is zero
func IsSameDay(t1, t2 time.Time) bool {
return IsSameDayWithOffset(t1, t2, 0)
}
// IsSameWeek Judge whether the given two times are the same week
func IsSameWeek(time1, time2 time.Time) bool {
week1, w1 := time1.ISOWeek()
week2, w2 := time2.ISOWeek()
return w1 == w2 && week1 == week2
}
// CheckHours Judge whether the specified time is an hour
func CheckHours(t1 time.Time) bool {
_, min, sec := t1.Clock()
if min == sec && min == 0 {
return true
}
return false
}
// AddDays Add days to the specified time
func AddDays(t1 time.Time, days int32) (time.Time, error) {
dd, err := time.ParseDuration("24h")
if err != nil {
return t1, err
}
return t1.Add(time.Duration(days) * dd), nil
}
// AddHours Add hours to the specified time
func AddHours(t1 time.Time, hours int32) (time.Time, error) {
hh, err := time.ParseDuration("1h")
if err != nil {
return t1, err
}
return t1.Add(time.Duration(hours) * hh), nil
}
// AddMinutes Add minutes to the specified time.
func AddMinutes(t1 time.Time, minutes int32) (time.Time, error) {
mm, err := time.ParseDuration("1m")
if err != nil {
return t1, err
}
return t1.Add(time.Duration(minutes) * mm), nil
}
// AddSeconds Add seconds to the specified time.
func AddSeconds(t1 time.Time, seconds int32) (time.Time, error) {
ss, err := time.ParseDuration("1s")
if err != nil {
return t1, err
}
return t1.Add(time.Duration(seconds) * ss), nil
}
// DiffDays gets the difference in days.
func DiffDays(t1, t2 time.Time) int32 {
return int32(t1.Sub(t2).Hours() / 24)
}
// DiffSeconds gets the difference in seconds.
func DiffSeconds(t1, t2 time.Time) int32 {
return int32(t1.Sub(t2).Seconds())
}
// DiffHours gets the difference in hours.
func DiffHours(t1, t2 time.Time) int32 {
return int32(t1.Sub(t2).Hours())
}
// DiffMinutes gets the difference in minutes.
func DiffMinutes(t1, t2 time.Time) int32 {
return int32(t1.Sub(t2).Minutes())
} | time/time.go | 0.800146 | 0.436922 | time.go | starcoder |
package gortex
import "fmt"
// DeltaRNN cell https://arxiv.org/pdf/1703.08864.pdf
type DeltaRNN struct {
Wr *Matrix
Ur *Matrix
Wx *Matrix
Wh *Matrix
Wo *Matrix
Br *Matrix
Bias *Matrix
A *Matrix
B *Matrix
C *Matrix
}
// MakeDeltaRNN create new cell
func MakeDeltaRNN(x_size, h_size, out_size int) *DeltaRNN {
net := new(DeltaRNN)
net.Wr = RandXavierMat(h_size, x_size)
net.Ur = RandXavierMat(h_size, h_size)
net.Wx = RandXavierMat(h_size, x_size)
net.Wh = RandXavierMat(h_size, h_size)
net.Wo = RandXavierMat(out_size, h_size)
net.Br = RandXavierMat(h_size, 1)
net.Bias = RandXavierMat(h_size, 1)
net.A = RandXavierMat(h_size, 1)
net.B = RandXavierMat(h_size, 1)
net.C = RandXavierMat(h_size, 1)
return net
}
func (rnn *DeltaRNN) GetParameters(namespace string) map[string]*Matrix {
return map[string]*Matrix{
namespace + "_Wr": rnn.Wr,
namespace + "_Ur": rnn.Ur,
namespace + "_Wx": rnn.Wx,
namespace + "_Wh": rnn.Wh,
namespace + "_Wo": rnn.Wo,
namespace + "_A": rnn.A,
namespace + "_B": rnn.B,
namespace + "_C": rnn.C,
namespace + "_Br": rnn.Br,
namespace + "_Bias": rnn.Bias}
}
func (rnn *DeltaRNN) SetParameters(namespace string, parameters map[string]*Matrix) error {
for k, v := range rnn.GetParameters(namespace) {
fmt.Printf("Look for %s parameters\n", k)
if m, ok := parameters[k]; ok {
fmt.Printf("Got %s parameters\n", k)
v.W = m.W
} else {
return fmt.Errorf("Model geometry is not compatible, parameter %s is unknown", k)
}
}
return nil
}
func (rnn *DeltaRNN) Step(g *Graph, x, h_prev *Matrix) (h, y *Matrix) {
// make DeltaRNN computation graph at one time-step
xx := g.Mul(rnn.Wx, x)
hh := g.Mul(rnn.Wh, h_prev)
//r := g.Sigmoid(g.Add(g.Mul(rnn.Wr, x), rnn.Br))
r := g.Sigmoid(g.Add(g.Add(g.Mul(rnn.Wr, x), g.Mul(rnn.Ur, h_prev)), rnn.Br))
// Hadamard product
z1 := g.EMul(rnn.A, xx)
z2 := g.EMul(rnn.B, hh)
z3 := g.EMul(rnn.C, g.EMul(xx, hh))
z := g.Add(g.Add(g.Add(z1, z2), z3), rnn.Bias)
h = g.Add(g.EMul(r, h_prev), g.EMul(g.Sub(r.OnesAs(), r), z))
y = g.Mul(rnn.Wo, g.Tanh(h))
return
} | rnn.delta.rnn.go | 0.681091 | 0.432063 | rnn.delta.rnn.go | starcoder |
package io
import (
"time"
)
type RowsInterface interface {
GetRow(i int) []byte // Position to the i-th record
GetData() []byte // Pointer to the beginning of the data
GetNumRows() int
GetRowLen() int
SetRowLen(int)
}
type RowSeriesInterface interface {
GetMetadataKey() string // The filesystem metadata key for this data
}
type Rows struct {
ColumnInterface
RowsInterface
dataShape []DataShape
data []byte
rowLen int // We allow for a rowLen that might differ from the sum of dataShape for alignment, etc
}
func NewRows(dataShape []DataShape, data []byte) *Rows {
return &Rows{dataShape: dataShape, data: data, rowLen: 0}
}
func (rows *Rows) GetColumn(colname string) (col interface{}) {
var offset int
for _, ds := range rows.GetDataShapes() {
if ds.Name == colname {
switch ds.Type {
case FLOAT32:
return getFloat32Column(offset, int(rows.GetRowLen()), rows.GetNumRows(), rows.GetData())
case FLOAT64:
return getFloat64Column(offset, int(rows.GetRowLen()), rows.GetNumRows(), rows.GetData())
case INT16:
return getInt16Column(offset, int(rows.GetRowLen()), rows.GetNumRows(), rows.GetData())
case INT32:
return getInt32Column(offset, int(rows.GetRowLen()), rows.GetNumRows(), rows.GetData())
case EPOCH, INT64:
return getInt64Column(offset, int(rows.GetRowLen()), rows.GetNumRows(), rows.GetData())
case UINT8:
return getUInt8Column(offset, int(rows.GetRowLen()), rows.GetNumRows(), rows.GetData())
case UINT16:
return getUInt16Column(offset, int(rows.GetRowLen()), rows.GetNumRows(), rows.GetData())
case UINT32:
return getUInt32Column(offset, int(rows.GetRowLen()), rows.GetNumRows(), rows.GetData())
case UINT64:
return getUInt64Column(offset, int(rows.GetRowLen()), rows.GetNumRows(), rows.GetData())
case STRING16:
return getString16Column(offset, int(rows.GetRowLen()), rows.GetNumRows(), rows.GetData())
case BOOL:
fallthrough
case BYTE:
return getByteColumn(offset, int(rows.GetRowLen()), rows.GetNumRows(), rows.GetData())
}
} else {
offset += ds.Type.Size()
}
}
return nil
}
func (rows *Rows) GetDataShapes() []DataShape {
return rows.dataShape
}
func (rows *Rows) Len() int {
return rows.GetNumRows()
}
func (rows *Rows) GetTime() ([]time.Time, error) {
ep := rows.GetColumn("Epoch").([]int64)
ts := make([]time.Time, len(ep))
nsi := rows.GetColumn("Nanoseconds")
if nsi == nil {
for i, secs := range ep {
ts[i] = ToSystemTimezone(time.Unix(secs, 0))
}
} else {
ns := nsi.([]int32)
for i, secs := range ep {
ts[i] = ToSystemTimezone(time.Unix(secs, int64(ns[i])))
}
}
return ts, nil
}
func (rows *Rows) SetRowLen(rowLen int) {
/*
Call this to set a custom row length for this group of rows. This is needed when padding for alignment.
*/
if rowLen < rows.GetRowLen() {
rowLen = rows.GetRowLen() // Make sure the requested rowLen is sane
}
rows.rowLen = rowLen
}
func (rows *Rows) GetRowLen() (len int) {
/*
rowLen can be set directly to allow for alignment, etc, or this will set it based on sum of DataShape
*/
if rows.rowLen == 0 {
for _, shape := range rows.dataShape {
rows.rowLen += shape.Type.Size()
}
}
return rows.rowLen
}
func (rows *Rows) GetNumRows() int {
mylen := rows.GetRowLen()
if mylen == 0 || len(rows.data) == 0 {
return 0
}
return len(rows.data) / mylen
}
func (rows *Rows) GetData() []byte {
return rows.data
}
func (rows *Rows) GetRow(i int) []byte {
rowLen := rows.GetRowLen()
start := i * rowLen
end := start + rowLen
return rows.data[start:end]
}
func (rows *Rows) ToColumnSeries() *ColumnSeries {
cs := NewColumnSeries()
cs.AddColumn("Epoch", rows.GetColumn("Epoch").([]int64))
for _, ds := range rows.GetDataShapes() {
if ds.Name == "Epoch" {
continue
}
cs.AddColumn(ds.Name, rows.GetColumn(ds.Name))
}
return cs
}
type RowSeries struct {
RowSeriesInterface
RowsInterface
ColumnInterface
rows *Rows
metadataKey TimeBucketKey
}
func NewRowSeries(
key TimeBucketKey,
data []byte,
dataShape []DataShape,
rowLen int,
rowType EnumRecordType,
) *RowSeries {
/*
We have to add a column named _nanoseconds_ to the datashapes for a variable record type
This is true because the read() function for variable types inserts a 32-bit nanoseconds column
*/
if rowType == VARIABLE {
dataShape = append(dataShape, DataShape{"Nanoseconds", INT32})
}
rows := NewRows(dataShape, data)
rows.SetRowLen(rowLen)
return &RowSeries{
metadataKey: key,
rows: rows,
}
}
func (rs *RowSeries) GetMetadataKey() TimeBucketKey {
return rs.metadataKey
}
func (rs *RowSeries) GetRow(i int) []byte {
return rs.rows.GetRow(i)
}
func (rs *RowSeries) GetData() []byte {
return rs.rows.GetData()
}
func (rs *RowSeries) GetNumRows() int {
return rs.rows.GetNumRows()
}
func (rs *RowSeries) GetRowLen() int {
return rs.rows.GetRowLen()
}
func (rs *RowSeries) SetRowLen(rowLen int) {
rs.rows.SetRowLen(rowLen)
}
func (rs *RowSeries) GetColumn(colname string) (col interface{}) {
return rs.rows.GetColumn(colname)
}
func (rs *RowSeries) GetDataShapes() (ds []DataShape) {
return rs.rows.GetDataShapes()
}
func (rs *RowSeries) Len() int {
return rs.GetNumRows()
}
func (rs *RowSeries) GetTime() ([]time.Time, error) {
return rs.rows.GetTime()
}
func (rs *RowSeries) GetEpoch() (col []int64) {
return getInt64Column(0, int(rs.GetRowLen()), rs.GetNumRows(), rs.GetData())
}
func (rs *RowSeries) ToColumnSeries() (key TimeBucketKey, cs *ColumnSeries) {
key = rs.GetMetadataKey()
cs = NewColumnSeries()
cs.AddColumn("Epoch", rs.GetEpoch())
for _, ds := range rs.rows.GetDataShapes() {
if ds.Name == "Epoch" {
continue
}
cs.AddColumn(ds.Name, rs.GetColumn(ds.Name))
}
return key, cs
} | utils/io/rowseries.go | 0.620966 | 0.510863 | rowseries.go | starcoder |
package radius
func ironRadius(mass float64) float64 {
var radius float64
if mass <= 0.001496 {
radius1 := FractionRadius(mass, 0, 0, 0)
radius2 := PlanetRadiusHelper(mass, 0.001496, 0.09947, 0.002096, 0.1112, 0.002931, 0.1243)
radius = RangeAdjust(mass, radius1, radius2, 0.0, 0.001496)
} else if mass <= 0.002096 {
//radius = quad_trend(-2681.77957240149, 29.1753892167919, 0.0618254642138286, mass);
radius = PlanetRadiusHelper(mass, 0.001496, 0.09947, 0.002096, 0.1112, 0.002931, 0.1243)
} else if mass <= 0.002931 {
//radius = quad_trend(-1688.63294694608, 24.2372061422203, 0.0678174361405534, mass);
radius1 := PlanetRadiusHelper(mass, 0.001496, 0.09947, 0.002096, 0.1112, 0.002931, 0.1243)
radius2 := PlanetRadiusHelper(mass, 0.002096, 0.1112, 0.002931, 0.1243, 0.00409, 0.1387)
radius = RangeAdjust(mass, radius1, radius2, 0.002096, 0.002931)
} else if mass <= 0.00409 {
//radius = quad_trend(-915.413258520028, 18.8629573194402, 0.0768735211649926, mass);
radius1 := PlanetRadiusHelper(mass, 0.002096, 0.1112, 0.002931, 0.1243, 0.00409, 0.1387)
radius2 := PlanetRadiusHelper(mass, 0.002931, 0.1243, 0.00409, 0.1387, 0.005694, 0.1546)
radius = RangeAdjust(mass, radius1, radius2, 0.002931, 0.00409)
} else if mass <= 0.005694 {
//radius = quad_trend(-512.025546524638, 14.9254112879076, 0.086217430425244, mass);
radius1 := PlanetRadiusHelper(mass, 0.002931, 0.1243, 0.00409, 0.1387, 0.005694, 0.1546)
radius2 := PlanetRadiusHelper(mass, 0.00409, 0.1387, 0.005694, 0.1546, 0.007904, 0.1722)
radius = RangeAdjust(mass, radius1, radius2, 0.00409, 0.005694)
} else if mass <= 0.007904 {
//radius = quad_trend(-292.3316061, 11.93892609, 0.0960976238, mass);
radius1 := PlanetRadiusHelper(mass, 0.00409, 0.1387, 0.005694, 0.1546, 0.007904, 0.1722)
radius2 := PlanetRadiusHelper(mass, 0.005694, 0.1546, 0.007904, 0.1722, 0.01094, 0.1915)
radius = RangeAdjust(mass, radius1, radius2, 0.005694, 0.007904)
} else if mass <= 0.01094 {
//radius = quad_trend(-192.8510361, 10.06829345, 0.104668233, mass);
radius1 := PlanetRadiusHelper(mass, 0.005694, 0.1546, 0.007904, 0.1722, 0.01094, 0.1915)
radius2 := PlanetRadiusHelper(mass, 0.007904, 0.1722, 0.01094, 0.1915, 0.01507, 0.2126)
radius = RangeAdjust(mass, radius1, radius2, 0.007904, 0.01094)
} else if mass <= 0.01507 {
//radius = quad_trend(-99.45862138, 7.642892436, 0.1200091513, mass);
radius1 := PlanetRadiusHelper(mass, 0.007904, 0.1722, 0.01094, 0.1915, 0.01507, 0.2126)
radius2 := PlanetRadiusHelper(mass, 0.01094, 0.1915, 0.01507, 0.2126, 0.0207, 0.2356)
radius = RangeAdjust(mass, radius1, radius2, 0.01094, 0.01507)
} else if mass <= 0.0207 {
//radius = quad_trend(-59.86761779, 6.226722237, 0.1323595252, mass);
radius1 := PlanetRadiusHelper(mass, 0.01094, 0.1915, 0.01507, 0.2126, 0.0207, 0.2356)
radius2 := PlanetRadiusHelper(mass, 0.01507, 0.2126, 0.0207, 0.2356, 0.02829, 0.2606)
radius = RangeAdjust(mass, radius1, radius2, 0.01507, 0.0207)
} else if mass <= 0.02829 {
//radius = quad_trend(-36.47985573, 5.080955774, 0.1460554689, mass);
radius1 := PlanetRadiusHelper(mass, 0.01507, 0.2126, 0.0207, 0.2356, 0.02829, 0.2606)
radius2 := PlanetRadiusHelper(mass, 0.0207, 0.2356, 0.02829, 0.2606, 0.0385, 0.2876)
radius = RangeAdjust(mass, radius1, radius2, 0.0207, 0.02829)
} else if mass <= 0.0385 {
//radius = quad_trend(-21.31356831, 4.067999437, 0.1625740583, mass);
radius1 := PlanetRadiusHelper(mass, 0.0207, 0.2356, 0.02829, 0.2606, 0.0385, 0.2876)
radius2 := PlanetRadiusHelper(mass, 0.02829, 0.2606, 0.0385, 0.2876, 0.05212, 0.3167)
radius = RangeAdjust(mass, radius1, radius2, 0.02829, 0.0385)
} else if mass <= 0.05212 {
//radius = quad_trend(-12.81381824, 3.297752085, 0.1796298268, mass);
radius1 := PlanetRadiusHelper(mass, 0.02829, 0.2606, 0.0385, 0.2876, 0.05212, 0.3167)
radius2 := PlanetRadiusHelper(mass, 0.0385, 0.2876, 0.05212, 0.3167, 0.07021, 0.348)
radius = RangeAdjust(mass, radius1, radius2, 0.0385, 0.05212)
} else if mass <= 0.07021 {
//radius = quad_trend(-7.888269424, 2.695209699, 0.1976541102, mass);
radius1 := PlanetRadiusHelper(mass, 0.0385, 0.2876, 0.05212, 0.3167, 0.07021, 0.348)
radius2 := PlanetRadiusHelper(mass, 0.05212, 0.3167, 0.07021, 0.348, 0.09408, 0.3814)
radius = RangeAdjust(mass, radius1, radius2, 0.05212, 0.07021)
} else if mass <= 0.09408 {
//radius = quad_trend(-4.757963763, 2.180931782, 0.218330896, mass);
radius1 := PlanetRadiusHelper(mass, 0.05212, 0.3167, 0.07021, 0.348, 0.09408, 0.3814)
radius2 := PlanetRadiusHelper(mass, 0.07021, 0.348, 0.09408, 0.3814, 0.1254, 0.417)
radius = RangeAdjust(mass, radius1, radius2, 0.07021, 0.09408)
} else if mass <= 0.1254 {
//radius = quad_trend(-2.941685354, 1.782294997, 0.2397586803, mass);
radius1 := PlanetRadiusHelper(mass, 0.07021, 0.348, 0.09408, 0.3814, 0.1254, 0.417)
radius2 := PlanetRadiusHelper(mass, 0.09408, 0.3814, 0.1254, 0.417, 0.1663, 0.4548)
radius = RangeAdjust(mass, radius1, radius2, 0.09408, 0.1254)
} else if mass <= 0.1663 {
//radius = quad_trend(-1.784894626, 1.444859141, 0.2638824172, mass);
radius1 := PlanetRadiusHelper(mass, 0.09408, 0.3814, 0.1254, 0.417, 0.1663, 0.4548)
radius2 := PlanetRadiusHelper(mass, 0.1254, 0.417, 0.1663, 0.4548, 0.2193, 0.4949)
radius = RangeAdjust(mass, radius1, radius2, 0.1254, 0.1663)
} else if mass <= 0.2193 {
//radius = quad_trend(-1.162328645, 1.204797699, 0.2865871433, mass);
radius1 := PlanetRadiusHelper(mass, 0.1254, 0.417, 0.1663, 0.4548, 0.2193, 0.4949)
radius2 := PlanetRadiusHelper(mass, 0.1663, 0.4548, 0.2193, 0.4949, 0.2877, 0.537)
radius = RangeAdjust(mass, radius1, radius2, 0.1663, 0.2193)
} else if mass <= 0.2877 {
//radius = quad_trend(-0.699716184, 0.9702531813, 0.3157745709, mass);
radius1 := PlanetRadiusHelper(mass, 0.1663, 0.4548, 0.2193, 0.4949, 0.2877, 0.537)
radius2 := PlanetRadiusHelper(mass, 0.2193, 0.4949, 0.2877, 0.537, 0.3754, 0.5814)
radius = RangeAdjust(mass, radius1, radius2, 0.2193, 0.2877)
} else if mass <= 0.3754 {
//radius = quad_trend(-0.4577736374, 0.8098210786, 0.3419049902, mass);
radius1 := PlanetRadiusHelper(mass, 0.2193, 0.4949, 0.2877, 0.537, 0.3754, 0.5814)
radius2 := PlanetRadiusHelper(mass, 0.2877, 0.537, 0.3754, 0.5814, 0.4875, 0.6279)
radius = RangeAdjust(mass, radius1, radius2, 0.2877, 0.3754)
} else if mass <= 0.4875 {
//radius = quad_trend(-0.2880355042, 0.6633540435, 0.3729683416, mass);
radius1 := PlanetRadiusHelper(mass, 0.2877, 0.537, 0.3754, 0.5814, 0.4875, 0.6279)
radius2 := PlanetRadiusHelper(mass, 0.3754, 0.5814, 0.4875, 0.6279, 0.6298, 0.6765)
radius = RangeAdjust(mass, radius1, radius2, 0.2877, 0.3754)
} else if mass <= 0.6298 {
//radius = quad_trend(-0.1883400931, 0.5519643698, 0.4035775744, mass);
radius1 := PlanetRadiusHelper(mass, 0.3754, 0.5814, 0.4875, 0.6279, 0.6298, 0.6765)
radius2 := PlanetRadiusHelper(mass, 0.4875, 0.6279, 0.6298, 0.6765, 0.8096, 0.727)
radius = RangeAdjust(mass, radius1, radius2, 0.3754, 0.6298)
} else if mass <= 0.8096 {
//radius = quad_trend(-0.1194866451, 0.4528567076, 0.4386849891, mass);
radius1 := PlanetRadiusHelper(mass, 0.4875, 0.6279, 0.6298, 0.6765, 0.8096, 0.727)
radius2 := PlanetRadiusHelper(mass, 0.6298, 0.6765, 0.8096, 0.727, 1.036, 0.7796)
radius = RangeAdjust(mass, radius1, radius2, 0.6298, 0.8096)
} else if mass <= 1.036 {
//radius = quad_trend(-0.0787318553, 0.3776396675, 0.4728678898, mass);
radius1 := PlanetRadiusHelper(mass, 0.6298, 0.6765, 0.8096, 0.727, 1.036, 0.7796)
radius2 := PlanetRadiusHelper(mass, 0.8096, 0.727, 1.036, 0.7796, 1.319, 0.834)
radius = RangeAdjust(mass, radius1, radius2, 0.8096, 1.036)
} else if mass <= 1.319 {
//radius = quad_trend(-0.0512867047, 0.313006338, 0.5103712488, mass);
radius1 := PlanetRadiusHelper(mass, 0.8096, 0.727, 1.036, 0.7796, 1.319, 0.834)
radius2 := PlanetRadiusHelper(mass, 1.036, 0.7796, 1.319, 0.834, 1.671, 0.8902)
radius = RangeAdjust(mass, radius1, radius2, 1.036, 1.319)
} else if mass <= 1.671 {
//radius = quad_trend(-0.0344294192, 0.2626030543, 0.5475255322, mass);
radius1 := PlanetRadiusHelper(mass, 1.036, 0.7796, 1.319, 0.834, 1.671, 0.8902)
radius2 := PlanetRadiusHelper(mass, 1.319, 0.834, 1.671, 0.8902, 2.108, 0.9481)
radius = RangeAdjust(mass, radius1, radius2, 1.319, 1.671)
} else if mass <= 2.108 {
//radius = quad_trend(-0.0239715508, 0.2230827695, 0.584363039, mass);
radius1 := PlanetRadiusHelper(mass, 1.319, 0.834, 1.671, 0.8902, 2.108, 0.9481)
radius2 := PlanetRadiusHelper(mass, 1.671, 0.8902, 2.108, 0.9481, 2.648, 1.007)
radius = RangeAdjust(mass, radius1, radius2, 1.671, 2.108)
} else if mass <= 2.648 {
//radius = quad_trend(-0.0140840757, 0.176057938, 0.6395547667, mass);
radius1 := PlanetRadiusHelper(mass, 1.671, 0.8902, 2.108, 0.9481, 2.648, 1.007)
radius2 := PlanetRadiusHelper(mass, 2.108, 0.9481, 2.648, 1.007, 3.31, 1.068)
radius = RangeAdjust(mass, radius1, radius2, 2.108, 2.648)
} else if mass <= 3.31 {
//radius = quad_trend(-0.0105419379, 0.154953881, 0.6706011795, mass);
radius1 := PlanetRadiusHelper(mass, 2.108, 0.9481, 2.648, 1.007, 3.31, 1.068)
radius2 := PlanetRadiusHelper(mass, 2.648, 1.007, 3.31, 1.068, 4.119, 1.13)
radius = RangeAdjust(mass, radius1, radius2, 2.648, 3.31)
} else if mass <= 4.119 {
//radius = quad_trend(-0.0070348211, 0.1288995104, 0.718416824, mass);
radius1 := PlanetRadiusHelper(mass, 2.648, 1.007, 3.31, 1.068, 4.119, 1.13)
radius2 := PlanetRadiusHelper(mass, 3.31, 1.068, 4.119, 1.13, 5.103, 1.193)
radius = RangeAdjust(mass, radius1, radius2, 3.31, 4.119)
} else if mass <= 5.103 {
//radius = quad_trend(-0.0047115353, 0.1074741683, 0.7672505662, mass);
radius1 := PlanetRadiusHelper(mass, 3.31, 1.068, 4.119, 1.13, 5.103, 1.193)
radius2 := PlanetRadiusHelper(mass, 4.119, 1.13, 5.103, 1.193, 6.293, 1.257)
radius = RangeAdjust(mass, radius1, radius2, 4.119, 5.103)
} else if mass <= 6.293 {
//radius = quad_trend(-0.003487465, 0.0935246637, 0.8065593536, mass);
radius1 := PlanetRadiusHelper(mass, 4.119, 1.13, 5.103, 1.193, 6.293, 1.257)
radius2 := PlanetRadiusHelper(mass, 5.103, 1.193, 6.293, 1.257, 7.727, 1.321)
radius = RangeAdjust(mass, radius1, radius2, 5.103, 6.293)
} else if mass <= 7.727 {
//radius = quad_trend(-0.0021560003, 0.0748575287, 0.8713031702, mass);
radius1 := PlanetRadiusHelper(mass, 5.103, 1.193, 6.293, 1.257, 7.727, 1.321)
radius2 := PlanetRadiusHelper(mass, 6.293, 1.257, 7.727, 1.321, 9.445, 1.386)
radius = RangeAdjust(mass, radius1, radius2, 6.293, 7.727)
} else if mass <= 9.445 {
//radius = quad_trend(-0.00160772, 0.0654424596, 0.9113174962, mass);
radius1 := PlanetRadiusHelper(mass, 6.293, 1.257, 7.727, 1.321, 9.445, 1.386)
radius2 := PlanetRadiusHelper(mass, 7.727, 1.321, 9.445, 1.386, 11.49, 1.451)
radius = RangeAdjust(mass, radius1, radius2, 7.727, 9.445)
} else if mass <= 11.49 {
//radius = quad_trend(-0.0012172944, 0.0572688997, 0.9536876732, mass);
radius1 := PlanetRadiusHelper(mass, 7.727, 1.321, 9.445, 1.386, 11.49, 1.451)
radius2 := PlanetRadiusHelper(mass, 9.445, 1.386, 11.49, 1.451, 13.92, 1.515)
radius = RangeAdjust(mass, radius1, radius2, 9.445, 11.49)
} else if mass <= 13.92 {
//radius = quad_trend(-7.485494E-4, 0.0453580881, 1.028659131, mass);
radius1 := PlanetRadiusHelper(mass, 9.445, 1.386, 11.49, 1.451, 13.92, 1.515)
radius2 := PlanetRadiusHelper(mass, 11.49, 1.451, 13.92, 1.515, 16.78, 1.579)
radius = RangeAdjust(mass, radius1, radius2, 11.49, 13.92)
} else if mass <= 16.78 {
//radius = quad_trend(-5.83219E-4, 0.0402824467, 1.067276595, mass);
radius1 := PlanetRadiusHelper(mass, 11.49, 1.451, 13.92, 1.515, 16.78, 1.579)
radius2 := PlanetRadiusHelper(mass, 13.92, 1.515, 16.78, 1.579, 20.14, 1.642)
radius = RangeAdjust(mass, radius1, radius2, 13.92, 16.78)
} else if mass <= 20.14 {
//radius = quad_trend(-3.979673E-4, 0.0334429539, 1.129882258, mass);
radius1 := PlanetRadiusHelper(mass, 13.92, 1.515, 16.78, 1.579, 20.14, 1.642)
radius2 := PlanetRadiusHelper(mass, 16.78, 1.579, 20.14, 1.642, 24.05, 1.704)
radius = RangeAdjust(mass, radius1, radius2, 16.78, 20.14)
} else if mass <= 24.05 {
//radius = quad_trend(-2.896199E-4, 0.0286550795, 1.182362194, mass);
radius1 := PlanetRadiusHelper(mass, 16.78, 1.579, 20.14, 1.642, 24.05, 1.704)
radius2 := PlanetRadiusHelper(mass, 20.14, 1.642, 24.05, 1.704, 28.6, 1.765)
radius = RangeAdjust(mass, radius1, radius2, 20.14, 24.05)
} else if mass <= 28.6 {
//radius = quad_trend(-2.251678E-4, 0.0252616764, 1.226694282, mass);
radius1 := PlanetRadiusHelper(mass, 20.14, 1.642, 24.05, 1.704, 28.6, 1.765)
radius2 := PlanetRadiusHelper(mass, 24.05, 1.704, 28.6, 1.765, 33.87, 1.824)
radius = RangeAdjust(mass, radius1, radius2, 24.05, 28.6)
} else if mass <= 33.87 {
//radius = quad_trend(-1.591712E-4, 0.0211388691, 1.290623996, mass);
radius1 := PlanetRadiusHelper(mass, 24.05, 1.704, 28.6, 1.765, 33.87, 1.824)
radius2 := PlanetRadiusHelper(mass, 28.6, 1.765, 33.87, 1.824, 39.94, 1.881)
radius = RangeAdjust(mass, radius1, radius2, 28.6, 33.87)
} else if mass <= 39.94 {
//radius = quad_trend(-1.04791E-4, 0.171250664, 1.364187783, mass);
radius1 := PlanetRadiusHelper(mass, 28.6, 1.765, 33.87, 1.824, 39.94, 1.881)
radius2 := PlanetRadiusHelper(mass, 33.87, 1.824, 39.94, 1.881, 46.92, 1.937)
radius = RangeAdjust(mass, radius1, radius2, 33.87, 39.94)
} else if mass <= 46.92 {
//radius = quad_trend(-9.380878E-5, 0.0161711529, 1.38476825, mass);
radius1 := PlanetRadiusHelper(mass, 33.87, 1.824, 39.94, 1.881, 46.92, 1.937)
radius2 := PlanetRadiusHelper(mass, 39.94, 1.881, 46.92, 1.937, 54.93, 1.99)
radius = RangeAdjust(mass, radius1, radius2, 39.94, 46.92)
} else if mass <= 54.93 {
//radius = quad_trend(-5.440961E-5, 0.0121583483, 1.486312324, mass);
radius1 := PlanetRadiusHelper(mass, 39.94, 1.881, 46.92, 1.937, 54.93, 1.99)
radius2 := PlanetRadiusHelper(mass, 46.92, 1.937, 54.93, 1.99, 64.08, 2.042)
radius = RangeAdjust(mass, radius1, radius2, 46.92, 54.93)
} else if mass <= 64.08 {
//radius = quad_trend(-5.031019E-5, 0.0116704759, 1.500741944, mass);
radius1 := PlanetRadiusHelper(mass, 46.92, 1.937, 54.93, 1.99, 64.08, 2.042)
radius2 := PlanetRadiusHelper(mass, 54.93, 1.99, 64.08, 2.042, 74.51, 2.091)
radius = RangeAdjust(mass, radius1, radius2, 54.93, 64.08)
} else if mass <= 74.51 {
//radius = quad_trend(-3.297829E-5, 0.0092684477, 1.583494853, mass);
radius1 := PlanetRadiusHelper(mass, 54.93, 1.99, 64.08, 2.042, 74.51, 2.091)
radius2 := PlanetRadiusHelper(mass, 64.08, 2.042, 74.51, 2.091, 86.37, 2.138)
radius = RangeAdjust(mass, radius1, radius2, 64.08, 74.51)
} else if mass <= 86.37 {
//radius = quad_trend(-2.420693E-5, 0.0078573106, 1.639942343, mass);
radius1 := PlanetRadiusHelper(mass, 64.08, 2.042, 74.51, 2.091, 86.37, 2.138)
radius2 := PlanetRadiusHelper(mass, 74.51, 2.091, 86.37, 2.138, 99.8, 2.183)
radius = RangeAdjust(mass, radius1, radius2, 74.51, 86.37)
} else if mass <= 99.8 {
//radius = quad_trend(-1.822424E-5, 0.0067435142, 1.691511445, mass);
radius1 := PlanetRadiusHelper(mass, 74.51, 2.091, 86.37, 2.138, 99.8, 2.183)
radius2 := PlanetRadiusHelper(mass, 86.37, 2.138, 99.8, 2.183, 115.0, 2.226)
radius = RangeAdjust(mass, radius1, radius2, 86.37, 99.8)
} else if mass <= 115.0 {
//radius = quad_trend(-1.516304E-5, 0.0060859676, 1.726644882, mass);
radius1 := PlanetRadiusHelper(mass, 86.37, 2.138, 99.8, 2.183, 115.0, 2.226)
radius2 := PlanetRadiusHelper(mass, 99.8, 2.183, 115.0, 2.226, 132.1, 2.266)
radius = RangeAdjust(mass, radius1, radius2, 99.8, 115.0)
} else if mass <= 132.1 {
radius1 := PlanetRadiusHelper(mass, 99.8, 2.183, 115.0, 2.226, 132.1, 2.266)
radius2 := PlanetRadiusHelper2(mass, 115.0, 2.226, 132.1, 2.266)
radius = RangeAdjust(mass, radius1, radius2, 115.0, 132.1)
} else {
//radius = ln_trend(0.8568787518, 0.2885439056, mass);
radius = PlanetRadiusHelper2(mass, 115.0, 2.226, 132.1, 2.266)
}
return radius
}
func halfRockHalfIronRadius(mass, cmf float64) float64 {
adjustForCarbon := true
var radius float64
if mass <= 0.00177 {
adjustForCarbon = false
radius1 := FractionRadius(mass, 0, 0.5, cmf)
radius2 := PlanetRadiusHelper(mass, 0.00177, 0.121, 0.002481, 0.1354, 0.003472, 0.1513)
radius = RangeAdjust(mass, radius1, radius2, 0.0, 0.00177)
} else if mass <= 0.002481 {
radius = PlanetRadiusHelper(mass, 0.00177, 0.121, 0.002481, 0.1354, 0.003472, 0.1513)
} else if mass <= 0.003472 {
//radius = quad_trend(-1373.39582492175, 24.2607963117367, 0.0836627150671445, mass);
radius1 := PlanetRadiusHelper(mass, 0.00177, 0.121, 0.002481, 0.1354, 0.003472, 0.1513)
radius2 := PlanetRadiusHelper(mass, 0.002481, 0.1354, 0.003472, 0.1513, 0.004848, 0.169)
radius = RangeAdjust(mass, radius1, radius2, 0.002481, 0.003472)
} else if mass <= 0.004848 {
//radius = quad_trend(-799.3217849, 19.51372934, 0.0931839832, mass);
radius1 := PlanetRadiusHelper(mass, 0.002481, 0.1354, 0.003472, 0.1513, 0.004848, 0.169)
radius2 := PlanetRadiusHelper(mass, 0.003472, 0.1513, 0.004848, 0.169, 0.006752, 0.1885)
radius = RangeAdjust(mass, radius1, radius2, 0.003472, 0.004848)
} else if mass <= 0.006752 {
//radius = quad_trend(-446.2529913, 15.41813134, 0.1047412297, mass);
radius1 := PlanetRadiusHelper(mass, 0.003472, 0.1513, 0.004848, 0.169, 0.006752, 0.1885)
radius2 := PlanetRadiusHelper(mass, 0.004848, 0.169, 0.006752, 0.1885, 0.00938, 0.2101)
radius = RangeAdjust(mass, radius1, radius2, 0.004848, 0.006752)
} else if mass <= 0.00938 {
//radius = quad_trend(-260.7214329, 12.42513624, 0.1164916409, mass);
radius1 := PlanetRadiusHelper(mass, 0.004848, 0.169, 0.006752, 0.1885, 0.00938, 0.2101)
radius2 := PlanetRadiusHelper(mass, 0.006752, 0.1885, 0.00938, 0.2101, 0.01299, 0.2339)
radius = RangeAdjust(mass, radius1, radius2, 0.006752, 0.00938)
} else if mass <= 0.01299 {
//radius = quad_trend(-152.0702736, 9.994609805, 0.1297303718, mass);
radius1 := PlanetRadiusHelper(mass, 0.006752, 0.1885, 0.00938, 0.2101, 0.01299, 0.2339)
radius2 := PlanetRadiusHelper(mass, 0.00938, 0.2101, 0.01299, 0.2339, 0.01792, 0.26)
radius = RangeAdjust(mass, radius1, radius2, 0.00938, 0.01299)
} else if mass <= 0.01792 {
//radius = quad_trend(-89.11289838, 8.048597336, 0.14438564, mass);
radius1 := PlanetRadiusHelper(mass, 0.00938, 0.2101, 0.01299, 0.2339, 0.01792, 0.26)
radius2 := PlanetRadiusHelper(mass, 0.01299, 0.2339, 0.01792, 0.26, 0.02464, 0.2886)
radius = RangeAdjust(mass, radius1, radius2, 0.01299, 0.01792)
} else if mass <= 0.02464 {
//radius = quad_trend(-52.58495246, 6.493967957, 0.1605145107, mass);
radius1 := PlanetRadiusHelper(mass, 0.01299, 0.2339, 0.01792, 0.26, 0.02464, 0.2886)
radius2 := PlanetRadiusHelper(mass, 0.01792, 0.26, 0.02464, 0.2886, 0.03372, 0.03372)
radius = RangeAdjust(mass, radius1, radius2, 0.01792, 0.02464)
} else if mass <= 0.03372 {
//radius = quad_trend(-31.03774295, 5.236472811, 0.1784172424, mass);
radius1 := PlanetRadiusHelper(mass, 0.01792, 0.26, 0.02464, 0.2886, 0.03372, 0.03372)
radius2 := PlanetRadiusHelper(mass, 0.02464, 0.2886, 0.03372, 0.03372, 0.04595, 0.3535)
radius = RangeAdjust(mass, radius1, radius2, 0.02464, 0.03372)
} else if mass <= 0.04595 {
//radius = quad_trend(-18.41664973, 4.230950314, 0.1979727934, mass);
radius1 := PlanetRadiusHelper(mass, 0.02464, 0.2886, 0.03372, 0.03372, 0.04595, 0.3535)
radius2 := PlanetRadiusHelper(mass, 0.03372, 0.03372, 0.04595, 0.3535, 0.06231, 0.3901)
radius = RangeAdjust(mass, radius1, radius2, 0.03372, 0.04595)
} else if mass <= 0.06231 {
//radius = quad_trend(-11.08681604, 3.437422519, 0.2189591664, mass);
radius1 := PlanetRadiusHelper(mass, 0.03372, 0.03372, 0.04595, 0.3535, 0.06231, 0.3901)
radius2 := PlanetRadiusHelper(mass, 0.04595, 0.3535, 0.06231, 0.3901, 0.08408, 0.4296)
radius = RangeAdjust(mass, radius1, radius2, 0.04595, 0.06231)
} else if mass <= 0.08408 {
//radius = quad_trend(-6.715816383, 2.797551879, 0.241858942, mass);
radius1 := PlanetRadiusHelper(mass, 0.04595, 0.3535, 0.06231, 0.3901, 0.08408, 0.4296)
radius2 := PlanetRadiusHelper(mass, 0.06231, 0.3901, 0.08408, 0.4296, 0.1129, 0.4721)
radius = RangeAdjust(mass, radius1, radius2, 0.06231, 0.08408)
} else if mass <= 0.1129 {
//radius = quad_trend(-4.069306668, 2.276242395, 0.2669812848, mass);
radius1 := PlanetRadiusHelper(mass, 0.06231, 0.3901, 0.08408, 0.4296, 0.1129, 0.4721)
radius2 := PlanetRadiusHelper(mass, 0.08408, 0.4296, 0.1129, 0.4721, 0.1508, 0.5177)
radius = RangeAdjust(mass, radius1, radius2, 0.08408, 0.1129)
} else if mass <= 0.1508 {
//radius = quad_trend(-2.532586328, 1.871009242, 0.2931444403, mass);
radius1 := PlanetRadiusHelper(mass, 0.08408, 0.4296, 0.1129, 0.4721, 0.1508, 0.5177)
radius2 := PlanetRadiusHelper(mass, 0.1129, 0.4721, 0.1508, 0.5177, 0.2003, 0.5663)
radius = RangeAdjust(mass, radius1, radius2, 0.1129, 0.1508)
} else if mass <= 0.2003 {
//radius = quad_trend(-1.57175725, 1.533662152, 0.3221665132, mass);
radius1 := PlanetRadiusHelper(mass, 0.1129, 0.4721, 0.1508, 0.5177, 0.2003, 0.5663)
radius2 := PlanetRadiusHelper(mass, 0.1508, 0.5177, 0.2003, 0.5663, 0.2647, 0.618)
radius = RangeAdjust(mass, radius1, radius2, 0.1508, 0.2003)
} else if mass <= 0.2647 {
//radius = quad_trend(-0.9812585362, 1.25908025, 0.3534744066, mass);
radius1 := PlanetRadiusHelper(mass, 0.1508, 0.5177, 0.2003, 0.5663, 0.2647, 0.618)
radius2 := PlanetRadiusHelper(mass, 0.2003, 0.5663, 0.2647, 0.618, 0.348, 0.6738)
radius = RangeAdjust(mass, radius1, radius2, 0.2003, 0.2647)
} else if mass <= 0.348 {
//radius = quad_trend(-0.6134611137, 1.03373077, 0.3873542869, mass);
radius1 := PlanetRadiusHelper(mass, 0.2003, 0.5663, 0.2647, 0.618, 0.348, 0.6738)
radius2 := PlanetRadiusHelper(mass, 0.2647, 0.618, 0.348, 0.6738, 0.455, 0.7307)
radius = RangeAdjust(mass, radius1, radius2, 0.2647, 0.348)
} else if mass <= 0.455 {
//radius = quad_trend(-0.3947160298, 0.8580784672, 0.4219903835, mass);
radius1 := PlanetRadiusHelper(mass, 0.2647, 0.618, 0.348, 0.6738, 0.455, 0.7307)
radius2 := PlanetRadiusHelper(mass, 0.348, 0.6738, 0.455, 0.7307, 0.5919, 0.7916)
radius = RangeAdjust(mass, radius1, radius2, 0.348, 0.455)
} else if mass <= 0.5919 {
//radius = quad_trend(-0.251475037, 0.7081194719, 0.4605672598, mass);
radius1 := PlanetRadiusHelper(mass, 0.348, 0.6738, 0.455, 0.7307, 0.5919, 0.7916)
radius2 := PlanetRadiusHelper(mass, 0.455, 0.7307, 0.5919, 0.7916, 0.7659, 0.8554)
radius = RangeAdjust(mass, radius1, radius2, 0.455, 0.5919)
} else if mass <= 0.7659 {
//radius = quad_trend(-0.1614376955, 0.5858667696, 0.501384447, mass);
radius1 := PlanetRadiusHelper(mass, 0.455, 0.7307, 0.5919, 0.7916, 0.7659, 0.8554)
radius2 := PlanetRadiusHelper(mass, 0.5919, 0.7916, 0.7659, 0.8554, 0.986, 0.9221)
radius = RangeAdjust(mass, radius1, radius2, 0.5919, 0.7659)
} else if mass <= 0.986 {
//radius = quad_trend(-0.1082398009, 0.4926693781, 0.5415582947, mass);
radius1 := PlanetRadiusHelper(mass, 0.5919, 0.7916, 0.7659, 0.8554, 0.986, 0.9221)
radius2 := PlanetRadiusHelper(mass, 0.7659, 0.8554, 0.986, 0.9221, 1.261, 0.9907)
radius = RangeAdjust(mass, radius1, radius2, 0.7659, 0.986)
} else if mass <= 1.261 {
//radius = quad_trend(-0.0690127077, 0.4045260997, 0.5903311441, mass);
radius1 := PlanetRadiusHelper(mass, 0.7659, 0.8554, 0.986, 0.9221, 1.261, 0.9907)
radius2 := PlanetRadiusHelper(mass, 0.986, 0.9221, 1.261, 0.9907, 1.606, 1.062)
radius = RangeAdjust(mass, radius1, radius2, 0.986, 1.261)
} else if mass <= 1.606 {
//radius = quad_trend(-0.0446111528, 0.3345668417, 0.6397483435, mass);
radius1 := PlanetRadiusHelper(mass, 0.986, 0.9221, 1.261, 0.9907, 1.606, 1.062)
radius2 := PlanetRadiusHelper(mass, 1.261, 0.9907, 1.606, 1.062, 2.036, 1.136)
radius = RangeAdjust(mass, radius1, radius2, 1.261, 1.606)
} else if mass <= 2.036 {
//radius = quad_trend(-0.0323446774, 0.2898923383, 0.679857461, mass);
radius1 := PlanetRadiusHelper(mass, 1.261, 0.9907, 1.606, 1.062, 2.036, 1.136)
radius2 := PlanetRadiusHelper(mass, 1.606, 1.062, 2.036, 1.136, 2.568, 1.211)
radius = RangeAdjust(mass, radius1, radius2, 1.606, 2.036)
} else if mass <= 2.568 {
//radius = quad_trend(-0.0188541262, 0.2277818406, 0.7503921064, mass);
radius1 := PlanetRadiusHelper(mass, 1.606, 1.062, 2.036, 1.136, 2.568, 1.211)
radius2 := PlanetRadiusHelper(mass, 2.036, 1.136, 2.568, 1.211, 3.226, 1.289)
radius = RangeAdjust(mass, radius1, radius2, 2.036, 2.568)
} else if mass <= 3.226 {
//radius = quad_trend(-0.0131731218, 0.1948661011, 0.7974556375, mass);
radius1 := PlanetRadiusHelper(mass, 2.036, 1.136, 2.568, 1.211, 3.226, 1.289)
radius2 := PlanetRadiusHelper(mass, 2.568, 1.211, 3.226, 1.289, 4.032, 1.369)
radius = RangeAdjust(mass, radius1, radius2, 2.568, 3.226)
} else if mass <= 4.032 {
//radius = quad_trend(-0.0089795106, 0.1644288708, 0.8520029117, mass);
radius1 := PlanetRadiusHelper(mass, 2.568, 1.211, 3.226, 1.289, 4.032, 1.369)
radius2 := PlanetRadiusHelper(mass, 3.226, 1.289, 4.032, 1.369, 5.018, 1.451)
radius = RangeAdjust(mass, radius1, radius2, 3.226, 4.032)
} else if mass <= 5.018 {
//radius = quad_trend(-0.0063563019, 0.1406888322, 0.9050771807, mass);
radius1 := PlanetRadiusHelper(mass, 3.226, 1.289, 4.032, 1.369, 5.018, 1.451)
radius2 := PlanetRadiusHelper(mass, 4.032, 1.369, 5.018, 1.451, 6.216, 1.534)
radius = RangeAdjust(mass, radius1, radius2, 4.032, 5.018)
} else if mass <= 6.216 {
//radius = quad_trend(-0.0042731856, 0.1172871044, 0.970053509, mass);
radius1 := PlanetRadiusHelper(mass, 4.032, 1.369, 5.018, 1.451, 6.216, 1.534)
radius2 := PlanetRadiusHelper(mass, 5.018, 1.451, 6.216, 1.534, 7.665, 1.618)
radius = RangeAdjust(mass, radius1, radius2, 5.018, 6.216)
} else if mass <= 7.665 {
//radius = quad_trend(-0.0032875812, 0.1036059286, 1.017013265, mass);
radius1 := PlanetRadiusHelper(mass, 5.018, 1.451, 6.216, 1.534, 7.665, 1.618)
radius2 := PlanetRadiusHelper(mass, 6.216, 1.534, 7.665, 1.618, 9.39, 1.7)
radius = RangeAdjust(mass, radius1, radius2, 6.216, 7.665)
} else if mass <= 9.39 {
//radius = quad_trend(-0.0019141267, 0.0801816636, 1.115866754, mass);
radius1 := PlanetRadiusHelper(mass, 6.216, 1.534, 7.665, 1.618, 9.39, 1.7)
radius2 := PlanetRadiusHelper(mass, 7.665, 1.618, 9.39, 1.7, 11.45, 1.783)
radius = RangeAdjust(mass, radius1, radius2, 7.665, 9.39)
} else if mass <= 11.45 {
//radius = quad_trend(-0.0015393648, 0.0723716241, 1.156159475, mass);
radius1 := PlanetRadiusHelper(mass, 7.665, 1.618, 9.39, 1.7, 11.45, 1.783)
radius2 := PlanetRadiusHelper(mass, 9.39, 1.7, 11.45, 1.783, 13.91, 1.865)
radius = RangeAdjust(mass, radius1, radius2, 9.39, 11.45)
} else if mass <= 13.91 {
//radius = quad_trend(-9.435581E-4, 0.057261966, 1.251053311, mass);
radius1 := PlanetRadiusHelper(mass, 9.39, 1.7, 11.45, 1.783, 13.91, 1.865)
radius2 := PlanetRadiusHelper(mass, 11.45, 1.783, 13.91, 1.865, 16.81, 1.947)
radius = RangeAdjust(mass, radius1, radius2, 11.45, 13.91)
} else if mass <= 16.81 {
//radius = quad_trend(-7.534048E-4, 0.0514204578, 1.295516297, mass);
radius1 := PlanetRadiusHelper(mass, 11.45, 1.783, 13.91, 1.865, 16.81, 1.947)
radius2 := PlanetRadiusHelper(mass, 13.91, 1.865, 16.81, 1.947, 20.21, 2.027)
radius = RangeAdjust(mass, radius1, radius2, 13.91, 16.81)
} else if mass <= 20.21 {
//radius = quad_trend(-5.047244E-4, 0.0422143076, 1.380000531, mass);
radius1 := PlanetRadiusHelper(mass, 13.91, 1.865, 16.81, 1.947, 20.21, 2.027)
radius2 := PlanetRadiusHelper(mass, 16.81, 1.947, 20.21, 2.027, 24.2, 2.106)
radius = RangeAdjust(mass, radius1, radius2, 16.81, 20.21)
} else if mass <= 24.2 {
//radius = quad_trend(-3.750415E-4, 0.0364550938, 1.443426061, mass);
radius1 := PlanetRadiusHelper(mass, 16.81, 1.947, 20.21, 2.027, 24.2, 2.106)
radius2 := PlanetRadiusHelper(mass, 20.21, 2.027, 24.2, 2.106, 28.85, 2.183)
radius = RangeAdjust(mass, radius1, radius2, 20.21, 24.2)
} else if mass <= 28.85 {
//radius = quad_trend(-2.610787E-4, 0.0304093647, 1.522991503, mass);
radius1 := PlanetRadiusHelper(mass, 20.21, 2.027, 24.2, 2.106, 28.85, 2.183)
radius2 := PlanetRadiusHelper(mass, 24.2, 2.106, 28.85, 2.183, 34.23, 2.258)
radius = RangeAdjust(mass, radius1, radius2, 24.2, 28.85)
} else if mass <= 34.23 {
//radius = quad_trend(-1.90016E-4, 0.0259267321, 1.593168402, mass);
radius1 := PlanetRadiusHelper(mass, 24.2, 2.106, 28.85, 2.183, 34.23, 2.258)
radius2 := PlanetRadiusHelper(mass, 28.85, 2.183, 34.23, 2.258, 40.45, 2.331)
radius = RangeAdjust(mass, radius1, radius2, 28.85, 34.23)
} else if mass <= 40.45 {
//radius = quad_trend(-1.446417E-4, 0.022538175, 1.655993898, mass);
radius1 := PlanetRadiusHelper(mass, 28.85, 2.183, 34.23, 2.258, 40.45, 2.331)
radius2 := PlanetRadiusHelper(mass, 34.23, 2.258, 40.45, 2.331, 47.59, 2.401)
radius = RangeAdjust(mass, radius1, radius2, 34.23, 40.45)
} else if mass <= 47.59 {
//radius = quad_trend(-1.04715E-4, 0.0190230332, 1.732853309, mass);
radius1 := PlanetRadiusHelper(mass, 34.23, 2.258, 40.45, 2.331, 47.59, 2.401)
radius2 := PlanetRadiusHelper(mass, 40.45, 2.331, 47.59, 2.401, 55.76, 2.468)
radius = RangeAdjust(mass, radius1, radius2, 40.45, 47.59)
} else if mass <= 55.76 {
//radius = quad_trend(-8.202615E-5, 0.016678137, 1.793060949, mass);
radius1 := PlanetRadiusHelper(mass, 40.45, 2.331, 47.59, 2.401, 55.76, 2.468)
radius2 := PlanetRadiusHelper(mass, 47.59, 2.401, 55.76, 2.468, 65.07, 2.531)
radius = RangeAdjust(mass, radius1, radius2, 47.59, 55.76)
} else if mass <= 65.07 {
//radius = quad_trend(-5.984705E-5, 0.0139982359, 1.873533463, mass);
radius1 := PlanetRadiusHelper(mass, 47.59, 2.401, 55.76, 2.468, 65.07, 2.531)
radius2 := PlanetRadiusHelper(mass, 55.76, 2.468, 65.07, 2.531, 75.65, 2.59)
radius = RangeAdjust(mass, radius1, radius2, 55.76, 65.07)
} else if mass <= 75.65 {
//radius = quad_trend(-4.398699E-5, 0.0117664086, 1.951605315, mass);
radius1 := PlanetRadiusHelper(mass, 55.76, 2.468, 65.07, 2.531, 75.65, 2.59)
radius2 := PlanetRadiusHelper(mass, 65.07, 2.531, 75.65, 2.59, 87.65, 2.645)
radius = RangeAdjust(mass, radius1, radius2, 65.07, 75.65)
} else if mass <= 87.65 {
//radius = quad_trend(-2.918571E-5, 0.0093493602, 2.049748472, mass);
radius1 := PlanetRadiusHelper(mass, 65.07, 2.531, 75.65, 2.59, 87.65, 2.645)
radius2 := PlanetRadiusHelper(mass, 75.65, 2.59, 87.65, 2.645, 101.2, 2.697)
radius = RangeAdjust(mass, radius1, radius2, 75.65, 87.65)
} else if mass <= 101.2 {
//radius = quad_trend(-2.427672E-5, 0.0084222976, 2.093292089, mass);
radius1 := PlanetRadiusHelper(mass, 75.65, 2.59, 87.65, 2.645, 101.2, 2.697)
radius2 := PlanetRadiusHelper(mass, 87.65, 2.645, 101.2, 2.697, 116.5, 2.745)
radius = RangeAdjust(mass, radius1, radius2, 87.65, 101.2)
} else if mass <= 116.5 {
//radius = quad_trend(-1.821786E-5, 0.0071032835, 2.164724855, mass);
radius1 := PlanetRadiusHelper(mass, 87.65, 2.645, 101.2, 2.697, 116.5, 2.745)
radius2 := PlanetRadiusHelper(mass, 101.2, 2.697, 116.5, 2.745, 133.8, 2.789)
radius = RangeAdjust(mass, radius1, radius2, 101.2, 116.5)
} else if mass <= 133.8 {
//radius = quad_trend(-1.286376E-5, 0.0057631526, 2.248182937, mass);
radius1 := PlanetRadiusHelper(mass, 101.2, 2.697, 116.5, 2.745, 133.8, 2.789)
radius2 := PlanetRadiusHelper(mass, 116.5, 2.745, 133.8, 2.789, 153.1, 2.829)
radius = RangeAdjust(mass, radius1, radius2, 116.5, 133.8)
} else if mass <= 153.1 {
radius1 := PlanetRadiusHelper(mass, 116.5, 2.745, 133.8, 2.789, 153.1, 2.829)
radius2 := PlanetRadiusHelper2(mass, 133.8, 2.789, 153.1, 2.829)
radius = RangeAdjust(mass, radius1, radius2, 133.8, 153.1)
} else {
//radius = ln_trend(1.335486915, 0.2968566848, mass);
radius = PlanetRadiusHelper2(mass, 133.8, 2.789, 153.1, 2.829)
}
if adjustForCarbon {
rmf := 0.5
carbonFraction := rmf * cmf
growFactor := (0.05 * carbonFraction) + 1.0
radius *= growFactor
}
return radius
} | stargen/radius/iron-radius.go | 0.506836 | 0.66046 | iron-radius.go | starcoder |
package views
import (
"sync"
"github.com/gdamore/tcell/v2"
)
// TextBar is a Widget that provides a single line of text, but with
// distinct left, center, and right areas. Each of the areas can be styled
// differently, and they align to the left, center, and right respectively.
// This is basically a convenience type on top Text and BoxLayout.
type TextBar struct {
changed bool
style tcell.Style
left Text
right Text
center Text
view View
lview ViewPort
rview ViewPort
cview ViewPort
once sync.Once
WidgetWatchers
}
// SetCenter sets the center text for the textbar. The text is
// always center aligned.
func (t *TextBar) SetCenter(s string, style tcell.Style) {
t.initialize()
if style == tcell.StyleDefault {
style = t.style
}
t.center.SetText(s)
t.center.SetStyle(style)
}
// SetLeft sets the left text for the textbar. It is always left-aligned.
func (t *TextBar) SetLeft(s string, style tcell.Style) {
t.initialize()
if style == tcell.StyleDefault {
style = t.style
}
t.left.SetText(s)
t.left.SetStyle(style)
}
// SetRight sets the right text for the textbar. It is always right-aligned.
func (t *TextBar) SetRight(s string, style tcell.Style) {
t.initialize()
if style == tcell.StyleDefault {
style = t.style
}
t.right.SetText(s)
t.right.SetStyle(style)
}
// SetStyle is used to set a default style to use for the textbar, including
// areas where no text is present. Note that this will not change the text
// already displayed, so call this before changing or setting text.
func (t *TextBar) SetStyle(style tcell.Style) {
t.initialize()
t.style = style
}
func (t *TextBar) initialize() {
t.once.Do(func() {
t.center.SetView(&t.cview)
t.left.SetView(&t.lview)
t.right.SetView(&t.rview)
t.center.SetAlignment(VAlignTop | HAlignCenter)
t.left.SetAlignment(VAlignTop | HAlignLeft)
t.right.SetAlignment(VAlignTop | HAlignRight)
t.center.Watch(t)
t.left.Watch(t)
t.right.Watch(t)
})
}
func (t *TextBar) layout() {
w, _ := t.view.Size()
ww, wh := t.left.Size()
t.lview.Resize(0, 0, ww, wh)
ww, wh = t.center.Size()
t.cview.Resize((w-ww)/2, 0, ww, wh)
ww, wh = t.right.Size()
t.rview.Resize(w-ww, 0, ww, wh)
t.changed = false
}
// SetView sets the View drawing context for this TextBar.
func (t *TextBar) SetView(view View) {
t.initialize()
t.view = view
t.lview.SetView(view)
t.rview.SetView(view)
t.cview.SetView(view)
t.changed = true
}
// Draw draws the TextBar into its View context.
func (t *TextBar) Draw() {
t.initialize()
if t.changed {
t.layout()
}
w, h := t.view.Size()
for y := 0; y < h; y++ {
for x := 0; x < w; x++ {
t.view.SetContent(x, y, ' ', nil, t.style)
}
}
// Draw in reverse order -- if we clip, we will clip at the
// right side.
t.right.Draw()
t.center.Draw()
t.left.Draw()
}
// Resize is called when the TextBar's View changes size, and
// updates the layout.
func (t *TextBar) Resize() {
t.initialize()
t.layout()
t.left.Resize()
t.center.Resize()
t.right.Resize()
t.PostEventWidgetResize(t)
}
// Size implements the Size method for Widget, returning the width
// and height in character cells.
func (t *TextBar) Size() (int, int) {
w, h := 0, 0
ww, wh := t.left.Size()
w += ww
if wh > h {
h = wh
}
ww, wh = t.center.Size()
w += ww
if wh > h {
h = wh
}
ww, wh = t.right.Size()
w += ww
if wh > h {
h = wh
}
return w, h
}
// HandleEvent handles incoming events. The only events handled are
// those for the Text objects; when those change, the TextBar adjusts
// the layout to accommodate.
func (t *TextBar) HandleEvent(ev tcell.Event) bool {
switch ev.(type) {
case *EventWidgetContent:
t.changed = true
return true
}
return false
}
// NewTextBar creates an empty, initialized TextBar.
func NewTextBar() *TextBar {
t := &TextBar{}
t.initialize()
return t
} | views/textbar.go | 0.666171 | 0.459743 | textbar.go | starcoder |
package vtctld
import (
"fmt"
"sort"
"sync"
"vitess.io/vitess/go/vt/discovery"
"vitess.io/vitess/go/vt/topo/topoproto"
topodatapb "vitess.io/vitess/go/vt/proto/topodata"
)
const (
// tabletMissing represents a missing/non-existent tablet for any metric.
tabletMissing = -1
// These values represent the threshold for replication lag.
lagThresholdDegraded = 60
lagThresholdUnhealthy = 120
// These values represent the health of the tablet - 1 is healthy, 2 is degraded, 3 is unhealthy
tabletHealthy = 0
tabletDegraded = 1
tabletUnhealthy = 2
)
// yLabel is used to keep track of the cell and type labels of the heatmap.
type yLabel struct {
CellLabel label
TypeLabels []label
}
// label is used to keep track of one label of a heatmap and how many rows it should span.
type label struct {
Name string
Rowspan int
}
// heatmap stores all the needed info to construct the heatmap.
type heatmap struct {
// Data is a 2D array of values of the specified metric.
Data [][]float64
// Aliases is a 2D array holding references to the tablet aliases.
Aliases [][]*topodatapb.TabletAlias
KeyspaceLabel label
CellAndTypeLabels []yLabel
ShardLabels []string
// YGridLines is used to draw gridLines on the map in the right places.
YGridLines []float64
}
type byTabletUID []*discovery.TabletStats
func (a byTabletUID) Len() int { return len(a) }
func (a byTabletUID) Swap(i, j int) { a[i], a[j] = a[j], a[i] }
func (a byTabletUID) Less(i, j int) bool { return a[i].Tablet.Alias.Uid < a[j].Tablet.Alias.Uid }
// availableTabletTypes is an array of tabletTypes that are being considered to display on the heatmap.
// Note: this list must always be sorted by the order they should appear (i.e. MASTER first, then REPLICA, then RDONLY)
var availableTabletTypes = []topodatapb.TabletType{topodatapb.TabletType_MASTER, topodatapb.TabletType_REPLICA, topodatapb.TabletType_RDONLY}
// tabletStatsCache holds the most recent status update received for
// each tablet. The tablets are indexed by uid, so it is different
// than discovery.TabletStatsCache.
type tabletStatsCache struct {
// mu guards access to the fields below.
mu sync.Mutex
// statuses keeps a map of TabletStats.
// The first key is the keyspace, the second key is the shard,
// the third key is the cell, the last key is the tabletType.
// The keys are strings to allow exposing this map as a JSON object in api.go.
statuses map[string]map[string]map[string]map[topodatapb.TabletType][]*discovery.TabletStats
// statusesByAlias is a copy of statuses and will be updated simultaneously.
// The first key is the string representation of the tablet alias.
statusesByAlias map[string]*discovery.TabletStats
}
type topologyInfo struct {
Keyspaces []string
Cells []string
TabletTypes []string
}
func newTabletStatsCache() *tabletStatsCache {
return &tabletStatsCache{
statuses: make(map[string]map[string]map[string]map[topodatapb.TabletType][]*discovery.TabletStats),
statusesByAlias: make(map[string]*discovery.TabletStats),
}
}
// StatsUpdate is part of the discovery.HealthCheckStatsListener interface.
// Upon receiving a new TabletStats, it updates the two maps in tablet_stats_cache.
func (c *tabletStatsCache) StatsUpdate(stats *discovery.TabletStats) {
c.mu.Lock()
defer c.mu.Unlock()
keyspace := stats.Tablet.Keyspace
shard := stats.Tablet.Shard
cell := stats.Tablet.Alias.Cell
tabletType := stats.Tablet.Type
aliasKey := tabletToMapKey(stats)
ts, ok := c.statusesByAlias[aliasKey]
if !stats.Up {
if !ok {
// Tablet doesn't exist and was recently deleted or changed its type. Panic as this is unexpected behavior.
panic(fmt.Sprintf("BUG: tablet (%v) doesn't exist", aliasKey))
}
// The tablet still exists in our cache but was recently deleted or changed its type. Delete it now.
c.statuses[keyspace][shard][cell][tabletType] = remove(c.statuses[keyspace][shard][cell][tabletType], stats.Tablet.Alias)
delete(c.statusesByAlias, aliasKey)
return
}
if !ok {
// Tablet isn't tracked yet so just add it.
_, ok := c.statuses[keyspace]
if !ok {
shards := make(map[string]map[string]map[topodatapb.TabletType][]*discovery.TabletStats)
c.statuses[keyspace] = shards
}
_, ok = c.statuses[keyspace][shard]
if !ok {
cells := make(map[string]map[topodatapb.TabletType][]*discovery.TabletStats)
c.statuses[keyspace][shard] = cells
}
_, ok = c.statuses[keyspace][shard][cell]
if !ok {
types := make(map[topodatapb.TabletType][]*discovery.TabletStats)
c.statuses[keyspace][shard][cell] = types
}
_, ok = c.statuses[keyspace][shard][cell][tabletType]
if !ok {
tablets := make([]*discovery.TabletStats, 0)
c.statuses[keyspace][shard][cell][tabletType] = tablets
}
c.statuses[keyspace][shard][cell][tabletType] = append(c.statuses[keyspace][shard][cell][tabletType], stats)
sort.Sort(byTabletUID(c.statuses[keyspace][shard][cell][tabletType]))
c.statusesByAlias[aliasKey] = stats
return
}
// Tablet already exists so just update it in the cache.
*ts = *stats
}
func tabletToMapKey(stats *discovery.TabletStats) string {
return stats.Tablet.Alias.String()
}
// remove takes in an array and returns it with the specified element removed
// (leaves the array unchanged if element isn't in the array).
func remove(tablets []*discovery.TabletStats, tabletAlias *topodatapb.TabletAlias) []*discovery.TabletStats {
filteredTablets := tablets[:0]
for _, tablet := range tablets {
if !topoproto.TabletAliasEqual(tablet.Tablet.Alias, tabletAlias) {
filteredTablets = append(filteredTablets, tablet)
}
}
return filteredTablets
}
func (c *tabletStatsCache) topologyInfo(selectedKeyspace, selectedCell string) *topologyInfo {
c.mu.Lock()
defer c.mu.Unlock()
return &topologyInfo{
Keyspaces: c.keyspacesLocked("all"),
Cells: c.cellsInTopology(selectedKeyspace),
TabletTypes: makeStringTypeList(c.typesInTopology(selectedKeyspace, selectedCell)),
}
}
func makeStringTypeList(types []topodatapb.TabletType) []string {
var list []string
for _, t := range types {
list = append(list, t.String())
}
return list
}
// keyspacesLocked returns the keyspaces to be displayed in the heatmap based on the dropdown filters.
// It returns one keyspace if a specific one was chosen or returns all of them if 'all' is chosen.
// This method is used by heatmapData to traverse over desired keyspaces and
// topologyInfo to send all available options for the keyspace dropdown.
func (c *tabletStatsCache) keyspacesLocked(keyspace string) []string {
if keyspace != "all" {
return []string{keyspace}
}
var keyspaces []string
for ks := range c.statuses {
keyspaces = append(keyspaces, ks)
}
sort.Strings(keyspaces)
return keyspaces
}
// cellsLocked returns the cells needed to be displayed in the heatmap based on the dropdown filters.
// returns one cell if a specific one was chosen or returns all of them if 'all' is chosen.
// This method is used by heatmapData to traverse over the desired cells.
func (c *tabletStatsCache) cellsLocked(keyspace, cell string) []string {
if cell != "all" {
return []string{cell}
}
return c.cellsInTopology(keyspace)
}
// tabletTypesLocked returns the tablet types needed to be displayed in the heatmap based on the dropdown filters.
// It returns tablet type if a specific one was chosen or returns all of them if 'all' is chosen for keyspace and/or cell.
// This method is used by heatmapData to traverse over the desired tablet types.
func (c *tabletStatsCache) tabletTypesLocked(keyspace, cell, tabletType string) []topodatapb.TabletType {
if tabletType != "all" {
tabletTypeObj, _ := topoproto.ParseTabletType(tabletType)
return []topodatapb.TabletType{tabletTypeObj}
}
return c.typesInTopology(keyspace, cell)
}
// cellsInTopology returns all the cells in the given keyspace.
// If all keyspaces is chosen, it returns the cells from every keyspace.
// This method is used by topologyInfo to send all available options for the cell dropdown
func (c *tabletStatsCache) cellsInTopology(keyspace string) []string {
keyspaces := c.keyspacesLocked(keyspace)
cells := make(map[string]bool)
// Going through all shards in each keyspace to get all existing cells
for _, ks := range keyspaces {
shardsPerKeyspace := c.statuses[ks]
for s := range shardsPerKeyspace {
cellsInKeyspace := c.statuses[ks][s]
for cl := range cellsInKeyspace {
cells[cl] = true
}
}
}
var cellList []string
for cell := range cells {
cellList = append(cellList, cell)
}
sort.Strings(cellList)
return cellList
}
// typesInTopology returns all the types in the given keyspace and cell.
// If all keyspaces and cells is chosen, it returns the types from every cell in every keyspace.
// This method is used by topologyInfo to send all available options for the tablet type dropdown
func (c *tabletStatsCache) typesInTopology(keyspace, cell string) []topodatapb.TabletType {
keyspaces := c.keyspacesLocked(keyspace)
types := make(map[topodatapb.TabletType]bool)
// Going through the shards in every cell in every keyspace to get existing tablet types
for _, ks := range keyspaces {
cellsPerKeyspace := c.cellsLocked(ks, cell)
for _, cl := range cellsPerKeyspace {
shardsPerKeyspace := c.statuses[ks]
for s := range shardsPerKeyspace {
typesPerShard := c.statuses[ks][s][cl]
for t := range typesPerShard {
types[t] = true
if len(types) == len(availableTabletTypes) {
break
}
}
}
}
}
typesList := sortTypes(types)
return typesList
}
func sortTypes(types map[topodatapb.TabletType]bool) []topodatapb.TabletType {
var listOfTypes []topodatapb.TabletType
for _, tabType := range availableTabletTypes {
if t := types[tabType]; t {
listOfTypes = append(listOfTypes, tabType)
}
}
return listOfTypes
}
func (c *tabletStatsCache) shards(keyspace string) []string {
var shards []string
for s := range c.statuses[keyspace] {
shards = append(shards, s)
}
sort.Strings(shards)
return shards
}
// heatmapData returns a 2D array of data (based on the specified metric) as well as the labels for the heatmap.
func (c *tabletStatsCache) heatmapData(selectedKeyspace, selectedCell, selectedTabletType, selectedMetric string) ([]heatmap, error) {
c.mu.Lock()
defer c.mu.Unlock()
// Get the metric data.
var metricFunc func(stats *discovery.TabletStats) float64
switch selectedMetric {
case "lag":
metricFunc = replicationLag
case "qps":
metricFunc = qps
case "health":
metricFunc = health
default:
return nil, fmt.Errorf("invalid metric: %v Select 'lag', 'cpu', or 'qps'", selectedMetric)
}
// Get the proper data (unaggregated tablets or aggregated tablets by types)
aggregated := false
if selectedKeyspace == "all" && selectedTabletType == "all" {
aggregated = true
}
keyspaces := c.keyspacesLocked(selectedKeyspace)
var heatmaps []heatmap
for _, keyspace := range keyspaces {
var h heatmap
h.ShardLabels = c.shards(keyspace)
keyspaceLabelSpan := 0
cells := c.cellsLocked(keyspace, selectedCell)
// The loop goes through every outer label (in this case, cell).
for _, cell := range cells {
var cellData [][]float64
var cellAliases [][]*topodatapb.TabletAlias
var cellLabel yLabel
if aggregated {
cellData, cellAliases, cellLabel = c.aggregatedData(keyspace, cell, selectedTabletType, selectedMetric, metricFunc)
} else {
cellData, cellAliases, cellLabel = c.unaggregatedData(keyspace, cell, selectedTabletType, metricFunc)
}
if cellLabel.CellLabel.Rowspan > 0 {
// Iterating over the rows of data for the current cell.
for i := 0; i < len(cellData); i++ {
// Adding the data in reverse to match the format that the plotly map takes in.
h.Data = append([][]float64{cellData[i]}, h.Data...)
if cellAliases != nil {
h.Aliases = append([][]*topodatapb.TabletAlias{cellAliases[i]}, h.Aliases...)
}
}
h.CellAndTypeLabels = append(h.CellAndTypeLabels, cellLabel)
}
keyspaceLabelSpan += cellLabel.CellLabel.Rowspan
}
// Setting the values for the yGridLines by going in reverse and subtracting 0.5 as an offset.
sum := 0
for c := len(h.CellAndTypeLabels) - 1; c >= 0; c-- {
// If the current view is aggregated then we need to traverse the cell labels
// to calculate the values for the grid line since that is the innermost label.
// For example if h.CellAndTypeLabels =
// { CellLabel: {Name: 'cell1', Rowspan: 2}, TypeLabels: nil },
// { CellLabel: {Name: 'cell2', Rowspan: 3}, TypeLabels: nil },
// then the resulting array will be [2.5, 4.5] which specifies the grid line indexes
// starting from 0 which is at the bottom of the heatmap.
if h.CellAndTypeLabels[c].TypeLabels == nil {
sum += h.CellAndTypeLabels[c].CellLabel.Rowspan
h.YGridLines = append(h.YGridLines, (float64(sum) - 0.5))
continue
}
// Otherwise traverse the type labels because that is the innermost label.
// For example if h.CellAndTypeLabels =
// { CellLabel: {Name: 'cell1', Rowspan: 3}, TypeLabels: [{Name: 'Master', Rowspan: 1}, {Name: 'Replica', Rowspan: 2}] },
// { CellLabel: {Name: 'cell2', Rowspan: 3}, TypeLabels: [{Name: 'Master', Rowspan: 1}, {Name: 'Replica', Rowspan: 2}] },
// then the resulting array will be [1.5, 2.5, 4.5, 5.5] which specifies the grid line indexes
// starting from 0 which is at the bottom of the heatmap.
for t := len(h.CellAndTypeLabels[c].TypeLabels) - 1; t >= 0; t-- {
sum += h.CellAndTypeLabels[c].TypeLabels[t].Rowspan
h.YGridLines = append(h.YGridLines, (float64(sum) - 0.5))
}
}
h.KeyspaceLabel = label{Name: keyspace, Rowspan: keyspaceLabelSpan}
heatmaps = append(heatmaps, h)
}
return heatmaps, nil
}
func (c *tabletStatsCache) unaggregatedData(keyspace, cell, selectedType string, metricFunc func(stats *discovery.TabletStats) float64) ([][]float64, [][]*topodatapb.TabletAlias, yLabel) {
// This loop goes through every nested label (in this case, tablet type).
var cellData [][]float64
var cellAliases [][]*topodatapb.TabletAlias
var cellLabel yLabel
cellLabelSpan := 0
tabletTypes := c.tabletTypesLocked(keyspace, cell, selectedType)
shards := c.shards(keyspace)
for _, tabletType := range tabletTypes {
maxRowLength := 0
// The loop calculates the maximum number of rows needed.
for _, shard := range shards {
tabletsCount := len(c.statuses[keyspace][shard][cell][tabletType])
if maxRowLength < tabletsCount {
maxRowLength = tabletsCount
}
}
// dataRowsPerType is a 2D array that will hold the data of the tablets of one (cell, type) combination.
dataRowsPerType := make([][]float64, maxRowLength)
// aliasRowsPerType is a 2D array that will hold the aliases of the tablets of one (cell, type) combination.
aliasRowsPerType := make([][]*topodatapb.TabletAlias, maxRowLength)
for i := range dataRowsPerType {
dataRowsPerType[i] = make([]float64, len(shards))
aliasRowsPerType[i] = make([]*topodatapb.TabletAlias, len(shards))
}
// Filling in the 2D array with tablet data by columns.
for shardIndex, shard := range shards {
for tabletIndex := 0; tabletIndex < maxRowLength; tabletIndex++ {
// If the key doesn't exist then the tablet must not exist so that data is set to -1 (tabletMissing).
if tabletIndex < len(c.statuses[keyspace][shard][cell][tabletType]) {
dataRowsPerType[tabletIndex][shardIndex] = metricFunc(c.statuses[keyspace][shard][cell][tabletType][tabletIndex])
aliasRowsPerType[tabletIndex][shardIndex] = c.statuses[keyspace][shard][cell][tabletType][tabletIndex].Tablet.Alias
} else {
dataRowsPerType[tabletIndex][shardIndex] = tabletMissing
aliasRowsPerType[tabletIndex][shardIndex] = nil
}
}
}
if maxRowLength > 0 {
cellLabel.TypeLabels = append(cellLabel.TypeLabels, label{Name: tabletType.String(), Rowspan: maxRowLength})
}
cellLabelSpan += maxRowLength
for i := 0; i < len(dataRowsPerType); i++ {
cellData = append(cellData, dataRowsPerType[i])
cellAliases = append(cellAliases, aliasRowsPerType[i])
}
}
cellLabel.CellLabel = label{Name: cell, Rowspan: cellLabelSpan}
return cellData, cellAliases, cellLabel
}
// aggregatedData gets heatmapData by taking the average of the metric value of all tablets within the keyspace and cell of the
// specified type (or from all types if 'all' was selected).
func (c *tabletStatsCache) aggregatedData(keyspace, cell, selectedType, selectedMetric string, metricFunc func(stats *discovery.TabletStats) float64) ([][]float64, [][]*topodatapb.TabletAlias, yLabel) {
shards := c.shards(keyspace)
tabletTypes := c.tabletTypesLocked(keyspace, cell, selectedType)
var cellData [][]float64
dataRow := make([]float64, len(shards))
// This loop goes through each shard in the (keyspace-cell) combination.
for shardIndex, shard := range shards {
var sum, count float64
hasTablets := false
unhealthyFound := false
// Going through all the types of tablets and aggregating their information.
for _, tabletType := range tabletTypes {
tablets, ok := c.statuses[keyspace][shard][cell][tabletType]
if !ok {
continue
}
for _, tablet := range tablets {
hasTablets = true
// If even one tablet is unhealthy then the entire group becomes unhealthy.
metricVal := metricFunc(tablet)
if (selectedMetric == "health" && metricVal == tabletUnhealthy) ||
(selectedMetric == "lag" && metricVal > lagThresholdUnhealthy) {
sum = metricVal
count = 1
unhealthyFound = true
break
}
sum += metricVal
count++
}
if unhealthyFound {
break
}
}
if hasTablets {
dataRow[shardIndex] = (sum / count)
} else {
dataRow[shardIndex] = tabletMissing
}
}
cellData = append(cellData, dataRow)
cellLabel := yLabel{
CellLabel: label{Name: cell, Rowspan: 1},
}
return cellData, nil, cellLabel
}
func (c *tabletStatsCache) tabletStats(tabletAlias *topodatapb.TabletAlias) (discovery.TabletStats, error) {
c.mu.Lock()
defer c.mu.Unlock()
ts, ok := c.statusesByAlias[tabletAlias.String()]
if !ok {
return discovery.TabletStats{}, fmt.Errorf("could not find tablet: %v", tabletAlias)
}
return *ts, nil
}
func health(stat *discovery.TabletStats) float64 {
// The tablet is unhealthy if there is an health error.
if stat.Stats.HealthError != "" {
return tabletUnhealthy
}
// The tablet is healthy/degraded/unheathy depending on the lag.
lag := stat.Stats.SecondsBehindMaster
switch {
case lag >= lagThresholdUnhealthy:
return tabletUnhealthy
case lag >= lagThresholdDegraded:
return tabletDegraded
}
// The tablet is degraded if there was an error previously.
if stat.LastError != nil {
return tabletDegraded
}
// The tablet is healthy or degraded based on serving status.
if !stat.Serving {
return tabletDegraded
}
// All else is ok so tablet is healthy.
return tabletHealthy
}
func replicationLag(stat *discovery.TabletStats) float64 {
return float64(stat.Stats.SecondsBehindMaster)
}
func qps(stat *discovery.TabletStats) float64 {
return stat.Stats.Qps
}
// compile-time interface check
var _ discovery.HealthCheckStatsListener = (*tabletStatsCache)(nil) | go/vt/vtctld/tablet_stats_cache.go | 0.577376 | 0.473049 | tablet_stats_cache.go | starcoder |
package iso20022
// Account between an investor(s) and a fund manager or a fund. The account can contain holdings in any investment fund or investment fund class managed (or distributed) by the fund manager, within the same fund family.
type InvestmentAccount34 struct {
// Unique and unambiguous identification for the account between the account owner and the account servicer.
Identification *AccountIdentification1 `xml:"Id,omitempty"`
// Name of the account. It provides an additional means of identification, and is designated by the account servicer in agreement with the account owner.
Name *Max35Text `xml:"Nm,omitempty"`
// Supplementary registration information applying to a specific block of units for dealing and reporting purposes. The supplementary registration information may be used when all the units are registered, for example, to a funds supermarket, but holdings for each investor have to reconciled individually.
Designation *Max35Text `xml:"Dsgnt,omitempty"`
// Purpose of the account/source fund type. This is typically linked to an investment product, for example, wrapper, ISA.
Type *AccountType1Choice `xml:"Tp,omitempty"`
// Ownership status of the account, for example, joint owners.
OwnershipType *OwnershipType1Choice `xml:"OwnrshTp"`
// Tax advantage specific to the account.
TaxExemption *TaxExemptionReason1Choice `xml:"TaxXmptn,omitempty"`
// Frequency at which a statement is issued.
StatementFrequency *StatementFrequencyReason1Choice `xml:"StmtFrqcy,omitempty"`
// Currency chosen for reporting purposes by the account owner in agreement with the account servicer.
ReferenceCurrency *ActiveCurrencyCode `xml:"RefCcy,omitempty"`
// Language for all communication concerning the account.
Language *LanguageCode `xml:"Lang,omitempty"`
// Dividend option chosen by the account owner based on the options offered in the prospectus.
IncomePreference *IncomePreference1Code `xml:"IncmPref,omitempty"`
// Method by which the tax (withholding tax) is to be processed i.e. either withheld at source or tax information reported to tax authorities or tax information is reported due to the provision of a tax certificate.
TaxWithholdingMethod *TaxWithholdingMethod2Code `xml:"TaxWhldgMtd,omitempty"`
// Details of the letter of intent.
LetterIntentDetails *LetterIntent1 `xml:"LttrInttDtls,omitempty"`
// Reference of an accumulation rights program, in which sales commissions are based on a customer's present purchases of shares and the aggregate quantity previously purchased by the customer. An accumulation rights program is mainly used in the US market.
AccumulationRightReference *Max35Text `xml:"AcmltnRghtRef,omitempty"`
// Number of account owners or related parties required to authorise transactions on the account.
RequiredSignatoriesNumber *Number `xml:"ReqrdSgntriesNb,omitempty"`
// Name of the investment fund family.
FundFamilyName *Max350Text `xml:"FndFmlyNm,omitempty"`
// Detailed information about the investment fund associated to the account.
FundDetails []*FinancialInstrument29 `xml:"FndDtls,omitempty"`
// Parameters to be applied on deal amount for orders when the amount is a fractional number.
RoundingDetails *RoundingParameters1 `xml:"RndgDtls,omitempty"`
// Party that manages the account on behalf of the account owner, that is manages the registration and booking of entries on the account, calculates balances on the account and provides information about the account.
AccountServicer *PartyIdentification2Choice `xml:"AcctSvcr,omitempty"`
// Specifies information about blocked accounts.
BlockedStatus []*Blocked1 `xml:"BlckdSts,omitempty"`
// Specifies the type of usage of the account.
AccountUsageType *AccountUsageType1Choice `xml:"AcctUsgTp,omitempty"`
// Specifies if documentary evidence has been provided for the foreign resident.
ForeignStatusCertification *Provided1Code `xml:"FrgnStsCertfctn,omitempty"`
// Date the investor signs the open account form.
AccountSignatureDateTime *DateAndDateTimeChoice `xml:"AcctSgntrDtTm,omitempty"`
}
func (i *InvestmentAccount34) AddIdentification() *AccountIdentification1 {
i.Identification = new(AccountIdentification1)
return i.Identification
}
func (i *InvestmentAccount34) SetName(value string) {
i.Name = (*Max35Text)(&value)
}
func (i *InvestmentAccount34) SetDesignation(value string) {
i.Designation = (*Max35Text)(&value)
}
func (i *InvestmentAccount34) AddType() *AccountType1Choice {
i.Type = new(AccountType1Choice)
return i.Type
}
func (i *InvestmentAccount34) AddOwnershipType() *OwnershipType1Choice {
i.OwnershipType = new(OwnershipType1Choice)
return i.OwnershipType
}
func (i *InvestmentAccount34) AddTaxExemption() *TaxExemptionReason1Choice {
i.TaxExemption = new(TaxExemptionReason1Choice)
return i.TaxExemption
}
func (i *InvestmentAccount34) AddStatementFrequency() *StatementFrequencyReason1Choice {
i.StatementFrequency = new(StatementFrequencyReason1Choice)
return i.StatementFrequency
}
func (i *InvestmentAccount34) SetReferenceCurrency(value string) {
i.ReferenceCurrency = (*ActiveCurrencyCode)(&value)
}
func (i *InvestmentAccount34) SetLanguage(value string) {
i.Language = (*LanguageCode)(&value)
}
func (i *InvestmentAccount34) SetIncomePreference(value string) {
i.IncomePreference = (*IncomePreference1Code)(&value)
}
func (i *InvestmentAccount34) SetTaxWithholdingMethod(value string) {
i.TaxWithholdingMethod = (*TaxWithholdingMethod2Code)(&value)
}
func (i *InvestmentAccount34) AddLetterIntentDetails() *LetterIntent1 {
i.LetterIntentDetails = new(LetterIntent1)
return i.LetterIntentDetails
}
func (i *InvestmentAccount34) SetAccumulationRightReference(value string) {
i.AccumulationRightReference = (*Max35Text)(&value)
}
func (i *InvestmentAccount34) SetRequiredSignatoriesNumber(value string) {
i.RequiredSignatoriesNumber = (*Number)(&value)
}
func (i *InvestmentAccount34) SetFundFamilyName(value string) {
i.FundFamilyName = (*Max350Text)(&value)
}
func (i *InvestmentAccount34) AddFundDetails() *FinancialInstrument29 {
newValue := new(FinancialInstrument29)
i.FundDetails = append(i.FundDetails, newValue)
return newValue
}
func (i *InvestmentAccount34) AddRoundingDetails() *RoundingParameters1 {
i.RoundingDetails = new(RoundingParameters1)
return i.RoundingDetails
}
func (i *InvestmentAccount34) AddAccountServicer() *PartyIdentification2Choice {
i.AccountServicer = new(PartyIdentification2Choice)
return i.AccountServicer
}
func (i *InvestmentAccount34) AddBlockedStatus() *Blocked1 {
newValue := new(Blocked1)
i.BlockedStatus = append(i.BlockedStatus, newValue)
return newValue
}
func (i *InvestmentAccount34) AddAccountUsageType() *AccountUsageType1Choice {
i.AccountUsageType = new(AccountUsageType1Choice)
return i.AccountUsageType
}
func (i *InvestmentAccount34) SetForeignStatusCertification(value string) {
i.ForeignStatusCertification = (*Provided1Code)(&value)
}
func (i *InvestmentAccount34) AddAccountSignatureDateTime() *DateAndDateTimeChoice {
i.AccountSignatureDateTime = new(DateAndDateTimeChoice)
return i.AccountSignatureDateTime
} | InvestmentAccount34.go | 0.792062 | 0.455562 | InvestmentAccount34.go | starcoder |
package ring
import (
"crypto/rand"
"math"
"math/bits"
)
func computeMatrixTernary(p float64) (M [][]uint8) {
var g float64
var x uint64
precision := uint64(56)
M = make([][]uint8, 2)
g = p
g *= math.Exp2(float64(precision))
x = uint64(g)
M[0] = make([]uint8, precision-1)
for j := uint64(0); j < precision-1; j++ {
M[0][j] = uint8((x >> (precision - j - 1)) & 1)
}
g = 1 - p
g *= math.Exp2(float64(precision))
x = uint64(g)
M[1] = make([]uint8, precision-1)
for j := uint64(0); j < precision-1; j++ {
M[1][j] = uint8((x >> (precision - j - 1)) & 1)
}
return M
}
// SampleMontgomeryNew samples coefficients with ternary distribution in montgomery form on the target polynomial.
func (context *Context) sampleTernary(samplerMatrix [][]uint64, p float64, pol *Poly) {
if p == 0 {
panic("cannot sample -> p = 0")
}
var coeff uint64
var sign uint64
var index uint64
if p == 0.5 {
randomBytesCoeffs := make([]byte, context.N>>3)
randomBytesSign := make([]byte, context.N>>3)
if _, err := rand.Read(randomBytesCoeffs); err != nil {
panic("crypto rand error")
}
if _, err := rand.Read(randomBytesSign); err != nil {
panic("crypto rand error")
}
for i := uint64(0); i < context.N; i++ {
coeff = uint64(uint8(randomBytesCoeffs[i>>3])>>(i&7)) & 1
sign = uint64(uint8(randomBytesSign[i>>3])>>(i&7)) & 1
index = (coeff & (sign ^ 1)) | ((sign & coeff) << 1)
for j := range context.Modulus {
pol.Coeffs[j][i] = samplerMatrix[j][index] //(coeff & (sign^1)) | (qi - 1) * (sign & coeff)
}
}
} else {
matrix := computeMatrixTernary(p)
randomBytes := make([]byte, 8)
pointer := uint8(0)
if _, err := rand.Read(randomBytes); err != nil {
panic("crypto rand error")
}
for i := uint64(0); i < context.N; i++ {
coeff, sign, randomBytes, pointer = kysampling(matrix, randomBytes, pointer)
index = (coeff & (sign ^ 1)) | ((sign & coeff) << 1)
for j := range context.Modulus {
pol.Coeffs[j][i] = samplerMatrix[j][index] //(coeff & (sign^1)) | (qi - 1) * (sign & coeff)
}
}
}
}
func (context *Context) SampleTernaryUniform(pol *Poly) {
context.sampleTernary(context.matrixTernary, 1.0/3.0, pol)
}
func (context *Context) SampleTernary(pol *Poly, p float64) {
context.sampleTernary(context.matrixTernary, p, pol)
}
func (context *Context) SampleTernaryMontgomery(pol *Poly, p float64) {
context.sampleTernary(context.matrixTernaryMontgomery, p, pol)
}
// SampleNew samples a new polynomial with ternary distribution.
func (context *Context) SampleTernaryNew(p float64) (pol *Poly) {
pol = context.NewPoly()
context.SampleTernary(pol, p)
return pol
}
// SampleMontgomeryNew samples a new polynomial with ternary distribution in montgomery form.
func (context *Context) SampleTernaryMontgomeryNew(p float64) (pol *Poly) {
pol = context.NewPoly()
context.SampleTernaryMontgomery(pol, p)
return
}
// SampleNTTNew samples a new polynomial with ternary distribution in the NTT domain.
func (context *Context) SampleTernaryNTTNew(p float64) (pol *Poly) {
pol = context.NewPoly()
context.SampleTernary(pol, p)
context.NTT(pol, pol)
return
}
// SampleNTT samples coefficients with ternary distribution in the NTT domain on the target polynomial.
func (context *Context) SampleTernaryNTT(pol *Poly, p float64) {
context.SampleTernary(pol, p)
context.NTT(pol, pol)
}
// SampleNTTNew samples a new polynomial with ternary distribution in the NTT domain and in montgomery form.
func (context *Context) SampleTernaryMontgomeryNTTNew(p float64) (pol *Poly) {
pol = context.SampleTernaryMontgomeryNew(p)
context.NTT(pol, pol)
return
}
// SampleNTT samples coefficients with ternary distribution in the NTT domain and in montgomery form on the target polynomial.
func (context *Context) SampleTernaryMontgomeryNTT(pol *Poly, p float64) {
context.SampleTernaryMontgomery(pol, p)
context.NTT(pol, pol)
}
// Samples a keys with distribution [-1, 1] = [1/2, 1/2] with exactly hw non zero coefficients
func (context *Context) sampleTernarySparse(samplerMatrix [][]uint64, pol *Poly, hw uint64) {
if hw > context.N {
hw = context.N
}
var mask, j uint64
var coeff uint8
index := make([]uint64, context.N)
for i := uint64(0); i < context.N; i++ {
index[i] = i
}
randomBytes := make([]byte, (uint64(math.Ceil(float64(hw) / 8.0)))) // We sample ceil(hw/8) bytes
pointer := uint8(0)
if _, err := rand.Read(randomBytes); err != nil {
panic("crypto rand error")
}
for i := uint64(0); i < hw; i++ {
mask = (1 << uint64(bits.Len64(context.N-i))) - 1 // rejection sampling of a random variable between [0, len(index)]
j = randInt32(mask)
for j >= context.N-i {
j = randInt32(mask)
}
coeff = (uint8(randomBytes[0]) >> (i & 7)) & 1 // random binary digit [0, 1] from the random bytes
for i := range context.Modulus {
pol.Coeffs[i][index[j]] = samplerMatrix[i][coeff]
}
// Removes the element in position j of the slice (order not preserved)
index[j] = index[len(index)-1]
index = index[:len(index)-1]
pointer += 1
if pointer == 8 {
randomBytes = randomBytes[1:]
pointer = 0
}
}
}
func (context *Context) SampleTernarySparse(pol *Poly, hw uint64) {
context.sampleTernarySparse(context.matrixTernary, pol, hw)
}
func (context *Context) SampleTernarySparseNew(hw uint64) (pol *Poly) {
pol = context.NewPoly()
context.SampleTernarySparse(pol, hw)
return pol
}
func (context *Context) SampleTernarySparseNTT(pol *Poly, hw uint64) {
context.SampleTernarySparse(pol, hw)
context.NTT(pol, pol)
}
func (context *Context) SampleTernarySparseNTTNew(hw uint64) (pol *Poly) {
pol = context.NewPoly()
context.sampleTernarySparse(context.matrixTernaryMontgomery, pol, hw)
context.NTT(pol, pol)
return pol
}
func (context *Context) SampleTernarySparseMontgomery(pol *Poly, hw uint64) {
context.sampleTernarySparse(context.matrixTernaryMontgomery, pol, hw)
}
func (context *Context) SampleSparseMontgomeryNew(hw uint64) (pol *Poly) {
pol = context.NewPoly()
context.sampleTernarySparse(context.matrixTernaryMontgomery, pol, hw)
return pol
}
func (context *Context) SampleTernarySparseMontgomeryNTTNew(hw uint64) (pol *Poly) {
pol = context.SampleSparseMontgomeryNew(hw)
context.NTT(pol, pol)
return pol
}
func (context *Context) SampleTernarySparseMontgomeryNTT(pol *Poly, hw uint64) {
context.SampleTernarySparseMontgomery(pol, hw)
context.NTT(pol, pol)
} | HE/ring/ternarySampler.go | 0.818664 | 0.400925 | ternarySampler.go | starcoder |
package segmentindex
import (
"bytes"
"encoding/binary"
"fmt"
"math"
"sort"
"github.com/pkg/errors"
)
type Tree struct {
nodes []*Node
}
type Node struct {
Key []byte
Start uint64
End uint64
}
func NewTree(capacity int) Tree {
return Tree{
nodes: make([]*Node, 0, capacity),
}
}
func NewBalanced(nodes []Node) Tree {
t := Tree{nodes: make([]*Node, len(nodes))}
// sort the slice just once
sort.Slice(nodes, func(a, b int) bool {
return bytes.Compare(nodes[a].Key, nodes[b].Key) < 0
})
t.buildBalanced(nodes, 0, 0, len(nodes)-1)
return t
}
func (t *Tree) buildBalanced(nodes []Node, targetPos, leftBound, rightBound int) {
t.grow(targetPos)
if leftBound > rightBound {
return
}
mid := (leftBound + rightBound) / 2
t.nodes[targetPos] = &nodes[mid]
t.buildBalanced(nodes, t.left(targetPos), leftBound, mid-1)
t.buildBalanced(nodes, t.right(targetPos), mid+1, rightBound)
}
func (t *Tree) Insert(key []byte, start, end uint64) {
newNode := Node{
Key: key,
Start: start,
End: end,
}
if len(t.nodes) == 0 {
t.nodes = append(t.nodes, &newNode)
return
}
t.insertAt(0, newNode)
}
func (t *Tree) insertAt(nodeID int, newNode Node) {
if !t.exists(nodeID) {
// we are at the target and can insert now
t.grow(nodeID)
t.nodes[nodeID] = &newNode
return
}
if bytes.Equal(newNode.Key, t.nodes[nodeID].Key) {
// this key already exists, which is an unexpected situation for an index
// key
panic(fmt.Sprintf("duplicate key %s", newNode.Key))
}
if bytes.Compare(newNode.Key, t.nodes[nodeID].Key) < 0 {
t.insertAt(t.left(nodeID), newNode)
} else {
t.insertAt(t.right(nodeID), newNode)
}
}
func (t *Tree) Get(key []byte) ([]byte, uint64, uint64) {
if len(t.nodes) == 0 {
return nil, 0, 0
}
return t.getAt(0, key)
}
func (t *Tree) getAt(nodeID int, key []byte) ([]byte, uint64, uint64) {
if !t.exists(nodeID) {
return nil, 0, 0
}
node := t.nodes[nodeID]
if bytes.Equal(node.Key, key) {
return node.Key, node.Start, node.End
}
if bytes.Compare(key, node.Key) < 0 {
return t.getAt(t.left(nodeID), key)
} else {
return t.getAt(t.right(nodeID), key)
}
}
func (t Tree) left(i int) int {
return 2*i + 1
}
func (t Tree) right(i int) int {
return 2*i + 2
}
func (t *Tree) exists(i int) bool {
if i >= len(t.nodes) {
return false
}
return t.nodes[i] != nil
}
// size calculates the exact size of this node on disk which is helpful to
// figure out the personal offset
func (n *Node) size() int {
if n == nil {
return 0
}
size := 0
size += 4 // uint32 for key length
size += len(n.Key)
size += 8 // uint64 startPos
size += 8 // uint64 endPos
size += 8 // int64 pointer left child
size += 8 // int64 pointer right child
return size
}
func (t *Tree) grow(i int) {
if i < len(t.nodes) {
return
}
oldSize := len(t.nodes)
newSize := oldSize
for newSize <= i {
newSize += oldSize
}
newNodes := make([]*Node, newSize)
copy(newNodes, t.nodes)
for i := range t.nodes {
t.nodes[i] = nil
}
t.nodes = newNodes
}
func (t *Tree) MarshalBinary() ([]byte, error) {
offsets, size := t.calculateDiskOffsets()
buf := bytes.NewBuffer(nil)
for i, node := range t.nodes {
if node == nil {
continue
}
var leftOffset int64
var rightOffset int64
if t.exists(t.left(i)) {
leftOffset = int64(offsets[t.left(i)])
} else {
leftOffset = -1
}
if t.exists(t.right(i)) {
rightOffset = int64(offsets[t.right(i)])
} else {
rightOffset = -1
}
if len(node.Key) > math.MaxUint32 {
return nil, errors.Errorf("max key size is %d", math.MaxUint32)
}
keyLen := uint32(len(node.Key))
if err := binary.Write(buf, binary.LittleEndian, &keyLen); err != nil {
return nil, err
}
if _, err := buf.Write(node.Key); err != nil {
return nil, err
}
if err := binary.Write(buf, binary.LittleEndian, &node.Start); err != nil {
return nil, err
}
if err := binary.Write(buf, binary.LittleEndian, &node.End); err != nil {
return nil, err
}
if err := binary.Write(buf, binary.LittleEndian, &leftOffset); err != nil {
return nil, err
}
if err := binary.Write(buf, binary.LittleEndian, &rightOffset); err != nil {
return nil, err
}
}
bytes := buf.Bytes()
if size != len(bytes) {
return nil, errors.Errorf("corrupt: wrote %d bytes with target %d", len(bytes), size)
}
return bytes, nil
}
// returns indivdual offsets and total size, nil nodes are skipped
func (t *Tree) calculateDiskOffsets() ([]int, int) {
current := 0
out := make([]int, len(t.nodes))
for i, node := range t.nodes {
out[i] = current
size := node.size()
current += size
}
return out, current
}
func (t *Tree) Height() int {
var highestElem int
for i := len(t.nodes) - 1; i >= 0; i-- {
if t.nodes[i] != nil {
highestElem = i
break
}
}
return int(math.Ceil(math.Log2(float64(highestElem))))
} | adapters/repos/db/lsmkv/segmentindex/tree.go | 0.71889 | 0.445409 | tree.go | starcoder |
package gocv
import (
"github.com/fwessels/go-cv-simd/sse2"
)
//AbsDifferenceSum gets sum of absolute difference of two gray 8-bit images.
// Both images must have the same width and height.
func AbsDifferenceSum(a, b gocvsimd.View) uint64 {
return gocvsimd.SimdSse2AbsDifferenceSum(a, b)
}
// AbsDifferenceSumMasked gets sum of absolute difference of two gray 8-bit images based on gray 8-bit mask.
// Gets the absolute difference sum for all points when mask[i] == index.
// Both images and mask must have the same width and height.
func AbsDifferenceSumMasked(a, b, mask gocvsimd.View, index uint8) uint64 {
return gocvsimd.SimdSse2AbsDifferenceSumMasked(a, b, mask, uint64(index))
}
// AbsDifferenceSums3x3 gets 9 sums of absolute difference of two gray 8-bit images with various relative shifts in neighborhood 3x3.
// Both images must have the same width and height. The image height and width must be equal or greater 3.
// The sums are calculated with central part (indent width = 1) of the current image and with part of the background image with corresponding shift.
// The shifts are lain in the range [-1, 1] for axis x and y.
func AbsDifferenceSums3x3(current, background gocvsimd.View) [9]uint64 {
return gocvsimd.SimdSse2AbsDifferenceSums3x3(current, background)
}
// AbsDifferenceSums3x3Masked gets 9 sums of absolute difference of two gray 8-bit images with various relative shifts in neighborhood 3x3 based on gray 8-bit mask.
// Gets the absolute difference sums for all points when mask[i] == index.
// Both images and mask must have the same width and height. The image height and width must be equal or greater 3.
// The sums are calculated with central part (indent width = 1) of the current image and with part of the background image with the corresponding shift.
// The shifts are lain in the range [-1, 1] for axis x and y.
func AbsDifferenceSums3x3Masked(current, background, mask gocvsimd.View, index uint8) [9]uint64 {
return gocvsimd.SimdSse2AbsDifferenceSums3x3Masked(current, background, mask, uint64(index))
}
// AbsGradientSaturatedSum puts to destination 8-bit gray image saturated sum of absolute gradient for every point of source 8-bit gray image.
// Both images must have the same width and height.
// For border pixels:
// dst[x, y] = 0;
// For other pixels:
// dx = abs(src[x + 1, y] - src[x - 1, y]);
// dy = abs(src[x, y + 1] - src[x, y - 1]);
// dst[x, y] = min(dx + dy, 255);
func AbsGradientSaturatedSum(src, dst gocvsimd.View) {
gocvsimd.SimdSse2AbsGradientSaturatedSum(src, dst)
}
// AddFeatureDifference adds feature difference to common difference in sum.
// All images must have the same width, height and format (8-bit gray).
// For every point:
// excess = max(lo[i] - value[i], 0) + max(value[i] - hi[i], 0);
// difference[i] += (weight * excess*excess) >> 16;
// This function is used for difference estimation in algorithm of motion detection.
func AddFeatureDifference(value, lo, hi gocvsimd.View, weight uint16, difference gocvsimd.View) {
gocvsimd.SimdSse2AddFeatureDifference(value, lo, hi, uint64(weight), difference)
}
// SquaredDifferenceSum calculates sum of squared differences for two 8-bit gray images.
// All images must have the same width and height.
// For every point:
// sum += (a[i] - b[i])*(a[i] - b[i]);
func SquaredDifferenceSum(a, b gocvsimd.View) uint64 {
return gocvsimd.SimdSse2SquaredDifferenceSum(a, b)
}
// SquaredDifferenceSumMasked calculates sum of squared differences for two images with using mask.
// All images must have the same width, height and format (8-bit gray).
// For every point:
// if(mask[i] == index)
// sum += (a[i] - b[i])*(a[i] - b[i]);
func SquaredDifferenceSumMasked(a, b, mask gocvsimd.View, index uint8) uint64 {
return gocvsimd.SimdSse2SquaredDifferenceSumMasked(a, b, mask, uint64(index))
} | correlation.go | 0.70477 | 0.78469 | correlation.go | starcoder |
package yamlpath
import (
"fmt"
"regexp"
"gopkg.in/yaml.v3"
)
type filter func(node, root *yaml.Node) bool
func newFilter(n *filterNode) filter {
if n == nil {
return never
}
switch n.lexeme.typ {
case lexemeFilterAt, lexemeRoot:
path := pathFilterScanner(n)
return func(node, root *yaml.Node) bool {
return len(path(node, root)) > 0
}
case lexemeFilterEquality, lexemeFilterInequality,
lexemeFilterGreaterThan, lexemeFilterGreaterThanOrEqual,
lexemeFilterLessThan, lexemeFilterLessThanOrEqual:
return comparisonFilter(n)
case lexemeFilterMatchesRegularExpression:
return matchRegularExpression(n)
case lexemeFilterNot:
f := newFilter(n.children[0])
return func(node, root *yaml.Node) bool {
return !f(node, root)
}
case lexemeFilterOr:
f1 := newFilter(n.children[0])
f2 := newFilter(n.children[1])
return func(node, root *yaml.Node) bool {
return f1(node, root) || f2(node, root)
}
case lexemeFilterAnd:
f1 := newFilter(n.children[0])
f2 := newFilter(n.children[1])
return func(node, root *yaml.Node) bool {
return f1(node, root) && f2(node, root)
}
default:
return never
}
}
func never(node, root *yaml.Node) bool {
return false
}
func comparisonFilter(n *filterNode) filter {
return nodeToFilter(n, func(l, r string) bool {
return n.lexeme.comparator()(compareNodeValues(l, r))
})
}
func nodeToFilter(n *filterNode, accept func(string, string) bool) filter {
lhsPath := newFilterScanner(n.children[0])
rhsPath := newFilterScanner(n.children[1])
return func(node, root *yaml.Node) (result bool) {
match := false
for _, l := range lhsPath(node, root) {
for _, r := range rhsPath(node, root) {
if !accept(l, r) {
return false
}
match = true
}
}
return match
}
}
type filterScanner func(*yaml.Node, *yaml.Node) []string
func emptyScanner(*yaml.Node, *yaml.Node) []string {
return []string{}
}
func newFilterScanner(n *filterNode) filterScanner {
switch {
case n == nil:
return emptyScanner
case n.isItemFilter():
return pathFilterScanner(n)
case n.isLiteral():
return literalFilterScanner(n)
default:
return emptyScanner
}
}
func pathFilterScanner(n *filterNode) filterScanner {
var at bool
switch n.lexeme.typ {
case lexemeFilterAt:
at = true
case lexemeRoot:
at = false
default:
panic("false precondition")
}
subpath := ""
for _, lexeme := range n.subpath {
subpath += lexeme.val
}
path, err := NewPath(subpath)
if err != nil {
return func(node, root *yaml.Node) []string {
return []string{}
}
}
return func(node, root *yaml.Node) []string {
if at {
return values(path.Find(node))
}
return values(path.Find(root))
}
}
func values(nodes []*yaml.Node, err error) []string {
if err != nil {
panic(fmt.Errorf("unexpected error: %v", err)) // should never happen
}
v := []string{}
for _, n := range nodes {
v = append(v, n.Value)
}
return v
}
func literalFilterScanner(n *filterNode) filterScanner {
v := n.lexeme.literalValue()
return func(node, root *yaml.Node) []string {
return []string{v}
}
}
func matchRegularExpression(parseTree *filterNode) filter {
return nodeToFilter(parseTree, stringMatchesRegularExpression)
}
func stringMatchesRegularExpression(s, expr string) bool {
re, _ := regexp.Compile(expr) // regex already compiled during lexing
return re.Match([]byte(s))
} | pkg/yamlpath/filter.go | 0.609989 | 0.469216 | filter.go | starcoder |
package rlwe
import (
"github.com/tuneinsight/lattigo/v3/ring"
"github.com/tuneinsight/lattigo/v3/utils"
)
// PolyQP represents a polynomial in the ring of polynomial modulo Q*P.
// This type is simply the union type between two ring.Poly, each one
// containing the modulus Q and P coefficients of that polynomial.
// The modulus Q represent the ciphertext modulus and the modulus P
// the special primes for the RNS decomposition during homomorphic
// operations involving keys.
type PolyQP struct {
Q, P *ring.Poly
}
// Equals returns true if the receiver PolyQP is equal to the provided other PolyQP.
// This method checks for equality of its two sub-polynomials.
func (p *PolyQP) Equals(other PolyQP) (v bool) {
if p == &other {
return true
}
v = true
if p.Q != nil {
v = p.Q.Equals(other.Q)
}
if p.P != nil {
v = v && p.P.Equals(other.P)
}
return v
}
// CopyValues copies the coefficients of p1 on the target polynomial.
// This method simply calls the CopyValues method for each of its sub-polynomials.
func (p *PolyQP) CopyValues(other PolyQP) {
if p.Q != nil {
p.Q.CopyValues(other.Q)
}
if p.P != nil {
p.P.CopyValues(other.P)
}
}
// CopyNew creates an exact copy of the target polynomial.
func (p *PolyQP) CopyNew() PolyQP {
if p == nil {
return PolyQP{}
}
var Q, P *ring.Poly
if p.Q != nil {
Q = p.Q.CopyNew()
}
if p.P != nil {
P = p.P.CopyNew()
}
return PolyQP{Q, P}
}
// RingQP is a structure that implements the operation in the ring R_QP.
// This type is simply a union type between the two Ring types representing
// R_Q and R_P.
type RingQP struct {
RingQ, RingP *ring.Ring
}
// NewPoly creates a new polynomial with all coefficients set to 0.
func (r *RingQP) NewPoly() PolyQP {
var Q, P *ring.Poly
if r.RingQ != nil {
Q = r.RingQ.NewPoly()
}
if r.RingP != nil {
P = r.RingP.NewPoly()
}
return PolyQP{Q, P}
}
// NewPolyLvl creates a new polynomial with all coefficients set to 0.
func (r *RingQP) NewPolyLvl(levelQ, levelP int) PolyQP {
var Q, P *ring.Poly
if r.RingQ != nil {
Q = r.RingQ.NewPolyLvl(levelQ)
}
if r.RingP != nil {
P = r.RingP.NewPolyLvl(levelP)
}
return PolyQP{Q, P}
}
// AddLvl adds p1 to p2 coefficient-wise and writes the result on p3.
// The operation is performed at levelQ for the ringQ and levelP for the ringP.
func (r *RingQP) AddLvl(levelQ, levelP int, p1, p2, pOut PolyQP) {
if r.RingQ != nil {
r.RingQ.AddLvl(levelQ, p1.Q, p2.Q, pOut.Q)
}
if r.RingP != nil {
r.RingP.AddLvl(levelP, p1.P, p2.P, pOut.P)
}
}
// AddNoModLvl adds p1 to p2 coefficient-wise and writes the result on p3 without modular reduction.
// The operation is performed at levelQ for the ringQ and levelP for the ringP.
func (r *RingQP) AddNoModLvl(levelQ, levelP int, p1, p2, pOut PolyQP) {
if r.RingQ != nil {
r.RingQ.AddNoModLvl(levelQ, p1.Q, p2.Q, pOut.Q)
}
if r.RingP != nil {
r.RingP.AddNoModLvl(levelP, p1.P, p2.P, pOut.P)
}
}
// SubLvl subtracts p2 to p1 coefficient-wise and writes the result on p3.
// The operation is performed at levelQ for the ringQ and levelP for the ringP.
func (r *RingQP) SubLvl(levelQ, levelP int, p1, p2, pOut PolyQP) {
if r.RingQ != nil {
r.RingQ.SubLvl(levelQ, p1.Q, p2.Q, pOut.Q)
}
if r.RingP != nil {
r.RingP.SubLvl(levelP, p1.P, p2.P, pOut.P)
}
}
// NTTLvl computes the NTT of p1 and returns the result on p2.
// The operation is performed at levelQ for the ringQ and levelP for the ringP.
func (r *RingQP) NTTLvl(levelQ, levelP int, p, pOut PolyQP) {
if r.RingQ != nil {
r.RingQ.NTTLvl(levelQ, p.Q, pOut.Q)
}
if r.RingP != nil {
r.RingP.NTTLvl(levelP, p.P, pOut.P)
}
}
// InvNTTLvl computes the inverse-NTT of p1 and returns the result on p2.
// The operation is performed at levelQ for the ringQ and levelP for the ringP.
func (r *RingQP) InvNTTLvl(levelQ, levelP int, p, pOut PolyQP) {
if r.RingQ != nil {
r.RingQ.InvNTTLvl(levelQ, p.Q, pOut.Q)
}
if r.RingP != nil {
r.RingP.InvNTTLvl(levelP, p.P, pOut.P)
}
}
// NTTLazyLvl computes the NTT of p1 and returns the result on p2.
// The operation is performed at levelQ for the ringQ and levelP for the ringP.
// Output values are in the range [0, 2q-1].
func (r *RingQP) NTTLazyLvl(levelQ, levelP int, p, pOut PolyQP) {
if r.RingQ != nil {
r.RingQ.NTTLazyLvl(levelQ, p.Q, pOut.Q)
}
if r.RingP != nil {
r.RingP.NTTLazyLvl(levelP, p.P, pOut.P)
}
}
// MFormLvl switches p1 to the Montgomery domain and writes the result on p2.
// The operation is performed at levelQ for the ringQ and levelP for the ringP.
func (r *RingQP) MFormLvl(levelQ, levelP int, p, pOut PolyQP) {
if r.RingQ != nil {
r.RingQ.MFormLvl(levelQ, p.Q, pOut.Q)
}
if r.RingP != nil {
r.RingP.MFormLvl(levelP, p.P, pOut.P)
}
}
// InvMFormLvl switches back p1 from the Montgomery domain to the conventional domain and writes the result on p2.
// The operation is performed at levelQ for the ringQ and levelP for the ringP.
func (r *RingQP) InvMFormLvl(levelQ, levelP int, p, pOut PolyQP) {
if r.RingQ != nil {
r.RingQ.InvMFormLvl(levelQ, p.Q, pOut.Q)
}
if r.RingP != nil {
r.RingP.InvMFormLvl(levelP, p.P, pOut.P)
}
}
// MulCoeffsMontgomeryLvl multiplies p1 by p2 coefficient-wise with a Montgomery modular reduction.
// The operation is performed at levelQ for the ringQ and levelP for the ringP.
func (r *RingQP) MulCoeffsMontgomeryLvl(levelQ, levelP int, p1, p2, p3 PolyQP) {
if r.RingQ != nil {
r.RingQ.MulCoeffsMontgomeryLvl(levelQ, p1.Q, p2.Q, p3.Q)
}
if r.RingP != nil {
r.RingP.MulCoeffsMontgomeryLvl(levelP, p1.P, p2.P, p3.P)
}
}
// MulCoeffsMontgomeryConstantLvl multiplies p1 by p2 coefficient-wise with a constant-time Montgomery modular reduction.
// The operation is performed at levelQ for the ringQ and levelP for the ringP.
// Result is within [0, 2q-1].
func (r *RingQP) MulCoeffsMontgomeryConstantLvl(levelQ, levelP int, p1, p2, p3 PolyQP) {
if r.RingQ != nil {
r.RingQ.MulCoeffsMontgomeryConstantLvl(levelQ, p1.Q, p2.Q, p3.Q)
}
if r.RingP != nil {
r.RingP.MulCoeffsMontgomeryConstantLvl(levelP, p1.P, p2.P, p3.P)
}
}
// MulCoeffsMontgomeryConstantAndAddNoModLvl multiplies p1 by p2 coefficient-wise with a
// constant-time Montgomery modular reduction and adds the result on p3.
// Result is within [0, 2q-1]
func (r *RingQP) MulCoeffsMontgomeryConstantAndAddNoModLvl(levelQ, levelP int, p1, p2, p3 PolyQP) {
if r.RingQ != nil {
r.RingQ.MulCoeffsMontgomeryConstantAndAddNoModLvl(levelQ, p1.Q, p2.Q, p3.Q)
}
if r.RingP != nil {
r.RingP.MulCoeffsMontgomeryConstantAndAddNoModLvl(levelP, p1.P, p2.P, p3.P)
}
}
// MulCoeffsMontgomeryAndSubLvl multiplies p1 by p2 coefficient-wise with
// a Montgomery modular reduction and subtracts the result from p3.
// The operation is performed at levelQ for the ringQ and levelP for the ringP.
func (r *RingQP) MulCoeffsMontgomeryAndSubLvl(levelQ, levelP int, p1, p2, p3 PolyQP) {
if r.RingQ != nil {
r.RingQ.MulCoeffsMontgomeryAndSubLvl(levelQ, p1.Q, p2.Q, p3.Q)
}
if r.RingP != nil {
r.RingP.MulCoeffsMontgomeryAndSubLvl(levelP, p1.P, p2.P, p3.P)
}
}
// MulCoeffsMontgomeryConstantAndSubNoModLvl multiplies p1 by p2 coefficient-wise with
// a Montgomery modular reduction and subtracts the result from p3.
// The operation is performed at levelQ for the ringQ and levelP for the ringP.
func (r *RingQP) MulCoeffsMontgomeryConstantAndSubNoModLvl(levelQ, levelP int, p1, p2, p3 PolyQP) {
if r.RingQ != nil {
r.RingQ.MulCoeffsMontgomeryConstantAndSubNoModLvl(levelQ, p1.Q, p2.Q, p3.Q)
}
if r.RingP != nil {
r.RingP.MulCoeffsMontgomeryConstantAndSubNoModLvl(levelP, p1.P, p2.P, p3.P)
}
}
// MulCoeffsMontgomeryAndAddLvl multiplies p1 by p2 coefficient-wise with a
// Montgomery modular reduction and adds the result to p3.
// The operation is performed at levelQ for the ringQ and levelP for the ringP.
func (r *RingQP) MulCoeffsMontgomeryAndAddLvl(levelQ, levelP int, p1, p2, p3 PolyQP) {
if r.RingQ != nil {
r.RingQ.MulCoeffsMontgomeryAndAddLvl(levelQ, p1.Q, p2.Q, p3.Q)
}
if r.RingP != nil {
r.RingP.MulCoeffsMontgomeryAndAddLvl(levelP, p1.P, p2.P, p3.P)
}
}
// PermuteNTTWithIndexLvl applies the automorphism X^{5^j} on p1 and writes the result on p2.
// Index of automorphism must be provided.
// Method is not in place.
// The operation is performed at levelQ for the ringQ and levelP for the ringP.
func (r *RingQP) PermuteNTTWithIndexLvl(levelQ, levelP int, p1 PolyQP, index []uint64, p2 PolyQP) {
if r.RingQ != nil {
r.RingQ.PermuteNTTWithIndexLvl(levelQ, p1.Q, index, p2.Q)
}
if r.RingP != nil {
r.RingP.PermuteNTTWithIndexLvl(levelP, p1.P, index, p2.P)
}
}
// PermuteNTTWithIndexAndAddNoModLvl applies the automorphism X^{5^j} on p1 and adds the result on p2.
// Index of automorphism must be provided.
// Method is not in place.
// The operation is performed at levelQ for the ringQ and levelP for the ringP.
func (r *RingQP) PermuteNTTWithIndexAndAddNoModLvl(levelQ, levelP int, p1 PolyQP, index []uint64, p2 PolyQP) {
if r.RingQ != nil {
r.RingQ.PermuteNTTWithIndexAndAddNoModLvl(levelQ, p1.Q, index, p2.Q)
}
if r.RingP != nil {
r.RingP.PermuteNTTWithIndexAndAddNoModLvl(levelP, p1.P, index, p2.P)
}
}
// CopyValuesLvl copies the values of p1 on p2.
// The operation is performed at levelQ for the ringQ and levelP for the ringP.
func (r *RingQP) CopyValuesLvl(levelQ, levelP int, p1, p2 PolyQP) {
if r.RingQ != nil {
ring.CopyValuesLvl(levelQ, p1.Q, p2.Q)
}
if r.RingP != nil {
ring.CopyValuesLvl(levelP, p1.P, p2.P)
}
}
// ExtendBasisSmallNormAndCenter extends a small-norm polynomial polQ in R_Q to a polynomial
// polQP in R_QP.
func (r *RingQP) ExtendBasisSmallNormAndCenter(polyInQ *ring.Poly, levelP int, polyOutQ, polyOutP *ring.Poly) {
var coeff, Q, QHalf, sign uint64
Q = r.RingQ.Modulus[0]
QHalf = Q >> 1
if polyInQ != polyOutQ && polyOutQ != nil {
polyOutQ.Copy(polyInQ)
}
for j := 0; j < r.RingQ.N; j++ {
coeff = polyInQ.Coeffs[0][j]
sign = 1
if coeff > QHalf {
coeff = Q - coeff
sign = 0
}
for i, pi := range r.RingP.Modulus[:levelP+1] {
polyOutP.Coeffs[i][j] = (coeff * sign) | (pi-coeff)*(sign^1)
}
}
}
// Copy copies the input polyQP on the target polyQP.
func (p *PolyQP) Copy(polFrom PolyQP) {
if polFrom.Q != nil {
p.Q.Copy(polFrom.Q)
}
if polFrom.P != nil {
p.P.Copy(polFrom.P)
}
}
// GetDataLen returns the length in byte of the target PolyQP
func (p *PolyQP) GetDataLen(WithMetadata bool) (dataLen int) {
if WithMetadata {
dataLen = 2
}
if p.Q != nil {
dataLen += p.Q.GetDataLen(WithMetadata)
}
if p.P != nil {
dataLen += p.P.GetDataLen(WithMetadata)
}
return
}
// WriteTo writes a polyQP on the inpute data.
func (p *PolyQP) WriteTo(data []byte) (pt int, err error) {
var inc int
if p.Q != nil {
data[0] = 1
}
if p.P != nil {
data[1] = 1
}
pt = 2
if data[0] == 1 {
if inc, err = p.Q.WriteTo(data[pt:]); err != nil {
return
}
pt += inc
}
if data[1] == 1 {
if inc, err = p.P.WriteTo(data[pt:]); err != nil {
return
}
pt += inc
}
return
}
// DecodePolyNew decodes the input bytes on the target polyQP.
func (p *PolyQP) DecodePolyNew(data []byte) (pt int, err error) {
var inc int
pt = 2
if data[0] == 1 {
p.Q = new(ring.Poly)
if inc, err = p.Q.DecodePolyNew(data[pt:]); err != nil {
return
}
pt += inc
}
if data[1] == 1 {
p.P = new(ring.Poly)
if inc, err = p.P.DecodePolyNew(data[pt:]); err != nil {
return
}
pt += inc
}
return
}
// UniformSamplerQP is a type for sampling polynomials in RingQP.
type UniformSamplerQP struct {
samplerQ, samplerP *ring.UniformSampler
}
// NewUniformSamplerQP instantiates a new UniformSamplerQP from a given PRNG.
func NewUniformSamplerQP(params Parameters, prng utils.PRNG) (s UniformSamplerQP) {
if params.RingQ() != nil {
s.samplerQ = ring.NewUniformSampler(prng, params.RingQ())
}
if params.RingP() != nil {
s.samplerP = ring.NewUniformSampler(prng, params.RingP())
}
return s
}
// Read samples a new polynomial in RingQP and stores it into p.
func (s UniformSamplerQP) Read(p *PolyQP) {
if p.Q != nil && s.samplerQ != nil {
s.samplerQ.Read(p.Q)
}
if p.P != nil && s.samplerP != nil {
s.samplerP.Read(p.P)
}
} | rlwe/ring_qp.go | 0.772874 | 0.590602 | ring_qp.go | starcoder |
package iso20022
// Completion of a securities settlement instruction, wherein securities are delivered/debited from a securities account and received/credited to the designated securities account.
type Transfer26 struct {
// Unique and unambiguous identifier for a transfer execution, as assigned by a confirming party.
TransferConfirmationReference *Max35Text `xml:"TrfConfRef"`
// Unique and unambiguous identifier for a transfer instruction, as assigned by the instructing party.
TransferReference *Max35Text `xml:"TrfRef"`
// Unique and unambiguous investor's identification of a transfer. This reference can typically be used in a hub scenario to give the reference of the transfer as assigned by the underlying client.
ClientReference *Max35Text `xml:"ClntRef,omitempty"`
// Unambiguous identification of the transfer allocated by the counterparty.
CounterpartyReference *AdditionalReference2 `xml:"CtrPtyRef,omitempty"`
// Date and time at which the transfer was executed.
EffectiveTransferDate *DateAndDateTimeChoice `xml:"FctvTrfDt"`
// Date and time at which the securities are to be exchanged at the International Central Securities Depository (ICSD) or Central Securities Depository (CSD).
RequestedSettlementDate *ISODate `xml:"ReqdSttlmDt,omitempty"`
// Date and time at which the securities were exchanged at the International Central Securities Depository (ICSD) or Central Securities Depository (CSD).
EffectiveSettlementDate *DateAndDateTimeChoice `xml:"FctvSttlmDt,omitempty"`
// Date and time at which a transaction is completed and cleared, ie, securities are delivered.
TradeDate *DateAndDateTimeChoice `xml:"TradDt,omitempty"`
// Identifies in which date the investor signed the transfer order form.
TransferOrderDateForm *ISODate `xml:"TrfOrdrDtForm,omitempty"`
// Specifies the reason for the assets transfer.
TransferReason *TransferReason1 `xml:"TrfRsn,omitempty"`
// Identifies whether or not saving plan or withdrawal or switch plan are included in the holdings.
HoldingsPlanType []*HoldingsPlanType1Code `xml:"HldgsPlanTp,omitempty"`
// Information related to the financial instrument received.
FinancialInstrumentDetails *FinancialInstrument13 `xml:"FinInstrmDtls"`
// Total quantity of securities settled.
TotalUnitsNumber *FinancialInstrumentQuantity1 `xml:"TtlUnitsNb"`
// Information about the units to be transferred.
UnitsDetails []*Unit3 `xml:"UnitsDtls,omitempty"`
// Value of a security, as booked in an account. Book value is often different from the current market value of the security.
AveragePrice *ActiveOrHistoricCurrencyAnd13DecimalAmount `xml:"AvrgPric,omitempty"`
// Identifies the currency to be used to transfer the holdings.
TransferCurrency *CurrencyCode `xml:"TrfCcy,omitempty"`
// Indicates whether the transfer results in a change of beneficial owner.
OwnAccountTransferIndicator *YesNoIndicator `xml:"OwnAcctTrfInd,omitempty"`
// Additional specific settlement information for non-regulated traded funds.
NonStandardSettlementInformation *Max350Text `xml:"NonStdSttlmInf,omitempty"`
// Party that receives securities from the delivering agent via the place of settlement, for example, securities central depository.
ReceivingAgentDetails *PartyIdentificationAndAccount93 `xml:"RcvgAgtDtls,omitempty"`
// Party that delivers securities to the receiving agent at the place of settlement, for example, a central securities depository.
DeliveringAgentDetails *PartyIdentificationAndAccount93 `xml:"DlvrgAgtDtls,omitempty"`
}
func (t *Transfer26) SetTransferConfirmationReference(value string) {
t.TransferConfirmationReference = (*Max35Text)(&value)
}
func (t *Transfer26) SetTransferReference(value string) {
t.TransferReference = (*Max35Text)(&value)
}
func (t *Transfer26) SetClientReference(value string) {
t.ClientReference = (*Max35Text)(&value)
}
func (t *Transfer26) AddCounterpartyReference() *AdditionalReference2 {
t.CounterpartyReference = new(AdditionalReference2)
return t.CounterpartyReference
}
func (t *Transfer26) AddEffectiveTransferDate() *DateAndDateTimeChoice {
t.EffectiveTransferDate = new(DateAndDateTimeChoice)
return t.EffectiveTransferDate
}
func (t *Transfer26) SetRequestedSettlementDate(value string) {
t.RequestedSettlementDate = (*ISODate)(&value)
}
func (t *Transfer26) AddEffectiveSettlementDate() *DateAndDateTimeChoice {
t.EffectiveSettlementDate = new(DateAndDateTimeChoice)
return t.EffectiveSettlementDate
}
func (t *Transfer26) AddTradeDate() *DateAndDateTimeChoice {
t.TradeDate = new(DateAndDateTimeChoice)
return t.TradeDate
}
func (t *Transfer26) SetTransferOrderDateForm(value string) {
t.TransferOrderDateForm = (*ISODate)(&value)
}
func (t *Transfer26) AddTransferReason() *TransferReason1 {
t.TransferReason = new(TransferReason1)
return t.TransferReason
}
func (t *Transfer26) AddHoldingsPlanType(value string) {
t.HoldingsPlanType = append(t.HoldingsPlanType, (*HoldingsPlanType1Code)(&value))
}
func (t *Transfer26) AddFinancialInstrumentDetails() *FinancialInstrument13 {
t.FinancialInstrumentDetails = new(FinancialInstrument13)
return t.FinancialInstrumentDetails
}
func (t *Transfer26) AddTotalUnitsNumber() *FinancialInstrumentQuantity1 {
t.TotalUnitsNumber = new(FinancialInstrumentQuantity1)
return t.TotalUnitsNumber
}
func (t *Transfer26) AddUnitsDetails() *Unit3 {
newValue := new (Unit3)
t.UnitsDetails = append(t.UnitsDetails, newValue)
return newValue
}
func (t *Transfer26) SetAveragePrice(value, currency string) {
t.AveragePrice = NewActiveOrHistoricCurrencyAnd13DecimalAmount(value, currency)
}
func (t *Transfer26) SetTransferCurrency(value string) {
t.TransferCurrency = (*CurrencyCode)(&value)
}
func (t *Transfer26) SetOwnAccountTransferIndicator(value string) {
t.OwnAccountTransferIndicator = (*YesNoIndicator)(&value)
}
func (t *Transfer26) SetNonStandardSettlementInformation(value string) {
t.NonStandardSettlementInformation = (*Max350Text)(&value)
}
func (t *Transfer26) AddReceivingAgentDetails() *PartyIdentificationAndAccount93 {
t.ReceivingAgentDetails = new(PartyIdentificationAndAccount93)
return t.ReceivingAgentDetails
}
func (t *Transfer26) AddDeliveringAgentDetails() *PartyIdentificationAndAccount93 {
t.DeliveringAgentDetails = new(PartyIdentificationAndAccount93)
return t.DeliveringAgentDetails
} | Transfer26.go | 0.791942 | 0.42471 | Transfer26.go | starcoder |
package bulletproof
import (
crand "crypto/rand"
"github.com/gtank/merlin"
"github.com/pkg/errors"
"github.com/coinbase/kryptology/pkg/core/curves"
)
// BatchProve proves that a list of scalars v are in the range n.
// It implements the aggregating logarithmic proofs defined on pg21.
// Instead of taking a single value and a single blinding factor, BatchProve takes in a list of values and list of
// blinding factors.
func (prover *RangeProver) BatchProve(v, gamma []curves.Scalar, n int, proofGenerators RangeProofGenerators, transcript *merlin.Transcript) (*RangeProof, error) {
// Define nm as the total bits required for secrets, calculated as number of secrets * n
m := len(v)
nm := n * m
// nm must be less than or equal to the number of generators generated
if nm > len(prover.generators.G) {
return nil, errors.New("ipp vector length must be less than or equal to maxVectorLength")
}
// In case where nm is less than number of generators precomputed by prover, trim to length
proofG := prover.generators.G[0:nm]
proofH := prover.generators.H[0:nm]
// Check that each elem in v is in range [0, 2^n]
for _, vi := range v {
checkedRange := checkRange(vi, n)
if checkedRange != nil {
return nil, checkedRange
}
}
// L40 on pg19
aL, err := getaLBatched(v, n, prover.curve)
if err != nil {
return nil, errors.Wrap(err, "rangeproof prove")
}
onenm := get1nVector(nm, prover.curve)
// L41 on pg19
aR, err := subtractPairwiseScalarVectors(aL, onenm)
if err != nil {
return nil, errors.Wrap(err, "rangeproof prove")
}
alpha := prover.curve.Scalar.Random(crand.Reader)
// Calc A (L44, pg19)
halpha := proofGenerators.h.Mul(alpha)
gaL := prover.curve.Point.SumOfProducts(proofG, aL)
haR := prover.curve.Point.SumOfProducts(proofH, aR)
capA := halpha.Add(gaL).Add(haR)
// L45, 46, pg19
sL := getBlindingVector(nm, prover.curve)
sR := getBlindingVector(nm, prover.curve)
rho := prover.curve.Scalar.Random(crand.Reader)
// Calc S (L47, pg19)
hrho := proofGenerators.h.Mul(rho)
gsL := prover.curve.Point.SumOfProducts(proofG, sL)
hsR := prover.curve.Point.SumOfProducts(proofH, sR)
capS := hrho.Add(gsL).Add(hsR)
// Fiat Shamir for y,z (L49, pg19)
capV := getcapVBatched(v, gamma, proofGenerators.g, proofGenerators.h)
y, z, err := calcyzBatched(capV, capA, capS, transcript, prover.curve)
if err != nil {
return nil, errors.Wrap(err, "rangeproof prove")
}
// Calc t_1, t_2
// See the l(X), r(X), equations on pg 21
// Use l(X)'s and r(X)'s constant and linear terms to derive t_1 and t_2
// (a_l - z*1^n)
zonenm := multiplyScalarToScalarVector(z, onenm)
constantTerml, err := subtractPairwiseScalarVectors(aL, zonenm)
if err != nil {
return nil, errors.Wrap(err, "rangeproof prove")
}
linearTerml := sL
// zSum term, see equation 71 on pg21
zSum := getSumTermrXBatched(z, n, len(v), prover.curve)
// a_r + z*1^nm
aRPluszonenm, err := addPairwiseScalarVectors(aR, zonenm)
if err != nil {
return nil, errors.Wrap(err, "rangeproof prove")
}
ynm := getknVector(y, nm, prover.curve)
hadamard, err := multiplyPairwiseScalarVectors(ynm, aRPluszonenm)
if err != nil {
return nil, errors.Wrap(err, "rangeproof prove")
}
constantTermr, err := addPairwiseScalarVectors(hadamard, zSum)
if err != nil {
return nil, errors.Wrap(err, "rangeproof prove")
}
linearTermr, err := multiplyPairwiseScalarVectors(ynm, sR)
if err != nil {
return nil, errors.Wrap(err, "rangeproof prove")
}
// t_1 (as the linear coefficient) is the sum of the dot products of l(X)'s linear term dot r(X)'s constant term
// and r(X)'s linear term dot l(X)'s constant term
t1FirstTerm, err := innerProduct(linearTerml, constantTermr)
if err != nil {
return nil, errors.Wrap(err, "rangeproof prove")
}
t1SecondTerm, err := innerProduct(linearTermr, constantTerml)
if err != nil {
return nil, errors.Wrap(err, "rangeproof prove")
}
t1 := t1FirstTerm.Add(t1SecondTerm)
// t_2 (as the quadratic coefficient) is the dot product of l(X)'s and r(X)'s linear terms
t2, err := innerProduct(linearTerml, linearTermr)
if err != nil {
return nil, errors.Wrap(err, "rangeproof prove")
}
// L52, pg20
tau1 := prover.curve.Scalar.Random(crand.Reader)
tau2 := prover.curve.Scalar.Random(crand.Reader)
// T_1, T_2 (L53, pg20)
capT1 := proofGenerators.g.Mul(t1).Add(proofGenerators.h.Mul(tau1))
capT2 := proofGenerators.g.Mul(t2).Add(proofGenerators.h.Mul(tau2))
// Fiat shamir for x (L55, pg20)
x, err := calcx(capT1, capT2, transcript, prover.curve)
if err != nil {
return nil, errors.Wrap(err, "rangeproof prove")
}
// Calc l
// Instead of using the expression in the line, evaluate l() at x
sLx := multiplyScalarToScalarVector(x, linearTerml)
l, err := addPairwiseScalarVectors(constantTerml, sLx)
if err != nil {
return nil, errors.Wrap(err, "rangeproof prove")
}
// Calc r
// Instead of using the expression in the line, evaluate r() at x
ynsRx := multiplyScalarToScalarVector(x, linearTermr)
r, err := addPairwiseScalarVectors(constantTermr, ynsRx)
if err != nil {
return nil, errors.Wrap(err, "rangeproof prove")
}
// Calc t hat
// For efficiency, instead of calculating the dot product, evaluate t() at x
zm := getknVector(z, m, prover.curve)
zsquarezm := multiplyScalarToScalarVector(z.Square(), zm)
sumv := prover.curve.Scalar.Zero()
for i := 0; i < m; i++ {
elem := zsquarezm[i].Mul(v[i])
sumv = sumv.Add(elem)
}
deltayzBatched, err := deltayzBatched(y, z, n, m, prover.curve)
if err != nil {
return nil, errors.Wrap(err, "rangeproof prove")
}
t0 := sumv.Add(deltayzBatched)
tLinear := t1.Mul(x)
tQuadratic := t2.Mul(x.Square())
tHat := t0.Add(tLinear).Add(tQuadratic)
// Calc tau_x (L61, pg20)
tau2xsquare := tau2.Mul(x.Square())
tau1x := tau1.Mul(x)
zsum := prover.curve.Scalar.Zero()
zExp := z.Clone()
for j := 1; j < m+1; j++ {
zExp = zExp.Mul(z)
zsum = zsum.Add(zExp.Mul(gamma[j-1]))
}
taux := tau2xsquare.Add(tau1x).Add(zsum)
// Calc mu (L62, pg20)
mu := alpha.Add(rho.Mul(x))
// Calc IPP (See section 4.2)
hPrime, err := gethPrime(proofH, y, prover.curve)
if err != nil {
return nil, errors.Wrap(err, "rangeproof prove")
}
// P is redefined in batched case, see bottom equation on pg21.
capPhmu := getPhmuBatched(proofG, hPrime, proofGenerators.h, capA, capS, x, y, z, mu, n, m, prover.curve)
wBytes := transcript.ExtractBytes([]byte("getw"), 64)
w, err := prover.curve.NewScalar().SetBytesWide(wBytes)
if err != nil {
return nil, errors.Wrap(err, "rangeproof prove")
}
ipp, err := prover.ippProver.rangeToIPP(proofG, hPrime, l, r, tHat, capPhmu, proofGenerators.u.Mul(w), transcript)
if err != nil {
return nil, errors.Wrap(err, "rangeproof prove")
}
out := &RangeProof{
capA: capA,
capS: capS,
capT1: capT1,
capT2: capT2,
taux: taux,
mu: mu,
tHat: tHat,
ipp: ipp,
curve: &prover.curve,
}
return out, nil
}
// See final term of L71 on pg 21
// Sigma_{j=1}^{m} z^{1+j} * (0^{(j-1)*n} || 2^{n} || 0^{(m-j)*n}).
func getSumTermrXBatched(z curves.Scalar, n, m int, curve curves.Curve) []curves.Scalar {
twoN := get2nVector(n, curve)
var out []curves.Scalar
// The final power should be one more than m
zExp := z.Clone()
for j := 0; j < m; j++ {
zExp = zExp.Mul(z)
elem := multiplyScalarToScalarVector(zExp, twoN)
out = append(out, elem...)
}
return out
}
func getcapVBatched(v, gamma []curves.Scalar, g, h curves.Point) []curves.Point {
out := make([]curves.Point, len(v))
for i, vi := range v {
out[i] = getcapV(vi, gamma[i], g, h)
}
return out
}
func getaLBatched(v []curves.Scalar, n int, curve curves.Curve) ([]curves.Scalar, error) {
var aL []curves.Scalar
for _, vi := range v {
aLi, err := getaL(vi, n, curve)
if err != nil {
return nil, err
}
aL = append(aL, aLi...)
}
return aL, nil
}
func calcyzBatched(capV []curves.Point, capA, capS curves.Point, transcript *merlin.Transcript, curve curves.Curve) (curves.Scalar, curves.Scalar, error) {
// Add the A,S values to transcript
for _, capVi := range capV {
transcript.AppendMessage([]byte("addV"), capVi.ToAffineUncompressed())
}
transcript.AppendMessage([]byte("addcapA"), capA.ToAffineUncompressed())
transcript.AppendMessage([]byte("addcapS"), capS.ToAffineUncompressed())
// Read 64 bytes twice from, set to scalar for y and z
yBytes := transcript.ExtractBytes([]byte("gety"), 64)
y, err := curve.NewScalar().SetBytesWide(yBytes)
if err != nil {
return nil, nil, errors.Wrap(err, "calcyz NewScalar SetBytesWide")
}
zBytes := transcript.ExtractBytes([]byte("getz"), 64)
z, err := curve.NewScalar().SetBytesWide(zBytes)
if err != nil {
return nil, nil, errors.Wrap(err, "calcyz NewScalar SetBytesWide")
}
return y, z, nil
}
func deltayzBatched(y, z curves.Scalar, n, m int, curve curves.Curve) (curves.Scalar, error) {
// z - z^2
zMinuszsquare := z.Sub(z.Square())
// 1^(n*m)
onenm := get1nVector(n*m, curve)
// <1^nm, y^nm>
onenmdotynm, err := innerProduct(onenm, getknVector(y, n*m, curve))
if err != nil {
return nil, errors.Wrap(err, "deltayz")
}
// (z - z^2)*<1^n, y^n>
termFirst := zMinuszsquare.Mul(onenmdotynm)
// <1^n, 2^n>
onendottwon, err := innerProduct(get1nVector(n, curve), get2nVector(n, curve))
if err != nil {
return nil, errors.Wrap(err, "deltayz")
}
termSecond := curve.Scalar.Zero()
zExp := z.Square()
for j := 1; j < m+1; j++ {
zExp = zExp.Mul(z)
elem := zExp.Mul(onendottwon)
termSecond = termSecond.Add(elem)
}
// (z - z^2)*<1^n, y^n> - z^3*<1^n, 2^n>
out := termFirst.Sub(termSecond)
return out, nil
}
// Bottom equation on pg21.
func getPhmuBatched(proofG, proofHPrime []curves.Point, h, capA, capS curves.Point, x, y, z, mu curves.Scalar, n, m int, curve curves.Curve) curves.Point {
twoN := get2nVector(n, curve)
// h'^(z*y^n + z^2*2^n)
lastElem := curve.NewIdentityPoint()
zExp := z.Clone()
for j := 1; j < m+1; j++ {
// Get subvector of h
hSubvector := proofHPrime[(j-1)*n : j*n]
// z^(j+1)
zExp = zExp.Mul(z)
exp := multiplyScalarToScalarVector(zExp, twoN)
// Final elem
elem := curve.Point.SumOfProducts(hSubvector, exp)
lastElem = lastElem.Add(elem)
}
zynm := multiplyScalarToScalarVector(z, getknVector(y, n*m, curve))
hPrimezynm := curve.Point.SumOfProducts(proofHPrime, zynm)
lastElem = lastElem.Add(hPrimezynm)
// S^x
capSx := capS.Mul(x)
// g^-z --> -z*<1,g>
onenm := get1nVector(n*m, curve)
zNeg := z.Neg()
zinvonen := multiplyScalarToScalarVector(zNeg, onenm)
zgdotonen := curve.Point.SumOfProducts(proofG, zinvonen)
// L66 on pg20
P := capA.Add(capSx).Add(zgdotonen).Add(lastElem)
hmu := h.Mul(mu)
Phmu := P.Sub(hmu)
return Phmu
} | pkg/bulletproof/range_batch_prover.go | 0.776411 | 0.461988 | range_batch_prover.go | starcoder |
// Package dataflow provides data flow analyses that can be performed on a
// previously constructed control flow graph, including a reaching definitions
// analysis and a live variables analysis for local variables.
package dataflow
// This file contains functions common to all data flow analyses, as well as
// one exported function.
import (
"go/ast"
"go/token"
"golang.org/x/tools/go/loader"
"golang.org/x/tools/go/types"
)
// ReferencedVars returns the sets of local variables that are defined or used
// within the given list of statements (based on syntax).
func ReferencedVars(stmts []ast.Stmt, info *loader.PackageInfo) (def, use map[*types.Var]struct{}) {
def = make(map[*types.Var]struct{})
use = make(map[*types.Var]struct{})
for _, stmt := range stmts {
for _, d := range defs(stmt, info) {
def[d] = struct{}{}
}
for _, u := range uses(stmt, info) {
use[u] = struct{}{}
}
}
return def, use
}
// defs extracts any local variables whose values are assigned in the given statement.
func defs(stmt ast.Stmt, info *loader.PackageInfo) []*types.Var {
idnts := make(map[*ast.Ident]struct{})
switch stmt := stmt.(type) {
case *ast.DeclStmt: // vars (1+) in decl; zero values
ast.Inspect(stmt, func(n ast.Node) bool {
if v, ok := n.(*ast.ValueSpec); ok {
idnts = union(idnts, idents(v))
}
return true
})
case *ast.IncDecStmt: // i++, i--
idnts = idents(stmt.X)
case *ast.AssignStmt: // :=, =, &=, etc. except x[i] (IndexExpr)
for _, x := range stmt.Lhs {
indExp := false
ast.Inspect(x, func(n ast.Node) bool {
if _, ok := n.(*ast.IndexExpr); ok {
indExp = true
return false
}
return true
})
if !indExp {
idnts = union(idnts, idents(x))
}
}
case *ast.RangeStmt: // only [ x, y ] on Lhs
idnts = union(idents(stmt.Key), idents(stmt.Value))
case *ast.TypeSwitchStmt:
// The assigned variable does not have a types.Var
// associated in this stmt; rather, the uses of that
// variable in the case clauses have several different
// types.Vars associated with them, according to type
var vars []*types.Var
ast.Inspect(stmt.Body, func(n ast.Node) bool {
switch cc := n.(type) {
case *ast.CaseClause:
v := typeCaseVar(info, cc)
if v != nil {
vars = append(vars, v)
}
return false
default:
return true
}
})
return vars
}
var vars []*types.Var
// should all map to types.Var's, if not we don't want anyway
for i, _ := range idnts {
if v, ok := info.ObjectOf(i).(*types.Var); ok {
vars = append(vars, v)
}
}
return vars
}
// typeCaseVar returns the implicit variable associated with a case clause in a
// type switch statement.
func typeCaseVar(info *loader.PackageInfo, cc *ast.CaseClause) *types.Var {
// Removed from go/loader
if v := info.Implicits[cc]; v != nil {
return v.(*types.Var)
}
return nil
}
// uses extracts local variables whose values are used in the given statement.
func uses(stmt ast.Stmt, info *loader.PackageInfo) []*types.Var {
idnts := make(map[*ast.Ident]struct{})
ast.Inspect(stmt, func(n ast.Node) bool {
switch stmt := stmt.(type) {
case *ast.AssignStmt: // mostly rhs of =, :=, &=, etc.
// some LHS are uses, e.g. x[i]
for _, x := range stmt.Lhs {
indExp := false
ast.Inspect(stmt, func(n ast.Node) bool {
if _, ok := n.(*ast.IndexExpr); ok {
indExp = true
return false
}
return true
})
if indExp || // x[i] is a uses of x and i
(stmt.Tok != token.ASSIGN &&
stmt.Tok != token.DEFINE) { // e.g. +=, ^=, etc.
idnts = union(idnts, idents(x))
}
}
// all RHS are uses
for _, s := range stmt.Rhs {
idnts = union(idnts, idents(s))
}
case *ast.BlockStmt: // no uses, skip - should not appear in cfg
case *ast.BranchStmt: // no uses, skip
case *ast.CaseClause: // no uses, skip
case *ast.CommClause: // no uses, skip
case *ast.DeclStmt: // no uses, skip
case *ast.DeferStmt:
idnts = idents(stmt.Call)
case *ast.ForStmt:
idnts = idents(stmt.Cond)
case *ast.IfStmt:
idnts = idents(stmt.Cond)
case *ast.LabeledStmt: // no uses, skip
case *ast.RangeStmt: // list in _, _ = range [ list ]
idnts = idents(stmt.X)
case *ast.SelectStmt: // no uses, skip
case *ast.SwitchStmt:
idnts = idents(stmt.Tag)
case *ast.TypeSwitchStmt: // no uses, skip
case ast.Stmt: // everything else is all uses
idnts = idents(stmt)
}
return true
})
var vars []*types.Var
// should all map to types.Var's, if not we don't want anyway
for i, _ := range idnts {
if v, ok := info.ObjectOf(i).(*types.Var); ok {
vars = append(vars, v)
}
}
return vars
}
// idents returns the set of all identifiers in given node.
func idents(node ast.Node) map[*ast.Ident]struct{} {
idents := make(map[*ast.Ident]struct{})
if node == nil {
return idents
}
ast.Inspect(node, func(n ast.Node) bool {
switch n := n.(type) {
case *ast.Ident:
idents[n] = struct{}{}
}
return true
})
return idents
}
func union(one, two map[*ast.Ident]struct{}) map[*ast.Ident]struct{} {
for o, _ := range one {
two[o] = struct{}{}
}
return two
} | analysis/dataflow/dataflow.go | 0.628749 | 0.522689 | dataflow.go | starcoder |
package mqo
import "math"
type Vector2 struct {
X float32
Y float32
}
type Vector3 struct {
X float32
Y float32
Z float32
}
func (v *Vector3) Len() float32 {
return float32(math.Sqrt(float64(v.X*v.X + v.Y*v.Y + v.Z*v.Z)))
}
func (v *Vector3) Normalize() {
l := v.Len()
if l > 0 {
v.X /= l
v.Y /= l
v.Z /= l
} else {
v.Z = 1
}
}
func (v *Vector3) Add(v2 *Vector3) *Vector3 {
return &Vector3{X: v.X + v2.X, Y: v.Y + v2.Y, Z: v.Z + v2.Z}
}
func (v *Vector3) Sub(v2 *Vector3) *Vector3 {
return &Vector3{X: v.X - v2.X, Y: v.Y - v2.Y, Z: v.Z - v2.Z}
}
func (v *Vector3) ToArray(array []float32) {
array[0] = v.X
array[1] = v.Y
array[2] = v.Z
}
type Vector4 struct {
X float32
Y float32
Z float32
W float32
}
type Document struct {
Scene *Scene
Materials []*Material
Objects []*Object
Plugins []Plugin
}
func NewDocument() *Document {
return &Document{}
}
func (doc *Document) GetPlugins() []Plugin {
return doc.Plugins
}
func (doc *Document) FixObjectID() {
for i, obj := range doc.Objects {
if obj.UID == 0 {
obj.UID = i + 1 // TODO: unique id
}
}
}
func (doc *Document) GetObjectByID(id int) *Object {
for _, obj := range doc.Objects {
if obj.UID == id {
return obj
}
}
return nil
}
type Scene struct {
CameraPos Vector3
CameraLookAt Vector3
CameraRot Vector3
}
type Material struct {
Name string
UID int
Color Vector4
Diffuse float32
Ambient float32
Emission float32
Specular float32
Power float32
Texture string
EmissionColor *Vector3
DoubleSided bool
Shader int
Ex2 *MaterialEx2
}
type MaterialEx2 struct {
ShaderType string
ShaderName string
ShaderParams map[string]interface{}
}
func (m *MaterialEx2) StringParam(name string) string {
if v, ok := m.ShaderParams[name].(string); ok {
return v
}
return ""
}
func (m *MaterialEx2) IntParam(name string) int {
if v, ok := m.ShaderParams[name].(int); ok {
return v
}
return 0
}
func (m *MaterialEx2) FloatParam(name string) float64 {
if v, ok := m.ShaderParams[name].(float64); ok {
return v
}
return 0
}
type Face struct {
UID int
Verts []int
Material int
UVs []Vector2
}
type Object struct {
UID int
Name string
Vertexes []*Vector3
Faces []*Face
Visible bool
Locked bool
Depth int
Shading int
Facet float32
Patch int
Segment int
Mirror int
MirrorDis float32
VertexByUID map[int]int
}
func NewObject(name string) *Object {
return &Object{Name: name, Visible: true, VertexByUID: map[int]int{}, Shading: 1, Facet: 59.5}
}
func (o *Object) Clone() *Object {
var cp = *o
cp.Vertexes = make([]*Vector3, len(o.Vertexes))
for i, v := range o.Vertexes {
vv := *v
cp.Vertexes[i] = &vv
}
cp.Faces = make([]*Face, len(o.Faces))
for i, v := range o.Faces {
d := &Face{Material: v.Material, Verts: make([]int, len(v.Verts)), UVs: make([]Vector2, len(v.UVs))}
copy(d.Verts, v.Verts)
copy(d.UVs, v.UVs)
cp.Faces[i] = d
}
return &cp
}
func (o *Object) GetVertexIndexByID(uid int) int {
if v, ok := o.VertexByUID[uid]; ok {
return v
}
if len(o.Vertexes) >= uid {
return uid - 1
}
return -1
}
type Plugin interface {
PreSerialize(mqo *Document)
PostDeserialize(mqo *Document)
} | mqo/mqo.go | 0.625095 | 0.533276 | mqo.go | starcoder |
package main
import (
"fmt"
"github.com/hyperledger/fabric/core/chaincode/shim"
"github.com/hyperledger/fabric/protos/peer"
)
// SimpleAsset은 간단한 체인 코드를 구현하여 자산을 관리합니다.
type SimpleAsset struct {
}
// 초기화를 위해 체인 코드 인스턴스화 중에 Init가 호출됩니다.
// 데이터. chaincode 업그레이드는이 기능을 호출하여 재설정합니다.
// 데이터를 이전합니다.
func (t *SimpleAsset) Init(stub shim.ChaincodeStubInterface) peer.Response {
// 거래 제안서에서 args 가져 오기
args := stub.GetStringArgs()
if len(args) != 2 {
return shim.Error("Incorrect arguments. Expecting a key and a value")
}
// stub.PutState ()를 호출하여 여기에 모든 변수 또는 에셋을 설정합니다.
// 원장에 키와 값을 저장합니다.
err := stub.PutState(args[0], []byte(args[1]))
if err != nil {
return shim.Error(fmt.Sprintf("Failed to create asset: %s", args[0]))
}
return shim.Success(nil)
}
// 호출은 chaincode에서 트랜잭션 당 호출됩니다. 각 거래는
// Init 함수에 의해 생성 된 자산의 'get'또는 'set'중 하나입니다. 세트
// 메소드는 새로운 키 - 값 쌍을 지정하여 새 자산을 만들 수 있습니다.
func (t *SimpleAsset) Invoke(stub shim.ChaincodeStubInterface) peer.Response {
// 트랜잭션 제안서에서 함수와 arg를 추출합니다.
fn, args := stub.GetFunctionAndParameters()
var result string
var err error
if fn == "set" {
result, err = set(stub, args)
} else { // fn이 nil이더라도 'get'이라고 가정합니다.
result, err = get(stub, args)
}
if err != nil {
return shim.Error(err.Error())
}
// 결과 페이로드를 성공 페이로드로 반환합니다.
return shim.Success([]byte(result))
}
// 원장에 자산을 저장합니다 (키와 값 모두). 키가 존재하면,
// 새 값으로 값을 대체합니다.
func set(stub shim.ChaincodeStubInterface, args []string) (string, error) {
if len(args) != 2 {
return "", fmt.Errorf("Incorrect arguments. Expecting a key and a value")
}
err := stub.PutState(args[0], []byte(args[1]))
if err != nil {
return "", fmt.Errorf("Failed to set asset: %s", args[0])
}
return args[1], nil
}
// Get은 지정된 자산 키의 값을 반환합니다.
func get(stub shim.ChaincodeStubInterface, args []string) (string, error) {
if len(args) != 1 {
return "", fmt.Errorf("Incorrect arguments. Expecting a key")
}
value, err := stub.GetState(args[0])
if err != nil {
return "", fmt.Errorf("Failed to get asset: %s with error: %s", args[0], err)
}
if value == nil {
return "", fmt.Errorf("Asset not found: %s", args[0])
}
return string(value), nil
}
// main 함수는 인스턴스화하는 동안 컨테이너에서 chaincode를 시작합니다.
func main() {
if err := shim.Start(new(SimpleAsset)); err != nil {
fmt.Printf("Error starting SimpleAsset chaincode: %s", err)
}
} | chaincode.go | 0.507812 | 0.461077 | chaincode.go | starcoder |
package dag
import (
"fmt"
"reflect"
"sort"
"strconv"
)
// the marshal* structs are for serialization of the graph data.
type marshalGraph struct {
// Type is always "Graph", for identification as a top level object in the
// JSON stream.
Type string
// Each marshal structure requires a unique ID so that it can be referenced
// by other structures.
ID string `json:",omitempty"`
// Human readable name for this graph.
Name string `json:",omitempty"`
// Arbitrary attributes that can be added to the output.
Attrs map[string]string `json:",omitempty"`
// List of graph vertices, sorted by ID.
Vertices []*marshalVertex `json:",omitempty"`
// List of edges, sorted by Source ID.
Edges []*marshalEdge `json:",omitempty"`
// Any number of subgraphs. A subgraph itself is considered a vertex, and
// may be referenced by either end of an edge.
Subgraphs []*marshalGraph `json:",omitempty"`
// Any lists of vertices that are included in cycles.
Cycles [][]*marshalVertex `json:",omitempty"`
}
func (g *marshalGraph) vertexByID(id string) *marshalVertex {
for _, v := range g.Vertices {
if id == v.ID {
return v
}
}
return nil
}
type marshalVertex struct {
// Unique ID, used to reference this vertex from other structures.
ID string
// Human readable name
Name string `json:",omitempty"`
Attrs map[string]string `json:",omitempty"`
// This is to help transition from the old Dot interfaces. We record if the
// node was a GraphNodeDotter here, so we can call it to get attributes.
graphNodeDotter GraphNodeDotter
}
func newMarshalVertex(v Vertex) *marshalVertex {
dn, ok := v.(GraphNodeDotter)
if !ok {
dn = nil
}
// the name will be quoted again later, so we need to ensure it's properly
// escaped without quotes.
name := strconv.Quote(VertexName(v))
name = name[1 : len(name)-1]
return &marshalVertex{
ID: marshalVertexID(v),
Name: name,
Attrs: make(map[string]string),
graphNodeDotter: dn,
}
}
// vertices is a sort.Interface implementation for sorting vertices by ID
type vertices []*marshalVertex
func (v vertices) Less(i, j int) bool { return v[i].Name < v[j].Name }
func (v vertices) Len() int { return len(v) }
func (v vertices) Swap(i, j int) { v[i], v[j] = v[j], v[i] }
type marshalEdge struct {
// Human readable name
Name string
// Source and Target Vertices by ID
Source string
Target string
Attrs map[string]string `json:",omitempty"`
}
func newMarshalEdge(e Edge) *marshalEdge {
return &marshalEdge{
Name: fmt.Sprintf("%s|%s", VertexName(e.Source()), VertexName(e.Target())),
Source: marshalVertexID(e.Source()),
Target: marshalVertexID(e.Target()),
Attrs: make(map[string]string),
}
}
// edges is a sort.Interface implementation for sorting edges by Source ID
type edges []*marshalEdge
func (e edges) Less(i, j int) bool { return e[i].Name < e[j].Name }
func (e edges) Len() int { return len(e) }
func (e edges) Swap(i, j int) { e[i], e[j] = e[j], e[i] }
// build a marshalGraph structure from a *Graph
func newMarshalGraph(name string, g *Graph) *marshalGraph {
mg := &marshalGraph{
Type: "Graph",
Name: name,
Attrs: make(map[string]string),
}
for _, v := range g.Vertices() {
id := marshalVertexID(v)
if sg, ok := marshalSubgrapher(v); ok {
smg := newMarshalGraph(VertexName(v), sg)
smg.ID = id
mg.Subgraphs = append(mg.Subgraphs, smg)
}
mv := newMarshalVertex(v)
mg.Vertices = append(mg.Vertices, mv)
}
sort.Sort(vertices(mg.Vertices))
for _, e := range g.Edges() {
mg.Edges = append(mg.Edges, newMarshalEdge(e))
}
sort.Sort(edges(mg.Edges))
for _, c := range (&AcyclicGraph{*g}).Cycles() {
var cycle []*marshalVertex
for _, v := range c {
mv := newMarshalVertex(v)
cycle = append(cycle, mv)
}
mg.Cycles = append(mg.Cycles, cycle)
}
return mg
}
// Attempt to return a unique ID for any vertex.
func marshalVertexID(v Vertex) string {
val := reflect.ValueOf(v)
switch val.Kind() {
case reflect.Chan, reflect.Func, reflect.Map, reflect.Ptr, reflect.Slice, reflect.UnsafePointer:
return strconv.Itoa(int(val.Pointer()))
case reflect.Interface:
return strconv.Itoa(int(val.InterfaceData()[1]))
}
if v, ok := v.(Hashable); ok {
h := v.Hashcode()
if h, ok := h.(string); ok {
return h
}
}
// fallback to a name, which we hope is unique.
return VertexName(v)
// we could try harder by attempting to read the arbitrary value from the
// interface, but we shouldn't get here from terraform right now.
}
// check for a Subgrapher, and return the underlying *Graph.
func marshalSubgrapher(v Vertex) (*Graph, bool) {
sg, ok := v.(Subgrapher)
if !ok {
return nil, false
}
switch g := sg.Subgraph().DirectedGraph().(type) {
case *Graph:
return g, true
case *AcyclicGraph:
return &g.Graph, true
}
return nil, false
} | terraform/terraform/vendor/github.com/hashicorp/terraform/internal/dag/marshal.go | 0.713631 | 0.419648 | marshal.go | starcoder |
package elastic
// The function_score allows you to modify the score of documents that
// are retrieved by a query. This can be useful if, for example,
// a score function is computationally expensive and it is sufficient
// to compute the score on a filtered set of documents.
// For more details, see
// http://www.elasticsearch.org/guide/en/elasticsearch/reference/current/query-dsl-function-score-query.html
type FunctionScoreQuery struct {
query Query
filter Filter
boost *float32
maxBoost *float32
scoreMode string
boostMode string
filters []Filter
scoreFuncs []ScoreFunction
minScore *float32
weight *float64
}
// NewFunctionScoreQuery creates a new function score query.
func NewFunctionScoreQuery() FunctionScoreQuery {
return FunctionScoreQuery{
filters: make([]Filter, 0),
scoreFuncs: make([]ScoreFunction, 0),
}
}
func (q FunctionScoreQuery) Query(query Query) FunctionScoreQuery {
q.query = query
return q
}
func (q FunctionScoreQuery) Filter(filter Filter) FunctionScoreQuery {
q.filter = filter
return q
}
func (q FunctionScoreQuery) Add(filter Filter, scoreFunc ScoreFunction) FunctionScoreQuery {
q.filters = append(q.filters, filter)
q.scoreFuncs = append(q.scoreFuncs, scoreFunc)
return q
}
func (q FunctionScoreQuery) AddScoreFunc(scoreFunc ScoreFunction) FunctionScoreQuery {
q.filters = append(q.filters, nil)
q.scoreFuncs = append(q.scoreFuncs, scoreFunc)
return q
}
func (q FunctionScoreQuery) ScoreMode(scoreMode string) FunctionScoreQuery {
q.scoreMode = scoreMode
return q
}
func (q FunctionScoreQuery) BoostMode(boostMode string) FunctionScoreQuery {
q.boostMode = boostMode
return q
}
func (q FunctionScoreQuery) MaxBoost(maxBoost float32) FunctionScoreQuery {
q.maxBoost = &maxBoost
return q
}
func (q FunctionScoreQuery) Boost(boost float32) FunctionScoreQuery {
q.boost = &boost
return q
}
func (q FunctionScoreQuery) MinScore(minScore float32) FunctionScoreQuery {
q.minScore = &minScore
return q
}
// Source returns JSON for the function score query.
func (q FunctionScoreQuery) Source() interface{} {
source := make(map[string]interface{})
query := make(map[string]interface{})
source["function_score"] = query
if q.query != nil {
query["query"] = q.query.Source()
}
if q.filter != nil {
query["filter"] = q.filter.Source()
}
if len(q.filters) == 1 && q.filters[0] == nil {
// Weight needs to be serialized on this level.
if weight := q.scoreFuncs[0].GetWeight(); weight != nil {
query["weight"] = weight
}
// Serialize the score function
query[q.scoreFuncs[0].Name()] = q.scoreFuncs[0].Source()
} else {
funcs := make([]interface{}, len(q.filters))
for i, filter := range q.filters {
hsh := make(map[string]interface{})
if filter != nil {
hsh["filter"] = filter.Source()
}
// Weight needs to be serialized on this level.
if weight := q.scoreFuncs[i].GetWeight(); weight != nil {
hsh["weight"] = weight
}
// Serialize the score function
hsh[q.scoreFuncs[i].Name()] = q.scoreFuncs[i].Source()
funcs[i] = hsh
}
query["functions"] = funcs
}
if q.scoreMode != "" {
query["score_mode"] = q.scoreMode
}
if q.boostMode != "" {
query["boost_mode"] = q.boostMode
}
if q.maxBoost != nil {
query["max_boost"] = *q.maxBoost
}
if q.boost != nil {
query["boost"] = *q.boost
}
if q.minScore != nil {
query["min_score"] = *q.minScore
}
return source
} | vendor/gopkg.in/olivere/elastic.v2/search_queries_fsq.go | 0.871543 | 0.416381 | search_queries_fsq.go | starcoder |
package jwt
import (
"encoding/json"
"fmt"
"reflect"
"time"
"github.com/grafana/grafana/pkg/models"
"gopkg.in/square/go-jose.v2/jwt"
)
func (s *AuthService) initClaimExpectations() error {
if err := json.Unmarshal([]byte(s.Cfg.JWTAuthExpectClaims), &s.expect); err != nil {
return err
}
for key, value := range s.expect {
switch key {
case "iss":
if stringValue, ok := value.(string); ok {
s.expectRegistered.Issuer = stringValue
} else {
return fmt.Errorf("%q expectation has invalid type %T, string expected", key, value)
}
delete(s.expect, key)
case "sub":
if stringValue, ok := value.(string); ok {
s.expectRegistered.Subject = stringValue
} else {
return fmt.Errorf("%q expectation has invalid type %T, string expected", key, value)
}
delete(s.expect, key)
case "aud":
switch value := value.(type) {
case []interface{}:
for _, val := range value {
if val, ok := val.(string); ok {
s.expectRegistered.Audience = append(s.expectRegistered.Audience, val)
} else {
return fmt.Errorf("%q expectation contains value with invalid type %T, string expected", key, val)
}
}
case string:
s.expectRegistered.Audience = []string{value}
default:
return fmt.Errorf("%q expectation has invalid type %T, array or string expected", key, value)
}
delete(s.expect, key)
}
}
return nil
}
func (s *AuthService) validateClaims(claims models.JWTClaims) error {
var registeredClaims jwt.Claims
for key, value := range claims {
switch key {
case "iss":
if stringValue, ok := value.(string); ok {
registeredClaims.Issuer = stringValue
} else {
return fmt.Errorf("%q claim has invalid type %T, string expected", key, value)
}
case "sub":
if stringValue, ok := value.(string); ok {
registeredClaims.Subject = stringValue
} else {
return fmt.Errorf("%q claim has invalid type %T, string expected", key, value)
}
case "aud":
switch value := value.(type) {
case []interface{}:
for _, val := range value {
if val, ok := val.(string); ok {
registeredClaims.Audience = append(registeredClaims.Audience, val)
} else {
return fmt.Errorf("%q claim contains value with invalid type %T, string expected", key, val)
}
}
case string:
registeredClaims.Audience = []string{value}
default:
return fmt.Errorf("%q claim has invalid type %T, array or string expected", key, value)
}
case "exp":
if value == nil {
continue
}
if floatValue, ok := value.(float64); ok {
out := jwt.NumericDate(floatValue)
registeredClaims.Expiry = &out
} else {
return fmt.Errorf("%q claim has invalid type %T, number expected", key, value)
}
case "nbf":
if value == nil {
continue
}
if floatValue, ok := value.(float64); ok {
out := jwt.NumericDate(floatValue)
registeredClaims.NotBefore = &out
} else {
return fmt.Errorf("%q claim has invalid type %T, number expected", key, value)
}
case "iat":
if value == nil {
continue
}
if floatValue, ok := value.(float64); ok {
out := jwt.NumericDate(floatValue)
registeredClaims.IssuedAt = &out
} else {
return fmt.Errorf("%q claim has invalid type %T, number expected", key, value)
}
}
}
expectRegistered := s.expectRegistered
expectRegistered.Time = time.Now()
if err := registeredClaims.Validate(expectRegistered); err != nil {
return err
}
for key, expected := range s.expect {
value, ok := claims[key]
if !ok {
return fmt.Errorf("%q claim is missing", key)
}
if !reflect.DeepEqual(expected, value) {
return fmt.Errorf("%q claim mismatch", key)
}
}
return nil
} | pkg/services/auth/jwt/validation.go | 0.573201 | 0.464537 | validation.go | starcoder |
package volume
import (
"errors"
"github.com/louis030195/protometry/api/vector3"
)
// NewBoxMinMax returns a new box using min max
func NewBoxMinMax(minX, minY, minZ, maxX, maxY, maxZ float64) *Box {
return &Box{
Min: vector3.NewVector3(minX, minY, minZ),
Max: vector3.NewVector3(maxX, maxY, maxZ),
}
}
// NewBoxOfSize returns a box of size centered at center
func NewBoxOfSize(x, y, z, size float64) *Box {
half := size / 2
min := *vector3.NewVector3(x-half, y-half, z-half)
max := *vector3.NewVector3(x+half, y+half, z+half)
return &Box{
Min: &min,
Max: &max,
}
}
// Equal returns whether a box is equal to another
func (b Box) Equal(other Box) bool {
return b.Min.Equal(*other.Min) && b.Max.Equal(*other.Max)
}
// GetCenter ...
func (b Box) GetCenter() vector3.Vector3 {
return *b.Min.Lerp(b.Max, 0.5)
}
// GetSize returns the size of the box
func (b *Box) GetSize() vector3.Vector3 {
return b.Max.Minus(*b.Min)
}
// In Returns whether the specified point is contained in this box.
func (b Box) Contains(v vector3.Vector3) bool {
return (b.Min.X <= v.X && v.X <= b.Max.X) &&
(b.Min.Y <= v.Y && v.Y <= b.Max.Y) &&
(b.Min.Z <= v.Z && v.Z <= b.Max.Z)
}
// Fit Returns whether the specified area is fully contained in the other area.
func (b Box) Fit(o Box) bool {
return o.Contains(*b.Max) && o.Contains(*b.Min)
}
// Intersects Returns whether any portion of this area strictly intersects with the specified area or reversely.
func (b Box) Intersects(b2 Box) bool {
return !(b.Max.X < b2.Min.X || b2.Max.X < b.Min.X || b.Max.Y < b2.Min.Y || b2.Max.Y < b.Min.Y || b.Max.Z < b2.Min.Z || b2.Max.Z < b.Min.Z)
}
/* Split split a CUBE into 8 cubes
* 3____7
* 2/___6/|
* | 1__|_5
* 0/___4/
*/
func (b *Box) Split() [8]*Box {
center := b.GetCenter()
return [8]*Box{
NewBoxMinMax(b.Min.X, b.Min.Y, b.Min.Z, center.X, center.Y, center.Z),
NewBoxMinMax(b.Min.X, b.Min.Y, center.Z, center.X, center.Y, b.Max.Z),
NewBoxMinMax(b.Min.X, center.Y, b.Min.Z, center.X, b.Max.Y, center.Z),
NewBoxMinMax(b.Min.X, center.Y, center.Z, center.X, b.Max.Y, b.Max.Z),
NewBoxMinMax(center.X, b.Min.Y, b.Min.Z, b.Max.X, center.Y, center.Z),
NewBoxMinMax(center.X, b.Min.Y, center.Z, b.Max.X, center.Y, b.Max.Z),
NewBoxMinMax(center.X, center.Y, b.Min.Z, b.Max.X, b.Max.Y, center.Z),
NewBoxMinMax(center.X, center.Y, center.Z, b.Max.X, b.Max.Y, b.Max.Z),
}
}
/* SplitFour split a CUBE into 4 cubes
* Vertical
* 1____3
* 0/___2/|
* | 1__|_3
* 0/___2/
* Horizontal
* 1____3
* 1/___3/|
* | 0__|_2
* 0/___2/
*/
func (b *Box) SplitFour(vertical bool) [4]*Box {
center := b.GetCenter()
if vertical {
return [4]*Box{
NewBoxMinMax(b.Min.X, b.Min.Y, b.Min.Z, center.X, b.Max.Y, center.Z),
NewBoxMinMax(b.Min.X, b.Min.Y, center.Z, center.X, b.Max.Y, b.Max.Z),
NewBoxMinMax(center.X, b.Min.Y, b.Min.Z, b.Max.X, b.Max.Y, center.Z),
NewBoxMinMax(center.X, b.Min.Y, center.Z, b.Max.X, b.Max.Y, b.Max.Z),
}
}
return [4]*Box{
NewBoxMinMax(b.Min.X, b.Min.Y, b.Min.Z, center.X, center.Y, b.Max.Z),
NewBoxMinMax(b.Min.X, center.Y, b.Min.Z, center.X, b.Max.Y, b.Max.Z),
NewBoxMinMax(center.X, b.Min.Y, b.Min.Z, b.Max.X, center.Y, b.Max.Z),
NewBoxMinMax(center.X, center.Y, b.Min.Z, b.Max.X, b.Max.Y, b.Max.Z),
}
}
// Grows the Bounds to include the point. In-place and returns itself
func (b *Box) EncapsulatePoint(o vector3.Vector3) *Box {
min := vector3.Min(*b.Min, o)
max := vector3.Max(*b.Max, o)
b.Min = &min
b.Max = &max
return b
}
// Grows the Bounds to include the bounds. In-place and returns itself
func (b *Box) EncapsulateBox(o Box) *Box {
b.EncapsulatePoint(*o.Min)
b.EncapsulatePoint(*o.Max)
return b
}
// Intersection returns the intersection between two boxes
func (b Box) Intersection(o Box) vector3.Vector3 {
panic(errors.New("intersection not implemented"))
return vector3.Vector3{}
} | api/volume/box.go | 0.891899 | 0.545709 | box.go | starcoder |
package trade_knife
import (
"github.com/amir-the-h/goex"
"time"
)
// Interval is the timeframe concept and determines duration of each candle.
type Interval string
// Duration Returns actual duration of the interval.
func (i Interval) Duration() time.Duration {
switch i {
case Interval1m:
return time.Minute
case Interval3m:
return time.Minute * 3
case Interval5m:
return time.Minute * 5
case Interval15m:
return time.Minute * 15
case Interval30m:
return time.Minute * 30
case Interval1h:
return time.Hour
case Interval2h:
return time.Hour * 2
case Interval4h:
return time.Hour * 4
case Interval6h:
return time.Hour * 6
case Interval8h:
return time.Hour * 8
case Interval12h:
return time.Hour * 12
case Interval1d:
return time.Hour * 24
case Interval3d:
return time.Hour * 24 * 3
case Interval1w:
return time.Hour * 24 * 7
case Interval1M:
return time.Hour * 24 * 30
}
return time.Duration(0)
}
// KlinePeriod Returns goex.KlinePeriod associated type
func (i Interval) KlinePeriod() goex.KlinePeriod {
switch i {
case Interval3m:
return goex.KLINE_PERIOD_3MIN
case Interval5m:
return goex.KLINE_PERIOD_5MIN
case Interval15m:
return goex.KLINE_PERIOD_15MIN
case Interval30m:
return goex.KLINE_PERIOD_30MIN
case Interval1h:
return goex.KLINE_PERIOD_1H
case Interval2h:
return goex.KLINE_PERIOD_2H
case Interval4h:
return goex.KLINE_PERIOD_4H
case Interval6h:
return goex.KLINE_PERIOD_6H
case Interval8h:
return goex.KLINE_PERIOD_8H
case Interval12h:
return goex.KLINE_PERIOD_12H
case Interval1d:
return goex.KLINE_PERIOD_1DAY
case Interval3d:
return goex.KLINE_PERIOD_3DAY
case Interval1w:
return goex.KLINE_PERIOD_1WEEK
case Interval1M:
return goex.KLINE_PERIOD_1MONTH
}
return goex.KLINE_PERIOD_1MIN
}
// GetPeriod Returns the open and close time which the interval can fit in.
func (i Interval) GetPeriod(ts int64) (ot *time.Time, ct *time.Time, err error) {
t := time.Unix(int64(ts/1000), 0)
switch i {
case Interval1m:
*ot = time.Date(t.Year(), t.Month(), t.Day(), t.Hour(), t.Minute(), 0, 0, time.UTC)
case Interval3m:
m := t.Minute() - (t.Minute() % 3)
*ot = time.Date(t.Year(), t.Month(), t.Day(), t.Hour(), m, 0, 0, time.UTC)
case Interval5m:
m := t.Minute() - (t.Minute() % 5)
*ot = time.Date(t.Year(), t.Month(), t.Day(), t.Hour(), m, 0, 0, time.UTC)
case Interval15m:
m := t.Minute() - (t.Minute() % 15)
*ot = time.Date(t.Year(), t.Month(), t.Day(), t.Hour(), m, 0, 0, time.UTC)
case Interval30m:
m := t.Minute() - (t.Minute() % 30)
*ot = time.Date(t.Year(), t.Month(), t.Day(), t.Hour(), m, 0, 0, time.UTC)
case Interval1h:
*ot = time.Date(t.Year(), t.Month(), t.Day(), t.Hour(), 0, 0, 0, time.UTC)
case Interval2h:
h := t.Hour() - (t.Hour() % 2)
*ot = time.Date(t.Year(), t.Month(), t.Day(), h, 0, 0, 0, time.UTC)
case Interval4h:
h := t.Hour() - (t.Hour() % 4)
*ot = time.Date(t.Year(), t.Month(), t.Day(), h, 0, 0, 0, time.UTC)
case Interval6h:
h := t.Hour() - (t.Hour() % 6)
*ot = time.Date(t.Year(), t.Month(), t.Day(), h, 0, 0, 0, time.UTC)
case Interval8h:
h := t.Hour() - (t.Hour() % 8)
*ot = time.Date(t.Year(), t.Month(), t.Day(), h, 0, 0, 0, time.UTC)
case Interval12h:
h := t.Hour() - (t.Hour() % 12)
*ot = time.Date(t.Year(), t.Month(), t.Day(), h, 0, 0, 0, time.UTC)
case Interval1d:
*ot = time.Date(t.Year(), t.Month(), t.Day(), 0, 0, 0, 0, time.UTC)
case Interval3d:
d := t.Hour() - (t.Hour() % 3)
*ot = time.Date(t.Year(), t.Month(), d, 0, 0, 0, 0, time.UTC)
case Interval1w:
d := t.Hour() - (t.Hour() % 7)
*ot = time.Date(t.Year(), t.Month(), d, 0, 0, 0, 0, time.UTC)
case Interval1M:
*ot = time.Date(t.Year(), t.Month(), 0, 0, 0, 0, 0, time.UTC)
}
*ct = ot.Add(i.Duration())
return
} | interval.go | 0.632162 | 0.51013 | interval.go | starcoder |
package resize
import (
"image"
"image/color"
)
// ycc is an in memory YCbCr image. The Y, Cb and Cr samples are held in a
// single slice to increase resizing performance.
type ycc struct {
// Pix holds the image's pixels, in Y, Cb, Cr order. The pixel at
// (x, y) starts at Pix[(y-Rect.Min.Y)*Stride + (x-Rect.Min.X)*3].
Pix []uint8
// Stride is the Pix stride (in bytes) between vertically adjacent pixels.
Stride int
// Rect is the image's bounds.
Rect image.Rectangle
// SubsampleRatio is the subsample ratio of the original YCbCr image.
SubsampleRatio image.YCbCrSubsampleRatio
}
// PixOffset returns the index of the first element of Pix that corresponds to
// the pixel at (x, y).
func (p *ycc) PixOffset(x, y int) int {
return (y-p.Rect.Min.Y)*p.Stride + (x-p.Rect.Min.X)*3
}
func (p *ycc) Bounds() image.Rectangle {
return p.Rect
}
func (p *ycc) ColorModel() color.Model {
return color.YCbCrModel
}
func (p *ycc) At(x, y int) color.Color {
if !(image.Point{x, y}.In(p.Rect)) {
return color.YCbCr{}
}
i := p.PixOffset(x, y)
return color.YCbCr{
p.Pix[i+0],
p.Pix[i+1],
p.Pix[i+2],
}
}
func (p *ycc) Opaque() bool {
return true
}
// SubImage returns an image representing the portion of the image p visible
// through r. The returned value shares pixels with the original image.
func (p *ycc) SubImage(r image.Rectangle) image.Image {
r = r.Intersect(p.Rect)
if r.Empty() {
return &ycc{SubsampleRatio: p.SubsampleRatio}
}
i := p.PixOffset(r.Min.X, r.Min.Y)
return &ycc{
Pix: p.Pix[i:],
Stride: p.Stride,
Rect: r,
SubsampleRatio: p.SubsampleRatio,
}
}
// newYCC returns a new ycc with the given bounds and subsample ratio.
func newYCC(r image.Rectangle, s image.YCbCrSubsampleRatio) *ycc {
w, h := r.Dx(), r.Dy()
buf := make([]uint8, 3*w*h)
return &ycc{Pix: buf, Stride: 3 * w, Rect: r, SubsampleRatio: s}
}
// YCbCr converts ycc to a YCbCr image with the same subsample ratio
// as the YCbCr image that ycc was generated from.
func (p *ycc) YCbCr() *image.YCbCr {
ycbcr := image.NewYCbCr(p.Rect, p.SubsampleRatio)
var off int
switch ycbcr.SubsampleRatio {
case image.YCbCrSubsampleRatio422:
for y := ycbcr.Rect.Min.Y; y < ycbcr.Rect.Max.Y; y++ {
yy := (y - ycbcr.Rect.Min.Y) * ycbcr.YStride
cy := (y - ycbcr.Rect.Min.Y) * ycbcr.CStride
for x := ycbcr.Rect.Min.X; x < ycbcr.Rect.Max.X; x++ {
xx := (x - ycbcr.Rect.Min.X)
yi := yy + xx
ci := cy + xx/2
ycbcr.Y[yi] = p.Pix[off+0]
ycbcr.Cb[ci] = p.Pix[off+1]
ycbcr.Cr[ci] = p.Pix[off+2]
off += 3
}
}
case image.YCbCrSubsampleRatio420:
for y := ycbcr.Rect.Min.Y; y < ycbcr.Rect.Max.Y; y++ {
yy := (y - ycbcr.Rect.Min.Y) * ycbcr.YStride
cy := (y/2 - ycbcr.Rect.Min.Y/2) * ycbcr.CStride
for x := ycbcr.Rect.Min.X; x < ycbcr.Rect.Max.X; x++ {
xx := (x - ycbcr.Rect.Min.X)
yi := yy + xx
ci := cy + xx/2
ycbcr.Y[yi] = p.Pix[off+0]
ycbcr.Cb[ci] = p.Pix[off+1]
ycbcr.Cr[ci] = p.Pix[off+2]
off += 3
}
}
case image.YCbCrSubsampleRatio440:
for y := ycbcr.Rect.Min.Y; y < ycbcr.Rect.Max.Y; y++ {
yy := (y - ycbcr.Rect.Min.Y) * ycbcr.YStride
cy := (y/2 - ycbcr.Rect.Min.Y/2) * ycbcr.CStride
for x := ycbcr.Rect.Min.X; x < ycbcr.Rect.Max.X; x++ {
xx := (x - ycbcr.Rect.Min.X)
yi := yy + xx
ci := cy + xx
ycbcr.Y[yi] = p.Pix[off+0]
ycbcr.Cb[ci] = p.Pix[off+1]
ycbcr.Cr[ci] = p.Pix[off+2]
off += 3
}
}
default:
// Default to 4:4:4 subsampling.
for y := ycbcr.Rect.Min.Y; y < ycbcr.Rect.Max.Y; y++ {
yy := (y - ycbcr.Rect.Min.Y) * ycbcr.YStride
cy := (y - ycbcr.Rect.Min.Y) * ycbcr.CStride
for x := ycbcr.Rect.Min.X; x < ycbcr.Rect.Max.X; x++ {
xx := (x - ycbcr.Rect.Min.X)
yi := yy + xx
ci := cy + xx
ycbcr.Y[yi] = p.Pix[off+0]
ycbcr.Cb[ci] = p.Pix[off+1]
ycbcr.Cr[ci] = p.Pix[off+2]
off += 3
}
}
}
return ycbcr
}
// imageYCbCrToYCC converts a YCbCr image to a ycc image for resizing.
func imageYCbCrToYCC(in *image.YCbCr) *ycc {
w, h := in.Rect.Dx(), in.Rect.Dy()
r := image.Rect(0, 0, w, h)
buf := make([]uint8, 3*w*h)
p := ycc{Pix: buf, Stride: 3 * w, Rect: r, SubsampleRatio: in.SubsampleRatio}
var off int
switch in.SubsampleRatio {
case image.YCbCrSubsampleRatio422:
for y := in.Rect.Min.Y; y < in.Rect.Max.Y; y++ {
yy := (y - in.Rect.Min.Y) * in.YStride
cy := (y - in.Rect.Min.Y) * in.CStride
for x := in.Rect.Min.X; x < in.Rect.Max.X; x++ {
xx := (x - in.Rect.Min.X)
yi := yy + xx
ci := cy + xx/2
p.Pix[off+0] = in.Y[yi]
p.Pix[off+1] = in.Cb[ci]
p.Pix[off+2] = in.Cr[ci]
off += 3
}
}
case image.YCbCrSubsampleRatio420:
for y := in.Rect.Min.Y; y < in.Rect.Max.Y; y++ {
yy := (y - in.Rect.Min.Y) * in.YStride
cy := (y/2 - in.Rect.Min.Y/2) * in.CStride
for x := in.Rect.Min.X; x < in.Rect.Max.X; x++ {
xx := (x - in.Rect.Min.X)
yi := yy + xx
ci := cy + xx/2
p.Pix[off+0] = in.Y[yi]
p.Pix[off+1] = in.Cb[ci]
p.Pix[off+2] = in.Cr[ci]
off += 3
}
}
case image.YCbCrSubsampleRatio440:
for y := in.Rect.Min.Y; y < in.Rect.Max.Y; y++ {
yy := (y - in.Rect.Min.Y) * in.YStride
cy := (y/2 - in.Rect.Min.Y/2) * in.CStride
for x := in.Rect.Min.X; x < in.Rect.Max.X; x++ {
xx := (x - in.Rect.Min.X)
yi := yy + xx
ci := cy + xx
p.Pix[off+0] = in.Y[yi]
p.Pix[off+1] = in.Cb[ci]
p.Pix[off+2] = in.Cr[ci]
off += 3
}
}
default:
// Default to 4:4:4 subsampling.
for y := in.Rect.Min.Y; y < in.Rect.Max.Y; y++ {
yy := (y - in.Rect.Min.Y) * in.YStride
cy := (y - in.Rect.Min.Y) * in.CStride
for x := in.Rect.Min.X; x < in.Rect.Max.X; x++ {
xx := (x - in.Rect.Min.X)
yi := yy + xx
ci := cy + xx
p.Pix[off+0] = in.Y[yi]
p.Pix[off+1] = in.Cb[ci]
p.Pix[off+2] = in.Cr[ci]
off += 3
}
}
}
return &p
} | ycc.go | 0.770724 | 0.635053 | ycc.go | starcoder |
package bufr
import (
"fmt"
"encoding/json"
)
// LookupFunction is for looking up meanings of code values on code tables
type LookupFunction = func(uint) string
var MISSING_VALUE_OF_NBITS = make(map[int]uint)
func init() {
// Store the missing values for corresponding number of bits for
// quick lookup.
for i := 1; i <= 64; i++ {
MISSING_VALUE_OF_NBITS[i] = 1<<uint(i) - 1
}
}
// IsMissing checks whether the given value is a missing value for the given number of
// bits. ONLY uint values can be missing. This is NOT checking whether the value is nil.
func IsMissing(value interface{}, nbits int) bool {
if _, ok := value.(uint); ok && nbits > 1 {
return value == MISSING_VALUE_OF_NBITS[nbits]
}
return false
}
// MissingValue returns uint number equals to the missing value of given number of bits
func MissingValue(nbits int) (uint, error) {
v, ok := MISSING_VALUE_OF_NBITS[nbits]
if ok {
return v, nil
}
return 0, fmt.Errorf("invalid number of bits for retrieving missing value: %v", nbits)
}
// Field represents a field of a BUFR message's section. Its value can be anything,
// including primitive types and custom types for UnexpandedTemplate and Payload.
type Field struct {
// Name of the field
Name string
// Value of the field. The Value can be a custom type Payload which covers
// the entire payload (the meat of the data section) of a BUFR message.
Value interface{}
// Number of nbits taken by this field.
Nbits int
// Lookup function for code values
Lookup LookupFunction
// TODO: Maybe the Field can be an interface and have Hidden and Virtual as implementations?
// A hidden field does not show up in certain serialised format, e.g. Text.
Hidden bool
// A virtual field is something that does not actually have corresponding
// bits in the source file, e.g. a computed field
Virtual bool
}
func (f *Field) Accept(visitor Visitor) error {
return visitor.VisitField(f)
}
func (f *Field) IsMissing() bool {
return IsMissing(f.Value, f.Nbits)
}
func (f *Field) MarshalJSON() ([]byte, error) {
return json.Marshal(f.Value)
}
func NewField(name string, value interface{}, nbits int) *Field {
return &Field{Name: name, Value: value, Nbits: nbits, Virtual: false}
}
func NewHiddenField(name string, value interface{}, nbits int) *Field {
field := NewField(name, value, nbits)
field.Hidden = true
return field
} | bufr/field.go | 0.613468 | 0.40486 | field.go | starcoder |
package edit
import (
"strings"
"github.com/elves/elvish/util"
)
// cell is an indivisible unit on the screen. It is not necessarily 1 column
// wide.
type cell struct {
rune
width byte
style string
}
// pos is the position within a buffer.
type pos struct {
line, col int
}
var invalidPos = pos{-1, -1}
func lineWidth(cs []cell) int {
w := 0
for _, c := range cs {
w += int(c.width)
}
return w
}
func makeSpacing(n int) []cell {
s := make([]cell, n)
for i := 0; i < n; i++ {
s[i].rune = ' '
s[i].width = 1
}
return s
}
// buffer reflects a continuous range of lines on the terminal. The Unix
// terminal API provides only awkward ways of querying the terminal buffer, so
// we keep an internal reflection and do one-way synchronizations (buffer ->
// terminal, and not the other way around). This requires us to exactly match
// the terminal's idea of the width of characters (wcwidth) and where to
// insert soft carriage returns, so there could be bugs.
type buffer struct {
width, col, indent int
newlineWhenFull bool
cells [][]cell // cells reflect len(cells) lines on the terminal.
dot pos // dot is what the user perceives as the cursor.
}
func newBuffer(width int) *buffer {
return &buffer{width: width, cells: [][]cell{make([]cell, 0, width)}}
}
func (b *buffer) appendCell(c cell) {
n := len(b.cells)
b.cells[n-1] = append(b.cells[n-1], c)
b.col += int(c.width)
}
func (b *buffer) appendLine() {
b.cells = append(b.cells, make([]cell, 0, b.width))
b.col = 0
}
func (b *buffer) newline() {
b.appendLine()
if b.indent > 0 {
for i := 0; i < b.indent; i++ {
b.appendCell(cell{rune: ' ', width: 1})
}
}
}
func (b *buffer) extend(b2 *buffer, moveDot bool) {
if b2 != nil && b2.cells != nil {
if moveDot {
b.dot.line = b2.dot.line + len(b.cells)
b.dot.col = b2.dot.col
}
b.cells = append(b.cells, b2.cells...)
b.col = b2.col
}
}
// extendHorizontal extends b horizontally. It pads each line in b to be at
// least of width w and appends the corresponding line in b2 to it, making new
// lines in b when b2 has more lines than b.
func (b *buffer) extendHorizontal(b2 *buffer, w int) {
i := 0
for ; i < len(b.cells) && i < len(b2.cells); i++ {
if w0 := lineWidth(b.cells[i]); w0 < w {
b.cells[i] = append(b.cells[i], makeSpacing(w-w0)...)
}
b.cells[i] = append(b.cells[i], b2.cells[i]...)
}
for ; i < len(b2.cells); i++ {
row := append(makeSpacing(w), b2.cells[i]...)
b.cells = append(b.cells, row)
}
}
// write appends a single rune to a buffer.
func (b *buffer) write(r rune, style string) {
if r == '\n' {
b.newline()
return
} else if r < 0x20 || r == 0x7f {
// BUG(xiaq): buffer.write drops ASCII control characters silently
return
}
wd := util.Wcwidth(r)
c := cell{r, byte(wd), style}
if b.col+wd > b.width {
b.newline()
b.appendCell(c)
} else {
b.appendCell(c)
if b.col == b.width && b.newlineWhenFull {
b.newline()
}
}
}
func (b *buffer) writes(s string, style string) {
for _, r := range s {
b.write(r, style)
}
}
func (b *buffer) writeStyled(s *styled) {
b.writes(s.text, s.style)
}
func (b *buffer) writeStyleds(ss []*styled) {
for _, s := range ss {
b.writeStyled(s)
}
}
func (b *buffer) writePadding(w int, style string) {
b.writes(strings.Repeat(" ", w), style)
}
func (b *buffer) line() int {
return len(b.cells) - 1
}
func (b *buffer) cursor() pos {
return pos{len(b.cells) - 1, b.col}
}
func (b *buffer) trimToLines(low, high int) {
for i := 0; i < low; i++ {
b.cells[i] = nil
}
for i := high; i < len(b.cells); i++ {
b.cells[i] = nil
}
b.cells = b.cells[low:high]
b.dot.line -= low
}
func widthOfCells(cells []cell) int {
w := 0
for _, c := range cells {
w += int(c.width)
}
return w
}
func lines(bufs ...*buffer) (l int) {
for _, buf := range bufs {
if buf != nil {
l += len(buf.cells)
}
}
return
}
func compareRows(r1, r2 []cell) (bool, int) {
for i, c := range r1 {
if i >= len(r2) || c != r2[i] {
return false, i
}
}
if len(r1) < len(r2) {
return false, len(r1)
}
return true, 0
} | edit/buffer.go | 0.619356 | 0.425486 | buffer.go | starcoder |
package main
type IntPair struct {
First, Second int
}
// O(w.h) time | O(w.h) space
// where W is the width of the matrix and H is the height
func MinimumPassesOfMatrix(matrix [][]int) int {
passes := convertNegatives(matrix)
if !containsNegative(matrix) {
return passes - 1
} else {
return -1
}
}
func containsNegative(matrix [][]int) bool {
for _, row := range matrix {
for _, value := range row {
if value < 0 {
return true
}
}
}
return false
}
// In Go, removing elements from the start of a list is an O(n)-time operation
// To make this an O(1)-time operation, we could use a more legitimate queue structure.
// For our time complexity analysis, we'll assume this runs in O(1) time.
// Alternatively for this solution we could turn the queue into a stack and replace dequeue for stack pop
func convertNegatives(matrix [][]int) int {
nextPassQueue := getAllPositivePositions(matrix)
var passes = 0
for len(nextPassQueue) > 0 {
currentPassQueue := nextPassQueue
nextPassQueue = make([]IntPair, 0)
for len(currentPassQueue) > 0 {
firstElement := currentPassQueue[0]
currentPassQueue = currentPassQueue[1:]
currentRow, currentCol := firstElement.First, firstElement.Second
adjacentPositions := getAdjacentPositions(currentRow, currentCol, matrix)
for _, position := range adjacentPositions {
row, col := position.First, position.Second
value := matrix[row][col]
if value < 0 {
matrix[row][col] *= -1
nextPassQueue = append(nextPassQueue, IntPair{row, col})
}
}
}
passes += 1
}
return passes
}
func getAllPositivePositions(matrix [][]int) []IntPair {
positivePositions := make([]IntPair, 0)
for row := range matrix {
for col := range matrix[row] {
value := matrix[row][col]
if value > 0 {
positivePositions = append(positivePositions, IntPair{row, col})
}
}
}
return positivePositions
}
func getAdjacentPositions(row, col int, matrix [][]int) []IntPair {
adjacentPositions := make([]IntPair, 0)
if row > 0 {
adjacentPositions = append(adjacentPositions, IntPair{row - 1, col})
}
if row < len(matrix)-1 {
adjacentPositions = append(adjacentPositions, IntPair{row + 1, col})
}
if col > 0 {
adjacentPositions = append(adjacentPositions, IntPair{row, col - 1})
}
if col < len(matrix[0])-1 {
adjacentPositions = append(adjacentPositions, IntPair{row, col + 1})
}
return adjacentPositions
} | src/graphs/medium/min-passes-matrix/go/iterative.go | 0.744006 | 0.538194 | iterative.go | starcoder |
Package nestedpendingoperations is a modified implementation of
pkg/util/goroutinemap. It implements a data structure for managing go routines
by volume/pod name. It prevents the creation of new go routines if an existing
go routine for the volume already exists. It also allows multiple operations to
execute in parallel for the same volume as long as they are operating on
different pods.
*/
package nestedpendingoperations
import (
"fmt"
"sync"
v1 "k8s.io/api/core/v1"
"k8s.io/apimachinery/pkg/types"
k8sRuntime "k8s.io/apimachinery/pkg/util/runtime"
"k8s.io/klog/v2"
"k8s.io/kubernetes/pkg/util/goroutinemap/exponentialbackoff"
volumetypes "k8s.io/kubernetes/pkg/volume/util/types"
)
const (
// EmptyUniquePodName is a UniquePodName for empty string.
EmptyUniquePodName volumetypes.UniquePodName = volumetypes.UniquePodName("")
// EmptyUniqueVolumeName is a UniqueVolumeName for empty string
EmptyUniqueVolumeName v1.UniqueVolumeName = v1.UniqueVolumeName("")
// EmptyNodeName is a NodeName for empty string
EmptyNodeName types.NodeName = types.NodeName("")
)
// NestedPendingOperations defines the supported set of operations.
type NestedPendingOperations interface {
// Run adds the concatenation of volumeName, podName, and nodeName to the list
// of running operations and spawns a new go routine to run
// generatedOperations.
// volumeName, podName, and nodeName collectively form the operation key.
// The following forms of operation keys are supported (two keys are designed
// to be "matched" if we want to serialize their operations):
// - volumeName empty, podName and nodeName could be anything
// This key does not match with any keys.
// - volumeName exists, podName empty, nodeName empty
// This key matches all other keys with the same volumeName.
// - volumeName exists, podName exists, nodeName empty
// This key matches with:
// - the same volumeName and podName
// - the same volumeName, but empty podName
// - volumeName exists, podName empty, nodeName exists
// This key matches with:
// - the same volumeName and nodeName
// - the same volumeName but empty nodeName
// If there is no operation with a matching key, the operation is allowed to
// proceed.
// If an operation with a matching key exists and the previous operation is
// running, an AlreadyExists error is returned.
// If an operation with a matching key exists and the previous operation
// failed:
// - If the previous operation has the same
// generatedOperations.operationName:
// - If the full exponential backoff period is satisfied, the operation is
// allowed to proceed.
// - Otherwise, an ExponentialBackoff error is returned.
// - Otherwise, exponential backoff is reset and operation is allowed to
// proceed.
// Once the operation is complete, the go routine is terminated. If the
// operation succeeded, its corresponding key is removed from the list of
// executing operations, allowing a new operation to be started with the key
// without error. If it failed, the key remains and the exponential
// backoff status is updated.
Run(
volumeName v1.UniqueVolumeName,
podName volumetypes.UniquePodName,
nodeName types.NodeName,
generatedOperations volumetypes.GeneratedOperations) error
// Wait blocks until all operations are completed. This is typically
// necessary during tests - the test should wait until all operations finish
// and evaluate results after that.
Wait()
// IsOperationPending returns true if an operation for the given volumeName
// and one of podName or nodeName is pending, otherwise it returns false
IsOperationPending(
volumeName v1.UniqueVolumeName,
podName volumetypes.UniquePodName,
nodeName types.NodeName) bool
// IsOperationSafeToRetry returns false if an operation for the given volumeName
// and one of podName or nodeName is pending or in exponential backoff, otherwise it returns true
IsOperationSafeToRetry(
volumeName v1.UniqueVolumeName,
podName volumetypes.UniquePodName,
nodeName types.NodeName, operationName string) bool
}
// NewNestedPendingOperations returns a new instance of NestedPendingOperations.
func NewNestedPendingOperations(exponentialBackOffOnError bool) NestedPendingOperations {
g := &nestedPendingOperations{
operations: []operation{},
exponentialBackOffOnError: exponentialBackOffOnError,
}
g.cond = sync.NewCond(&g.lock)
return g
}
type nestedPendingOperations struct {
operations []operation
exponentialBackOffOnError bool
cond *sync.Cond
lock sync.RWMutex
}
type operation struct {
key operationKey
operationName string
operationPending bool
expBackoff exponentialbackoff.ExponentialBackoff
}
func (grm *nestedPendingOperations) Run(
volumeName v1.UniqueVolumeName,
podName volumetypes.UniquePodName,
nodeName types.NodeName,
generatedOperations volumetypes.GeneratedOperations) error {
grm.lock.Lock()
defer grm.lock.Unlock()
opKey := operationKey{volumeName, podName, nodeName}
opExists, previousOpIndex := grm.isOperationExists(opKey)
if opExists {
previousOp := grm.operations[previousOpIndex]
// Operation already exists
if previousOp.operationPending {
// Operation is pending
return NewAlreadyExistsError(opKey)
}
backOffErr := previousOp.expBackoff.SafeToRetry(fmt.Sprintf("%+v", opKey))
if backOffErr != nil {
if previousOp.operationName == generatedOperations.OperationName {
return backOffErr
}
// previous operation and new operation are different. reset op. name and exp. backoff
grm.operations[previousOpIndex].operationName = generatedOperations.OperationName
grm.operations[previousOpIndex].expBackoff = exponentialbackoff.ExponentialBackoff{}
}
// Update existing operation to mark as pending.
grm.operations[previousOpIndex].operationPending = true
grm.operations[previousOpIndex].key = opKey
} else {
// Create a new operation
grm.operations = append(grm.operations,
operation{
key: opKey,
operationPending: true,
operationName: generatedOperations.OperationName,
expBackoff: exponentialbackoff.ExponentialBackoff{},
})
}
go func() (eventErr, detailedErr error) {
// Handle unhandled panics (very unlikely)
defer k8sRuntime.HandleCrash()
// Handle completion of and error, if any, from operationFunc()
defer grm.operationComplete(opKey, &detailedErr)
return generatedOperations.Run()
}()
return nil
}
func (grm *nestedPendingOperations) IsOperationSafeToRetry(
volumeName v1.UniqueVolumeName,
podName volumetypes.UniquePodName,
nodeName types.NodeName,
operationName string) bool {
grm.lock.RLock()
defer grm.lock.RUnlock()
opKey := operationKey{volumeName, podName, nodeName}
exist, previousOpIndex := grm.isOperationExists(opKey)
if !exist {
return true
}
previousOp := grm.operations[previousOpIndex]
if previousOp.operationPending {
return false
}
backOffErr := previousOp.expBackoff.SafeToRetry(fmt.Sprintf("%+v", opKey))
if backOffErr != nil {
if previousOp.operationName == operationName {
return false
}
}
return true
}
func (grm *nestedPendingOperations) IsOperationPending(
volumeName v1.UniqueVolumeName,
podName volumetypes.UniquePodName,
nodeName types.NodeName) bool {
grm.lock.RLock()
defer grm.lock.RUnlock()
opKey := operationKey{volumeName, podName, nodeName}
exist, previousOpIndex := grm.isOperationExists(opKey)
if exist && grm.operations[previousOpIndex].operationPending {
return true
}
return false
}
// This is an internal function and caller should acquire and release the lock
func (grm *nestedPendingOperations) isOperationExists(key operationKey) (bool, int) {
// If volumeName is empty, operation can be executed concurrently
if key.volumeName == EmptyUniqueVolumeName {
return false, -1
}
for previousOpIndex, previousOp := range grm.operations {
volumeNameMatch := previousOp.key.volumeName == key.volumeName
podNameMatch := previousOp.key.podName == EmptyUniquePodName ||
key.podName == EmptyUniquePodName ||
previousOp.key.podName == key.podName
nodeNameMatch := previousOp.key.nodeName == EmptyNodeName ||
key.nodeName == EmptyNodeName ||
previousOp.key.nodeName == key.nodeName
if volumeNameMatch && podNameMatch && nodeNameMatch {
return true, previousOpIndex
}
}
return false, -1
}
func (grm *nestedPendingOperations) getOperation(key operationKey) (uint, error) {
// Assumes lock has been acquired by caller.
for i, op := range grm.operations {
if op.key.volumeName == key.volumeName &&
op.key.podName == key.podName &&
op.key.nodeName == key.nodeName {
return uint(i), nil
}
}
return 0, fmt.Errorf("operation %+v not found", key)
}
func (grm *nestedPendingOperations) deleteOperation(key operationKey) {
// Assumes lock has been acquired by caller.
opIndex := -1
for i, op := range grm.operations {
if op.key.volumeName == key.volumeName &&
op.key.podName == key.podName &&
op.key.nodeName == key.nodeName {
opIndex = i
break
}
}
if opIndex < 0 {
return
}
// Delete index without preserving order
grm.operations[opIndex] = grm.operations[len(grm.operations)-1]
grm.operations = grm.operations[:len(grm.operations)-1]
}
func (grm *nestedPendingOperations) operationComplete(key operationKey, err *error) {
// Defer operations are executed in Last-In is First-Out order. In this case
// the lock is acquired first when operationCompletes begins, and is
// released when the method finishes, after the lock is released cond is
// signaled to wake waiting goroutine.
defer grm.cond.Signal()
grm.lock.Lock()
defer grm.lock.Unlock()
if *err == nil || !grm.exponentialBackOffOnError {
// Operation completed without error, or exponentialBackOffOnError disabled
grm.deleteOperation(key)
if *err != nil {
// Log error
klog.Errorf("operation %+v failed with: %v", key, *err)
}
return
}
// Operation completed with error and exponentialBackOffOnError Enabled
existingOpIndex, getOpErr := grm.getOperation(key)
if getOpErr != nil {
// Failed to find existing operation
klog.Errorf("Operation %+v completed. error: %v. exponentialBackOffOnError is enabled, but failed to get operation to update.",
key,
*err)
return
}
grm.operations[existingOpIndex].expBackoff.Update(err)
grm.operations[existingOpIndex].operationPending = false
// Log error
klog.Errorf("%v", grm.operations[existingOpIndex].expBackoff.
GenerateNoRetriesPermittedMsg(fmt.Sprintf("%+v", key)))
}
func (grm *nestedPendingOperations) Wait() {
grm.lock.Lock()
defer grm.lock.Unlock()
for len(grm.operations) > 0 {
grm.cond.Wait()
}
}
type operationKey struct {
volumeName v1.UniqueVolumeName
podName volumetypes.UniquePodName
nodeName types.NodeName
}
// NewAlreadyExistsError returns a new instance of AlreadyExists error.
func NewAlreadyExistsError(key operationKey) error {
return alreadyExistsError{key}
}
// IsAlreadyExists returns true if an error returned from
// NestedPendingOperations indicates a new operation can not be started because
// an operation with the same operation name is already executing.
func IsAlreadyExists(err error) bool {
switch err.(type) {
case alreadyExistsError:
return true
default:
return false
}
}
// alreadyExistsError is the error returned by NestedPendingOperations when a
// new operation can not be started because an operation with the same operation
// name is already executing.
type alreadyExistsError struct {
operationKey operationKey
}
var _ error = alreadyExistsError{}
func (err alreadyExistsError) Error() string {
return fmt.Sprintf(
"Failed to create operation with name %+v. An operation with that name is already executing.",
err.operationKey)
} | cluster-autoscaler/vendor/k8s.io/kubernetes/pkg/volume/util/nestedpendingoperations/nestedpendingoperations.go | 0.734786 | 0.448607 | nestedpendingoperations.go | starcoder |
package riak
import (
"errors"
"fmt"
"reflect"
"strings"
"github.com/tpjg/goriakpbc/json"
)
/*
Make structs work like a Document Model, similar to how the Ruby based "ripple"
gem works. This is done by parsing the JSON data and mapping it to the struct's
fields. To enable easy integration with Ruby/ripple projects the struct "tag"
feature of Go is used to possibly get around the naming convention differences
between Go and Ruby (Uppercase starting letter required for export and
typically CamelCase versus underscores). Also it stores the model/struct name
as _type in Riak.
For example the following Ruby/Ripple class:
class Device
include Ripple::Document
property :ip, String
property :description, String
property :download_enabled, Boolean
end
can be mapped to the following Go class:
type Device struct {
riak.Model `riak:devices`
Download_enabled bool `riak:"download_enabled"`
Ip string `riak:"ip"`
Description string `riak:"description"`
}
Note that it is required to have a riak.Model field.
Also if the field name in Ripple is equal the extra tag is not needed, (e.g.
if the Ripple class above would have a "property :Ip, String").
*/
type Model struct {
robject *RObject
parent Resolver // Pointer to the parent struct (Device in example above)
}
type Resolver interface {
Resolve(int) error
}
// Error definitions
var (
ResolveNotImplemented = errors.New("Resolve not implemented")
DestinationError = errors.New("Destination is not a pointer (to a struct)")
DestinationIsNotModel = errors.New("Destination has no riak.Model field")
DestinationIsNotSlice = errors.New("Must supply a slice to GetSiblings")
DestinationLengthError = errors.New("Length of slice does not match number of siblings")
DestinationNotInitialized = errors.New("Destination struct is not initialized (correctly) using riak.New or riak.Load")
ModelDoesNotMatch = errors.New("Warning: struct name does not match _type in Riak")
ModelNotNew = errors.New("Destination struct already has an instantiated riak.Model (this struct is probably not new)")
NoSiblingData = errors.New("No non-empty sibling data")
)
/*
Return is an error is really a warning, e.g. a common json error, or
ModelDoesNotMatch.
*/
func IsWarning(err error) bool {
if err != nil {
if strings.HasPrefix(err.Error(), "<nil> - json: cannot unmarshal") && strings.Contains(err.Error(), "into Go value of type") {
return true
} else if strings.HasPrefix(err.Error(), "<nil> - parsing") && strings.Contains(err.Error(), ": cannot parse") {
return true
} else if err == ModelDoesNotMatch {
return true
}
} else {
// In case there is no error reply true anyway since this is probably
// what is expected - a check whether it is safe to continue.
return true
}
return false
}
func (*Model) Resolve(count int) (err error) {
return ResolveNotImplemented
}
// Link to one other model
type One struct {
model interface{}
link Link
client *Client
}
// Link to many other models
type Many []One
func (c *Client) setOneLink(source Link, dest reflect.Value) {
if dest.Kind() == reflect.Struct && dest.Type() == reflect.TypeOf(One{}) {
one := One{link: source, client: c}
mv := reflect.ValueOf(one)
dest.Set(mv)
return
}
}
func (c *Client) addOneLink(source Link, dest reflect.Value) {
if dest.Kind() == reflect.Slice && dest.Type() == reflect.TypeOf(Many{}) {
one := One{link: source, client: c}
mv := reflect.ValueOf(one)
// Add this One link to the slice
dest.Set(reflect.Append(dest, mv))
}
}
// Sets up a riak.Model structure and assigns it to the given resolver field.
func setup_model(obj *RObject, dest Resolver, rm reflect.Value) {
model := Model{robject: obj, parent: dest}
mv := reflect.ValueOf(model)
rm.Set(mv)
}
// Check if the passed destination is a pointer to a struct with riak.Model field
// Returns the destination Value and Type (dv, dt) as well as the riak.Model field (rm)
// and the bucketname (bn), which is derived from the tag of the riak.Model field.
func check_dest(dest interface{}) (dv reflect.Value, dt reflect.Type, rm reflect.Value, bn string, err error) {
dv = reflect.ValueOf(dest)
if dv.Kind() != reflect.Ptr || dv.IsNil() {
err = DestinationError
return
}
dv = dv.Elem()
dt = reflect.TypeOf(dest)
dt = dt.Elem()
if dt.Kind() != reflect.Struct {
err = DestinationError
return
}
for i := 0; i < dt.NumField(); i++ {
dobj := dt.Field(i)
if dobj.Type.Kind() == reflect.Struct && dobj.Type == reflect.TypeOf(Model{}) {
bn = dt.Field(i).Tag.Get("riak")
rm = dv.Field(i) // Return the Model field value
return
}
}
err = DestinationIsNotModel
return
}
type modelName struct {
Type string `_type`
}
/*
mapData, maps the decoded JSON data onto the right struct fields, including
decoding of links.
*/
func (c *Client) mapData(dv reflect.Value, dt reflect.Type, data []byte, links []Link, dest interface{}) (err error) {
// Double check there is a "_type" field that is the same as the struct
// name, this is only a warning though.
var mn modelName
err = json.Unmarshal(data, &mn)
if err != nil || dt.Name() != mn.Type {
err = fmt.Errorf("Warning: struct name does not match _type in Riak (%v)", err)
}
// Unmarshal the destination model
jserr := json.Unmarshal(data, dest)
if jserr != nil {
err = fmt.Errorf("%v - %v", err, jserr) // Add error
}
// For all the links in the struct, find the correct mapping
for i := 0; i < dt.NumField(); i++ {
ft := dt.Field(i)
fv := dv.Field(i)
if ft.Type == reflect.TypeOf(One{}) {
var name string
if ft.Tag.Get("riak") != "" {
name = ft.Tag.Get("riak")
} else if string(ft.Tag) != "" && !strings.Contains(string(ft.Tag), `:"`) {
//DEPRECATED: use tag directly if it appears not to have key/values.
name = string(ft.Tag)
} else {
name = ft.Name
}
// Search in Links
for _, v := range links {
if v.Tag == name {
c.setOneLink(v, fv)
}
}
} else if ft.Type == reflect.TypeOf(Many{}) {
var name string
if ft.Tag.Get("riak") != "" {
name = ft.Tag.Get("riak")
} else if string(ft.Tag) != "" && !strings.Contains(string(ft.Tag), `:"`) {
//DEPRECATED: use tag directly if it appears not to have key/values.
name = string(ft.Tag)
} else {
name = ft.Name
}
// Search in Links
for _, v := range links {
if v.Tag == name {
c.addOneLink(v, fv)
}
}
}
}
return
}
func (m *Model) GetSiblings(dest interface{}) (err error) {
client, err := m.getClient()
if err != nil {
return err
}
// Check if a slice is supplied
v := reflect.ValueOf(dest)
if v.Kind() == reflect.Ptr && !v.IsNil() {
v = v.Elem()
}
if v.Kind() != reflect.Slice {
return DestinationIsNotSlice
}
// Check the length of the supplied slice against the number of non-empty siblings
count := 0
for _, s := range m.robject.Siblings {
if len(s.Data) != 0 {
count += 1
}
}
if count == 0 {
return NoSiblingData
}
if v.Len() != count {
return DestinationLengthError
}
// Check the type of the slice elements
if reflect.TypeOf(v.Index(0).Interface()) != reflect.TypeOf(m.parent).Elem() {
return fmt.Errorf("Slice elements incorrect, must be %v", reflect.TypeOf(m.parent).Elem())
}
count = 0
// Double check the parent and get the Value and Type
_, dt, _, _, err := check_dest(m.parent)
if err != nil {
return err
}
// Walk over the slice and map the data for each sibling
for _, sibling := range m.robject.Siblings {
if len(sibling.Data) != 0 {
// Map the data onto a temporary struct
tmp := reflect.New(dt)
err = client.mapData(tmp.Elem(), dt, sibling.Data, sibling.Links, tmp.Interface())
// Copy the temporary struct to the slice element
v.Index(count).Set(tmp.Elem())
count += 1
}
}
return
}
/*
The LoadModelFrom function retrieves the data from Riak and stores it in the struct
that is passed as destination. It stores some necessary information in the
riak.Model field so it can be used later in other (Save) operations.
If the bucketname is empty ("") it'll be the default bucket, based on the
riak.Model tag.
Using the "Device" struct as an example:
dev := &Device{}
err := client.Load("devices", "12345", dev)
*/
func (c *Client) LoadModelFrom(bucketname string, key string, dest Resolver, options ...map[string]uint32) (err error) {
// Check destination
dv, dt, rm, bn, err := check_dest(dest)
if err != nil {
return err
}
// Use default bucket name if empty
if bucketname == "" {
bucketname = bn
}
// Fetch the object from Riak.
bucket, err := c.Bucket(bucketname)
if bucket == nil || err != nil {
err = fmt.Errorf("Can't get bucket for %v - %v", dt.Name(), err)
return
}
obj, err := bucket.Get(key, options...)
if err != nil {
if obj != nil {
// Set the values in the riak.Model field
setup_model(obj, dest, rm)
}
return err
}
if obj == nil {
return NotFound
}
if obj.Conflict() {
// Count number of non-empty siblings for which a conflict must be resolved
count := 0
for _, s := range obj.Siblings {
if len(s.Data) != 0 {
count += 1
}
}
// Set the RObject in the destination struct so it can be used for resolving the conflict
setup_model(obj, dest, rm)
// Resolve the conflict and return the errorcode
return dest.Resolve(count)
}
// Map the data onto the struct.
err = c.mapData(dv, dt, obj.Data, obj.Links, dest)
// Set the values in the riak.Model field
setup_model(obj, dest, rm)
return
}
// Load data into model. DEPRECATED, use LoadModelFrom instead.
func (c *Client) Load(bucketname string, key string, dest Resolver, options ...map[string]uint32) (err error) {
return c.LoadModelFrom(bucketname, key, dest, options...)
}
// Load data into the model using the default bucket (from the Model's struct definition)
func (c *Client) LoadModel(key string, dest Resolver, options ...map[string]uint32) (err error) {
return c.LoadModelFrom("", key, dest, options...)
}
/*
Create a new Document Model, passing in the bucketname and key. The key can be
empty in which case Riak will pick a key. The destination must be a pointer to
a struct that has the riak.Model field.
If the bucketname is empty the default bucketname, based on the riak.Model tag
will be used.
*/
func (c *Client) NewModelIn(bucketname string, key string, dest Resolver, options ...map[string]uint32) (err error) {
// Check destination
_, dt, rm, bn, err := check_dest(dest)
if err != nil {
return err
}
// Use default bucket name if empty
if bucketname == "" {
bucketname = bn
}
// Fetch the object from Riak.
bucket, err := c.Bucket(bucketname)
if bucket == nil || err != nil {
err = fmt.Errorf("Can't get bucket for %v - %v", dt.Name(), err)
return
}
// Check if the RObject field within riak.Model is still nill, otherwise
// this destination (dest) is probably an already fully instantiated
// struct.
model := &Model{}
mv := reflect.ValueOf(model)
mv = mv.Elem()
mv.Set(rm)
if model.robject != nil {
return ModelNotNew
}
// For the riak.Model field within the struct, set the Client and Bucket
// and fields and set the RObject field to nil.
model.robject = &RObject{Bucket: bucket, Key: key, ContentType: "application/json", Options: options}
model.parent = dest
rm.Set(mv)
return
}
// Instantiate a new model, setting the necessary fields, like the client.
// If key is not empty that key will be used, otherwise Riak will choose a key.
func (c *Client) NewModel(key string, dest Resolver, options ...map[string]uint32) (err error) {
return c.NewModelIn("", key, dest, options...)
}
// Create a new model. DEPRECATED, use NewModelIn instead.
func (c *Client) New(bucketname string, key string, dest Resolver, options ...map[string]uint32) (err error) {
return c.NewModelIn(bucketname, key, dest, options...)
}
// Creates a link to a given model
func (c *Client) linkToModel(dest interface{}) (link Link, err error) {
// Check destination
_, _, rm, _, err := check_dest(dest)
if err != nil {
return
}
// Get the Model field
model := &Model{}
mv := reflect.ValueOf(model)
mv = mv.Elem()
mv.Set(rm)
// Now check if there is an RObject, otherwise probably not correctly instantiated with .New (or Load).
if model.robject == nil {
err = DestinationNotInitialized
return
}
link = Link{Bucket: model.robject.Bucket.name, Key: model.robject.Key}
return
}
// Save a Document Model to Riak under a new key, if empty a Key will be choosen by Riak
func (c *Client) SaveAs(newKey string, dest Resolver) (err error) {
// Check destination
dv, dt, rm, _, err := check_dest(dest)
if err != nil {
return err
}
// Get the Model field
model := &Model{}
mv := reflect.ValueOf(model)
mv = mv.Elem()
mv.Set(rm)
// Now check if there is an RObject, otherwise probably not correctly instantiated with .New (or Load).
if model.robject == nil {
return DestinationNotInitialized
}
// JSON encode the entire struct
data, err := json.Marshal(dest)
if err != nil {
return err
}
// Clear the old links
model.robject.Links = []Link{}
// Now add the Links
for i := 0; i < dt.NumField(); i++ {
ft := dt.Field(i)
fv := dv.Field(i)
var fieldname string
if ft.Tag.Get("riak") != "" {
fieldname = ft.Tag.Get("riak")
} else if string(ft.Tag) != "" && !strings.Contains(string(ft.Tag), `:"`) {
//DEPRECATED: use tag directly if it appears not to have key/values.
fieldname = string(ft.Tag)
} else {
fieldname = ft.Name
}
if ft.Type == reflect.TypeOf(One{}) {
// Save a link, set the One struct first
lmodel := &One{}
lmv := reflect.ValueOf(lmodel)
lmv = lmv.Elem()
lmv.Set(fv)
// If the link is not set, create it now.
if lmodel.link.Bucket == "" || lmodel.link.Key == "" {
lmodel.link, _ = c.linkToModel(lmodel.model)
}
// Add the link (if not already in the object's links)
model.robject.AddLink(Link{lmodel.link.Bucket, lmodel.link.Key, fieldname})
}
if ft.Type == reflect.TypeOf(Many{}) {
// Save the links, create a Many struct first
lmodels := &Many{}
lmv := reflect.ValueOf(lmodels)
lmv = lmv.Elem()
lmv.Set(fv)
// Now walk over those links...
for _, lmodel := range *lmodels {
// If the link is not set, create it now.
if lmodel.link.Bucket == "" || lmodel.link.Key == "" {
lmodel.link, _ = c.linkToModel(lmodel.model)
}
// Add the link (if not already in the object's links)
model.robject.AddLink(Link{lmodel.link.Bucket, lmodel.link.Key, fieldname})
}
}
}
//fmt.Printf("Saving data for %v as %v\n", dt.Name(), string(data))
model.robject.Data = data
model.robject.ContentType = "application/json"
if newKey != "" {
model.robject.Key = newKey
}
// Store the RObject in Riak
err = model.robject.Store()
return
}
// Save a Document Model to Riak
func (c *Client) Save(dest Resolver) (err error) {
return c.SaveAs("", dest)
}
// Get the client from a given model
func (m *Model) getClient() (c *Client, err error) {
if m.robject == nil {
return nil, DestinationNotInitialized
}
if m.robject.Bucket == nil {
return nil, DestinationNotInitialized
}
if m.robject.Bucket.client == nil {
return nil, DestinationNotInitialized
}
return m.robject.Bucket.client, nil
}
// Save a Document Model to Riak under a new key, if empty a Key will be choosen by Riak
func (m *Model) SaveAs(newKey string) (err error) {
client, err := m.getClient()
if err != nil {
return err
}
return client.SaveAs(newKey, m.parent)
}
// Save a Document Model to Riak
func (m *Model) Save() (err error) {
return m.SaveAs("")
}
// Delete a Document Model
func (m *Model) Delete() (err error) {
return m.robject.Destroy()
}
// Reload a Document Model
func (m *Model) Reload() (err error) {
vclock := string(m.robject.Vclock)
err = m.robject.Reload()
if err != nil {
return err
}
// Check if there was any change
if string(m.robject.Vclock) != vclock {
// Set the content again
if m.robject.Conflict() {
// Count number of non-empty siblings for which a conflict must be resolved
count := 0
for _, s := range m.robject.Siblings {
if len(s.Data) != 0 {
count += 1
}
}
// Resolve the conflict and return the errorcode
return m.parent.Resolve(count)
}
// Map the data onto the struct.
dv, dt, _, _, err := check_dest(m.parent)
c, err := m.getClient()
if err != nil {
return err
}
err = c.mapData(dv, dt, m.robject.Data, m.robject.Links, m.parent)
if err != nil {
return err
}
}
return
}
// Return the object Vclock - this allows an application to detect whether Reload()
// loaded a newer version of the object
func (m *Model) Vclock() (vclock []byte) {
return m.robject.Vclock
}
// Return the object's indexes. This allows an application to set custom secondary
// indexes on the object for later querying.
func (m *Model) Indexes() map[string][]string {
if m.robject.Indexes == nil {
m.robject.Indexes = make(map[string][]string)
}
return m.robject.Indexes
}
// Get a models Key, e.g. needed when Riak has picked it
func (c *Client) Key(dest interface{}) (key string, err error) {
// Check destination
_, _, rm, _, err := check_dest(dest)
if err != nil {
return
}
// Get the Model field
model := &Model{}
mv := reflect.ValueOf(model)
mv = mv.Elem()
mv.Set(rm)
// Now check if there is an RObject, otherwise probably not correctly instantiated with .New (or Load).
if model.robject == nil {
err = errors.New("Destination struct is not instantiated using riak.New or riak.Load")
return
}
return model.robject.Key, nil
}
// Get a models Key, e.g. needed when Riak has picked it
func (m Model) Key() (key string) {
if m.robject == nil {
return ""
}
return m.robject.Key
}
// Set the Key value, note that this does not save the model, it only changes the data structure
func (c *Client) SetKey(newKey string, dest interface{}) (err error) {
// Check destination
_, _, rm, _, err := check_dest(dest)
if err != nil {
return
}
// Get the Model field
model := &Model{}
mv := reflect.ValueOf(model)
mv = mv.Elem()
mv.Set(rm)
// Now check if there is an RObject, otherwise probably not correctly instantiated with .New (or Load).
if model.robject == nil {
return DestinationNotInitialized
}
model.robject.Key = newKey
return
}
// Set the Key value, note that this does not save the model, it only changes the data structure
func (m Model) SetKey(newKey string) (err error) {
if m.robject == nil {
return DestinationNotInitialized
}
m.robject.Key = newKey
return
}
func (o One) Link() (link Link) {
return o.link
}
// Set the link to a given Model (dest)
func (o *One) Set(dest Resolver) (err error) {
_, _, _, _, err = check_dest(dest)
if err == nil {
o.model = dest
o.link, err = o.client.linkToModel(dest)
}
return
}
// Set a link directly
func (o *One) SetLink(a One) {
o.link = a.link
}
// Test for equality (bucket and key are equal)
func (o *One) Equal(e One) bool {
return o.link.Bucket == e.link.Bucket && o.link.Key == e.link.Key
}
// Test if the link is empty (not set)
func (o *One) Empty() bool {
return o.link.Bucket == "" && o.link.Key == ""
}
func (o *One) Get(dest Resolver) (err error) {
if o.client == nil {
return DestinationNotInitialized
}
return o.client.Load(o.link.Bucket, o.link.Key, dest)
}
// Add a Link to the given Model (dest)
func (m *Many) Add(dest Resolver) (err error) {
_, _, _, _, err = check_dest(dest)
if err == nil {
o := One{model: dest}
o.link, err = o.client.linkToModel(dest)
*m = append(*m, o)
}
return err
}
// Add a given Link (One) directly
func (m *Many) AddLink(o One) {
*m = append(*m, o)
}
// Remove a Link to the given Model (dest)
func (m *Many) Remove(dest Resolver) (err error) {
_, _, _, _, err = check_dest(dest)
if err == nil {
o := One{model: dest}
o.link, err = o.client.linkToModel(dest)
for i, v := range *m {
if v.link.Bucket == o.link.Bucket && v.link.Key == o.link.Key {
// Remove this element from the list
*m = append((*m)[:i], (*m)[i+1:]...)
return err
}
}
} else {
return err
}
return NotFound
}
// Remove a given Link (One) directly, e.g. so it can be used when iterating over a riak.Many slice
func (m *Many) RemoveLink(o One) (err error) {
for i, v := range *m {
if v.link.Bucket == o.link.Bucket && v.link.Key == o.link.Key {
// Remove this element from the list
*m = append((*m)[:i], (*m)[i+1:]...)
return err
}
}
return NotFound
}
// Return the number of Links
func (m *Many) Len() int {
return len(*m)
}
// Return if a given Link is in the riak.Many slice
func (m *Many) Contains(o One) bool {
for _, v := range *m {
if v.Equal(o) {
return true
}
}
return false
}
// Instantiate a new model (using the default client), setting the necessary fields, like the client.
// If key is not empty that key will be used, otherwise Riak will choose a key.
func NewModel(key string, dest Resolver, options ...map[string]uint32) (err error) {
if defaultClient == nil {
return NoDefaultClientConnection
}
return defaultClient.NewModelIn("", key, dest, options...)
}
// Create a new Document Model (using the default client), passing in the bucketname and key.
func NewModelIn(bucketname string, key string, dest Resolver, options ...map[string]uint32) (err error) {
if defaultClient == nil {
return NoDefaultClientConnection
}
return defaultClient.NewModelIn(bucketname, key, dest, options...)
}
// Load data into the model using the default bucket (from the Model's struct definition)
func LoadModel(key string, dest Resolver, options ...map[string]uint32) (err error) {
if defaultClient == nil {
return NoDefaultClientConnection
}
return defaultClient.LoadModel(key, dest, options...)
}
// The LoadModelFrom function retrieves the data from Riak using the default client and stores it in the struct
// that is passed as destination.
func LoadModelFrom(bucketname string, key string, dest Resolver, options ...map[string]uint32) (err error) {
if defaultClient == nil {
return NoDefaultClientConnection
}
return defaultClient.LoadModelFrom(bucketname, key, dest, options...)
}
func init() {
json.SkipTypes[reflect.TypeOf(Model{})] = true
json.SkipTypes[reflect.TypeOf(One{})] = true
json.SkipTypes[reflect.TypeOf(Many{})] = true
} | model.go | 0.611614 | 0.434101 | model.go | starcoder |
package ca_ES_VALENCIA
import "github.com/MaxSlyugrov/cldr"
var currencies = []cldr.Currency{
{Currency: "AFA", DisplayName: "afgani afganés (1927–2002)", Symbol: ""},
{Currency: "AFN", DisplayName: "afgani afganés", Symbol: ""},
{Currency: "ALK", DisplayName: "lek albanés (1946–1965)", Symbol: ""},
{Currency: "ALL", DisplayName: "lek albanés", Symbol: ""},
{Currency: "AZM", DisplayName: "manat azerbaidjanés (1993–2006)", Symbol: ""},
{Currency: "AZN", DisplayName: "manat azerbaidjanés", Symbol: ""},
{Currency: "CLE", DisplayName: "escut xilé", Symbol: ""},
{Currency: "CLP", DisplayName: "peso xilé", Symbol: ""},
{Currency: "CNX", DisplayName: "dòlar del Banc Popular Xinés", Symbol: ""},
{Currency: "CNY", DisplayName: "iuan xinés", Symbol: ""},
{Currency: "FIM", DisplayName: "marc finlandés", Symbol: ""},
{Currency: "FRF", DisplayName: "franc francés", Symbol: ""},
{Currency: "HUF", DisplayName: "fòrint hongarés", Symbol: ""},
{Currency: "JPY", DisplayName: "ien japonés", Symbol: ""},
{Currency: "LUC", DisplayName: "franc convertible luxemburgués", Symbol: ""},
{Currency: "LUF", DisplayName: "franc luxemburgués", Symbol: ""},
{Currency: "LUL", DisplayName: "franc financer luxemburgués", Symbol: ""},
{Currency: "MZE", DisplayName: "escut moçambiqués", Symbol: ""},
{Currency: "MZM", DisplayName: "antic metical moçambiqués", Symbol: ""},
{Currency: "MZN", DisplayName: "metical moçambiqués", Symbol: ""},
{Currency: "NLG", DisplayName: "florí neerlandés", Symbol: ""},
{Currency: "NZD", DisplayName: "dòlar neozelandés", Symbol: ""},
{Currency: "PLN", DisplayName: "zloty polonés", Symbol: ""},
{Currency: "PLZ", DisplayName: "zloty polonés (1950–1995)", Symbol: ""},
{Currency: "PTE", DisplayName: "escut portugués", Symbol: ""},
{Currency: "ROL", DisplayName: "antic leu romanés", Symbol: ""},
{Currency: "RON", DisplayName: "leu romanés", Symbol: ""},
{Currency: "RWF", DisplayName: "franc rwandés", Symbol: ""},
{Currency: "SIT", DisplayName: "tolar eslové", Symbol: ""},
{Currency: "THB", DisplayName: "baht tailandés", Symbol: ""},
{Currency: "UAK", DisplayName: "karbóvanets ucraïnés", Symbol: ""},
{Currency: "UGS", DisplayName: "xíling ugandés (1966–1987)", Symbol: ""},
{Currency: "UGX", DisplayName: "xíling ugandés", Symbol: ""},
{Currency: "XFO", DisplayName: "franc or francés", Symbol: ""},
{Currency: "XFU", DisplayName: "franc UIC francés", Symbol: ""},
{Currency: "ZWD", DisplayName: "dòlar zimbabués (1980–2008)", Symbol: ""},
{Currency: "ZWL", DisplayName: "dòlar zimbabués (2009)", Symbol: ""},
{Currency: "ZWR", DisplayName: "dòlar zimbabués (2008)", Symbol: ""},
} | resources/locales/ca_ES_VALENCIA/currency.go | 0.50415 | 0.408955 | currency.go | starcoder |
package stats
import "math"
// Series is a container for a series of data
type Series []Coordinate
// Coordinate holds the data in a series
type Coordinate struct {
X, Y float64
}
// LinearRegression finds the least squares linear regression on data series
func LinearRegression(s Series) (regressions Series, err error) {
if len(s) == 0 {
return nil, EmptyInputErr
}
// Placeholder for the math to be done
var sum [5]float64
// Loop over data keeping index in place
i := 0
for ; i < len(s); i++ {
sum[0] += s[i].X
sum[1] += s[i].Y
sum[2] += s[i].X * s[i].X
sum[3] += s[i].X * s[i].Y
sum[4] += s[i].Y * s[i].Y
}
// Find gradient and intercept
f := float64(i)
gradient := (f*sum[3] - sum[0]*sum[1]) / (f*sum[2] - sum[0]*sum[0])
intercept := (sum[1] / f) - (gradient * sum[0] / f)
// Create the new regression series
for j := 0; j < len(s); j++ {
regressions = append(regressions, Coordinate{
X: s[j].X,
Y: s[j].X*gradient + intercept,
})
}
return regressions, nil
}
// ExponentialRegression returns an exponential regression on data series
func ExponentialRegression(s Series) (regressions Series, err error) {
if len(s) == 0 {
return nil, EmptyInputErr
}
var sum [6]float64
for i := 0; i < len(s); i++ {
if s[i].Y < 0 {
return nil, YCoordErr
}
sum[0] += s[i].X
sum[1] += s[i].Y
sum[2] += s[i].X * s[i].X * s[i].Y
sum[3] += s[i].Y * math.Log(s[i].Y)
sum[4] += s[i].X * s[i].Y * math.Log(s[i].Y)
sum[5] += s[i].X * s[i].Y
}
denominator := (sum[1]*sum[2] - sum[5]*sum[5])
a := math.Pow(math.E, (sum[2]*sum[3]-sum[5]*sum[4])/denominator)
b := (sum[1]*sum[4] - sum[5]*sum[3]) / denominator
for j := 0; j < len(s); j++ {
regressions = append(regressions, Coordinate{
X: s[j].X,
Y: a * math.Exp(b*s[j].X),
})
}
return regressions, nil
}
// LogarithmicRegression returns an logarithmic regression on data series
func LogarithmicRegression(s Series) (regressions Series, err error) {
if len(s) == 0 {
return nil, EmptyInputErr
}
var sum [4]float64
i := 0
for ; i < len(s); i++ {
sum[0] += math.Log(s[i].X)
sum[1] += s[i].Y * math.Log(s[i].X)
sum[2] += s[i].Y
sum[3] += math.Pow(math.Log(s[i].X), 2)
}
f := float64(i)
a := (f*sum[1] - sum[2]*sum[0]) / (f*sum[3] - sum[0]*sum[0])
b := (sum[2] - a*sum[0]) / f
for j := 0; j < len(s); j++ {
regressions = append(regressions, Coordinate{
X: s[j].X,
Y: b + a*math.Log(s[j].X),
})
}
return regressions, nil
} | tools/vendor/github.com/montanaflynn/stats/regression.go | 0.795142 | 0.785267 | regression.go | starcoder |
package backend
import (
"context"
"github.com/baudtime/baudtime/msg"
"github.com/prometheus/prometheus/pkg/labels"
)
type Backend interface {
Queryable
// StartTime returns the oldest timestamp stored in the storage.
StartTime() (int64, error)
// Appender returns a new appender against the storage.
Appender() (Appender, error)
// Close closes the storage and all its underlying resources.
Close() error
}
// A Queryable handles queries against a storage.
type Queryable interface {
// Querier returns a new Querier on the storage.
Querier(ctx context.Context, mint, maxt int64) (Querier, error)
}
// Querier provides reading access to time series data.
type Querier interface {
// Select returns a set of series that matches the given label matchers.
//Select(*SelectHints, ...*labels.Matcher) (SeriesSet, error)
Select(*SelectHints, ...*labels.Matcher) (SeriesSet, error)
// LabelValues returns all potential values for a label name.
LabelValues(string, ...*labels.Matcher) ([]string, error)
LabelNames(...*labels.Matcher) ([]string, error)
// Close releases the resources of the Querier.
Close() error
}
// Appender provides batched appends against a storage.
type Appender interface {
Add(l []msg.Label, t int64, v float64, hash uint64) error
Flush() error
}
// SelectHints specifies hints passed for data selections.
// This is used only as an option for implementation to use.
type SelectHints struct {
Start int64 // Start time in milliseconds for this select.
End int64 // End time in milliseconds for this select.
Step int64 // Query step size in milliseconds.
Func string // String representation of surrounding function or aggregation.
Grouping []string // List of label names used in aggregation.
By bool // Indicate whether it is without or by.
Range int64 // Range vector selector range in milliseconds.
OnlyLabels bool // Query meta data only, no samples, only labels
}
// SeriesSet contains a set of series.
type SeriesSet interface {
Next() bool
At() Series
Err() error
}
// Series represents a single time series.
type Series interface {
// Labels returns the complete set of labels identifying the series.
Labels() labels.Labels
// Iterator returns a new iterator of the data of the series.
Iterator() SeriesIterator
}
// SeriesIterator iterates over the data of a time series.
type SeriesIterator interface {
// Seek advances the iterator forward to the value at or after
// the given timestamp.
Seek(t int64) bool
// At returns the current timestamp/value pair.
At() (t int64, v float64)
// Next advances the iterator by one.
Next() bool
// Err returns the current error.
Err() error
}
// QueryableFunc is an adapter to allow the use of ordinary functions as
// Queryables. It follows the idea of http.HandlerFunc.
type QueryableFunc func(ctx context.Context, mint, maxt int64) (Querier, error)
// Querier calls f() with the given parameters.
func (f QueryableFunc) Querier(ctx context.Context, mint, maxt int64) (Querier, error) {
return f(ctx, mint, maxt)
}
// errSeriesSet implements SeriesSet, just returning an error.
type errSeriesSet struct {
err error
}
func (errSeriesSet) Next() bool {
return false
}
func (errSeriesSet) At() Series {
return nil
}
func (e errSeriesSet) Err() error {
return e.err
}
var emptySeriesSet = errSeriesSet{}
// EmptySeriesSet returns a series set that's always empty.
func EmptySeriesSet() SeriesSet {
return emptySeriesSet
} | backend/interface.go | 0.784938 | 0.412648 | interface.go | starcoder |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.