GUI and Desktop Applications int64 0 1 | A_Id int64 5.3k 72.5M | Networking and APIs int64 0 1 | Python Basics and Environment int64 0 1 | Other int64 0 1 | Database and SQL int64 0 1 | Available Count int64 1 13 | is_accepted bool 2
classes | Q_Score int64 0 1.72k | CreationDate stringlengths 23 23 | Users Score int64 -11 327 | AnswerCount int64 1 31 | System Administration and DevOps int64 0 1 | Title stringlengths 15 149 | Q_Id int64 5.14k 60M | Score float64 -1 1.2 | Tags stringlengths 6 90 | Answer stringlengths 18 5.54k | Question stringlengths 49 9.42k | Web Development int64 0 1 | Data Science and Machine Learning int64 1 1 | ViewCount int64 7 3.27M |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
0 | 3,188,680 | 0 | 0 | 0 | 0 | 2 | false | 6 | 2010-07-02T16:58:00.000 | 0 | 3 | 1 | Picking a front-end/interpreter for a scientific code | 3,167,661 | 0 | c++,python,matlab,tcl,interpreter | Well, unless there are any other suggestions, the final answer I have arrived at is to go with Python.
I seriously considered matlab/octave, but when reading the octave API and matlab API, they are different enough that I'd need to build separate interfaces for each (or get very creative with macros). With python I en... | The simulation tool I have developed over the past couple of years, is written in C++ and currently has a tcl interpreted front-end. It was written such that it can be run either in an interactive shell, or by passing an input file. Either way, the input file is written in tcl (with many additional simulation-specifi... | 0 | 1 | 453 |
0 | 3,168,060 | 0 | 0 | 0 | 0 | 2 | false | 6 | 2010-07-02T16:58:00.000 | 3 | 3 | 1 | Picking a front-end/interpreter for a scientific code | 3,167,661 | 0.197375 | c++,python,matlab,tcl,interpreter | Have you considered using Octave? From what I gather, it is nearly a drop-in replacement for much of matlab. This might allow you to support matlab for those who have it, and a free alternative for those who don't. Since the "meat" of your program appears to be written in another language, the performance considera... | The simulation tool I have developed over the past couple of years, is written in C++ and currently has a tcl interpreted front-end. It was written such that it can be run either in an interactive shell, or by passing an input file. Either way, the input file is written in tcl (with many additional simulation-specifi... | 0 | 1 | 453 |
0 | 3,177,037 | 0 | 0 | 0 | 0 | 3 | false | 14 | 2010-07-05T02:25:00.000 | 1 | 5 | 0 | What is a good first-implementation for learning machine learning? | 3,176,967 | 0.039979 | python,computer-science,artificial-intelligence,machine-learning | Decision tree. It is frequently used in classification tasks and has a lot of variants. Tom Mitchell's book is a good reference to implement it. | I find learning new topics comes best with an easy implementation to code to get the idea. This is how I learned genetic algorithms and genetic programming. What would be some good introductory programs to write to get started with machine learning?
Preferably, let any referenced resources be accessible online so the... | 0 | 1 | 5,717 |
0 | 3,182,779 | 0 | 0 | 0 | 0 | 3 | false | 14 | 2010-07-05T02:25:00.000 | 1 | 5 | 0 | What is a good first-implementation for learning machine learning? | 3,176,967 | 0.039979 | python,computer-science,artificial-intelligence,machine-learning | Neural nets may be the easiest thing to implement first, and they're fairly thoroughly covered throughout literature. | I find learning new topics comes best with an easy implementation to code to get the idea. This is how I learned genetic algorithms and genetic programming. What would be some good introductory programs to write to get started with machine learning?
Preferably, let any referenced resources be accessible online so the... | 0 | 1 | 5,717 |
0 | 3,177,077 | 0 | 0 | 0 | 0 | 3 | false | 14 | 2010-07-05T02:25:00.000 | -8 | 5 | 0 | What is a good first-implementation for learning machine learning? | 3,176,967 | -1 | python,computer-science,artificial-intelligence,machine-learning | There is something called books; are you familiar with those? When I was exploring AI two decades ago, there were many books. I guess now that the internet exists, books are archaic, but you can probably find some in an ancient library. | I find learning new topics comes best with an easy implementation to code to get the idea. This is how I learned genetic algorithms and genetic programming. What would be some good introductory programs to write to get started with machine learning?
Preferably, let any referenced resources be accessible online so the... | 0 | 1 | 5,717 |
0 | 3,207,845 | 0 | 0 | 0 | 0 | 1 | true | 5 | 2010-07-07T10:07:00.000 | 2 | 2 | 0 | AdaBoost ML algorithm python implementation | 3,193,756 | 1.2 | python,machine-learning,adaboost | Thanks a million Steve! In fact, your suggestion had some compatibility issues with MacOSX (a particular library was incompatible with the system) BUT it helped me find out a more interesting package : icsi.boost.macosx. I am just denoting that in case any Mac-eter finds it interesting!
Thank you again!
Tim | Is there anyone that has some ideas on how to implement the AdaBoost (Boostexter) algorithm in python?
Cheers! | 0 | 1 | 4,289 |
0 | 3,227,905 | 0 | 0 | 0 | 0 | 2 | false | 25 | 2010-07-12T10:57:00.000 | 2 | 5 | 0 | Determine height of Coffee in the pot using Python imaging | 3,227,843 | 0.07983 | python,image-processing | First do thresholding, then segmentation. Then you can more easily detect edges. | We have a web-cam in our office kitchenette focused at our coffee maker. The coffee pot is clearly visible. Both the location of the coffee pot and the camera are static. Is it possible to calculate the height of coffee in the pot using image recognition? I've seen image recognition used for quite complex stuff like fa... | 0 | 1 | 1,297 |
0 | 3,231,205 | 0 | 0 | 0 | 0 | 2 | false | 25 | 2010-07-12T10:57:00.000 | 0 | 5 | 0 | Determine height of Coffee in the pot using Python imaging | 3,227,843 | 0 | python,image-processing | make pictures of the pot with different levels of coffe in it.
downsample the image to maybe 4*10 pixels.
make the same in a loop for each new live picture.
calculate the difference of each pixels value compared to the reference images.
take the reference image with the least difference sum and you get the state of you... | We have a web-cam in our office kitchenette focused at our coffee maker. The coffee pot is clearly visible. Both the location of the coffee pot and the camera are static. Is it possible to calculate the height of coffee in the pot using image recognition? I've seen image recognition used for quite complex stuff like fa... | 0 | 1 | 1,297 |
0 | 3,242,538 | 0 | 0 | 0 | 0 | 1 | false | 27 | 2010-07-13T23:37:00.000 | 1 | 6 | 0 | Interpolation over an irregular grid | 3,242,382 | 0.033321 | python,numpy,scipy,interpolation | There's a bunch of options here, which one is best will depend on your data...
However I don't know of an out-of-the-box solution for you
You say your input data is from tripolar data. There are three main cases for how this data could be structured.
Sampled from a 3d grid in tripolar space, projected back to 2d LAT, ... | So, I have three numpy arrays which store latitude, longitude, and some property value on a grid -- that is, I have LAT(y,x), LON(y,x), and, say temperature T(y,x), for some limits of x and y. The grid isn't necessarily regular -- in fact, it's tripolar.
I then want to interpolate these property (temperature) values o... | 0 | 1 | 28,537 |
0 | 32,098,823 | 0 | 0 | 0 | 0 | 1 | false | 184 | 2010-07-26T17:25:00.000 | 2 | 10 | 0 | Numpy matrix to array | 3,337,301 | 0.039979 | python,arrays,matrix,numpy | First, Mv = numpy.asarray(M.T), which gives you a 4x1 but 2D array.
Then, perform A = Mv[0,:], which gives you what you want. You could put them together, as numpy.asarray(M.T)[0,:]. | I am using numpy. I have a matrix with 1 column and N rows and I want to get an array from with N elements.
For example, if i have M = matrix([[1], [2], [3], [4]]), I want to get A = array([1,2,3,4]).
To achieve it, I use A = np.array(M.T)[0]. Does anyone know a more elegant way to get the same result?
Thanks! | 0 | 1 | 319,127 |
0 | 3,352,172 | 0 | 0 | 0 | 0 | 1 | false | 6 | 2010-07-28T10:28:00.000 | 1 | 3 | 0 | Good framework for live charting in Python? | 3,351,963 | 0.066568 | python,live,charts | I havent worked with Matplotlib but I've always found gnuplot to be adequate for all my charting needs.
You have the option of calling gnuplot from python or using gnuplot.py
(gnuplot-py.sourceforge.net) to interface to gnuplot. | I am working on a Python application that involves running regression analysis on live data, and charting both. That is, the application gets fed with live data, and the regression models re-calculates as the data updates. Please note that I want to plot both the input (the data) and output (the regression analysis) in... | 0 | 1 | 5,782 |
0 | 3,355,927 | 0 | 0 | 1 | 0 | 2 | false | 4 | 2010-07-28T17:52:00.000 | 2 | 6 | 0 | Suggestions for passing large table between Python and C# | 3,355,832 | 0.066568 | c#,python,file | You may consider running IronPython - then you can pass values back and forth across C#/Python | I have a C# application that needs to be run several thousand times. Currently it precomputes a large table of constant values at the start of the run for reference. As these values will be the same from run to run I would like to compute them independently in a simple python script and then just have the C# app impo... | 0 | 1 | 1,046 |
0 | 3,356,036 | 0 | 0 | 1 | 0 | 2 | false | 4 | 2010-07-28T17:52:00.000 | 1 | 6 | 0 | Suggestions for passing large table between Python and C# | 3,355,832 | 0.033321 | c#,python,file | CSV is fine suggestion, but may be clumsy with values being int and double. Generally tab or semicomma are best separators. | I have a C# application that needs to be run several thousand times. Currently it precomputes a large table of constant values at the start of the run for reference. As these values will be the same from run to run I would like to compute them independently in a simple python script and then just have the C# app impo... | 0 | 1 | 1,046 |
0 | 3,357,139 | 0 | 0 | 1 | 0 | 1 | false | 2 | 2010-07-28T18:06:00.000 | 0 | 2 | 0 | What is the best way to load a CCITT T.3 compressed tiff using python? | 3,355,962 | 0 | python,compression,tiff,imaging,image-formats | How about running tiffcp with subprocess to convert to LZW (-c lzw switch), then process normally with pylibtiff? There are Windows builds of tiffcp lying around on the web. Not exactly Python-native solution, but still... | I am trying to load a CCITT T.3 compressed tiff into python, and get the pixel matrix from it. It should just be a logical matrix.
I have tried using pylibtiff and PIL, but when I load it with them, the matrix it returns is empty. I have read in a lot of places that these two tools support loading CCITT but not access... | 0 | 1 | 1,297 |
0 | 3,368,847 | 0 | 0 | 0 | 0 | 1 | true | 7 | 2010-07-30T04:43:00.000 | 6 | 2 | 0 | Zooming With Python Image Library | 3,368,740 | 1.2 | python,image-processing,python-imaging-library | You would be much better off using the EXTENT rather than the AFFINE method. You only need to calculate two things: what part of the input you want to see, and how large it should be. For example, if you want to see the whole image scaled down to half size (i.e. zooming out by 2), you'd pass the data (0, 0, im.size[0],... | I'm writing a simple application in Python which displays images.I need to implement Zoom In and Zoom Out by scaling the image.
I think the Image.transform method will be able to do this, but I'm not sure how to use it, since it's asking for an affine matrix or something like that :P
Here's the quote from the docs:
im... | 0 | 1 | 7,941 |
0 | 3,374,079 | 0 | 0 | 0 | 0 | 1 | false | 27 | 2010-07-30T14:30:00.000 | 0 | 5 | 0 | Profile Memory Allocation in Python (with support for Numpy arrays) | 3,372,444 | 0 | python,numpy,memory-management,profile | Can you just save/pickle some of the arrays to disk in tmp files when not using them? That's what I've had to do in the past with large arrays. Of course this will slow the program down, but at least it'll finish. Unless you need them all at once? | I have a program that contains a large number of objects, many of them Numpy arrays. My program is swapping miserably, and I'm trying to reduce the memory usage, because it actually can't finis on my system with the current memory requirements.
I am looking for a nice profiler that would allow me to check the amount of... | 0 | 1 | 4,821 |
0 | 20,171,517 | 0 | 0 | 0 | 0 | 1 | false | 104 | 2010-08-05T10:39:00.000 | 4 | 7 | 0 | How to create an empty R vector to add new items | 3,413,879 | 0.113791 | python,r,vector,rpy2 | As pointed out by Brani, vector() is a solution, e.g.
newVector <- vector(mode = "numeric", length = 50)
will return a vector named "newVector" with 50 "0"'s as initial values. It is also fairly common to just add the new scalar to an existing vector to arrive at an expanded vector, e.g.
aVector <- c(aVector, newScalar... | I want to use R in Python, as provided by the module Rpy2. I notice that R has very convenient [] operations by which you can extract the specific columns or lines. How could I achieve such a function by Python scripts?
My idea is to create an R vector and add those wanted elements into this vector so that the final ve... | 0 | 1 | 291,651 |
0 | 4,092,003 | 0 | 0 | 0 | 0 | 1 | false | 12 | 2010-08-10T11:23:00.000 | -5 | 4 | 0 | PDF image in PDF document using ReportLab (Python) | 3,448,365 | -1 | python,image,pdf,reportlab | Use from reportlab.graphics import renderPDF | I saved some plots from matplotlib into a pdf format because it seems to offer a better quality. How do I include the PDF image into a PDF document using ReportLab? The convenience method Image(filepath) does not work for this format.
Thank you. | 0 | 1 | 7,751 |
0 | 3,494,982 | 0 | 1 | 0 | 0 | 2 | true | 31 | 2010-08-16T12:29:00.000 | 33 | 6 | 0 | Convert image to a matrix in python | 3,493,092 | 1.2 | python,image-processing,numpy,python-imaging-library | scipy.misc.imread() will return a Numpy array, which is handy for lots of things. | I want to do some image processing using Python.
Is there a simple way to import .png image as a matrix of greyscale/RGB values (possibly using PIL)? | 0 | 1 | 77,563 |
0 | 50,263,426 | 0 | 1 | 0 | 0 | 2 | false | 31 | 2010-08-16T12:29:00.000 | 7 | 6 | 0 | Convert image to a matrix in python | 3,493,092 | 1 | python,image-processing,numpy,python-imaging-library | scipy.misc.imread() is deprecated now. We can use imageio.imread instead of that to read it as a Numpy array | I want to do some image processing using Python.
Is there a simple way to import .png image as a matrix of greyscale/RGB values (possibly using PIL)? | 0 | 1 | 77,563 |
0 | 3,520,715 | 0 | 0 | 0 | 0 | 2 | false | 1 | 2010-08-19T10:07:00.000 | 1 | 2 | 0 | Can scipy calculate (double) integrals with complex-valued integrands (real and imaginary parts in integrand)? | 3,520,672 | 0.099668 | python,scipy | Yes. Those integrals (I'll assume they're area integrals over a region in 2D space) can be calculated using an appropriate quadrature rule.
You can also use Green's theorem to convert them into contour integrals and use Gaussian quadrature to integrate along the path. | (Couldn't upload the picture showing the integral as I'm a new user.) | 0 | 1 | 571 |
0 | 3,526,740 | 0 | 0 | 0 | 0 | 2 | false | 1 | 2010-08-19T10:07:00.000 | 0 | 2 | 0 | Can scipy calculate (double) integrals with complex-valued integrands (real and imaginary parts in integrand)? | 3,520,672 | 0 | python,scipy | Thanks duffymo!
I am calculating Huygens-Fresnel diffraction integrals: plane and other wave diffraction through circular (2D) apertures in polar coordinates.
As far as the programming goes: Currently a lot of my code is in Mathematica. I am considering changing to one of: scipy, java + flanagan math library, java + ap... | (Couldn't upload the picture showing the integral as I'm a new user.) | 0 | 1 | 571 |
0 | 3,605,012 | 0 | 0 | 0 | 0 | 2 | true | 0 | 2010-08-26T15:00:00.000 | 1 | 2 | 0 | Python/Numpy error: NULL result without error in PyObject_Call | 3,576,430 | 1.2 | python,numpy | It appears that this may have been an error from using the 32-bit version of NumPy and not the 64 bit. For whatever reason, though the program has no problem keeping the array in memory, it trips up when writing the array to a file if the number of elements in the array is greater than 2^32. | I've never seen this error before, and none of the hits on Google seem to apply. I've got a very large NumPy array that holds Boolean values. When I try writing the array using numpy.dump(), I get the following error:
SystemError: NULL result without error in PyObject_Call
The array is initialized with all False values... | 0 | 1 | 3,309 |
0 | 3,576,712 | 0 | 0 | 0 | 0 | 2 | false | 0 | 2010-08-26T15:00:00.000 | 1 | 2 | 0 | Python/Numpy error: NULL result without error in PyObject_Call | 3,576,430 | 0.099668 | python,numpy | That message comes directly from the CPython interpreter (see abstract.c method PyObject_Call). You may get a better response on a Python or NumPy mailing list regarding that error message because it looks like a problem in C code.
Write a simple example to demonstrating the problem and you should be able to narrow th... | I've never seen this error before, and none of the hits on Google seem to apply. I've got a very large NumPy array that holds Boolean values. When I try writing the array using numpy.dump(), I get the following error:
SystemError: NULL result without error in PyObject_Call
The array is initialized with all False values... | 0 | 1 | 3,309 |
0 | 21,390,123 | 0 | 0 | 0 | 0 | 3 | false | 6 | 2010-09-03T21:18:00.000 | 3 | 7 | 0 | Importing SPSS dataset into Python | 3,639,639 | 0.085505 | python,import,dataset,spss | Option 1
As rkbarney pointed out, there is the Python savReaderWriter available via pypi. I've run into two issues:
It relies on a lot of extra libraries beyond the seemingly pure-python implementation. SPSS files are read and written in nearly every case by the IBM provided SPSS I/O modules. These modules differ b... | Is there any way to import SPSS dataset into Python, preferably NumPy recarray format?
I have looked around but could not find any answer.
Joon | 0 | 1 | 9,605 |
0 | 3,691,267 | 0 | 0 | 0 | 0 | 3 | false | 6 | 2010-09-03T21:18:00.000 | 1 | 7 | 0 | Importing SPSS dataset into Python | 3,639,639 | 0.028564 | python,import,dataset,spss | To be clear, the SPSS ODBC driver does not require an SPSS installation. | Is there any way to import SPSS dataset into Python, preferably NumPy recarray format?
I have looked around but could not find any answer.
Joon | 0 | 1 | 9,605 |
0 | 3,640,019 | 0 | 0 | 0 | 0 | 3 | false | 6 | 2010-09-03T21:18:00.000 | 3 | 7 | 0 | Importing SPSS dataset into Python | 3,639,639 | 0.085505 | python,import,dataset,spss | SPSS has an extensive integration with Python, but that is meant to be used with SPSS (now known as IBM SPSS Statistics). There is an SPSS ODBC driver that could be used with Python ODBC support to read a sav file. | Is there any way to import SPSS dataset into Python, preferably NumPy recarray format?
I have looked around but could not find any answer.
Joon | 0 | 1 | 9,605 |
0 | 3,650,761 | 0 | 0 | 1 | 0 | 1 | false | 67 | 2010-09-06T09:04:00.000 | 25 | 3 | 0 | Are NumPy's math functions faster than Python's? | 3,650,194 | 1 | python,performance,numpy | You should use numpy function to deal with numpy's types and use regular python function to deal with regular python types.
Worst performance usually occurs when mixing python builtins with numpy, because of types conversion. Those type conversion have been optimized lately, but it's still often better to not use them.... | I have a function defined by a combination of basic math functions (abs, cosh, sinh, exp, ...).
I was wondering if it makes a difference (in speed) to use, for example,
numpy.abs() instead of abs()? | 0 | 1 | 39,385 |
0 | 3,686,359 | 0 | 0 | 0 | 0 | 1 | false | 4 | 2010-09-09T21:49:00.000 | 1 | 1 | 0 | Compensate for Auto White Balance with OpenCV | 3,680,829 | 0.197375 | python,opencv,webcam,touchscreen,background-subtraction | You could try interfacing your camera through DirectShow and turn off Auto White Balance through your code or you could try first with the camera software deployed with it. It often gives you ability to do certain modifications as white balance and similar stuff. | I'm working on an app that takes in webcam data, applies various transformations, blurs and then does a background subtraction and threshold filter. It's a type of optical touch screen retrofitting system (the design is so different that tbeta/touchlib can't be used).
The camera's white balance is screwing up the thres... | 0 | 1 | 4,888 |
0 | 3,688,556 | 0 | 1 | 0 | 0 | 2 | false | 5 | 2010-09-10T19:29:00.000 | 0 | 6 | 0 | Storing an inverted index | 3,687,715 | 0 | python,information-retrieval,inverted-index | You could store the repr() of the dictionary and use that to re-create it. | I am working on a project on Info Retrieval.
I have made a Full Inverted Index using Hadoop/Python.
Hadoop outputs the index as (word,documentlist) pairs which are written on the file.
For a quick access, I have created a dictionary(hashtable) using the above file.
My question is, how do I store such an index on disk ... | 0 | 1 | 3,666 |
0 | 5,341,353 | 0 | 1 | 0 | 0 | 2 | false | 5 | 2010-09-10T19:29:00.000 | 0 | 6 | 0 | Storing an inverted index | 3,687,715 | 0 | python,information-retrieval,inverted-index | I am using anydmb for that purpose. Anydbm provides the same dictionary-like interface, except it allow only strings as keys and values. But this is not a constraint since you can use cPickle's loads/dumps to store more complex structures in the index. | I am working on a project on Info Retrieval.
I have made a Full Inverted Index using Hadoop/Python.
Hadoop outputs the index as (word,documentlist) pairs which are written on the file.
For a quick access, I have created a dictionary(hashtable) using the above file.
My question is, how do I store such an index on disk ... | 0 | 1 | 3,666 |
0 | 19,403,571 | 0 | 0 | 0 | 0 | 1 | false | 2 | 2010-09-11T22:55:00.000 | 0 | 4 | 0 | How to find the average of multiple columns in a file using python | 3,692,996 | 0 | python | Less of an answer than it is an alternative understanding of the problem:
You could think of each line being a vector. In this way, the average done column-by-column is just the average of each of these vectors. All you need in order to do this is
A way to read a line into a vector object,
A vector addition operatio... | Hi I have a file that consists of too many columns to open in excel. Each column has 10 rows of numerical values 0-2 and has a row saying the title of the column. I would like the output to be the name of the column and the average value of the 10 rows. The file is too large to open in excel 2000 so I have to try using... | 0 | 1 | 5,272 |
0 | 3,704,637 | 0 | 0 | 0 | 0 | 1 | false | 24 | 2010-09-13T21:29:00.000 | 20 | 4 | 0 | In Python small floats tending to zero | 3,704,570 | 1 | python,floating-point,numerical-stability | Would it be possible to do your work in a logarithmic space? (For example, instead of storing 1e-320, just store -320, and use addition instead of multiplication) | I have a Bayesian Classifier programmed in Python, the problem is that when I multiply the features probabilities I get VERY small float values like 2.5e-320 or something like that, and suddenly it turns into 0.0. The 0.0 is obviously of no use to me since I must find the "best" class based on which class returns the M... | 0 | 1 | 17,743 |
0 | 3,762,217 | 0 | 0 | 1 | 0 | 2 | false | 4 | 2010-09-21T15:45:00.000 | 2 | 5 | 0 | Python vs. C++ for an application that does sparse linear algebra | 3,761,994 | 0.07983 | c++,python,linear-algebra | I don't have directly applicable experience, but the scipy/numpy operations are almost all implemented in C. As long as most of what you need to do is expressed in terms of scipy/numpy functions, then your code shouldn't be much slower than equivalent C/C++. | I'm writing an application where quite a bit of the computational time will be devoted to performing basic linear algebra operations (add, multiply, multiply by vector, multiply by scalar, etc.) on sparse matrices and vectors. Up to this point, we've built a prototype using C++ and the Boost matrix library.
I'm consid... | 0 | 1 | 3,064 |
0 | 3,762,759 | 0 | 0 | 1 | 0 | 2 | false | 4 | 2010-09-21T15:45:00.000 | 1 | 5 | 0 | Python vs. C++ for an application that does sparse linear algebra | 3,761,994 | 0.039979 | c++,python,linear-algebra | Speed nowdays its no longer an issue for python since ctypes and cython emerged. Whats brilliant about cython is that your write python code and it generates c code without requiring from you to know a single line of c and then compiles to a library or you could even create a stanalone. Ctypes also is similar though a... | I'm writing an application where quite a bit of the computational time will be devoted to performing basic linear algebra operations (add, multiply, multiply by vector, multiply by scalar, etc.) on sparse matrices and vectors. Up to this point, we've built a prototype using C++ and the Boost matrix library.
I'm consid... | 0 | 1 | 3,064 |
0 | 3,792,494 | 0 | 0 | 1 | 0 | 2 | false | 8 | 2010-09-25T04:12:00.000 | 2 | 5 | 0 | Among MATLAB and Python, which one is good for statistical analysis? | 3,792,465 | 0.07983 | python,matlab,statistics,analysis | SciPy, NumPy and Matplotlib. | Which one among the two languages is good for statistical analysis? What are the pros and cons, other than accessibility, for each? | 0 | 1 | 1,096 |
0 | 3,792,582 | 0 | 0 | 1 | 0 | 2 | false | 8 | 2010-09-25T04:12:00.000 | 3 | 5 | 0 | Among MATLAB and Python, which one is good for statistical analysis? | 3,792,465 | 0.119427 | python,matlab,statistics,analysis | I would pick Python because it can be a powerful as Matlab but is free. Also, you can distribute your applications for free and no licensing chains.
Matlab is awesome and expensive (it had a great statistical package) and it will glow smoother than Python in the beginning, but not so in the long run.
Now, if you real... | Which one among the two languages is good for statistical analysis? What are the pros and cons, other than accessibility, for each? | 0 | 1 | 1,096 |
0 | 3,796,237 | 0 | 1 | 0 | 0 | 1 | false | 20 | 2010-09-25T17:25:00.000 | 2 | 3 | 0 | Ruby generators vs Python generators | 3,794,762 | 0.132549 | python,ruby,generator,enumerator | Generators are stack based, Ruby's Enumerators are often specialised (at the interpreter level) and not stack based. | I've been researching the similarities/differences between Ruby and Python generators (known as Enumerators in Ruby), and so far as i can tell they're pretty much equivalent.
However one difference i've noticed is that Python Generators support a close() method whereas Ruby Generators do not. From the Python docs the ... | 0 | 1 | 4,732 |
0 | 3,819,063 | 0 | 0 | 0 | 0 | 1 | false | 1 | 2010-09-29T05:10:00.000 | 0 | 2 | 0 | using file/db as the buffer for very big numpy array to yield data prevent overflow? | 3,818,881 | 0 | python,memory-management,numpy | If You have matrices with lots of zeros use scipy.sparse.csc_matrix.
It's possible to write everything, for example You can override numarray array class. | In using the numpy.darray, I met a memory overflow problem due to the size of data,for example:
Suppose I have a 100000000 * 100000000 * 100000000 float64 array data source, when I want to read data and process it in memory with np. It will raise a Memoray Error because it works out all memory for storing such a big ar... | 0 | 1 | 337 |
1 | 3,855,400 | 0 | 0 | 0 | 0 | 1 | false | 1 | 2010-10-04T00:29:00.000 | 0 | 3 | 0 | How to draw polygons with Point2D in wxPython? | 3,852,146 | 0 | python,wxpython | DC's only use integers. Try using Cairo or wx.GraphicsContext. | I have input values of x, y, z coordinates in the following format:
[-11.235865 5.866001 -4.604924]
[-11.262565 5.414276 -4.842384]
[-11.291885 5.418229 -4.849229]
[-11.235865 5.866001 -4.604924]
I want to draw polygons and succeeded with making a list of wx.point objects. But I need to plot floating point coordinates ... | 0 | 1 | 1,320 |
0 | 3,859,736 | 0 | 1 | 1 | 0 | 1 | false | 20 | 2010-10-04T13:12:00.000 | 4 | 10 | 0 | Fastest way to sort in Python | 3,855,537 | 0.07983 | python,arrays,performance,sorting | Radix sort theoretically runs in linear time (sort time grows roughly in direct proportion to array size ), but in practice Quicksort is probably more suited, unless you're sorting absolutely massive arrays.
If you want to make quicksort a bit faster, you can use insertion sort] when the array size becomes small.
It wo... | What is the fastest way to sort an array of whole integers bigger than 0 and less than 100000 in Python? But not using the built in functions like sort.
Im looking at the possibility to combine 2 sport functions depending on input size. | 0 | 1 | 57,933 |
0 | 25,062,330 | 0 | 1 | 0 | 0 | 1 | false | 23 | 2010-10-07T22:19:00.000 | 3 | 3 | 0 | Display array as raster image in python | 3,886,281 | 0.197375 | python,image,image-processing,numpy | Quick addition: for displaying with matplotlib, if you want the image to appear "raster", i.e., pixelized without smoothing, then you should include the option interpolation='nearest' in the call to imshow. | I've got a numpy array in Python and I'd like to display it on-screen as a raster image. What is the simplest way to do this? It doesn't need to be particularly fancy or have a nice interface, all I need to do is to display the contents of the array as a greyscale raster image.
I'm trying to transition some of my IDL c... | 0 | 1 | 41,992 |
0 | 18,177,268 | 0 | 0 | 0 | 0 | 1 | false | 13 | 2010-10-08T13:48:00.000 | 1 | 6 | 0 | Select cells randomly from NumPy array - without replacement | 3,891,180 | 0.033321 | python,random,numpy,shuffle,sampling | people using numpy version 1.7 or later there can also use the builtin function numpy.random.choice | I'm writing some modelling routines in NumPy that need to select cells randomly from a NumPy array and do some processing on them. All cells must be selected without replacement (as in, once a cell has been selected it can't be selected again, but all cells must be selected by the end).
I'm transitioning from IDL where... | 0 | 1 | 15,426 |
0 | 3,893,461 | 0 | 0 | 0 | 0 | 1 | false | 4 | 2010-10-08T14:40:00.000 | 0 | 3 | 0 | Clustering problem | 3,891,645 | 0 | python,algorithm,cluster-analysis,classification,nearest-neighbor | If your number of clusters is fixed and you only want to maximize the number of points that are in these clusters then I think a greedy solution would be good :
find the rectangle that can contains the maximum number of points,
remove these points,
find the next rectangle
...
So how to find the rectangle of maximum ... | I've been tasked to find N clusters containing the most points for a certain data set given that the clusters are bounded by a certain size. Currently, I am attempting to do this by plugging in my data into a kd-tree, iterating over the data and finding its nearest neighbor, and then merging the points if the cluster t... | 0 | 1 | 1,423 |
0 | 3,934,387 | 0 | 1 | 0 | 0 | 1 | false | 0 | 2010-10-14T13:56:00.000 | 2 | 2 | 0 | build recent numpy on recent ubuntu | 3,933,923 | 0.197375 | python,ubuntu,numpy | One way to try, which isn't guaranteed to work, but worth a shot is to see if uupdate can sucessfully update the package. Get a tarball of numpy 1.5. run "apt-get source numpy" which should fetch and unpack the current source from ubuntu. cd into this source directory and run "uupdate ../numpytarballname". This sho... | How do I build numpy 1.5 on ubuntu 10.10?
The instructions I found seems outdated or not clear.
Thanks | 0 | 1 | 1,563 |
0 | 4,572,785 | 0 | 0 | 0 | 0 | 2 | false | 9 | 2010-10-15T21:16:00.000 | 2 | 5 | 0 | Migrating from Stata to Python | 3,946,219 | 0.07983 | python,statistics,numpy,stata | Use Rpy2 and call the R var package. | Some coworkers who have been struggling with Stata 11 are asking for my help to try to automate their laborious work. They mainly use 3 commands in Stata:
tsset (sets a time series analysis)
as in: tsset year_column, yearly
varsoc (Obtain lag-order selection statistics for VARs)
as in: varsoc column_a column_b
ve... | 0 | 1 | 1,936 |
0 | 3,946,648 | 0 | 0 | 0 | 0 | 2 | false | 9 | 2010-10-15T21:16:00.000 | 0 | 5 | 0 | Migrating from Stata to Python | 3,946,219 | 0 | python,statistics,numpy,stata | I have absolutely no clue what any of those do, but NumPy and SciPy. Maybe Sage or SymPy. | Some coworkers who have been struggling with Stata 11 are asking for my help to try to automate their laborious work. They mainly use 3 commands in Stata:
tsset (sets a time series analysis)
as in: tsset year_column, yearly
varsoc (Obtain lag-order selection statistics for VARs)
as in: varsoc column_a column_b
ve... | 0 | 1 | 1,936 |
0 | 3,964,945 | 0 | 0 | 0 | 0 | 1 | false | 15 | 2010-10-17T20:12:00.000 | 1 | 5 | 0 | Image analysis in R | 3,955,077 | 0.039979 | python,image,r,analysis | Try the rgdal package. You will be able to read (import) and write (export) GeoTiff image files from/to R.
Marcio Pupin Mello | I would like to know how I would go about performing image analysis in R. My goal is to convert images into matrices (pixel-wise information), extract/quantify color, estimate the presence of shapes and compare images based on such metrics/patterns.
I am aware of relevant packages available in Python (suggestions relev... | 0 | 1 | 4,953 |
0 | 3,993,156 | 0 | 0 | 0 | 0 | 1 | true | 7 | 2010-10-22T00:50:00.000 | 7 | 3 | 0 | What does ... mean in numpy code? | 3,993,125 | 1.2 | python,numpy | Yes, you're right. It fills in as many : as required. The only difference occurs when you use multiple ellipses. In that case, the first ellipsis acts in the same way, but each remaining one is converted to a single :. | And what is it called? I don't know how to search for it; I tried calling it ellipsis with the Google. I don't mean in interactive output when dots are used to indicate that the full array is not being shown, but as in the code I'm looking at,
xTensor0[...] = xVTensor[..., 0]
From my experimentation, it appears to fu... | 0 | 1 | 1,449 |
0 | 53,562,948 | 0 | 0 | 0 | 0 | 1 | false | 1,721 | 2010-10-22T12:48:00.000 | 3 | 22 | 0 | Generate random integers between 0 and 9 | 3,996,904 | 0.027266 | python,random,integer | This is more of a mathematical approach but it works 100% of the time:
Let's say you want to use random.random() function to generate a number between a and b. To achieve this, just do the following:
num = (b-a)*random.random() + a;
Of course, you can generate more numbers. | How can I generate random integers between 0 and 9 (inclusive) in Python?
For example, 0, 1, 2, 3, 4, 5, 6, 7, 8, 9 | 0 | 1 | 2,497,091 |
0 | 4,004,384 | 0 | 0 | 0 | 0 | 3 | false | 5 | 2010-10-23T11:58:00.000 | 0 | 4 | 0 | How to build a conceptual search engine? | 4,003,840 | 0 | python,search,lucene,nlp,lsa | First , write a piece of python code that will return you pineapple , orange , papaya when you input apple. By focusing on "is" relation of semantic network. Then continue with has a relationship and so on.
I think at the end , you might get a fairly sufficient piece of code for a school project. | I would like to build an internal search engine (I have a very large collection of thousands of XML files) that is able to map queries to concepts. For example, if I search for "big cats", I would want highly ranked results to return documents with "large cats" as well. But I may also be interested in having it retur... | 0 | 1 | 1,997 |
0 | 4,004,024 | 0 | 0 | 0 | 0 | 3 | false | 5 | 2010-10-23T11:58:00.000 | 1 | 4 | 0 | How to build a conceptual search engine? | 4,003,840 | 0.049958 | python,search,lucene,nlp,lsa | This is an incredibly hard problem and it can't be solved in a way that would always produce adequate results. I'd suggest to stick to some very simple principles instead so that the results are at least predictable. I think you need 2 things: some basic morphology engine plus a dictionary of synonyms.
Whenever a searc... | I would like to build an internal search engine (I have a very large collection of thousands of XML files) that is able to map queries to concepts. For example, if I search for "big cats", I would want highly ranked results to return documents with "large cats" as well. But I may also be interested in having it retur... | 0 | 1 | 1,997 |
0 | 4,004,314 | 0 | 0 | 0 | 0 | 3 | true | 5 | 2010-10-23T11:58:00.000 | 9 | 4 | 0 | How to build a conceptual search engine? | 4,003,840 | 1.2 | python,search,lucene,nlp,lsa | I'm not sure how to integrate that into a search engine. Could I use Lucene to do this? How?
Step 1. Stop.
Step 2. Get something to work.
Step 3. By then, you'll understand more about Python and Lucene and other tools and ways you might integrate them.
Don't start by trying to solve integration problems. Software ... | I would like to build an internal search engine (I have a very large collection of thousands of XML files) that is able to map queries to concepts. For example, if I search for "big cats", I would want highly ranked results to return documents with "large cats" as well. But I may also be interested in having it retur... | 0 | 1 | 1,997 |
0 | 4,023,046 | 0 | 0 | 0 | 0 | 3 | false | 8 | 2010-10-26T10:42:00.000 | 0 | 6 | 0 | Pytables vs. CSV for files that are not very large | 4,022,887 | 0 | python,csv,pytables | These are not "exclusive" choices.
You need both.
CSV is just a data exchange format. If you use pytables, you still need to import and export in CSV format. | I recently came across Pytables and find it to be very cool. It is clear that they are superior to a csv format for very large data sets. I am running some simulations using python. The output is not so large, say 200 columns and 2000 rows.
If someone has experience with both, can you suggest which format would be mor... | 0 | 1 | 3,418 |
0 | 7,753,331 | 0 | 0 | 0 | 0 | 3 | false | 8 | 2010-10-26T10:42:00.000 | 2 | 6 | 0 | Pytables vs. CSV for files that are not very large | 4,022,887 | 0.066568 | python,csv,pytables | One big plus for PyTables is the storage of metadata, like variables etc.
If you run the simulations more often with different parameters you the store the results as an array entry in the h5 file.
We use it to store measurement data + experiment scripts to get the data so it is all self contained.
BTW: If you need to ... | I recently came across Pytables and find it to be very cool. It is clear that they are superior to a csv format for very large data sets. I am running some simulations using python. The output is not so large, say 200 columns and 2000 rows.
If someone has experience with both, can you suggest which format would be mor... | 0 | 1 | 3,418 |
0 | 4,024,016 | 0 | 0 | 0 | 0 | 3 | false | 8 | 2010-10-26T10:42:00.000 | 1 | 6 | 0 | Pytables vs. CSV for files that are not very large | 4,022,887 | 0.033321 | python,csv,pytables | i think its very hard to comapre pytables and csv.. pyTable is a datastructure ehile CSV is an exchange format for data. | I recently came across Pytables and find it to be very cool. It is clear that they are superior to a csv format for very large data sets. I am running some simulations using python. The output is not so large, say 200 columns and 2000 rows.
If someone has experience with both, can you suggest which format would be mor... | 0 | 1 | 3,418 |
0 | 4,066,155 | 0 | 0 | 0 | 0 | 1 | false | 6 | 2010-11-01T01:16:00.000 | 2 | 4 | 0 | Proper data structure to represent a Sudoku puzzle? | 4,066,075 | 0.099668 | python,data-structures,graph,sudoku | Others have reasonably suggested simply using a 2D array.
I note that a 2D array in most language implementations (anything in which that is implemented as "array of array of X" suffers from additional access time overhead (one access to the top level array, a second to the subarray).
I suggest you implement the data s... | What would be a smart data structure to use to represent a Sudoku puzzle? I.e. a 9X9 square where each "cell" contains either a number or a blank.
Special considerations include:
Ability to compare across row, column, and in 3X3 "group
Ease of implementation (specifically in Python)
Efficiency (not paramount)
I suppo... | 0 | 1 | 7,767 |
0 | 4,072,921 | 0 | 0 | 0 | 0 | 1 | false | 7 | 2010-11-01T20:37:00.000 | 0 | 3 | 0 | Add more sample points to data | 4,072,844 | 0 | python,numpy,scipy | If your application is not sensitive to precision or you just want a quick overview, you could just fill the unknown data points with averages from neighbouring known data points (in other words, do naive linear interpolation). | Given some data of shape 20x45, where each row is a separate data set, say 20 different sine curves with 45 data points each, how would I go about getting the same data, but with shape 20x100?
In other words, I have some data A of shape 20x45, and some data B of length 20x100, and I would like to have A be of shape 20x... | 0 | 1 | 5,913 |
0 | 4,098,941 | 0 | 0 | 1 | 0 | 4 | false | 2 | 2010-11-04T15:51:00.000 | 1 | 6 | 0 | Collecting, storing, and retrieving large amounts of numeric data | 4,098,509 | 0.033321 | java,c++,python,storage,simulation | Using D-Bus format to send the information may be to your advantage. The format is standard, binary, and D-Bus is implemented in multiple languages, and can be used to send both over the network and inter-process on the same machine. | I am about to start collecting large amounts of numeric data in real-time (for those interested, the bid/ask/last or 'tape' for various stocks and futures). The data will later be retrieved for analysis and simulation. That's not hard at all, but I would like to do it efficiently and that brings up a lot of questions... | 0 | 1 | 2,032 |
0 | 4,098,550 | 0 | 0 | 1 | 0 | 4 | false | 2 | 2010-11-04T15:51:00.000 | 0 | 6 | 0 | Collecting, storing, and retrieving large amounts of numeric data | 4,098,509 | 0 | java,c++,python,storage,simulation | If you are just storing, then use system tools. Don't write your own. If you need to do some real-time processing of the data before it is stored, then that's something completely different. | I am about to start collecting large amounts of numeric data in real-time (for those interested, the bid/ask/last or 'tape' for various stocks and futures). The data will later be retrieved for analysis and simulation. That's not hard at all, but I would like to do it efficiently and that brings up a lot of questions... | 0 | 1 | 2,032 |
0 | 4,098,613 | 0 | 0 | 1 | 0 | 4 | false | 2 | 2010-11-04T15:51:00.000 | 1 | 6 | 0 | Collecting, storing, and retrieving large amounts of numeric data | 4,098,509 | 0.033321 | java,c++,python,storage,simulation | Actually, this is quite similar to what I'm doing, which is monitoring changes players make to the world in a game. I'm currently using an sqlite database with python.
At the start of the program, I load the disk database into memory, for fast writing procedures. Each change is put in to two lists. These lists are for ... | I am about to start collecting large amounts of numeric data in real-time (for those interested, the bid/ask/last or 'tape' for various stocks and futures). The data will later be retrieved for analysis and simulation. That's not hard at all, but I would like to do it efficiently and that brings up a lot of questions... | 0 | 1 | 2,032 |
0 | 4,098,582 | 0 | 0 | 1 | 0 | 4 | false | 2 | 2010-11-04T15:51:00.000 | 3 | 6 | 0 | Collecting, storing, and retrieving large amounts of numeric data | 4,098,509 | 0.099668 | java,c++,python,storage,simulation | Optimizing for disk space and IO speed is the same thing - these days, CPUs are so fast compared to IO that it's often overall faster to compress data before storing it (you may actually want to do that). I don't really see memory playing a big role (though you should probably use a reasonably-sized buffer to ensure yo... | I am about to start collecting large amounts of numeric data in real-time (for those interested, the bid/ask/last or 'tape' for various stocks and futures). The data will later be retrieved for analysis and simulation. That's not hard at all, but I would like to do it efficiently and that brings up a lot of questions... | 0 | 1 | 2,032 |
0 | 4,101,917 | 0 | 0 | 0 | 0 | 1 | false | 1 | 2010-11-04T22:01:00.000 | 0 | 2 | 1 | Hadoop/Elastic Map Reduce with binary executable? | 4,101,815 | 0 | python,matlab,amazon-web-services,hadoop,mapreduce | The following is not exactly an answer to your Hadoop question, but I couldn't resist not asking why you don't execute your processing jobs on the Grid resources? There are proven solutions for executing compute intensive workflows on the Grid. And as far as I know matlab runtime environment is usually available on the... | I am writing and distributed image processing application using hadoop streaming, python, matlab, and elastic map reduce. I have compiled a binary executable of my matlab code using the matlab compiler. I am wondering how I can incorporate this into my workflow so the binary is part of the processing on Amazon's elasti... | 0 | 1 | 1,138 |
0 | 4,122,980 | 0 | 0 | 0 | 1 | 2 | true | 15 | 2010-11-08T10:00:00.000 | 34 | 2 | 0 | Csv blank rows problem with Excel | 4,122,794 | 1.2 | python,excel,csv | You're using open('file.csv', 'w')--try open('file.csv', 'wb').
The Python csv module requires output files be opened in binary mode. | I have a csv file which contains rows from a sqlite3 database. I wrote the rows to the csv file using python.
When I open the csv file with Ms Excel, a blank row appears below every row, but the file on notepad is fine(without any blanks).
Does anyone know why this is happenning and how I can fix it?
Edit: I used the s... | 0 | 1 | 7,209 |
0 | 4,122,816 | 0 | 0 | 0 | 1 | 2 | false | 15 | 2010-11-08T10:00:00.000 | 0 | 2 | 0 | Csv blank rows problem with Excel | 4,122,794 | 0 | python,excel,csv | the first that comes into my mind (just an idea) is that you might have used "\r\n" as row delimiter (which is shown as one linebrak in notepad) but excel expects to get only "\n" or only "\r" and so it interprets this as two line-breaks. | I have a csv file which contains rows from a sqlite3 database. I wrote the rows to the csv file using python.
When I open the csv file with Ms Excel, a blank row appears below every row, but the file on notepad is fine(without any blanks).
Does anyone know why this is happenning and how I can fix it?
Edit: I used the s... | 0 | 1 | 7,209 |
0 | 53,662,588 | 0 | 0 | 0 | 0 | 3 | false | 62 | 2010-11-09T03:49:00.000 | 0 | 11 | 0 | python matplotlib framework under macosx? | 4,130,355 | 0 | python,macos,matplotlib,fink | Simply aliasing a new command to launch python in ~/.bash_profile will do the trick.
alias vpython3=/Library/Frameworks/Python.framework/Versions/3.6(replace with your own python version)/bin/python3
then 'source ~/.bash_profile' and use vpython3 to launch python3.
Explanation: Python is actually by default installed a... | I am getting this error:
/sw/lib/python2.7/site-packages/matplotlib/backends/backend_macosx.py:235:
UserWarning: Python is not installed as a framework. The MacOSX
backend may not work correctly if Python is not installed as a
framework. Please see the Python documentation for more information on
installing Py... | 0 | 1 | 25,060 |
0 | 4,131,726 | 0 | 0 | 0 | 0 | 3 | true | 62 | 2010-11-09T03:49:00.000 | 18 | 11 | 0 | python matplotlib framework under macosx? | 4,130,355 | 1.2 | python,macos,matplotlib,fink | There are two ways Python can be built and installed on Mac OS X. One is as a traditional flat Unix-y shared library. The other is known as a framework install, a file layout similar to other frameworks on OS X where all of the component directories (include, lib, bin) for the product are installed as subdirectories ... | I am getting this error:
/sw/lib/python2.7/site-packages/matplotlib/backends/backend_macosx.py:235:
UserWarning: Python is not installed as a framework. The MacOSX
backend may not work correctly if Python is not installed as a
framework. Please see the Python documentation for more information on
installing Py... | 0 | 1 | 25,060 |
0 | 33,873,802 | 0 | 0 | 0 | 0 | 3 | false | 62 | 2010-11-09T03:49:00.000 | 31 | 11 | 0 | python matplotlib framework under macosx? | 4,130,355 | 1 | python,macos,matplotlib,fink | Optionally you could use the Agg backend which requires no extra installation of anything. Just put backend : Agg into ~/.matplotlib/matplotlibrc | I am getting this error:
/sw/lib/python2.7/site-packages/matplotlib/backends/backend_macosx.py:235:
UserWarning: Python is not installed as a framework. The MacOSX
backend may not work correctly if Python is not installed as a
framework. Please see the Python documentation for more information on
installing Py... | 0 | 1 | 25,060 |
0 | 30,009,455 | 0 | 0 | 0 | 0 | 1 | false | 2 | 2010-11-09T11:51:00.000 | 0 | 3 | 0 | HDF5 : storing NumPy data | 4,133,327 | 0 | python,c,numpy,hdf5,pytables | HDF5 takes care of binary compatibility of structures for you. You simply have to tell it what your structs consist of (dtype) and you'll have no problems saving/reading record arrays - this is because the type system is basically 1:1 between numpy and HDF5. If you use H5py I'm confident to say the IO should be fast ... | when I used NumPy I stored it's data in the native format *.npy. It's very fast and gave me some benefits, like this one
I could read *.npy from C code as
simple binary data(I mean *.npy are
binary-compatibly with C structures)
Now I'm dealing with HDF5 (PyTables at this moment). As I read in the tutorial, they are u... | 0 | 1 | 6,260 |
0 | 4,156,445 | 0 | 1 | 0 | 0 | 2 | false | 0 | 2010-11-11T00:32:00.000 | 2 | 3 | 0 | Extract different POS words for a given word in python nltk | 4,150,443 | 0.132549 | python,nltk | There are two options i can think of off the top of my head:
Option one is to iterate over the sample POS-tagged corpora and simply build this mapping yourself. This gives you the POS tags that are associated with a particular word in the corpora.
Option two is to build a hidden markov model POS tagger on the corpora, ... | Is there any package in python nltk that can produce all different parts of speech words for a given word. For example if i give add(verb) then it must produce addition(noun),additive(adj) and so on. Can anyone let me know? | 0 | 1 | 430 |
0 | 4,155,066 | 0 | 1 | 0 | 0 | 2 | false | 0 | 2010-11-11T00:32:00.000 | 0 | 3 | 0 | Extract different POS words for a given word in python nltk | 4,150,443 | 0 | python,nltk | NLTK has a lot of clever things hiding away, so there might be a direct way of doing it. However, I think you may have to write your own code to work with the WordNet database. | Is there any package in python nltk that can produce all different parts of speech words for a given word. For example if i give add(verb) then it must produce addition(noun),additive(adj) and so on. Can anyone let me know? | 0 | 1 | 430 |
0 | 4,151,409 | 0 | 0 | 0 | 0 | 2 | false | 6 | 2010-11-11T02:26:00.000 | 0 | 3 | 0 | Where do two 2-D arrays begin to overlap each other? | 4,150,909 | 0 | python,multidimensional-array,numpy,subdomain,overlap | Can you say more? What model are you using? What are you modelling? How is it computed?
Can you make the dimensions match to avoid the fit? (i.e. if B doesn't depend on all of A, only plug in the part of A that B models, or compute boring values for the parts of B that wouldn't overlap A and drop those values later) | I'm working with model output at the moment, and I can't seem to come up with a nice way of combining two arrays of data. Arrays A and B store different data, and the entries in each correspond to some spatial (x,y) point -- A holds some parameter, and B holds model output. The problem is that B is a spatial subsecti... | 0 | 1 | 1,367 |
0 | 4,191,918 | 0 | 0 | 0 | 0 | 2 | false | 6 | 2010-11-11T02:26:00.000 | 0 | 3 | 0 | Where do two 2-D arrays begin to overlap each other? | 4,150,909 | 0 | python,multidimensional-array,numpy,subdomain,overlap | I need to find the indexes at which they start to overlap
So are you looking for indexes from A or from B? And is B strictly rectangular?
Finding the bounding box or convex hull of B is really cheap. | I'm working with model output at the moment, and I can't seem to come up with a nice way of combining two arrays of data. Arrays A and B store different data, and the entries in each correspond to some spatial (x,y) point -- A holds some parameter, and B holds model output. The problem is that B is a spatial subsecti... | 0 | 1 | 1,367 |
0 | 4,156,169 | 0 | 1 | 0 | 0 | 2 | false | 1 | 2010-11-11T15:19:00.000 | 1 | 4 | 0 | moving on from python | 4,155,955 | 0.049958 | python,programming-languages | If you just want to learn a new language you could take a look at scala. The language is influenced by languages like ruby, python and erlang, but is staticaly typed and runs on the JVM. The speed is comparable to Java. And you can use all the java libraries, plus reuse a lot of your python code through jython. | I use python heavily for manipulating data and then packaging it for statistical modeling (R through RPy2).
Feeling a little restless, I would like to branch out into other languages where
Faster than python
It's free
There's good books, documentations and tutorials
Very suitable for data manipulation
Lots of librar... | 0 | 1 | 210 |
0 | 4,157,423 | 0 | 1 | 0 | 0 | 2 | false | 1 | 2010-11-11T15:19:00.000 | 1 | 4 | 0 | moving on from python | 4,155,955 | 0.049958 | python,programming-languages | I didn't see you mention SciPy on your list... I tend like R syntax better, but they cover much of the same ground. SciPy has faster matrix and array structures than the general purpose Python ones. Mostly places where I have wanted to use Cython, SciPy has been just as easy / fast.
GNU/Octave is an open/free versi... | I use python heavily for manipulating data and then packaging it for statistical modeling (R through RPy2).
Feeling a little restless, I would like to branch out into other languages where
Faster than python
It's free
There's good books, documentations and tutorials
Very suitable for data manipulation
Lots of librar... | 0 | 1 | 210 |
0 | 4,158,455 | 0 | 0 | 0 | 0 | 1 | true | 51 | 2010-11-11T19:19:00.000 | 72 | 3 | 0 | more than 9 subplots in matplotlib | 4,158,367 | 1.2 | python,charts,matplotlib | It was easier than I expected, I just did: pylab.subplot(4,4,10) and it worked. | Is it possible to get more than 9 subplots in matplotlib?
I am on the subplots command pylab.subplot(449); how can I get a 4410 to work?
Thank you very much. | 0 | 1 | 28,836 |
0 | 4,215,056 | 0 | 0 | 0 | 0 | 2 | true | 25 | 2010-11-18T12:44:00.000 | 13 | 8 | 0 | An example using python bindings for SVM library, LIBSVM | 4,214,868 | 1.2 | python,machine-learning,svm,libsvm | LIBSVM reads the data from a tuple containing two lists. The first list contains the classes and the second list contains the input data. create simple dataset with two possible classes
you also need to specify which kernel you want to use by creating svm_parameter.
>> from libsvm import *
>> prob = svm_problem([1,-1... | I am in dire need of a classification task example using LibSVM in python. I don't know how the Input should look like and which function is responsible for training and which one for testing
Thanks | 0 | 1 | 50,030 |
0 | 8,302,624 | 0 | 0 | 0 | 0 | 2 | false | 25 | 2010-11-18T12:44:00.000 | 3 | 8 | 0 | An example using python bindings for SVM library, LIBSVM | 4,214,868 | 0.07486 | python,machine-learning,svm,libsvm | Adding to @shinNoNoir :
param.kernel_type represents the type of kernel function you want to use,
0: Linear
1: polynomial
2: RBF
3: Sigmoid
Also have in mind that, svm_problem(y,x) : here y is the class labels and x is the class instances and x and y can only be lists,tuples and dictionaries.(no numpy array) | I am in dire need of a classification task example using LibSVM in python. I don't know how the Input should look like and which function is responsible for training and which one for testing
Thanks | 0 | 1 | 50,030 |
0 | 4,263,022 | 0 | 0 | 1 | 0 | 2 | false | 12 | 2010-11-24T02:31:00.000 | 0 | 5 | 0 | Data analysis using R/python and SSDs | 4,262,984 | 0 | python,r,data-analysis,solid-state-drive | The read and write times for SSDs are significantly higher than standard 7200 RPM disks (it's still worth it with a 10k RPM disk, not sure how much of an improvement it is over a 15k). So, yes, you'd get much faster times on data access.
The performance improvement is undeniable. Then, it's a question of economics. 2TB... | Does anyone have any experience using r/python with data stored in Solid State Drives. If you are doing mostly reads, in theory this should significantly improve the load times of large datasets. I want to find out if this is true and if it is worth investing in SSDs for improving the IO rates in data intensive applica... | 0 | 1 | 4,485 |
0 | 4,264,161 | 0 | 0 | 1 | 0 | 2 | false | 12 | 2010-11-24T02:31:00.000 | 2 | 5 | 0 | Data analysis using R/python and SSDs | 4,262,984 | 0.07983 | python,r,data-analysis,solid-state-drive | I have to second John's suggestion to profile your application. My experience is that it isn't the actual data reads that are the slow part, it's the overhead of creating the programming objects to contain the data, casting from strings, memory allocation, etc.
I would strongly suggest you profile your code first, and... | Does anyone have any experience using r/python with data stored in Solid State Drives. If you are doing mostly reads, in theory this should significantly improve the load times of large datasets. I want to find out if this is true and if it is worth investing in SSDs for improving the IO rates in data intensive applica... | 0 | 1 | 4,485 |
0 | 5,586,430 | 0 | 0 | 0 | 0 | 1 | false | 8 | 2010-11-25T02:03:00.000 | 2 | 2 | 0 | Using SlopeOne algorithm to predict if a gamer can complete a level in a Game? | 4,273,169 | 0.197375 | python,algorithm,filtering,prediction,collaborative | I think it might work, but I would apply log to the number of tries (you can't do log(0) so retries won't work) first. If someone found a level easy they would try it once or twice, whereas people who found it hard would generally have to do it over and over again. The difference between did it in 1 go vs 2 goes is muc... | I am planning to use SlopeOne algorithm to predict if a gamer can complete a given level in a Game or not?
Here is the scenario:
Lots of Gamers play and try to complete 100 levels in the game.
Each gamer can play a level as many times as they want until they cross the level.
The system keeps track of the level and the... | 0 | 1 | 431 |
0 | 4,273,543 | 0 | 0 | 0 | 0 | 2 | false | 50 | 2010-11-25T03:18:00.000 | 4 | 5 | 0 | Reversible hash function? | 4,273,466 | 0.158649 | python,hash | Why not just XOR with a nice long number?
Easy. Fast. Reversible.
Or, if this doesn't need to be terribly secure, you could convert from base 10 to some smaller base (like base 8 or base 4, depending on how long you want the numbers to be). | I need a reversible hash function (obviously the input will be much smaller in size than the output) that maps the input to the output in a random-looking way. Basically, I want a way to transform a number like "123" to a larger number like "9874362483910978", but not in a way that will preserve comparisons, so it must... | 0 | 1 | 47,252 |
0 | 4,274,259 | 0 | 0 | 0 | 0 | 2 | false | 50 | 2010-11-25T03:18:00.000 | 19 | 5 | 0 | Reversible hash function? | 4,273,466 | 1 | python,hash | What you are asking for is encryption. A block cipher in its basic mode of operation, ECB, reversibly maps a input block onto an output block of the same size. The input and output blocks can be interpreted as numbers.
For example, AES is a 128 bit block cipher, so it maps an input 128 bit number onto an output 128 b... | I need a reversible hash function (obviously the input will be much smaller in size than the output) that maps the input to the output in a random-looking way. Basically, I want a way to transform a number like "123" to a larger number like "9874362483910978", but not in a way that will preserve comparisons, so it must... | 0 | 1 | 47,252 |
0 | 4,323,638 | 0 | 0 | 0 | 0 | 1 | true | 41 | 2010-12-01T08:45:00.000 | 29 | 2 | 0 | How to get started with Big Data Analysis | 4,322,559 | 1.2 | python,r,hadoop,bigdata | Using the Python Disco project for example.
Good. Play with that.
Using the RHIPE package and finding toy datasets and problem areas.
Fine. Play with that, too.
Don't sweat finding "big" datasets. Even small datasets present very interesting problems. Indeed, any dataset is a starting-off point.
I once built a s... | I've been a long time user of R and have recently started working with Python. Using conventional RDBMS systems for data warehousing, and R/Python for number-crunching, I feel the need now to get my hands dirty with Big Data Analysis.
I'd like to know how to get started with Big Data crunching.
- How to start simple wi... | 0 | 1 | 18,227 |
0 | 19,205,464 | 0 | 0 | 0 | 0 | 2 | false | 7 | 2010-12-01T20:33:00.000 | 1 | 4 | 0 | Generating a graph with certain degree distribution? | 4,328,837 | 0.049958 | python,algorithm,r,graph,networkx | I know this is very late, but you can do the same thing, albeit a little more straightforward, with mathematica.
RandomGraph[DegreeGraphDistribution[{3, 3, 3, 3, 3, 3, 3, 3}], 4]
This will generate 4 random graphs, with each node having a prescribed degree. | I am trying to generate a random graph that has small-world properties (exhibits a power law distribution). I just started using the networkx package and discovered that it offers a variety of random graph generation. Can someone tell me if it possible to generate a graph where a given node's degree follows a gamma dis... | 0 | 1 | 9,242 |
0 | 4,329,072 | 0 | 0 | 0 | 0 | 2 | false | 7 | 2010-12-01T20:33:00.000 | 2 | 4 | 0 | Generating a graph with certain degree distribution? | 4,328,837 | 0.099668 | python,algorithm,r,graph,networkx | I did this a while ago in base Python... IIRC, I used the following method. From memory, so this may not be entirely accurate, but hopefully it's worth something:
Chose the number of nodes, N, in your graph, and the density (existing edges over possible edges), D. This implies the number of edges, E.
For each node, as... | I am trying to generate a random graph that has small-world properties (exhibits a power law distribution). I just started using the networkx package and discovered that it offers a variety of random graph generation. Can someone tell me if it possible to generate a graph where a given node's degree follows a gamma dis... | 0 | 1 | 9,242 |
0 | 4,345,485 | 0 | 0 | 0 | 0 | 1 | true | 4 | 2010-12-03T12:04:00.000 | 3 | 1 | 0 | Python Imaging, how to quantize an image to 16bit depth? | 4,345,337 | 1.2 | python,python-imaging-library,imaging | You might want to look into converting your image to a numpy array, performing your quantisation, then converting back to PIL.
There are modules in numpy to convert to/from PIL images. | I would like to quantize a 24bit image to 16bit color depth using Python Imaging.
PIL used to provide a method im.quantize(colors, **options) however this has been deprecated for out = im.convert("P", palette=Image.ADAPTIVE, colors=256)
Unfortunately 256 is the MAXIMUM number of colors that im.convert() will quantize t... | 0 | 1 | 4,447 |
0 | 4,348,902 | 0 | 0 | 0 | 0 | 1 | false | 144 | 2010-12-03T18:41:00.000 | 2 | 7 | 0 | Saving interactive Matplotlib figures | 4,348,733 | 0.057081 | python,matplotlib | Good question. Here is the doc text from pylab.save:
pylab no longer provides a save function, though the old pylab
function is still available as matplotlib.mlab.save (you can still
refer to it in pylab as "mlab.save"). However, for plain text
files, we recommend numpy.savetxt. For saving numpy ar... | Is there a way to save a Matplotlib figure such that it can be re-opened and have typical interaction restored? (Like the .fig format in MATLAB?)
I find myself running the same scripts many times to generate these interactive figures. Or I'm sending my colleagues multiple static PNG files to show different aspects of a... | 0 | 1 | 98,342 |
0 | 4,368,488 | 0 | 1 | 0 | 0 | 1 | false | 4 | 2010-12-06T16:11:00.000 | 1 | 7 | 0 | Symmetric dictionary where d[a][b] == d[b][a] | 4,368,423 | 0.028564 | python,inheritance,dictionary | An obvious alternative is to use a (v1,v2) tuple as the key into a single standard dict, and insert both (v1,v2) and (v2,v1) into the dictionary, making them refer to the same object on the right-hand side. | I have an algorithm in python which creates measures for pairs of values, where m(v1, v2) == m(v2, v1) (i.e. it is symmetric). I had the idea to write a dictionary of dictionaries where these values are stored in a memory-efficient way, so that they can easily be retrieved with keys in any order. I like to inherit from... | 0 | 1 | 1,589 |
0 | 4,854,162 | 0 | 0 | 0 | 0 | 2 | true | 15 | 2010-12-08T17:01:00.000 | 4 | 3 | 0 | Does PyPy work with NLTK? | 4,390,129 | 1.2 | python,nltk,pypy | I got a response via email (Seo, please feel free to respond here) that said:
The main issues are:
PyPy implements Python 2.5. This means adding "from future import with_statement" here and there, rewriting usages of property.setter, and fixing up new in 2.6 library calls like os.walk.
NLTK needs PyYAML. Simply symlink... | Does PyPy work with NLTK, and if so, is there an appreciable performance improvement, say for the bayesian classifier?
While we're at it, do any of the other python environments (shedskin, etc) offer better nlkt performance than cpython? | 0 | 1 | 2,318 |
0 | 4,549,093 | 0 | 0 | 0 | 0 | 2 | false | 15 | 2010-12-08T17:01:00.000 | 5 | 3 | 0 | Does PyPy work with NLTK? | 4,390,129 | 0.321513 | python,nltk,pypy | At least some of NLTK does work with PyPy and there is some performance gain, according to someone on #pypy on freenode. Have you run any tests? Just download PyPy from pypy.org/download.html and instead of "time python yourscript.py data.txt" type "time pypy yourscript.py data.txt". | Does PyPy work with NLTK, and if so, is there an appreciable performance improvement, say for the bayesian classifier?
While we're at it, do any of the other python environments (shedskin, etc) offer better nlkt performance than cpython? | 0 | 1 | 2,318 |
0 | 4,450,277 | 0 | 1 | 0 | 0 | 1 | false | 14 | 2010-12-15T13:02:00.000 | 8 | 7 | 0 | easy save/load of data in python | 4,450,144 | 1 | python,io | If it should be human-readable, I'd
also go with JSON. Unless you need to
exchange it with enterprise-type
people, they like XML better. :-)
If it should be human editable and
isn't too complex, I'd probably go
with some sort of INI-like format,
like for example configparser.
If it is complex, and doesn't need to
be ex... | What is the easiest way to save and load data in python, preferably in a human-readable output format?
The data I am saving/loading consists of two vectors of floats. Ideally, these vectors would be named in the file (e.g. X and Y).
My current save() and load() functions use file.readline(), file.write() and string-to-... | 0 | 1 | 62,035 |
0 | 4,460,959 | 0 | 1 | 0 | 0 | 1 | false | 42 | 2010-12-16T12:49:00.000 | 0 | 10 | 0 | Extract the first paragraph from a Wikipedia article (Python) | 4,460,921 | 0 | python,wikipedia | Try a combination of urllib to fetch the site and BeautifulSoup or lxml to parse the data. | How can I extract the first paragraph from a Wikipedia article, using Python?
For example, for Albert Einstein, that would be:
Albert Einstein (pronounced /ˈælbərt
ˈaɪnstaɪn/; German: [ˈalbɐt ˈaɪnʃtaɪn]
( listen); 14 March 1879 – 18 April
1955) was a theoretical physicist,
philosopher and author who is widely
... | 0 | 1 | 49,932 |
0 | 6,335,027 | 0 | 0 | 0 | 0 | 1 | false | 2 | 2010-12-20T10:32:00.000 | 0 | 2 | 0 | Python usage of breadth-first search on social graph | 4,488,783 | 0 | python,algorithm,social-networking,traversal,breadth-first-search | I have around 300 friends in facebook and some of my friends also have 300 friends on an average. If you gonna build a graph out of it , it's gonna be huge . Correct me , if I am wrong ? . A BFS will be quit lot demanding in this scenario ?
Thanks
J | I've been reading a lot of stackoverflow questions about how to use the breadth-first search, dfs, A*, etc, the question is what is the optimal usage and how to implement it in reality verse simulated graphs. E.g.
Consider you have a social graph of Twitter/Facebook/Some social networking site, to me it seems a search ... | 0 | 1 | 1,271 |
0 | 4,516,073 | 0 | 0 | 0 | 0 | 3 | false | 7 | 2010-12-23T05:06:00.000 | -1 | 4 | 0 | CvSize does not exist? | 4,516,007 | -0.049958 | python,opencv,computer-vision | Perhaps the documentation is wrong and you have to use cv.cvSize instead of cv.CvSize ?
Also, do a dir(cv) to find out the methods available to you. | I have installed the official python bindings for OpenCv and I am implementing some standard textbook functions just to get used to the python syntax. I have run into the problem, however, that CvSize does not actually exist, even though it is documented on the site...
The simple function: blah = cv.CvSize(inp.width... | 0 | 1 | 7,873 |
0 | 6,534,684 | 0 | 0 | 0 | 0 | 3 | false | 7 | 2010-12-23T05:06:00.000 | 8 | 4 | 0 | CvSize does not exist? | 4,516,007 | 1 | python,opencv,computer-vision | It seems that they opted to eventually avoid this structure altogether. Instead, it just uses a python tuple (width, height). | I have installed the official python bindings for OpenCv and I am implementing some standard textbook functions just to get used to the python syntax. I have run into the problem, however, that CvSize does not actually exist, even though it is documented on the site...
The simple function: blah = cv.CvSize(inp.width... | 0 | 1 | 7,873 |
0 | 5,974,122 | 0 | 0 | 0 | 0 | 3 | false | 7 | 2010-12-23T05:06:00.000 | 0 | 4 | 0 | CvSize does not exist? | 4,516,007 | 0 | python,opencv,computer-vision | The right call is cv.cvSize(inp.width/2, inp.height/2).
All functions in the python opencv bindings start with a lowercased c even in the highgui module. | I have installed the official python bindings for OpenCv and I am implementing some standard textbook functions just to get used to the python syntax. I have run into the problem, however, that CvSize does not actually exist, even though it is documented on the site...
The simple function: blah = cv.CvSize(inp.width... | 0 | 1 | 7,873 |
0 | 4,523,953 | 0 | 0 | 0 | 0 | 1 | true | 1 | 2010-12-23T23:38:00.000 | 1 | 1 | 0 | NumPy Under Xen Client System | 4,523,267 | 1.2 | python,ubuntu,numpy,virtualization,xen | Yes. The optimizations run in userland and so shouldn't cause any PV traps. | I am working on a project built on NumPy, and I would like to take advantage of some of NumPy's optional architecture-specific optimizations. If I install NumPy on a paravirtualized Xen client OS (Ubuntu, in this case - a Linode), can I take advantage of those optimizations? | 0 | 1 | 99 |
0 | 4,535,370 | 0 | 0 | 0 | 0 | 1 | true | 4 | 2010-12-26T20:47:00.000 | 5 | 1 | 0 | Colon difference in Matlab and Python | 4,535,359 | 1.2 | python,arrays,matlab,syntax | someArray[:,0,0] is the Python NumPy equivalent of MATLAB's someArray(:,1,1). I've never figured out how to do it in pure Python, the colon slice operation is a total mystery to me with lists-of-lists. | What is the equivalent to someArray(:,1,1) in python from Matlab?
In python someArray[:][0][0] produces a different value | 0 | 1 | 793 |
0 | 11,489,099 | 0 | 0 | 0 | 0 | 1 | false | 3 | 2010-12-27T08:18:00.000 | 1 | 4 | 1 | Iterative MapReduce | 4,537,422 | 0.049958 | python,streaming,hadoop,mapreduce,iteration | You needn't write another job. You can put the same job in a loop ( a while loop) and just keep changing the parameters of the job, so that when the mapper and reducer complete their processing, the control starts with creating a new configuration, and then you just automatically have an input file that is the output o... | I've written a simple k-means clustering code for Hadoop (two separate programs - mapper and reducer). The code is working over a small dataset of 2d points on my local box. It's written in Python and I plan to use Streaming API.
I would like suggestions on how best to run this program on Hadoop.
After each run of mapp... | 0 | 1 | 3,152 |
Subsets and Splits
Python & ML Questions
Retrieves a subset of entries tagged with both 'python' and 'machine-learning', providing a basic filtered view of relevant data.
Python & ML Questions
Retrieves all records tagged with both 'python' and 'machine-learning', providing a basic filtered subset of the dataset.