Question
stringlengths
25
7.47k
Q_Score
int64
0
1.24k
Users Score
int64
-10
494
Score
float64
-1
1.2
Data Science and Machine Learning
int64
0
1
is_accepted
bool
2 classes
A_Id
int64
39.3k
72.5M
Web Development
int64
0
1
ViewCount
int64
15
1.37M
Available Count
int64
1
9
System Administration and DevOps
int64
0
1
Networking and APIs
int64
0
1
Q_Id
int64
39.1k
48M
Answer
stringlengths
16
5.07k
Database and SQL
int64
1
1
GUI and Desktop Applications
int64
0
1
Python Basics and Environment
int64
0
1
Title
stringlengths
15
148
AnswerCount
int64
1
32
Tags
stringlengths
6
90
Other
int64
0
1
CreationDate
stringlengths
23
23
I am working on a personal project where I need to manipulate values in a database-like format. Up until now I have been using dictionaries, tuples, and lists to store and consult those values. I am thinking about starting to use SQL to manipulate those values, but I don't know if it's worth the effort, because I don't...
1
2
0.132549
0
false
2,871,090
0
291
3
0
0
2,870,815
SQL is useful in many applications. But it is an overkill in this case. You can easily store your data in CSV, pickle or JSON format. Get this job done in 5 minutes and then learn SQL when you have time.
1
0
0
Python and database
3
python,sql,database
0
2010-05-20T03:24:00.000
I am trying to install postgrepsql to cygwin on a windows 7 machine and want it to work with django. After built and installed postgrepsql in cygwin, I built and installed psycopg2 in cygwin as well and got no error, but when use it in python with cygwin, I got the "no such process" error: import psycopg2 Tra...
0
1
0.049958
0
false
14,780,956
0
1,204
2
0
0
2,879,246
In my case, I had to reinstall libpq5.
1
0
0
psycopg2 on cygwin: no such process
4
python,django,postgresql,psycopg2
0
2010-05-21T02:34:00.000
I am trying to install postgrepsql to cygwin on a windows 7 machine and want it to work with django. After built and installed postgrepsql in cygwin, I built and installed psycopg2 in cygwin as well and got no error, but when use it in python with cygwin, I got the "no such process" error: import psycopg2 Tra...
0
0
0
0
false
2,885,759
0
1,204
2
0
0
2,879,246
Why? There is native psycopg2 for Win.
1
0
0
psycopg2 on cygwin: no such process
4
python,django,postgresql,psycopg2
0
2010-05-21T02:34:00.000
After running a bunch of simulations I'm going to be outputting the results into a table created using SQLAlchemy. I plan to use this data to generate statistics - mean and variance being key. These, in turn, will be used to generate some graphs - histograms/line graphs, pie-charts and box-and-whisker plots specificall...
1
1
1.2
0
true
2,891,001
0
1,415
1
0
0
2,890,564
It looks like matplotlib takes simple python data types -- lists of numbers, etc, so you'll be need to write custom code to massage what you pull out of mysql/sqlalchemy for input into the graphing functions...
1
0
0
How to generate graphs and statistics from SQLAlchemy tables?
1
python,matplotlib,sqlalchemy
0
2010-05-23T03:10:00.000
If you wanted to manipulate the data in a table in a postgresql database using some python (maybe running a little analysis on the result set using scipy) and then wanted to export that data back into another table in the same database, how would you go about the implementation? Is the only/best way to do this to simpl...
4
0
0
0
false
2,906,866
0
1,582
1
0
0
2,905,097
I agree with the SQL Alchemy suggestions or using Django's ORM. Your needs seem to simple for PL/Python to be used.
1
0
0
Python and Postgresql
7
python,postgresql
0
2010-05-25T13:35:00.000
I am in the planning stages of rewriting an Access db I wrote several years ago in a full fledged program. I have very slight experience coding, but not enough to call myself a programmer by far. I'll definitely be learning as I go, so I'd like to keep everything as simple as possible. I've decided on Python and SQL...
5
0
0
0
false
2,979,467
0
5,638
1
0
0
2,924,231
You question is a little broad. I'll try to cover as much as I can. First, what I understood and my assumptions. In your situation, the sqlite database is just a data store. Only one process (unless your application is multiprocess) will be accessing it so you won't need to worry about locking issues. The application ...
1
0
0
Python/Sqlite program, write as browser app or desktop app?
8
python,sqlite,browser
0
2010-05-27T19:28:00.000
When using SQL Alchemy for abstracting your data access layer and using controllers as the way to access objects from that abstraction layer, how should joins be handled? So for example, say you have an Orders controller class that manages Order objects such that it provides getOrder, saveOrder, etc methods and likewis...
1
2
1.2
0
true
2,934,084
1
430
1
0
0
2,933,796
Controllers are meant to encapsulate features for your convienience. Not to bind your hands. If you want to join, simply join. Use the controller that you think is logically fittest to make the query.
1
0
0
SQL Alchemy MVC and cross controller joins
1
python,model-view-controller,sqlalchemy,dns,controllers
0
2010-05-29T04:25:00.000
I have large amounts of data (a few terabytes) and accumulating... They are contained in many tab-delimited flat text files (each about 30MB). Most of the task involves reading the data and aggregating (summing/averaging + additional transformations) over observations/rows based on a series of predicate statements, and...
32
2
0.049958
0
false
2,942,419
0
5,360
4
0
0
2,937,619
When you say "accumulating" then solution (2) looks most suitable to problem. After initial load up to database you only update database with new files (daily, weekly? depends how often you need this). In cases (1) and (3) you need to process files each time (what was stated earlier as most time/resources-consuming), u...
1
0
0
large amount of data in many text files - how to process?
8
python,sql,r,large-files,large-data-volumes
0
2010-05-30T05:06:00.000
I have large amounts of data (a few terabytes) and accumulating... They are contained in many tab-delimited flat text files (each about 30MB). Most of the task involves reading the data and aggregating (summing/averaging + additional transformations) over observations/rows based on a series of predicate statements, and...
32
4
0.099668
0
false
2,937,664
0
5,360
4
0
0
2,937,619
With terabytes, you will want to parallelize your reads over many disks anyway; so might as well go straight into Hadoop. Use Pig or Hive to query the data; both have extensive support for user-defined transformations, so you should be able to implement what you need to do using custom code.
1
0
0
large amount of data in many text files - how to process?
8
python,sql,r,large-files,large-data-volumes
0
2010-05-30T05:06:00.000
I have large amounts of data (a few terabytes) and accumulating... They are contained in many tab-delimited flat text files (each about 30MB). Most of the task involves reading the data and aggregating (summing/averaging + additional transformations) over observations/rows based on a series of predicate statements, and...
32
1
0.024995
0
false
2,937,660
0
5,360
4
0
0
2,937,619
Yes. You are right! I/O would cost most of your processing time. I don't suggest you to use distributed systems, like hadoop, for this task. Your task could be done in a modest workstation. I am not an Python expert, I think it has support for asynchronous programming. In F#/.Net, the platform has well support for tha...
1
0
0
large amount of data in many text files - how to process?
8
python,sql,r,large-files,large-data-volumes
0
2010-05-30T05:06:00.000
I have large amounts of data (a few terabytes) and accumulating... They are contained in many tab-delimited flat text files (each about 30MB). Most of the task involves reading the data and aggregating (summing/averaging + additional transformations) over observations/rows based on a series of predicate statements, and...
32
14
1.2
0
true
2,937,630
0
5,360
4
0
0
2,937,619
(3) is not necessarily a bad idea -- Python makes it easy to process "CSV" file (and despite the C standing for Comma, tab as a separator is just as easy to handle) and of course gets just about as much bandwidth in I/O ops as any other language. As for other recommendations, numpy, besides fast computation (which you...
1
0
0
large amount of data in many text files - how to process?
8
python,sql,r,large-files,large-data-volumes
0
2010-05-30T05:06:00.000
Should I invest a lot of time trying to figure out an ORM style implementation, or is it still common to just stick with standard SQL queries in python/pylons/sqlalchemy?
1
8
1.2
0
true
2,947,182
0
1,032
2
0
0
2,947,172
ORMs are very popular, for several reasons -- e.g.: some people would rather not learn SQL, ORMs can ease porting among different SQL dialects, they may fit in more smoothly with the mostly-OOP style of applications, indeed might even ease some porting to non-SQL implementations (e.g, moving a Django app to Google App ...
1
0
0
Transitioning from php to python/pylons/SQLAlchemy -- Are ORMs the standard now?
2
python,sql,orm,sqlalchemy
0
2010-06-01T03:49:00.000
Should I invest a lot of time trying to figure out an ORM style implementation, or is it still common to just stick with standard SQL queries in python/pylons/sqlalchemy?
1
1
0.099668
0
false
2,947,191
0
1,032
2
0
0
2,947,172
If you have never use an ORM like SqlAlchemy before, I would suggest that you learn it - as long as you are learning the Python way. If nothing else, you will be better able to decide where/when to use it vs plain SQL. I don't think you should have to invest a lot of time on it. Documentation for SQLAlchemy is decent, ...
1
0
0
Transitioning from php to python/pylons/SQLAlchemy -- Are ORMs the standard now?
2
python,sql,orm,sqlalchemy
0
2010-06-01T03:49:00.000
I have a table formatted similar to this: Date | ID | Value | Difference I need to get the difference between a record's value column, and the previous record's value column based off of the date. I.E 2 days ago | cow | 1 | Null Yesterday | cow | 2 | Null Today | cow | 3 | Null Yesterdays difference would be 1, and tod...
0
0
1.2
0
true
2,960,589
0
530
1
0
0
2,960,481
Use a SELECT... WHERE date <= NOW() && date >= ( NOW() - 90000 ) (90,000 is 25 hours, giving you a little leeway with the insert time), and then take the difference between the rows in python.
1
0
0
Get the previous date in Mysql
2
python,mysql
0
2010-06-02T18:35:00.000
In one of my Django projects that use MySQL as the database, I need to have a date fields that accept also "partial" dates like only year (YYYY) and year and month (YYYY-MM) plus normal date (YYYY-MM-DD). The date field in MySQL can deal with that by accepting 00 for the month and the day. So 2010-00-00 is valid in MyS...
8
7
1.2
0
true
3,027,410
1
4,080
1
0
0
2,971,198
First, thanks for all your answers. None of them, as is, was a good solution for my problem, but, for your defense, I should add that I didn't give all the requirements. But each one help me think about my problem and some of your ideas are part of my final solution. So my final solution, on the DB side, is to use a va...
1
0
0
How to deal with "partial" dates (2010-00-00) from MySQL in Django?
5
python,mysql,database,django,date
0
2010-06-04T02:49:00.000
I am writing an app to do a file conversion and part of that is replacing old account numbers with a new account numbers. Right now I have a CSV file mapping the old and new account numbers with around 30K records. I read this in and store it as dict and when writing the new file grab the new account from the dict by k...
3
1
1.2
1
true
2,980,269
0
109
1
0
0
2,980,257
As long as they will all fit in memory, a dict will be the most efficient solution. It's also a lot easier to code. 100k records should be no problem on a modern computer. You are right that switching to an SQLite database is a good choice when the number of records gets very large.
1
0
0
Efficient way to access a mapping of identifiers in Python
1
python,database,sqlite,dictionary,csv
0
2010-06-05T12:08:00.000
I'm doing a project with reasonalby big DataBase. It's not a probper DB file, but a class with format as follows: DataBase.Nodes.Data=[[] for i in range(1,1000)] f.e. this DataBase is all together something like few thousands rows. Fisrt question - is the way I'm doing efficient, or is it better to use SQL, or any othe...
3
3
0.197375
1
false
2,991,030
0
306
1
0
0
2,990,995
Pickle (cPickle) can handle any (picklable) Python object. So as long, as you're not trying to pickle thread or filehandle or something like that, you're ok.
1
0
0
How to save big "database-like" class in python
3
python,serialization,pickle,object-persistence
0
2010-06-07T15:52:00.000
I'm building a web application, and I need to use an architecture that allows me to run it over two servers. The application scrapes information from other sites periodically, and on input from the end user. To do this I'm using Php+curl to scrape the information, Php or python to parse it and store the results in a M...
3
2
1.2
0
true
3,022,395
1
1,161
1
0
0
3,021,921
How do go about implementing this? Too big a question for an answer here. Certainly you don't want 2 sets of code for the scraping (1 for scheduled, 1 for demand) in addition to the added complication, you really don't want to be running job which will take an indefinite time to complete within the thread generated b...
1
0
0
Web application architecture, and application servers?
2
php,python,model-view-controller,cakephp,application-server
0
2010-06-11T10:12:00.000
Criteria for 'better': fast in math and simple (few fields, many records) db transactions, convenient to develop/read/extend, flexible, connectible. The task is to use a common web development scripting language to process and calculate long time series and multidimensional surfaces (mostly selecting/inserting sets of ...
4
4
0.379949
0
false
3,022,304
0
451
2
0
0
3,022,232
The best option is probably the language you're most familiar with. My second consideration would be if you need to use any special maths libraries and whether they're supported in each of the languages.
1
0
0
What's a better choice for SQL-backed number crunching - Ruby 1.9, Python 2, Python 3, or PHP 5.3?
2
php,python,ruby,performance,math
0
2010-06-11T11:13:00.000
Criteria for 'better': fast in math and simple (few fields, many records) db transactions, convenient to develop/read/extend, flexible, connectible. The task is to use a common web development scripting language to process and calculate long time series and multidimensional surfaces (mostly selecting/inserting sets of ...
4
10
1.2
0
true
3,022,242
0
451
2
0
0
3,022,232
I would suggest Python with it's great Scientifical/Mathematical libraries (SciPy, NumPy). Otherwise the languages are not differing so much, although I doubt that Ruby, PHP or JS can keep up with the speed of Python or Perl. And what the comments below here say: at this moment, go for the latest Python2 (which is Pyth...
1
0
0
What's a better choice for SQL-backed number crunching - Ruby 1.9, Python 2, Python 3, or PHP 5.3?
2
php,python,ruby,performance,math
0
2010-06-11T11:13:00.000
I'm doing some queries in Python on a large database to get some stats out of the database. I want these stats to be in-memory so other programs can use them without going to a database. I was thinking of how to structure them, and after trying to set up some complicated nested dictionaries, I realized that a good rep...
32
1
0.033321
0
false
65,153,849
0
44,308
1
0
0
3,047,412
In-memory databases usually do not support memory paging option (for the whole database or certain tables), i,e, total size of the database should be smaller than the available physical memory or maximum shared memory size. Depending on your application, data-access pattern, size of database and available system memory...
1
0
1
in-memory database in Python
6
python,sql,database,in-memory-database
0
2010-06-15T17:13:00.000
I'm running a Django project on Postgresql 8.1.21 (using Django 1.1.1, Python2.5, psycopg2, Apache2 with mod_wsgi 3.2). We've recently encountered this lovely error: OperationalError: FATAL: connection limit exceeded for non-superusers I'm not the first person to run up against this. There's a lot of discussion about ...
3
1
1.2
0
true
3,049,796
1
2,568
1
0
0
3,049,625
This could be caused by other things. For example, configuring Apache/mod_wsgi in a way that theoretically it could accept more concurrent requests than what the database itself may be able to accept at the same time. Have you reviewed your Apache/mod_wsgi configuration and compared limit on maximum clients to that of ...
1
0
0
Django ORM and PostgreSQL connection limits
1
python,database,django,postgresql,django-orm
0
2010-06-15T22:45:00.000
THE TASK: I am in the process of migrating a DB from MS Access to Maximizer. In order to do this I must take 64 tables in MS ACCESS and merge them into one. The output must be in the form of a TAB or CSV file. Which will then be imported into Maximizer. THE PROBLEM: Access is unable to perform a query that is so comple...
2
0
0
0
false
3,073,339
0
4,410
2
0
0
3,064,830
I'm not even clear on what you're trying to do. I assume your problem is that Jet/ACE can't handle a UNION with that many SELECT statements. If you have 64 identically-structured tables and you want them in a single CSV, I'd create a temp table in Access, append each table in turn, then export from the temp table to C...
1
0
0
Query crashes MS Access
4
python,sql,ms-access,crm
0
2010-06-17T19:13:00.000
THE TASK: I am in the process of migrating a DB from MS Access to Maximizer. In order to do this I must take 64 tables in MS ACCESS and merge them into one. The output must be in the form of a TAB or CSV file. Which will then be imported into Maximizer. THE PROBLEM: Access is unable to perform a query that is so comple...
2
1
0.049958
0
false
3,064,852
0
4,410
2
0
0
3,064,830
I would recommend #2 if the merge is fairly simple and straightforward, and doesn't need the power of an RDBMS. I'd go with #1 if the merge is more complex and you will need to write some actual queries to get the data merged properly.
1
0
0
Query crashes MS Access
4
python,sql,ms-access,crm
0
2010-06-17T19:13:00.000
I've just started learning Python Django and have a lot of experience building high traffic websites using PHP and MySQL. What worries me so far is Python's overly optimistic approach that you will never need to write custom SQL and that it automatically creates all these Foreign Key relationships in your database. The...
2
0
0
0
false
3,320,441
1
640
3
0
0
3,066,255
I concur with the 'no foreign keys' advice (with the disclaimer: I also work for Percona). The reason why it is is recommended is for concurrency / reducing locking internally. It can be a difficult "optimization" to sell, but if you consider that the database has transactions (and is more or less ACID compliant) then ...
1
0
0
Does Python Django support custom SQL and denormalized databases with no Foreign Key relationships?
5
python,mysql,django
0
2010-06-17T23:07:00.000
I've just started learning Python Django and have a lot of experience building high traffic websites using PHP and MySQL. What worries me so far is Python's overly optimistic approach that you will never need to write custom SQL and that it automatically creates all these Foreign Key relationships in your database. The...
2
0
0
0
false
3,066,274
1
640
3
0
0
3,066,255
django-admin inspectdb allows you to reverse engineer a models file from existing tables. That is only a very partial response to your question ;)
1
0
0
Does Python Django support custom SQL and denormalized databases with no Foreign Key relationships?
5
python,mysql,django
0
2010-06-17T23:07:00.000
I've just started learning Python Django and have a lot of experience building high traffic websites using PHP and MySQL. What worries me so far is Python's overly optimistic approach that you will never need to write custom SQL and that it automatically creates all these Foreign Key relationships in your database. The...
2
0
0
0
false
3,066,360
1
640
3
0
0
3,066,255
You can just create the model.py and avoid having SQL Alchemy automatically create the tables leaving it up to you to define the actual tables as you please. So although there are foreign key relationships in the model.py this does not mean that they must exist in the actual tables. This is a very good thing consider...
1
0
0
Does Python Django support custom SQL and denormalized databases with no Foreign Key relationships?
5
python,mysql,django
0
2010-06-17T23:07:00.000
How would I go around creating a MYSQL table schema inspecting an Excel(or CSV) file. Are there any ready Python libraries for the task? Column headers would be sanitized to column names. Datatype would be estimated based on the contents of the spreadsheet column. When done, data would be loaded to the table. I have an...
6
1
0.039979
0
false
3,072,109
0
7,011
3
0
0
3,070,094
As far as I know, there is no tool that can automate this process (I would love for someone to prove me wrong as I've had this exact problem before). When I did this, I came up with two options: (1) Manually create the columns in the db with the appropriate types and then import, or (2) Write some kind of filter that...
1
0
0
Generate table schema inspecting Excel(CSV) and import data
5
python,mysql,excel,csv,import-from-excel
0
2010-06-18T13:40:00.000
How would I go around creating a MYSQL table schema inspecting an Excel(or CSV) file. Are there any ready Python libraries for the task? Column headers would be sanitized to column names. Datatype would be estimated based on the contents of the spreadsheet column. When done, data would be loaded to the table. I have an...
6
1
0.039979
0
false
3,071,074
0
7,011
3
0
0
3,070,094
Quick and dirty workaround with phpmyadmin: Create a table with the right amount of columns. Make sure the data fits the columns. Import the CSV into the table. Use the propose table structure.
1
0
0
Generate table schema inspecting Excel(CSV) and import data
5
python,mysql,excel,csv,import-from-excel
0
2010-06-18T13:40:00.000
How would I go around creating a MYSQL table schema inspecting an Excel(or CSV) file. Are there any ready Python libraries for the task? Column headers would be sanitized to column names. Datatype would be estimated based on the contents of the spreadsheet column. When done, data would be loaded to the table. I have an...
6
1
1.2
0
true
3,169,710
0
7,011
3
0
0
3,070,094
Just for (my) reference, I documented below what I did: XLRD is practical, however I've just saved the Excel data as CSV, so I can use LOAD DATA INFILE I've copied the header row and started writing the import and normalization script Script does: CREATE TABLE with all columns as TEXT, except for Primary key query mys...
1
0
0
Generate table schema inspecting Excel(CSV) and import data
5
python,mysql,excel,csv,import-from-excel
0
2010-06-18T13:40:00.000
I currently work with Google's AppEngine and I could not find out, whether a Google DataStorage Object Entry has an ID by default, and if not, how I add such a field and let it increase automatically? regards,
1
4
0.26052
0
false
3,078,018
1
282
2
1
0
3,077,156
An object has a Key, part of which is either an automatically-generated numeric ID, or an assigned key name. IDs are not guaranteed to be increasing, and they're almost never going to be consecutive because they're allocated to an instance in big chunks, and IDs unused by the instance to which they're allocated will n...
1
0
0
Does GQL automatically add an "ID" Property
3
python,google-app-engine,gql
0
2010-06-19T20:38:00.000
I currently work with Google's AppEngine and I could not find out, whether a Google DataStorage Object Entry has an ID by default, and if not, how I add such a field and let it increase automatically? regards,
1
3
1.2
0
true
3,077,170
1
282
2
1
0
3,077,156
Yes, they have id's by default, and it is named ID as you mentioned.
1
0
0
Does GQL automatically add an "ID" Property
3
python,google-app-engine,gql
0
2010-06-19T20:38:00.000
I am trying to use the SimpleDB in following way. I want to keep 48 hrs worth data at anytime into simpledb and query it for different purposes. Each domain has 1 hr worth data, so at any time there are 48 domains present in the simpledb. As the new data is constantly uploaded, I delete the oldest domain and create a...
2
0
0
0
false
9,012,699
1
1,743
1
0
0
3,103,145
I have had the same issue as you Charlie. After profiling the code, I have narrowed the performance problem down to SSL. It seems like that is where it is spending most of it's time and hence CPU cycles. I have read of a problem in the httplib library (which boto uses for SSL) where the performance doesn't increase unl...
1
0
0
SimpleDB query performance improvement using boto
3
python,amazon-simpledb,boto
0
2010-06-23T15:38:00.000
Can I somehow work with remote databases (if they can do it) with the Django ORM? It is understood that the sitting has spelled out the local database. And periodically to make connection to various external databases and perform any sort of commands such as load dump.
0
1
0.197375
0
false
3,125,012
1
121
1
0
0
3,123,801
If you can connect to the database remotely, then you can simply specify its host/port in settings.py exactly as you would a local one.
1
0
0
Remote execution of commands using the Django ORM
1
python,django,orm
0
2010-06-26T12:05:00.000
My two main requirements for the site are related to degrees of separation and graph matching (given two graphs, return some kind of similarity score). My first thought was to use MySql to do it, which would probably work out okay for storing how I want to manage 'friends' (similar to Twitter), but I'm thinking if I wa...
4
2
0.099668
0
false
3,126,208
1
749
1
0
0
3,126,155
MySQL is really your best choice for the database unless you want to go proprietary. As for the actual language, pick whatever you are familiar with. While Youtube and Reddit are written in python, many of the other large sites use Ruby (Hulu, Twitter, Techcrunch) or C++ (Google) or PHP (Facebook, Yahoo, etc).
1
0
0
What should I use for the backend of a 'social' website?
4
python,sql,mysql,database
0
2010-06-27T02:19:00.000
With PostgreSQL, one of my tables has an 'interval' column, values of which I would like to extract as something I can manipulate (datetime.timedelta?); however I am using PyGreSQL which seems to be returning intervals as strings, which is less than helpful. Where should I be looking to either parse the interval or mak...
0
3
1.2
0
true
3,137,124
0
1,424
1
0
0
3,134,699
Use Psycopg 2. It correctly converts between Postgres's interval data type and Python's timedelta.
1
0
0
python sql interval
1
python,sql,postgresql,pygresql
0
2010-06-28T17:33:00.000
I've decided to give Python a try on Netbeans. The problem so far is when try to run program I know works, i.e. if I ran it through the terminal. For the project I selected the correct Python version (2.6.5). And received the following error: Traceback (most recent call last): File "/Users/XXX/NetBeansProjects/New...
0
0
0
0
false
3,151,119
0
184
1
0
0
3,149,370
Search for PYTHONPATH. You probably have different settings in your OS and Netbeans.
1
0
0
Netbeans + sqlite3 = Fail?
1
python,sqlite
0
2010-06-30T12:50:00.000
I'm writing a script for exporting some data. Some details about the environment: The project is Django based I'm using raw/custom SQL for the export The database engine is MySQL. The database and code are on the same box.- Details about the SQL: A bunch of inner joins A bunch of columns selected, some with a basic ...
0
0
0
0
false
3,188,555
0
705
1
0
0
3,188,289
Two ideas: MySQL may have query caching enabled, which makes it difficult to get accurate timing when you run the same query repeatedly. Try changing the ID in your query to make sure that it really does run in 3-4 seconds consistently. Try using strace on the python process to see what it is doing during this time.
1
0
0
Python MySQL Performance: Runs fast in mysql command line, but slow with cursor.execute
1
python,mysql,performance
0
2010-07-06T16:39:00.000
As is mentioned in the doc for google app engine, it does not support group by and other aggregation functions. Is there any alternatives to implement the same functionality? I am working on a project where I need it on urgent basis, being a large database its not efficient to iterate the result set and then perform th...
0
1
1.2
0
true
3,211,471
1
322
1
1
0
3,210,577
The best way is to populate the summaries (aggregates) at the time of write. This way your reads will be faster, since they just read - at the cost of writes which will have to update the summaries if its likely to be effected by the write. Hopefully you will be reading more often than writing/updating summaries.
1
0
0
Google application engine Datastore - any alternatives to aggregate functions and group by?
1
python,google-app-engine
0
2010-07-09T07:21:00.000
I'm using Google Appengine to store a list of favorites, linking a Facebook UserID to one or more IDs from Bing. I need function calls returning the number of users who have favorited an item, and the number of times an item has been favorited (and by whom). My question is, should I resolve this relationship into two t...
1
0
1.2
0
true
3,213,988
1
280
1
1
0
3,210,994
I don't think there's a hard and fast answer to questions like this. "Is this optimization worth it" always depends on many variables such as, is the lack of optimization actually a problem to start with? How much of a problem is it? What's the cost in terms of extra time and effort and risk of bugs of a more complex o...
1
0
0
Many-to-many relationships in Google AppEngine - efficient?
1
python,google-app-engine,performance,many-to-many
0
2010-07-09T08:21:00.000
I'm using MySQLdb and when I perform an UPDATE to a table row I sometimes get an infinite process hang. At first I thought, maybe its COMMIT since the table is Innodb, but even with autocommit(True) and db.commit() after each update I still get the hang. Is it possible there is a row lock and the query just fails to ca...
0
1
1.2
0
true
3,216,500
0
544
1
0
0
3,216,027
Depending on your user privileges, you can execute SHOW PROCESSLIST or SELECT from information_schema.processlist while the UPDATE hangs to see if there is a contention issue with another query. Also do an EXPLAIN on a SELECT of the WHERE clause used in the UPDATE to see if you need to change the statement. If it's ...
1
0
0
MySQLdb Handle Row Lock
1
python,mysql
0
2010-07-09T19:47:00.000
Is there a good step by step online guide to install xampp (apache server,mysql server) together with zope-plone on the same linux machine and make it play nicely or do I have to go through their confusing documentations? Or how can I install this configuration in the best way? I can install and use both seperately but...
0
0
1.2
0
true
3,247,954
1
486
1
0
0
3,233,246
sorry for wrong site but I just figured out that it was not a problem at all. I installed XAMPP (a snap) and downloaded and ran the plone install script. Both sites XAMPP on port 80 and zope/plone on 8080 are working without problems. Just to let everyone know. I don't know why I got nervous about this :)
1
0
0
Guide to install xampp with zope-plone on the same linux machine?
2
python,apache,xampp,plone,zope
0
2010-07-13T00:09:00.000
I never thought I'd ever say this but I'd like to have something like the report generator in Microsoft Access. Very simple, just list data from a SQL query. I don't really care what language is used as long as I can get it done fast. C#,C++,Python,Javascript... I want to know the quickest (development sense) way to d...
0
0
0
0
false
3,249,163
0
325
1
0
0
3,242,448
Some suggestions: 1) ASP.NET Gridview ---use the free Visual Studio to create an asp.net page ...can do VB, C#, etc. ---drag/drop a gridview control on your page, then connect it to your data and display fields, all via wizard (you did say quick and dirty, correct?). No coding required if you can live with...
1
0
0
Quick and dirty reports based on a SQL query
2
c#,javascript,python,sql,database
0
2010-07-13T23:59:00.000
Installed Django from source (python setup.py install and such), installed MySQLdb from source (python setup.py build, python setup.py install). Using Python 2.4 which came installed on the OS (CentOS 5.5). Getting the following error message after launching the server: Error loading MySQLdb module: No module named MyS...
19
1
0.019997
0
false
23,076,238
1
38,547
1
0
0
3,243,073
Try this if you are using linux:- sudo apt-get install python-mysqldb windows:- pip install python-mysqldb or easy_install python-mysqldb Hope this should work
1
0
0
Django unable to find MySQLdb python module
10
python,django,mysql
0
2010-07-14T02:49:00.000
I am a PHP guy. In PHP I mainly use Doctrine ORM to deal with database issues. I am considering move to Python + Django recently. I know Python but don't have experience with Django. Can anyone who has good knowledge of both Doctrine and ORM in Django give me a comparison of features of these two ORM implementations?
3
1
0.049958
0
false
8,543,708
1
5,103
3
0
0
3,249,977
Ive used Doctrine over a 2 year project that ended 1.5 years ago, since then i've been doing mostly Django. I prefer Djangos ORM over Doctrine any day, more features, more consistency, faster and shinier.
1
0
0
ORM in Django vs. PHP Doctrine
4
php,python,django,orm,doctrine
0
2010-07-14T20:01:00.000
I am a PHP guy. In PHP I mainly use Doctrine ORM to deal with database issues. I am considering move to Python + Django recently. I know Python but don't have experience with Django. Can anyone who has good knowledge of both Doctrine and ORM in Django give me a comparison of features of these two ORM implementations?
3
5
0.244919
0
false
12,267,439
1
5,103
3
0
0
3,249,977
I am a rare person who had to switch from Django 1.4 to Symfony 2.1 so I had to use Doctrine 2 instead of current Django ORM. Maybe Doctrine can do many things but let me tell you that it is a nightmare for me to use it coming from Django. I'm bored with the verbosity of php/Symfony/Doctrine ... Also I never needed som...
1
0
0
ORM in Django vs. PHP Doctrine
4
php,python,django,orm,doctrine
0
2010-07-14T20:01:00.000
I am a PHP guy. In PHP I mainly use Doctrine ORM to deal with database issues. I am considering move to Python + Django recently. I know Python but don't have experience with Django. Can anyone who has good knowledge of both Doctrine and ORM in Django give me a comparison of features of these two ORM implementations?
3
-1
-0.049958
0
false
3,250,203
1
5,103
3
0
0
3,249,977
Django isn't just an orm. It is a web framework like symfony. The form framework in symfony is modeled on django for example. It's orm part is more like doctrine 2 I think, but I haven't played with either much.
1
0
0
ORM in Django vs. PHP Doctrine
4
php,python,django,orm,doctrine
0
2010-07-14T20:01:00.000
We need to be able to inform a Delphi application in case there are changes to some of our tables in MySQL. Delphi clients are in the Internet behind a firewall, and they have to be authenticated before connecting to the notification server we need to implement. The server can be programmed using for example Java, PHP ...
4
0
0
0
false
3,255,344
0
1,596
2
0
0
3,255,330
Why not use the XMPP protocol (aka Jabbber) ?
1
0
0
How to create a notification server which informs Delphi application when database changes?
3
java,php,python,mysql,delphi
0
2010-07-15T12:02:00.000
We need to be able to inform a Delphi application in case there are changes to some of our tables in MySQL. Delphi clients are in the Internet behind a firewall, and they have to be authenticated before connecting to the notification server we need to implement. The server can be programmed using for example Java, PHP ...
4
1
0.066568
0
false
3,255,395
0
1,596
2
0
0
3,255,330
There is apache camel and spring intergration, both provides some ways to send messages across.
1
0
0
How to create a notification server which informs Delphi application when database changes?
3
java,php,python,mysql,delphi
0
2010-07-15T12:02:00.000
Due to the nature of my application, I need to support fast inserts of large volumes of data into the database. Using executemany() increases performance, but there's a caveat. For example, MySQL has a configuration parameter called max_allowed_packet, and if the total size of my insert queries exceeds its value, MySQL...
1
2
1.2
0
true
3,271,153
0
2,103
1
0
0
3,267,580
I had a similar problem recently and used the - not very elegant - work-around: First I parsed my.cnf for a value for max_allow_packets, if I can't find it, the maximum is set to a default value. All data items are stored in a list. Next, for each data item I count the approximate byte length (with strings, it's the l...
1
0
0
SQLAlchemy and max_allowed_packet problem
1
python,mysql,sqlalchemy,large-query
0
2010-07-16T17:54:00.000
I am trying to setup a website in django which allows the user to send queries to a database containing information about their representatives in the European Parliament. I have the data in a comma seperated .txt file with the following format: Parliament, Name, Country, Party_Group, National_Party, Position 7, Mart...
13
2
0.066568
0
false
3,275,298
1
7,034
1
0
0
3,270,952
You asked what the create(**dict(zip(fields, row))) line did. I don't know how to reply directly to your comment, so I'll try to answer it here. zip takes multiple lists as args and returns a list of their correspond elements as tuples. zip(list1, list2) => [(list1[0], list2[0]), (list1[1], list2[1]), .... ] dict take...
1
0
0
Populating a SQLite3 database from a .txt file with Python
6
python,django,sqlite
0
2010-07-17T09:38:00.000
I'm having all sorts of trouble trying to instal MySQLdb (1.2.2) on snow leopard. I am running python 2.5.1 and MySQL 5.1 32bit. Python and MySQL are running just fine. I've also installed django 1.2.1, although I don't think thats all that important, but wanted to give an idea of the stack i'm trying to install. I am ...
2
0
0
0
false
3,285,926
0
461
1
0
0
3,285,631
I tried to solve this one for days myself and finally gave up. I switched to postgres. It works pretty well with django on snow leopard, with one minor problem. For some reason auto_inc pk ids don't get assigned to some models. I solved the problem by randomly assigning an id from a large random range, and relying on...
1
0
0
Installing MySQLdb on Snow Leopard
3
python,mysql,django
0
2010-07-19T22:34:00.000
I'm playing around with a little web app in web.py, and am setting up a url to return a JSON object. What's the best way to convert a SQL table to JSON using python?
54
1
0.014285
0
false
55,329,857
0
157,377
1
0
0
3,286,525
If you are using an MSSQL Server 2008 and above, you can perform your SELECT query to return json by using the FOR JSON AUTO clause E.G SELECT name, surname FROM users FOR JSON AUTO Will return Json as [{"name": "Jane","surname": "Doe" }, {"name": "Foo","surname": "Samantha" }, ..., {"name": "John", "surname": "boo" }...
1
0
1
return SQL table as JSON in python
14
python,sql,json
0
2010-07-20T02:16:00.000
I have 5 python cgi pages. I can navigate from one page to another. All pages get their data from the same database table just that they use different queries. The problem is that the application as a whole is slow. Though they connect to the same database, each page creates a new handle every time I visit it and handl...
1
0
0
0
false
3,289,546
1
179
1
0
0
3,289,330
Django and Pylons are both frameworks that solve this problem quite nicely, namely by abstracting the DB-frontend integration. They are worth considering.
1
0
0
Improving performance of cgi
2
python,cgi
1
2010-07-20T11:15:00.000
I need to save an image file into sqlite database in python. I could not find a solution. How can I do it? Thanks in advance.
8
0
0
0
false
3,310,034
0
15,288
2
0
0
3,309,957
It's never a good idea to record raw types in databases. Couldn't you just save the file on the filesystem, and record the path to it in database?
1
0
0
pysqlite - how to save images
4
python,image,sqlite,blob,pysqlite
0
2010-07-22T14:29:00.000
I need to save an image file into sqlite database in python. I could not find a solution. How can I do it? Thanks in advance.
8
11
1.2
0
true
3,310,995
0
15,288
2
0
0
3,309,957
write - cursor.execute('insert into File (id, name, bin) values (?,?,?)', (id, name, sqlite3.Binary(file.read()))) read - file = cursor.execute('select bin from File where id=?', (id,)).fetchone() if you need to return bin data in web app - return cStringIO.StringIO(file['bin'])
1
0
0
pysqlite - how to save images
4
python,image,sqlite,blob,pysqlite
0
2010-07-22T14:29:00.000
This is a tricky question, we've been talking about this for a while (days) and haven't found a convincingly good solution. This is the situation: We have users and groups. A user can belong to many groups (many to many relation) There are certain parts of the site that need access control, but: There are certain ROWS...
2
2
0.099668
0
false
3,327,313
1
214
3
0
0
3,327,279
This problem is not really new; it's basically the general problem of authorization and access rights/control. In order to avoid having to model and maintain a complete graph of exactly what objects each user can access in each possible way, you have to make decisions (based on what your application does) about how to ...
1
0
0
Control access to parts of a system, but also to certain pieces of information
4
python,access-control
0
2010-07-24T23:11:00.000
This is a tricky question, we've been talking about this for a while (days) and haven't found a convincingly good solution. This is the situation: We have users and groups. A user can belong to many groups (many to many relation) There are certain parts of the site that need access control, but: There are certain ROWS...
2
0
0
0
false
3,327,325
1
214
3
0
0
3,327,279
It's hard to be specific without knowing more about your setup and about why exactly you need different users to have different permissions on different rows. But generally, I would say that whenever you access any data in the database in your code, you should precede it by an authorization check, which examines the cu...
1
0
0
Control access to parts of a system, but also to certain pieces of information
4
python,access-control
0
2010-07-24T23:11:00.000
This is a tricky question, we've been talking about this for a while (days) and haven't found a convincingly good solution. This is the situation: We have users and groups. A user can belong to many groups (many to many relation) There are certain parts of the site that need access control, but: There are certain ROWS...
2
0
0
0
false
3,327,726
1
214
3
0
0
3,327,279
Add additional column "category" or "type" to the table(s), that will categorize the rows (or if you will, group/cluster them) - and then create a pivot table that defines the access control between (rowCategory, userGroup). So for each row, by its category you can pull which userGroups have access (and what kind of ac...
1
0
0
Control access to parts of a system, but also to certain pieces of information
4
python,access-control
0
2010-07-24T23:11:00.000
I'm using MongoDB an nosql database. Basically as a result of a query I have a list of dicts which themselves contains lists of dictionaries... which I need to work with. Unfortunately dealing with all this data within Python can be brought to a crawl when the data is too much. I have never had to deal with this probl...
0
1
0.066568
0
false
3,333,193
0
406
2
0
0
3,330,668
Are you loading all the data into memory at once? If so you could be causing the OS to swap memory to disk, which can bring any system to a crawl. Dictionaries are hashtables so even an empty dict will use up a lot of memory, and from what you say you are creating a lot of them at once. I don't know the MongoDB API,...
1
0
1
Speeding up parsing of HUGE lists of dictionaries - Python
3
python,parsing,list,sorting,dictionary
0
2010-07-25T19:23:00.000
I'm using MongoDB an nosql database. Basically as a result of a query I have a list of dicts which themselves contains lists of dictionaries... which I need to work with. Unfortunately dealing with all this data within Python can be brought to a crawl when the data is too much. I have never had to deal with this probl...
0
3
1.2
0
true
3,333,236
0
406
2
0
0
3,330,668
Do you really want all of that data back in your Python program? If so fetch it back a little at a time, but if all you want to do is summarise the data then use mapreduce in MongoDB to distribute the processing and just return the summarised data. After all, the point about using a NoSQL database that cleanly shards a...
1
0
1
Speeding up parsing of HUGE lists of dictionaries - Python
3
python,parsing,list,sorting,dictionary
0
2010-07-25T19:23:00.000
I was using Python 2.6.5 to build my application, which came with sqlite3 3.5.9. Apparently though, as I found out in another question of mine, foreign key support wasn't introduced in sqlite3 until version 3.6.19. However, Python 2.7 comes with sqlite3 3.6.21, so this work -- I decided I wanted to use foreign keys in ...
9
6
1
0
false
3,341,117
0
4,281
2
0
0
3,333,095
download the latest version of sqlite3.dll from sqlite website and replace the the sqlite3.dll in the python dir.
1
0
1
How can I upgrade the sqlite3 package in Python 2.6?
3
python,build,linker,sqlite
0
2010-07-26T07:54:00.000
I was using Python 2.6.5 to build my application, which came with sqlite3 3.5.9. Apparently though, as I found out in another question of mine, foreign key support wasn't introduced in sqlite3 until version 3.6.19. However, Python 2.7 comes with sqlite3 3.6.21, so this work -- I decided I wanted to use foreign keys in ...
9
1
0.066568
0
false
3,333,348
0
4,281
2
0
0
3,333,095
I decided I'd just give this a shot when I realized that every library I've ever installed in python 2.6 resided in my site-packages folder. I just... copied site-packages to my 2.7 installation, and it works so far. This is by far the easiest route for me if this works -- I'll look further into it but at least I can c...
1
0
1
How can I upgrade the sqlite3 package in Python 2.6?
3
python,build,linker,sqlite
0
2010-07-26T07:54:00.000
I have some MySQL database server information that needs to be shared between a Python backend and a PHP frontend. What is the best way to go about storing the information in a manner wherein it can be read easily by Python and PHP? I can always brute force it with a bunch of str.replace() calls in Python and hope it w...
0
4
1.2
0
true
3,349,485
0
200
1
0
0
3,349,445
Store the shared configuration in a plain text file, preferably in a standard format. You might consider yaml, ini, or json. I'm pretty sure both PHP and python can very trivially read and parse all three of those formats.
1
0
0
Python - PHP Shared MySQL server connection info?
1
php,python,mysql,variables,share
0
2010-07-28T02:23:00.000
I store groups of entities in the google app engine Data Store with the same ancestor/parent/entityGroup. This is so that the entities can be updated in one atomic datastore transaction. The problem is as follows: I start a db transaction I update entityX by setting entityX.flag = True I save entityX I query for entit...
2
0
0
0
false
3,350,082
1
215
1
1
0
3,350,068
Looks like you are not doing a commit on the transaction before querying start a db transaction update entityX by setting entityX.flag = True save entityX COMMIT TRANSACTION query for entity where flag == True. BUT, here is the problem. This query does NOT return any results. It should have returned entityX, but it d...
1
0
0
On the google app engine, why do updates not reflect in a transaction?
2
python,google-app-engine
0
2010-07-28T05:06:00.000
We need to bulk load many long strings (>4000 Bytes, but <10,000 Bytes) using cx_Oracle. The data type in the table is CLOB. We will need to load >100 million of these strings. Doing this one by one would suck. Doing it in a bulk fashion, ie using cursor.arrayvar() would be ideal. However, CLOB does not support arrays....
0
0
0
0
false
3,373,296
0
1,252
1
0
0
3,358,666
In the interest of getting shit done that is good enough, we did the abuse of the CLOB I mentioned in my comment. It took less than 30 minutes to get coded up, runs fast and works.
1
0
0
Passing an array of long strings ( >4000 bytes) to an Oracle (11gR2) stored procedure using cx_Oracle
2
python,oracle,cx-oracle
0
2010-07-29T00:41:00.000
I am connecting to an MS SQL Server db from Python in Linux. I am connecting via pyodbc using the FreeTDS driver. When I return a money field from MSSQL it comes through as a float, rather than a Python Decimal. The problem is with FreeTDS. If I run the exact same Python code from Windows (where I do not need to use...
0
1
0.099668
0
false
3,372,035
0
1,284
1
0
0
3,371,795
You could always just convert it to Decimal when it comes back...
1
0
0
FreeTDS translating MS SQL money type to python float, not Decimal
2
python,sql-server,pyodbc,freetds
0
2010-07-30T13:19:00.000
I set up Mysql5, mysql5-server and py26-mysql using Macports. I then started the mysql server and was able to start the prompt with mysql5 In my settings.py i changed database_engine to "mysql" and put "dev.db" in database_name. I left the username and password blank as the database doesnt exist yet. When I ran python ...
0
1
0.099668
0
false
3,377,350
1
5,039
1
0
0
3,376,673
syncdb will not create a database for you -- it only creates tables that don't already exist in your schema. You need to: Create a user to 'own' the database (root is a bad choice). Create the database with that user. Update the Django database settings with the correct database name, user, and password.
1
0
0
Django MySql setup
2
python,mysql,django
0
2010-07-31T03:34:00.000
I had my sqlalchemy related code in my main() method in my script. But then when I created a function, I wasn't able to reference my 'products' mapper because it was in the main() method. Should I be putting the sqlalchemy related code (session, mapper, and classes) in global scope so all functions in my single file sc...
1
2
1.2
0
true
3,382,810
0
87
1
0
0
3,382,739
Typical approach is to define all mappings in separate model module, with one file per class/table. Then you just import needed classes whenever need them.
1
0
0
Where to put my sqlalchemy code in my script?
1
python,sqlalchemy
0
2010-08-01T16:20:00.000
Lets say I have a database table which consists of three columns: id, field1 and field2. This table may have anywhere between 100 and 100,000 rows in it. I have a python script that should insert 10-1,000 new rows into this table. However, if the new field1 already exists in the table, it should do an UPDATE, not an...
2
0
0
0
false
3,536,835
0
2,505
2
0
0
3,404,556
You appear to be comparing apples with oranges. A python list is only useful if your data fit into the address-space of the process. Once the data get big, this won't work any more. Moreover, a python list is not indexed - for that you should use a dictionary. Finally, a python list is non-persistent - it is forgotten ...
1
0
0
Python performance: search large list vs sqlite
4
python,performance,sqlite
0
2010-08-04T10:18:00.000
Lets say I have a database table which consists of three columns: id, field1 and field2. This table may have anywhere between 100 and 100,000 rows in it. I have a python script that should insert 10-1,000 new rows into this table. However, if the new field1 already exists in the table, it should do an UPDATE, not an...
2
0
0
0
false
3,404,589
0
2,505
2
0
0
3,404,556
I imagine using a python dictionary would allow for much faster searching than using a python list. (Just set the values to 0, you won't need them, and hopefully a '0' stores compactly.) As for the larger question, I'm curious too. :)
1
0
0
Python performance: search large list vs sqlite
4
python,performance,sqlite
0
2010-08-04T10:18:00.000
i am reading a csv file into a list of a list in python. it is around 100mb right now. in a couple of years that file will go to 2-5gigs. i am doing lots of log calculations on the data. the 100mb file is taking the script around 1 minute to do. after the script does a lot of fiddling with the data, it creates URL's th...
4
4
1.2
0
true
3,419,835
0
1,357
5
0
0
3,419,624
I don't know exactly what you are doing. But a database will just change how the data is stored. and in fact it might take longer since most reasonable databases may have constraints put on columns and additional processing for the checks. In many cases having the whole file local, going through and doing calculatio...
1
0
1
python or database?
5
python,sql
0
2010-08-05T22:13:00.000
i am reading a csv file into a list of a list in python. it is around 100mb right now. in a couple of years that file will go to 2-5gigs. i am doing lots of log calculations on the data. the 100mb file is taking the script around 1 minute to do. after the script does a lot of fiddling with the data, it creates URL's th...
4
2
0.07983
0
false
3,419,871
0
1,357
5
0
0
3,419,624
I always reach for a database for larger datasets. A database gives me some stuff for "free"; that is, I don't have to code it. searching sorting indexing language-independent connections Something like SQLite might be the answer for you. Also, you should investigate the "nosql" databases; it sounds like your prob...
1
0
1
python or database?
5
python,sql
0
2010-08-05T22:13:00.000
i am reading a csv file into a list of a list in python. it is around 100mb right now. in a couple of years that file will go to 2-5gigs. i am doing lots of log calculations on the data. the 100mb file is taking the script around 1 minute to do. after the script does a lot of fiddling with the data, it creates URL's th...
4
4
0.158649
0
false
3,419,726
0
1,357
5
0
0
3,419,624
If you need to go through all lines each time you perform the "fiddling" it wouldn't really make much difference, assuming the actual "fiddling" is whats eating your cycles. Perhaps you could store the results of your calculations somehow, then a database would probably be nice. Also, databases have methods for ensurin...
1
0
1
python or database?
5
python,sql
0
2010-08-05T22:13:00.000
i am reading a csv file into a list of a list in python. it is around 100mb right now. in a couple of years that file will go to 2-5gigs. i am doing lots of log calculations on the data. the 100mb file is taking the script around 1 minute to do. after the script does a lot of fiddling with the data, it creates URL's th...
4
4
0.158649
0
false
3,419,718
0
1,357
5
0
0
3,419,624
I'd only put it into a relational database if: The data is actually relational and expressing it that way helps shrink the size of the data set by normalizing it. You can take advantage of triggers and stored procedures to offload some of the calculations that your Python code is performing now. You can take advantage...
1
0
1
python or database?
5
python,sql
0
2010-08-05T22:13:00.000
i am reading a csv file into a list of a list in python. it is around 100mb right now. in a couple of years that file will go to 2-5gigs. i am doing lots of log calculations on the data. the 100mb file is taking the script around 1 minute to do. after the script does a lot of fiddling with the data, it creates URL's th...
4
1
0.039979
0
false
3,419,687
0
1,357
5
0
0
3,419,624
At 2 gigs, you may start running up against speed issues. I work with model simulations for which it calls hundreds of csv files and it takes about an hour to go through 3 iterations, or about 20 minutes per loop. This is a matter of personal preference, but I would go with something like PostGreSql because it integr...
1
0
1
python or database?
5
python,sql
0
2010-08-05T22:13:00.000
I have an sqlite database whose data I need to transfer over the network, the server needs to modify the data, and then I need to get the db back and either update my local version or overwrite it with the new db. How should I do this? My coworker at first wanted to scrap the db and just use an .ini file, but this is g...
0
3
1.2
0
true
3,451,733
0
471
1
0
0
3,451,708
Use the copy command in your OS. No reason to overthink this.
1
0
0
Sending sqlite db over network
1
python,sqlite,embedded,binary-data
0
2010-08-10T17:29:00.000
I'm trying to copy an excel sheet with python, but I keep getting "access denied" error message. The file is closed and is not shared. It has macros though. Is their anyway I can copy the file forcefully with python? thanks.
0
0
1.2
0
true
3,466,751
0
94
1
0
0
3,465,231
If you do not have sufficient file permissions you will not be able to access the file. In that case you will have to execute your Python program as an user with sufficient permissions. If on the other hand the file is locked using other means specific to Excel then I am not sure what exactly is the solution. You might...
1
0
0
Copying a file with access locks, forcefully with python
1
python,excel-2003
0
2010-08-12T06:32:00.000
is there a python ORM (object relational mapper) that has a tool for automatically creating python classes (as code so I can expand them) from a given database schema? I'm frequently faced with small tasks involving different databases (like importing/exporting from various sources etc.) and I thought python together w...
2
3
0.197375
0
false
3,481,115
0
1,411
1
0
0
3,478,780
You do not need to produce a source code representation of your classes to be able to expand them. The only trick is that you need the ORM to generate the classes BEFORE importing the module that defines the derived classes. Even better, don't use derivation, but use __getattr__ and __setattr__ to implement transparent...
1
0
0
Python ORM that automatically creates classes from DB schema
3
python,orm,code-generation
0
2010-08-13T16:18:00.000
I have a couple of sqlite dbs (i'd say about 15GBs), with about 1m rows in total - so not super big. I was looking at mongodb, and it looks pretty easy to work with, especially if I want to try and do some basic natural language processing on the documents which make up the databases. I've never worked with Mongo in t...
12
10
1
0
false
3,491,117
0
3,302
3
0
0
3,487,456
As others have said, MongoDB does not have single-server durability right now. Fortunately, it's dead easy to set up multi-node replication. You can even set up a second machine in another data center and have data automatically replicated to it live! If a write must succeed, you can cause Mongo to not return from an i...
1
0
0
Mongodb - are reliability issues significant still?
5
python,sqlite,mongodb
0
2010-08-15T13:00:00.000
I have a couple of sqlite dbs (i'd say about 15GBs), with about 1m rows in total - so not super big. I was looking at mongodb, and it looks pretty easy to work with, especially if I want to try and do some basic natural language processing on the documents which make up the databases. I've never worked with Mongo in t...
12
3
0.119427
0
false
3,488,244
0
3,302
3
0
0
3,487,456
Mongo does not have ACID properties, specifically durability. So you can face issues if the process does not shut down cleanly or the machine loses power. You are supposed to implement backups and redundancy to handle that.
1
0
0
Mongodb - are reliability issues significant still?
5
python,sqlite,mongodb
0
2010-08-15T13:00:00.000
I have a couple of sqlite dbs (i'd say about 15GBs), with about 1m rows in total - so not super big. I was looking at mongodb, and it looks pretty easy to work with, especially if I want to try and do some basic natural language processing on the documents which make up the databases. I've never worked with Mongo in t...
12
2
0.07983
0
false
3,490,547
0
3,302
3
0
0
3,487,456
I don't see the problem if you have the same data also in the sqlite backups. You can always refill your MongoDb databases. Refilling will only take a few minutes.
1
0
0
Mongodb - are reliability issues significant still?
5
python,sqlite,mongodb
0
2010-08-15T13:00:00.000
I want to encrypt a string using RSA algorithm and then store that string into postgres database using SQLAlchemy in python. Then Retrieve the encrypted string and decrypt it using the same key. My problem is that the value gets stored in the database is not same as the actual encrypted string. The datatype of column w...
1
1
0.099668
0
false
3,507,558
0
3,897
1
0
0
3,507,543
By "same key" you mean "the other key", right? RSA gives you a keypair, if you encrypt with one you decrypt with the other ... Other than that, it sounds like a encoding problem. Try storing the data as binary or encode the string with your databases collation. Basically encryption gives you bytes but you store them as...
1
0
0
Inserting Encrypted Data in Postgres via SQLALchemy
2
python,postgresql,sqlalchemy,rsa,pycrypto
0
2010-08-17T22:39:00.000
I'd like to query the database and get read-only objects with session object. I need to save the objects in my server and use them through the user session. If I use a object outside of the function that calls the database, I get this error: "DetachedInstanceError: Parent instance is not bound to a Session; lazy load o...
0
0
0
0
false
3,513,490
1
244
1
0
0
3,513,433
You must load the parent object again.
1
0
0
How to get read-only objects from database?
1
python,sqlalchemy
0
2010-08-18T14:57:00.000
I'm in the settings.py module, and I'm supposed to add the directory to the sqlite database. How do I know where the database is and what the full directory is? I'm using Windows 7.
5
1
0.099668
0
false
3,524,305
1
3,575
1
0
0
3,524,236
if you don't provide full path, it will use the current directory of settings.py, and if you wish to specify static path you can specify it like: c:/projects/project1/my_proj.db or in case you want to make it dynamic then you can use os.path module so os.path.dirname(file) will give you the path of settings.py and acco...
1
0
0
Trouble setting up sqlite3 with django! :/
2
python,database,django,sqlite
0
2010-08-19T17:02:00.000
I'm working with two databases, a local version and the version on the server. The server is the most up to date version and instead of recopying all values on all tables from the server to my local version, I would like to enter each table and only insert/update the values that have changed, from server, and copy th...
1
0
0
0
false
3,527,732
0
1,035
1
0
0
3,526,629
If all of your tables' records had timestamps, you could identify "the values that have changed in the server" -- otherwise, it's not clear how you plan to do that part (which has nothing to do with insert or update, it's a question of "selecting things right"). Once you have all the important values, somecursor.execut...
1
0
0
Python + MySQLDB Batch Insert/Update command for two of the same databases
1
python,mysql,batch-file
0
2010-08-19T21:59:00.000
Sometimes, when fetching data from the database either through the python shell or through a python script, the python process dies, and one single word is printed to the terminal: Killed That's literally all it says. It only happens with certain scripts, but it always happens for those scripts. It consistently happens...
7
6
1.2
0
true
3,529,637
1
1,944
1
1
0
3,526,748
Only one thing I could think of that will kill automatically a process on Linux - the OOM killer. What's in the system logs?
1
0
0
Why do some Django ORM queries end abruptly with the message "Killed"?
2
python,django,postgresql
0
2010-08-19T22:19:00.000
I've been diving into MongoDB with kind help of MongoKit and MongoEngine, but then I started thinking whether the data mappers are necessary here. Both mappers I mentioned enable one to do simple things without any effort. But is any effort required to do simple CRUD? It appears to me that in case of NoSQL the mappers ...
2
1
1.2
0
true
3,553,262
0
366
1
1
0
3,533,064
We are running a production site using Mongodb for the backend (no direct queries to Mongo, we have a search layer in between). We wrote our own business / object layer, i suppose it just seemed natural enough for the programmers to write in the custom logic. We did separate the database and business layers, but they...
1
0
0
Do you use data mappers with MongoDB?
1
python,orm,mongodb,mongoengine,mongokit
0
2010-08-20T16:54:00.000
I have an SQL database and am wondering what command you use to just get a list of the table names within that database.
35
10
1.2
0
true
3,556,313
0
62,222
1
0
0
3,556,305
SHOW tables 15 chars
1
0
0
How to retrieve table names in a mysql database with Python and MySQLdb?
4
python,mysql,mysql-python
0
2010-08-24T12:18:00.000
While I see a bunch of links/binaries for mysql connector for python 2.6, I don't see one for 2.7 To use django, should I just revert to 2.6 or is there a way out ? I'm using windows 7 64bit django - 1.1 Mysql 5.1.50 Any pointers would be great.
1
1
0.066568
0
false
58,359,370
1
2,224
1
0
0
3,562,406
For Python 2.7 on specific programs: sudo chown -R $USER /Library/Python/2.7 brew install mysql@5.7 brew install mysql-connector-c brew link --overwrite mysql@5.7 echo 'export PATH="/usr/local/opt/mysql@5.7/bin:$PATH"' >> ~/.bash_profile sed -i -e 's/libs="$libs -l "/libs="$libs -lmysqlclient -lssl -lcrypto"/g' /usr/l...
1
0
0
Is there no mysql connector for python 2.7 on windows
3
mysql,python-2.7
0
2010-08-25T02:15:00.000
I wanted to get the community's feedback on a language choice our team is looking to make in the near future. We are a software developer, and I work in a team of Oracle and SQL Server DBAs supporting a cross platform Java application which runs on Oracle Application Server. We have SQL Server and Oracle code bases, ...
6
1
0.033321
0
false
3,564,251
0
3,213
5
1
0
3,564,177
Although I prefer working on the JVM, one thing that turns me off is having to spin up a JVM to run a script. If you can work in a REPL this is not such a big deal, but it really slows you down when doing edit-run-debug scripting. Now of course Oracle has a lot of Java stuff where interaction moght be needed, but that...
1
0
0
Which cross platform scripting language should we adopt for a group of DBAs?
6
python,scala,groovy,shell,jython
0
2010-08-25T08:47:00.000
I wanted to get the community's feedback on a language choice our team is looking to make in the near future. We are a software developer, and I work in a team of Oracle and SQL Server DBAs supporting a cross platform Java application which runs on Oracle Application Server. We have SQL Server and Oracle code bases, ...
6
0
0
0
false
3,564,285
0
3,213
5
1
0
3,564,177
I've been in a similar situation, though on a small scale. The previous situation was that any automation on the SQL Server DBs was done with VBScript, which I did start out using. As I wanted something cross-platform (and less annoying than VBScript) I went with Python. What I learnt is: Obviously you want a languag...
1
0
0
Which cross platform scripting language should we adopt for a group of DBAs?
6
python,scala,groovy,shell,jython
0
2010-08-25T08:47:00.000
I wanted to get the community's feedback on a language choice our team is looking to make in the near future. We are a software developer, and I work in a team of Oracle and SQL Server DBAs supporting a cross platform Java application which runs on Oracle Application Server. We have SQL Server and Oracle code bases, ...
6
4
0.132549
0
false
3,565,446
0
3,213
5
1
0
3,564,177
The XML thing almost calls for Scala. Now, I love Scala, but I suggest Python here.
1
0
0
Which cross platform scripting language should we adopt for a group of DBAs?
6
python,scala,groovy,shell,jython
0
2010-08-25T08:47:00.000
I wanted to get the community's feedback on a language choice our team is looking to make in the near future. We are a software developer, and I work in a team of Oracle and SQL Server DBAs supporting a cross platform Java application which runs on Oracle Application Server. We have SQL Server and Oracle code bases, ...
6
5
0.16514
0
false
3,568,609
0
3,213
5
1
0
3,564,177
I think your best three options are Groovy, Python, and Scala. All three let you write code at a high level (compared to C/Java). Python has its own perfectly adequate DB bindings, and Groovy and Scala can use ones made for Java. The advantages of Python are that it is widely used already, so there are tons of tools,...
1
0
0
Which cross platform scripting language should we adopt for a group of DBAs?
6
python,scala,groovy,shell,jython
0
2010-08-25T08:47:00.000
I wanted to get the community's feedback on a language choice our team is looking to make in the near future. We are a software developer, and I work in a team of Oracle and SQL Server DBAs supporting a cross platform Java application which runs on Oracle Application Server. We have SQL Server and Oracle code bases, ...
6
6
1
0
false
3,564,413
0
3,213
5
1
0
3,564,177
You can opt for Python. Its dynamic(interpreted) , is available on Windows/Linux/Solaris, has easy to read syntax so that your code maintenance is easy. There modules/libraries for Oracle interaction and various other database servers as well. there are also library support for XML. All 7 points are covered.
1
0
0
Which cross platform scripting language should we adopt for a group of DBAs?
6
python,scala,groovy,shell,jython
0
2010-08-25T08:47:00.000
I have been developing under Python/Snowleopard happily for the part 6 months. I just upgraded Python to 2.6.5 and a whole bunch of libraries, including psycopg2 and Turbogears. I can start up tg-admin and run some queries with no problems. Similarly, I can run my web site from the command line with no problems. Howev...
0
0
1.2
0
true
3,571,749
0
296
1
0
0
3,571,495
Problem solved (to a point). I was running 64 bit python from Aptana Studio and 32 bit python on the command line. By forcing Aptana to use 32 bit python, the libraries work again and all is happy.
1
0
0
Psycopg2 under osx works on commandline but fails in Aptana studio
1
python,turbogears,psycopg
1
2010-08-26T01:41:00.000
I'm using a linux machine to make a little python program that needs to input its result in a SQL Server 2000 DB. I'm new to python so I'm struggling quite a bit to find what's the best solution to connect to the DB using python 3, since most of the libs I looked only work in python 2. As an added bonus question, the f...
1
0
0
0
false
4,062,244
0
1,786
2
0
0
3,571,819
If you want to have portable mssql server library, you can try the module from www.pytds.com. It works with 2.5+ AND 3.1, have a good stored procedure support. It's api is more "functional", and has some good features you won't find anywhere else.
1
0
0
How to access a MS SQL Server using Python 3?
3
python,sql-server,python-3.x,py2exe
0
2010-08-26T03:18:00.000
I'm using a linux machine to make a little python program that needs to input its result in a SQL Server 2000 DB. I'm new to python so I'm struggling quite a bit to find what's the best solution to connect to the DB using python 3, since most of the libs I looked only work in python 2. As an added bonus question, the f...
1
0
0
0
false
3,573,005
0
1,786
2
0
0
3,571,819
I can't answer your question directly, but given that many popular Python packages and frameworks are not yet fully supported on Python 3, you might consider just using Python 2.x. Unless there are features you absolutely cannot live without in Python 3, of course. And it isn't clear from your post if you plan to deplo...
1
0
0
How to access a MS SQL Server using Python 3?
3
python,sql-server,python-3.x,py2exe
0
2010-08-26T03:18:00.000
Since mongo doesn't have a schema, does that mean that we won't have to do migrations when we change the models? What does the migration process look like with a non-relational db?
18
1
0.066568
0
false
3,605,615
1
5,660
2
0
0
3,604,565
What does the migration process look like with a non-relational db? Depends on if you need to update all the existing data or not. In many cases, you may not need to touch the old data, such as when adding a new optional field. If that field also has a default value, you may also not need to update the old documents,...
1
0
0
Does django with mongodb make migrations a thing of the past?
3
python,django,mongodb
0
2010-08-30T22:01:00.000
Since mongo doesn't have a schema, does that mean that we won't have to do migrations when we change the models? What does the migration process look like with a non-relational db?
18
2
0.132549
0
false
3,604,687
1
5,660
2
0
0
3,604,565
There is no silver bullet. Adding or removing fields is easier with non-relational db (just don't use unneeded fields or use new fields), renaming a field is easier with traditional db (you'll usually have to change a lot of data in case of field rename in schemaless db), data migration is on par - depending on task.
1
0
0
Does django with mongodb make migrations a thing of the past?
3
python,django,mongodb
0
2010-08-30T22:01:00.000