Question
stringlengths
25
7.47k
Q_Score
int64
0
1.24k
Users Score
int64
-10
494
Score
float64
-1
1.2
Data Science and Machine Learning
int64
0
1
is_accepted
bool
2 classes
A_Id
int64
39.3k
72.5M
Web Development
int64
0
1
ViewCount
int64
15
1.37M
Available Count
int64
1
9
System Administration and DevOps
int64
0
1
Networking and APIs
int64
0
1
Q_Id
int64
39.1k
48M
Answer
stringlengths
16
5.07k
Database and SQL
int64
1
1
GUI and Desktop Applications
int64
0
1
Python Basics and Environment
int64
0
1
Title
stringlengths
15
148
AnswerCount
int64
1
32
Tags
stringlengths
6
90
Other
int64
0
1
CreationDate
stringlengths
23
23
I had a postgresql query where I need to take column defined as character from table and then pass this value to the function where it only accepts integer.So in this case, how can i solve the problem??Can anyone help??
0
0
0
0
false
5,148,795
0
126
1
0
0
5,148,790
ord(val) will give you the integer value of a character. int(val) will cast a value into an integer.
1
0
1
how to convert value of column defined as character into integer in python
2
python,casting
0
2011-02-28T23:23:00.000
As part of artifacts delivery, our developers give the data and structure scripts in .sql files. I usually "double click" on these files to open in "Microsoft SQL Server Management Studio". Management studio will prompt me for entering database server and user/pwd. I enter them manually and click on Execute button to e...
1
4
0.379949
0
false
5,174,307
0
3,938
1
0
0
5,174,269
You could just run them using sqlcmd. Sqlcmd is a command line utility that will let you run .sql scripts from the command line, which I'm sure you can kick off through python.
1
0
0
Execute .sql files that are used to run in SQL Management Studio in python
2
python,sql,sql-server
0
2011-03-02T22:22:00.000
I'm using SQLAlchemy 0.6.6 against a Postgres 8.3 DB on Windows 7 an PY 2.6. I am leaving the defaults for configuring pooling when I create my engine, which is pool_size=5, max_overflow=10. For some reason, the connections keep piling up and I intermittently get "Too many clients" from PG. I am positive that connect...
1
0
1.2
0
true
5,195,465
0
1,198
1
0
0
5,185,438
Turns out there was ScopedSession being used outside the normal application usage and the close wasn't in a finally.
1
0
0
SQLAlchemy Connection Pooling Problems - Postgres on Windows
1
python,postgresql,sqlalchemy,connection-pooling,cherrypy
0
2011-03-03T19:21:00.000
I have about 30 MB of textual data that is core to the algorithms I use in my web application. On the one hand, the data is part of the algorithm and changes to the data can cause an entire algorithm to fail. This is why I keep the data in text files in my source control, and all changes are auto-tested (pre-commit). I...
4
1
0.099668
0
false
5,210,383
1
127
1
0
0
5,210,318
Okay, here's an idea: Ship all the data as is done now. Have the installation script install it in the appropriate databases. Let users modify this database and give them a button "restore to original" that simply reinstalls from the text file. Alternatively, this route may be easier, esp. when upgrading an installat...
1
0
0
What's the best way to handle source-like data files in a web application?
2
python
0
2011-03-06T11:56:00.000
I'm not sure if this is an issue specific to sqlite databases but after adding some properties I executed syncdb successfully but still the the columns were not added to the database and when I try the access the model in admin I get no such column error. Why is this happening and how do I overcome this issue? Details:...
4
3
0.197375
0
false
5,211,417
1
5,639
1
0
0
5,211,340
As always, syncdb does not migrate the existing schema.
1
0
0
Django manage.py syncdb doing nothing when used with sqlite3
3
python,django,sqlite
0
2011-03-06T15:26:00.000
I am trying to modify the guestbook example webapp to reduce the amount of database writes. What I am trying to achieve is to load all the guestbook entries into memcache which I have done. However I want to be able to directly update the memcache with new guestbook entries and then write all changes to the database a...
2
6
1.2
0
true
5,222,081
1
1,124
1
1
0
5,221,977
This is a recipe for lost data. I have a hard time believing that a guest book is causing enough write activity to be an issue. Also, the bookkeeping involved in this would be tricky, since memcache isn't searchable.
1
0
0
Limit amount of writes to database using memcache
3
python,google-app-engine,caching,memcached,google-cloud-datastore
0
2011-03-07T16:12:00.000
I'm using sqlalchemy with reflection, a couple of partial indices in my DB make it dump warnings like this: SAWarning: Predicate of partial index i_some_index ignored during reflection into my logs and keep cluttering. It does not hinder my application behavior. I would like to keep these warnings while developing, but...
35
12
1
0
false
5,331,129
0
14,820
1
0
0
5,225,780
the warning means you did a table or metadata reflection, and it's reading in postgresql indexes that have some complex condition which the SQLAlchemy reflection code doesn't know what to do with. This is a harmless warning, as whether or not indexes are reflected doesn't affect the operation of the application, unle...
1
0
0
Turn off a warning in sqlalchemy
2
python,postgresql,sqlalchemy
0
2011-03-07T22:00:00.000
I'm working on a client machine running suse linux and python 2.4.2. I'm not allowed to dowload anything from the net including any external libraries. So, is there any way I can connect to a database (oracle) using only the default libraries?
0
1
1.2
0
true
5,228,737
0
360
1
0
0
5,228,728
No. There is nothing in the standard library for connecting to database servers.
1
0
0
Python: Connecting to db without any external libraries
2
python,database,oracle
0
2011-03-08T05:32:00.000
So I have a Django web application and I need to add a payment module to it. Basically a user will prepay for a certain amount of service and this will slowly reduce over as the user uses the service. I'm wondering what is the best practice to facilitate this? I can process payments using Satchmo, but then just storing...
3
1
0.099668
0
false
5,236,907
1
453
2
0
0
5,236,855
My language agnostic recommendation would be to make sure that the database that communicates with the web app is read only; at least for the table(s) that deal with these account balances. So, you process payments, and manage the reduction of account balances in a database that is not accessible to anyone other than y...
1
0
0
Securely storing account balances in a database?
2
python,database,django,web-applications
0
2011-03-08T18:45:00.000
So I have a Django web application and I need to add a payment module to it. Basically a user will prepay for a certain amount of service and this will slowly reduce over as the user uses the service. I'm wondering what is the best practice to facilitate this? I can process payments using Satchmo, but then just storing...
3
6
1.2
0
true
5,236,901
1
453
2
0
0
5,236,855
I don't know about a "well-tested solution" as you put it, but I would strongly caution against just storing a dollar value in the database and increasing or decreasing that dollar value. Instead, I would advise storing transactions that can be audited if anything goes wrong. Calculate the amount available from the c...
1
0
0
Securely storing account balances in a database?
2
python,database,django,web-applications
0
2011-03-08T18:45:00.000
Is there a way in cx_Oracle to capture the stdout output from an oracle stored procedure? These show up when using Oracle's SQL Developer or SQL Plus, but there does not seem to be a way to fetch it using the database drivers.
3
4
1.2
0
true
5,247,755
0
2,934
1
0
0
5,244,517
You can retrieve dbms_output with DBMS_OUTPUT.GET_LINE(buffer, status). Status is 0 on success and 1 when there's no more data. You can also use get_lines(lines, numlines). numlines is input-output. You set it to the max number of lines and it is set to the actual number on output. You can call this in a loop and exit ...
1
0
0
Capturing stdout output from stored procedures with cx_Oracle
4
python,oracle10g,cx-oracle
0
2011-03-09T10:36:00.000
I ´m trying to serialize an array in python to insert it on a database of MySQL... I try with pickle.dump() method but it returns byte... what can I use?? thanks!! (I ´m working in python 3)
3
1
0.049958
0
false
5,259,400
0
1,312
1
0
0
5,259,329
Pickle is a binary serialization, that's why you get a byte string. Pros: more compact can express most of Python's objects. Con's: bytes can be harder to handle Python only. JSON is more universal, so you're not tied to reading data with Python. It's also mostly ASCII, so it's easier to handle. the con is that i...
1
0
1
Serizalize an array in Python
4
python,mysql
0
2011-03-10T12:01:00.000
I need to use a Python ORM with a MS-Access database (in Windows). My first searches are not really succesfull : SQLAlchemy : no MS Access support in the two last versions. DAL from Web2Py : no Access (??) Storm : no MS Access sqlobject: no MS Access dejavu : seems OK for MS Access but is the project alive ? Any ide...
1
1
1.2
0
true
5,262,564
1
1,062
1
0
0
5,262,387
Web2py recently updated their DAL making it much easier to add support for new db engines. I don't believe there is currently native Jet (MS Access) support, but the existing SQL Server support could probably be modified without much effort to provide MS Access support. The latest version of the web2py DAL is a singl...
1
0
0
Python ORM for MS-Access
2
python,ms-access,orm
0
2011-03-10T16:08:00.000
Is opening/closing db cursor costly operation? What is the best practice, to use a different cursor or to reuse the same cursor between different sql executions? Does it matter if a transaction consists of executions performed on same or different cursors belonging to same connection? Thanks.
2
1
1.2
0
true
5,275,401
0
659
1
0
0
5,275,236
This will depend a lot on your database as well as your chose python implementation - have you tried profiling a few short test operations?
1
0
0
db cursor - transaction in python
1
python,database,transactions,cursor
0
2011-03-11T15:58:00.000
I have a very large dataset - millions of records - that I want to store in Python. I might be running on 32-bit machines so I want to keep the dataset down in the hundreds-of-MB range and not ballooning much larger than that. These records - represent a M:M relationship - two IDs (foo and bar) and some simple metadat...
1
2
0.099668
0
false
5,303,400
0
240
1
0
0
5,302,816
What you describe sounds like a sparse matrix, where the foos are along one axis and the bars along the other one. Each non-empty cell represents a relationship between one foo and one bar, and contains the "simple metadata" you describe. There are efficient sparse matrix packages for Python (scipy.sparse, PySparse) y...
1
0
1
Efficient large dicts of dicts to represent M:M relationships in Python
4
python,data-structures
0
2011-03-14T18:34:00.000
When I put my database file (which is a .sdb) into a directory and try to access it from that directory, I receive an error. The error reads "unable to open database file". For example, let's say my .sdb file is in the "data" directory and I use the command "con = lite.connect('data\noktalar.sdb')", this error occurs. ...
3
1
0.099668
0
false
5,321,757
0
324
1
0
0
5,321,699
Where is your python process running from? Try to point to the absolute path of the file. And when pointing to path use raw string r'c:\\mypath\data\notktalar.sub'
1
0
0
Python Database Error
2
python,sqlite
0
2011-03-16T06:14:00.000
I'm using the Python version of Google App Engine and Datastore. What is a good way to load a table that will contain lookup data? By look up data I mean that after the initial load no rows will need to be inserted, deleted, or updated Blowing away all rows and reloading the table is not acceptable if it destroys refe...
2
2
1.2
0
true
5,331,814
1
319
1
0
0
5,328,112
If it's really created once and never changes within the lifetime of a deployment, and it's relatively small (a few megs or less), store it with your app as data files. Have the app load the data into memory initially, and cache it there.
1
0
0
Need Pattern for lookup tables in Google App Engine
2
python,google-app-engine,google-cloud-datastore
0
2011-03-16T16:05:00.000
I'm designing a python application which works with a database. I'm planning to use sqlite. There are 15000 objects, and each object has a few attributes. every day I need to add some data for each object.(Maybe create a column with the date as its name). However, I would like to easily delete the data which is too old...
0
0
0
0
false
5,339,473
0
113
2
0
0
5,335,330
If your database is pretty much a collection of almost-homogenic data, you could as well go for a simpler key-value database. If the main action you perform on the data is scanning through everything, it would perform significantly better. Python library has bindings for popular ones as "anydbm". There is also a dict-i...
1
0
0
Please help me design a database schema for this:
3
python,sqlite,data-modeling
0
2011-03-17T05:42:00.000
I'm designing a python application which works with a database. I'm planning to use sqlite. There are 15000 objects, and each object has a few attributes. every day I need to add some data for each object.(Maybe create a column with the date as its name). However, I would like to easily delete the data which is too old...
0
0
0
0
false
5,335,386
0
113
2
0
0
5,335,330
For that size of a db, I would use something else. I've used sqlite once for a media library with about 10k objects and it was slow, like 5 minutes to query it all and display, searches were :/, switching to postgres made life so much easier. This is just on the performance issue only. It also might be better to creat...
1
0
0
Please help me design a database schema for this:
3
python,sqlite,data-modeling
0
2011-03-17T05:42:00.000
is it possible to Insert a python tuple in a postgresql database
0
1
0.049958
0
false
5,342,409
0
4,015
2
0
0
5,342,359
Really we need more information. What data is inside the tuple? Is it just integers? Just strings? Is it megabytes of images? If you had a Python tuple like (4,6,2,"Hello",7) you could insert the string '(4,6,2,"Hello",7)' into a Postgres database, but that's probably not the answer you're looking for. You really need ...
1
0
0
is it possible to Insert a python tuple in a postgresql database
4
python,database,postgresql
0
2011-03-17T16:47:00.000
is it possible to Insert a python tuple in a postgresql database
0
1
0.049958
0
false
5,342,419
0
4,015
2
0
0
5,342,359
This question does not make any sense. You can insert using SQL whatever is supported by your database model. If you need a fancy mapper: look at an ORM like SQLAlchemy.
1
0
0
is it possible to Insert a python tuple in a postgresql database
4
python,database,postgresql
0
2011-03-17T16:47:00.000
I am attemping to install OpsCenter for Cassandra, and using the the standard REHL image. I can't figure out how to get this to work. Another version of EPEL perhaps? yum install opscenter.... Error: Package: python26-rrdtool-1.2.27-1.i386 (opscenter) Requires: librrd.so.2
2
0
0
0
false
5,344,716
0
410
1
1
0
5,344,641
Try installing rrdtool via yum, that should contain librrd.so.2 and correct your issue.
1
0
0
Amazon Linux AMI EC2 - librrd.so.2 dependency issue
2
python,linux,centos,cassandra,yum
0
2011-03-17T20:10:00.000
I have a Webserver running in Python. He is getting some Data from some Apps and need to store these in MongoDB. My MongoDB is sharded. Now i want that my Webserver know how much Shards MongoDB has. At the moment he reads this from a cfg file. There is an Statement in MongoDb named printshardingstatus where u can see ...
0
0
0
0
false
5,377,084
0
1,576
1
0
0
5,350,599
You can simply get config databasr and execute find() on shards collection just like normal collection.
1
0
0
Execute MongoDb Statements in Python
3
python,mongodb,pymongo
0
2011-03-18T10:21:00.000
I'm creating a server with Apache2 + mod_python + Django for development and would like to know how to use Mercurial to manage application development. My idea is to make the folder where the Mercurial stores the project be the same folder to deploy Django. Thank you for your attention!
0
0
1.2
0
true
5,397,870
1
627
1
0
0
5,397,528
I thought about this, good idea for development. Use mercurial in common way. And of course you need deploy mercurial server before. If you update your django project, it will be compiled on the fly. My workflow: Set up mercurial server or use bitbucket Init repo locally Push repo to central repo On server pull repo...
1
0
0
How to use Mercurial to deploy Django applications?
1
python,django,mercurial,apache2,mod-python
0
2011-03-22T20:37:00.000
I have my database in msacess 2000 .mdb format which I downloaded from the net and now I want to access that database from my program which is a python script. Can I call tables from my programs?? it would be very grateful if anyone of you please suggest me what to do
2
0
0
0
false
5,402,549
0
5,861
1
0
0
5,402,463
Create an ODBC DSN wit hthis MDB. Python can access ODBC data sources.
1
0
0
How do I access a .mdb file from python?
3
python,ms-access
0
2011-03-23T08:16:00.000
I am using the function open_workbook() to open an excel file. But I cannot find any function to close the file later in the xlrd module. Is there a way to close the xls file using xlrd? Or is not required at all?
23
6
1
0
false
5,404,018
0
22,147
1
0
0
5,403,781
The open_workbook calls the release_resources ( which closes the mmaped file ) before returning.
1
0
0
Is there a way to close a workbook using xlrd
2
python,xlrd
0
2011-03-23T10:24:00.000
Scenario Entity1 (id,itmname) Entity2 (id,itmname,price) Entity3 (id,itmname,profit) profit and price are both IntegerProperty I want to count all the item with price more then 500 and profit more then 10. I know its join operation and is not supported by google. I tried my best to find out the way other then executi...
1
0
0
0
false
5,415,555
1
372
1
1
0
5,415,342
The standard solution to this problem is denormalization. Try storing a copy of price and profit in Entity1 and then you can answer your question with a single, simple query on Entity1.
1
0
0
Optimizing join query performance in google app engine
2
python,google-app-engine
0
2011-03-24T05:58:00.000
I have to re-design an existing application which uses Pylons (Python) on the backend and GWT on the frontend. In the course of this re-design I can also change the backend system. I tried to read up on the advantages and disadvantages of various backend systems (Java, Python, etc) but I would be thankful for some feed...
4
1
1.2
0
true
5,421,810
1
1,559
1
1
0
5,417,372
We had the same dilemma in the past. I was involved in designing and building a system that had a GWT frontend and Java (Spring, Hibernate) backend. Some of our other (related) systems were built in Python and Ruby, so the expertise was there, and a question just like yours came up. We decided on Java mainly so we cou...
1
0
0
Feedback on different backends for GWT
1
java,python,gwt,architecture,web-frameworks
0
2011-03-24T09:56:00.000
Is there a way to reduce the I/O's associated with either mysql or a python script? I am thinking of using EC2 and the costs seem okay except I can't really predict my I/O usage and I am worried it might blindside me with costs. I basically develop a python script to parse data and upload it into mysql. Once its in...
2
0
0
0
false
5,426,527
0
202
1
1
0
5,425,289
You didn't really specify whether it was writes or reads. My guess is that you can do it all in a mysql instance in a ramdisc (tmpfs under Linux). Operations such as ALTER TABLE and copying big data around end up creating a lot of IO requests because they move a lot of data. This is not the same as if you've just got a...
1
0
0
reducing I/O on application and database
2
python,mysql,amazon-ec2,mysql-management
1
2011-03-24T20:44:00.000
I using MySQLdb for access to mysql database from python. I need to know if connection with database is still alive... are there any attribute or method in order to do this??? thanks!!
0
0
0
0
false
5,430,722
0
146
1
0
0
5,430,652
To be honest, I haven't used mysqldb in python in a very long time. That being said, I would suggest using an execute("now()") (or "select 1", any other "dummy" SQL command) and handle any errors. edit: That should also probably be part of a class you're using. Don't fill your entire project with .execute("now()") on ...
1
0
0
Verify the connection with MySQL database
1
python-3.x,mysql-python
0
2011-03-25T09:31:00.000
save_or_update has been removed in 0.6. Are there alternatives to use them in 0.6 and above? I noticed the existence of the method _save_or_update_state for session objects, but there are no docs on this method.
2
1
0.066568
0
false
5,469,880
0
4,986
2
0
0
5,442,825
Session.merge() works fine for both new and existing object. But you have to remember, that merge() returns object bound to the session as opposed to add() (and save_or_update() in old versions) which puts object passed as argument into the session. This behavior is required to insure there is a single object for each ...
1
0
0
save_or_update using SQLalchemy 0.6
3
python,sql,sqlalchemy
0
2011-03-26T14:05:00.000
save_or_update has been removed in 0.6. Are there alternatives to use them in 0.6 and above? I noticed the existence of the method _save_or_update_state for session objects, but there are no docs on this method.
2
-1
-0.066568
0
false
11,861,997
0
4,986
2
0
0
5,442,825
session.merge() will not work if you have your db setup as a master-slave, where you typically want to query from the slave, but write to the master. I have such a setup, and ended up re-querying from the master just before the writing, then using a session.add() if the data is indeed not there on the master.
1
0
0
save_or_update using SQLalchemy 0.6
3
python,sql,sqlalchemy
0
2011-03-26T14:05:00.000
I'm actually working in a search engine project. We are working with python + mongoDb. I have a pymongo cursor after excecuting a find() command to the mongo db. The pymongo cursor has around 20k results. I have noticed that the iteration over the pymongo cursor is really slow compared with a normal iteration over for ...
10
1
0.049958
0
false
7,828,897
0
14,362
2
0
0
5,480,340
the default cursor size is 4MB, and the maximum it can go to is 16MB. you can try to increase your cursor size until that limit is reached and see if you get an improvement, but it also depends on what your network can handle.
1
0
1
Python + MongoDB - Cursor iteration too slow
4
python,mongodb,performance,iteration,database-cursor
0
2011-03-29T23:52:00.000
I'm actually working in a search engine project. We are working with python + mongoDb. I have a pymongo cursor after excecuting a find() command to the mongo db. The pymongo cursor has around 20k results. I have noticed that the iteration over the pymongo cursor is really slow compared with a normal iteration over for ...
10
-4
-1
0
false
5,480,531
0
14,362
2
0
0
5,480,340
You don't provide any information about the overall document sizes. Fetch such an amount of document requires both network traffic and IO on the database server. The performance is sustained "bad" even in "hot" state with warm caches? You can use "mongosniff" in order to inspect the "wire" activity and system tools lik...
1
0
1
Python + MongoDB - Cursor iteration too slow
4
python,mongodb,performance,iteration,database-cursor
0
2011-03-29T23:52:00.000
I'm in the planning phase of an Android app which synchronizes to a web app. The web side will be written in Python with probably Django or Pyramid while the Android app will be straightforward java. My goal is to have the Android app work while there is no data connection, excluding the social/web aspects of the app...
4
1
0.197375
0
false
11,871,778
1
1,618
1
0
0
5,544,689
1) Looks like this is pretty good way to manage your local & remote changes + support offline work. I don't think this is overkill 2) I think, you should cache user's changes locally with local timestamp until synchronizing is finished. Then server should manage all processing: track current version, commit and rollbac...
1
0
0
Android app database syncing with remote database
1
python,android
0
2011-04-04T21:42:00.000
when I launch my application with apache2+modwsgi I catch Exception Type: ImportError Exception Value: DLL load failed: The specified module could not be found. in line from lxml import etree with Django dev server all works fine Visual C++ Redistributable 2008 installed Dependency walker told that msvcrt90.dll...
2
2
1.2
0
true
5,559,988
1
1,054
1
0
0
5,552,162
It is indeed because of 'msvcrt90.dll'. From somewhere in micro patch revisions of Python 2.6 they stopped building in automatic dependencies on the DLL for extension modules and relied on Python executable doing it. When embedded in other systems however you are then dependent on that executable linking to DLL and in ...
1
0
0
problem with soaplib (lxml) with apache2 + mod_wsgi
1
python,apache2,mingw,lxml,cx-oracle
0
2011-04-05T12:55:00.000
I'm currently working on a proof of concept application using Python 3.2 via SQLAlchemy with a MS SQL Server back end. Thus far, I'm hitting a brick wall looking for ways to actually do the connection. Most discussions point to using pyODBC, however it does not support Python 3.x yet. Does anyone have any connection ...
0
0
1.2
0
true
5,559,890
0
867
1
0
0
5,559,645
At this moment none of the known Python drivers to connect to Sql Server had a compatible python 3000 version. PyODBC mxODBC pymssql zxjdbc AdoDBAPI
1
0
0
SQLAlchemy 3.2 and MS SQL Connectivity
1
python,sql-server,sqlalchemy
0
2011-04-05T23:10:00.000
I'm working on a python server which concurrently handles transactions on a number of databases, each storing performance data about a different application. Concurrency is accomplished via the Multiprocessing module, so each transaction thread starts in a new process, and shared-memory data protection schemes are not ...
0
0
0
0
false
5,559,724
0
327
2
0
0
5,559,660
You could capture the error when trying to create the file in your code and in your exception handler, check if the file exists and use the existing file instead of creating it.
1
0
1
Prevent a file from being created in python
5
python,sqlite
0
2011-04-05T23:13:00.000
I'm working on a python server which concurrently handles transactions on a number of databases, each storing performance data about a different application. Concurrency is accomplished via the Multiprocessing module, so each transaction thread starts in a new process, and shared-memory data protection schemes are not ...
0
0
0
0
false
5,559,768
0
327
2
0
0
5,559,660
You didn't mention the platform, but on linux open(), or os.open() in python, takes a flags parameter which you can use. The O_CREAT flag creates a file if it does not exist, and the O_EXCL flag gives you an error if the file already exists. You'll also be needing O_RDONLY, O_WRONLY or O_RDWR for specifying the access ...
1
0
1
Prevent a file from being created in python
5
python,sqlite
0
2011-04-05T23:13:00.000
Any idea on how I could run a bunch of .sql files that contains lots of functions from within sqlalchemy, after I create the schema ? I've tried using DDL(), engine.text(<text>).execute(), engine.execute(<text>). None of them work, they are either failing because improper escape or some other weird errors. I am using s...
2
1
0.197375
0
false
5,564,716
0
1,729
1
0
0
5,563,437
You can't do that. You must parse the file and split it into individual SQL commands, and then execute each one separately in a transaction.
1
0
0
run .sql files from within sqlalchemy
1
python,sqlalchemy
0
2011-04-06T08:25:00.000
I have been trying to generate data in Excel. I generated .CSV file. So up to that point it's easy. But generating graph is quite hard in Excel... I am wondering, is python able to generate data AND graph in excel? If there are examples or code snippets, feel free to post it :) Or a workaround can be use python to gene...
6
2
0.07983
1
false
5,568,485
0
48,351
1
0
0
5,568,319
I suggest you to try gnuplot while drawing graph from data files.
1
0
0
use python to generate graph in excel
5
python,excel,charts,export-to-excel
0
2011-04-06T14:47:00.000
I have a python program which makes use of MySQL database. I am getting following error. It would be very grateful if some one help me out a solution. Traceback (most recent call last): File "version2_1.py", line 105, in refine(wr,w)#function for replacement File "version2_1.py", line 49, in refine wrds=db_connect....
1
0
0
0
false
5,606,690
0
3,356
1
0
0
5,606,665
Looks like you have an incorrect username/password for MySQL. Try creating a user in MySQL and use that to connect.
1
0
0
Error when trying to execute a Python program that uses MySQL
3
python,mysql,mysql-error-1045
0
2011-04-09T17:37:00.000
from the interpreter i can issue >>> from MySQLdb just fine. so, I'm assuming the module did actually load. My source looks as follows: from Tkinter import * from MySQLdb import * """ Inventory control for Affordable Towing Functions: connection() - Controls database connection delete() - Remove item fro...
0
1
0.099668
0
false
5,609,341
0
7,873
1
0
0
5,609,322
from MySQLdb import * and import MySQLdb do very different things.
1
0
0
python2.6 with MySQLdb, NameError 'MySQLdb' not defined
2
python,mysql,programming-languages,network-programming
0
2011-04-10T02:12:00.000
Is there some module to allow for easy DB provider configuration via connection string, similar to PHP's PDO where I can nicely say "psql://" or "mysql://" or, in this python project, am I just going to have to code some factory classes that use MySQLdb, psycopg2, etc?
0
0
0
0
false
5,617,901
0
413
1
0
0
5,617,246
There's something not quite as nice in logilab.database, but which works quite well (http://www.logilab.org/project/logilab-database). Supports sqlite, mysql, postgresql and some versions of mssql, and some abstraction mechanisms on the SQL understood by the different backend engines.
1
0
0
python and DB connection abstraction?
2
python
0
2011-04-11T05:49:00.000
Python hangs on lxml.etree.XMLSchema(tree) when I use it on apache server + mod_wsgi (Windows) When I use Django dev server - all works fine if you know about other nice XML validation solution against XSD, tell me pls Update: I'm using soaplib, which uses lxml logger.debug("building schema...") self.schema = etree.X...
5
1
0.066568
0
false
6,176,299
1
1,123
2
0
0
5,617,599
I had a similar problem on a Linux system. Try installing a more recent version of libxml2 and reinstalling lxml, at least that's what did it for me.
1
0
0
Python hangs on lxml.etree.XMLSchema(tree) with apache + mod_wsgi
3
python,apache,mod-wsgi,lxml,xml-validation
0
2011-04-11T06:34:00.000
Python hangs on lxml.etree.XMLSchema(tree) when I use it on apache server + mod_wsgi (Windows) When I use Django dev server - all works fine if you know about other nice XML validation solution against XSD, tell me pls Update: I'm using soaplib, which uses lxml logger.debug("building schema...") self.schema = etree.X...
5
2
0.132549
0
false
6,685,198
1
1,123
2
0
0
5,617,599
I had the same problem (lxml 2.2.6, mod_wsgi 3.2). A work around for this is to pass a file or filename to the constructor: XMLSchema(file=).
1
0
0
Python hangs on lxml.etree.XMLSchema(tree) with apache + mod_wsgi
3
python,apache,mod-wsgi,lxml,xml-validation
0
2011-04-11T06:34:00.000
I have a sentence like the cat sat on the mat stored as a single sql field. I want to periodically search for keywords which are not not in a stop list, in this case cat sat mat What's the best way to store them in an SQL table for quick searching? As far as I can see it I see the following options Up to [n] additio...
1
1
0.066568
0
false
5,627,582
0
290
2
0
0
5,627,140
I do something similar with SQLite too. In my experience it's not as fast as other db's in this type of situation so it pays to make your schema as simple as possible. Up to [n] additional columns per row, one for each word. Store all of the interesting words in a single, comma separated field. A new table, linked to ...
1
0
0
Storing interesting words from a sentence
3
python,sql,sqlite
0
2011-04-11T20:26:00.000
I have a sentence like the cat sat on the mat stored as a single sql field. I want to periodically search for keywords which are not not in a stop list, in this case cat sat mat What's the best way to store them in an SQL table for quick searching? As far as I can see it I see the following options Up to [n] additio...
1
1
1.2
0
true
5,627,243
0
290
2
0
0
5,627,140
I would suggest giving your sentences a key, likely IDENTITY. I would then create a second table linking to your sentence table, with a row for each interesting word. If you'd like to search for say, words starting with ca- if you stored these words in a comma delimited you'd have to wildcard the start and end, wherea...
1
0
0
Storing interesting words from a sentence
3
python,sql,sqlite
0
2011-04-11T20:26:00.000
Beginner question- what is the difference between sqlite and sqlalchemy?
37
66
1
0
false
5,632,745
0
24,446
1
0
0
5,632,677
They're apples and oranges. Sqlite is a database storage engine, which can be better compared with things such as MySQL, PostgreSQL, Oracle, MSSQL, etc. It is used to store and retrieve structured data from files. SQLAlchemy is a Python library that provides an object relational mapper (ORM). It does what it suggests: ...
1
0
0
What is the difference between sqlite3 and sqlalchemy?
2
python,sqlite,sqlalchemy
0
2011-04-12T08:54:00.000
I have a really large excel file and i need to delete about 20,000 rows, contingent on meeting a simple condition and excel won't let me delete such a complex range when using a filter. The condition is: If the first column contains the value, X, then I need to be able to delete the entire row. I'm trying to automate t...
5
12
1.2
0
true
5,635,203
0
51,378
1
0
0
5,635,054
Don't delete. Just copy what you need. read the original file open a new file iterate over rows of the original file (if the first column of the row does not contain the value X, add this row to the new file) close both files rename the new file into the original file
1
0
0
Python to delete a row in excel spreadsheet
6
python,excel,xlwt
0
2011-04-12T12:19:00.000
After much study and investigation, I've decided to do my Python development with pyQT4 using Eric5 as the editor. However, I've run into a brick wall with trying to get MySQL to work. It appears that there's an issue with the QMySQL driver. From the discussions that I've seen so far, the only fix is to install the ...
1
2
0.132549
0
false
5,643,057
0
6,327
1
0
0
5,642,537
yes that will work, I do the same thing. I like a programming API, like what SQLAlchemy provides over the Raw SQL version of Qt's QtSql module. It works fine and nice, just populate a subclassed QAbstractTableModel with data from your sqlalchemy queries, like you would with data from any other python object. This tho...
1
0
0
pyQT and MySQL or MSSQL Connectivity
3
python,mysql,sql-server,pyqt
0
2011-04-12T22:50:00.000
I am very new to python and Django, was actually thrown in to finish off some coding for my company since our coder left for overseas. When I run python manage.py syncdb I receive the following error psycopg2.OperationalError: FATAL: password authentication failed for user "winepad" I'm not sure why I am being prompte...
0
1
0.099668
0
false
5,643,247
1
3,164
1
0
0
5,643,201
Check your settings.py file. The most likely reason for this issue is that the username for the database is set to "winepad". Change that to the appropriate value and rerun python manage.py syncdb That should fix the issue.
1
0
0
python manage.py syncdb
2
python,django,postgresql
0
2011-04-13T00:37:00.000
I have just begun learning Python. Eventually I will learn Django, as my goal is to able to do web development (video sharing/social networking). At which point should I begin learning MySQL? Do I need to know it before I even begin Django? If so, how much should I look to know before diving into Django? Thank you.
1
0
0
0
false
5,643,494
1
1,826
2
0
0
5,643,400
Django uses its own ORM, so I guess it's not completely necessary to learn MySQL first, but I suspect it would help a fair bit to know what's going on behind the scenes, and it will help you think in the correct way to formulate your queries. I would start learning MySQL (or any other SQL), after you've got a pretty go...
1
0
0
Beginning MySQL/Python
4
python,mysql,django,new-operator
0
2011-04-13T01:16:00.000
I have just begun learning Python. Eventually I will learn Django, as my goal is to able to do web development (video sharing/social networking). At which point should I begin learning MySQL? Do I need to know it before I even begin Django? If so, how much should I look to know before diving into Django? Thank you.
1
0
0
0
false
5,654,701
1
1,826
2
0
0
5,643,400
As Django documents somehow Recommends, It is better to learning PostgreSQL. PostgreSQL is working pretty with Django, I never had any problem with Django/PostgreSQL. I all know is sometimes i have weird error when working with MySQL.
1
0
0
Beginning MySQL/Python
4
python,mysql,django,new-operator
0
2011-04-13T01:16:00.000
I am new to Python and having some rudimentary problems getting MySQLdb up and running. I'm hoping somebody out there can help me. When I first tried to install the module using setup.py, the setup terminated because it was unable to find mysql_config. This is because I didn't realize the module expected MySQL to be ...
0
0
0
0
false
5,644,390
0
125
1
0
0
5,644,374
Install the MySQL client libraries. Install the MySQL client library development files, and build again.
1
0
0
Help with MySQLdb module: corrupted installation and connecting to remote servers
1
python,mysql,python-module,mysql-python,setup.py
0
2011-04-13T04:23:00.000
I am facing a problem where I am trying to add data from a python script to mysql database with InnonDB engine, it works fine with myisam engine of the mysql database. But the problem with the myisam engine is that it doesn't support foreign keys so I'll have to add extra code each place where I want to insert/delete r...
3
6
1.2
0
true
5,654,733
0
1,182
1
0
0
5,654,107
InnoDB is transactional. You need to call connection.commit() after inserts/deletes/updates. Edit: you can call connection.autocommit(True) to turn on autocommit.
1
0
0
Problem in insertion from python script in mysql database with innondb engine
2
python,mysql,innodb,myisam
0
2011-04-13T18:55:00.000
I have a query set of approximately 1500 records from a Django ORM query. I have used the select_related() and only() methods to make sure the query is tight. I have also used connection.queries to make sure there is only this one query. That is, I have made sure no extra queries are getting called on each iteration...
7
3
0.148885
0
false
5,656,734
1
4,372
2
0
0
5,656,238
1500 records is far from being a large dataset, and seven seconds is really too much. There is probably some problem in your models, you can easily check it by getting (as Brandon says) the values() query, and then create explicitly the 1500 object by iterating the dictionary. Just convert the ValuesQuerySet into a lis...
1
0
0
How do I speed up iteration of large datasets in Django
4
python,django
0
2011-04-13T22:03:00.000
I have a query set of approximately 1500 records from a Django ORM query. I have used the select_related() and only() methods to make sure the query is tight. I have also used connection.queries to make sure there is only this one query. That is, I have made sure no extra queries are getting called on each iteration...
7
1
0.049958
0
false
5,657,066
1
4,372
2
0
0
5,656,238
Does your model's Meta declaration tell it to "order by" a field that is stored off in some other related table? If so, your attempt to iterate might be triggering 1,500 queries as Django runs off and grabs that field for each item, and then sorts them. Showing us your code would help us unravel the problem!
1
0
0
How do I speed up iteration of large datasets in Django
4
python,django
0
2011-04-13T22:03:00.000
I have a group of related companies that share items they own with one-another. Each item has a company that owns it and a company that has possession of it. Obviously, the company that owns the item can also have possession of it. Also, companies sometimes permanently transfer ownership of items instead of just lendin...
2
0
0
0
false
5,656,695
1
198
1
0
0
5,656,345
Option #1 is probably the cleanest choice. An Item has only one owner company and is possessed by only one possessing company. Put two FK to Company in Item, and remember to explicitly define the related_name of the two inverses to be different each other. As you want to avoid touching the Item model, either add the ...
1
0
0
How to model lending items between a group of companies
3
django,design-patterns,database-design,django-models,python
0
2011-04-13T22:17:00.000
I am working on a realtime data website that has a data-mining backend side to it. I am highly experienced in both Python and C++/C#, and wondering which one would be preferable for the backend development. I am strongly leaning towards Python for its available libraries and ease of use. But am I wrong? If so, why? As ...
1
1
0.197375
0
false
5,658,631
0
1,302
1
0
0
5,658,529
We do backend development based on Zope, Python and other Python-related stuff since almost 15 years. Python gives you great flexibility and all-batteries included (likely true for C#, not sure about C++). If you do RDBMS development with Python: SQLAlchemy is the way to go. It provides a huge functionality and saved m...
1
0
0
Designing a Website Backend - Python or C++/C#?
1
c#,c++,python,backend
1
2011-04-14T04:27:00.000
I am trying to insert a query that contains é - or \xe9 (INSERT INTO tbl1 (text) VALUES ("fiancé")) into a MySQL table in Python using the _mysql module. My query is in unicode, and when I call _mysql.connect(...).query(query) I get a UnicodeEncodeError: 'ascii' codec can't encode character u'\xe9' in position X : ordi...
0
0
0
0
false
5,658,972
0
522
1
0
0
5,658,737
I know this doesn't directly answer your question, but why aren't you using prepared statements? That will do two things: probably fix your problem, and almost certainly fix the SQLi bug you've almost certainly got. If you won't do that, are you absolutely certain your string itself is unicode? If you're just naively u...
1
0
0
Python MySQL Unicode Error
1
python,mysql,unicode
0
2011-04-14T04:55:00.000
I'm building a WSGI web app and I have a MySQL database. I'm using MySQLdb, which provides cursors for executing statements and getting results. What is the standard practice for getting and closing cursors? In particular, how long should my cursors last? Should I get a new cursor for each transaction? I believe you ne...
95
-6
-1
0
false
5,670,056
0
99,138
1
0
0
5,669,878
I suggest to do it like php and mysql. Start i at the beginning of your code before printing of the first data. So if you get a connect error you can display a 50x(Don't remember what internal error is) error message. And keep it open for the whole session and close it when you know you wont need it anymore.
1
0
0
When to close cursors using MySQLdb
5
python,mysql,mysql-python
0
2011-04-14T21:23:00.000
I am now working on a big backend system for a real-time and history tracking web service. I am highly experienced in Python and intend to use it with sqlalchemy (MySQL) to develop the backend. I don't have any major experience developing robust and sustainable backend systems and I was wondering if you guys could poin...
0
0
0
0
false
5,671,966
1
4,070
1
0
0
5,670,639
Use Apache, Django and Piston. Use REST as the protocol. Write as little code as possible. Django models, forms, and admin interface. Piston wrapppers for your resources.
1
0
0
Python Backend Design Patterns
2
python,backend
0
2011-04-14T22:55:00.000
I want to generate compound charts (e.g: Bar+line) from my database using python. How can i do this ? Thanks in Advance
0
1
0.049958
1
false
6,272,840
0
277
1
0
0
5,693,151
Pretty easy to do with pygooglechart - You can basically follow the bar chart examples that ship with the software and then use the add_data_line method to make the lines on top of the bar chart
1
0
0
Compoud charts with python
4
python,charts
0
2011-04-17T11:09:00.000
I'm running django site with MySQL as DB back-end. Finally i've got 3 millions rows in django_session table. Most of them are expired, thus i want to remove them. But if i manually run delete from django_session where expire_date < "2011-04-18" whole site seems to be hanged - it cannot be accessed via browser. Why suc...
1
1
0.049958
0
false
5,703,375
1
513
2
0
0
5,703,308
I am not MySQL expert, but I guess MySQL locks the table for the deleting and this might be MySQL transaction/backend related. When deleting is in progress MySQL blocks the access to the table from other connections. MyISAM and InnoDB backend behavior might differ. I suggest you study MySQL manual related to this: the ...
1
0
0
MySQL&django hangs on huge session delete
4
python,mysql,django
0
2011-04-18T13:03:00.000
I'm running django site with MySQL as DB back-end. Finally i've got 3 millions rows in django_session table. Most of them are expired, thus i want to remove them. But if i manually run delete from django_session where expire_date < "2011-04-18" whole site seems to be hanged - it cannot be accessed via browser. Why suc...
1
5
1.2
0
true
5,703,378
1
513
2
0
0
5,703,308
If your table is MyISAM, DELETE operations lock the table and it is not accessible by the concurrent queries. If there are many records to delete, the table is locked for too long. Split your DELETE statement into several shorter batches.
1
0
0
MySQL&django hangs on huge session delete
4
python,mysql,django
0
2011-04-18T13:03:00.000
I have had a virtualenv for Trunk up and running for a while, but now I am trying to branch, and get things setup on another virtualenv for my 'refactor' branch. Everything looks to be setup correctly, but when I try to run any manage.py commands, I get this error: _mysql_exceptions.OperationalError: (1045, "Access den...
1
1
1.2
0
true
8,742,834
1
1,450
1
0
0
5,726,440
I found the problem I was having. Django was importing a different settings.py file. I had another django project inside my django product like myproject/myproject/. Instead of importing myproject/settings.py, it was importing myproject/myproject/settings.py I assume that Aptana Studio created that project there. If ...
1
0
0
Can't access MySQL database in Django VirtualEnv on localhost
1
python,mysql,django,mysql-error-1045
0
2011-04-20T06:39:00.000
In Python, is there a way to get notified that a specific table in a MySQL database has changed?
10
1
0.039979
0
false
5,771,943
0
30,571
2
0
0
5,771,925
Not possible with standard SQL functionality.
1
0
0
python: how to get notifications for mysql database changes?
5
python,mysql
0
2011-04-24T17:03:00.000
In Python, is there a way to get notified that a specific table in a MySQL database has changed?
10
10
1
0
false
5,771,988
0
30,571
2
0
0
5,771,925
It's theoretically possible but I wouldn't recommend it: Essentially you have a trigger on the the table the calls a UDF which communicates with your Python app in some way. Pitfalls include what happens if there's an error? What if it blocks? Anything that happens inside a trigger should ideally be near-instant. Wh...
1
0
0
python: how to get notifications for mysql database changes?
5
python,mysql
0
2011-04-24T17:03:00.000
I am developing a database based django application and I have installed apache, python and django using macport on a snow leopard machine. I ran into issues installing MySQL with macport. But I was able to successfully install a standalone MySQL server (from MySQL.com). Is it possible to remove the MysQL package insta...
0
2
1.2
0
true
5,782,960
1
182
1
0
0
5,782,875
To use py26-mysql you don't need the entire server distribution for MySQL. You do need the client libs, at the very least. If you remove the server, you need to make sure you re-install the base libraries needed by the Python module to function.
1
0
0
Is it possible to install py26-mysql without installing mysql5 package?
1
python,mysql,macports
0
2011-04-25T20:26:00.000
I have some database structure; as most of it is irrelevant for us, i'll describe just some relevant pieces. Let's lake Item object as example: items_table = Table("invtypes", gdata_meta, Column("typeID", Integer, primary_key = True), Column("typeName", String, index=True), ...
7
7
1.2
0
true
5,819,858
0
3,854
1
0
0
5,795,492
To force loading lazy attributes just access them. This the simplest way and it works fine for relations, but is not as efficient for Columns (you will get separate SQL query for each column in the same table). You can get a list of all unloaded properties (both relations and columns) from sqlalchemy.orm.attributes.ins...
1
0
0
Completing object with its relations and avoiding unnecessary queries in sqlalchemy
2
python,sqlalchemy,eager-loading
0
2011-04-26T19:35:00.000
We're currently in the process of implementing a CRM-like solution internally for a professional firm. Due to the nature of the information stored, and the varying values and keys for the information we decided to use a document storage database, as it suited the purposes perfectly (In this case we chose MongoDB). As p...
16
1
0.049958
0
false
5,821,550
0
2,546
2
0
0
5,817,182
stay with mongodb. Two reasons - 1. its better to stay in the same domain if you can to reduce complexity and 2. mongodb is excellent for querying and requires less work than redis, for example.
1
0
0
Using MongoDB as our master database, should I use a separate graph database to implement relationships between entities?
4
python,django,mongodb,redis,neo4j
0
2011-04-28T10:28:00.000
We're currently in the process of implementing a CRM-like solution internally for a professional firm. Due to the nature of the information stored, and the varying values and keys for the information we decided to use a document storage database, as it suited the purposes perfectly (In this case we chose MongoDB). As p...
16
6
1
0
false
5,836,158
0
2,546
2
0
0
5,817,182
The documents in MongoDB very much resemble nodes in Neo4j, minus the relationships. They both hold key-value properties. If you've already made the choice to go with MongoDB, then you can use Neo4j to store the relationships and then bridge the stores in your application. If you're choosing new technology, you can go ...
1
0
0
Using MongoDB as our master database, should I use a separate graph database to implement relationships between entities?
4
python,django,mongodb,redis,neo4j
0
2011-04-28T10:28:00.000
I built a previous program that took client info and stored it in a folder of txt files (impractical much) but now I want to upgrade the program to be more efficient and put the info into a database of some sort... How can I take the info from the text files and add them to the new database without having to manually d...
2
0
0
0
false
15,442,076
0
5,085
1
0
0
5,823,236
The main reason for DB to have a SQL is to make it separate and generic from the application that you are developing. To have your own DB built you need to have a storage mechanism could be files on the hard disk, with search options so that you can access data immediately with keywords that you are interested in. on t...
1
0
0
If I want to build a custom database, how could I?
5
python,database
0
2011-04-28T18:21:00.000
Is it possible to save my in-memory sqlite database to hard disk? If it is possible, some python code would be awesome. Thanks in advance. EDIT: I succeeded this task by using apsw . It works like a charm. Thanks for your contribution.
19
6
1
0
false
5,832,180
0
13,227
3
0
0
5,831,548
Yes. When you create the connection to the database, replace :memory: with the path where you want to save the DB. sqlite uses caches for file based DBs, so this shouldn't be (much) slower.
1
0
0
python save in memory sqlite
7
python,sqlite
0
2011-04-29T11:43:00.000
Is it possible to save my in-memory sqlite database to hard disk? If it is possible, some python code would be awesome. Thanks in advance. EDIT: I succeeded this task by using apsw . It works like a charm. Thanks for your contribution.
19
13
1
0
false
5,925,061
0
13,227
3
0
0
5,831,548
(Disclosure: I am the APSW author) The only safe way to make a binary copy of a database is to use the backup API that is part of SQLite and is exposed by APSW. This does the right thing with ordering, locking and concurrency. To make a SQL (text) copy of the a database then use the APSW shell which includes a .dump ...
1
0
0
python save in memory sqlite
7
python,sqlite
0
2011-04-29T11:43:00.000
Is it possible to save my in-memory sqlite database to hard disk? If it is possible, some python code would be awesome. Thanks in advance. EDIT: I succeeded this task by using apsw . It works like a charm. Thanks for your contribution.
19
1
0.028564
0
false
5,831,644
0
13,227
3
0
0
5,831,548
Open a disk based database and just copy everything from one to the other.
1
0
0
python save in memory sqlite
7
python,sqlite
0
2011-04-29T11:43:00.000
In MySQL, I have two different databases -- let's call them A and B. Database A resides on server server1, while database B resides on server server2. Both servers {A, B} are physically close to each other, but are on different machines and have different connection parameters (different username, different password et...
29
4
0.26052
0
false
5,832,825
0
32,061
2
0
0
5,832,787
It is very simple - select data from one server, select data from another server and aggregate using Python. If you would like to have SQL query with JOIN - put result from both servers into separate tables in local SQLite database and write SELECT with JOIN.
1
0
0
MySQL -- Joins Between Databases On Different Servers Using Python?
3
python,mysql
0
2011-04-29T13:36:00.000
In MySQL, I have two different databases -- let's call them A and B. Database A resides on server server1, while database B resides on server server2. Both servers {A, B} are physically close to each other, but are on different machines and have different connection parameters (different username, different password et...
29
3
0.197375
0
false
5,832,954
0
32,061
2
0
0
5,832,787
No. It is not possible to do the join as you would like. But you may be able to sort something out by replicating one of the servers to the other for the individual database. One data set is under the control of one copy of MySQL and the other dataset is under the control of the other copy of MySQL. The query can only...
1
0
0
MySQL -- Joins Between Databases On Different Servers Using Python?
3
python,mysql
0
2011-04-29T13:36:00.000
Here's what I want to do. Develop a Django project on a development server with a development database. Run the south migrations as necessary when I change the model. Save the SQL from each migration, and apply those to the production server when I'm ready to deploy. Is such a thing possible with South? (I'd also be ...
28
50
1
0
false
5,897,509
1
10,745
2
0
0
5,833,418
You can at least inspect the sql generated by doing manage.py migrate --db-dry-run --verbosity=2. This will not do anything to the database and will show all the sql. I would still make a backup though, better safe than sorry.
1
0
0
Django - South - Is There a way to view the SQL it runs?
5
python,database,migration,django-south
0
2011-04-29T14:32:00.000
Here's what I want to do. Develop a Django project on a development server with a development database. Run the south migrations as necessary when I change the model. Save the SQL from each migration, and apply those to the production server when I'm ready to deploy. Is such a thing possible with South? (I'd also be ...
28
2
0.07983
0
false
5,932,967
1
10,745
2
0
0
5,833,418
I'd either do what Lutger suggested (and maybe write a log parser to strip out just the SQL), or I'd run my migration against a test database with logging enabled on the test DB. Of course, if you can run it against the test database, you're just a few steps away from validating the migration. If it passes, run it aga...
1
0
0
Django - South - Is There a way to view the SQL it runs?
5
python,database,migration,django-south
0
2011-04-29T14:32:00.000
In my server process, it looks like this: Main backend processes: Processes Huge list of files and , record them inside MySQL. On every 500 files done, it writes "Progress Report" to a separate file /var/run/progress.log like this "200/5000 files done" It is multi-processed with 4 children, each made sure to run on a ...
1
0
0
0
false
12,211,059
1
773
1
1
0
5,848,184
Quick advice, make sure (like, super sure) that you do close your file. So ALWAYS use a try-except-final block for this Remember that the contens of a final block will ALWAYS be executed, that will prevent you a lot of head pain :)
1
0
0
If I open and read the file which is periodically written, can I/O deadlock occur?
2
python,linux,performance,io,deadlock
0
2011-05-01T11:55:00.000
I'm working with Tornado and MongoDB and I would like to send a confirmation email to the user when he creates an account in my application. For the moment, I use a simple XHTML page with a form and I send information to my MongoDB database using Tornado. I would like to have an intermediate step which sends an email t...
5
6
1
0
false
7,483,440
1
3,734
1
1
0
5,862,238
I wonder why you would handle registration like that. The usual way to handle registration is: Write the user info to the database, but with an 'inactive' label attached to the user. Send an email to the user. If the user confirms the registration, then switch the user to 'active'. If you don't want to write to the d...
1
0
0
How can I send a user registration confirmation email using Tornado and MongoDB?
2
python,email,tornado
0
2011-05-02T20:41:00.000
The identity map and unit of work patterns are part of the reasons sqlalchemy is much more attractive than django.db. However, I am not sure how the identity map would work, or if it works when an application is configured as wsgi and the orm is accessed directly through api calls, instead of a shared service. I would ...
6
0
0
0
false
5,869,588
1
2,298
1
0
0
5,869,514
So this all depends on how you setup your sqlalchemy connection. Normally what you do is to manage each wsgi request to have it's own threadlocal session. This session will know about all of the goings-on of it, items added/changed/etc. However, each thread is not aware of the others. In this way the loading/preconfigu...
1
0
0
sqlalchemy identity map question
2
python,sqlalchemy,identity-map
0
2011-05-03T12:35:00.000
I have a data model called Game. In the Game model, I have two properties called player1 and player2 which are their names. I want to find a player in gamebut I don't know how to buil the query because gql does not support OR clause and then I can't use select * from Game where player1 = 'tom' or player2 = 'tom' statem...
3
0
0
0
false
10,265,451
1
631
1
0
0
5,875,881
Note that there is no gain of performance in using Drew's schema, because queries in list properties must check for equality against all the elements of the list.
1
0
0
Google app engine gql query two properties with same string
3
python,google-app-engine
0
2011-05-03T21:24:00.000
Can i use Berkeley DB python classes in mobile phone directly , i mean Do DB python classes and methods are ready to be used in any common mobile phone like Nokia,Samsong (windows mobile)..etc. If a phone system supports python language, does that mean that it is easy and straightforward to use Berkeley DB on it...
0
1
1.2
0
true
5,888,966
0
159
1
0
0
5,888,854
Berkeley DB is a library that needs to be available. What you may have is Python bindings to Berkeley DB. If the library is not present, having Python will not help. Look for SQLite, which may be present (it is for iPhone) as it has SQL support and its library size is smaller than Berkeley DB, which makes it better sui...
1
0
0
Can use Berkeley DB in mobile phone
1
python,database,mobile,windows-mobile,berkeley-db
0
2011-05-04T19:30:00.000
A primary goal of a project I plan to bid on involves creating a Microsoft Access database using python. The main DB backend will be postgres, but the plan is to export an Access image. This will be a web app that'll take input from the user and go through a black box and output the results as an access db. The web app...
7
2
0.049958
0
false
5,925,032
0
3,427
4
0
0
5,891,359
The various answers to the duplicate question suggest that your "primary goal" of creating an MS Access database on a linux server is not attainable. Of course, such a goal is of itself not worthwhile at all. If you tell us what the users/consumers of the Access db are expected to do with it, maybe we can help you. Pos...
1
0
0
Building an MS Access database using python
8
python,linux,ms-access
0
2011-05-05T00:26:00.000
A primary goal of a project I plan to bid on involves creating a Microsoft Access database using python. The main DB backend will be postgres, but the plan is to export an Access image. This will be a web app that'll take input from the user and go through a black box and output the results as an access db. The web app...
7
0
0
0
false
5,954,299
0
3,427
4
0
0
5,891,359
Could you create a self-extracting file to send to the Windows user who has Microsoft Access installed? Include a blank .mdb file. dynamically build xml documents with tables, schema and data Include an import executable that will take all of the xml docs and import into the Access .mdb file. It's an extra step for t...
1
0
0
Building an MS Access database using python
8
python,linux,ms-access
0
2011-05-05T00:26:00.000
A primary goal of a project I plan to bid on involves creating a Microsoft Access database using python. The main DB backend will be postgres, but the plan is to export an Access image. This will be a web app that'll take input from the user and go through a black box and output the results as an access db. The web app...
7
2
0.049958
0
false
5,964,496
0
3,427
4
0
0
5,891,359
If you know this well enough: Python, it's database modules, and ODBC configuration then you should know how to do this: open a database, read some data, insert it in to a different database If so, then you are very close to your required solution. The trick is, you can open an MDB file as an ODBC datasource. Now:...
1
0
0
Building an MS Access database using python
8
python,linux,ms-access
0
2011-05-05T00:26:00.000
A primary goal of a project I plan to bid on involves creating a Microsoft Access database using python. The main DB backend will be postgres, but the plan is to export an Access image. This will be a web app that'll take input from the user and go through a black box and output the results as an access db. The web app...
7
0
0
0
false
5,972,450
0
3,427
4
0
0
5,891,359
Well, looks to me like you need a copy of vmware server on the linux box running windows, a web service in the vm to write to access, and communications to it from the main linux box. You aren't going to find a means of creating an access db on Linux. Calling it a requirement isn't going to make it technically possible...
1
0
0
Building an MS Access database using python
8
python,linux,ms-access
0
2011-05-05T00:26:00.000
I'm working with the BeautifulSoup python library. I used the urllib2 library to download the HTML code from a page, and then I have parsed it with BeautifulSoup. I want to save some of the HTML content into a MySql table, but I'm having some problems with the encoding. The MySql table is encoded with 'utf-8' charset. ...
2
2
0.197375
0
false
5,903,100
1
693
1
0
0
5,902,914
BeautifulSoup returns all data as unicode strings. First triple check that the unicode strings are ccorrect. If not then there is some issue with the encoding of the input data.
1
0
0
Wrong encoding with Python BeautifulSoup + MySql
2
python,mysql,encoding,urllib2,beautifulsoup
0
2011-05-05T19:12:00.000
I'm using PyCrypto to store some files inside a SQLITE database. I'm using 4 fields : the name of the file, the length of the file (in bytes) the SHA512 hash of the file the encrypted file (with AES and then base64 to ASCII). I need all the fields to show some info about the file without decrypting it. The question is ...
3
3
0.148885
0
false
5,919,875
0
484
3
0
0
5,919,819
Data encrypted with AES has the same length as the plain data (give or take some block padding), so giving original length away doesn't harm security. SHA512 is a strong cryptographic hash designed to provide minimal information about the original content, so I don't see a problem here either. Therefore, I think your s...
1
0
1
Storing encrypted files inside a database
4
python,database,security,encryption
0
2011-05-07T07:56:00.000
I'm using PyCrypto to store some files inside a SQLITE database. I'm using 4 fields : the name of the file, the length of the file (in bytes) the SHA512 hash of the file the encrypted file (with AES and then base64 to ASCII). I need all the fields to show some info about the file without decrypting it. The question is ...
3
1
1.2
0
true
5,920,346
0
484
3
0
0
5,919,819
To avoid any problems concerning the first few bytes being the same, you should use AES in Block Cipher mode with a random IV. This ensures that even if the first block (length depends on the key size) of two encrypted files is exactly the same, the cipher text will be different. If you do that, I see no problem with y...
1
0
1
Storing encrypted files inside a database
4
python,database,security,encryption
0
2011-05-07T07:56:00.000
I'm using PyCrypto to store some files inside a SQLITE database. I'm using 4 fields : the name of the file, the length of the file (in bytes) the SHA512 hash of the file the encrypted file (with AES and then base64 to ASCII). I need all the fields to show some info about the file without decrypting it. The question is ...
3
0
0
0
false
5,933,351
0
484
3
0
0
5,919,819
You really need to think about what attacks you want to protect against, and the resources of the possible attackers. In general, storing some data encrypted is only useful if it satisfies your exact requirements. In particular, if there is a way an attacker could compromise the key at the same time as the data, then t...
1
0
1
Storing encrypted files inside a database
4
python,database,security,encryption
0
2011-05-07T07:56:00.000
I have large text files upon which all kinds of operations need to be performed, mostly involving row by row validations. The data are generally of a sales / transaction nature, and thus tend to contain a huge amount of redundant information across rows, such as customer names. Iterating and manipulating this data ha...
3
1
1.2
0
true
5,931,175
0
501
1
0
0
5,931,151
Hash table resizing isn't a concern unless you have a requirement that each insert into the table should take the same amount of time. As long as you always expand the hash table size by a constant factor (e.g. always increasing the size by 50%), the computational cost of adding an extra element is amortized O(1). This...
1
0
0
BST or Hash Table?
3
python,c,data-structures,file-io
0
2011-05-08T23:41:00.000
when I try to install python-mysql today, I got a number of compilation error or complaining /Developer/SDKs/MacOSX10.4u.sdk not found, like the following: running build running build_py copying MySQLdb/release.py -> build/lib.macosx-10.3-i386-2.6/MySQLdb running build_ext building '_mysql' extension Compiling with ...
0
0
1.2
0
true
5,936,425
0
358
1
1
0
5,935,910
Check your environment for CFLAGS or LDFLAGS. Both of these can include the -isysroot argument that influences the SDK selection. The other place to start at is to look at the output of python2.6-config --cflags --ldflags since (I believe) that this influences the Makefile generation. Make sure to run easy_install w...
1
0
0
mac snow leopard setuptools stick to MacOSX10.4u.sdk when trying to install python-mysql
1
python,mysql,macos,osx-snow-leopard,compilation
0
2011-05-09T10:58:00.000
The Facts: I am working on a NoteBook with Intel Core 2 Duo 2,26 GHz and 4 Gigabyte of Ram. It has a Apache Server and a MySQL Server running. My Server (I did lshw | less) shows a 64 Bit CPU with 2,65 GHz and 4 Gigabyte Ram, too. It has the XAMPP-Package running on it. The Database structures (tables, indices, ...) a...
2
2
1.2
0
true
5,944,478
0
101
2
0
0
5,944,433
What kind of server? If you're renting a VPS or similar you're contending with other users for CPU time. What platform is running on both? Tell us more about your situation!
1
0
0
How do I find why a python scripts runs in significantly different running times on different machines?
2
python,mysql,runtime
0
2011-05-10T02:07:00.000
The Facts: I am working on a NoteBook with Intel Core 2 Duo 2,26 GHz and 4 Gigabyte of Ram. It has a Apache Server and a MySQL Server running. My Server (I did lshw | less) shows a 64 Bit CPU with 2,65 GHz and 4 Gigabyte Ram, too. It has the XAMPP-Package running on it. The Database structures (tables, indices, ...) a...
2
0
0
0
false
5,956,131
0
101
2
0
0
5,944,433
I would check that the databases in question are of similar scope. You say they're the same structure, but are they sized similarly? If your test case only has 100 entries when production has 100000000, that's one huge potential area for performance problems.
1
0
0
How do I find why a python scripts runs in significantly different running times on different machines?
2
python,mysql,runtime
0
2011-05-10T02:07:00.000
sorry for my English in advance. I am a beginner with Cassandra and his data model. I am trying to insert one million rows in a cassandra database in local on one node. Each row has 10 columns and I insert those only in one column family. With one thread, that operation took around 3 min. But I would like do the same ...
0
0
0
1
false
5,950,881
0
1,686
4
0
0
5,950,427
It's possible you're hitting the python GIL but more likely you're doing something wrong. For instance, putting 2M rows in a single batch would be Doing It Wrong.
1
0
0
Insert performance with Cassandra
4
python,multithreading,insert,cassandra
0
2011-05-10T13:02:00.000
sorry for my English in advance. I am a beginner with Cassandra and his data model. I am trying to insert one million rows in a cassandra database in local on one node. Each row has 10 columns and I insert those only in one column family. With one thread, that operation took around 3 min. But I would like do the same ...
0
0
0
1
false
5,956,519
0
1,686
4
0
0
5,950,427
Try running multiple clients in multiple processes, NOT threads. Then experiment with different insert sizes. 1M inserts in 3 mins is about 5500 inserts/sec, which is pretty good for a single local client. On a multi-core machine you should be able to get several times this amount provided that you use multiple client...
1
0
0
Insert performance with Cassandra
4
python,multithreading,insert,cassandra
0
2011-05-10T13:02:00.000