Question
stringlengths
25
7.47k
Q_Score
int64
0
1.24k
Users Score
int64
-10
494
Score
float64
-1
1.2
Data Science and Machine Learning
int64
0
1
is_accepted
bool
2 classes
A_Id
int64
39.3k
72.5M
Web Development
int64
0
1
ViewCount
int64
15
1.37M
Available Count
int64
1
9
System Administration and DevOps
int64
0
1
Networking and APIs
int64
0
1
Q_Id
int64
39.1k
48M
Answer
stringlengths
16
5.07k
Database and SQL
int64
1
1
GUI and Desktop Applications
int64
0
1
Python Basics and Environment
int64
0
1
Title
stringlengths
15
148
AnswerCount
int64
1
32
Tags
stringlengths
6
90
Other
int64
0
1
CreationDate
stringlengths
23
23
Is it even possible to create an abstraction layer that can accommodate relational and non-relational databases? The purpose of this layer is to minimize repetition and allows a web application to use any kind of database by just changing/modifying the code in one place (ie, the abstraction layer). The part that sits o...
2
0
1.2
0
true
3,649,176
1
1,721
3
0
0
3,606,215
Thank you for all the answers. To summarize the answers, currently only web2py and Django supports this kind of abstraction. It is not about a SQL-NoSQL holy grail, using abstraction can make the apps more flexible. Lets assume that you started a project using NoSQL, and then later on you need to switch over to SQL. I...
1
0
0
Is there any python web app framework that provides database abstraction layer for SQL and NoSQL?
5
python,sql,database,google-app-engine,nosql
0
2010-08-31T05:18:00.000
Is it even possible to create an abstraction layer that can accommodate relational and non-relational databases? The purpose of this layer is to minimize repetition and allows a web application to use any kind of database by just changing/modifying the code in one place (ie, the abstraction layer). The part that sits o...
2
1
0.039979
0
false
3,609,648
1
1,721
3
0
0
3,606,215
Regarding App Engine, all existing attempts limit you in some way (web2py doesn't support transactions or namespaces and probably many other stuff, for example). If you plan to work with GAE, use what GAE provides and forget looking for a SQL-NoSQL holy grail. Existing solutions are inevitably limited and affect perfor...
1
0
0
Is there any python web app framework that provides database abstraction layer for SQL and NoSQL?
5
python,sql,database,google-app-engine,nosql
0
2010-08-31T05:18:00.000
Is it even possible to create an abstraction layer that can accommodate relational and non-relational databases? The purpose of this layer is to minimize repetition and allows a web application to use any kind of database by just changing/modifying the code in one place (ie, the abstraction layer). The part that sits o...
2
1
0.039979
0
false
3,606,610
1
1,721
3
0
0
3,606,215
Yo may also check web2py, they support relational databases and GAE on the core.
1
0
0
Is there any python web app framework that provides database abstraction layer for SQL and NoSQL?
5
python,sql,database,google-app-engine,nosql
0
2010-08-31T05:18:00.000
Can you recommend a high-performance, thread-safe and stable ORM for Python? The data I need to work with isn't complex, so SQLAlchemy is probably an overkill.
3
6
1.2
0
true
3,609,616
0
3,631
1
0
0
3,607,285
If you are looking for something thats high performance, and based on one of your comments "something that can handle >5k queries per second". You need to keep in mind that an ORM is not built specifically for speed and performance, it is built for maintainability and ease of use. If the data is so basic that even SqlA...
1
0
1
Fast, thread-safe Python ORM?
4
python,orm
0
2010-08-31T08:25:00.000
i'm using IronPython 2.6 for .Net4 to build an GUI logging application. This application received data via serialport and stores these data in an sqlite3 database while showing the last 100 received items in an listview. The listview gathers it's data via an SQL SELECT from the database every 100ms. It only querys data...
0
0
0
0
false
3,616,111
1
495
1
0
0
3,616,078
good That is highly subjective without far more detailed requirements. You should be able to use any database with .NET support, whether out of the box (notably SQL Server Express and Compact) or installed separately (SQL Server-other editions, DB2, MySQL, Oracle, ...). Ten select commands per second should be easily...
1
0
0
IronPython - What kind of database is useable
2
database,ironpython
0
2010-09-01T08:04:00.000
I use Python and MySQLdb to download web pages and store them into database. The problem I have is that I can't save complicated strings in the database because they are not properly escaped. Is there a function in Python that I can use to escape a string for MySQL? I tried with ''' (triple simple quotes) and """, but ...
77
0
0
0
false
61,042,304
0
144,313
1
0
0
3,617,052
One other way to work around this is using something like this when using mysqlclient in python. suppose the data you want to enter is like this <ol><li><strong style="background-color: rgb(255, 255, 0);">Saurav\'s List</strong></li></ol>. It contains both double qoute and single quote. You can use the following method...
1
0
0
Escape string Python for MySQL
7
python,mysql,escaping
0
2010-09-01T10:23:00.000
I'm having a right old nightmare with JPype. I have got my dev env on Windows and so tried installing it there with no luck. I then tried on Ubunto also with no luck. I'm getting a bit desperate now. I am using Mingw32 since I tried installing VS2008 but it told me I had to install XP SP2 but I am on Vista. I tried VS2...
3
1
0.066568
0
false
6,258,169
1
3,736
1
0
0
3,649,577
Edit the Setup.py and remove the /EHsc option.
1
1
0
JPype compile problems
3
java,python
0
2010-09-06T06:54:00.000
I have a Twisted application that runs in an x86 64bit machine with Win 2008 server. It needs to be connected to a SQL Server database that runs in another machine (in a cloud actually but I have IP, port, db name, credentials). Do I need to install anything more that Twisted to my machine? And which API should be use...
0
1
0.099668
0
false
4,059,366
0
1,128
1
1
0
3,657,271
If you want to have portable mssql server library, you can try the module from www.pytds.com. It works with 2.5+ and 3.1, have a good stored procedure support. It's api is more "functional", and has some good features you won't find anywhere else.
1
0
0
Twisted and connection to SQL Server
2
python,sql-server,twisted
0
2010-09-07T09:07:00.000
I'm creating a basic database utility class in Python. I'm refactoring an old module into a class. I'm now working on an executeQuery() function, and I'm unsure of whether to keep the old design or change it. Here are the 2 options: (The old design:) Have one generic executeQuery method that takes the query to execute...
3
4
0.26052
0
false
3,662,258
0
174
1
0
0
3,662,134
It's propably just me and my FP fetish, but I think a function executed solely for side effects is very different from a non-destructive function that fetches some data, and therefore have different names. Especially if the generic function would do something different depending on exactly that (the part on the commit ...
1
0
0
Design question in Python: should this be one generic function or two specific ones?
3
python,oop
0
2010-09-07T19:47:00.000
AFAIK SQLite returns unicode objects for TEXT in Python. Is it possible to get SQLite to return string objects instead?
3
0
0
0
false
25,273,292
0
7,275
2
0
0
3,666,328
Use Python 3.2+. It will automatically return string instead of unicode (as in Python 2.7)
1
0
0
Can I get SQLite to string instead of unicode for TEXT in Python?
3
python,string,sqlite,unicode
0
2010-09-08T09:31:00.000
AFAIK SQLite returns unicode objects for TEXT in Python. Is it possible to get SQLite to return string objects instead?
3
4
0.26052
0
false
3,666,433
0
7,275
2
0
0
3,666,328
TEXT is intended to store text. Use BLOB if you want to store bytes.
1
0
0
Can I get SQLite to string instead of unicode for TEXT in Python?
3
python,string,sqlite,unicode
0
2010-09-08T09:31:00.000
I created a new Pylons project, and would like to use Cassandra as my database server. I plan on using Pycassa to be able to use cassandra 0.7beta. Unfortunately, I don't know where to instantiate the connection to make it available in my application. The goal would be to : Create a pool when the application is launc...
10
2
1.2
0
true
3,687,133
0
885
1
1
0
3,671,535
Well. I worked a little more. In fact, using a connection manager was probably not a good idea as this should be the template context. Additionally, opening a connection for each thread is not really a big deal. Opening a connection per request would be. I ended up with just pycassa.connect_thread_local() in app_global...
1
0
0
How to connect to Cassandra inside a Pylons app?
2
python,pylons,cassandra
0
2010-09-08T20:14:00.000
We have a Django project which runs on Google App Engine and used db.UserProperty in several models. We don't have an own User model. My boss would like to use RPXNow (Janrain) for authentication, but after I integrated it, the users.get_current_user() method returned None. It makes sense, because not Google authentica...
0
1
1.2
0
true
3,707,639
1
389
1
1
0
3,699,751
You can only get a User object if you're using one of the built-in authentication methods. User objects provide an interface to the Users API, which is handled by the App Engine infrastructure. If you're using your own authentication library, regardless of what protocol it uses, you will have to store user information ...
1
0
0
Google App Engine's db.UserProperty with rpxnow
2
python,google-app-engine,rpxnow
0
2010-09-13T10:55:00.000
Short story I have a technical problem with a third-party library at my hands that I seem to be unable to easily solve in a way other than creating a surrogate key (despite the fact that I'll never need it). I've read a number of articles on the Net discouraging the use of surrogate keys, and I'm a bit at a loss if it ...
2
2
1.2
0
true
3,713,061
0
698
3
0
0
3,712,949
I always make surrogate keys when using ORMs (or rather, I let the ORMs make them for me). They solve a number of problems, and don't introduce any (major) problems. So, you've done your job by acknowledging that there are "papers on the net" with valid reasons to avoid surrogate keys, and that there's probably a bette...
1
0
0
How badly should I avoid surrogate primary keys in SQL?
3
python,sqlalchemy,primary-key
0
2010-09-14T21:11:00.000
Short story I have a technical problem with a third-party library at my hands that I seem to be unable to easily solve in a way other than creating a surrogate key (despite the fact that I'll never need it). I've read a number of articles on the Net discouraging the use of surrogate keys, and I'm a bit at a loss if it ...
2
0
0
0
false
4,160,811
0
698
3
0
0
3,712,949
I use surrogate keys in a db that I use reflection on with sqlalchemy. The pro is that you can more easily manage the foreign keys / relationships that exists in your tables / models. Also, the rdbms is managing the data more efficiently. The con is the data inconsistency: duplicates. To avoid this - always use the uni...
1
0
0
How badly should I avoid surrogate primary keys in SQL?
3
python,sqlalchemy,primary-key
0
2010-09-14T21:11:00.000
Short story I have a technical problem with a third-party library at my hands that I seem to be unable to easily solve in a way other than creating a surrogate key (despite the fact that I'll never need it). I've read a number of articles on the Net discouraging the use of surrogate keys, and I'm a bit at a loss if it ...
2
0
0
0
false
3,713,270
0
698
3
0
0
3,712,949
"Using a surrogate key allows duplicates to be created when using a natural key would have prevented such problems" Exactly, so you should have both keys, not just a surrogate. The error you seem to be making is not that you are using a surrogate, it's that you are assuming the table only needs one key. Make sure you c...
1
0
0
How badly should I avoid surrogate primary keys in SQL?
3
python,sqlalchemy,primary-key
0
2010-09-14T21:11:00.000
I have created a database in PostgreSQL, let's call it testdb. I have a generic set of tables inside this database, xxx_table_one, xxx_table_two and xxx_table_three. Now, I have Python code where I want to dynamically create and remove "sets" of these 3 tables to my database with a unique identifier in the table name d...
7
3
0.197375
0
false
3,715,621
0
5,102
2
0
0
3,715,456
PostgreSQL doesn't impose a direct limit on this, your OS does (it depends on maximum directory size) This may depend on your OS as well. Some filesystems get slower with large directories. PostgreSQL won't be able to optimize queries if they're across different tables. So using less tables (or a single table) should b...
1
0
0
Is there a limitation on the number of tables a PostgreSQL database can have?
3
python,mysql,database,database-design,postgresql
0
2010-09-15T07:15:00.000
I have created a database in PostgreSQL, let's call it testdb. I have a generic set of tables inside this database, xxx_table_one, xxx_table_two and xxx_table_three. Now, I have Python code where I want to dynamically create and remove "sets" of these 3 tables to my database with a unique identifier in the table name d...
7
0
0
0
false
5,603,789
0
5,102
2
0
0
3,715,456
If your data were not related, I think your tables could be in different schema, and then you would use SET search_path TO schema1, public for example, this way you wouldn't have to dynamically generate table names in your queries. I am planning to try this structure on a large database which stores logs and other trac...
1
0
0
Is there a limitation on the number of tables a PostgreSQL database can have?
3
python,mysql,database,database-design,postgresql
0
2010-09-15T07:15:00.000
I am into a project where zope web server is used. With this PostgreSQL database is used. But I am not able to add a new PostgreSQL connection via zope. Actually, I am not aware of what else I need to install so that I can use PostgreSQL dB with zope. From whatever I have explored about this I have come to know that I ...
3
0
0
0
false
3,719,408
0
110
1
0
0
3,719,145
Look at psycopg, it ships with a Zope Database Adapter.
1
0
0
What are the essentials I need to install if I want to use PostgreSQL DB with zope? for eg: Zope Database Adapter?
2
python,zope
0
2010-09-15T15:24:00.000
I'm running a web crawler that gets called as a separate thread via Django. When it tries to store the scraped information I get this error: File "/usr/lib/pymodules/python2.6/MySQLdb/cursors.py", line 147, in execute charset = db.character_set_name() InterfaceError: (0, '') If I manually run the script from the co...
0
0
1.2
0
true
3,722,799
1
1,854
1
0
0
3,722,120
Since it mentions the character set, my gut says you are running a different Django/Python/something from the command line than you are from the webserver. In your settings file, turn on DEBUG=True, restart the server, and then run this again. In particular, look at the list of paths shown. If they are not exactly what...
1
0
0
mySQL interface error only occuring if ran in Django
2
python,mysql,django,multithreading
0
2010-09-15T21:49:00.000
I have zope 2.11 installed. Now i want to use Posgresql 7.4.13 DB with it. So i know i need to install psycopg2 Database Adapter. Can any one tell me Is psycopg2 compatible with zope2??
1
1
0.197375
0
false
4,018,666
0
142
1
0
0
3,725,699
Yes, you can use psycopg2 with Zope2. Just install it in your Python with easy_install or setup.py. You will also need a matching ZPsycopgDA Product in Zope. You find the ZPsycopgDA folder in the psycopg2 source distribution tarball.
1
0
0
Is Zpsycopg2 compatible with zope 2?
1
python,database,zope
0
2010-09-16T10:19:00.000
I am building an application with objects which have their data stored in mysql tables (across multiple tables). When I need to work with the object (retrieve object attributes / change the attributes) I am querying the sql database using mysqldb (select / update). However, since the application is quite computation in...
2
5
0.462117
0
false
3,770,439
0
1,287
1
0
0
3,770,394
25Mb is tiny. Microscopic. SQL is slow. Glacial. Do not waste time on SQL unless you have transactions (with locking and multiple users). If you're doing "analysis", especially computationally-intensive analysis, load all the data into memory. In the unlikely event that data doesn't fit into memory, then do this. Q...
1
0
0
Optimizing Python Code for Database Access
2
python,mysql,optimization
0
2010-09-22T14:43:00.000
We've worked hard to work up a full dimensional database model of our problem, and now it's time to start coding. Our previous projects have used hand-crafted queries constructed by string manipulation. Is there any best/standard practice for interfacing between python and a complex database layout? I've briefly evalua...
10
3
0.197375
0
false
3,782,627
0
3,376
3
0
0
3,782,386
I'm using SQLAlchemy with a pretty big datawarehouse and I'm using it for the full ETL process with success. Specially in certain sources where I have some complex transformation rules or with some heterogeneous sources (such as web services). I'm not using the Sqlalchemy ORM but rather using its SQL Expression Languag...
1
0
0
Python: interact with complex data warehouse
3
python,django-models,sqlalchemy,data-warehouse,olap
0
2010-09-23T20:40:00.000
We've worked hard to work up a full dimensional database model of our problem, and now it's time to start coding. Our previous projects have used hand-crafted queries constructed by string manipulation. Is there any best/standard practice for interfacing between python and a complex database layout? I've briefly evalua...
10
2
0.132549
0
false
3,782,432
0
3,376
3
0
0
3,782,386
SQLAlchemy definitely. Compared to SQLAlchemy, all other ORMs look like child's toy. Especially the Django-ORM. What's Hibernate to Java, SQLAlchemy is to Python.
1
0
0
Python: interact with complex data warehouse
3
python,django-models,sqlalchemy,data-warehouse,olap
0
2010-09-23T20:40:00.000
We've worked hard to work up a full dimensional database model of our problem, and now it's time to start coding. Our previous projects have used hand-crafted queries constructed by string manipulation. Is there any best/standard practice for interfacing between python and a complex database layout? I've briefly evalua...
10
6
1
0
false
3,782,509
0
3,376
3
0
0
3,782,386
Don't get confused by your requirements. One size does not fit all. load large amounts of data relatively quickly Why not use the databases's native loaders for this? Use Python to prepare files, but use database tools to load. You'll find that this is amazingly fast. update/insert small amounts of data quickly...
1
0
0
Python: interact with complex data warehouse
3
python,django-models,sqlalchemy,data-warehouse,olap
0
2010-09-23T20:40:00.000
Howdie stackoverflow people! So I've been doing some digging regarding these NoSQL databases, MongoDB, CouchDB etc. Though I am still not sure about real time-ish stuff therefore I thought i'd ask around to see if someone have any practical experience. Let's think about web stuff, let's say we've got a very dynamic sup...
2
0
0
0
false
3,799,207
1
1,738
2
0
0
3,798,728
It depends heavily on the server running said NoSQL solution, amount of data etc... I have played around with Mongo a bit and it is very easy to setup multiple servers to run simultaneously and you would most likely be able to accomplish high concurrency by starting multiple instances on the same box and having them ac...
1
0
0
MongoDB for realtime ajax stuff?
2
php,python,ajax,mongodb,real-time
0
2010-09-26T16:31:00.000
Howdie stackoverflow people! So I've been doing some digging regarding these NoSQL databases, MongoDB, CouchDB etc. Though I am still not sure about real time-ish stuff therefore I thought i'd ask around to see if someone have any practical experience. Let's think about web stuff, let's say we've got a very dynamic sup...
2
2
1.2
0
true
3,801,074
1
1,738
2
0
0
3,798,728
Let's say we have 5000 users at the same time, every 5, 10 or 20 seconds ajax requests that updates various interfaces. OK, so to get this right, you're talking about 250 to 1000 writes per second? Yeah, MongoDB can handle that. The real key on performance is going to be whether or not these are queries, updates...
1
0
0
MongoDB for realtime ajax stuff?
2
php,python,ajax,mongodb,real-time
0
2010-09-26T16:31:00.000
I've heard of redis-cache but how exactly does it work? Is it used as a layer between django and my rdbms, by caching the rdbms queries somehow? Or is it supposed to be used directly as the database? Which I doubt, since that github page doesn't cover any login details, no setup.. just tells you to set some config pro...
107
61
1
0
false
7,722,260
1
75,541
1
0
0
3,801,379
Just because Redis stores things in-memory does not mean that it is meant to be a cache. I have seen people using it as a persistent store for data. That it can be used as a cache is a hint that it is useful as a high-performance storage. If your Redis system goes down though you might loose data that was not been writ...
1
0
0
How can I use redis with Django?
5
python,django,redis
0
2010-09-27T05:48:00.000
FTS3/FTS4 doesn't work in python by default (up to 2.7). I get the error: sqlite3.OperationalError: no such module: fts3 or sqlite3.OperationalError: no such module: fts4 How can this be resolved?
13
0
0
0
false
12,372,189
0
6,571
2
0
0
3,823,659
What Naveen said but => For Windows installations: While running setup.py for for package installations... Python 2.7 searches for an installed Visual Studio 2008. You can trick Python to use Visual Studio by setting SET VS90COMNTOOLS=%VS100COMNTOOLS% before calling setup.py.
1
0
0
How to setup FTS3/FTS4 with python2.7 on Windows
4
python,sqlite,full-text-search,fts3,fts4
0
2010-09-29T16:22:00.000
FTS3/FTS4 doesn't work in python by default (up to 2.7). I get the error: sqlite3.OperationalError: no such module: fts3 or sqlite3.OperationalError: no such module: fts4 How can this be resolved?
13
2
0.099668
0
false
3,826,412
0
6,571
2
0
0
3,823,659
never mind. installing pysqlite from source was easy and sufficient. python setup.py build_static install fts3 is enabled by default when installing from source.
1
0
0
How to setup FTS3/FTS4 with python2.7 on Windows
4
python,sqlite,full-text-search,fts3,fts4
0
2010-09-29T16:22:00.000
I have a 100 mega bytes sqlite db file that I would like to load to memory before performing sql queries. Is it possible to do that in python? Thanks
8
2
0.099668
0
false
25,521,707
0
12,928
1
0
0
3,826,552
If you are using Linux, you can try tmpfs which is a memory-based file system. It's very easy to use it: mount tmpfs to a directory. copy sqlite db file to the directory. open it as normal sqlite db file. Remember, anything in tmpfs will be lost after reboot. So, you may copy db file back to disk if it changed.
1
0
0
In python, how can I load a sqlite db completely to memory before connecting to it?
4
python,sql,memory,sqlite
0
2010-09-29T23:10:00.000
I have been working on developing this analytical tool to help interpret and analyze a database that is bundled within the package. It is very important for us to secure the database in a way that can only be accessed with our software. What is the best way of achieving it in Python? I am aware that there may not be a...
4
3
1.2
0
true
3,850,560
0
4,184
1
0
0
3,848,658
This question comes up on the SQLite users mailing list about once a month. No matter how much encryption etc you do, if the database is on the client machine then the key to decrypt will also be on the machine at some point. An attacker will be able to get that key since it is their machine. A better way of looking a...
1
0
1
Encrypting a Sqlite db file that will be bundled in a pyexe file
2
python,database,sqlite,encryption
0
2010-10-03T04:55:00.000
I have an existing sqlite3 db file, on which I need to make some extensive calculations. Doing the calculations from the file is painfully slow, and as the file is not large (~10 MB), so there should be no problem to load it into memory. Is there a Pythonic way to load the existing file into memory in order to speed up...
72
-1
-0.019997
0
false
3,850,164
0
46,619
2
0
0
3,850,022
sqlite supports in-memory databases. In python, you would use a :memory: database name for that. Perhaps you could open two databases (one from the file, an empty one in-memory), migrate everything from the file database into memory, then use the in-memory database further to do calculations.
1
0
0
How to load existing db file to memory in Python sqlite3?
10
python,performance,sqlite
0
2010-10-03T13:55:00.000
I have an existing sqlite3 db file, on which I need to make some extensive calculations. Doing the calculations from the file is painfully slow, and as the file is not large (~10 MB), so there should be no problem to load it into memory. Is there a Pythonic way to load the existing file into memory in order to speed up...
72
0
0
0
false
57,569,063
0
46,619
2
0
0
3,850,022
With the solution of Cenk Alti, I always had a MemoryError with Python 3.7, when the process reached 500MB. Only with the use of the backup functionality of sqlite3 (mentioned by thinwybk), I was able to to load and save bigger SQLite databases. Also you can do the same with just 3 lines of code, both ways.
1
0
0
How to load existing db file to memory in Python sqlite3?
10
python,performance,sqlite
0
2010-10-03T13:55:00.000
Looking around for a noSQL database implementation that has an ORM syntax (pref. like Django's), lets me store and retrieve nested dictionary attributes but written entirely in Python to ease deployment and avoids Javascript syntax for map/reduce. Even better if it has a context-aware (menus), python-based console, as ...
4
2
0.099668
0
false
3,865,523
0
1,830
1
0
0
3,865,283
I don't know about a noSQL solution, but sqlite+sqlalchemy's ORM works pretty well for me. As long as it gives you the interface and features you need, I don't see a reason to care whether it uses sql internally.
1
0
1
Pure Python implementation of MongoDB?
4
python,mongodb,nosql
0
2010-10-05T15:39:00.000
I'm using Python and SQLAlchemy to query a SQLite FTS3 (full-text) store and I would like to prevent my users from using the - as an operator. How should I escape the - so users can search for a term containing the - (enabled by changing the default tokenizer) instead of it signifying "does not contain the term followi...
2
1
0.099668
0
false
3,942,449
0
1,200
1
0
0
3,865,733
From elsewhere on the internet it seems it may be possible to surround each search term with double quotes "some-term". Since we do not need the subtraction operation, my solution was to replace hyphens - with underscores _ when populating the search index and when performing searches.
1
0
0
How do I escape the - character in SQLite FTS3 queries?
2
python,sqlite,sqlalchemy,fts3
0
2010-10-05T16:32:00.000
I'm trying to restore the current working database to the data stored in a .sql file from within Django. Whats the best way to do this? Does django have an good way to do this or do I need to grab the connection string from the settings.py file and send command line mysql commands to do this? Thanks for your help.
0
1
1.2
0
true
3,868,544
1
182
1
0
0
3,866,989
You can't import sql dumps through django; import it through mysql directly, if you run mysql locally you can find various graphical mysql clients that can help you with doing so; if you need to do it remotely, find out if your server has any web interfaces for that installed!
1
0
0
How do I replace the current working MySQL database with a .sql file?
2
python,mysql,django
0
2010-10-05T19:28:00.000
i just want to use an entity modify it to show something,but don't want to change to the db, but after i use it ,and in some other place do the session.commit() it will add this entity to db,i don't want this happen, any one could help me?
0
1
1.2
0
true
3,896,280
1
85
1
0
0
3,881,364
You can expunge it from session before modifying object, then this changes won't be accounted on next commits unless you add the object back to session. Just call session.expunge(obj).
1
0
0
use sqlalchemy entity isolately
1
python,sqlalchemy,entity
0
2010-10-07T11:56:00.000
I am trying out Sphinx search in my Django project. All setup done & it works but need some clarification from someone who has actually used this setup. In my Sphinx search while indexing, I have used 'name' as the field in my MySQL to be searchable & all other fields in sql_query to be as attributes (according to Sphi...
0
1
1.2
0
true
4,121,651
1
395
1
0
0
3,897,650
Here's a shot in the dark - Try to get the name of your index in sphinx.conf same as the table_name you are trying to index. This is a quirk which is missed by lot of people.
1
0
0
Django Sphinx Text Search
1
python,django,search,full-text-search,django-sphinx
0
2010-10-09T20:02:00.000
I'm currently busy making a Python ORM which gets all of its information from a RDBMS via introspection (I would go with XRecord if I was happy with it in other respects) — meaning, the end-user only tells which tables/views to look at, and the ORM does everything else automatically (if it makes you actually write some...
2
0
1.2
0
true
3,902,410
0
241
3
0
0
3,901,961
So far, I see the only one technique covering more than two tables in relation. A table X is assumed related to table Y, if and only if X is referenced to Y no more than one table away. That is: "Zero tables away" means X contains the foreign key to Y. No big deal, that's how we detect many-to-ones. "One table away" me...
1
0
0
What are methods of programmatically detecting many-to-many relationships in a RDMBS?
3
python,orm,metaprogramming,introspection,relationships
0
2010-10-10T19:59:00.000
I'm currently busy making a Python ORM which gets all of its information from a RDBMS via introspection (I would go with XRecord if I was happy with it in other respects) — meaning, the end-user only tells which tables/views to look at, and the ORM does everything else automatically (if it makes you actually write some...
2
1
0.066568
0
false
3,902,041
0
241
3
0
0
3,901,961
If you have to ask, you shouldn't be doing this. I'm not saying that to be cruel, but Python already has several excellent ORMs that are well-tested and widely used. For example, SQLAlchemy supports the autoload=True attribute when defining tables that makes it read the table definition - including all the stuff you're...
1
0
0
What are methods of programmatically detecting many-to-many relationships in a RDMBS?
3
python,orm,metaprogramming,introspection,relationships
0
2010-10-10T19:59:00.000
I'm currently busy making a Python ORM which gets all of its information from a RDBMS via introspection (I would go with XRecord if I was happy with it in other respects) — meaning, the end-user only tells which tables/views to look at, and the ORM does everything else automatically (if it makes you actually write some...
2
0
0
0
false
3,902,030
0
241
3
0
0
3,901,961
Theoretically, any table with multiple foreign keys is in essence a many-to-many relation, which makes your question trivial. I suspect that what you need is a heuristic of when to use MTM patterns (rather than standard classes) in the object model. In that case, examine what are the limitations of the patterns you cho...
1
0
0
What are methods of programmatically detecting many-to-many relationships in a RDMBS?
3
python,orm,metaprogramming,introspection,relationships
0
2010-10-10T19:59:00.000
Well, the question pretty much summarises it. My db activity is very update intensive, and I want to programmatically issue a Vacuum Analyze. However I get an error that says that the query cannot be executed within a transaction. Is there some other way to do it?
9
14
1.2
0
true
3,932,055
0
5,030
1
0
0
3,931,951
This is a flaw in the Python DB-API: it starts a transaction for you. It shouldn't do that; whether and when to start a transaction should be up to the programmer. Low-level, core APIs like this shouldn't babysit the developer and do things like starting transactions behind our backs. We're big boys--we can start tr...
1
0
0
Is it possible to issue a "VACUUM ANALYZE " from psycopg2 or sqlalchemy for PostgreSQL?
2
python,postgresql,sqlalchemy,psycopg2,vacuum
0
2010-10-14T09:49:00.000
Hi so this is what I understand how Openid works:- the user enters his openid url on the site say"hii.com" The app does a redirect to the openid provider and either does the login or denies it and sends the response back to the site i.e"hii.com" If authentication was succesful then the response object provided by the ...
3
1
1.2
0
true
3,937,506
0
488
1
0
0
3,937,456
upd.: my previous answer was wrong The store you are referring to is where your app stores the data during auth. Storing it in a shared memcached instance should be the best option (faster than db and reliable enough).
1
0
0
what is the concept of store in OpenID
1
python,openid,store
0
2010-10-14T20:51:00.000
... vs declarative sqlalchemy ?
7
1
0.066568
0
false
3,975,114
0
4,024
1
0
0
3,957,938
The Elixir syntax is something I find useful when building a database for a given app from scratch and everything is all figured out beforehand. I have had my best luck with SQLAlchemy when using it on legacy databases (and on other similarly logistically immutable schemas). Particularly useful is the plugin SQLSoup, f...
1
0
0
What are the benefits of using Elixir
3
python,sqlalchemy,python-elixir
0
2010-10-18T09:36:00.000
I will be writing a little Python script tomorrow, to retrieve all the data from an old MS Access database into a CSV file first, and then after some data cleansing, munging etc, I will import the data into a mySQL database on Linux. I intend to use pyodbc to make a connection to the MS Access db. I will be running the...
1
5
1.2
0
true
3,964,635
0
8,978
3
0
0
3,964,378
Memory usage for csvfile.reader and csvfile.writer isn't proportional to the number of records, as long as you iterate correctly and don't try to load the whole file into memory. That's one reason the iterator protocol exists. Similarly, csvfile.writer writes directly to disk; it's not limited by available memory. Y...
1
0
0
is there a limit to the (CSV) filesize that a Python script can read/write?
4
python,ms-access,csv,odbc
0
2010-10-18T23:49:00.000
I will be writing a little Python script tomorrow, to retrieve all the data from an old MS Access database into a CSV file first, and then after some data cleansing, munging etc, I will import the data into a mySQL database on Linux. I intend to use pyodbc to make a connection to the MS Access db. I will be running the...
1
1
0.049958
0
false
3,964,404
0
8,978
3
0
0
3,964,378
I wouldn't bother using an intermediate format. Pulling from Access via ADO and inserting right into MySQL really shouldn't be an issue.
1
0
0
is there a limit to the (CSV) filesize that a Python script can read/write?
4
python,ms-access,csv,odbc
0
2010-10-18T23:49:00.000
I will be writing a little Python script tomorrow, to retrieve all the data from an old MS Access database into a CSV file first, and then after some data cleansing, munging etc, I will import the data into a mySQL database on Linux. I intend to use pyodbc to make a connection to the MS Access db. I will be running the...
1
0
0
0
false
3,964,398
0
8,978
3
0
0
3,964,378
The only limit should be operating system file size. That said, make sure when you send the data to the new database, you're writing it a few records at a time; I've seen people do things where they try to load the entire file first, then write it.
1
0
0
is there a limit to the (CSV) filesize that a Python script can read/write?
4
python,ms-access,csv,odbc
0
2010-10-18T23:49:00.000
As the title says, what is the equivalent of Python's '%s %s' % (first_string, second_string) in SQLite? I know I can do concatenation like first_string || " " || second_string, but it looks very ugly.
0
0
1.2
0
true
3,976,347
0
2,632
2
0
0
3,976,313
There isn't one.
1
0
0
SQLite equivalent of Python's "'%s %s' % (first_string, second_string)"
5
python,sqlite,string
0
2010-10-20T09:16:00.000
As the title says, what is the equivalent of Python's '%s %s' % (first_string, second_string) in SQLite? I know I can do concatenation like first_string || " " || second_string, but it looks very ugly.
0
2
0.07983
0
false
3,976,353
0
2,632
2
0
0
3,976,313
I can understand not liking first_string || ' ' || second_string, but that's the equivalent. Standard SQL (which SQLite speaks in this area) just isn't the world's prettiest string manipulation language. You could try getting the results of the query back into some other language (e.g., Python which you appear to like)...
1
0
0
SQLite equivalent of Python's "'%s %s' % (first_string, second_string)"
5
python,sqlite,string
0
2010-10-20T09:16:00.000
I've been asked to encrypt various db fields within the db. Problem is that these fields need be decrypted after being read. I'm using Django and SQL Server 2005. Any good ideas?
17
2
0.099668
0
false
3,979,447
1
13,590
2
0
0
3,979,385
If you are storing things like passwords, you can do this: store users' passwords as their SHA256 hashes get the user's password hash it List item check it against the stored password You can create a SHA-256 hash in Python by using the hashlib module. Hope this helps
1
0
0
A good way to encrypt database fields?
4
python,sql,sql-server,django,encryption
0
2010-10-20T15:12:00.000
I've been asked to encrypt various db fields within the db. Problem is that these fields need be decrypted after being read. I'm using Django and SQL Server 2005. Any good ideas?
17
6
1.2
0
true
3,979,446
1
13,590
2
0
0
3,979,385
Yeah. Tell whoever told you to get real. Makes no / little sense. If it is about the stored values - enterprise edition 2008 can store encrypted DB files. Otherwise, if you really need to (with all disadvantages) just encrypt them and store them as byte fields.
1
0
0
A good way to encrypt database fields?
4
python,sql,sql-server,django,encryption
0
2010-10-20T15:12:00.000
I'd like to build a "feed" for recent activity related to a specific section of my site. I haven't used memcache before, but I'm thinking of something like this: When a new piece of information is submitted to the site, assign a unique key to it and also add it to memcache. Add this key to the end of an existing list...
0
0
0
0
false
4,006,612
1
351
1
0
0
3,999,496
If the list of keys is bounded in size then it should be ok. memcache by default has a 1MB item size limit. Sounds like memcache is the only storage for the data, is it a good idea?
1
0
0
Best way to keep an activity log in memcached
1
python,memcached,feed
0
2010-10-22T17:43:00.000
I am trying to implement a python script which writes and reads to a database to track changes within a 3d game (Minecraft) These changes are done by various clients and can be represented by player name, coordinates (x,y,z), and a description. I am storing a high volume of changes and would like to know what would be ...
0
0
0
0
false
4,000,101
0
144
1
0
0
4,000,072
Any kind. A NoSQL option like MongoDB might be especially interesting.
1
0
0
Suitable kind of database to track a high volume of changes
2
python,database,change-tracking
0
2010-10-22T19:02:00.000
I have some things that do not need to be indexed or searched (game configurations) so I was thinking of storing JSON on a BLOB. Is this a good idea at all? Or are there alternatives?
1
2
0.099668
0
false
4,001,358
0
1,335
4
0
0
4,001,314
I don't see why not. As a related real-world example, WordPress stores serialized PHP arrays as a single value in many instances.
1
0
1
Storing JSON in MySQL?
4
python,mysql,json
0
2010-10-22T22:05:00.000
I have some things that do not need to be indexed or searched (game configurations) so I was thinking of storing JSON on a BLOB. Is this a good idea at all? Or are there alternatives?
1
0
0
0
false
4,008,102
0
1,335
4
0
0
4,001,314
I think,It's beter serialize your XML.If you are using python language ,cPickle is good choice.
1
0
1
Storing JSON in MySQL?
4
python,mysql,json
0
2010-10-22T22:05:00.000
I have some things that do not need to be indexed or searched (game configurations) so I was thinking of storing JSON on a BLOB. Is this a good idea at all? Or are there alternatives?
1
5
1.2
0
true
4,001,338
0
1,335
4
0
0
4,001,314
If you need to query based on the values within the JSON, it would be better to store the values separately. If you are just loading a set of configurations like you say you are doing, storing the JSON directly in the database works great and is a very easy solution.
1
0
1
Storing JSON in MySQL?
4
python,mysql,json
0
2010-10-22T22:05:00.000
I have some things that do not need to be indexed or searched (game configurations) so I was thinking of storing JSON on a BLOB. Is this a good idea at all? Or are there alternatives?
1
2
0.099668
0
false
4,001,334
0
1,335
4
0
0
4,001,314
No different than people storing XML snippets in a database (that doesn't have XML support). Don't see any harm in it, if it really doesn't need to be searched at the DB level. And the great thing about JSON is how parseable it is.
1
0
1
Storing JSON in MySQL?
4
python,mysql,json
0
2010-10-22T22:05:00.000
A new requirement has come down from the top: implement 'proprietary business tech' with the awesome, resilient Elixir database I have set up. I've tried a lot of different things, such as creating an implib from the provided interop DLL (which apparently doesn't work like COM dlls) which didn't work at all. CPython do...
0
0
1.2
0
true
4,025,154
0
502
1
0
0
4,017,164
After a day or so of deliberation, I'm attempting to load the new business module in IronPython. Although I don't really want to introduce to python interpreters into my environment, I think that this will be the glue I need to get this done efficiently.
1
0
0
Loading Elixir/SQLAlchemy models in .NET?
1
sqlalchemy,python-elixir
0
2010-10-25T17:23:00.000
Django: If I added new tables to database, how can I query them? Do I need to create the relevant models first? Or django creates it by itself? More specifically, I installed another django app, it created several database tables in database, and now I want to get some specific data from them? What are the correct app...
0
0
0
0
false
4,042,305
1
78
2
0
0
4,042,286
Django doen't follow convention over configuration philosophy. you have to explicitly create the backing model for the table and in the meta tell it about the table name...
1
0
0
Django: If I added new tables to database, how can I query them?
2
python,django,django-models,django-admin
0
2010-10-28T11:11:00.000
Django: If I added new tables to database, how can I query them? Do I need to create the relevant models first? Or django creates it by itself? More specifically, I installed another django app, it created several database tables in database, and now I want to get some specific data from them? What are the correct app...
0
1
1.2
0
true
4,042,337
1
78
2
0
0
4,042,286
I suppose another django app has all model files needed to access those tables, you should just try importing those packages and use this app's models.
1
0
0
Django: If I added new tables to database, how can I query them?
2
python,django,django-models,django-admin
0
2010-10-28T11:11:00.000
I'm having a problem with file uploading. I'm using FastCGI on Apache2 (unix) to run a WSGI-compliant application. File uploads, in the form of images, are begin saved in a MySQL database. However, larger images are being truncated at 65535 bytes. As far as I can tell, nothing should be limiting the size of the files a...
3
2
1.2
0
true
4,047,955
1
1,279
1
0
0
4,047,899
If the web server/gateway layer were truncating incoming form submissions I'd expect an error from FieldStorage, since the truncation would not just interrupt the file upload but also the whole multipart/form-data structure. Even if cgi.py tolerated this, it would be very unlikely to have truncated the multipart at jus...
1
0
0
Does FastCGI or Apache2 limit upload sizes?
1
python,mysql,file-upload,apache2,fastcgi
0
2010-10-28T23:16:00.000
It seems as if MySQLdb is restricting the maximum transfer size for SQL statements. I have set the max_allowed_packet to 128M for mysqld. MySQL documentation says that this needs to be done for the client as well.
3
1
1.2
0
true
4,051,531
0
1,923
1
0
0
4,050,257
You need to put max_allowed_packet into the [client] section of my.cnf on the machine where the client runs. If you want to, you can specify a different file or group in mysqldb.connect.
1
0
0
How do I set max_allowed_packet or equivalent for MySQLdb in python?
1
python,mysql
0
2010-10-29T08:36:00.000
When connected to a postgresql database using psycopg and I pull the network cable I get no errors. How can I detect this in code to notify the user?
2
0
0
0
false
4,061,641
0
214
2
0
0
4,061,635
You will definitely get an error the next time you try and execute a query, so I wouldn't worry if you can't alert the user at the exact instance they lose there network connection.
1
0
0
Python and psycopg detect network error
2
python,postgresql,psycopg
0
2010-10-31T02:54:00.000
When connected to a postgresql database using psycopg and I pull the network cable I get no errors. How can I detect this in code to notify the user?
2
0
1.2
0
true
4,069,833
0
214
2
0
0
4,061,635
psycopg can't detect what happens with the network. For example, if you unplug your ethernet cable, replug it and execute a query everything will work OK. You should definitely get an exception when psycopg tries to send some SQL to the backend and there is no network connection but depending on the exact netwokr probl...
1
0
0
Python and psycopg detect network error
2
python,postgresql,psycopg
0
2010-10-31T02:54:00.000
I am using Redis database where we store the navigational information. These data must be persistent and should be fetched faster. I don't have more than 200 MB data in this data set. I face problem when writing admin modules for redis db and I really missing the sql schema and power of django style admin modules. Now ...
1
1
1.2
0
true
4,061,902
1
1,165
2
0
0
4,061,828
I would create a read only slave to your mysql database and force its database engines to memory. You'd have to handle failures by re-initializing the read only database, but that can be scripted rather easily. This way you still have your persistence in the regular mysql database and your read speed in the read only ...
1
0
0
fit mysql db in memory
3
python,mysql,sqlalchemy,performance
0
2010-10-31T04:18:00.000
I am using Redis database where we store the navigational information. These data must be persistent and should be fetched faster. I don't have more than 200 MB data in this data set. I face problem when writing admin modules for redis db and I really missing the sql schema and power of django style admin modules. Now ...
1
0
0
0
false
4,061,848
1
1,165
2
0
0
4,061,828
I would think you could have a persistent table, copy all of the data into a MEMORY engine table whenever the server starts, and have triggers on the memory db for INSERT UPDATE and DELETE write to the persistent table so it is hidden for the user. Correct me if I'm wrong though, it's just the approach I would first t...
1
0
0
fit mysql db in memory
3
python,mysql,sqlalchemy,performance
0
2010-10-31T04:18:00.000
I have a csv file which contains rows from a sqlite3 database. I wrote the rows to the csv file using python. When I open the csv file with Ms Excel, a blank row appears below every row, but the file on notepad is fine(without any blanks). Does anyone know why this is happenning and how I can fix it? Edit: I used the s...
15
34
1.2
1
true
4,122,980
0
7,209
2
0
0
4,122,794
You're using open('file.csv', 'w')--try open('file.csv', 'wb'). The Python csv module requires output files be opened in binary mode.
1
0
0
Csv blank rows problem with Excel
2
python,excel,csv
0
2010-11-08T10:00:00.000
I have a csv file which contains rows from a sqlite3 database. I wrote the rows to the csv file using python. When I open the csv file with Ms Excel, a blank row appears below every row, but the file on notepad is fine(without any blanks). Does anyone know why this is happenning and how I can fix it? Edit: I used the s...
15
0
0
1
false
4,122,816
0
7,209
2
0
0
4,122,794
the first that comes into my mind (just an idea) is that you might have used "\r\n" as row delimiter (which is shown as one linebrak in notepad) but excel expects to get only "\n" or only "\r" and so it interprets this as two line-breaks.
1
0
0
Csv blank rows problem with Excel
2
python,excel,csv
0
2010-11-08T10:00:00.000
I have a simple question. I'm doing some light crawling so new content arrives every few days. I've written a tokenizer and would like to use it for some text mining purposes. Specifically, I'm using Mallet's topic modeling tool and one of the pipe is to tokenize the text into tokens before further processing can be do...
2
1
0.099668
0
false
4,151,273
0
894
1
0
0
4,122,940
I store tokenized text in a MySQL database. While I don't always like the overhead of communication with the database, I've found that there are lots of processing tasks that I can ask the database to do for me (like search the dependency parse tree for complex syntactic patterns).
1
0
0
Storing tokenized text in the db?
2
python,caching,postgresql,nlp,tokenize
0
2010-11-08T10:17:00.000
I noticed that a significant part of my (pure Python) code deals with tables. Of course, I have class Table which supports the basic functionality, but I end up adding more and more features to it, such as queries, validation, sorting, indexing, etc. I to wonder if it's a good idea to remove my class Table, and refacto...
4
5
1.2
0
true
4,136,841
0
903
3
0
0
4,136,800
SQLite does not run in a separate process. So you don't actually have any extra overhead from IPC. But IPC overhead isn't that big, anyway, especially over e.g., UNIX sockets. If you need multiple writers (more than one process/thread writing to the database simultaneously), the locking overhead is probably worse, and ...
1
0
0
Pros and cons of using sqlite3 vs custom table implementation
3
python,performance,sqlite
0
2010-11-09T17:49:00.000
I noticed that a significant part of my (pure Python) code deals with tables. Of course, I have class Table which supports the basic functionality, but I end up adding more and more features to it, such as queries, validation, sorting, indexing, etc. I to wonder if it's a good idea to remove my class Table, and refacto...
4
4
0.26052
0
false
4,136,876
0
903
3
0
0
4,136,800
You could try to make a sqlite wrapper with the same interface as your class Table, so that you keep your code clean and you get the sqlite performences.
1
0
0
Pros and cons of using sqlite3 vs custom table implementation
3
python,performance,sqlite
0
2010-11-09T17:49:00.000
I noticed that a significant part of my (pure Python) code deals with tables. Of course, I have class Table which supports the basic functionality, but I end up adding more and more features to it, such as queries, validation, sorting, indexing, etc. I to wonder if it's a good idea to remove my class Table, and refacto...
4
0
0
0
false
4,136,862
0
903
3
0
0
4,136,800
If you're doing database work, use a database, if your not, then don't. Using tables, it sound's like you are. I'd recommend using an ORM to make it more pythonic. SQLAlchemy is the most flexible (though it's not strictly just an ORM).
1
0
0
Pros and cons of using sqlite3 vs custom table implementation
3
python,performance,sqlite
0
2010-11-09T17:49:00.000
The context: I'm working on some Python scripts on an Ubuntu server. I need to use some code written in Python 2.7 but our server has Python 2.5. We installed 2.7 as a second instance of Python so we wouldn't break anything reliant on 2.5. Now I need to install the MySQLdb package. I assume I can't do this the easy way...
3
0
0
0
false
4,139,191
0
429
2
1
0
4,138,504
Are you sure that file isn't hardcoded in some other portion of the build process? Why not just add it to you $PATH for the duration of the build? Does the script need to write that file for some reason? Does the build script use su or sudo to attempt to become some other user? Are you absolutely sure about both the pe...
1
0
0
Trouble installing MySQLdb for second version of Python
2
python,mysql,permissions,configuration-files
0
2010-11-09T20:52:00.000
The context: I'm working on some Python scripts on an Ubuntu server. I need to use some code written in Python 2.7 but our server has Python 2.5. We installed 2.7 as a second instance of Python so we wouldn't break anything reliant on 2.5. Now I need to install the MySQLdb package. I assume I can't do this the easy way...
3
2
0.197375
0
false
4,139,563
0
429
2
1
0
4,138,504
As far as I'm aware, there is a very significant difference between "mysql_config" and "my.cnf". "mysql_config" is usually located in the "bin" folder of your MySQL install and when executed, spits out various filesystem location information about your install. "my.cnf" is a configuration script used by MySQL itself. ...
1
0
0
Trouble installing MySQLdb for second version of Python
2
python,mysql,permissions,configuration-files
0
2010-11-09T20:52:00.000
I'd like to develop a small/medium-size cross-platform application (including GUI). My background: mostly web applications with MVC architectures, both Python (Pylons + SqlAlchemy) and Java (know the language well, but don't like it that much). I also know some C#. So far, I have no GUI programming experience (neither ...
11
5
0.761594
0
false
4,145,581
0
3,111
1
0
0
4,145,350
I'm a Python guy and use PyQt myself, and I can wholly recommend it. Concerning your cons: compilation, distribution and deployment more difficult? No, not really. For many projects, a full setup.py for e.g. cx_Freeze can be less than 30 lines that rarely need to change (most import dependencies are detected automati...
1
1
0
Python + QT, Windows Forms or Swing for a cross-platform application?
1
c#,java,python,user-interface,cross-platform
0
2010-11-10T14:11:00.000
I'm programming a web application using sqlalchemy. Everything was smooth during the first phase of development when the site was not in production. I could easily change the database schema by simply deleting the old sqlite database and creating a new one from scratch. Now the site is in production and I need to pres...
63
16
1
0
false
4,165,496
1
32,073
1
0
0
4,165,452
What we do. Use "major version"."minor version" identification of your applications. Major version is the schema version number. The major number is no some random "enough new functionality" kind of thing. It's a formal declaration of compatibility with database schema. Release 2.3 and 2.4 both use schema version ...
1
0
0
How to efficiently manage frequent schema changes using sqlalchemy?
4
python,sqlalchemy,pylons,data-migration,migrate
0
2010-11-12T14:08:00.000
Can anyone help me install Apache with mod_wsgi to run Python for implementation of RESTful Web services. We're trying to get rid of our existing Java REST services with Apache Tomcat. The installation platform is SUSE Linux Enterprise. Please provide a step by step installation procedure with required modules, as I tr...
0
0
0
0
false
4,168,054
1
2,568
1
0
0
4,167,684
Check if mod_wsgi is loaded as a module into the httpd.conf Add apache host that points to a python/wsgi module which contains the 'def application' definition for your web-service. Resolve any path issues that maybe arise from your import handling. If this doesn't work, drop some error-dump here and we'll check.
1
0
0
Install Apache with mod_wsgi to use Python for RESTful web services and Apache for web pages
2
python,apache,rest,mod-wsgi,mod-python
0
2010-11-12T18:07:00.000
i want to create application in windows. i need to use databases which would be preferable best for pyqt application like sqlalchemy mysql etc.
0
0
0
0
false
4,208,750
0
535
3
0
0
4,168,020
i guess its totally upto you ..but as far as i am concerned i personlly use sqlite, becoz it is easy to use and amazingly simple syntax whereas for MYSQL u can use it for complex apps and has options for performance tuning. but in end its totally upto u and wt your app requires
1
1
0
which databases can be used better for pyqt application
4
python,database,pyqt
0
2010-11-12T18:49:00.000
i want to create application in windows. i need to use databases which would be preferable best for pyqt application like sqlalchemy mysql etc.
0
1
0.049958
0
false
4,294,636
0
535
3
0
0
4,168,020
SQlite is fine for a single user. If you are going over a network to talk to a central database, then you need a database woith a decent Python lirary. Take a serious look at MySQL if you need/want SQL. Otherwise, there is CouchDB in the Not SQL camp, which is great if you are storing documents, and can express sear...
1
1
0
which databases can be used better for pyqt application
4
python,database,pyqt
0
2010-11-12T18:49:00.000
i want to create application in windows. i need to use databases which would be preferable best for pyqt application like sqlalchemy mysql etc.
0
1
0.049958
0
false
4,512,428
0
535
3
0
0
4,168,020
If you want a relational database I'd recommend you to use SQLAlchemy, as you then get a choice as well as an ORM. Bu default go with SQLite, as per other recommendations here. If you don't need a relational database, take a look at ZODB. It's an awesome Python-only object-oriented database.
1
1
0
which databases can be used better for pyqt application
4
python,database,pyqt
0
2010-11-12T18:49:00.000
I have my own unit testing suite based on the unittest library. I would like to track the history of each test case being run. I would also like to identify after each run tests which flipped from PASS to FAIL or vice versa. I have very little knowledge about databases, but it seems that I could utilize sqlite3 for thi...
1
0
1.2
0
true
4,170,458
0
280
1
0
0
4,170,442
Technically, yes. The only thing that you need is some kind of scripting language or shell script that can talk to sqlite. You should think of a database like a file in a file system where you don't have to care about the file format. You just say, here are tables of data, with columns. And each row of that is one reco...
1
0
0
Using sqlite3 to track unit test results
1
python,unit-testing,sqlite
1
2010-11-13T01:33:00.000
I'm trying to use a MongoDB Database from a Google App Engine service is that possible? How do I install the PyMongo driver on Google App Engine? Thanks
4
1
0.066568
0
false
4,179,091
1
1,355
1
1
0
4,178,742
It's not possible because you don't have access to networks sockets in App Engine. As long as you cannot access the database via HTTP, it's impossible.
1
0
0
is it possible to use PyMongo in Google App Engine?
3
python,google-app-engine,mongodb,pymongo
0
2010-11-14T17:42:00.000
We're rewriting a website used by one of our clients. The user traffic on it is very low, less than 100 unique visitors a week. It's basically just a nice interface to their data in our databases. It allows them to query and filter on different sets of data of theirs. We're rewriting the site in Python, re-using the sa...
7
1
0.033321
0
false
4,186,505
1
3,442
1
0
0
4,186,384
Most people, in this case, would use a framework. The best documented and most popular framework in Python is Django. It has good database support (including Oracle), and you'll have the easiest time getting help using it since there's such an active Django community. You can try some other frameworks, but if you're ti...
1
0
0
How to display database query results of 100,000 rows or more with HTML?
6
python,html,oracle,coldfusion
0
2010-11-15T16:18:00.000
I am implementing a class that resembles a typical database table: has named columns and unnamed rows has a primary key by which I can refer to the rows supports retrieval and assignment by primary key and column title can be asked to add unique or non-unique index for any of the columns, allowing fast retrieval of a...
6
2
0.132549
0
false
4,188,260
0
1,361
2
0
0
4,188,202
I would consider building a dictionary with keys that are tuples or lists. Eg: my_dict(("col_2", "row_24")) would get you this element. Starting from there, it would be pretty easy (if not extremely fast for very large databases) to write 'get_col' and 'get_row' methods, as well as 'get_row_slice' and 'get_col_slice' f...
1
0
0
How to implement database-style table in Python
3
python,performance,data-structures,implementation
0
2010-11-15T19:48:00.000
I am implementing a class that resembles a typical database table: has named columns and unnamed rows has a primary key by which I can refer to the rows supports retrieval and assignment by primary key and column title can be asked to add unique or non-unique index for any of the columns, allowing fast retrieval of a...
6
0
0
0
false
4,231,416
0
1,361
2
0
0
4,188,202
You really should use SQLite. For your first reason (tracking deletion reasons) you can easily implement this by having a second table that you "move" rows to on deletion. The reason can be tracked in additional column in that table or another table you can join. If a deletion reason isn't always required then you ca...
1
0
0
How to implement database-style table in Python
3
python,performance,data-structures,implementation
0
2010-11-15T19:48:00.000
I am using the mysql connector (https://launchpad.net/myconnpy) with SQLAlchemy and, though the table is definitely UTF8, any string columns returned are just normal strings not unicode. The documentation doesn't list any specific parameters for UTF8/unicode support for the mysql connector driver so I borrowed from the...
1
-3
-0.291313
0
false
4,192,633
0
1,239
1
0
0
4,191,370
Sorry, i don't know about the connector, i use MySQLDB and it is working quite nicely. I work in UTF8 as well and i didn't have any problem.
1
0
0
MySql Connector (python) and SQLAlchemy Unicode problem
2
python,mysql,unicode,sqlalchemy
0
2010-11-16T05:06:00.000
What the difference is between flush() and commit() in SQLAlchemy? I've read the docs, but am none the wiser - they seem to assume a pre-understanding that I don't have. I'm particularly interested in their impact on memory usage. I'm loading some data into a database from a series of files (around 5 million rows in to...
569
0
0
0
false
65,843,088
0
180,674
1
0
0
4,201,455
commit () records these changes in the database. flush () is always called as part of the commit () (1) call. When you use a Session object to query a database, the query returns results from both the database and the reddened parts of the unrecorded transaction it is performing.
1
0
0
SQLAlchemy: What's the difference between flush() and commit()?
6
python,sqlalchemy
0
2010-11-17T04:20:00.000
I'm using sqlite with python. I'm implementing the POP3 protocol. I have a table msg_id text date text from_sender text subject text body text hashkey text Now I need to check for duplicate messages by checking the message id of the message retrieved against the existing msg_id's in the table. I encrypted the msg_id...
0
0
0
0
false
4,208,359
0
481
1
0
0
4,208,146
The main issue is that you're trying to compare a Python string (m.hexdigest()) with a tuple. Additionally, another poster's suggestion that you use SQL for the comparison is probably good advice. Another SQL suggestion would be to fix your columns -- TEXT for everything probably isn't what you want; an index on your h...
1
0
0
Comparing sql values
3
python,sql,sqlite
0
2010-11-17T19:08:00.000
Which is more expensive to do in terms of resources and efficiency, File read/write operation or Database Read/Write operation? I'm using MongoDB, with Python. I't be preforming about 100k requests on the db/file per minute. Also, there's about 15000 documents in the database / file. Which would be faster? thanks in a...
4
6
1
0
false
4,210,090
0
3,529
5
0
0
4,210,057
It depends.. if you need to read sequenced data, file might be faster, if you need to read random data, database has better chances to be optimized to your needs. (after all - database reads it's records from a file as well, but it has an internal structure and algorithms to enhance performance, it can use the memory i...
1
0
1
Is a file read faster than reading data from the database?
5
python,performance,mongodb
0
2010-11-17T22:58:00.000
Which is more expensive to do in terms of resources and efficiency, File read/write operation or Database Read/Write operation? I'm using MongoDB, with Python. I't be preforming about 100k requests on the db/file per minute. Also, there's about 15000 documents in the database / file. Which would be faster? thanks in a...
4
1
0.039979
0
false
49,248,435
0
3,529
5
0
0
4,210,057
Reading from a database can be more efficient, because you can access records directly and make use of indexes etc. With normal flat files you basically have to read them sequentially. (Mainframes support direct access files, but these are sort of halfway between flat files and databases). If you are in a multi-user en...
1
0
1
Is a file read faster than reading data from the database?
5
python,performance,mongodb
0
2010-11-17T22:58:00.000
Which is more expensive to do in terms of resources and efficiency, File read/write operation or Database Read/Write operation? I'm using MongoDB, with Python. I't be preforming about 100k requests on the db/file per minute. Also, there's about 15000 documents in the database / file. Which would be faster? thanks in a...
4
3
0.119427
0
false
4,210,106
0
3,529
5
0
0
4,210,057
There are too many factors to offer a concrete answer, but here's a list for you to consider: Disk bandwidth Disk latency Disk cache Network bandwidth MongoDB cluster size Volume of MongoDB client activity (the disk only has one "client" unless your machine is busy with other workloads)
1
0
1
Is a file read faster than reading data from the database?
5
python,performance,mongodb
0
2010-11-17T22:58:00.000
Which is more expensive to do in terms of resources and efficiency, File read/write operation or Database Read/Write operation? I'm using MongoDB, with Python. I't be preforming about 100k requests on the db/file per minute. Also, there's about 15000 documents in the database / file. Which would be faster? thanks in a...
4
0
0
0
false
4,210,113
0
3,529
5
0
0
4,210,057
If caching is not used sequential IO operations are faster with files by definition. Databases eventually use files, but they have more layers to pass before data hit the file. But if you want to query data using database is more efficient, because if you choose files you will have to implement it yourselves. For your ...
1
0
1
Is a file read faster than reading data from the database?
5
python,performance,mongodb
0
2010-11-17T22:58:00.000
Which is more expensive to do in terms of resources and efficiency, File read/write operation or Database Read/Write operation? I'm using MongoDB, with Python. I't be preforming about 100k requests on the db/file per minute. Also, there's about 15000 documents in the database / file. Which would be faster? thanks in a...
4
4
0.158649
0
false
4,210,368
0
3,529
5
0
0
4,210,057
Try it and tell us the answer.
1
0
1
Is a file read faster than reading data from the database?
5
python,performance,mongodb
0
2010-11-17T22:58:00.000
i have a noob question. I have a record in a table that looks like '\1abc' I then use this string as a regex replacement in re.sub("([0-9])",thereplacement,"2") I'm a little confused with the backslashes. The string i got back was "\\1abc"
0
2
0.197375
0
false
4,226,375
0
191
1
0
0
4,224,400
Note that you can make \ stop being an escape character by setting standard_conforming_strings to on.
1
0
0
regarding backslash from postgresql
2
python,postgresql
0
2010-11-19T11:11:00.000
Short Question: Is there any nosql flat-file database available as sqlite? Explanation: Flat file database can be opened in different processes to read, and keep one process to write. I think its perfect for read cache if there's no strict consistent needed. Say 1-2 secs write to the file or even memory block and the r...
49
0
0
0
false
15,588,028
0
28,380
1
0
0
4,245,438
Something trivial but workable, if you are looking storage backed up key value data structure use pickled dictionary. Use cPickle for better performance if needed.
1
0
0
Is there any nosql flat file database just as sqlite?
3
python,database,caching,sqlite,nosql
0
2010-11-22T12:36:00.000
Is there a Python module that writes Excel 2007+ files? I'm interested in writing a file longer than 65535 lines and only Excel 2007+ supports it.
14
1
0.024995
0
false
4,258,896
0
21,620
1
0
0
4,257,771
If you are on Windows and have Excel 2007+ installed, you should be able to use pywin32 and COM to write XLSX files using almost the same code as you would would to write XLS files ... just change the "save as ...." part at the end. Probably, you can also write XLSX files using Excel 2003 with the freely downloadable a...
1
0
0
Python: Writing to Excel 2007+ files (.xlsx files)
8
python,excel,excel-2007,openpyxl
0
2010-11-23T15:36:00.000
I've built a number of python driven sites that utilize mongodb as a database backend and am very happy with it's ObjectId system, however, I'd love to be able encode the ids in a shorter fashion without building a mapping collection or utilizing a url-shortener service. Suggestions? Success stories?
14
0
0
0
false
8,654,689
0
4,165
2
0
0
4,261,129
If you can generate auto-incrementing unique numbers, there's absolutely no need to use ObjectId for _id. Doing this in a distributed environment will most likely be more expensive than using ObjectId. That's your tradeoff.
1
0
0
How can one shorten mongo ids for better use in URLs?
5
python,mongodb
0
2010-11-23T21:26:00.000
I've built a number of python driven sites that utilize mongodb as a database backend and am very happy with it's ObjectId system, however, I'd love to be able encode the ids in a shorter fashion without building a mapping collection or utilizing a url-shortener service. Suggestions? Success stories?
14
1
0.039979
0
false
4,261,319
0
4,165
2
0
0
4,261,129
If you are attempting to retain the original value then there really is not a good way. You could encode it, but the likeliness of it being smaller is minimal. You could hash it, but then it's not reversible. If this is a REQUIREMENT, I'd probably recommend creating a lookup table or collection where a small incrementa...
1
0
0
How can one shorten mongo ids for better use in URLs?
5
python,mongodb
0
2010-11-23T21:26:00.000
When creating a virtual environment with no -site packages do I need to install mysql & the mysqldb adapter which is in my global site packages in order to use them in my virtual project environment?
4
5
1.2
0
true
4,273,823
0
920
1
0
0
4,273,729
You can also (on UNIX) symlink specific packages from the Python site-packages into your virtualenv's site-packages.
1
0
1
Python Virtualenv
2
python,virtualenv
0
2010-11-25T04:29:00.000