Question stringlengths 25 7.47k | Q_Score int64 0 1.24k | Users Score int64 -10 494 | Score float64 -1 1.2 | Data Science and Machine Learning int64 0 1 | is_accepted bool 2
classes | A_Id int64 39.3k 72.5M | Web Development int64 0 1 | ViewCount int64 15 1.37M | Available Count int64 1 9 | System Administration and DevOps int64 0 1 | Networking and APIs int64 0 1 | Q_Id int64 39.1k 48M | Answer stringlengths 16 5.07k | Database and SQL int64 1 1 | GUI and Desktop Applications int64 0 1 | Python Basics and Environment int64 0 1 | Title stringlengths 15 148 | AnswerCount int64 1 32 | Tags stringlengths 6 90 | Other int64 0 1 | CreationDate stringlengths 23 23 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
The type of a field in a collection in my mongodb database is unicode string. This field currently does not have any data associated with it in any of the documents in the collection.
I dont want the type to be string because,i want to add subfields to it from my python code using pymongo.
The collection already has ma... | 2 | 1 | 0.099668 | 0 | false | 6,789,704 | 0 | 1,765 | 1 | 0 | 0 | 6,789,562 | Sure, simply create a script that iterates over your current collection, reads the existing value and overwrite it with the new value (an embedded document in your case). You change the typ of the field by simply setting a new value for that field. E.g. setting a string field to an integer field :
db.test.update({field... | 1 | 0 | 1 | Changing the type of a field in a collection in a mongodb database | 2 | python,mongodb,pymongo | 0 | 2011-07-22T11:50:00.000 |
I have a rather complex Excel 2010 file that I automate using python and win32com. For this I run windows in virtual box on an ubuntu machine.
However, that same excel file solves/runs fine on Ubuntu Maverick directly using wine 1.3. Any hope of automating Excel on wine so I can drop the VM?
Or is that just crazy tal... | 2 | 3 | 1.2 | 0 | true | 6,847,960 | 0 | 2,617 | 1 | 1 | 0 | 6,847,684 | You'd need a Windows version of Python, not a Linux version -- I'm saying you'd have to run Python under wine as well.
Have you tried with just a normal Windows install of Python on wine? I don't see any reason why this wouldn't work.
There is are numerous pages in a Google search that show Windows Python (32-bit) work... | 1 | 0 | 0 | automating excel with win32com on linux with wine | 1 | python,linux,excel,win32com,wine | 0 | 2011-07-27T16:10:00.000 |
I have lots of data to operate on (write, sort, read). This data can potentially be larger than the main memory and doesn't need to be stored permanently.
Is there any kind of library/database that can store these data for me in memory and that does have and automagically fallback to disk if system runs in a OOM situat... | 0 | 0 | 1.2 | 0 | true | 7,056,331 | 0 | 105 | 1 | 0 | 0 | 6,942,105 | I will go for the in memory solution and let the OS swap. I can still replace the storage component if this will be really a problem. Thanks agf. | 1 | 0 | 0 | In memory database with fallback to disk on OOM | 2 | python | 0 | 2011-08-04T13:16:00.000 |
I have a database and csv file that gets updated once a day. I managed to updated my table1 from this file by creating a separate log file with the record of the last insert.
No, I have to create a new table table2 where I keep calculations from the table1.
My issue is that those calculations are based on 10, 20 and 9... | 0 | 0 | 0 | 0 | false | 15,492,291 | 0 | 778 | 1 | 0 | 0 | 6,945,953 | The answer is "as well as one could possibly expect."
Without seeing your tables, data, and queries, and the stats of your machine it is hard to be too specific. However in general updates basically doing three steps. This is a bit of an oversimplification but it allows you to estimate performance.
First it selects t... | 1 | 0 | 0 | Table updates using daily data from other tables Postgres/Python | 1 | python,postgresql | 0 | 2011-08-04T17:33:00.000 |
I noticed that sqlite3 isn´t really capable nor reliable when i use it inside a multiprocessing enviroment. Each process tries to write some data into the same database, so that a connection is used by multiple threads. I tried it with the check_same_thread=False option, but the number of insertions is pretty random: S... | 8 | 8 | 1 | 0 | false | 12,809,817 | 0 | 20,258 | 1 | 0 | 0 | 6,969,820 | I've actually just been working on something very similar:
multiple processes (for me a processing pool of 4 to 32 workers)
each process worker does some stuff that includes getting information
from the web (a call to the Alchemy API for mine)
each process opens its own sqlite3 connection, all to a single file, and e... | 1 | 0 | 1 | SQLite3 and Multiprocessing | 4 | python,sqlite,multiprocessing | 0 | 2011-08-07T00:05:00.000 |
I'm in the process of setting up a webserver from scratch, mainly for writing webapps with Python. On looking at alternatives to Apache+mod_wsgi, it appears that pypy plays very nicely indeed with pretty much everything I intend to use for my own apps. Not really having had a chance to play with PyPy properly, I feel t... | 5 | -1 | -0.099668 | 0 | false | 8,920,308 | 1 | 815 | 1 | 0 | 0 | 6,976,578 | I know that mod_wsgi doesn't work with mod_php
I heavily advise you, running PHP and Python applications on CGI level.
PHP 5.x runs on CGI, for python there exists flup, that makes it possible to run WSGI Applications on CGI.
Tamer | 1 | 0 | 0 | PyPy + PHP on a single webserver | 2 | php,python,apache,wsgi,pypy | 1 | 2011-08-07T23:41:00.000 |
I'm writing a Python script which uses a MySQL database, which is locally hosted. The program will be delivered as source code. As a result, the MySQL password will be visible to bare eyes. Is there a good way to protect this?
The idea is to prevent some naughty people from looking at the source code, gaining direct a... | 17 | 16 | 1.2 | 0 | true | 6,981,725 | 0 | 14,190 | 2 | 0 | 0 | 6,981,064 | Short answer
You can't.
If the password is stored in the artifact that's shipped to the end-user you must consider it compromised! Even if the artifact is a compiled binary, there are always (more or less complicated) ways to get at the password.
The only way to protect your resources is by exposing only a limited API ... | 1 | 0 | 0 | Safeguarding MySQL password when developing in Python? | 4 | python,mysql | 0 | 2011-08-08T10:52:00.000 |
I'm writing a Python script which uses a MySQL database, which is locally hosted. The program will be delivered as source code. As a result, the MySQL password will be visible to bare eyes. Is there a good way to protect this?
The idea is to prevent some naughty people from looking at the source code, gaining direct a... | 17 | -5 | -1 | 0 | false | 6,981,128 | 0 | 14,190 | 2 | 0 | 0 | 6,981,064 | Either use simple passwor like root.Else Don't use password. | 1 | 0 | 0 | Safeguarding MySQL password when developing in Python? | 4 | python,mysql | 0 | 2011-08-08T10:52:00.000 |
I am trying to find a solution for a problem I am working on. I have a python program which is is using a custom built sqlite3 install (which allows > 10 simultaneous connections) and in addition requires the use of Tix (which does not come as a stand install with the python package for the group I am distributing to.... | 2 | 3 | 1.2 | 0 | true | 6,988,101 | 0 | 115 | 1 | 0 | 0 | 6,986,925 | One possible solution: Create a custom package for that program containing the custom sqlite3/etc. stuff and use relative imports to refer to those custom subpackages from a main module in your package, which you'd hook into with a simple importing script that would execute a your_package.run() function or something. Y... | 1 | 0 | 1 | Packaging a Python Program with custom built libraries | 1 | python,distutils | 0 | 2011-08-08T18:40:00.000 |
I'm looking for a way to debug queries as they are executed and I was wondering if there is a way to have MySQLdb print out the actual query that it runs, after it has finished inserting the parameters and all that? From the documentation, it seems as if there is supposed to be a Cursor.info() call that will give info... | 82 | 126 | 1.2 | 0 | true | 7,190,914 | 0 | 67,798 | 1 | 0 | 0 | 7,071,166 | We found an attribute on the cursor object called cursor._last_executed that holds the last query string to run even when an exception occurs. This was easier and better for us in production than using profiling all the time or MySQL query logging as both of those have a performance impact and involve more code or mor... | 1 | 0 | 0 | Print the actual query MySQLdb runs? | 10 | python,mysql,mysql-python | 0 | 2011-08-15T21:43:00.000 |
Is this possible? Generation of Excel combobox in a cell using xlwt or similar module?
When I load the xls using xlrd, then copy and save it using xlwt, the combobox from original xls is lost. | 2 | 1 | 0.197375 | 0 | false | 7,266,184 | 0 | 597 | 1 | 0 | 0 | 7,094,771 | No, it's not possible. xlrd doesn't pick up the combo box and suchlike. | 1 | 0 | 0 | Excel Combobox in Python xlwt module | 1 | python,excel,combobox,xlwt,xlrd | 0 | 2011-08-17T14:45:00.000 |
I'm implementing a voting system for a relatively large website and I'm wondering where should I store the vote count. The main problem is that storing them in the main database would put a lot of strain on it, as MySQL isn't very good at handing lots and lots of simple queries.
My best option so far is to use memcach... | 3 | 2 | 1.2 | 0 | true | 7,112,410 | 0 | 145 | 4 | 0 | 0 | 7,112,347 | Can you accept some degree of vote loss? If so, you can do a hybrid solution. Every modulo 100 (10, something), update the SQL database with the current memcache value. You can also have a periodic script scan and update if required. | 1 | 0 | 0 | Best way of storing incremental numbers? | 6 | python,memcached,voting | 0 | 2011-08-18T18:33:00.000 |
I'm implementing a voting system for a relatively large website and I'm wondering where should I store the vote count. The main problem is that storing them in the main database would put a lot of strain on it, as MySQL isn't very good at handing lots and lots of simple queries.
My best option so far is to use memcach... | 3 | 0 | 0 | 0 | false | 7,112,511 | 0 | 145 | 4 | 0 | 0 | 7,112,347 | Mongodb can work well.Since it can be faster or Google App Engine was designed to scale. | 1 | 0 | 0 | Best way of storing incremental numbers? | 6 | python,memcached,voting | 0 | 2011-08-18T18:33:00.000 |
I'm implementing a voting system for a relatively large website and I'm wondering where should I store the vote count. The main problem is that storing them in the main database would put a lot of strain on it, as MySQL isn't very good at handing lots and lots of simple queries.
My best option so far is to use memcach... | 3 | 2 | 0.066568 | 0 | false | 7,112,669 | 0 | 145 | 4 | 0 | 0 | 7,112,347 | MySQL isn't very good at handing lots and lots of simple queries
You may have something drastically misconfigured in your MySQL server. MySQL should easily be able to handle 4000 queries per minute. There are benchmarks of MySQL handling over 25k INSERTs per second. | 1 | 0 | 0 | Best way of storing incremental numbers? | 6 | python,memcached,voting | 0 | 2011-08-18T18:33:00.000 |
I'm implementing a voting system for a relatively large website and I'm wondering where should I store the vote count. The main problem is that storing them in the main database would put a lot of strain on it, as MySQL isn't very good at handing lots and lots of simple queries.
My best option so far is to use memcach... | 3 | 0 | 0 | 0 | false | 7,116,659 | 0 | 145 | 4 | 0 | 0 | 7,112,347 | If you like memcached but don't like the fact that it doesn't persist data then you should consider using Membase. Membase is basically memcached with sqlite as the persistence layer. It is very easy to set up and supports the memcached protocol so if you already have memcached set up you can use Membase as a drop in r... | 1 | 0 | 0 | Best way of storing incremental numbers? | 6 | python,memcached,voting | 0 | 2011-08-18T18:33:00.000 |
I looked through several SO-Questions for how to pickle a python object and store it into a database. The information I collected is:
import pickle or import cpickle. Import the latter, if performance is an issue.
Assume dict is a python dictionary (or what so ever python object): pickled = pickle.dumps(dict).
store p... | 7 | 2 | 1.2 | 0 | true | 7,117,674 | 0 | 5,063 | 1 | 0 | 0 | 7,117,525 | It's really that easy... so long as you don't need your DB to know anything about the dictionary. If you need any sort of structured data access to the contents of the dictionary, then you're going to have to get more involved.
Another gotcha might be what you intend to put in the dict. Python's pickle serialization ... | 1 | 0 | 1 | How to Pickle a python dictionary into MySQL? | 3 | python,mysql,pickle | 0 | 2011-08-19T05:59:00.000 |
I'm trying to get a django site deployed from a repository. I was almost there, and then changed something (I'm not sure what!!) and was back to square one.
Now I'm trying to run ./manage.py syncdb and get the following error:
django.core.exceptions.ImproperlyConfigured: Error loading MySQLdb module: this is MySQLdb v... | 6 | 1 | 1.2 | 0 | true | 7,352,188 | 1 | 4,064 | 1 | 0 | 0 | 7,137,214 | For those who come upon this question:
It turns out that ubuntu _mysql version was different from the one in my venv. Uninstalling that and re-installing in my venv did the trick. | 1 | 0 | 0 | Django MySQLdb version doesn't match _mysql version Ubuntu | 2 | mysql,django,deployment,ubuntu,mysql-python | 0 | 2011-08-21T08:27:00.000 |
i'm trying to build a web server using apache as the http server, mod_wsgi + python as the logic handler, the server was supposed to handler long request without returning, meaning i want to keep writing stuff into this request.
the problem is, when the link is broken, the socket is in a CLOSE_WAIT status, apache will ... | 0 | 1 | 1.2 | 0 | true | 7,145,199 | 1 | 393 | 1 | 0 | 0 | 7,144,011 | You cant. It is a limitation of the API defined by the WSGI specification. So, nothing to do with Apache or mod_wsgi really as you will have the same issue with any WSGI server if you follow the WSGI specification.
If you search through the mod_wsgi mailing list on Google Groups you will find a number of discussions ab... | 1 | 0 | 0 | apache server with mod_wsgi + python as backend, how can i be able to notified my connection status? | 1 | python,apache,webserver,mod-wsgi | 0 | 2011-08-22T06:52:00.000 |
I am currently working on two projects in python. One need python 2.5 and other 2.7.
Now the problem is when I installed mysql python for 2.5 it required 32 bit version of mysql and it was not working with 64 bit version. So I installed 32 bit version. This project is done by using virtualenv.
Now I need to run it on ... | 0 | 0 | 1.2 | 0 | true | 7,159,017 | 0 | 235 | 1 | 0 | 0 | 7,158,929 | It is possible but you'll need to to compile them by hand, start by creating separate folders for them to live in, then get the source and dependencies that they'll need and keep them separate, you'll need to alter the ./configure commands to point them to the correct places and they should build fine. | 1 | 0 | 1 | Install both 32 bit and 64 bit versions of mysql on a same mac machine | 1 | mysql,osx-snow-leopard,32bit-64bit,mysql-python,python-2.5 | 0 | 2011-08-23T09:34:00.000 |
Okay, so I'm connected to an oracle database in python 2.7 and cx_Oracle 5.1 compiled against the instant client 11.2. I've got a cursor to the database and running SQL is not an issue, except this:
cursor.execute('ALTER TRIGGER :schema_trigger_name DISABLE',
schema_trigger_name='test.test_tr... | 1 | 0 | 0 | 0 | false | 7,174,814 | 0 | 1,300 | 1 | 0 | 0 | 7,174,741 | You normally can't bind an object name in Oracle. For variables it'll work but not for trigger_names, table_names etc. | 1 | 0 | 0 | Exception binding variables with cx_Oracle in python | 2 | python,oracle,cx-oracle | 0 | 2011-08-24T11:32:00.000 |
As a personal project, I have been developing my own database software in C#. Many current database systems can use SQL commands for queries. Is there anyone here that could point me in the right direction of implementing such a system in a database software written completely from scratch? For example a user familiar ... | 1 | 3 | 1.2 | 0 | true | 7,211,297 | 0 | 2,117 | 1 | 0 | 0 | 7,211,204 | A full-on database engine is a pretty serious undertaking. You're not going to sit down and have a complete engine next week, so I'd have thought you would want to write the SQL parser piecemeal: adding features to the parser as the features are supported in the engine.
I'm guessing this is just something fun to do, r... | 1 | 0 | 0 | C# custom database engine, how to implement SQL | 1 | c#,java,python,sql,database | 0 | 2011-08-26T22:36:00.000 |
Im trying to pull only one column from a datastore table
I have a Books model with
id, key, title, author, isbn and price
everything = db.GqlQuery('SELECT * FROM Books') gives me everything, but say i only want the title
books = db.GqlQuery('SELECT title FROM Books')
Ive tried everything people have suggested but nothi... | 2 | 3 | 0.291313 | 0 | false | 7,214,401 | 1 | 1,553 | 1 | 0 | 0 | 7,213,991 | You can't. GQL is not SQL, and the datastore is not a relational database. An entity is stored as a single serialized protocol buffer, and it's impossible to fetch part of an entity; the whole thing needs to be deserialized. | 1 | 0 | 0 | Google App Engine python, GQL, select only one column from datastore | 2 | python,google-app-engine,gql,gqlquery | 0 | 2011-08-27T10:36:00.000 |
I am relatively new to SQLalchemy and have done basic database creation, insert, update and delete. I have found it quite simple to use so far. My question is:
I want to move records from one database to another backup database. What is the simplest way to do this in SQLalchemy? | 2 | 0 | 0 | 0 | false | 7,216,293 | 0 | 1,472 | 1 | 0 | 0 | 7,216,100 | You would just go direct to the database utiltites and back it up there. Nothing to do with SQLAlchemy | 1 | 0 | 0 | What is the easiest way to move data from one database to another backup database using SQLalchemy? | 2 | python,sqlalchemy | 0 | 2011-08-27T17:17:00.000 |
I am dealing with an application with huge SQL queries. They are so complex that when I finish understanding one I have already forgotten how it all started.
I was wondering if it will be a good practice to pull more data from database and make the final query in my code, let's say, with Python. Am I nuts? Would it be ... | 9 | 7 | 1 | 0 | false | 7,279,821 | 0 | 3,111 | 3 | 0 | 0 | 7,279,761 | Let the DB figure out how best to retrieve the information that you want, else you'll have to duplicate the functionality of the RDBMS in your code, and that will be way more complex than your SQL queries.
Plus, you'll waste time transferring all that unneeded information from the DB to your app, so that you can filte... | 1 | 0 | 0 | Should I use complex SQL queries or process results in the application? | 5 | python,sql,performance | 0 | 2011-09-02T06:05:00.000 |
I am dealing with an application with huge SQL queries. They are so complex that when I finish understanding one I have already forgotten how it all started.
I was wondering if it will be a good practice to pull more data from database and make the final query in my code, let's say, with Python. Am I nuts? Would it be ... | 9 | 3 | 1.2 | 0 | true | 7,280,826 | 0 | 3,111 | 3 | 0 | 0 | 7,279,761 | I would have the business logic in the application, as much as possible. Complex business logic in queries are difficult to maintain. (when I finish understanding one I have already forgotten how it all started)Complex logic in stored procedures are ok. But with a typical python application, you would want your busines... | 1 | 0 | 0 | Should I use complex SQL queries or process results in the application? | 5 | python,sql,performance | 0 | 2011-09-02T06:05:00.000 |
I am dealing with an application with huge SQL queries. They are so complex that when I finish understanding one I have already forgotten how it all started.
I was wondering if it will be a good practice to pull more data from database and make the final query in my code, let's say, with Python. Am I nuts? Would it be ... | 9 | 1 | 0.039979 | 0 | false | 7,282,367 | 0 | 3,111 | 3 | 0 | 0 | 7,279,761 | @Nivas is generally correct.
These are pretty common patterns
Division of labour - the DBAs have to return all the data the business need, but they only have a database to work with. The developers could work with the DBAs to do it better but departmental responsbilities make it nearly impossible. So SQL to do moret... | 1 | 0 | 0 | Should I use complex SQL queries or process results in the application? | 5 | python,sql,performance | 0 | 2011-09-02T06:05:00.000 |
My python application is dying, this oracle trace file is being generated. I am using cx_Oracle, how do I go about using this trace file to resolve this crash?
ora_18225_139690296567552.trc
kpedbg_dmp_stack()+360<-kpeDbgCrash()+192<-kpureq2()+3194<-OCIStmtPrepare2()+157<-Cursor_InternalPrepare()+298<-0000000000EA3010<-... | 1 | 0 | 1.2 | 0 | true | 7,530,424 | 0 | 653 | 1 | 0 | 0 | 7,285,135 | Do you have an Oracle support contract? If I would file an SR and upload the trace to Oracle and have them tell you what it is complaining about. Those code calls are deep in their codebase from the looks of it. | 1 | 0 | 0 | I have an Oracle Stack trace file Python cx_Oracle | 1 | python,cx-oracle | 0 | 2011-09-02T14:43:00.000 |
In a regular application (like on Windows), when objects/variables are created on a global level it is available to the entire program during the entire duration the program is running.
In a web application written in PHP for instance, all variables/objects are destroyed at the end of the script so everything has to be... | 4 | 0 | 0 | 0 | false | 7,293,404 | 1 | 193 | 1 | 0 | 0 | 7,293,290 | All Python globals are created when the module is imported. When module is re-imported the same globals are used.
Python web servers do not do threading, but pre-forked processes. Thus there is no threading issues with Apache.
The lifecycle of Python processes under Apache depends. Apache has settings how many child pr... | 1 | 0 | 0 | Memory model for apache/modwsgi application in python? | 2 | python,apache,memory-management,mod-wsgi | 0 | 2011-09-03T13:09:00.000 |
I have a single table in an Sqlite DB, with many rows. I need to get the number of rows (total count of items in the table).
I tried select count(*) from table, but that seems to access each row and is super slow.
I also tried select max(rowid) from table. That's fast, but not really safe -- ids can be re-used, table c... | 2 | 0 | 0 | 0 | false | 34,628,302 | 0 | 3,251 | 3 | 0 | 0 | 7,346,079 | To follow up on Thilo's answer, as a data point, I have a sqlite table with 2.3 million rows. Using select count(*) from table, it took over 3 seconds to count the rows. I also tried using SELECT rowid FROM table, (thinking that rowid is a default primary indexed key) but that was no faster. Then I made an index on one... | 1 | 0 | 0 | Fast number of rows in Sqlite | 3 | python,sqlite | 0 | 2011-09-08T09:44:00.000 |
I have a single table in an Sqlite DB, with many rows. I need to get the number of rows (total count of items in the table).
I tried select count(*) from table, but that seems to access each row and is super slow.
I also tried select max(rowid) from table. That's fast, but not really safe -- ids can be re-used, table c... | 2 | 1 | 0.066568 | 0 | false | 7,346,136 | 0 | 3,251 | 3 | 0 | 0 | 7,346,079 | Do you have any kind of index on a not-null column (for example a primary key)? If yes, the index can be scanned (which hopefully does not take that long). If not, a full table scan is the only way to count all rows. | 1 | 0 | 0 | Fast number of rows in Sqlite | 3 | python,sqlite | 0 | 2011-09-08T09:44:00.000 |
I have a single table in an Sqlite DB, with many rows. I need to get the number of rows (total count of items in the table).
I tried select count(*) from table, but that seems to access each row and is super slow.
I also tried select max(rowid) from table. That's fast, but not really safe -- ids can be re-used, table c... | 2 | 1 | 0.066568 | 0 | false | 7,346,821 | 0 | 3,251 | 3 | 0 | 0 | 7,346,079 | Other way to get the rows number of a table is by using a trigger that stores the actual number of rows in other table (each insert operation will increment a counter).
In this way inserting a new record will be a little slower, but you can immediately get the number of rows. | 1 | 0 | 0 | Fast number of rows in Sqlite | 3 | python,sqlite | 0 | 2011-09-08T09:44:00.000 |
I'm migrating a GAE/Java app to Python (non-GAE) due new pricing, so I'm getting a little server and I would like to find a database that fits the following requirements:
Low memory usage (or to be tuneable or predictible)
Fastest querying capability for simple document/tree-like data identified by key (I don't care a... | 3 | 1 | 1.2 | 0 | true | 7,377,444 | 1 | 830 | 1 | 0 | 0 | 7,375,415 | I would recommend Postresql, only because it does what you want, can scale, is fast, rather easy to work with and stable.
It is exceptionally fast at the example queries given, and could be even faster with document querying. | 1 | 0 | 0 | Low memory and fastest querying database for a Python project | 2 | python,database,nosql,rdbms | 0 | 2011-09-10T23:49:00.000 |
I am using Psycopg2 with PostgreSQL 8.4. While reading from a huge table, I suddenly get this cryptic error at the following line of code, after this same line of code has successfully fetched a few hundred thousand rows.
somerows = cursorToFetchData.fetchmany(30000)
psycopg2.DataError: invalid value "LÃ" for "DD"
DETA... | 0 | 2 | 1.2 | 0 | true | 7,378,101 | 0 | 176 | 2 | 0 | 0 | 7,375,572 | Can you paste in the data from the row that's causing the problem? At a guess I'd say it's a badly formatted date entry, but hard to say.
(Can't comment, so has to be in a answer...) | 1 | 0 | 0 | Cryptic Psycopg2 error message | 2 | python,postgresql,psycopg2 | 0 | 2011-09-11T00:33:00.000 |
I am using Psycopg2 with PostgreSQL 8.4. While reading from a huge table, I suddenly get this cryptic error at the following line of code, after this same line of code has successfully fetched a few hundred thousand rows.
somerows = cursorToFetchData.fetchmany(30000)
psycopg2.DataError: invalid value "LÃ" for "DD"
DETA... | 0 | 1 | 0.099668 | 0 | false | 40,247,155 | 0 | 176 | 2 | 0 | 0 | 7,375,572 | This is not a psycopg error, it is a postgres error.
After the error is raised, take a look at cur.query to see the query generated. Copy and paste it into psql and you'll see the same error. Then debug it from there. | 1 | 0 | 0 | Cryptic Psycopg2 error message | 2 | python,postgresql,psycopg2 | 0 | 2011-09-11T00:33:00.000 |
I have a large amount of data that I am pulling from an xml file that all needs to be validated against each other (in excess of 500,000 records). It is location data, so it has information such as: county, street prefix, street suffix, street name, starting house number, ending number. There are duplicates, house numb... | 0 | 0 | 0 | 0 | false | 7,392,402 | 0 | 1,110 | 1 | 0 | 0 | 7,391,148 | Unless this data has already been sanitised against the PAF (UK Post office Address file - every address in UK basically) then you will have addresses in there that are the same actual house, but spelt differently, wrong postcode, postcode in wrong field etc. This will completely change your approach.
Check out if thi... | 1 | 0 | 0 | Large temporary database to sanitize data in Python | 2 | python,xml,sanitization | 0 | 2011-09-12T16:40:00.000 |
Well, I might be doing some work in Python that would end up with hundreds of thousands, maybe millions of rows of data, each with entries in maybe 50 or more columns. I want a way to keep track of this data and work with it. Since I also want to learn Microsoft Access, I suggest putting the data in there. Is there ... | 1 | 1 | 1.2 | 0 | true | 7,410,499 | 0 | 323 | 1 | 0 | 0 | 7,410,458 | Yes, you can talk to any ODBC database from Python, and that should include Access. You'll want the "windows" version of Python (which includes stuff like ODBC) from ActiveState.
I'd be more worried about the "millions of rows" in Access, it can get a bit slow on retrieval if you're actually using it for relational t... | 1 | 0 | 1 | Is it possible to store data from Python in Access file? | 2 | python,database,ms-access | 0 | 2011-09-14T01:56:00.000 |
I'm working on an application where a user can search for items near his location.
When a user registers for my service, their long/lat coordinates are taken (this is actually grabbed from a zip/postcode and then gets looked up via Google for the long/lats). This also happens when a user adds an item, they are asked fo... | 3 | 0 | 0 | 0 | false | 7,420,726 | 0 | 4,012 | 1 | 0 | 0 | 7,413,619 | To be performant, you don't want to do a complete scan through the database and compute distances for each row, you want conditions that can be indexed. The simplest way to do this is to compute a box with a minimum/maximum latitude and minimum/maximum longitude, and use BETWEEN to exclude everything outside of those r... | 1 | 0 | 0 | Find long/lat's within 20 miles of user's long/lat | 6 | python,mysql,geolocation,latitude-longitude | 0 | 2011-09-14T08:49:00.000 |
I have an application that needs to interface with another app's database. I have read access but not write.
Currently I'm using sql statements via pyodbc to grab the rows and using python manipulate the data. Since I don't cache anything this can be quite costly.
I'm thinking of using an ORM to solve my problem. The q... | 1 | 2 | 0.379949 | 0 | false | 7,429,664 | 0 | 122 | 1 | 0 | 0 | 7,426,564 | I don't think an ORM is the solution to your problem of performance. By default ORMs tend to be less efficient than row SQL because they might fetch data that you're not going to use (eg. doing a SELECT * when you need only one field), although SQLAlchemy allows fine-grained control over the SQL generated.
Now to imple... | 1 | 0 | 0 | How to interface with another database effectively using python | 1 | python,sql,orm,sqlalchemy | 0 | 2011-09-15T06:12:00.000 |
My Python High Replication Datastore application requires a large lookup table of between 100,000 and 1,000,000 entries. I need to be able to supply a code to some method that will return the value associated with that code (or None if there is no association). For example, if my table held acceptable English words the... | 0 | 1 | 1.2 | 0 | true | 7,466,485 | 1 | 143 | 2 | 1 | 0 | 7,451,163 | If you can, try and fit the data into instance memory. If it won't fit in instance memory, you have a few options available to you.
You can store the data in a resource file that you upload with the app, if it only changes infrequently, and access it off disk. This assumes you can build a data structure that permits ea... | 1 | 0 | 0 | GAE Lookup Table Incompatible with Transactions? | 2 | python,google-app-engine,transactions,google-cloud-datastore,entity-groups | 0 | 2011-09-16T22:55:00.000 |
My Python High Replication Datastore application requires a large lookup table of between 100,000 and 1,000,000 entries. I need to be able to supply a code to some method that will return the value associated with that code (or None if there is no association). For example, if my table held acceptable English words the... | 0 | 1 | 0.099668 | 0 | false | 7,452,303 | 1 | 143 | 2 | 1 | 0 | 7,451,163 | First, if you're under the belief that a namespace is going to help avoid key collisions, it's time to take a step back. A key consists of an entity kind, a namespace, a name or id, and any parents that the entity might have. It's perfectly valid for two different entity kinds to have the same name or id. So if you hav... | 1 | 0 | 0 | GAE Lookup Table Incompatible with Transactions? | 2 | python,google-app-engine,transactions,google-cloud-datastore,entity-groups | 0 | 2011-09-16T22:55:00.000 |
I have a function where I save a large number of models, (thousands at a time), this takes several minutes so I have written a progress bar to display progress to the user. The progress bar works by polling a URL (from Javascript) and looking a request.session value to see the state of the first call (the one that is s... | 3 | 1 | 0.197375 | 0 | false | 7,473,401 | 1 | 839 | 1 | 0 | 0 | 7,472,348 | No, both your main saves and the status bar updates will be conducted using the same database connection so they will be part of the same transaction.
I can see two options to avoid this.
You can either create your own, separate database connection and save the status bar updates using that.
Don't save the status bar... | 1 | 0 | 0 | Force commit of nested save() within a transaction | 1 | python,sql,django | 0 | 2011-09-19T14:15:00.000 |
i use pymongo to test the performance of the mongodb.
i use 100 threads, every thread excecute 5000 insert, and everything work ok.
but when i excecute 10000 insert in every thead, i meet some error:
"AutoReconnect: Connection reset by peer" | 3 | 1 | 0.099668 | 0 | false | 18,267,147 | 0 | 1,676 | 1 | 0 | 0 | 7,479,907 | Driver can't remove dropped socket from connection from pool until your code try use it. | 1 | 0 | 1 | Mongodb : AutoReconnect, Connection reset by peer | 2 | python,mongodb,pymongo | 0 | 2011-09-20T03:48:00.000 |
I am currently writing a Python script to interact with an SQLite database but it kept returning that the database was "Encrypted or Corrupted".
The database is definitely not encrypted and so I tried to open it using the sqlite3 library at the command line (returned the same error) and with SQLite Manager add-on for F... | 0 | 0 | 0 | 0 | false | 7,512,015 | 0 | 1,207 | 1 | 0 | 0 | 7,511,965 | You should check the user privileges, the user on linux may not have enough privileges. | 1 | 0 | 0 | SQLite3 Database file - Corrupted/Encrypted only on Linux | 2 | python,sql,database,sqlite | 0 | 2011-09-22T08:40:00.000 |
I'm developing an web based application written in PHP5, which basically is an UI on top of a database. To give users a more flexible tool I want to embed a scripting language, so they can do more complex things like fire SQL queries, do loops and store data in variables and so on. In my business domain Python is widel... | 11 | 3 | 0.099668 | 0 | false | 7,660,613 | 0 | 570 | 2 | 0 | 0 | 7,528,360 | How about doing the scripting on the client. That will ensure maximum security and also save server resources.
In other words Javascript would be your scripting platform. What you do is expose the functionality of your backend as javascript functions. Depending on how your app is currently written that might require ba... | 1 | 0 | 0 | Embed python/dsl for scripting in an PHP web application | 6 | php,python,dsl,plpgsql | 1 | 2011-09-23T11:36:00.000 |
I'm developing an web based application written in PHP5, which basically is an UI on top of a database. To give users a more flexible tool I want to embed a scripting language, so they can do more complex things like fire SQL queries, do loops and store data in variables and so on. In my business domain Python is widel... | 11 | 0 | 0 | 0 | false | 7,605,372 | 0 | 570 | 2 | 0 | 0 | 7,528,360 | You could do it without Python, by ie. parsing the user input for pre-defined "tags" and returning the result. | 1 | 0 | 0 | Embed python/dsl for scripting in an PHP web application | 6 | php,python,dsl,plpgsql | 1 | 2011-09-23T11:36:00.000 |
I would like to store windows path in MySQL without escaping the backslashes. How can I do this in Python? I am using MySQLdb to insert records into the database. When I use MySQLdb.escape_string(), I notice that the backslashes are removed. | 0 | 0 | 0 | 0 | false | 7,553,317 | 0 | 1,040 | 1 | 0 | 0 | 7,553,200 | Have a look at os.path.normpath(thePath)
I can't remember if it's that one, but there IS a standard os.path formating function that gives double backslashes, that can be stored in a db "as is" and reused later "as is". I have no more windows machine and cannot test it anymore. | 1 | 0 | 0 | Storing windows path in MySQL without escaping backslashes | 2 | python,mysql | 0 | 2011-09-26T09:36:00.000 |
I'm trying to write a pop3 and imap clients in python using available libs, which will download email headers (and subsequently entire email bodies) from various servers and save them in a mongodb database. The problem I'm facing is that this client downloads emails in addition to a user's regular email client. So with... | 3 | 3 | 1.2 | 0 | true | 7,556,750 | 0 | 1,501 | 1 | 0 | 0 | 7,553,606 | Outlook logs in to a POP3 server and issues the STAT, LIST and UIDL commands; then if it decides the user has no new messages it logs out. I have observed Outlook doing this when tracing network traffic between a client and my DBMail POP3 server. I have seen Outlook fail to detect new messages on a POP3 server using ... | 1 | 0 | 0 | Download POP3 headers from a certain date (Python) | 1 | python,email,pop3 | 1 | 2011-09-26T10:14:00.000 |
I have been doing lots of searching and reading to solve this.
The main goal is let a Django-based web management system connecting to a device which runs a http server as well. Django will handle user request and ask device for the real data, then feedback to user.
Now I have a "kinda-work-in-concept" solution:
Brows... | 0 | 0 | 0 | 0 | false | 7,567,682 | 1 | 320 | 1 | 0 | 0 | 7,565,812 | If you have control over what runs on the device side, consider using XML-RPC to talk from client to server. | 1 | 0 | 0 | How to control Apache via Django to connect to mongoose(another HTTP server)? | 2 | python,django,apache,mod-wsgi,mod-python | 0 | 2011-09-27T07:52:00.000 |
I know about the XLWT library, which I've used before on a Django project. XLWT is very neat but as far as I know, it doesn't support .xlsx which is the biggest obstacle in my case. I'm probably going to be dealing with more than 2**16 rows of information. Is there any other mature similar library? Or even better, is t... | 3 | 0 | 0 | 0 | false | 7,576,355 | 0 | 1,504 | 1 | 0 | 0 | 7,576,309 | Export a CSV don't use .xlsx.. | 1 | 0 | 0 | Exporting to Excel .xlsx from a Python Pyramid project | 3 | python,xls,xlsx,xlwt,openpyxl | 0 | 2011-09-27T22:14:00.000 |
I am researching a project that would require hundreds of database writes per a minute. I have never dealt with this level of data writes before and I am looking for good scalable techniques and technologies.
I am a comfortable python developer with experience in django and sql alchemy. I am thinking I will build the d... | 0 | 0 | 0 | 0 | false | 7,587,624 | 1 | 324 | 2 | 0 | 0 | 7,586,999 | You should actually be okay with low hundreds of writes per minute through SQLAlchemy (thats only a couple a second); if you're talking more like a thousand a minute, yeah that might be problematic.
What kind of data do you have? If it's fairly flat (few tables, few relations), you might want to investigate a non-rela... | 1 | 0 | 0 | Setup for high volume of database writing | 3 | python,django,database-design,amazon-web-services | 0 | 2011-09-28T17:14:00.000 |
I am researching a project that would require hundreds of database writes per a minute. I have never dealt with this level of data writes before and I am looking for good scalable techniques and technologies.
I am a comfortable python developer with experience in django and sql alchemy. I am thinking I will build the d... | 0 | 0 | 0 | 0 | false | 7,587,774 | 1 | 324 | 2 | 0 | 0 | 7,586,999 | If it's just a few hundred writes you still can do with a relational DB. I'd pick PostgreSQL (8.0+),
which has a separate background writer process. It also has tuneable serialization levels so you
can enable some tradeoffs between speed and strict ACID compliance, some even at transaction level.
Postgres is well docum... | 1 | 0 | 0 | Setup for high volume of database writing | 3 | python,django,database-design,amazon-web-services | 0 | 2011-09-28T17:14:00.000 |
Problem
I am writing a program that reads a set of documents from a corpus (each line is a document). Each document is processed using a function processdocument, assigned a unique ID, and then written to a database. Ideally, we want to do this using several processes. The logic is as follows:
The main routine creates... | 1 | 5 | 0.761594 | 0 | false | 7,603,832 | 0 | 1,473 | 1 | 0 | 0 | 7,603,790 | The MetaData and its collection of Table objects should be considered a fixed, immutable structure of your application, not unlike your function and class definitions. As you know with forking a child process, all of the module-level structures of your application remain present across process boundaries, and table de... | 1 | 0 | 1 | How to use simple sqlalchemy calls while using thread/multiprocessing | 1 | python,database,multithreading,sqlalchemy,multiprocessing | 0 | 2011-09-29T21:50:00.000 |
So I am pretty sure that I have managed to dork up my MySQLdb installation. I have all of the following installed correctly on a fresh install of OS X Lion:
phpMyAdmin
MySQL 5.5.16
Django 1.3.1
And yet when I try to run "from django.db import connection" in a django console, I get the following:
from django.db im... | 2 | 1 | 0.066568 | 0 | false | 7,605,229 | 1 | 3,834 | 2 | 0 | 0 | 7,605,212 | Install pip if you haven't already, and run
pip install MySQL-Python | 1 | 0 | 0 | Having an issue with setting up MySQLdb on Mac OS X Lion in order to support Django | 3 | python,mysql,django,macos,mysql-python | 0 | 2011-09-30T01:53:00.000 |
So I am pretty sure that I have managed to dork up my MySQLdb installation. I have all of the following installed correctly on a fresh install of OS X Lion:
phpMyAdmin
MySQL 5.5.16
Django 1.3.1
And yet when I try to run "from django.db import connection" in a django console, I get the following:
from django.db im... | 2 | 5 | 0.321513 | 0 | false | 12,027,574 | 1 | 3,834 | 2 | 0 | 0 | 7,605,212 | I found the following solution for this issue. It worked for me. I have encountered this problem when I was running python console from PyCharm.
sudo ln -s /usr/local/mysql/lib/libmysqlclient.18.dylib /usr/lib/libmysqlclient.18.dylib | 1 | 0 | 0 | Having an issue with setting up MySQLdb on Mac OS X Lion in order to support Django | 3 | python,mysql,django,macos,mysql-python | 0 | 2011-09-30T01:53:00.000 |
I've been spending the better part of the weekend trying to figure out the best way to transfer data from an MS Access table into an Excel sheet using Python. I've found a few modules that may help (execsql, python-excel), but with my limited knowledge and the modules I have to use to create certain data (I'm a GIS pro... | 2 | 1 | 0.039979 | 0 | false | 7,636,416 | 0 | 4,767 | 2 | 0 | 0 | 7,630,142 | Another idea - how important is the formatting part? If you can ditch the formatting, you can output your data as CSV. Excel can open CSV files, and the CSV format is much simpler then the Excel format - it's so simple you can write it directly from Python like a text file, and that way you won't need to mess with Offi... | 1 | 0 | 0 | Copy data from MS Access to MS Excel using Python | 5 | python,excel,ms-access | 0 | 2011-10-03T00:23:00.000 |
I've been spending the better part of the weekend trying to figure out the best way to transfer data from an MS Access table into an Excel sheet using Python. I've found a few modules that may help (execsql, python-excel), but with my limited knowledge and the modules I have to use to create certain data (I'm a GIS pro... | 2 | 1 | 0.039979 | 0 | false | 7,630,189 | 0 | 4,767 | 2 | 0 | 0 | 7,630,142 | The best approach might be to not use Python for this task.
You could use the macro recorder in Excel to record the import of the External data into Excel.
After starting the macro recorder click Data -> Get External Data -> New Database Query and enter your criteria. Once the data import is complete you can look at t... | 1 | 0 | 0 | Copy data from MS Access to MS Excel using Python | 5 | python,excel,ms-access | 0 | 2011-10-03T00:23:00.000 |
I am importing text files into excel using xlwt module. But it allows only 256 columns to be stored. Are there any ways to solve this problem? | 14 | 0 | 0 | 0 | false | 70,290,332 | 0 | 18,626 | 2 | 0 | 0 | 7,658,513 | If you trying to write to the columns in the for loop and getting this error, then re-initalize the column to 0 while iterating. | 1 | 0 | 0 | Python - Xlwt more than 256 columns | 6 | python | 0 | 2011-10-05T08:21:00.000 |
I am importing text files into excel using xlwt module. But it allows only 256 columns to be stored. Are there any ways to solve this problem? | 14 | 1 | 0.033321 | 0 | false | 7,658,627 | 0 | 18,626 | 2 | 0 | 0 | 7,658,513 | Is that a statement of fact or should xlwt support more than 256 columns? What error do you get? What does your code look like?
If it truly does have a 256 column limit, just write your data in a csv-file using the appropriate python module and import the file into Excel. | 1 | 0 | 0 | Python - Xlwt more than 256 columns | 6 | python | 0 | 2011-10-05T08:21:00.000 |
I'm using Python with Celery and RabbitMQ to make a web spider to count the number of links on a page.
Can a database, such as MySQL, be written into asynchronously? Is it OK to commit the changes after every row added, or is it required to batch them (multi-add) and then commit after a certain number of rows/duration... | 0 | 1 | 1.2 | 0 | true | 7,780,116 | 0 | 728 | 1 | 0 | 0 | 7,659,246 | For write intensive operation like Counters and Logs NoSQL solution are always the best choice. Personally I use a mongoDB for this kind of tasks. | 1 | 0 | 0 | Python Celery Save Results in Database Asynchronously | 1 | python,database,asynchronous,rabbitmq,celery | 0 | 2011-10-05T09:31:00.000 |
I am trying to edit several excel files (.xls) without changing the rest of the sheet. The only thing close so far that I've found is the xlrd, xlwt, and xlutils modules. The problem with these is it seems that xlrd evaluates formulae when reading, then puts the answer as the value of the cell. Does anybody know of a w... | 0 | 1 | 0.049958 | 0 | false | 7,667,880 | 0 | 8,902 | 1 | 0 | 0 | 7,665,486 | As of now, xlrd doesn't read formulas. It's not that it evaluates them, it simply doesn't read them.
For now, your best bet is to programmatically control a running instance of Excel, either via pywin32 or Visual Basic or VBScript (or some other Microsoft-friendly language which has a COM interface). If you can't run... | 1 | 0 | 0 | Is there any way to edit an existing Excel file using Python preserving formulae? | 4 | python,excel,formula,xlwt,xlrd | 0 | 2011-10-05T17:52:00.000 |
I'm using postgresql and python and I need to store data group by week of the year. So, there's plenty alternatives:
week and year in two separated fields
a date pointing to the start of the week (or a random day of the week)
And, the one I like: an interval type.
I never use it, but reading the docs, seems to fit. B... | 0 | 4 | 1.2 | 0 | true | 7,668,912 | 0 | 661 | 1 | 0 | 0 | 7,668,822 | The PostgreSQL interval type isn't really what you're looking for -- it's specifically intended for storing an arbitrary length of time, ranging anywhere from a microsecond to a few million years. An interval has no starting or ending point; it's just a measure of "how long".
If you're specifically after storing which ... | 1 | 0 | 0 | psycopg2: interval type for storing weeks | 1 | python,postgresql,psycopg2 | 0 | 2011-10-05T23:11:00.000 |
I'm writing a script to access data in an established database and unfortunately, I'm breaking the DB. I'm able to recreate the issue from the command line:
[user@box tmp]# python
Python 2.7.2 (default, Sep 19 2011, 15:02:41)
[GCC 4.1.2 20080704 (Red Hat 4.1.2-48)] on linux2
Type "help", "copyright",... | 2 | 2 | 0.197375 | 0 | false | 7,670,330 | 0 | 1,471 | 2 | 0 | 0 | 7,669,434 | I suggest using psycopg2 instead of pgdb. pgdb uses the following semantics:
connect() -> open database connection, begin transaction
commit() -> commit, begin transaction
rollback() -> rollback, begin transaction
execute() -> execute statement
psycopg2, on the other hand, uses the following semantics:
connect() -> ... | 1 | 0 | 0 | python pgdb hanging database | 2 | python,postgresql,pgdb | 0 | 2011-10-06T01:15:00.000 |
I'm writing a script to access data in an established database and unfortunately, I'm breaking the DB. I'm able to recreate the issue from the command line:
[user@box tmp]# python
Python 2.7.2 (default, Sep 19 2011, 15:02:41)
[GCC 4.1.2 20080704 (Red Hat 4.1.2-48)] on linux2
Type "help", "copyright",... | 2 | 2 | 1.2 | 0 | true | 7,669,476 | 0 | 1,471 | 2 | 0 | 0 | 7,669,434 | Try calling db.rollback() before you close the cursor (or if you're doing a write operation, db.commit()). | 1 | 0 | 0 | python pgdb hanging database | 2 | python,postgresql,pgdb | 0 | 2011-10-06T01:15:00.000 |
Pretty recent (but not newborn) to both Python, SQLAlchemy and Postgresql, and trying to understand inheritance very hard.
As I am taking over another programmer's code, I need to understand what is necessary, and where, for the inheritance concept to work.
My questions are:
Is it possible to rely only on SQLAlchemy f... | 1 | 4 | 0.379949 | 0 | false | 7,675,115 | 0 | 1,451 | 1 | 0 | 0 | 7,672,569 | Welcome to Stack Overflow: in the future, if you have more than one question; you should provide a separate post for each. Feel free to link them together if it might help provide context.
Table inheritance in postgres is a very different thing and solves a different set of problems from class inheritance in python, ... | 1 | 0 | 0 | Python, SQLAlchemy and Postgresql: understanding inheritance | 2 | python,postgresql,inheritance,sqlalchemy | 0 | 2011-10-06T09:40:00.000 |
At work we want our next generation product to be based on a graph database. I'm looking for suggestions as to what database engine might be appropriate for our new project:
Out product is intended to keep track of a large number of prices for goods. Here's a simplistic example of what it does - supposing you wanted to... | 2 | 3 | 1.2 | 0 | true | 7,675,078 | 0 | 303 | 1 | 0 | 0 | 7,674,895 | Neo4J is the most mature graphDB I know of - and is java, with bindings for python too, or REST | 1 | 0 | 0 | I'm looking for a graph-database for a Java/Python centric organization | 1 | java,python,database,graph | 0 | 2011-10-06T13:27:00.000 |
We have a bunch of utility scripts in Visual FoxPro, which we use to interactively cleanse/format data. We'd like to start migrating this code to make use of other database platforms, like MySQL or SQLite.
For instance we have a script that we run which converts the name and/or address lines to proper upper/lower ca... | 1 | 0 | 0 | 0 | false | 7,681,237 | 0 | 789 | 1 | 0 | 0 | 7,681,017 | It seems like you're trying to do several things all at once. Could you take a step-by-step approach? Perhaps cleansing the data as they are right now using your normal, usual scripts. Then migrate the database to MySQL.
It is easy to migrate the database if VisualFoxPro offers a way to export the database to, say, CSV... | 1 | 0 | 0 | What's the best language/technique to perform advanced data cleansing and formatting on a SQL/MySQL/PostgreSQL table? | 1 | python,mysql,sqlalchemy,foxpro,data-cleaning | 0 | 2011-10-06T22:03:00.000 |
I have a Pylons application using SQLAlchemy with SQLite as backend. I would like to know if every read operation going to SQLite will always lead to a hard disk read (which is very slow compared to RAM) or some caching mechanisms are already involved.
does SQLite maintain a subset of the database in RAM for faster a... | 5 | 3 | 1.2 | 0 | true | 7,712,124 | 0 | 339 | 1 | 0 | 0 | 7,710,895 | Yes, SQLite has its own memory cache. Check PRAGMA cache_size for instance. Also, if you're looking for speedups, check PRAGMA temp_store. There is also API for implementing your own cache.
The SQLite database is just a file to the OS. Nothing is 'automatically' done for it. To ensure caching does happen, there are sql... | 1 | 0 | 0 | Are SQLite reads always hitting disk? | 2 | python,sqlite,sqlalchemy | 0 | 2011-10-10T09:35:00.000 |
Best practice question about setting Mongo indexes. Mongoengine, the Python ORM wrapper, allows you to set indexes in the Document meta class.
When is this meta class introspected and the index added? Can I build a collection via a mongoengine Document class and then add an index after the fact?
If I remove the inde... | 7 | 6 | 1.2 | 0 | true | 9,082,609 | 0 | 2,747 | 1 | 0 | 0 | 7,758,898 | You can add an index at any time and ensureIndex will be called behind the scenes so it will be added if it doesn't exist.
If you remove an index from the meta - you will have to use pymongo or the shell to remove the index. | 1 | 0 | 1 | How does MongoEngine handle Indexes (creation, update, removal)? | 2 | python,mongodb,indexing,mongoengine | 0 | 2011-10-13T18:43:00.000 |
which is better for production with web2py? please more insights.
I'm very new 2 web2py and i am working on a small pharmacy mgt system.
pls which is better for production postgres or mysql? if postgres, step by step installation guide pls so to smoothly work with web2py. thanks | 2 | 0 | 0 | 0 | false | 9,021,132 | 1 | 1,454 | 1 | 0 | 0 | 7,761,339 | I say. Whatever you can work with from console. Some events may require fixing db from fingertip, you may also want to have some other ongoing actions in db and it might need to be done outside web2py.
PosgreSQL is my choice as there are much less irregular behaviours thus its easier to grasp... | 1 | 0 | 0 | which is better for production with web2py? | 4 | python,web2py | 0 | 2011-10-13T22:50:00.000 |
I am using memcached on a web site, and I am currently needing to open connections to a database and socket each time a function is called. In the case of the db connection, I am having to decide at runtime, which database to connect to.
Because of the (default) stateless nature of web apps, I am having to tear down (i... | 0 | 0 | 1.2 | 0 | true | 7,785,856 | 0 | 150 | 2 | 0 | 0 | 7,783,860 | Both languages support database connections which live beyond the lifetime of a single request. Don't use memcache for that! | 1 | 0 | 0 | Is it safe to store a connection (effectively a pointer) in memcache? | 2 | php,python,memcached | 0 | 2011-10-16T11:03:00.000 |
I am using memcached on a web site, and I am currently needing to open connections to a database and socket each time a function is called. In the case of the db connection, I am having to decide at runtime, which database to connect to.
Because of the (default) stateless nature of web apps, I am having to tear down (i... | 0 | 0 | 0 | 0 | false | 7,785,929 | 0 | 150 | 2 | 0 | 0 | 7,783,860 | I am wondering if it is possible to store (i.e. cache) the socket connection and the database connections in memcache
No. | 1 | 0 | 0 | Is it safe to store a connection (effectively a pointer) in memcache? | 2 | php,python,memcached | 0 | 2011-10-16T11:03:00.000 |
I am building a web application that allows a user to upload an image. When the image is uploaded, it needs to be resized to one or more sizes, each of which needs to be sent to Amazon s3 for storage. Metadata and urls for each size of the image are stored in a single database record on the web server. I'm using a mess... | 0 | 2 | 1.2 | 0 | true | 7,802,664 | 1 | 639 | 1 | 0 | 0 | 7,802,504 | It seems that your goal is not to decouple the database from the MQ, but rather from the workers. As such, you can create another queue that receives completion notifications, and have another single worker that picks up the notifications and updates the database appropriately. | 1 | 0 | 0 | Best practice for decoupling a database from a message queue | 1 | python,database,pylons,message-queue,celery | 0 | 2011-10-18T04:41:00.000 |
I use Berkeley DB(BDB) in nginx. When a request arrives, nginx passes the URI as a key to BDB and checks if that key has a value in BDB file.
I actually did in an example. I add some data in BDB, and run nginx, it's OK. I can access it.
But when I add some data in running BDB with nginx (using Python), I can't get the... | 1 | 1 | 1.2 | 0 | true | 8,835,081 | 0 | 237 | 1 | 1 | 0 | 7,817,567 | it supports
A Single Process With One Thread
A Single Process With Multiple Threads
Groups of Cooperating Processes
Groups of Unrelated Processes | 1 | 0 | 0 | Does Berkeley DB only support one processor operation | 1 | python,nginx,berkeley-db | 0 | 2011-10-19T06:52:00.000 |
Is there way to execute DDL script from Python with kinterbasdb library for Firebird database?
Basically I'd like to replicate 'isql -i myscript.sql' command. | 1 | 2 | 0.379949 | 0 | false | 7,832,347 | 0 | 412 | 1 | 0 | 0 | 7,825,066 | It has been a while since I used kinterbasdb, but as far as I know you should be able to do this with any query command which can also be used for INSERT, UPDATE and DELETE (ie nothing that produces a resultset). So Connection.execute_immediate and Cursor.execute should work.
Did you actually try this.
BTW: With Firebi... | 1 | 0 | 0 | How to run DDL script with kinterbasdb | 1 | python,firebird,ddl,kinterbasdb | 0 | 2011-10-19T16:58:00.000 |
I'm very new to Python and I'm trying to write a sort of recipe organizer to get acquainted with the language. Basically, I am unsure how how I should be storing the recipes.
For now, the information I want to store is:
Recipe name
Ingredient names
Ingredient quantities
Preparation
I've been thinking about how to d... | 0 | 2 | 0.099668 | 0 | false | 7,827,955 | 0 | 2,639 | 1 | 0 | 0 | 7,827,859 | Just a general data modeling concept: you never want to name anything "...NumberOne", "...NumberTwo". Data models designed in this way are very difficult to query. You'll ultimately need to visit each of N tables for 1 to N ingredients. Also, each table in the model would ultimately have the same fields making main... | 1 | 0 | 0 | How to store recipe information with Python | 4 | python,database,database-design | 0 | 2011-10-19T20:42:00.000 |
I created a simple bookmarking app using django which uses sqlite3 as the database backend.
Can I upload it to appengine and use it? What is "Django-nonrel"? | 3 | 5 | 0.761594 | 0 | false | 7,838,935 | 1 | 582 | 1 | 1 | 0 | 7,838,667 | Unfortunately, no you can't. Google App Engine does not allow you to write files, and that is needed by SQLite.
Until recently, it had no support of SQL at all, preferring a home-grown solution (see the "CAP theorem" as for why). This motivated the creation of projects like "Django-nonrel" which is a version of Django ... | 1 | 0 | 0 | Can I deploy a django app which uses sqlite3 as backend on google app engine? | 1 | python,django,google-app-engine,web-applications,sqlite | 0 | 2011-10-20T15:52:00.000 |
I am using GeoDjango with PostGIS. Then I am into trouble on how to get the nearest record from the given coordinates from my postgres db table. | 13 | 2 | 0.066568 | 0 | false | 7,904,142 | 0 | 5,401 | 1 | 0 | 0 | 7,846,355 | I have no experience with GeoDjango, but on PostgreSQL/PostGIS you have the st_distance(..) function. So, you can order your results by st_distance(geom_column, your_coordinates) asc and see what are the nearest rows.
If you have plain coordinates (no postgis geometry), you can convert your coordinates to a point with ... | 1 | 0 | 0 | How can I query the nearest record in a given coordinates(latitude and longitude of string type)? | 6 | python,postgresql,postgis,geodjango | 0 | 2011-10-21T07:36:00.000 |
What's the best combination of tools to import daily data feed (in .CSV format) to a MSSQL server table?
Environment and acceptable tools:
- Windows 2000/XP
- ruby or python
MS SQL Server is on a remote server, the importing process has to be done on a Windows client machine. | 0 | 0 | 0 | 0 | false | 7,847,885 | 0 | 3,142 | 1 | 0 | 0 | 7,847,818 | And what about DTS services? It's integral part of MS SQL server starting with early versions and it allows you to import text-based data to server tables | 1 | 0 | 0 | Import CSV to MS SQL Server programmatically | 4 | python,sql-server,ruby,windows,csv | 0 | 2011-10-21T10:00:00.000 |
I'd like to open the chromium site data (in ~/.config/chromium/Default) with python-sqlite3 but it gets locked whenever chromium is running, which is understandable since transactions may be made. Is there a way to open it in read-only mode, ensuring that I can't corrupt the integrity of the db while chromium is using ... | 15 | 6 | 1 | 0 | false | 7,857,866 | 0 | 7,130 | 1 | 0 | 0 | 7,857,755 | Chromium is holding a database lock for long periods of time? Yuck! That's really not a very good idea at all. Still, not your fault…
You could try just copying the database file (e.g., with the system utility cp) and using that snapshot for reading purposes; SQLite keeps all its committed state in a single file per da... | 1 | 0 | 0 | Is it possible to open a locked sqlite database in read only mode? | 3 | python,database,sqlite | 0 | 2011-10-22T06:05:00.000 |
This is what I have :-
Ubuntu 11.10.
Django 1.3
Python 2.7
What I want to do is build an app that is similar to top-coder and I have the skeletal version of the app sketched out. The basic requirements would be:-
1. Saving the code.
2. Saving the user name and ranks.(User-profile)
3. Should allow a teacher to creat... | 3 | 3 | 0.197375 | 0 | false | 10,204,764 | 1 | 2,560 | 1 | 0 | 0 | 7,859,775 | I've used mongo-engine with Django but you need to create a file specifically for Mongo documents eg. Mongo_models.py. In that file you define your Mongo documents. You then create forms to match each Mongo document. Each form has a save method which inserts or updates whats stored in Mongo. Django forms are designed t... | 1 | 0 | 0 | Mongo DB or Couch DB with django for building an app that is similar to top coder? | 3 | python,django,mongodb,couchdb | 0 | 2011-10-22T13:09:00.000 |
There seems to be many choices for Python to interface with SQLite (sqlite3, atpy) and HDF5 (h5py, pyTables) -- I wonder if anyone has experience using these together with numpy arrays or data tables (structured/record arrays), and which of these most seamlessly integrate with "scientific" modules (numpy, scipy) for ea... | 12 | 23 | 1.2 | 1 | true | 7,891,137 | 0 | 3,647 | 1 | 0 | 0 | 7,883,646 | Most of it depends on your use case.
I have a lot more experience dealing with the various HDF5-based methods than traditional relational databases, so I can't comment too much on SQLite libraries for python...
At least as far as h5py vs pyTables, they both offer very seamless access via numpy arrays, but they're orie... | 1 | 0 | 0 | exporting from/importing to numpy, scipy in SQLite and HDF5 formats | 1 | python,sqlite,numpy,scipy,hdf5 | 0 | 2011-10-25T01:06:00.000 |
I have two List. first list element Name Age Sex and second list element test 10 female. I want to insert this data into database. In first list having MySQL Column and in second MySQL Column Values.I'm trying to make this query. INSERT INTO (LIST1) VALUES (List2) =>INSERT INTO table (name,age,sex) values (test,10,fem... | 0 | 0 | 0 | 0 | false | 7,886,073 | 0 | 77 | 1 | 0 | 0 | 7,886,024 | Try getting this to work using the MySQL gui. Once that works properly, then you can try to get it to work with Python using the SQL statements that worked in MySQL. | 1 | 0 | 0 | related to List (want to insert into database) | 2 | python | 0 | 2011-10-25T07:32:00.000 |
The most common SQLite interface I've seen in Python is sqlite3, but is there anything that works well with NumPy arrays or recarrays? By that I mean one that recognizes data types and does not require inserting row by row, and extracts into a NumPy (rec)array...? Kind of like R's SQL functions in the RDB or sqldf lib... | 6 | 1 | 0.049958 | 1 | false | 12,100,118 | 0 | 7,905 | 1 | 0 | 0 | 7,901,853 | This looks a bit older but is there any reason you cannot just do a fetchall() instead of iterating and then just initializing numpy on declaration? | 1 | 0 | 0 | NumPy arrays with SQLite | 4 | python,arrays,sqlite,numpy,scipy | 0 | 2011-10-26T11:15:00.000 |
I am running Ubuntu, Flask 0.8, mod_wsgi 3 and apache2. When an error occurs, I am unable to get Flask's custom 500 error page to trigger (and not the debug mode output either). It works fine when I run it without WSGI via app.run(debug=True).
I've tried setting WSGIErrorOverride to both On and Off in apache settings... | 2 | 1 | 0.197375 | 0 | false | 7,942,317 | 1 | 873 | 1 | 0 | 0 | 7,940,745 | Are you sure the error is actually coming from Flask if you are getting a generic Apache 500 error page? You should look in the Apache error log to see what error messages are in there first. The problem could be configuration or your WSGI script file being wrong or failing due to wrong sys.path etc. | 1 | 0 | 0 | Using Python Flask, mod_wsgi, apache2 - unable to get custom 500 error page | 1 | python,apache,wsgi,flask | 0 | 2011-10-29T18:13:00.000 |
I'm create a blog using django.
I'm getting an 'operational error: FATAL: role "[database user]" does not exist.
But i have not created any database yet, all i have done is filled in the database details in setting.py.
Do i have to create a database using psycopg2? If so, how do i do it?
Is it:
python
import psycop... | 0 | 0 | 0 | 0 | false | 7,942,855 | 1 | 1,391 | 2 | 0 | 0 | 7,941,623 | Generally, you would create the database externally before trying to hook it up with Django.
Is this your private server? If so, there are command-line tools you can use to set up a PostgreSQL user and create a database.
If it is a shared hosting situation, you would use CPanel or whatever utility your host provides t... | 1 | 0 | 0 | how do i create a database in psycopg2 and do i need to? | 2 | python,database,django,psycopg2 | 0 | 2011-10-29T20:52:00.000 |
I'm create a blog using django.
I'm getting an 'operational error: FATAL: role "[database user]" does not exist.
But i have not created any database yet, all i have done is filled in the database details in setting.py.
Do i have to create a database using psycopg2? If so, how do i do it?
Is it:
python
import psycop... | 0 | 0 | 0 | 0 | false | 7,941,712 | 1 | 1,391 | 2 | 0 | 0 | 7,941,623 | before connecting to database, you need to create database, add user, setup access for user you selected.
Reffer to installation/configuration guides for Postgres. | 1 | 0 | 0 | how do i create a database in psycopg2 and do i need to? | 2 | python,database,django,psycopg2 | 0 | 2011-10-29T20:52:00.000 |
Does a canonical user id exist for a federated user created using STS? When using boto I need a canonical user id to grant permissions to a bucket.
Here's a quick tour through my code:
I've successfully created temporary credentials using boto's STS module (using a "master" account), and this gives me back:
federa... | 2 | 1 | 1.2 | 0 | true | 8,074,814 | 1 | 718 | 1 | 0 | 0 | 8,032,576 | Contacted the author of boto and learned of:
get_canonical_user_id() for the S3Connection class.
This will give you the canonical user ID for the credentials associated with the connection. The connection will have to have been used for some operation (e.g.: listing buckets).
Very awkward, but possible. | 1 | 0 | 0 | Do AWS Canonical UserIDs exist for AWS Federated Users (temporary security credentials)? | 1 | python,amazon-s3,amazon-web-services,boto,amazon-iam | 0 | 2011-11-07T03:57:00.000 |
I store several properties of objects in hashsets. Among other things, something like "creation date". There are several hashsets in the db.
So, my question is, how can I find all objects older than a week for example? Can you suggest an algorithm what faster than O(n) (naive implementation)?
Thanks,
Oles | 1 | 2 | 1.2 | 0 | true | 8,039,797 | 0 | 462 | 1 | 0 | 0 | 8,039,566 | My initial thought would be to store the data elsewhere, like relational database, or possibly using a zset.
If you had continuous data (meaning it was consistently set at N interval time periods), then you could store the hash key as the member and the date (as a int timestamp) as the value. Then you could do a zrank... | 1 | 0 | 0 | Redis: find all objects older than | 1 | python,redis | 0 | 2011-11-07T16:37:00.000 |
I am running several thousand python processes on multiple servers which go off, lookup a website, do some analysis and then write the results to a central MySQL database.
It all works fine for about 8 hours and then my scripts start to wait for a MySQL connection.
On checking top it's clear that the MySQL daemon is ov... | 1 | 0 | 0 | 0 | false | 9,198,763 | 0 | 197 | 1 | 0 | 0 | 8,048,742 | There are a lot of tweaks that can be done to improve the performance of MySQL. Given your workload, you would probably benefit a lot from mysql 5.5 and higher, which improved performance on multiprocessor machines. Is the machine in question hitting VM? if it is paging out, then the performance of mysql will be horrib... | 1 | 0 | 0 | Python processes and MySQL | 2 | python,mysql,linux,indexing | 0 | 2011-11-08T10:06:00.000 |
sorry, but does this make sense? the ORM means: Object Relational Mapper, and here, there is Relational, and NoSql is not RDBMS! so why the use of an ORM in a NoSql solution? because i see updates of ORMs for Python! | 8 | 3 | 0.197375 | 0 | false | 8,051,721 | 0 | 4,264 | 3 | 0 | 0 | 8,051,614 | Interesting question. Although NoSQL databases do not have a mechanism to identify relationships, it does not mean that there are no logical relationships between the data that you are storing. Most of the time, you are handling & enforcing those relationships in code manually if you're using a NoSQL database.
Hence, I... | 1 | 0 | 0 | why the use of an ORM with NoSql (like MongoDB) | 3 | python,orm,mongodb | 0 | 2011-11-08T14:03:00.000 |
sorry, but does this make sense? the ORM means: Object Relational Mapper, and here, there is Relational, and NoSql is not RDBMS! so why the use of an ORM in a NoSql solution? because i see updates of ORMs for Python! | 8 | 2 | 0.132549 | 0 | false | 8,051,652 | 0 | 4,264 | 3 | 0 | 0 | 8,051,614 | ORM is an abstraction layer. Switching to a different engine is much easier when the queries are abstracted away, and hidden behind a common interface (it doesn't always work that well in practice, but it's still easier than without). | 1 | 0 | 0 | why the use of an ORM with NoSql (like MongoDB) | 3 | python,orm,mongodb | 0 | 2011-11-08T14:03:00.000 |
sorry, but does this make sense? the ORM means: Object Relational Mapper, and here, there is Relational, and NoSql is not RDBMS! so why the use of an ORM in a NoSql solution? because i see updates of ORMs for Python! | 8 | 13 | 1.2 | 0 | true | 8,051,825 | 0 | 4,264 | 3 | 0 | 0 | 8,051,614 | Firstly, they are not ORM (since they don't have any relations among them), they are ODM (Object Document Mapper)
Main usage of these ODM frameworks here same as the some common feature of ORM, thus
providing the abstraction over your data model. you can have your data modelled in your application irrespective of the... | 1 | 0 | 0 | why the use of an ORM with NoSql (like MongoDB) | 3 | python,orm,mongodb | 0 | 2011-11-08T14:03:00.000 |
I'm thinking in create a webapplication with cakephp but consuming python's appengine webservice. But, to install cakephp etc, I need to configure the database. Appengine uses another kind of datastorage, with is different from mysql, etc.
I was thinking in store the data in appengine, and using the python webservices,... | 0 | 0 | 0 | 0 | false | 8,070,747 | 1 | 378 | 1 | 1 | 0 | 8,069,649 | You can not run PHP on GAE. If you run PHP somewhere, it is a bad architecture to go over the internet for your data. It will be slooooow and a nightmare to develop in.
You should store your data where you run your php, unless you must have a distributed, globally scaling architecture, which afaiu not the case. | 1 | 0 | 0 | Connect appengine with cakephp | 5 | php,python,google-app-engine,cakephp | 0 | 2011-11-09T18:21:00.000 |
I am currently working on a pyramid system that uses sqlalchemy.
This system will include a model (let's call it Base) that is stored in a
database table. This model should be extensible by the user on runtime. Basically, the user
should be able to subclass the Base and create a new model (let's call this one 'Child').... | 3 | 4 | 0.197375 | 0 | false | 8,125,931 | 1 | 1,319 | 1 | 0 | 0 | 8,122,078 | This doesn't seem to have much to do with "database reflection", but rather dynamic table creation. This is a pretty dangerous operation and generally frowned upon.
You should try to think about how to model the possible structure your users would want to add to the Base and design your schema around that. Sometimes th... | 1 | 0 | 0 | Model Creation by SQLAlchemy database reflection | 4 | python,reflection,sqlalchemy,pyramid | 0 | 2011-11-14T13:09:00.000 |
How can we remove the database name and username that appears on top left hand side corner in openERP window after openERP logo.In which file do we need to make changes to remove that.
Thanks,
Sameer | 0 | 0 | 0 | 0 | false | 12,295,904 | 1 | 133 | 1 | 0 | 0 | 8,151,033 | It's in the openerp-web module. The location depends on your particular configuration. The relevant code can be found in the file addons/web/static/src/xml/base.xml. Search for header_title and edit the contents of the h1 tag of that class. | 1 | 0 | 0 | Removing Database name and username from top Left hand side corner. | 2 | python,openerp | 0 | 2011-11-16T11:37:00.000 |
I want to convert xlsx file to xls format using python. The reason is that im using xlrd library to parse xls files, but xlrd is not able to parse xlsx files.
Switching to a different library is not feasible for me at this stage, as the entire project is using xlrd, so a lot of changes will be required.
So, is there an... | 0 | 0 | 0 | 0 | false | 21,996,139 | 0 | 1,806 | 1 | 0 | 0 | 8,151,243 | xlrd-0.9.2.tar.gz (md5) can extract data from Excel spreadsheets (.xls and .xlsx, versions 2.0 on-wards) on any platform. | 1 | 0 | 0 | xlrd library not working with xlsx files.any way to covert xlsx to xls using python? | 2 | python,excel,xls,xlsx,xlrd | 0 | 2011-11-16T11:54:00.000 |
if I use cx_Oracle 5.0.4, I can connect from python console, and works under apache+django+mod_wsgi
but when I update cx_Oracle 5.1.1, I can connect from python console, BUT same code doesn't work under apache+django+mod_wsgi
File "C:\Python27\lib\site-packages\django\db\backends\oracle\base.py", line 24, in
rais... | 1 | 1 | 0.197375 | 0 | false | 8,158,089 | 1 | 1,227 | 1 | 0 | 0 | 8,151,815 | Need a solution as well.
I have the same setup on WinXP (Apache 2.2.21/ mod_wsgi 3.3/ python 2.7.2/ cx_Oracle 5.x.x). I found that cx_Oracle 5.1 also fails with the same error. Only 5.0.4 works.
Here is the list of changes that were made from 5.0.4 to 5.1:
Remove support for UNICODE mode and permit Unicode to be... | 1 | 0 | 0 | cx_Oracle 5.1.1 under apache+mod_wsgi | 1 | python,apache,cx-oracle | 0 | 2011-11-16T12:36:00.000 |
I'm working with SQLAlchemy for the first time and was wondering...generally speaking is it enough to rely on python's default equality semantics when working with SQLAlchemy vs id (primary key) equality?
In other projects I've worked on in the past using ORM technologies like Java's Hibernate, we'd always override .eq... | 11 | 1 | 0.099668 | 0 | false | 8,179,370 | 0 | 4,581 | 1 | 0 | 0 | 8,179,068 | I had a few situations where my sqlalchemy application would load multiple instances of the same object (multithreading/ different sqlalchemy sessions ...). It was absolutely necessary to override eq() for those objects or I would get various problems. This could be a problem in my application design, but it probably ... | 1 | 0 | 0 | sqlalchemy id equality vs reference equality | 2 | python,sqlalchemy | 0 | 2011-11-18T07:24:00.000 |
Essentially I have a large database of transactions and I am writing a script that will take some personal information and match a person to all of their past transactions.
So I feed the script a name and it returns all of the transactions that it has decided belong to that customer.
The issue is that I have to do thi... | 2 | 2 | 1.2 | 0 | true | 8,230,713 | 0 | 107 | 1 | 1 | 0 | 8,230,617 | No, there wouldn't be any problem multiple worker computers searching and writing to the same database since MySQL is designed to be able to handle this. Your approach is good. | 1 | 0 | 0 | relatively new programmer interested in using Celery, is this the right approach | 1 | python,mysql,celery | 0 | 2011-11-22T16:56:00.000 |
I'm currently trying to build and install the mySQLdb module for Python, but the command
python setup.py build
gives me the following error
running build
running build_py
copying MySQLdb/release.py -> build/lib.macosx-10.3-intel-2.7/MySQLdb
error: could not delete 'build/lib.macosx-10.3-intel-2.7/MySQLdb/release.py':... | 4 | 1 | 1.2 | 0 | true | 8,260,644 | 0 | 912 | 1 | 1 | 0 | 8,236,963 | Make sure that gcc-4.0 is in your PATH. Also, you can create an alias from gcc to gcc-4.0.
Take care about 32b and 64b versions. Mac OS X is a 64b operating system and you should right flags to make sure you're compiling for 64b architecture. | 1 | 0 | 0 | Errors When Installing MySQL-python module for Python 2.7 | 2 | python,mysql,django,mysql-python | 0 | 2011-11-23T03:28:00.000 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.