Question stringlengths 25 7.47k | Q_Score int64 0 1.24k | Users Score int64 -10 494 | Score float64 -1 1.2 | Data Science and Machine Learning int64 0 1 | is_accepted bool 2
classes | A_Id int64 39.3k 72.5M | Web Development int64 0 1 | ViewCount int64 15 1.37M | Available Count int64 1 9 | System Administration and DevOps int64 0 1 | Networking and APIs int64 0 1 | Q_Id int64 39.1k 48M | Answer stringlengths 16 5.07k | Database and SQL int64 1 1 | GUI and Desktop Applications int64 0 1 | Python Basics and Environment int64 0 1 | Title stringlengths 15 148 | AnswerCount int64 1 32 | Tags stringlengths 6 90 | Other int64 0 1 | CreationDate stringlengths 23 23 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Looking for any advice I can get.
I have 16 virtual CPUs all writing to a single remote MongoDB server. The machine that's being written to is a 64-bit machine with 32GB RAM, running Windows Server 2008 R2. After a certain amount of time, all the CPUs stop cold (no gradual performance reduction), and any attempt to... | 0 | 0 | 0 | 0 | false | 10,157,192 | 0 | 97 | 1 | 0 | 0 | 10,114,431 | Try it with journaling turned off and see if the problem remains. | 1 | 0 | 0 | Distributed write job crashes remote machine with MongoDB server | 1 | mongodb,python-2.7,windows-server-2008-r2,pymongo,distributed-transactions | 0 | 2012-04-11T21:43:00.000 |
We moved our SQL Server 2005 database to a new physical server, and since then it has been terminating any connection that persist for 30 seconds.
We are experiencing this in Oracle SQL developer and when connecting from python using pyodbc
Everything worked perfectly before, and now python returns this error after 30 ... | 0 | 1 | 1.2 | 0 | true | 10,145,890 | 0 | 549 | 1 | 0 | 0 | 10,145,201 | First of all what you need is profile the sql server to see if any activity is happening. Look for slow running queries, CPU and memory bottlenecks.
Also you can include the timeout in the querystring like this:
"Data Source=(local);Initial Catalog=AdventureWorks;Integrated Security=SSPI;Connection Timeout=30";
and ... | 1 | 0 | 0 | SQL Server 2005 terminating connections after 30 sec | 2 | python,sql,sql-server,sql-server-2005,oracle-sqldeveloper | 0 | 2012-04-13T17:06:00.000 |
I've extended sorl-thumbnail's KVStoreBase class, and made a key-value backend that uses a single MongoDB collection.
This was done in order to avoid installing a discrete key-value store (e.g. Redis).
Should I clear the collection every once in a while?
What are the downsides? | 2 | 0 | 1.2 | 0 | true | 11,557,675 | 0 | 298 | 1 | 0 | 0 | 10,146,087 | Only clear the collection if low disk usage is more important to you than fast access times.
The downsides are that your users will all hit un-cached thumbs simultaneously (And simultaneously begin recomputing them).
Just run python manage.py thumbnail cleanup
This cleans up the Key Value Store from stale cache. It re... | 1 | 0 | 0 | Using sorl-thumbnail with MongoDB storage | 1 | django,mongodb,python-imaging-library,sorl-thumbnail | 0 | 2012-04-13T18:11:00.000 |
I am currently writing a script in Python which uploads data to a localhost MySql DB. I am now looking to relocate this MySql DB to a remote server with a static IP address. I have a web hosting facility but this only allows clients to connect to the MySql DB if I specify the domain / IP address from which clients will... | 0 | 2 | 0.132549 | 0 | false | 10,157,409 | 0 | 1,360 | 1 | 0 | 0 | 10,157,380 | Go to Cpanel and add the wildcard % on remote Mysql Connection options (cPanel > Remote MySQL) | 1 | 0 | 0 | Remote Access to MySql DB (Hosting Options) | 3 | python,mysql,mysql-python | 0 | 2012-04-14T21:07:00.000 |
I am trying to read phonenumber field from xls using xlrd (python). But, I always get float no.
e.g. I get phone number as 8889997777.0
How can I get rid of floating format and convert it to string to store it in my local mongodb within python as string as regular phone number e.g. 8889997777 | 2 | 0 | 0 | 0 | false | 10,169,963 | 0 | 2,371 | 2 | 0 | 0 | 10,169,949 | Did you try using int(phoneNumberVar) or in your case int(8889997777.0)? | 1 | 0 | 0 | python xlrd reading phone nunmber from xls becomes float | 2 | python,string,floating-point,xls,xlrd | 0 | 2012-04-16T07:06:00.000 |
I am trying to read phonenumber field from xls using xlrd (python). But, I always get float no.
e.g. I get phone number as 8889997777.0
How can I get rid of floating format and convert it to string to store it in my local mongodb within python as string as regular phone number e.g. 8889997777 | 2 | 4 | 0.379949 | 0 | false | 10,170,261 | 0 | 2,371 | 2 | 0 | 0 | 10,169,949 | You say:
python xlrd reading phone nunmber from xls becomes float
This is incorrect. It is already a float inside your xls file. xlrd reports exactly what it finds.
You can use str(int(some_float_value)) to do what you want to do. | 1 | 0 | 0 | python xlrd reading phone nunmber from xls becomes float | 2 | python,string,floating-point,xls,xlrd | 0 | 2012-04-16T07:06:00.000 |
I would like to know if somebody knows a way to customize the csv output in htsql, and especially the delimiter and the encoding ?
I would like to avoid iterating over each result and find a way through configuration and/or extensions.
Thank in advance.
Anthony | 1 | 3 | 0.53705 | 1 | false | 10,210,348 | 0 | 170 | 1 | 0 | 0 | 10,205,990 | If you want TAB as a delimiter, use tsv format (e.g. /query/:tsv instead of /query/:csv).
There is no way to specify the encoding other than UTF-8. You can reencode the output manually on the client. | 1 | 0 | 0 | Customizing csv output in htsql | 1 | python,sql,htsql | 0 | 2012-04-18T08:52:00.000 |
I am new in OpenERP, I have installed OpenERP v6. I want to know how can I insert data in database? Which files I have to modify to do the job? (files for the SQL code) | 2 | 0 | 0 | 0 | false | 10,208,766 | 1 | 794 | 2 | 0 | 0 | 10,208,147 | OpenERP works with PostgreSQl as the Back-end Structure.
Postgresql is managed by pgadmin3 (Postgres GUI),you can write sql queries there and can add/delete records from there.
It is not advisable to insert/remove data directly into Database!!!! | 1 | 0 | 0 | OpenERP: insert Data code | 3 | python,postgresql,openerp | 0 | 2012-04-18T11:10:00.000 |
I am new in OpenERP, I have installed OpenERP v6. I want to know how can I insert data in database? Which files I have to modify to do the job? (files for the SQL code) | 2 | 0 | 0 | 0 | false | 10,225,346 | 1 | 794 | 2 | 0 | 0 | 10,208,147 | The addition of columns in the .py files of the corresponding modules you want to chnage will insert coumns to the pgadmin3 also defenition of classes will create tables...when the fields are displayed in xml file and values are entered to the fields through the interface the values get stored to the table values to th... | 1 | 0 | 0 | OpenERP: insert Data code | 3 | python,postgresql,openerp | 0 | 2012-04-18T11:10:00.000 |
I have a web application that has been done using Cakephp with MySql as the DB. The webapp also exposes a set of web services that get and update data to the MySQL DB. I will like to extend the app to provide a fresh set of web services but will like to use a python based framework like web2py/django etc. Since both wi... | 0 | 0 | 0 | 0 | false | 10,233,231 | 1 | 73 | 1 | 0 | 0 | 10,233,187 | This is one of the reasons to use RDBMS to provide access for different users and applications to the same data. There should absolutely no problem with this. | 1 | 0 | 0 | Same MySql DB working with a php and a python framework | 2 | php,python,mysql,django,cakephp | 0 | 2012-04-19T17:08:00.000 |
I'm trying to drop a few tables with the "DROP TABLE" command but for a unknown reason, the program just "sits" and doesn't delete the table that I want it to in the database.
I have 3 tables in the database:
Product, Bill and Bill_Products which is used for referencing products in bills.
I managed to delete/drop Produ... | 42 | 5 | 0.124353 | 0 | false | 19,072,541 | 0 | 55,426 | 4 | 0 | 0 | 10,317,114 | Had the same problem.
There were not any locks on the table.
Reboot helped. | 1 | 0 | 0 | Postgresql DROP TABLE doesn't work | 8 | python,database,django,postgresql | 0 | 2012-04-25T13:50:00.000 |
I'm trying to drop a few tables with the "DROP TABLE" command but for a unknown reason, the program just "sits" and doesn't delete the table that I want it to in the database.
I have 3 tables in the database:
Product, Bill and Bill_Products which is used for referencing products in bills.
I managed to delete/drop Produ... | 42 | 2 | 0.049958 | 0 | false | 40,749,694 | 0 | 55,426 | 4 | 0 | 0 | 10,317,114 | Old question but ran into a similar issue. Could not reboot the database so tested a few things until this sequence worked :
truncate table foo;
drop index concurrently foo_something; times 4-5x
alter table foo drop column whatever_foreign_key; times 3x
alter table foo drop column id;
drop table foo; | 1 | 0 | 0 | Postgresql DROP TABLE doesn't work | 8 | python,database,django,postgresql | 0 | 2012-04-25T13:50:00.000 |
I'm trying to drop a few tables with the "DROP TABLE" command but for a unknown reason, the program just "sits" and doesn't delete the table that I want it to in the database.
I have 3 tables in the database:
Product, Bill and Bill_Products which is used for referencing products in bills.
I managed to delete/drop Produ... | 42 | 0 | 0 | 0 | false | 69,412,889 | 0 | 55,426 | 4 | 0 | 0 | 10,317,114 | The same thing happened for me--except that it was because I forgot the semicolon. face palm | 1 | 0 | 0 | Postgresql DROP TABLE doesn't work | 8 | python,database,django,postgresql | 0 | 2012-04-25T13:50:00.000 |
I'm trying to drop a few tables with the "DROP TABLE" command but for a unknown reason, the program just "sits" and doesn't delete the table that I want it to in the database.
I have 3 tables in the database:
Product, Bill and Bill_Products which is used for referencing products in bills.
I managed to delete/drop Produ... | 42 | 4 | 0.099668 | 0 | false | 60,367,779 | 0 | 55,426 | 4 | 0 | 0 | 10,317,114 | I ran into this today, I was issuing a:
DROP TABLE TableNameHere
and getting ERROR: table "tablenamehere" does not exist. I realized that for case-sensitive tables (as was mine), you need to quote the table name:
DROP TABLE "TableNameHere" | 1 | 0 | 0 | Postgresql DROP TABLE doesn't work | 8 | python,database,django,postgresql | 0 | 2012-04-25T13:50:00.000 |
Ive been reading the Django Book and its great so far, unless something doesn't work properly. I have been trying for two days to install the psycogp2 plugin with no luck.
i navigate to the unzipped directory and run setup.py install and it returns "You must have postgresql dev for building a serverside extension or li... | 6 | -1 | -0.049958 | 0 | false | 20,124,244 | 1 | 4,157 | 2 | 0 | 0 | 10,321,568 | sudo apt-get install python-psycopg2 should work fine since it worked solution for me as well. | 1 | 0 | 0 | Django with psycopg2 plugin | 4 | python,django | 0 | 2012-04-25T18:25:00.000 |
Ive been reading the Django Book and its great so far, unless something doesn't work properly. I have been trying for two days to install the psycogp2 plugin with no luck.
i navigate to the unzipped directory and run setup.py install and it returns "You must have postgresql dev for building a serverside extension or li... | 6 | 3 | 0.148885 | 0 | false | 22,528,687 | 1 | 4,157 | 2 | 0 | 0 | 10,321,568 | I'm working on Xubuntu (12.04) and I have encountered the same error when I wanted to install django-toolbelt. I solved this error with the following operations :
sudo apt-get install python-dev
sudo apt-get install libpq-dev
sudo apt-get install python-psycopg2
I hope this informations may be helpful for someone els... | 1 | 0 | 0 | Django with psycopg2 plugin | 4 | python,django | 0 | 2012-04-25T18:25:00.000 |
I have a daemon process witch spawns child processes using multiprocessing to do some work, each child process opens its own connection handle do DB (postgres in my case). Jobs to processes are passed via Queue and if queue is empty processes invoke sleep for some time, and recheck queue
How can I implement "graceful s... | 3 | 5 | 1.2 | 0 | true | 10,322,481 | 0 | 403 | 1 | 1 | 0 | 10,322,422 | Store all the open files/connections/etc. in a global structure, and close them all and exit in your SIGTERM handler. | 1 | 0 | 0 | Gracefull shutdown, close db connections, opened files, stop work on SIGTERM, in multiprocessing | 1 | python,database,multiprocessing,signals | 0 | 2012-04-25T19:27:00.000 |
I have done my homework in reading about protection against sql injection attacks: I know that I need to use parameter binding but:
I already do this, thank you.
I know that some of the db drivers my users use implement parameter binding in the most stupid possible way. i.e., they are prone to sql injection attacks. I... | 0 | 1 | 0.049958 | 0 | false | 10,329,694 | 0 | 621 | 4 | 0 | 0 | 10,329,486 | I don't know if this is in any way applicable but I am just putting it up there for completeness and experts can downvote me at will... not to mention i have concerns about its performance in some cases.
I was once tasked with protecting an aging web app written in classic asp against sql injection (they were getting h... | 1 | 0 | 0 | protecting against sql injection attacks beyond parameter binding | 4 | python,sql,sql-injection | 0 | 2012-04-26T08:09:00.000 |
I have done my homework in reading about protection against sql injection attacks: I know that I need to use parameter binding but:
I already do this, thank you.
I know that some of the db drivers my users use implement parameter binding in the most stupid possible way. i.e., they are prone to sql injection attacks. I... | 0 | 2 | 1.2 | 0 | true | 10,336,420 | 0 | 621 | 4 | 0 | 0 | 10,329,486 | I already do this, thank you.
Good; with just this, you can be totally sure (yes, totally sure) that user inputs are being interpreted only as values. You should direct your energies toward securing your site against other kinds of vulnerabilities (XSS and CSRF come to mind; make sure you're using SSL properly, et-... | 1 | 0 | 0 | protecting against sql injection attacks beyond parameter binding | 4 | python,sql,sql-injection | 0 | 2012-04-26T08:09:00.000 |
I have done my homework in reading about protection against sql injection attacks: I know that I need to use parameter binding but:
I already do this, thank you.
I know that some of the db drivers my users use implement parameter binding in the most stupid possible way. i.e., they are prone to sql injection attacks. I... | 0 | 0 | 0 | 0 | false | 10,336,013 | 0 | 621 | 4 | 0 | 0 | 10,329,486 | So, I would like to add an extra layer of protection by adding extra sanitization of http-facing user input.
This strategy is doomed to fail. | 1 | 0 | 0 | protecting against sql injection attacks beyond parameter binding | 4 | python,sql,sql-injection | 0 | 2012-04-26T08:09:00.000 |
I have done my homework in reading about protection against sql injection attacks: I know that I need to use parameter binding but:
I already do this, thank you.
I know that some of the db drivers my users use implement parameter binding in the most stupid possible way. i.e., they are prone to sql injection attacks. I... | 0 | -1 | -0.049958 | 0 | false | 10,329,550 | 0 | 621 | 4 | 0 | 0 | 10,329,486 | Well in php, I use preg_replace to protect my website from being attacked by sql injection. preg_match can also be used. Try searching an equivalent function of this in python. | 1 | 0 | 0 | protecting against sql injection attacks beyond parameter binding | 4 | python,sql,sql-injection | 0 | 2012-04-26T08:09:00.000 |
We have developed an application using DJango 1.3.1, Python 2.7.2 using Database as SQL server 2008. All these are hosted in Win 2008 R2 operating system on VM. The clients has windows 7 as o/s.
We developed application keeping in view with out VM, all of sudden client has come back saying they can only host the appl... | 0 | 0 | 1.2 | 0 | true | 10,331,810 | 1 | 422 | 1 | 1 | 0 | 10,331,518 | Maybe this could help you a bit, although my set-up is slightly different. I am running an ASP.NET web app developed on Windows7 via VMware fusion on OS X. I access the web app from outside the VM (browser of Mac or other computers/phones within the network).
Here are the needed settings:
Network adapter set to (Bridg... | 1 | 0 | 0 | Steps to access Django application hosted in VM from Windows 7 client | 2 | django,wxpython,sql-server-2008-r2,vmware,python-2.7 | 0 | 2012-04-26T10:21:00.000 |
What's the best way to create an intentionally empty query in SQLAlchemy?
For example, I've got a few functions which build up the query (adding WHERE clauses, for example), and at some points I know that the the result will be empty.
What's the best way to create a query that won't return any rows? Something like Djan... | 36 | 34 | 1 | 0 | false | 12,837,029 | 0 | 9,094 | 1 | 0 | 0 | 10,345,327 | If you need the proper return type, just return session.query(MyObject).filter(sqlalchemy.sql.false()).
When evaluated, this will still hit the DB, but it should be fast.
If you don't have an ORM class to "query", you can use false() for that as well:
session.query(sqlalchemy.false()).filter(sqlalchemy.false()) | 1 | 0 | 0 | SQLAlchemy: create an intentionally empty query? | 4 | python,sqlalchemy | 0 | 2012-04-27T05:41:00.000 |
I'm using django-1.4 , sqlite3 , django-facebookconnect
Following instructions in Wiki to setup .
"python manage.py syncdb" throws an error .
Creating tables ...
Creating table auth_permission
Creating table auth_group_permissions
Creating table auth_group
Creating table auth_user_user_permissions
Creating table auth_... | 0 | 1 | 1.2 | 0 | true | 10,486,708 | 1 | 282 | 1 | 0 | 0 | 10,356,581 | You should use django-facebook instead, it does that and more and it is actively supported :) | 1 | 0 | 0 | Getting db_type() error while using django-facebook connect for DjangoApp | 1 | python,django,facebook,sqlite | 0 | 2012-04-27T19:17:00.000 |
I have a large dataset of events in a Postgres database that is too large to analyze in memory. Therefore I would like to quantize the datetimes to a regular interval and perform group by operations within the database prior to returning results. I thought I would use SqlSoup to iterate through the records in the appro... | 0 | 1 | 0.197375 | 0 | false | 10,360,094 | 0 | 333 | 1 | 0 | 0 | 10,359,617 | After talking with some folks, it's pretty clear the better answer is to use Pig to process and aggregate my data locally. At the scale, I'm operating it wasn't clear Hadoop was the appropriate tool to be reaching for. One person I talked to about this suggests Pig will be orders of magnitude faster than in-DB operatio... | 1 | 0 | 0 | Data Transformation in Postgres Using SqlSoup | 1 | python,postgresql,sqlsoup | 0 | 2012-04-28T00:57:00.000 |
I have a python script that gets data from a USB weather station, now it puts the data into MySQL whenever the data is received from the station.
I have a MySQL class with an insert function, what i want i that the function checks if it has been run the last 5 minutes if it has, quit.
Could not find any code on the int... | 1 | 0 | 0 | 0 | false | 10,366,467 | 0 | 489 | 2 | 0 | 0 | 10,366,424 | Just derive to a new class and override the insert function. In the overwriting function, check last insert time and call father's insert method if it has been more than five minutes, and of course update the most recent insert time. | 1 | 0 | 0 | Python, function quit if it has been run the last 5 minutes | 5 | python,python-2.7 | 0 | 2012-04-28T18:36:00.000 |
I have a python script that gets data from a USB weather station, now it puts the data into MySQL whenever the data is received from the station.
I have a MySQL class with an insert function, what i want i that the function checks if it has been run the last 5 minutes if it has, quit.
Could not find any code on the int... | 1 | 0 | 0 | 0 | false | 10,366,452 | 0 | 489 | 2 | 0 | 0 | 10,366,424 | Each time the function is run save a file with the current time. When the function is run again check the time stored in the file and make sure it is old enough. | 1 | 0 | 0 | Python, function quit if it has been run the last 5 minutes | 5 | python,python-2.7 | 0 | 2012-04-28T18:36:00.000 |
I'm building a social app in django, the architecture of the site will be very similar to facebook
There will be posts, posts will have comments
Both posts and comments will have meta data like date, author, tags, votes
I decided to go with nosql database because of the ease with which we can add new features.
I finali... | 0 | 2 | 0.099668 | 0 | false | 10,396,700 | 1 | 1,158 | 3 | 0 | 0 | 10,396,315 | There's a huge distinction to be made between Redis and MongoDB for your particular needs, in that Redis, unlike MongoDB, doesn't facilitate value queries.
You can use MongoDB to embed the comments within the post document, which means you get the post and the comments in a single query, yet you could also query for po... | 1 | 0 | 0 | mongo db or redis for a facebook like site? | 4 | python,django,database-design,mongodb,redis | 0 | 2012-05-01T10:16:00.000 |
I'm building a social app in django, the architecture of the site will be very similar to facebook
There will be posts, posts will have comments
Both posts and comments will have meta data like date, author, tags, votes
I decided to go with nosql database because of the ease with which we can add new features.
I finali... | 0 | 0 | 0 | 0 | false | 10,403,789 | 1 | 1,158 | 3 | 0 | 0 | 10,396,315 | First, loosely couple your app and your persistence so that you can swap them out at a very granular level. For example, you want to be able to move one service from mongo to redis as your needs evolve. Be able to measure your services and appropriately respond to them individually.
Second, you are unlikely to find one... | 1 | 0 | 0 | mongo db or redis for a facebook like site? | 4 | python,django,database-design,mongodb,redis | 0 | 2012-05-01T10:16:00.000 |
I'm building a social app in django, the architecture of the site will be very similar to facebook
There will be posts, posts will have comments
Both posts and comments will have meta data like date, author, tags, votes
I decided to go with nosql database because of the ease with which we can add new features.
I finali... | 0 | 1 | 0.049958 | 0 | false | 10,396,466 | 1 | 1,158 | 3 | 0 | 0 | 10,396,315 | These things are subjective and can be looked at in different directions. But if you have already decided to go with a nosql solution and is trying to determine between mongodb and redis I think it is better to go with mongodb as I guess you should be able to save a big number of posts and also mongodb documents are be... | 1 | 0 | 0 | mongo db or redis for a facebook like site? | 4 | python,django,database-design,mongodb,redis | 0 | 2012-05-01T10:16:00.000 |
I am facing an issue with setting a value of Excel Cell.
I get data from a table cell in MS-Word Document(dcx) and print it on output console.
Problem is that the data of the cell is just a word, "Hour", with no apparent other leading or trailing printable character like white-spaces. But when I print it using python's... | 3 | 3 | 1.2 | 0 | true | 10,423,918 | 0 | 3,883 | 1 | 0 | 0 | 10,423,593 | Try using value.rstrip('\r\n') to remove any carriage returns (\r) or newlines (\n) at the end of your string value. | 1 | 0 | 1 | Unwanted character in Excel Cell In Python | 2 | python,excel,ms-word,character | 0 | 2012-05-03T00:30:00.000 |
Sometimes an application requires quite a few SQL queries before it can do anything useful. I was wondering if there is a way to send those as a batch to the database, to avoid the overhead of going back and forth between the client and the server?
If there is no standard way to do it, I'm using the python bindings of ... | 0 | 0 | 0 | 0 | false | 10,434,644 | 0 | 92 | 1 | 0 | 0 | 10,434,523 | This process works best on inserts
Make all you SQL queries into Stored Procedures. These eventually will become child stored procedures
Create Master Store procedure to run all other Stored Procedures.
Modify master Stored procedure to accept values required by child Stored Procedures
Modify master Stored procedure t... | 1 | 0 | 0 | Grouping SQL queries | 1 | mysql,sql,mysql-python | 0 | 2012-05-03T15:27:00.000 |
Lets say I have 100 servers each running a daemon - lets call it server - that server is responsible for spawning a thread for each user of this particular service (lets say 1000 threads per server). Every N seconds each thread does something and gets information for that particular user (this request/response model ca... | 1 | 1 | 1.2 | 0 | true | 10,440,880 | 1 | 221 | 1 | 1 | 0 | 10,440,277 | The general way to handle this is to have the threads report their status back to the server daemon. If you haven't seen a status update within the last 5N seconds, then you kill the thread and start another.
You can keep track of the current active threads that you've spun up in a list, then just loop through them occ... | 1 | 0 | 0 | Distributed server model | 1 | python,distributed-computing | 0 | 2012-05-03T22:39:00.000 |
I have come across a requirement that needs to access a set of databases in a Mongodb server, using TurboGear framework. There I need to list down the Databases, and allow the user to select one and move on. As far as I looked, TurboGear does facilitate multiple databases to use, but those needs to be specify beforehan... | 2 | 2 | 1.2 | 0 | true | 10,650,606 | 0 | 305 | 1 | 0 | 0 | 10,495,324 | For SQLAlchemy you can achieve something like that using a smarter Session.
Just subclass the sqlalchemy.orm.Session class and override the get_bind(self, mapper=None, clause=None) method.
That method is called each time the session has to decide which engine to use and is expected to return the engine itself. You can ... | 1 | 0 | 0 | How to change the database on the fly in python using TurboGear framework? | 1 | mongodb,python-3.x,turbogears2 | 0 | 2012-05-08T08:42:00.000 |
I have one table with Time32 column and large number of rows. My problem is next.
When my table reaches thousand million rows, I want start archiving every row older than specified value. For creating query I will use Time32 column which represents timestamp for collected data in row. So,using this query I want delete... | 1 | 1 | 0.197375 | 0 | false | 10,497,547 | 0 | 1,114 | 1 | 0 | 0 | 10,496,821 | The general way to archive records from one table of a given database to another one is to copy records into the target table, and then to delete the same records in the origin table.
That said, depending of your database engine and the capabilities of the language built on top of that, you can write atomic query comma... | 1 | 0 | 0 | Pytables - Delete rows from table by some criteria | 1 | python,database,python-2.7,hdf5,pytables | 0 | 2012-05-08T10:26:00.000 |
I looked at the sqlite.org docs, but I am new to this, so bear with me. (I have a tiny bit of experience with MySQL, and I think using it would be an overkill for what I am trying to do with my application.)
From what I understand I can initially create an SQLite db file locally on my MAC and add entrees to it using a ... | 0 | 1 | 1.2 | 0 | true | 10,518,010 | 0 | 143 | 1 | 0 | 0 | 10,517,900 | I could upload the db file to any web hosting service to any directory
Supposing that the service has the libraries installed to handle sqlite, and that sqlite is installed.
Would I be able to run a Python script that writes to SQLite
Yes, well, maybe. As of Python 2.5, Python includes sqlite support as part of it's... | 1 | 0 | 0 | Understanding SQLite conceptually | 1 | python,sqlite,web-hosting | 0 | 2012-05-09T14:13:00.000 |
I'm trying to install MYSQLdb on a windows client. The goal is, from the Windows client, run a python script that connects to a MySQL server on a LINUX client. Looking at the setup code (and based on the errors I am getting when I try to run setup.py for mysqldb, it appears that I have to have my own version of MySQL... | 0 | 1 | 0.066568 | 0 | false | 10,541,253 | 0 | 1,771 | 1 | 0 | 0 | 10,541,085 | You don't need the entire MySQL database server, only the MySQL client libraries. | 1 | 0 | 0 | Install MYSQLdb python module without MYSQL local install | 3 | python,mysql,windows | 0 | 2012-05-10T19:44:00.000 |
I know that pyramid comes with a scaffold for sqlalchemy. But what if I'm using the pyramid_jqm scaffold. How would you integrate or use sqlalchemy then? When I create a model.py and import from sqlalchemy I get an error that he couldnt find the module. | 0 | 2 | 1.2 | 0 | true | 10,555,714 | 1 | 101 | 1 | 0 | 0 | 10,551,042 | You have to setup your project in the same way that the alchemy scaffold is constructed. Put "sqlalchemy" in your setup.py requires field and run "python setup.py develop" to install the dependency. This is all just python and unrelated to Pyramid. | 1 | 0 | 0 | Using sqlalchemy in pyramid_jqm | 1 | python,sqlalchemy,pyramid | 0 | 2012-05-11T12:08:00.000 |
I am absolute beginner using google app engine with python 2.7. I was successful with creating helloworld app, but then any changes I do to the original app doesn't show in localhost:8080. Is there a way to reset/refresh the localhost. I tried to create new projects/directories with different content but my localhost c... | 0 | 2 | 0.132549 | 0 | false | 10,575,238 | 1 | 437 | 3 | 1 | 0 | 10,575,184 | Those warnings shouldn't prevent you from seeing new 'content,' they simply mean that you are missing some libraries necessary to run local versions of CloudSQL (MySQL) and the Images API.
First to do is try to clear your browser cache. What changes did you make to your Hello World app? | 1 | 0 | 0 | Localhost is not refreshing/reseting | 3 | python,google-app-engine | 0 | 2012-05-13T20:57:00.000 |
I am absolute beginner using google app engine with python 2.7. I was successful with creating helloworld app, but then any changes I do to the original app doesn't show in localhost:8080. Is there a way to reset/refresh the localhost. I tried to create new projects/directories with different content but my localhost c... | 0 | 0 | 0 | 0 | false | 10,593,822 | 1 | 437 | 3 | 1 | 0 | 10,575,184 | Press CTRL-F5 in your browser, while on the page. Forces a cache refresh. | 1 | 0 | 0 | Localhost is not refreshing/reseting | 3 | python,google-app-engine | 0 | 2012-05-13T20:57:00.000 |
I am absolute beginner using google app engine with python 2.7. I was successful with creating helloworld app, but then any changes I do to the original app doesn't show in localhost:8080. Is there a way to reset/refresh the localhost. I tried to create new projects/directories with different content but my localhost c... | 0 | 0 | 0 | 0 | false | 41,388,817 | 1 | 437 | 3 | 1 | 0 | 10,575,184 | You can try opening up the DOM reader (Mac: alt+command+i, Windows: shift+control+i) the reload the page. It's weird, but it works for me. | 1 | 0 | 0 | Localhost is not refreshing/reseting | 3 | python,google-app-engine | 0 | 2012-05-13T20:57:00.000 |
I have found that ultramysql meets my requirement. But it has no document, and no windows binary package.
I have a program heavy on internet downloads and mysql inserts. So I use gevent to solve the multi-download-tasks problem. After I downloaded the web pages, and parsed the web pages, I get to insert the data into m... | 4 | 1 | 1.2 | 0 | true | 12,335,813 | 1 | 1,197 | 2 | 1 | 0 | 10,580,835 | Postgres may be better suited due to its asynchronous capabilities | 1 | 0 | 0 | How to use mysql in gevent based programs in python? | 2 | python,mysql,gevent | 0 | 2012-05-14T09:41:00.000 |
I have found that ultramysql meets my requirement. But it has no document, and no windows binary package.
I have a program heavy on internet downloads and mysql inserts. So I use gevent to solve the multi-download-tasks problem. After I downloaded the web pages, and parsed the web pages, I get to insert the data into m... | 4 | 1 | 0.099668 | 0 | false | 13,006,283 | 1 | 1,197 | 2 | 1 | 0 | 10,580,835 | I think one solution is use pymysql. Since pymysql use python socket, after monkey patch, should be work with gevent. | 1 | 0 | 0 | How to use mysql in gevent based programs in python? | 2 | python,mysql,gevent | 0 | 2012-05-14T09:41:00.000 |
I'm currently developing an application which connects to a database using sqlalchemy. The idea consists of having several instances of the application running in different computers using the same database. I want to be able to see changes in the database in all instances of the application once they are commited. I'm... | 2 | 0 | 1.2 | 0 | true | 10,602,194 | 0 | 1,036 | 1 | 0 | 0 | 10,601,947 | You said it, you are using SQLAlchemy's event interface, it is not the one of the RDBMS, and SQLAlchemy does not communicate with the other instances connected to that DB.
SQLAlchemy's event system calls a function in your own process. It's up to you to make this function send a signal to the rest of them via the netwo... | 1 | 0 | 0 | Concurrency in sqlalchemy | 1 | python,concurrency,sqlalchemy | 0 | 2012-05-15T13:37:00.000 |
I have a couchdb instance with database a and database b. They should contain identical sets of documents, except that the _rev property will be different, which, AIUI, means I can't use replication.
How do I verify that the two databases really do contain the same documents which are all otherwise 'equal'?
I've tried ... | 1 | 1 | 1.2 | 0 | true | 10,616,421 | 1 | 685 | 1 | 0 | 0 | 10,615,980 | If you want to make sure they're exactly the same, write a map job that emits the document path as the key, and the documents hash (generated any way you like) as the value. Do not include the _rev field in the hash generation.
You cannot reduce to a single hash because order is not guaranteed, but you can feed the res... | 1 | 0 | 0 | Compare two couchdb databases | 1 | couchdb,replication,couchdb-python | 0 | 2012-05-16T09:45:00.000 |
I have made a python ladon webservice and I run is on Ubuntu with Apache2 and mod_wsgi. (I use Python 2.6).
The webservice connect to a postgreSQL database with psycopg2 python module.
My problem is that the psycopg2.connection is closed (or destroyed) automatically after a little time (after about 1 or 2 minutes).
Th... | 0 | 1 | 0.197375 | 0 | false | 10,645,670 | 1 | 450 | 1 | 0 | 0 | 10,636,409 | If you are using mod_wsgi in embedded moe, especially with preform MPM for Apache, then likely that Apache is killing off the idle processes. Try using mod_wsgi daemon mode, which keeps process persistent and see if it makes a difference. | 1 | 0 | 0 | Python psycopg2 + mod_wsgi: connection is very slow and automatically close | 1 | python,web-services,apache2,mod-wsgi,psycopg2 | 1 | 2012-05-17T13:12:00.000 |
I wish to consume a .net webservice containing the results of SQL Server query using a Python client. I have used the Python Suds library to interface to the same web service but not with a set of results. How should I structure the data so it is efficiently transmitted and consumed by a Python client. There should be... | 0 | 1 | 0.197375 | 0 | false | 10,653,866 | 0 | 175 | 1 | 0 | 0 | 10,638,071 | Suds is a library to connect via SOAP, so you may already have blown "efficiently transmitted" out of the window, as this is a particularly verbose format over the wire. Your maximum data size is relatively small, and so should almost certainly be transmitted back in a single message so the SOAP overhead is incurred on... | 1 | 0 | 0 | SQL Query result via .net webservice to a non .net- Python client | 1 | .net,python,sql-server,web-services,suds | 0 | 2012-05-17T14:49:00.000 |
I'm seeing some unexpected behaviour with Flask-SQLAlchemy, and I don't understand what's going on:
If I make a change to a record using e.g. MySQL Workbench or Sequel Pro, the running app (whether running under WSGI on Apache, or from the command line) isn't picking up the change. If I reload the app by touching the W... | 1 | 1 | 0.197375 | 0 | false | 15,194,364 | 1 | 326 | 1 | 0 | 0 | 10,645,793 | you app's SELECT is probably within its own transaction / session so changes submitted by another session (e.g. MySQL Workbench connection) are not yet visible for your SELECT. You can easily verify it by enabling mysql general log or by setting 'echo: false' in your create_engine(...) definition. Chances are you're st... | 1 | 0 | 0 | Flask SQLAlchemy not picking up changed records | 1 | python,flask-sqlalchemy | 0 | 2012-05-18T01:57:00.000 |
Background:
I'm trying to use a Google Map as an interface to mark out multiple polygons, that can be stored in a Postgres Database.
The Database will then be queried with a geocoded Longitude Latitude Point to determine which of the Drawn Polygons encompass the point.
Using Python and Django.
Question
How do I configu... | 0 | 1 | 0.099668 | 0 | false | 10,648,479 | 1 | 667 | 1 | 0 | 0 | 10,647,482 | "Using Python and Django" only, you're not going to do this. Obviously you're going to need Javascript.
So you may as well dump Google Maps and use an open-source web mapping framework. OpenLayers has a well-defined Javascript API which will let you do exactly what you want. Examples in the OpenLayers docs show how.
Yo... | 1 | 0 | 0 | Mark Out Multiple Delivery Zones on Google Map and Store in Database | 2 | python,django,postgresql,google-maps,postgis | 0 | 2012-05-18T06:01:00.000 |
There is a way to avoid duplicate files in mongo gridfs?
Or I have to do that via application code (I am using pymongo) | 5 | 1 | 0.099668 | 0 | false | 10,648,760 | 0 | 2,727 | 2 | 0 | 0 | 10,648,729 | You could use md5 hash and compare new hash with exists before saving file. | 1 | 0 | 1 | Mongo: avoid duplicate files in gridfs | 2 | python,mongodb,gridfs | 0 | 2012-05-18T07:48:00.000 |
There is a way to avoid duplicate files in mongo gridfs?
Or I have to do that via application code (I am using pymongo) | 5 | 5 | 1.2 | 0 | true | 10,650,262 | 0 | 2,727 | 2 | 0 | 0 | 10,648,729 | The MD5 sum is already part of Mongo's gridfs meta-data, so you could simply set a unique index on that column and the server will refuse to store the file. No need to compare on the client side. | 1 | 0 | 1 | Mongo: avoid duplicate files in gridfs | 2 | python,mongodb,gridfs | 0 | 2012-05-18T07:48:00.000 |
Are there any good example projects which uses SQLAlchemy (with Python Classes) that I can look into? (which has at least some basic database operations - CRUD)
I believe that, it is a good way to learn any programming language by looking into someone's code.
Thanks! | 18 | 0 | 0 | 0 | false | 10,778,146 | 0 | 12,180 | 1 | 0 | 0 | 10,656,426 | What kind of environment are you looking to work with on top of SQLAlchemy?
Most likely, if you are using a popular web framework like django, Flask or Pylons, you can find many examples and tutorials specific to that framework that include SQLAlchemy.
This will boost your knowledge both with SQLAlchemy and whatever e... | 1 | 0 | 0 | SQLAlchemy Example Projects | 2 | python,sqlalchemy | 0 | 2012-05-18T16:32:00.000 |
I am confused about why python needs cursor object. I know jdbc and there the database connection is quite intuitive but in python I am confused with cursor object. Also I am doubtful about what is the difference between cursor.close() and connection.close() function in terms of resource release. | 41 | 5 | 0.321513 | 0 | false | 10,660,537 | 0 | 17,423 | 1 | 0 | 0 | 10,660,411 | Connection object is your connection to the database, close that when you're done talking to the database all together. Cursor object is an iterator over a result set from a query. Close those when you're done with that result set. | 1 | 0 | 0 | difference between cursor and connection objects | 3 | python,python-db-api | 0 | 2012-05-18T22:16:00.000 |
I am thinking about creating an open source data management web application for various types of data.
A privileged user must be able to
add new entity types (for example a 'user' or a 'family')
add new properties to entity types (for example 'gender' to 'user')
remove/modify entities and properties
These will be ... | 20 | 6 | 1 | 0 | false | 10,792,940 | 1 | 4,158 | 2 | 0 | 0 | 10,672,939 | What you're asking about is a common requirement in many systems -- how to extend a core data model to handle user-defined data. That's a popular requirement for packaged software (where it is typically handled one way) and open-source software (where it is handled another way).
The earlier advice to learn more about ... | 1 | 0 | 0 | Which database model should I use for dynamic modification of entities/properties during runtime? | 4 | python,database,dynamic,sqlalchemy,redis | 0 | 2012-05-20T11:16:00.000 |
I am thinking about creating an open source data management web application for various types of data.
A privileged user must be able to
add new entity types (for example a 'user' or a 'family')
add new properties to entity types (for example 'gender' to 'user')
remove/modify entities and properties
These will be ... | 20 | 3 | 1.2 | 0 | true | 10,707,420 | 1 | 4,158 | 2 | 0 | 0 | 10,672,939 | So, if you conceptualize your entities as "documents," then this whole problem maps onto a no-sql solution pretty well. As commented, you'll need to have some kind of model layer that sits on top of your document store and performs tasks like validation, and perhaps enforces (or encourages) some kind of schema, becaus... | 1 | 0 | 0 | Which database model should I use for dynamic modification of entities/properties during runtime? | 4 | python,database,dynamic,sqlalchemy,redis | 0 | 2012-05-20T11:16:00.000 |
I have a large SQLServer database on my current hosting site...
and
I would like to import it into Google BigData.
Is there a method for this? | 0 | 1 | 0.197375 | 0 | false | 10,713,425 | 0 | 108 | 1 | 0 | 0 | 10,705,572 | I think that the answer is that there is no general recipe for doing this. In fact, I don't even think it makes sense to have a general recipe ...
What you need to do is to analyse the SQL schemas and work out an appropriate mapping to BigData schemas. Then you figure out how to migrate the data. | 1 | 0 | 0 | Porting data from SQLServer to BigData | 1 | python,sql-server,bigdata | 0 | 2012-05-22T15:52:00.000 |
I need to copy an existing neo4j database in Python. I even do not need it for backup, just to play around with while keeping the original database untouched. However, there is nothing about copy/backup operations in neo4j.py documentation (I am using python embedded binding).
Can I just copy the whole folder with the... | 1 | 2 | 1.2 | 0 | true | 10,736,999 | 0 | 310 | 1 | 0 | 0 | 10,724,345 | Yes,
you can copy the whole DB directory when you have cleanly shut down the DB for backup. | 1 | 0 | 0 | Copy neo4j database from python | 1 | python,copy,backup,neo4j | 0 | 2012-05-23T16:44:00.000 |
Let's say I get sales data every 15 minutes. The sales transactions are stored in a mysql database. I need to be able to graph this data, and allow the user to re-size the scale of time. The info would be graphed on a django website.
How would I go about doing this, and are there any open source tools that I could loo... | 0 | 1 | 0.066568 | 0 | false | 10,779,681 | 0 | 3,292 | 1 | 0 | 0 | 10,779,244 | HighCharts have awesome features you can also build pivot charts using that one but they will charge you .You can look over Py Chart also | 1 | 0 | 0 | How to graph mysql data in python? | 3 | python,mysql,sql | 0 | 2012-05-28T04:10:00.000 |
In google app engine, can I call "get_or_insert" from inside a transaction?
The reason I ask is because I'm not sure if there is some conflict with having this run its own transaction inside an already running transaction.
Thanks! | 2 | 2 | 0.197375 | 0 | false | 10,791,742 | 1 | 308 | 1 | 1 | 0 | 10,790,381 | No. get_or_insert is syntactic sugar for a transactional function that fetches or inserts a record. You can implement it yourself trivially, but that will only work if the record you're operating on is in the same entity group as the rest of the entities in the current transaction, or if you have cross-group transactio... | 1 | 0 | 0 | In app engine, can I call "get_or_insert" from inside a transaction? | 2 | python,google-app-engine | 0 | 2012-05-28T21:01:00.000 |
In db.py,I can use a function(func insert) insert data into sqlite correctly.
Now I want to insert data into sqlite through python-fastcgi, in
fastcgi (just named post.py ) I can get the request data correctly,but
when I call db.insert,it gives me internal server error.
I already did chmod 777 slqite.db. Anyo... | 2 | 4 | 1.2 | 0 | true | 10,796,243 | 0 | 1,289 | 1 | 0 | 0 | 10,793,042 | Ffinally I found the answer:
the sqlite3 library needs write permissions also on the directory that contains it, probably because it needs to create a lockfile.
Therefor when I use sql to insert data there is no problem, but when I do it through web cgi,fastcgi etc)to insert data there would be an error.
Just add wri... | 1 | 0 | 0 | sqlite3 insert using python and python cgi | 1 | python,sqlite,fastcgi | 0 | 2012-05-29T04:23:00.000 |
I have run a few trials and there seems to be some improvement in speed if I set autocommit to False.
However, I am worried that doing one commit at the end of my code, the database rows will not be updated. So, for example, I do several updates to the database, none are committed, does querying the database then give ... | 2 | 0 | 0 | 0 | false | 10,803,049 | 0 | 1,170 | 2 | 0 | 0 | 10,803,012 | As long as you use the same connection, the database should show you a consistent view on the data, e.g. with all changes made so far in this transaction.
Once you commit, the changes will be written to disk and be visible to other (new) transactions and connections. | 1 | 0 | 0 | Does setting autocommit to true take longer than batch committing? | 3 | python,mysql,odbc,pyodbc | 0 | 2012-05-29T16:22:00.000 |
I have run a few trials and there seems to be some improvement in speed if I set autocommit to False.
However, I am worried that doing one commit at the end of my code, the database rows will not be updated. So, for example, I do several updates to the database, none are committed, does querying the database then give ... | 2 | 1 | 0.066568 | 0 | false | 10,803,230 | 0 | 1,170 | 2 | 0 | 0 | 10,803,012 | The default transaction mode for InnoDB is REPEATABLE READ, all the read will be consistent within a transaction. If you insert rows and query them in the same transaction, you will not see the newly inserted row, but they will be stored when you commit the transaction. If you want to see the newly inserted row before ... | 1 | 0 | 0 | Does setting autocommit to true take longer than batch committing? | 3 | python,mysql,odbc,pyodbc | 0 | 2012-05-29T16:22:00.000 |
From someone who has a django application in a non-trivial production environment, how do you handle database migrations? I know there is south, but it seems like that would miss quite a lot if anything substantial is involved.
The other two options (that I can think of or have used) is doing the changes on a test dat... | 22 | 1 | 0.033321 | 0 | false | 10,872,504 | 1 | 11,397 | 2 | 0 | 0 | 10,826,266 | South isnt used everywhere. Like in my orgainzation we have 3 levels of code testing. One is local dev environment, one is staging dev enviroment, and third is that of a production .
Local Dev is on the developers hands where he can play according to his needs. Then comes staging dev which is kept identical to product... | 1 | 0 | 0 | Database migrations on django production | 6 | python,mysql,django,migration,django-south | 0 | 2012-05-31T01:12:00.000 |
From someone who has a django application in a non-trivial production environment, how do you handle database migrations? I know there is south, but it seems like that would miss quite a lot if anything substantial is involved.
The other two options (that I can think of or have used) is doing the changes on a test dat... | 22 | 0 | 0 | 0 | false | 70,559,647 | 1 | 11,397 | 2 | 0 | 0 | 10,826,266 | If its not trivial, you should have pre-prod database/ app that mimic the production one. To avoid downtime on production. | 1 | 0 | 0 | Database migrations on django production | 6 | python,mysql,django,migration,django-south | 0 | 2012-05-31T01:12:00.000 |
Using python(fastcgi),lighttpd,sqlite3 for server
Update data of sqlite3 every weekend.
Thats means, every user get the same data from server before weekend,and server query database for every user's request.
My question is:
Is there any way to cache data for users,server using cache data to response all users before u... | 0 | 1 | 1.2 | 0 | true | 10,843,435 | 0 | 167 | 1 | 0 | 0 | 10,843,191 | You can use a cache such as memcached to store it once retrieved. | 1 | 0 | 0 | cache data in python and sqlite3 | 1 | python,sqlite,fastcgi,lighttpd | 0 | 2012-06-01T01:04:00.000 |
I am trying to write a python program for appending live stock quotes from a csv file to an excel file (which is already open) using xlrd and xlwt.
The task is summarised below.
From my stock-broker's application, a csv file is continually being updated on my hard disk.
I wish to write a program which, when run, would ... | 1 | 1 | 0.197375 | 1 | false | 10,857,757 | 0 | 1,299 | 1 | 0 | 0 | 10,851,726 | Not directly. xlutils can use xlrd and xlwt to copy a spreadsheet, and appending to a "to be written" worksheet is straightforward. I don't think reading the open spreadsheet is a problem -- but xlwt will not write to the open book/sheet.
You might write an Excel VBA macro to draw the graphs. In principle, I think a... | 1 | 0 | 0 | xlrd - append data to already opened workbook | 1 | python,xlrd,xlwt | 0 | 2012-06-01T14:01:00.000 |
I am trying to save an array of dates. I am providing a list of date objects, yet psycopg2 is throwing the above error.
Any thoughts on how I can work around this? | 1 | 1 | 1.2 | 0 | true | 10,914,900 | 0 | 1,679 | 1 | 0 | 0 | 10,854,532 | This is a PostgreSQL error: you need an explicit cast. Add ::date[] after the value or the placeholder. | 1 | 0 | 1 | psycopg2 column is of type date[] but expression is of type text[] | 1 | python,django,psycopg2 | 0 | 2012-06-01T17:03:00.000 |
I have an open source PHP website and I intend to modify/translate (mostly constant strings) it so it can be used by Japanese users.
The original code is PHP+MySQL+Apache and written in English with charset=utf-8
I want to change, for example, the word "login" into Japanese counterpart "ログイン" etc
I am not sure whether ... | 2 | 2 | 0.132549 | 0 | false | 10,868,488 | 0 | 148 | 2 | 0 | 0 | 10,868,473 | If it's in the file, then yes, you will need to save the file as UTF-8.
If it's is in the database, you do not need to save the PHP file as UTF-8.
In PHP, strings are basically just binary blobs. You will need to save the file as UTF-8 so the correct bytes are read in. In theory, if you saved the raw bytes in an ANSI... | 1 | 0 | 1 | PHP for Python Programmers: UTF-8 Issues | 3 | php,python,mysql,apache,utf-8 | 1 | 2012-06-03T06:52:00.000 |
I have an open source PHP website and I intend to modify/translate (mostly constant strings) it so it can be used by Japanese users.
The original code is PHP+MySQL+Apache and written in English with charset=utf-8
I want to change, for example, the word "login" into Japanese counterpart "ログイン" etc
I am not sure whether ... | 2 | 0 | 0 | 0 | false | 10,868,497 | 0 | 148 | 2 | 0 | 0 | 10,868,473 | If the file contains UTF-8 characters then save it with UTF-8. Otherwise you can save it in any format. One thing you should be aware of is that the PHP interpreter does not support the UTF-8 byte order mark so make sure you save it without that. | 1 | 0 | 1 | PHP for Python Programmers: UTF-8 Issues | 3 | php,python,mysql,apache,utf-8 | 1 | 2012-06-03T06:52:00.000 |
Using CGI scripts, I can run single Python files on my server and then use their output on my website.
However, I have a more complicated program on my computer that I would like to run on the server. It involves several modules I have written myself, and the SQLITE3 module built in Python. The program involves readin... | 0 | 0 | 0 | 0 | false | 10,900,387 | 0 | 88 | 1 | 0 | 0 | 10,900,319 | I suggest you look in the log of your server to find out what caused the 500 error. | 1 | 0 | 0 | Importing Python files into each other on a web server | 2 | python,sqlite,web | 0 | 2012-06-05T15:34:00.000 |
It looks like this is what e.g. MongoEngine does. The goal is to have model files be able to access the db without having to explicitly pass around the context. | 2 | 2 | 0.379949 | 0 | false | 10,907,158 | 1 | 877 | 1 | 0 | 0 | 10,906,477 | Pyramid has nothing to do with it. The global needs to handle whatever mechanism the WSGI server is using to serve your application.
For instance, most servers use a separate thread per request, so your global variable needs to be threadsafe. gunicorn and gevent are served using greenlets, which is a different mechanic... | 1 | 0 | 0 | In Pyramid, is it safe to have a python global variable that stores the db connection? | 1 | python,pyramid | 0 | 2012-06-05T23:41:00.000 |
is there a difference if i use """..""" in the sql of cusror.execute. Even if there is any slight difference please tell | 1 | 0 | 0 | 0 | false | 10,910,268 | 0 | 125 | 1 | 0 | 0 | 10,910,246 | No, other than the string can contain newlines. | 1 | 0 | 0 | What is the use of """...""" in python instead of "..." or '...', especially in MySQLdb cursor.execute | 2 | python,sql,string,mysql-python | 0 | 2012-06-06T07:55:00.000 |
Just started to use Mercurial. Wow, nice application. I moved my database file out of the code directory, but I was wondering about the .pyc files. I didn't include them on the initial commit. The documentation about the .hgignore file includes an example to exclude *.pyc, so I think I'm on the right track.
I am wonde... | 7 | 5 | 0.244919 | 0 | false | 10,920,888 | 1 | 8,119 | 2 | 0 | 0 | 10,920,423 | Usually you are safe, because *.pyc are regenerated if the corresponding *.py changes its content.
It is problematic if you delete a *.py file and you are still importing from it in another file. In this case you are importing from the *.pyc file if it is existing. But this will be a bug in your code and is not really ... | 1 | 0 | 0 | What to do with pyc files when Django or python is used with Mercurial? | 4 | python,django,mercurial,pyc | 0 | 2012-06-06T18:58:00.000 |
Just started to use Mercurial. Wow, nice application. I moved my database file out of the code directory, but I was wondering about the .pyc files. I didn't include them on the initial commit. The documentation about the .hgignore file includes an example to exclude *.pyc, so I think I'm on the right track.
I am wonde... | 7 | 0 | 0 | 0 | false | 10,920,511 | 1 | 8,119 | 2 | 0 | 0 | 10,920,423 | Sure if you have a .pyc file from an older version of the same module python will use that. Many times I have wondered why my program wasn't reflecting the changes I made, and realized it was because I had old pyc files.
If this means that .pyc are not reflecting your current version then yes you will have to delete a... | 1 | 0 | 0 | What to do with pyc files when Django or python is used with Mercurial? | 4 | python,django,mercurial,pyc | 0 | 2012-06-06T18:58:00.000 |
I have a sqlite3 database that I created from Python (2.7) on a local machine, and am trying to copy it to a remote location. I ran "sqlite3 posts.db .backup posts.db.bak" to create a copy (I can use the original and this new copy just fine). But when I move the copied file to the remote location, suddenly every comman... | 0 | 0 | 0 | 0 | false | 10,922,927 | 0 | 624 | 1 | 0 | 0 | 10,922,394 | You did a .backup on the source system, but you don't mention doing a .restore on the target system. Please clarify.
You don't mention what versions of the sqlite3 executable you have on the source and target systems.
You don't mention how you transferred the .bak file from the source to the target.
Was the source db b... | 1 | 0 | 0 | How to safely move an SQLite3 database? | 1 | python,sqlite,copy | 0 | 2012-06-06T21:13:00.000 |
I have a desktop application that send POST requests to a server where a django app store the results. DB server and web server are not on the same machine and it happens that sometimes the connectivity is lost for a very short time but results in a connection error on some requests:
OperationalError: (2003, "Can't co... | 1 | 1 | 0.099668 | 0 | false | 10,935,789 | 1 | 2,536 | 1 | 0 | 0 | 10,930,459 | You could use a middleware with a process_view method and a try / except wrapping your call.
Or you could decorate your views and wrap the call there.
Or you could use class based views with a base class that has a method decorator on its dispatch method, or an overriden.dispatch.
Really, you have plenty of solutions.... | 1 | 0 | 0 | Django: how to properly handle a database connection error | 2 | python,mysql,django | 0 | 2012-06-07T11:00:00.000 |
I cant find "best" solution for very simple problem(or not very)
Have classical set of data: posts that attached to users, comments that attached to post and to user.
Now i can't decide how to build scheme/classes
On way is to store user_id inside comments and inside.
But what happens when i have 200 comments on page?
... | 3 | 1 | 0.049958 | 0 | false | 10,932,004 | 1 | 919 | 1 | 0 | 0 | 10,931,889 | What I would do with mongodb would be to embed the user id into the comments (which are part of the structure of the "post" document).
Three simple hints for better performances:
1) Make sure to ensure an index on the user_id
2) Use comment pagination method to avoid querying 200 times the database
3) Caching is your f... | 1 | 0 | 0 | MongoDB: Embedded users into comments | 4 | python,mongodb,mongoalchemy,nosql | 0 | 2012-06-07T12:34:00.000 |
I am making a little add-on for a game, and it needs to store information on a player:
username
ip-address
location in game
a list of alternate user names that have came from that ip or alternate ip addresses that come from that user name
I read an article a while ago that said that unless I am storing a large amount... | 14 | 7 | 1 | 0 | false | 10,957,953 | 0 | 5,782 | 1 | 0 | 0 | 10,957,877 | Assuming by 'database' you mean 'relational database' - even the embedded databases like SQLite come with some overhead compared to a plain text file. But, sometimes that overhead is worth it compared to rolling your own.
The biggest question you need to ask is whether you are storing relational data - whether things ... | 1 | 0 | 0 | When is it appropriate to use a database , in Python | 2 | python,database,flat-file | 0 | 2012-06-09T02:16:00.000 |
I am looking around in order to get an answer what is the max limit of results I can have from a GQL query on Ndb on Google AppEngine. I am using an implementation with cursors but it will be much faster if I retrieve them all at once. | 5 | 9 | 1 | 0 | false | 10,974,037 | 1 | 1,106 | 2 | 1 | 0 | 10,968,439 | This depends on lots of things like the size of the entities and the number of values that need to look up in the index, so it's best to benchmark it for your specific application. Also beware that if you find that on a sunny day it takes e.g. 10 seconds to load all your items, that probably means that some small frac... | 1 | 0 | 0 | What is the Google Appengine Ndb GQL query max limit? | 2 | python,google-app-engine,gql,app-engine-ndb | 0 | 2012-06-10T11:51:00.000 |
I am looking around in order to get an answer what is the max limit of results I can have from a GQL query on Ndb on Google AppEngine. I am using an implementation with cursors but it will be much faster if I retrieve them all at once. | 5 | 7 | 1.2 | 0 | true | 10,969,575 | 1 | 1,106 | 2 | 1 | 0 | 10,968,439 | Basically you don't have the old limit of 1000 entities per query anymore, but consider using a reasonable limit, because you can hit the time out error and it's better to get them in batches so users won't wait during load time. | 1 | 0 | 0 | What is the Google Appengine Ndb GQL query max limit? | 2 | python,google-app-engine,gql,app-engine-ndb | 0 | 2012-06-10T11:51:00.000 |
When I say 'equivalent', I mean an ORM that allows for the same work-style. That is;
Setting up a database
Dispensing and editing 'beans' (table rows) as if the table was already ready, while the table is being created behind the scenes
Reviewing, indexing and polishing the table structure before production
Thanks fo... | 1 | 0 | 1.2 | 0 | true | 13,714,374 | 0 | 700 | 1 | 0 | 0 | 10,987,162 | Short answer, there is a proof-of-concept called PyBean as answered by Gabor de Mooij, but it barely offers any features and cannot be used. There are no other Python libraries that work like PyBean. | 1 | 0 | 0 | Is there a RedBeanPHP equivalent for Python? | 2 | php,python,mysql,orm,redbean | 0 | 2012-06-11T20:35:00.000 |
Here's the scenario:
I have a url in a MySQL database that contains Unicode. The database uses the Latin-1 encoding. Now, when I read the record from MySQL using Python, it gets converted to Unicode because all strings follow the Unicode format in Python.
I want to write the URL into a text file -- to do so, it needs t... | 0 | 0 | 1.2 | 0 | true | 10,992,555 | 0 | 1,752 | 1 | 0 | 0 | 10,990,496 | You most probably need to set your mysql shell client to use utf8.
You can set it either in mysql shell directly by running set character set utf8.
Or by adding default-character-set=utf8 to your ~/.my.cnf. | 1 | 0 | 0 | Unicode to UTF-8 encoding issue when importing SQL text file into MySQL | 1 | python,mysql,unicode,encoding,utf-8 | 0 | 2012-06-12T04:22:00.000 |
I'm looking for a way of editing and save a specified cell in Excel 2010 .xlsx file from Node.JS. I realize, that maybe there are no production-ready solutions for NodeJS at this time. However, NodeJS supports C++ libraries, so could you suggest me any suitable lib compatible with Node?
Also, I had an idea to process t... | 0 | 0 | 1.2 | 0 | true | 11,008,175 | 0 | 3,505 | 1 | 0 | 0 | 11,007,460 | Basically you have 2 possibilities:
node.js does not support C++ libraries but it is possible to write bindings for node.js that interact with a C/C++ library. So you need to get your feet wet on writing a C++ addon for the V8 (the JavaScript engine behind node.js)
find a command line program which does what you want ... | 1 | 0 | 0 | Node.JS/C++/Python - edit Excel .xlsx file | 1 | c++,python,excel,node.js,read-write | 0 | 2012-06-13T02:27:00.000 |
Okay., We have Rails webapp which stores data in a mysql data base. The table design was not read efficient. So we resorted to creating a separate set of read only tables in mysql and made all our internal API calls use that tables for read. We used callbacks to keep the data in sync between both the set of tables. Now... | 2 | 1 | 1.2 | 0 | true | 11,014,025 | 1 | 296 | 1 | 0 | 0 | 11,013,976 | Yes, refactor the code to put a data web service in front of the database and let the Ruby and Python apps talk to the service. Let it maintain all integrity and business rules.
"Don't Repeat Yourself" - it's a good rule. | 1 | 0 | 0 | Maintaining data integrity in mysql when different applications are accessing it | 1 | python,mysql,ruby-on-rails,database,triggers | 0 | 2012-06-13T11:31:00.000 |
I have an excel spreadsheet (version 1997-2003) and another nonspecific database file (a .csy file, I am assuming it can be parsed as a text file as that is what it appears to be). I need to take information from both sheets, match them up, put them on one line, and print it to a text file. I was going to use python fo... | 1 | -1 | -0.066568 | 0 | false | 11,020,968 | 0 | 2,962 | 1 | 0 | 0 | 11,020,919 | Python is beginner-friendly and is good with string manipulation so it's a good choice. I have no idea how easy awk is to learn without programming experience but I would consider that as it's more or less optimized for processing csv's. | 1 | 0 | 1 | First time writing a script, not sure what language to use (parsing excel and other files) | 3 | python,excel,file-io,scripting | 0 | 2012-06-13T18:16:00.000 |
We are writing an inventory system and I have some questions about sqlalchemy (postgresql) and transactions/sessions. This is a web app using TG2, not sure this matters but to much info is never a bad.
How can make sure that when changing inventory qty's that i don't run into race conditions. If i understand it corre... | 2 | 3 | 1.2 | 0 | true | 11,034,199 | 0 | 2,935 | 1 | 0 | 0 | 11,033,892 | If two transactions try to set the same value at the same time one of them will fail. The one that loses will need error handling. For your particular example you will want to query for the number of parts and update the number of parts in the same transaction.
There is no race condition on sequence numbers. Save a re... | 1 | 0 | 0 | SQLAlchemy(Postgresql) - Race Conditions | 2 | python,postgresql,web-applications,sqlalchemy,turbogears2 | 0 | 2012-06-14T13:15:00.000 |
I am using mongoexport to export mongodb data which also has Image data in Binary format.
Export is done in csv format.
I tried to read image data from csv file into python and tried to store as in Image File in .jpg format on disk.
But it seems that, data is corrupt and image is not getting stored.
Has anybody come ac... | 1 | -1 | -0.099668 | 0 | false | 11,058,611 | 0 | 930 | 2 | 0 | 0 | 11,055,921 | Depending how you stored the data, it may be prefixed with 4 bytes of size. Are the corrupt exports 4 bytes/GridFS chunk longer than you'd expect? | 1 | 0 | 1 | Can mongoexport be used to export images stored in binary format in mongodb | 2 | python,image,mongodb,csv | 0 | 2012-06-15T18:01:00.000 |
I am using mongoexport to export mongodb data which also has Image data in Binary format.
Export is done in csv format.
I tried to read image data from csv file into python and tried to store as in Image File in .jpg format on disk.
But it seems that, data is corrupt and image is not getting stored.
Has anybody come ac... | 1 | 0 | 0 | 0 | false | 11,056,533 | 0 | 930 | 2 | 0 | 0 | 11,055,921 | One thing to watch out for is an arbitrary 2MB BSON Object size limit in several of 10gen's implementations. You might have to denormalize your image data and store it across multiple objects. | 1 | 0 | 1 | Can mongoexport be used to export images stored in binary format in mongodb | 2 | python,image,mongodb,csv | 0 | 2012-06-15T18:01:00.000 |
I need to manipulate a large amount of numerical/textual data, say total of 10 billion entries which can be theoretically organized as 1000 of 10000*1000 tables.
Most calculations need to be performed on a small subset of data each time (specific rows or columns), such that I don't need all the data at once.
Therefore... | 4 | 3 | 0.197375 | 0 | false | 11,058,566 | 0 | 3,289 | 1 | 0 | 0 | 11,058,409 | IMO simply use the file system with a file format that can you read/write in both MATLAB and Python. Databases usually imply a relational model (excluding the No-SQL ones), which would only add complexity here.
Being more MATLAB-inclined, you can directly manipulate MAT-files in SciPy with scipy.io.loadmat/scipy.io.sav... | 1 | 0 | 0 | What the simplest database to use with both Python and Matlab? | 3 | python,database,matlab | 0 | 2012-06-15T21:27:00.000 |
I'm looking for a search engine that I can point to a column in my database that supports advanced functions like spelling correction and "close to" results.
Right now I'm just using
SELECT <column> from <table> where <colname> LIKE %<searchterm>%
and I'm missing some results particularly when users misspell items... | 3 | 3 | 0.197375 | 0 | false | 11,088,110 | 0 | 153 | 2 | 0 | 0 | 11,082,229 | Apache Solr is a great Search Engine that provides (1) N-Gram Indexing (search for not just complete strings but also for partial substrings, this helps greatly in getting similar results) (2) Provides an out of box Spell Corrector based on distance metric/edit distance (which will help you in getting a "did you mean c... | 1 | 0 | 0 | Search Engine for a single DB column | 3 | python,mysql,database,search | 0 | 2012-06-18T11:52:00.000 |
I'm looking for a search engine that I can point to a column in my database that supports advanced functions like spelling correction and "close to" results.
Right now I'm just using
SELECT <column> from <table> where <colname> LIKE %<searchterm>%
and I'm missing some results particularly when users misspell items... | 3 | 1 | 0.066568 | 0 | false | 11,087,295 | 0 | 153 | 2 | 0 | 0 | 11,082,229 | I would suggest looking into open source technologies like Sphynx Search. | 1 | 0 | 0 | Search Engine for a single DB column | 3 | python,mysql,database,search | 0 | 2012-06-18T11:52:00.000 |
I have a table in a django app where one of the fields is called Order (as in sort order) and is an integer. Every time a new record is entered the field auto increments itself to the next number. My issue is when a record is deleted I would like the other records to shift a number up and cant find anything that would ... | 1 | 0 | 0 | 0 | false | 11,101,114 | 1 | 2,701 | 4 | 0 | 0 | 11,100,997 | Instead of deleting orders - you should create a field which is a boolean (call it whatever you like - for example, deleted) and set this field to 1 for "deleted" orders.
Messing with a serial field (which is what your auto-increment field is called in postgres) will lead to problems later; especially if you have forei... | 1 | 0 | 0 | Auto Increment Field in Django/Python | 7 | python,django,postgresql | 0 | 2012-06-19T12:32:00.000 |
I have a table in a django app where one of the fields is called Order (as in sort order) and is an integer. Every time a new record is entered the field auto increments itself to the next number. My issue is when a record is deleted I would like the other records to shift a number up and cant find anything that would ... | 1 | 4 | 0.113791 | 0 | false | 11,101,064 | 1 | 2,701 | 4 | 0 | 0 | 11,100,997 | You are going to have to implement that feature yourself, I doubt very much that a relational db will do that for you, and for good reason: it means updating a potentially large number of rows when one row is deleted.
Are you sure you need this? It could become expensive. | 1 | 0 | 0 | Auto Increment Field in Django/Python | 7 | python,django,postgresql | 0 | 2012-06-19T12:32:00.000 |
I have a table in a django app where one of the fields is called Order (as in sort order) and is an integer. Every time a new record is entered the field auto increments itself to the next number. My issue is when a record is deleted I would like the other records to shift a number up and cant find anything that would ... | 1 | -1 | -0.028564 | 0 | false | 11,101,032 | 1 | 2,701 | 4 | 0 | 0 | 11,100,997 | Try to set the value with type sequence in postgres using pgadmin. | 1 | 0 | 0 | Auto Increment Field in Django/Python | 7 | python,django,postgresql | 0 | 2012-06-19T12:32:00.000 |
I have a table in a django app where one of the fields is called Order (as in sort order) and is an integer. Every time a new record is entered the field auto increments itself to the next number. My issue is when a record is deleted I would like the other records to shift a number up and cant find anything that would ... | 1 | 0 | 0 | 0 | false | 15,074,698 | 1 | 2,701 | 4 | 0 | 0 | 11,100,997 | I came across this looking for something else and wanted to point something out:
By storing the order in a field in the same table as your data, you lose data integrity, or if you index it things will get very complicated if you hit a conflict. In other words, it's very easy to have a bug (or something else) give you ... | 1 | 0 | 0 | Auto Increment Field in Django/Python | 7 | python,django,postgresql | 0 | 2012-06-19T12:32:00.000 |
I am new with SQL/Python.
I was wondering if there is a way for me to sort or categorize expense items into three primary categories.
That is I have a 56,000 row list with about 100+ different expense categories. They vary from things like Payroll, Credit Card Pmt, telephone, etc.
I would like to put them into three ca... | 0 | 0 | 0 | 0 | false | 11,121,498 | 0 | 163 | 1 | 0 | 0 | 11,121,395 | You should create a table called something like ExpenseCategories, with the columns ExpenseCategory, PrimaryCategory.
This table would have one row for each expense category (which you can enforce with a constraint if you like). You would then join this table with your existing data in SQL.
By the way, in Excel, you c... | 1 | 0 | 0 | Method for Sorting a list of expense categories into specific categories | 1 | python,sql | 0 | 2012-06-20T14:05:00.000 |
As a relatively new programmer, I have several times encountered situations where it would be beneficial for me to read and assemble program data from an external source rather than have it written in the code. This is mostly the case when there are a large number of objects of the same type. In such scenarios, object ... | 5 | 0 | 0 | 0 | false | 11,130,087 | 0 | 4,619 | 3 | 0 | 0 | 11,129,844 | I would be tempted to research a little into some GUI that could output graphviz (DOT format) with annotations, so you could create the rooms and links between them (a sort of graph). Then later, you might want another format to support heftier info.
But should make it easy to create maps, links between rooms (containi... | 1 | 0 | 1 | Optimal format for simple data storage in python | 8 | python | 0 | 2012-06-20T23:59:00.000 |
As a relatively new programmer, I have several times encountered situations where it would be beneficial for me to read and assemble program data from an external source rather than have it written in the code. This is mostly the case when there are a large number of objects of the same type. In such scenarios, object ... | 5 | 5 | 0.124353 | 0 | false | 11,129,974 | 0 | 4,619 | 3 | 0 | 0 | 11,129,844 | Though there are good answers here already, I would simply recommend JSON for your purposes for the sole reason that since you're a new programmer it will be the most straightforward to read and translate as it has the most direct mapping to native Python data types (lists [] and dictionaries {}). Readability goes a l... | 1 | 0 | 1 | Optimal format for simple data storage in python | 8 | python | 0 | 2012-06-20T23:59:00.000 |
As a relatively new programmer, I have several times encountered situations where it would be beneficial for me to read and assemble program data from an external source rather than have it written in the code. This is mostly the case when there are a large number of objects of the same type. In such scenarios, object ... | 5 | 1 | 0.024995 | 0 | false | 11,129,853 | 0 | 4,619 | 3 | 0 | 0 | 11,129,844 | If you want editability, YAML is the best option of the ones you've named, because it doesn't have <> or {} required delimiters. | 1 | 0 | 1 | Optimal format for simple data storage in python | 8 | python | 0 | 2012-06-20T23:59:00.000 |
I would like to get some understanding on the question that I was pretty sure was clear for me. Is there any way to create table using psycopg2 or any other python Postgres database adapter with the name corresponding to the .csv file and (probably the most important) with columns that are specified in the .csv file. | 3 | 1 | 1.2 | 0 | true | 11,130,568 | 0 | 2,067 | 1 | 0 | 0 | 11,130,261 | I'll leave you to look at the psycopg2 library properly - this is off the top of my head (not had to use it for a while, but IIRC the documentation is ample).
The steps are:
Read column names from CSV file
Create "CREATE TABLE whatever" ( ... )
Maybe INSERT data
import os.path
my_csv_file = '/home/somewhere/file.csv'
... | 1 | 0 | 0 | Dynamically creating table from csv file using psycopg2 | 1 | python,postgresql,psycopg2 | 0 | 2012-06-21T00:59:00.000 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.