Question
stringlengths
25
7.47k
Q_Score
int64
0
1.24k
Users Score
int64
-10
494
Score
float64
-1
1.2
Data Science and Machine Learning
int64
0
1
is_accepted
bool
2 classes
A_Id
int64
39.3k
72.5M
Web Development
int64
0
1
ViewCount
int64
15
1.37M
Available Count
int64
1
9
System Administration and DevOps
int64
0
1
Networking and APIs
int64
0
1
Q_Id
int64
39.1k
48M
Answer
stringlengths
16
5.07k
Database and SQL
int64
1
1
GUI and Desktop Applications
int64
0
1
Python Basics and Environment
int64
0
1
Title
stringlengths
15
148
AnswerCount
int64
1
32
Tags
stringlengths
6
90
Other
int64
0
1
CreationDate
stringlengths
23
23
I would like to hear your opinion about the effective implementation of one-to-many relationship with Python NDB. (e.g. Person(one)-to-Tasks(many)) In my understanding, there are three ways to implement it. Use 'parent' argument Use 'repeated' Structured property Use 'repeated' Key property I choose a way based on th...
11
6
1
0
false
14,749,034
0
1,389
2
1
0
14,739,044
One thing that most GAE users will come to realize (sooner or later) is that the datastore does not encourage design according to the formal normalization principles that would be considered a good idea in relational databases. Instead it often seems to encourage design that is unintuitive and anathema to established n...
1
0
0
Effective implementation of one-to-many relationship with Python NDB
2
python,google-app-engine,app-engine-ndb
0
2013-02-06T21:22:00.000
I would like to hear your opinion about the effective implementation of one-to-many relationship with Python NDB. (e.g. Person(one)-to-Tasks(many)) In my understanding, there are three ways to implement it. Use 'parent' argument Use 'repeated' Structured property Use 'repeated' Key property I choose a way based on th...
11
7
1
0
false
14,740,062
0
1,389
2
1
0
14,739,044
A key thing you are missing: How are you reading the data? If you are displaying all the tasks for a given person on a request, 2 makes sense: you can query the person and show all his tasks. However, if you need to query say a list of all tasks say due at a certain time, querying for repeated structured properties is ...
1
0
0
Effective implementation of one-to-many relationship with Python NDB
2
python,google-app-engine,app-engine-ndb
0
2013-02-06T21:22:00.000
Pretty simple question but haven't been able to find a good answer. In Excel, I am generating files that need to be automatically read. They are read by an ID number, but the format I get is setting it as text. When using xlrd, I get this format: 5.5112E+12 When I need it in this format: 5511195414392 What is the b...
2
1
0.099668
0
false
14,854,783
0
1,158
1
0
0
14,751,806
I used the CSV module to figure this out, as it read the cells correctly.
1
0
0
Reading scientific numbers in xlrd
2
python,xlrd
0
2013-02-07T13:05:00.000
I have really big database which I want write to xlsx/xls file. I already tried to use xlwt, but it allows to write only 65536 rows (some of my tables have more than 72k rows). I also found openpyxl, but it works too slow, and use huge amount of memory for big spreadsheets. Are there any other possibilities to write ex...
8
1
0.066568
0
false
31,982,266
0
5,375
1
0
0
14,754,090
XlsxWriter work for me. I try openpyxl but it error. 22k*400 r*c
1
0
0
How to write big set of data to xls file?
3
python,excel,hdf5
0
2013-02-07T14:56:00.000
How would one go about connecting to a different database based on which module is being used? Our scenario is as follows: We have a standalone application with its own database on a certain server and OpenERP running on different server. We want to create a module in OpenERP which can utilise entities on the standalon...
1
1
0.197375
0
false
14,796,657
1
1,877
1
0
0
14,756,365
One way to connect to an external application is to create a connector module. There are already several connector modules that you can take a look at: the thunderbird and outlook plugins the joomla and magento modules the 'event moodle' module For example, the joomla connector uses a joomla plugin to handle the comm...
1
0
0
How to connect to a different database in OpenERP?
1
python,xml-rpc,openerp
0
2013-02-07T16:45:00.000
I have pip installed psycopg2, but when I try to runserver or syncdb in my Django project, it raises an error saying there is "no module named _psycopg". EDIT: the "syncdb" command now raises: django.core.exceptions.ImproperlyConfigured: ImportError django.contrib.admin: No module named _psycopg Thanks for your help
2
1
1.2
0
true
15,337,328
1
1,586
1
0
0
14,758,024
This was solved by performing a clean reinstall of django. There was apparently some dependecies missing that the recursive pip install did not seem to be able to solve.
1
0
0
Psycopg missing module in Django
2
python,django,pip,psycopg2,psycopg
0
2013-02-07T18:10:00.000
TLDR; Are there drawbacks to putting two different types of documents into the same collection to save a round-trip to the database? So I have documents with children, and a list of keys in the parent referencing the children, and almost whenever we want a parent, we also want the children to come along. The naive way ...
0
1
1.2
0
true
14,780,990
0
339
1
0
0
14,780,381
I'll address the three points separately. You should know that it absolutely depends on the situation on what works best. There is no "theoretically correct" answer as it depends on your data store/access patterns. It is always a fairly complex decision on how you store your data. I think the main rule should be "How ...
1
0
1
Put different "schemas" into same MongoDB collection
2
python,performance,mongodb
0
2013-02-08T19:58:00.000
My python project involves an externally provided database: A text file of approximately 100K lines. This file will be updated daily. Should I load it into an SQL database, and deal with the diff daily? Or is there an effective way to "query" this text file? ADDITIONAL INFO: Each "entry", or line, contains three field...
2
0
0
0
false
14,797,390
0
336
2
0
0
14,795,810
What I've done before is create SQLite databases from txt files which were created from database extracts, one SQLite db for each day. One can query across SQLite db to check the values etc and create additional tables of data. I added an additional column of data that was the SHA1 of the text line so that I could easi...
1
0
0
Large text database: Convert to SQL or use as is
2
python,sql,database,text
0
2013-02-10T07:53:00.000
My python project involves an externally provided database: A text file of approximately 100K lines. This file will be updated daily. Should I load it into an SQL database, and deal with the diff daily? Or is there an effective way to "query" this text file? ADDITIONAL INFO: Each "entry", or line, contains three field...
2
1
0.099668
0
false
14,795,870
0
336
2
0
0
14,795,810
How often will the data be queried? On the one extreme, if once per day, you might use a sequential search more efficiently than maintaining a database or index. For more queries and a daily update, you could build and maintain your own index for more efficient queries. Most likely, it would be worth a negligible (if a...
1
0
0
Large text database: Convert to SQL or use as is
2
python,sql,database,text
0
2013-02-10T07:53:00.000
I dropped my database that I had previously created for django using : dropdb <database> but when I go to the psql prompt and say \d, I still see the relations there : How do I remove everything from postgres so that I can do everything from scratch ?
0
1
0.099668
0
false
14,880,796
1
88
2
0
0
14,869,718
Most likely somewhere along the line, you created your objects in the template1 database (or in older versions the postgres database) and every time you create a new db i thas all those objects in it. You can either drop the template1 / postgres database and recreate it or connect to it and drop all those objects by h...
1
0
0
postgres : relation there even after dropping the database
2
python,django,postgresql
0
2013-02-14T07:23:00.000
I dropped my database that I had previously created for django using : dropdb <database> but when I go to the psql prompt and say \d, I still see the relations there : How do I remove everything from postgres so that I can do everything from scratch ?
0
0
0
0
false
14,870,374
1
88
2
0
0
14,869,718
Chances are that you never created the tables in the correct schema in the first place. Either that or your dropdb failed to complete. Try to drop the database again and see what it says. If that appears to work then go in to postgres and type \l, putting the output here.
1
0
0
postgres : relation there even after dropping the database
2
python,django,postgresql
0
2013-02-14T07:23:00.000
We have a Python application with over twenty modules, most of which are shared by several web and console applications. I've never had a clear understanding of the best practice for establishing and managing database connection in multi module Python apps. Consider this example: I have a module defining an object cla...
17
4
0.379949
0
false
14,883,719
1
6,022
2
0
0
14,883,346
MySQL connections are relatively fast, so this might not be a problem (i.e. you should measure). Most other databases take much more resources to create a connection. Creating a new connection when you need one is always the safest, and is a good first choice. Some db libraries, e.g. SqlAlchemy, have connection pools ...
1
0
0
How should I establish and manage database connections in a multi-module Python app?
2
python,mysql
0
2013-02-14T20:20:00.000
We have a Python application with over twenty modules, most of which are shared by several web and console applications. I've never had a clear understanding of the best practice for establishing and managing database connection in multi module Python apps. Consider this example: I have a module defining an object cla...
17
16
1.2
0
true
14,883,590
1
6,022
2
0
0
14,883,346
The best method is to open a connection when you need to do some operations (like getting and/or updating data); manipulate the data; write it back to the database in one query (very important for performance), and then close the connection. Opening a connection is a fairly light process. Some pitfalls for performan...
1
0
0
How should I establish and manage database connections in a multi-module Python app?
2
python,mysql
0
2013-02-14T20:20:00.000
I have a MySQL database with around 10,000 articles in it, but that number will probably go up with time. I want to be able to search through these articles and pull out the most relevent results based on some keywords. I know there are a number of projects that I can plug into that can essentially do this for me. Howe...
0
3
0.291313
0
false
14,889,522
1
643
1
0
0
14,889,206
The best bet for you to do "Search Engine" for the 10,000 Articles is to read "Programming Collective Intelligence" by Toby Segaran. Wonderful read and to save your time go to Chapter 4 of August 2007 issue.
1
0
0
Search engine from scratch
2
python,mysql,search,search-engine
0
2013-02-15T06:10:00.000
I have a situation where my script parse approx 20000 entries and save them to db. I have used transaction which takes around 35 seconds to save and also consume high memory since until committed queries are saved in memory. I have Found another way to write CSV then load into postgres using "copy_from" which is very f...
0
1
1.2
0
true
14,890,240
0
90
1
0
0
14,890,211
Reduce the size of your transactions?
1
0
0
File writing in python
1
python,file,postgresql,csv
0
2013-02-15T07:45:00.000
If we have a json format data file which stores all of our database data content, such as table name, row, and column, etc content, how can we use DB-API object to insert/update/delete data from json file into database, such as sqlite, mysql, etc. Or please share if you have better idea to handle it. People said it is...
0
1
0.197375
0
false
14,951,638
0
454
1
0
0
14,942,462
There's no magic way, you'll have to write a Python program to load your JSON data in a database. SQLAlchemy is a good tool to make it easier.
1
0
1
how will Python DB-API read json format data into an existing database?
1
python,database,json,sqlalchemy,python-db-api
0
2013-02-18T17:57:00.000
Example scenario: MySQL running a single server -> HOSTNAME Two MySQL databases on that server -> USERS , GAMES . Task -> Fetch 10 newest games from GAMES.my_games_table , and fetch users playing those games from USERS.my_users_table ( assume no joins ) In Django as well as Python MySQLdb , why is having one cursor f...
6
10
1.2
0
true
15,328,753
1
1,351
3
0
0
14,986,129
A shorter answer would be, "MySQL doesn't support that type of cursor", so neither does Python-MySQL, so the reason one connection command is preferred is because that's the way MySQL works. Which is sort of a tautology. However, the longer answer is: A 'cursor', by your definition, would be some type of object access...
1
0
0
Why django and python MySQLdb have one cursor per database?
3
python,mysql,django,mysql-python
0
2013-02-20T17:27:00.000
Example scenario: MySQL running a single server -> HOSTNAME Two MySQL databases on that server -> USERS , GAMES . Task -> Fetch 10 newest games from GAMES.my_games_table , and fetch users playing those games from USERS.my_users_table ( assume no joins ) In Django as well as Python MySQLdb , why is having one cursor f...
6
2
0.132549
0
false
15,302,237
1
1,351
3
0
0
14,986,129
As you say, MySQL connections are cheap, so for your case, I'm not sure there is a technical advantage either way, outside of code organization and flow. It might be easier to manage two cursors than to keep track of which database a single cursor is currently talking to by painstakingly tracking SQL 'USE' statements. ...
1
0
0
Why django and python MySQLdb have one cursor per database?
3
python,mysql,django,mysql-python
0
2013-02-20T17:27:00.000
Example scenario: MySQL running a single server -> HOSTNAME Two MySQL databases on that server -> USERS , GAMES . Task -> Fetch 10 newest games from GAMES.my_games_table , and fetch users playing those games from USERS.my_users_table ( assume no joins ) In Django as well as Python MySQLdb , why is having one cursor f...
6
0
0
0
false
15,421,235
1
1,351
3
0
0
14,986,129
One cursor per database is not necessarily preferable, it's just the default behavior. The rationale is that different databases are more often than not on different servers, use different engines, and/or need different initialization options. (Otherwise, why should you be using different "databases" in the first place...
1
0
0
Why django and python MySQLdb have one cursor per database?
3
python,mysql,django,mysql-python
0
2013-02-20T17:27:00.000
I'm in the process of building a Django powered site that is backed by a MySQL server. This MySQL server is going to be accessed from additional sources, other than the website, to read and write table data; such as a program that users run locally which connects to the database. Currently the program running locally i...
3
1
1.2
0
true
14,992,070
1
377
1
0
0
14,991,783
This is a completely valid concern and a very common problem. You have described creating a RESTful API. I guess it could be considered a proxy to a database but is not usually referred to as a proxy. Django is a great tool to use to use to accomplish this. Django even has a couple packages that will assist in speed...
1
0
0
Django as a mysql proxy server?
1
c++,python,mysql,c,django
0
2013-02-20T23:21:00.000
I have a postgre database with a timestamp column and I have a REST service in Python that executes a query in the database and returns data to a JavaScript front-end to plot a graph using flot. Now the problem I have is that flot can automatically handle the date using JavaScript's TIMESTAMP, but I don't know how to c...
8
3
0.291313
0
false
15,032,100
1
4,296
1
0
0
15,031,856
You can't send a Python or Javascript "datetime" object over JSON. JSON only accepts more basic data types like Strings, Ints, and Floats. The way I usually do it is send it as text, using Python's datetime.isoformat() then parse it on the Javascript side.
1
0
0
Converting postgresql timestamp to JavaScript timestamp in Python
2
javascript,python,postgresql,flot
0
2013-02-22T19:33:00.000
The first element of arrays (in most programming languages) has an id (index) of 0. The first element (row) of MySQL tables has an (auto incremented) id of 1. The latter seems to be the exception.
5
4
0.26052
0
false
15,056,205
0
2,295
2
0
0
15,055,175
The better question to ask is "why are arrays zero-indexed?" The reason has to do with pointer arithmetic. The index of an array is an offset relative to the pointer address. In C++, given array char x[5], the expressions x[1] and *(x + 1) are equivalent, given that sizeof(char) == 1. So auto increment fields starti...
1
0
0
Why does MySQL count from 1 and not 0?
3
php,python,mysql,ruby
0
2013-02-24T18:40:00.000
The first element of arrays (in most programming languages) has an id (index) of 0. The first element (row) of MySQL tables has an (auto incremented) id of 1. The latter seems to be the exception.
5
0
0
0
false
15,055,977
0
2,295
2
0
0
15,055,175
The main reason I suppose is that a row in a database isnt an array and the autoincrement value isnt an index in the sense that an array index is. The primary key id can be any value and to a great extent it is simply essential it is unique and is not guaranteed to be anything else (for example you can delete a row an...
1
0
0
Why does MySQL count from 1 and not 0?
3
php,python,mysql,ruby
0
2013-02-24T18:40:00.000
The DBF files are updated every few hours. We need to import new records into MySQL and skip duplicates. I don't have any experience with DBF files but as far as I can tell a handful of the one's we're working with don't have unique IDs. I plan to use Python if there are no ready-made utilities that do this.
0
-1
-0.099668
0
false
16,302,184
0
2,974
1
0
0
15,059,749
When you say you are using dBase, I presume you have access to the (.) dot prompt. At dot prompt convert the .dbf file into a delimited text file. Reconvert the delimited text file into a MySql data file with the necessary command in MySql. I do not know the actual command for it. All DBMS will have commands to do th...
1
0
0
What's the best way to routinely import DBase (dbf) files into MySQL tables?
2
python,mysql,dbf,dbase
0
2013-02-25T03:45:00.000
I'm doing a small web application which might need to eventually scale somewhat, and am curious about Google App Engine. However, I am experiencing a problem with the development server (dev_appserver.py): At seemingly random, requests will take 20-30 seconds to complete, even if there is no hard computation or data u...
2
2
0.197375
0
false
15,098,634
1
237
2
1
0
15,098,051
Don't worry about it. It (IIRC) keeps the whole DB (datastore) in memory using a "emulation" of the real thing. There are lots of other issues that you won't see when deployed. I'd suggest that your hard drive is spinning down and the delay you see is it taking a few seconds to wake back up. If this becomes a problem...
1
0
0
Google App Engine development server random (?) slowdowns
2
python,google-app-engine
0
2013-02-26T19:54:00.000
I'm doing a small web application which might need to eventually scale somewhat, and am curious about Google App Engine. However, I am experiencing a problem with the development server (dev_appserver.py): At seemingly random, requests will take 20-30 seconds to complete, even if there is no hard computation or data u...
2
0
0
0
false
15,106,246
1
237
2
1
0
15,098,051
Does this happen in all web browsers? I had issues like this when viewing a local app engine dev site in several browsers at the same time for cross-browser testing. IE would then struggle, with requests taking about as long as you describe. If this is the issue, I found the problems didn't occur with IETester. Sorry i...
1
0
0
Google App Engine development server random (?) slowdowns
2
python,google-app-engine
0
2013-02-26T19:54:00.000
How do I save an open excel file using python= I currently read the excel workbook using XLRD but I need to save the excel file so any changes the user inputs are read. I have done this using a VBA script from within excel which saves the workbook every x seconds, but this is not ideal.
0
0
0
0
false
15,114,556
0
758
1
0
0
15,114,329
It looks like XLRD is used for reading the data, not interfacing with excel. So no, unless you use a different library using python is not the best way to do this, what is wrong with the VBA script?
1
0
0
Save open excel file using python
2
python,excel,xlrd
0
2013-02-27T14:17:00.000
Django needs MySQL-python package to manipulate MySQL, but MySQL-python doesn't support Python 3.3. I have tried MySQL-for-Python-3, but it doesn't work. Please help! Thanks a lot!
1
0
0
0
false
15,203,056
1
713
1
0
0
15,202,503
As others have noted, Python 3 support in Django 1.5 is "experimental" and, as such, not everything should be expected to work. That being said, if you absolutely need to get this working, you may be able to run the 2to3 tool on a source version of MySQL-python to translate it to Python 3 (and build against Python 3 h...
1
0
0
How can I use MySQL with Python 3.3 and Django 1.5?
3
python,mysql,django
0
2013-03-04T13:18:00.000
So, in order to avoid the "no one best answer" problem, I'm going to ask, not for the best way, but the standard or most common way to handle sessions when using the Tornado framework. That is, if we're not using 3rd party authentication (OAuth, etc.), but rather we have want to have our own Users table with secure co...
14
14
1.2
0
true
15,265,556
1
14,722
3
1
0
15,254,538
Tornado designed to be stateless and don't have session support out of the box. Use secure cookies to store sensitive information like user_id. Use standard cookies to store not critical information. For storing large objects - use standard scheme - MySQL + memcache.
1
0
0
standard way to handle user session in tornado
4
python,tornado
0
2013-03-06T17:55:00.000
So, in order to avoid the "no one best answer" problem, I'm going to ask, not for the best way, but the standard or most common way to handle sessions when using the Tornado framework. That is, if we're not using 3rd party authentication (OAuth, etc.), but rather we have want to have our own Users table with secure co...
14
17
1
0
false
16,320,593
1
14,722
3
1
0
15,254,538
Here's how it seems other micro frameworks handle sessions (CherryPy, Flask for example): Create a table holding session_id and whatever other fields you'll want to track on a per session basis. Some frameworks will allow you to just store this info in a file on a per user basis, or will just store things directly in ...
1
0
0
standard way to handle user session in tornado
4
python,tornado
0
2013-03-06T17:55:00.000
So, in order to avoid the "no one best answer" problem, I'm going to ask, not for the best way, but the standard or most common way to handle sessions when using the Tornado framework. That is, if we're not using 3rd party authentication (OAuth, etc.), but rather we have want to have our own Users table with secure co...
14
4
0.197375
0
false
16,346,968
1
14,722
3
1
0
15,254,538
The key issue with sessions is not where to store them, is to how to expire them intelligently. Regardless of where sessions are stored, as long as the number of stored sessions is reasonable (i.e. only active sessions plus some surplus are stored), all this data is going to fit in RAM and be served fast. If there is a...
1
0
0
standard way to handle user session in tornado
4
python,tornado
0
2013-03-06T17:55:00.000
I have a problem in kettle connecting python. In kettle, I only find the js script module. Does kettle support python directly? I mean, can I call a python script in kettle without using js or others? By the way, I want to move data from Oracle to Mongo regularly. I choose to use python to implement the transformation....
3
2
1.2
0
true
15,274,794
0
6,043
1
0
0
15,263,196
It doesnt support it directly from what I've seen. However there is a mongodb input step. And a lot of work has been done on it recently ( and still ongoing. So given there is a mongodb input step, if you're using an ETL tool already then why would you want to make it execute a python script to do the job??
1
0
0
how to call python script in kettle
1
python,kettle
0
2013-03-07T04:39:00.000
I'm working on a Django application that needs to interact with a mongoDB instance ( preferably through django's ORM) The meat of the application still uses a relational database - but I just need to interact with mongo for a single specific model. Which mongo driver/subdriver for python will suite my needs best ?
0
0
0
0
false
15,498,874
1
258
1
0
0
15,314,025
You could use django-nonrel which is a fork of Django and will let you use the same ORM. If you dont want a forked Django you could use MongoEngine which has a similar syntax otherwise just raw pymongo.
1
0
0
Use MongoDB with Django but also use relational database
1
python,django,mongodb
0
2013-03-09T18:00:00.000
I have created a cronjob in Python. The purpose is to insert data into a table from another one based on certain conditions. There is more than 65000 record to be inserted. I have executed the cronjob and has seen more than 25000 records inserted. But after that the record are getting automatically deleted from that t...
4
1
0.197375
0
false
15,388,969
0
637
1
0
0
15,332,618
Run your django orm statement in the django shell and print the traceback. Look for delete statements in the django traceback sql.
1
0
0
Records getting deleted from Mysql table automatically
1
mysql,django,python-2.7,xeround
0
2013-03-11T06:40:00.000
I'm currently exploring using python to develop my server-side implementation. I've decided to use SQLAlchemy for database stuff. What I'm not currently to sure about is how it should be set up so that more than one developer can work on the project. For the code it is not a problem but how do I handle the database mod...
0
0
0
0
false
15,346,132
0
195
1
0
0
15,345,864
Make sure you have a python programs or programs to fill databases with test data from scratch. It allows each developer to work from different starting points, but also test with the same environment.
1
0
0
Multi developer environment python and sqlalchemy
2
python,database,development-environment
0
2013-03-11T18:28:00.000
So I know how to download Excel files from Google Drive in .csv format. However, since .csv files do not support multiple sheets, I have developed a system in a for loop to add the '&grid=tab_number' to the file download url so that I can download each sheet as its own .csv file. The problem I have run into is finding ...
0
0
1.2
0
true
15,505,507
0
95
1
0
1
15,456,709
Ended up just downloading with xlrd and using that. Thanks for the link Rob.
1
0
0
Complicated Excel Issue with Google API and Python
1
python,excel,google-drive-api
0
2013-03-17T01:39:00.000
I have a huge csv file which contains millions of records and I want to load it into Netezza DB using python script I have tried simple insert query but it is very very slow. Can point me some example python script or some idea how can I do the same? Thank you
2
0
0
1
false
15,643,468
0
4,583
2
0
0
15,592,980
You need to get the nzcli installed on the machine that you want to run nzload from - your sysadmin should be able to put it on your unix/linux application server. There's a detailed process to setting it all up, caching the passwords, etc - the sysadmin should be able to do that to. Once it is set up, you can create N...
1
0
0
How to use NZ Loader (Netezza Loader) through Python Script?
3
python,netezza
0
2013-03-23T22:45:00.000
I have a huge csv file which contains millions of records and I want to load it into Netezza DB using python script I have tried simple insert query but it is very very slow. Can point me some example python script or some idea how can I do the same? Thank you
2
1
0.066568
1
false
17,522,337
0
4,583
2
0
0
15,592,980
you can use nz_load4 to load the data,This is the support utility /nz/support/contrib/bin the syntax is same like nzload,by default nz_load4 will load the data using 4 thread and you can go upto 32 thread by using -tread option for more details use nz_load4 -h This will create the log files based on the number of thr...
1
0
0
How to use NZ Loader (Netezza Loader) through Python Script?
3
python,netezza
0
2013-03-23T22:45:00.000
For my app, I need to determine the nearest points to some other point and I am looking for a simple but relatively fast (in terms of performance) solution. I was thinking about using PostGIS and GeoDjango but I think my app is not really that "geographic" (I still don't really know what that means though). The geograp...
0
0
0
0
false
15,593,621
1
689
1
0
0
15,593,572
You're probably right, PostGIS/GeoDjango is probably overkill, but making your own Django app would not be too much trouble for your simple task. Django offers a lot in terms of templating, etc. and with the built in admin makes it pretty easy to enter single records. And GeoDjango is part of contrib, so you can always...
1
0
0
Django + postgreSQL: find near points
3
python,django,postgresql,postgis,geodjango
0
2013-03-24T00:09:00.000
In my company we want to build an application in Google app engine which will manage user provisioning to Google apps. But we do not really know what data source to use? We made two propositions : spreadsheet which will contains users' data and we will use spreadsheet API to get this data and use it for user provision...
0
0
0
0
false
15,671,792
1
248
1
1
0
15,671,591
If you use the Datastore API, you will also need to build out a way to manage users data in the system. If you use Spreadsheets, that will serve as your way to manage users data, so in that way managing the data would be taken care of for you. The benefits to use the Datastore API would be if you'd like to have a seaml...
1
0
0
Datastore vs spreadsheet for provisioning Google apps
1
python,google-app-engine,google-sheets,google-cloud-datastore
0
2013-03-27T23:37:00.000
If I've been given a Query object that I didn't construct, is there a way to directly modify its WHERE clause? I'm really hoping to be able remove some AND statements or replace the whole FROM clause of a query instead of starting from scratch. I'm aware of the following methods to modify the SELECT clause: Query.wi...
2
2
1.2
0
true
15,707,037
0
489
1
0
0
15,705,511
you can modify query._whereclause directly, but I'd seek to find a way to not have this issue in the first place - whereever it is that the Query is generated should be factored out so that the non-whereclause version is made available.
1
0
0
SQLAlchemy ORM: modify WHERE clause
1
python,orm,sqlalchemy,where-clause
0
2013-03-29T14:41:00.000
I have installd Python 2.7.3 on Linux 64 bit machine. I have Oracle 11g client(64bit) as well installed. And I set ORACLE_HOME, PATH, LD_LIBRARY_PATH, and installed cx_oracle 5.1.2 version for Python 2.7 & Oracle 11g. But ldd command on cx_oracle is unable to find libclntsh.so.11.1. I tried creating symlinks to libclnt...
0
0
0
0
false
15,745,441
0
412
1
1
0
15,740,464
The issue with me was that I installed python, cx_oracle as root but Oracle client installation was done by "oracle" user. I got my own oracle installation and that fixed the issue. Later I ran into PyUnicodeUCS4_DecodeUTF16 issues with Python and for that I had to install python with —enable-unicode=ucs4 option
1
0
0
cx_oracle unable to find Oracle Client
1
python,cx-oracle
0
2013-04-01T09:00:00.000
I try to connect to a remote oracle server by cx_Oracle: db = cx_Oracle.connect('username', 'password', dsn_tns) but it says databaseError: ORA-12541 tns no listener
6
1
0.066568
0
false
46,728,202
0
17,494
1
0
0
15,772,351
In my case it was due to the fact that my server port was wrong: ./install_database_new.sh localhost:1511 XE full I changed the port to "1521" and I could connect.
1
0
0
ocx_Oracle ORA-12541 tns no listener
3
python,cx-oracle
0
2013-04-02T19:10:00.000
I'm using Sqlalchemy in a multitenant Flask application and need to create tables on the fly when a new tenant is added. I've been using Table.create to create individual tables within a new Postgres schema (along with search_path modifications) and this works quite well. The limitation I've found is that the Table.cre...
0
3
1.2
0
true
15,775,816
0
1,872
1
0
0
15,774,899
(Copy from comment) Assuming sess is the session, you can do sess.execute(CreateTable(tenantX_tableY)) instead. EDIT: CreateTable is only one of the things being done when calling table.create(). Use table.create(sess.connection()) instead.
1
0
0
How do you create a table with Sqlalchemy within a transaction in Postgres?
1
python,postgresql,sqlalchemy,ddl,flask-sqlalchemy
0
2013-04-02T21:38:00.000
I'm writing a python app that connects to perforce on a daily basis. The app gets the contents of an excel file on perfoce, parses it, and copies some data to a database. The file is rather big, so I would like to keep track of which revision of the file the app last read on the database, this way i can check to see if...
0
2
1.2
0
true
15,806,216
0
1,797
1
0
0
15,795,038
Several options come to mind. The simplest approach would be to always let your program use the same client and let it sync the file. You could let your program call p4 sync and see if you get a new version or not. Let it continue if you get a new version. This approach has the advantage that you don't need to remembe...
1
0
0
How to get head revision number of a file, or the changelist number when it was checked in / changed
1
python,python-2.7,perforce
0
2013-04-03T18:23:00.000
I am searching for a persistent data storage solution that can handle heterogenous data stored on disk. PyTables seems like an obvious choice, but the only information I can find on how to append new columns is a tutorial example. The tutorial has the user create a new table with added column, copy the old table into...
5
5
1.2
0
true
19,470,951
0
2,350
1
0
0
15,797,163
Yes, you must create a new table and copy the original data. This is because Tables are a dense format. This gives it a huge performance benefits but one of the costs is that adding new columns is somewhat expensive.
1
0
0
Is the only way to add a column in PyTables to create a new table and copy?
2
python,pytables
0
2013-04-03T20:13:00.000
I'm writing a web application in Python (on Apache server on a Linux system) that needs to connect to a Postgres database. It therefore needs a valid password for the database server. It seems rather unsatisfactory to hard code the password in my Python files. I did wonder about using a .pgpass file, but it would need ...
3
1
0.099668
0
false
15,897,981
0
1,239
1
0
0
15,895,788
No matter what approach you use, other apps running as www-data will be able to read your password and log in as you to the database. Using peer auth won't help you out, it'll still trust all apps running under www-data. If you want your application to be able to isolate its data from other databases you'll need to run...
1
0
0
"Correct" way to store postgres password in python website
2
python,apache,postgresql,mod-wsgi
0
2013-04-09T07:23:00.000
I am having trouble finding this answer anywhere on the internet. I want to be able to monitor a row in a MySQL table for changes and when this occurs, run a Python function. This Python function I want to run has nothing to do with MySQL; it just enables a pin on a Raspberry Pi. I have tried looking into SQLAlchemy; h...
1
4
0.379949
0
false
15,904,750
0
6,558
1
0
0
15,903,357
What about a cron job instead of create a loop? I think it's a bit nicer.
1
0
0
How to execute Python function when value in SQL table changes?
2
python,sql,sqlalchemy,raspberry-pi
0
2013-04-09T13:31:00.000
I have a couple of python scripts which I plan to put up on a server and run them repeatedly once a day. This python script does some calculation and finally uploads the data to a central database. Of course to connect to the database a password and username is required. Is it safe to input this username and password o...
0
0
0
0
false
15,907,470
0
284
1
0
0
15,905,113
Create a DB user with limited access rights, for example, to that only table where it uploads data to. Hardcode that user in your script or pass it as command line arguments. There is little else you can do for a automated script because it has to use some username and password to connect to the DB somehow. You could e...
1
0
0
Connecting to a database using python and running it as a cron job
1
python
0
2013-04-09T14:47:00.000
What's a reasonable default for pool_size in a ZODB.DB call in a multi-threaded web application? Leaving the actual default value 7 gives me some connection WARNINGs even when I'm the only one navigating through db-interacting handlers. Is it possible to set a number that's too high? What factors play into deciding wha...
2
4
1.2
0
true
15,919,692
0
485
1
0
0
15,914,198
The pool size is only a 'guideline'; the warning is logged when you exceed that size; if you were to use double the number of connections an CRITICAL log message would be registed instead. These are there to indicate you may be using too many connections in your application. The pool will try to reduce the number of re...
1
0
0
Reasonable settings for ZODB pool_size
1
python,connection-pooling,zodb
0
2013-04-09T23:12:00.000
I have to implement nosetests for Python code using a MongoDB store. Is there any python library which permits me initializing a mock in-memory MongoDB server? I am using continuous integration. So, I want my tests to be independent of any MongoDB running server. Is there a way to mock mongoDM Server in memory to test...
17
4
0.197375
0
false
15,915,744
0
14,649
1
0
0
15,915,031
I don’t know about Python, but I had a similar concern with C#. I decided to just run a real instance of Mongo on my workstation pointed at an empty directory. It’s not great because the code isn’t isolated but it’s fast and easy. Only the data access layer actually calls Mongo during the test. The rest can rely on the...
1
0
1
Use mock MongoDB server for unit test
4
python,mongodb,python-2.7,pymongo
0
2013-04-10T00:42:00.000
We need to store a text field ( say 2000 characters) and its unique hash ( say SHA1 ) in a MySQL table. To test that text already exists in the MySQL table, we generate SHA1 of the text , and find whether it exists in the unique field hash . Now lets assume there are two texts: "This is the text which will be stored i...
1
1
1.2
0
true
15,919,118
0
496
1
0
0
15,919,063
I highly doubt anything you're looking for exists, so I propose a simpler solution: Come up with a simple algorithm for normalizing your text, e.g.: Normalize whitespace Remove punctuation Then, calculate the hash of that and store it in a separate column (normalizedHash) or store an ID to a table of normalized hashe...
1
0
0
Good hashing algorithm with proximity to original text input , less avalanche effect?
2
python,mysql,string-matching
0
2013-04-10T07:00:00.000
I wrote a little script that copies files from bucket on one S3 account to the bucket in another S3 account. In this script I use bucket.copy_key() function to copy key from one bucket in another bucket. I tested it, it works fine, but the question is: do I get charged for copying files between S3 to S3 in same region?...
1
3
1.2
0
true
15,957,021
1
322
1
0
1
15,956,099
If you are using the copy_key method in boto then you are doing server-side copying. There is a very small per-request charge for COPY operations just as there are for all S3 operations but if you are copying between two buckets in the same region, there is no network transfer charges. This is true whether you run th...
1
0
0
Will I get charge for transfering files between S3 accounts using boto's bucket.copy_key() function?
1
python,amazon-web-services,amazon-s3,boto,data-transfer
0
2013-04-11T18:24:00.000
Things to note in advance: I am using wampserver 2.2 Ive forwarded port 80 I added a rule to my firewall to accept traffic through port 3306 I have added "Allow from all" in directory of "A file i forget" My friend can access my phpmyadmin server through his browser I am quite the novice, so bear with me. I am tryin...
0
0
0
0
false
16,370,493
0
120
1
0
0
15,958,249
If your phpmyadmin runs on the same machine as mysql-server, 127.0.0.1 is enough (and safer if your mysql server binds to 127.0.0.1, rather than 0.0.0.0) if you use tcp(rather than unix socket).
1
0
0
What do I use for HOST to connect to a remote server with mysqldb python?
1
python,sql,phpmyadmin,mysql-python,host
0
2013-04-11T20:27:00.000
So there has been a lot of hating on singletons in python. I generally see that having a singleton is usually no good, but what about stuff that has side effects, like using/querying a Database? Why would I make a new instance for every simple query, when I could reuse a present connection already setup again? What wou...
7
1
0.099668
0
false
15,960,691
0
6,703
2
0
0
15,958,678
If you're using an object oriented approach, then abamet's suggestion of attaching the database connection parameters as class attributes makes sense to me. The class can then establish a single database connection which all methods of the class refer to as self.db_connection, for example. If you're not using an object...
1
0
1
DB-Connections Class as a Singleton in Python
2
python,database,singleton
0
2013-04-11T20:53:00.000
So there has been a lot of hating on singletons in python. I generally see that having a singleton is usually no good, but what about stuff that has side effects, like using/querying a Database? Why would I make a new instance for every simple query, when I could reuse a present connection already setup again? What wou...
7
7
1.2
0
true
15,958,721
0
6,703
2
0
0
15,958,678
Normally, you have some kind of object representing the thing that uses a database (e.g., an instance of MyWebServer), and you make the database connection a member of that object. If you instead have all your logic inside some kind of function, make the connection local to that function. (This isn't too common in many...
1
0
1
DB-Connections Class as a Singleton in Python
2
python,database,singleton
0
2013-04-11T20:53:00.000
I have an error no such table: mytable, even though it is defined in models/tables.py. I use sqlite. Interesting enough, if I go to admin panel -> my app -> database administration then I see a link mytable, however when I click on it then I get no such table: mytable. I don't know how to debug such error? Any ideas?
2
3
1.2
0
true
16,026,857
1
1,115
1
0
0
16,026,776
web2py keeps the structure it thinks the table has in a separate file. If someone has manually dropped the table, web2py will still think it exists, but of course you get an error when you try to actually use the table Look for the *.mytable.table file in the databases directory
1
0
0
web2py. no such table error
1
python,web2py
0
2013-04-16T00:21:00.000
I understand how to save a redis database using bgsave. However, once my database server restarts, how do I tell if a saved database is present and how do I load it into my application. I can tolerate a few minutes of lost data, so I don't need to worry about an AOF, but I cannot tolerate the loss of, say, an hour's ...
2
1
0.197375
0
false
16,069,631
0
1,375
1
0
0
16,068,644
You can stop redis and replace dump.rdb in /var/lib/redis (or whatever file is in the dbfilename variable in your redis.conf). Then start redis again.
1
0
0
How to load a redis database after
1
python,redis,persistence,reload
0
2013-04-17T19:33:00.000
I have a python script that retrieves the newest 5 records from a mysql database and sends email notification to a user containing this information. I would like the user to receive only new records and not old ones. I can retrieve data from mysql without problems... I've tried to store it in text files and compare the...
0
2
1.2
0
true
16,079,138
0
85
1
0
0
16,078,856
Are you assigning a unique incrementing ID to each record? If you are, you can create a separate table that holds just the ID of the last record fetched, that way you can only retrieve records with IDs greater than this ID. Each time you fetch, you could update this table with the new latest ID. Let me know if I misu...
1
0
1
How to check if data has already been previously used
1
python
0
2013-04-18T09:11:00.000
I need to scrap about 40 random webpages at the same time.These pages vary on each request. I have used rpcs in python to fetch the urls and scraped the data using BeautifulSoup. It takes about 25 seconds to scrap all the data and display on the screen. To increase the speed i stored the data in appengine datastore so...
0
0
0
0
false
16,131,039
1
402
1
1
0
16,098,570
Based on what I know about your app it would make sense to use memcache. It will be faster, and will automatically take care of things like expiring stale cache entries.
1
0
0
What is the fastest way to get scraped data from so many web pages?
1
python,mysql,google-app-engine,google-cloud-datastore,web-scraping
0
2013-04-19T06:29:00.000
I've had my python program removed from windows a while ago, and recently downloaded python2.7.4 from the main site, but when I type "python" in the Windows PowerShell(x86) prompt from C:, I get the message "'python' is not recognized as an internal or external command, operable program or batch file.", and I'd like to...
1
1
0.197375
0
false
16,108,206
0
1,533
1
1
0
16,107,658
Making the comments an answer for future reference: Have a ; at the end of the PATH and logout and log back in.
1
0
0
Can't open python.exe in Windows Powershell
1
python,windows,powershell,path,exe
0
2013-04-19T15:02:00.000
My little website has a table of comments and a table of votes. Each user of the website gets to vote once on each comment. When displaying comments to the user, I will select from the comments table and outerjoin a vote if one exists for the current user. Is there a way to make a query where the vote will be attached ...
0
0
1.2
0
true
17,140,662
0
149
1
0
0
16,114,939
In the end I decided that working with the tuple returned by the query wasn't a problem.
1
0
0
SqlAlchemy: Join onto another object
2
python,sqlalchemy
0
2013-04-19T23:35:00.000
Is there any SQL injection equivalents, or other vulnerabilities I should be aware of when using NoSQL? I'm using Google App Engine DB in Python2.7, and noticed there is not much documentation from Google about security of Datastore. Any help would be appreciated!
2
7
1.2
0
true
16,140,194
1
973
1
1
0
16,134,927
Standard SQL injection techniques rely on the fact that SQL has various statements to either query or modify data. The datastore has no such feature. The GQL (the query language for the datastore) can only be used to query, not modify. Inserts, updates, and deletes are done using a separate method that does not use a t...
1
0
0
NDB/DB NoSQL Injection Google Datastore
1
python,security,google-app-engine,nosql,google-cloud-datastore
0
2013-04-21T18:51:00.000
I am totally fresh and noob as you can be on Twisted. I chose a database proxy as my final project. The idea is, have a mysql as a database. A twisted proxy runs in between client and the database.The proxy makes the methods like UPDATE,SELECT,INSERT through its XMLRPC to the client. And, the methods itself in the prox...
1
0
1.2
0
true
16,171,818
0
166
1
0
0
16,155,776
As you use XML-RPC, you will have to write simple Twisted web application that handles XML-RPC calls. There are many possibilities for cache: expiring, storing on disk, invalidating, etc etc. You may start from simple dict for storing queries and find its limitations.
1
0
0
Database Proxy using Twisted
1
python,twisted
0
2013-04-22T20:07:00.000
I am a rails developer that is learning python and I am doing a project using the pyramid framework. I am used to having some sort of way of rolling back the database changes If I change the models in some sort of way. Is there some sort of database rollback that works similar to the initialize_project_db command?
0
2
1.2
0
true
16,159,421
1
139
1
0
0
16,157,144
initialize_db is not a migration script. It is for bootstrapping your model and that's that. If you want to tie in migrations with upgrade/rollback support, look at alembic for SQL schema migrations.
1
0
0
Is there some sort of way to roll back the initialize_project_db script in pyramid?
1
python,database,pyramid
0
2013-04-22T21:36:00.000
I have an old SQLite 2 database that I would like to read using Python 3 (on Windows). Unfortunately, it seems that Python's sqlite3 library does not support SQLite 2 databases. Is there any other convenient way to read this type of database in Python 3? Should I perhaps compile an older version of pysqlite? Will such ...
0
0
1.2
0
true
23,542,492
0
445
1
0
0
16,193,630
As the pysqlite author I am pretty sure nobody has ported pysqlite 1.x to Python 3 yet. The only solution that makes sense effort-wise is the one theomega suggested. If all you need is access the data from Python for importing them elsewhere, but doing the sqlite2 dump/sqlite3 restore dance is not possible, there is an...
1
0
0
Read an SQLite 2 database using Python 3
1
sqlite,python-3.x
0
2013-04-24T13:43:00.000
What's the best way to automatically query several dozen MySQL databases with a script on a nightly basis? The script usually returns no results, so I'd ideally have it email or notify me if any are ever returned. I've looked into PHP, Ruby and Python for this, but I'm a little stumped as to how best to handle this.
0
1
0.066568
0
false
16,203,901
0
307
1
0
0
16,203,859
I believe the only one can answer this question is you. All 3 examples you gave can do what you need to do with cron to automate the job. But the best script language to be used is the one you are most comfortable to use.
1
0
0
What's the best way to automate running MySQL scripts on several databases on a daily basis?
3
php,python,mysql,sql,ruby
1
2013-04-24T23:19:00.000
I am quite new to heroku and I reached a bump in my dev... I am trying to write a server/client kind of application...on the server side I will have a DB(I installed postgresql for python) and I was hoping I could reach the server, for now, via a python client(for test purposes) and send data/queries and perform basic ...
0
3
0.53705
0
false
16,245,012
1
569
1
0
0
16,244,924
Heroku is for developing Web (HTTP, HTTPS) applications. You can't deploy code that uses socket to Heroku. If you want to run your app on Heroku, the easier way is to use a web framework (Flask, CherryPy, Django...). They usually also come with useful libraries and abstractions for you to talk to your database.
1
0
0
how to write a client/server app in heroku
1
python,heroku
0
2013-04-26T20:44:00.000
I have a SQLAlchemy Session object and would like to know whether it is dirty or not. The exact question what I would like to (metaphorically) ask the Session is: "If at this point I issue a commit() or a rollback(), the effect on the database is the same or not?". The rationale is this: I want to ask the user wether h...
18
0
0
0
false
16,257,019
0
12,854
1
0
0
16,256,777
Sessions have a private _is_clean() member which seems to return true if there is nothing to flush to the database. However, the fact that it is private may mean it's not suitable for external use. I'd stop short of personally recommending this, since any mistake here could obviously result in data loss for your users.
1
0
0
How to check whether SQLAlchemy session is dirty or not
4
python,sqlalchemy
0
2013-04-27T20:54:00.000
I'm building a web app in GAE that needs to make use of some simple relationships between the datastore entities. Additionally, I want to do what I can from the outset to make import and exportability easier, and to reduce development time to migrate the application to another platform. I can see two possible ways of h...
0
1
1.2
0
true
16,268,751
1
48
1
1
0
16,266,979
GAE's datastore just doesn't export well to SQL. There's often situations where data needs to be modeled very differently on GAE to support certain queries, ie many-to-many relationships. Denormalizing is also the right way to support some queries on GAE's datastore. Ancestor relationships are something that don't e...
1
0
0
GAE: planning for exportability and relational databases
1
google-app-engine,python-2.7,google-cloud-datastore
0
2013-04-28T19:40:00.000
I am using wx.Grid to build spreadsheetlike input interface. I want to lock the size of the cells so the user can not change them. I have successfully disabled the drag-sizing with grid.EnableDragGridSize(False) of the grid but user can still resize the cells by using borders between column and row labels. I am probabl...
1
0
1.2
0
true
16,279,016
0
426
1
0
0
16,278,613
I found the solution. To completely lock user ability to resize cells it is needed to use .EnableDragGridSize(False) , .DisableDragColSize() and .DisableDragRowSize() methods.
1
1
0
wx.Grid cell size lock
1
python,wxpython
0
2013-04-29T12:26:00.000
I am able to easily call a python script from php using system(), although there are several options. They all work fine, except they all fail. Through trial and error I have narrowed it down to it failing on import MySQLdb I am not too familiar with php, but I am using it in a pinch. I understand while there could be...
0
1
1.2
0
true
16,282,538
0
322
1
0
0
16,281,823
The Apache user (www-data in your case) has a somewhat restricted environment. Check where the Python MySQLdb package is installed and edit the Apache user's env (cf Apache manual and your distrib's one about this) so it has a usable Python environment with the right PYTHONPATH etc.
1
0
0
call python script from php that connects to MySQL
1
php,python,mysql,linux
1
2013-04-29T14:55:00.000
Let's say I have some free form entries for names, where some are in the format "Last Name, First Name" and others are in the format "First Name Last Name" (eg "Bob MacDonald" and "MacDonald. Bob" are both present). From what I understand, Lucene indexing does not allow for wildcards in the beginning of the sentence, s...
2
1
0.099668
0
false
16,290,406
1
194
1
0
0
16,290,237
Can you just use OR? "Hilary Clinton" OR "Clinton, Hilary"?
1
0
1
Lucene or Python: Select both "Hilary Clinton" and "Clinton, Hilary" name entries
2
python,regex,neo4j,lucene
0
2013-04-30T00:09:00.000
There are two possible cases where I am finding MySQL and RDBMS too slow. I need a recommendation for a better alternative in terms of NOSQL. 1) I have an application that's saving tons of emails for later analysis. Email content is saved in a simple table with a couple of relations to another two tables. Columns are s...
0
1
1.2
0
true
16,306,049
0
56
1
0
0
16,304,959
If search is your primary use case, I'd look into a search solution like ElasticSearch or Solr. Even if some databases support some sort of full text indexing, they're not optimized for this problem.
1
0
0
Possible NoSQL cases
1
python,nosql
0
2013-04-30T16:42:00.000
I have been running a Python octo.py script to do word counting/author on a series of files. The script works well -- I tried it on a limited set of data and am getting the correct results. But when I run it on the complete data set it takes forever. I am running on a windows XP laptop with dual core 2.33 GHz and 2 GB ...
0
0
1.2
0
true
16,378,262
0
196
1
0
0
16,376,374
As your application isn't very CPU intensive, the slow disk turns out to be the bottleneck. Old 5200 RPM laptop hard drives are very slow, which, in addition to fragmentation and low RAM (which impacts disk caching), make reading very slow. This in turns slows down processing and yields low CPU usage. You can try defra...
1
0
0
Octo.py only using between 0% and 3% of my CPUs
1
python-2.7,multiprocessing,cpu-usage
0
2013-05-04T16:20:00.000
I'm trying to write a function to do a bulk-save to a mongoDB using pymongo, is there a way of doing it? I've already tried using insert and it works for new records but it fails on duplicates. I need the same functionality that you get using save but with a collection of documents (it replaces an already added documen...
3
1
0.197375
0
false
16,380,066
0
311
1
0
0
16,379,254
you can use bulk insert with option w=0 (ex safe=False), but then you should do a check to see if all documents were actually inserted if this is important for you
1
0
1
Is there a pymongo (or another Python library) bulk-save?
1
python,mongodb,pymongo
0
2013-05-04T21:42:00.000
I am executing update in mysqldb which is changing the values of part of a key and field. When I execute the query in python it triggers something in the database to cause it to add extra rows. When I execute the same exact query from mysql workbench it performs the update correctly without adding extra rows. What is...
0
0
0
0
false
16,943,780
0
78
1
0
0
16,420,461
There was a trigger activating that I did not know about. Thanks for the help
1
0
0
MySQLdb for python behaves differently for queries than the mysql workbench browser
1
python,mysql
0
2013-05-07T13:35:00.000
If I have text that is saved in a Postgresql database is there any way to execute that text as Python code and potentially have it update the same database?
0
0
0
0
false
16,470,721
0
162
1
0
0
16,470,079
let me see if I understand what you are trying to accomplish: store ad-hoc user code in a varchar field on a database read and execute said code allow said code to affect the database in question, say drop table ... Assuming that I've got it, you could write something that reads the table holding the code (use pyodb...
1
0
0
Execute text in Postgresql database as Python code
2
python,postgresql
0
2013-05-09T19:57:00.000
I can connect to my local mysql database from python, and I can create, select from, and insert individual rows. My question is: can I directly instruct mysqldb to take an entire dataframe and insert it into an existing table, or do I need to iterate over the rows? In either case, what would the python script look lik...
60
-1
-0.022219
0
false
56,185,092
0
168,445
1
0
0
16,476,413
df.to_sql(name = "owner", con= db_connection, schema = 'aws', if_exists='replace', index = >True, index_label='id')
1
0
0
How to insert pandas dataframe via mysqldb into database?
9
python,mysql,pandas,mysql-python
0
2013-05-10T06:29:00.000
I want to do something like select * from table where name like '%name%' is there anyway to do this in Hbase ? and if there is a way so how to do that ps. I use HappyBase to communicate with Hbase
0
1
0.197375
0
false
16,608,107
0
364
1
0
0
16,606,906
HBase provides a scanner interface that allows you to enumerate over a range of keys in an HTable. HappyBase has support for scans and this is documented pretty well in their API. So this would solve your question if you were asking for a "like 'name%'" type of query which searches for anything that begins with the pre...
1
0
0
Hbase wildcard support
1
python,hbase,thrift
0
2013-05-17T10:33:00.000
I have to read and write data's into .xlsx extentsion files using python. And I have to use cell formatting features like merging cells,bold,font size,color etc..So which python module is good to use ?
0
1
0.099668
0
false
24,190,976
0
367
1
0
0
16,651,124
openpyxl is the only library I know of that can read and write xlsx files. It's down side is that when you edit an existing file it doesn't save the original formatting or charts. A problem I'm dealing with right now. If anyone knows a work around please let me know.
1
0
1
Which module has more option to read and write xlsx extension files using Python?
2
python
0
2013-05-20T13:55:00.000
In MongoDB if we provide a coordinate and a distance, using $near operator will find us the documents nearby within the provided distance, and sorted by distance to the given point. Does Redis provide similar functions?
2
1
1.2
0
true
16,886,089
0
798
1
0
0
16,761,134
Noelkd was right. There is no inbuilt function in Redis. I found that the simplest solution is to use geohash to store the hashed lat/lng as keys. Geohash is able to store locations nearby with similar structure, e.g. A hash of a certain location is ebc8ycq, then the nearby locations can be queried with the wildcard eb...
1
0
0
How to find geographically near documents in Redis, like $near in MongoDB?
2
python,mongodb,redis,geospatial
0
2013-05-26T16:13:00.000
I am running my Django site on appengine. In the datastore, there is an entity kind / table X which is only updated once every 24 hours. X has around 15K entries and each entry is of form ("unique string of length <20", integer). In some context, a user request involves fetching an average of 200 entries from X, which...
1
1
1.2
0
true
16,775,062
1
53
1
1
0
16,773,961
Your total amout of data is very small and looks like a dict. Why not save it (this object) as a single entry in the database or the blobstore and you can cache this entry.
1
0
0
A way to optimize reading from a datastore which updates once a day
1
python,django,google-app-engine
0
2013-05-27T13:08:00.000
I need to represent instances of Python "Long integer" in MySQL. I wonder what the most appropriate SQL data type I should use. The Python documentation (v2.7) says (for numbers.Integral): Long integers These represent numbers in an unlimited range, subject to available (virtual) memory only. For the purpose of shift...
2
3
0.148885
0
false
16,867,914
0
1,333
1
0
0
16,867,823
Yes if you really need unlimited precision then you'll have to use a blob because even strigns are limited. But really I can almost guarantee that you'll be fine with a NUMERIC/DECIMAL data type. 65 digits means that you can represent numbers in the range (-10^65, 10^65). How large is this? To give you some idea: The ...
1
0
0
What are the options for storing Python long integers in MySQL?
4
python,mysql,mysql-python
0
2013-06-01T00:26:00.000
Can someone advise on what database is better for storing textual information such as part of speech sequences, dependencies, sentences used in NLP project written in python. Now this information is stored in files and they need to be parsed every time in order to extract the mentioned blocks which are used as an input...
0
6
1.2
0
true
16,873,052
0
2,075
1
0
0
16,872,221
This really depends on what exactly you are storing and which operations you will perform on this data. SQL vs. NoSQL is a very fundamental decision and no one can give you a good advice here. If your data fits relational model well, then, SQL (PostgreSQL or MySQL) is your choice. If your data is more like documents, u...
1
0
0
Database for NLP project
1
python,mysql,mongodb,nlp,bigdata
0
2013-06-01T11:31:00.000
So I have my Django app running and I just added South. I performed some migrations which worked fine locally, but I am seeing some database errors on my Heroku version. I'd like to view the current schema for my database both locally and on Heroku so I can compare and see exactly what is different. Is there an easy wa...
2
3
1.2
0
true
16,942,831
1
3,348
1
0
0
16,942,317
From the command line you should be able to do heroku pg:psql to connect directly via PSQL to your database and from in there \dt will show you your tables and \d <tablename> will show you your table schema.
1
0
0
How to View My Postgres DB Schema from Command Line
3
python,django,postgresql,heroku,django-south
0
2013-06-05T14:15:00.000
I am executing an update query using MySQLdb and python 2.7. Is it possible to know which rows affected by retrieving all their ids?
1
2
1.2
0
true
16,961,869
0
67
1
0
0
16,961,438
You can get the number of affected rows by using cursor.rowcount. The information which rows are affected is not available since the mysql api does not support this.
1
0
0
Python, mySQLdb: Is it possible to retrieve updated keys, after update?
1
python,mysql-python
0
2013-06-06T11:53:00.000
MySQL is installed at /usr/local/mysql In site.cfg the path for mysql_config is /usr/local/mysql/bin/mysql_config but when i try to build in the terminal im getting this error: hammads-imac-2:MySQL-python-1.2.4b4 syedhammad$ sudo python setup.py build running build running build_py copying MySQLdb/release.py -> build/l...
1
2
1.2
0
true
16,985,650
0
141
1
1
0
16,985,604
You probably need Xcode's Command Line Tools. Download the lastest version of Xcode, then go to "Preferences", select "Download" tab, then install Command Line Tools.
1
0
0
Configuring MySQL with python on OS X lion
1
python,mysql,macos
0
2013-06-07T13:41:00.000
We are currently developing an application that makes heavy use of PostgreSQL. For the most part we access the database using SQLAlchemy, and this works very well. For testing the relevant objects can be either mocked, or used without database access. But there are some parts of the system that run non-standard queries...
3
0
0
0
false
17,017,714
0
161
1
0
0
16,999,676
So after playing around with it some more I now have a solution that is halfway decent. I split the class in question up into three separate classes: A class that provides access to the required data; A context manager that supports the temporary table stuff; And the old class with all the logic (sans the database stu...
1
0
0
Design Pattern for complicated queries
1
python,sql,design-patterns
0
2013-06-08T12:48:00.000
I've been writing a Python web app (in Flask) for a while now, and I don't believe I fully grasp how database access should work across multiple request/response cycles. Prior to Python my web programming experience was in PHP (several years worth) and I'm afraid that my PHP experience is misleading some of my Python w...
4
4
1.2
0
true
17,012,369
1
351
1
0
0
17,012,349
Have you looked in to SQLAlchemy at all? It takes care of a lot of the dirty details - it maintains a pool of connections, and reuses/closes them as necessary.
1
0
0
Database access strategy for a Python web app
2
python,psycopg2
0
2013-06-09T17:36:00.000
Need to get one row from a table, and delete the same row. It does not matter which row it is. The function should be generic, so the column names are unknown, and there are no identifiers. (Rows as a whole can be assumed to be unique.) The resulting function would be like a pop() function for a stack, except that the ...
0
1
1.2
0
true
17,382,716
0
64
1
0
0
17,127,306
Well. every table in a sqlite has a rowid. Select one and delete it?
1
0
0
row_pop() function in pysqlite?
1
python,database,pysqlite
0
2013-06-15T19:47:00.000
I have the user registration form made in django. I want to know the city from which the user is registering. Is there any way that i get the IP address of the user and then somehow get the city for that IP. using some API or something
0
0
0
0
false
17,159,679
1
163
1
0
0
17,159,576
Not in any reliable way, or at least not in Django. The problem is that user IPs are usually dynamic, hence the address is changing every couple of days. Also some ISPs soon will start to use a single IP for big blocks of users (forgot what this is called) since they are running out of IPv4 IP addresses... In other wor...
1
0
0
Is there any simple way to store the user location while registering in database
4
python,django,ip
0
2013-06-18T02:12:00.000
I am trying to put the items, scraped by my spider, in a mysql db via a mysql pipeline. Everything is working but i see some odd behaviour. I see that the filling of the database is not in the same order as the website itself. There is like a random order. Probably of the dictionary like list of the items scraped i gue...
2
0
1.2
0
true
17,213,740
1
201
2
0
0
17,213,515
Items in a database are have not a special order if you don't impose it. So you should add a timestamp to your table in the database, keep it up-to-date (mysql has a special flag to mark a field as auto-now) and use ORDER BY in your queries.
1
0
0
Scrapy reversed item ordening for preparing in db
3
python,scrapy
0
2013-06-20T12:20:00.000
I am trying to put the items, scraped by my spider, in a mysql db via a mysql pipeline. Everything is working but i see some odd behaviour. I see that the filling of the database is not in the same order as the website itself. There is like a random order. Probably of the dictionary like list of the items scraped i gue...
2
1
0.066568
0
false
17,221,923
1
201
2
0
0
17,213,515
It's hard to say without the actual code, but in theory.. Scrapy is completely async, you cannot know the order of items that will be parsed and processed through the pipeline. But, you can control the behavior by "marking" each item with priority key. Add a field priority to your Item class, in the parse_item method o...
1
0
0
Scrapy reversed item ordening for preparing in db
3
python,scrapy
0
2013-06-20T12:20:00.000
Lets take SQLAlchemy as an example. Why should I use the Flask SQLAlchemy extension instead of the normal SQLAlchemy module? What is the difference between those two? Isn't is perfectly possible to just use the normal module in your Flask app?
1
4
1.2
0
true
17,223,377
1
99
1
0
0
17,222,824
The extensions exist to extend the functionality of Flask, and reduce the amount of code you need to write for common usage patterns, like integrating your application with SQLAlchemy in the case of flask-sqlalchemy, or login handling with flask-login. Basically just clean, reusable ways to do common things with a web ...
1
0
0
Why do Flask Extensions exist?
1
python,sqlalchemy,flask,flask-sqlalchemy
0
2013-06-20T20:06:00.000
I would like to save data to sqlite3 databases which will be fetched from the remote system by FTP. Each database would be given a name that is an encoding of the time and date with a resolution of 1 hour (i.e. a new database every hour). From the Python 3 sqlite3 library, would any problems be encountered if two threa...
1
0
0
0
false
17,275,138
0
426
1
0
0
17,274,626
This will work just fine. When two threads are trying to create the same file, one will fail to do so, but it will continue to try to lock the file.
1
0
0
Can sqlite3 databases be created in a thread-safe way?
1
python,python-3.x,sqlite
0
2013-06-24T11:41:00.000
I have a flask application which use three types of databases - MySQL, Mongo and Redis. Now, if it had been simple MySQL I could have use SQLAlchemy or something on that line for database modelling. Now, in the current scenario where I am using many different types of database in a single application, I think I will ha...
3
0
0
0
false
17,289,054
1
79
1
0
0
17,276,970
It's not an efficient model, but this would work: You can write three different APIs (RESTful pattern is a good idea). Each will be an independent Flask application, listening on a different port (likely over localhost, not the public IP interface). A forth Flask application is your main application that external clien...
1
0
0
How to create models if I am using various types of database simultaneously?
2
python,database,flask,flask-sqlalchemy
0
2013-06-24T13:41:00.000
I'm looking for the best approach for inserting a row into a spreadsheet using openpyxl. Effectively, I have a spreadsheet (Excel 2007) which has a header row, followed by (at most) a few thousand rows of data. I'm looking to insert the row as the first row of actual data, so after the header. My understanding is that ...
19
-1
-0.016665
0
false
17,305,443
0
90,928
1
0
0
17,299,364
Unfortunately there isn't really a better way to do in that read in the file, and use a library like xlwt to write out a new excel file (with your new row inserted at the top). Excel doesn't work like a database that you can read and and append to. You unfortunately just have to read in the information and manipulate...
1
0
0
Insert row into Excel spreadsheet using openpyxl in Python
12
python,excel,xlrd,xlwt,openpyxl
0
2013-06-25T14:00:00.000
So I have a password protected XLS file which i've forgotten the password for...I'm aware it's a date within a certain range so i'm trying to write a brute forcer to try various dates of the year. However, I can't find how to use python/java to enter the password for the file. It's protected such that I can't open the ...
0
0
0
0
false
17,344,366
0
268
1
0
0
17,344,335
If you search there are a number of applications that you can download that will unblock the workbook.
1
0
0
How to enter password in XLS files with python?
1
java,python,excel,passwords,xls
0
2013-06-27T13:20:00.000
In my python/django based web application I want to export some (not all!) data from the app's SQLite database to a new SQLite database file and, in a web request, return that second SQLite file as a downloadable file. In other words: The user visits some view and, internally, a new SQLite DB file is created, populate...
0
1
1.2
0
true
17,382,483
1
169
1
0
0
17,382,053
I'm not sure you can get at the contents of a :memory: database to treat it as a file; a quick look through the SQLite documentation suggests that its API doesn't expose the :memory: database to you as a binary string, or a memory-mapped file, or any other way you could access it as a series of bytes. The only way to a...
1
0
0
Python: Create and return an SQLite DB as a web request result
2
python,django,sqlite
0
2013-06-29T16:01:00.000