Curious to see if anyone else has experienced this -- In the past, I have used Postgres for projects with large data sets. In the last few months I have become interested in using SQLite for data management, because it relieves of me of needing to run a full RDBMS (either locally or on a separate machine).
Recently I got a project which seemed perfect for SQLite. I needed to run some regressions against several years worth of data files, all saved in SPSS' .sav format. So, I imported them into R (each file is 300+ wide at ~75,000 rows) and made each file into a separate table. Sadly the data structure changed with each year and it's a PITA managing all of the changes. That's one of the reasons I wanted it in a DB. I have defined a set of views which make my life much easier and let R work with the subsets of the data that I am actually interested in.
But, I have noticed that R + SQLite seems a little flakey. I have gotten some weird table locked errors when trying to write my data into the SQLite file. To get the table write functions to work, I sometimes have to exit my R process, make sure I have closed SQLiteman and start all over. Once or twice I have even succeeded in freezing EMACS.
Anyone else ever experienced anything like this? It doesn't seem to happen when I work with Postgres. It could be R+SQLite or it could be me doing something dumb to SQLite. I'm not really sure ATM.
Thought?
Bookmarks