I have a very simple table 'logs' with just two fields: login: text log: text (no auto increment id, no indexes yet) The database is used by one person, just to do searches. It's loaded with new information in big batches, so it's not a lot of small writes, so on the writing part speed isn't a big concern. The database is quite big: 522 million records and about 24 gigs. Given the file size and the need to be portable, it's located on an external USB drive. I selected sqlite for it's simplicity and because the database itself is so simple. Also, the docs say it should be able to handle databases of about a terra byte quite fine and speed wasn't my biggest concern... however... Currently running a query like "SELECT * FROM logs WHERE login like "%andy%";" takes about 4 minutes to complete. I have played around with indexed, but most I seem to be able to shave of is a few seconds from the queries, so I guess I'm doing something wrong... The question is: what can I do to improve speed of such queries? And what are realistic speeds to expect from a database this size? All help is appreciated! Thanks in advance