Examples of returning database query results as CSV

Have you ever needed a comma-separated list of node ids from your database to put into a query in another Drupal site? Or just for testing?

Before we get started, I should note, I was really happy with drupal.org today. I came back after a couple weeks of work stealing me away, and found 4 issues I was subscribed to or involved in that had patches reviewed and committed. Even some of my own! What a great feeling. So surely, I wanted to give back by reviewing some patches in my own queue.

So, I went about resolving a couple issues with the 7.x release of the Delete All module. At some point, I wanted to get a list of nids in a comma separated value form in order to put them in a sub-query. I'm a command line junkie (in the sense that I'm addicted to it, not necessarily that I know the right way to use it).

Now, if I wrote out some PHP code along with drush to run it so I get use of Drupal functionality, then I could simply do:

$> vim getnodes.php


$result = db_select('node','n')->fields('n',array('nid'))->condition('type','page','=')->execute()->fetchCol('nid');
print_r(implode(',',$result));

$> drush php-script getnodes.php

An even easier proposal is just running the above, but directly using drush sql-query

$> drush sql-query "select nid from node where type='page';"

But then you still need to pass it to some php code or something to get csv values.

But what if I want to get this same info directly from the MySQL database. Well, one way is selecting into an OUTFILE as part of your query.


SELECT nid
INTO OUTFILE '/tmp/nodes.csv'
FROM drupal.node WHERE type = 'page';

There are several problems here. For one, you need access to write files on the machine, which is not always the case. Plus, with AppArmor in Ubuntu, you sometimes can't do this at all without disabling it. Two, you have to go find the outfile and open it afterwards to get your data (though perhaps that could be piped as well, though you would still have the permissions problem). Three, it requires writing your queries differently, and I can never get used to putting "INTO OUTFILE '/tmp/whatever.csv' in the right place in the query. Fourth, you have to specify how to enclose all the fields and not put new lines, etc.

But hold on! There's a better solution. One can do this in a couple of piped commands directly from the command line. A lot of googling couldn't find this solution in its entirety but this is pulled from a couple of StackOverflow posts plus some advice from a blog.


mysql -uuser -p -h myhost.com -A drupal_cms -e "select nid from node where type='page'" | awk '{print $1","$2'} | tr -d '\n'

Or if you have drush available:

drush sql-query "select nid from node where type='page'" | awk '{print $1","$2'} | tr -d '\n'

This uses the ability to run a MySQL query from the command line and then pass it to awk to have the results separated by commas, then finally on to the truncate command, making sure to use -d, which removes the new lines. And Voila, we can start running these commands easily from the command line and get on with our business faster. Put this as a function in your .bash_profile


function mycmscsv() {
mysql -uuser -p -A drupal_cms -e" $@ " | awk '{print $1","$2'} | tr -d '\n'
}

And now all you have to do is run mycmscsv "select nid from node where type='fellows_blog_post'"

Hope you found that useful, and if you have a better way to do it, let me know!

If you liked the above, then you may like the Bash Cookbook: Bash Cookbook: Solutions and Examples for Bash Users (Cookbooks (O'Reilly))

Add new comment

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.

Comment using an existing account (Google, Twitter, etc.)