It is possible to retrieve the pages via wget. The simplest approach would be to simply copy the URL Google Desktop produces and paste it into wget "URL HERE".
However, if your wish is to do this all from Terminal without having to search using Google Desktop first, that will rely on if Google Desktop has to generate the results before giving you the link or not, and if the port it uses stays the same.
If you can simply search for "PIE" in Google Desktop and then, and in the Webpage that it opens, modify the URL so it says "CAKE" instead of "PIE", and then returns results for "CAKE". If that works then you can simply copy the URL into Terminal and modify it for whatever results you want:
wget "http://localhost:33327/search?flags=8&hl=en_US&num=10&q=SEARCH+REQUEST+HERE&start=0&s=JD1G1cWkjb88GSZ1EPB3LVgcSwo"
That should work and you can feel free to use that in a Bash script or whatever.
However viewing the results of the search in the Terminal is another thing. You could simply use a Terminal Editor like nano to open the HTML file but then you'll be staring at the RAW HTML code and that isn't the easiest thing to look at for search results. It is possible to make a Script to scrape the Results into something legible in the Terminal but that will be a lot of effort tweaking things correctly and is beyond the scope of this question. If you want to use Google Desktop search then I recommend simply opening the Webpage in a Browser like it was designed to do.