We've all seen this. Examples that need libraries they don't mention, code fragments that don't make sense, full blown examples that only illustrate a trivial case, and the ultimate insult, examples that don't compile. I chased one of those for about an hour before I found the solution.
I did get a python script to work updating more than one datastream to a brand new feed I created on Xively. Yes, you have to learn a new set of terms to use this stuff. Armed with this tiny bit of success, I put in the code to gather XBee packets and update some global variables, then push them up to Xively and ran it for an hour or so to watch what happened. It worked pretty well. I don't have code in it to gather all the data I want, or a way to store it such that my new web server can present it, but I've got a nice start.
Let's talk about some of the things I've discovered. The python scheduler works really well and does all the stuff I want. I can set a routine to run any time I want, even to the point of six months from now at noon. It's not quite as good at tiny periods, but I don't need that right now. The Xively library is actually pretty extensive, but the documentation is terrible. They don't even have a list of classes that are available, they rely on samples that suck instead. The XBee library doesn't work the way a person coming from an Arduino experience expects it to, it forces you to use threads which makes passing data around harder than expected. To make up for this, there is a cool library that can queue things up for you so another thread can grab them. This little queue is really nice and could be used in a lot of different ways.
The HUGE advantage is the raspberry's handling of the internet. It just works. No long delays while the ethernet chip makes up its mind to work, you have tons of connections to play with, processing the returned data is a snap since python has an enormous string handling library. There's so much that can be done there it's amazing.
Here's what I have so far:
This gives the following output on the raspberry:
What I do is create a thread to catch XBee packets and queue them up to be handled. In the main thread, I grab the packets off the queue and take them apart, saving a couple of important items in global variables. I have an event scheduled to print the value of the global variables every 15 seconds and another event scheduled to run every minute and send updates to Xively. Yes, this is an odd way of doing it, but it's what I already do on my current house controller. I've found that scheduling things to happen is a much simpler way of handling tasks than anything else I've tried.
There's no internet handling in this module at all. I will do that next. As I mentioned before, I have two thermostats that are hooked to my local lan that can take commands and respond; I'll put the code in to query them every so often and save the results. Since the internet handling in python is so robust, that shouldn't be a problem at all.
The big problem is deciding how to store the house data in such a way that the web server I have running on the Pi can get at it. Everyone uses a database, but I'm not sure a big hunk of code like that is reasonable for a task like this. Gotta think about it and experiment a bit before I go that route.
Perseverance, or maybe bull-headedness, has gotten me this far and I truly hope other people that are thinking about doing something like this stumble across this site. It just might save them some of the headaches I've had.
Part Three of this is here <link>