We talked briefly last year about an API. My company develops the Indigo home automation software for the Mac. We would love to see an Indigo plugin for the Twine. We would prefer some way to directly catch status updates whenever something changes on the Twine. Is that going to be possible?
Answers
you get back good JSON from the supermechanical website (like: {"gs_version": "Oct 3 2012-11:35:08", "age": 10.875224113464355, "ts": 1357246592.0, "values": [["00004999f257830700", "1.1.4"], ["00004999f257830701", 2600], ["00004999f257830703", "bottom"], ["00004999f257830705", 2615144], ["00004999f257830706", 0], ["00004999f257830707", 0]], "last_poll": 1357245824.0, "rssi": 178} )
see twine.sh below:
wget -o log.txt --quiet -O temp.txt --keep-session-cookies --save-cookies cookies.txt --no-check-certificate --post-data="email=REPLACE_WITH_YOUR_LOGIN&password=REPLACE_WITH_YOUR_PASSWORD" https://twine.supermechanical.com/login
wget -o log.txt --quiet -O temp.txt --load-cookies cookies.txt --no-check-certificate https://twine.supermechanical.com/rt/REPLACE_WITH_YOUR_TWINE_ID?cached=1
DATE=`date`
TEMP=`cat temp.txt | awk -F"," '{print $7}' | awk -F"]" '{print $1}' | tr -d ' '`
echo "$DATE|$TEMP"
GSHEET='https://docs.google.com/spreadsheet/formResponse?formkey=REPLACE_WITH_YOUR_FORM_KEY&REPLACE_WITH_YOUR_FORM_ELEMENT_NAME='$TEMP
wget --quiet -O temp.txt $GSHEET
I understand that shipping thousands of Twines is the priority right now, but empowering the developer community to build software/apps that run on top of the the Twine platform has to be a close second, no? My guess is that the really interesting and innovative (read: marketable) Twine use-cases will come from folks who have access to a robust API.
We don't make a humidity sensor at this point, but the moisture sensor detects the presence of liquid, yes. Technically, it's tuned for liquid conductivity but even oily skin will work. We've used it as a punch clock.
We'd really be interested in working with you guys to enable a better integration with home automation software. The regular 1 minute update interval that the twine website gets would be awesome if we could configure the twine to update another URL with values that frequently. Just thinking out-loud here.
Keep up the good work!
If you want to change the answer to accepted that's fine.
I've been able to trigger a recording manually by setting an orientation rule and flipping my Twine over, but I'd like to set my Twine to record at regular intervals. Once an hour is more than fine.
Would setting 'reset after' to 3600 seconds do the trick? Is the specific logic for these settings documented anywhere?
If I understand what Derek is stating about the reset time, wouldn't the rule trigger right after the reset time - if the condition is true? So...just set the temp to is above 0, or something and then the reset to 300 - to get an update every 5 mins?
Until then, I set a bunch of 'rises above' and 'falls below' rules within 65-80 deg F.
Based on some of the other comments, it sounds like some kind of 'regular interval' rule would be very popular.
https://www.sparkfun.com/categories/143?sort_by=price_asc&per_page=50
and I'm rigging it on the breakout with two rules - on wiggle (on) send me the current temp. on on (stop wiggle) send me the current temp. Then i'm going to place it precariously somewhere that has vibrations. It wont be consistent, but it'll get me through until they add it. Or add movement begin/movement end on the accelerometer.
I have found myself running into the same issue. I am trying to setup my Twine to post data to Cosm. And my code is working fine but there doesn't seem to be any good way to force an update. Perhaps I can use my breakout board with a 555 timer. Has there been any talk of being able to publish data to cosm or similar services?
The hardware is great, but freeing the data for use with any external service would really generate an explosion of innovative uses. Right now I feel the Twine is somewhat limited by what functionality is present on the hosted software side of the Super Mechanical site.
A built in bridge to COSM would be nice as you could use the data from COSM in a variety of ways, however, the problem is the triggering - i.e. it isn't happening the way we'd all like it to.