Sunday, November 30, 2014

Spark Core without Internet and Cloud

I recently got a Spark Core. I don't find a need to connect my device to cloud. What's even more I don't want to do that. My home automation system is suppose to be cloud free. I want to make it a loose connection of Spark Core (in future Photon P0/P1) as controllers and data collectors, and Raspberry Pi as a brain device (exposing API or syncing with cloud of my choice).
First, one needs to remember that Spark is an embed device. System is linked with one's application. Every time one wants to update a program, one needs to recompile firmware (compile one's application as a compilation unit and link with the rest of firmware).

Installing firmware

The firmware repository is located here: https://github.com/spark/firmware Following instruction: https://github.com/spark/firmware#1-download-and-install-dependencies on Ubuntu means:
sudo apt-get install gcc-arm-none-eabi # wrong version
sudo apt-get install automake
sudo apt-get install dfu-util # wrong version
sudo apt-get install git
Do NOT do it at home. One needs to install gcc by:
sudo apt-get remove binutils-arm-none-eabi gcc-arm-none-eabi
sudo add-apt-repository ppa:terry.guo/gcc-arm-embedded
sudo apt-get update
sudo apt-get install gcc-arm-none-eabi
More about problems with gcc for arm can be read: and dfu-util from source (a dfu-util build manual http://dfu-util.sourceforge.net/build.html):
sudo apt-get install libusb-1.0-0-dev
wget http://dfu-util.sourceforge.net/releases/dfu-util-0.8.tar.gz
tar -zxvf dfu-util-0.8.tar.gz dfu-util-0.8
cd dfu-util-0.8
./autogen.sh
# you need to have autoreconf, if error, try: sudo aptitude install dh-autoreconf
./configure
make
sudo make install
Reasons:
  • gcc-arm-none-eabi is missing nano.specs and <cctype> header file used in core-firmware/inc/spark_wiring_character.h this will cause your compilation to throw errors like this:
    4.8 arm-none-eabi-g++: error: nano.specs: No such file or directory
    
  • Ubuntu have dfu-util version 0.5, the most recent version is 0.8. In an old version running dfu-util -l will give you:
    Found DFU: [1d50:607f] devnum=0, cfg=1, intf=0, alt=0, name="UNDEFINED"
    Found DFU: [1d50:607f] devnum=0, cfg=1, intf=0, alt=1, name="UNDEFINED"
    
  • Notice "UNDEFINED". Using new version will print something more similar to:
    Found DFU: [1d50:607f] devnum=0, cfg=1, intf=0, alt=0, name="@Internal Flash  /0x08000000/20*001Ka,108*001Kg" 
    Found DFU: [1d50:607f] devnum=0, cfg=1, intf=0, alt=1, name="@SPI Flash : SST25x/0x00000000/512*04Kg"
    
    But what's more, the old version is not able to flash Spark Core. Flashing and listing devices required to run it with root privilages (sudo).

Getting source

Note, that makefile for firmware assumes that all 3 projects are downloaded/cloned to the same directory: core-firmware, core-common-lib, core-communication-lib.

FDU mode

Flash it part of firmware manual requires you to put Core in FDU mode. If you wonder how DFU mode on Core looks: https://vine.co/v/MahhI1Fg7O6:

No comments:

Post a Comment