First, one needs to remember that Spark is an embed device. System is linked with one's application. Every time one wants to update a program, one needs to recompile firmware (compile one's application as a compilation unit and link with the rest of firmware).
Installing firmware
The firmware repository is located here: https://github.com/spark/firmware Following instruction: https://github.com/spark/firmware#1-download-and-install-dependencies on Ubuntu means:sudo apt-get install gcc-arm-none-eabi # wrong version sudo apt-get install automake sudo apt-get install dfu-util # wrong version sudo apt-get install gitDo NOT do it at home. One needs to install gcc by:
sudo apt-get remove binutils-arm-none-eabi gcc-arm-none-eabi sudo add-apt-repository ppa:terry.guo/gcc-arm-embedded sudo apt-get update sudo apt-get install gcc-arm-none-eabiMore about problems with gcc for arm can be read:
- https://answers.launchpad.net/gcc-arm-embedded/+question/253484
- https://bugs.launchpad.net/gcc-arm-embedded/+bug/1309060 SOLUTION
- https://community.spark.io/t/how-to-install-the-spark-toolchain-in-ubuntu-14-04/4139
- https://bugs.archlinux.org/task/39004
- http://askubuntu.com/questions/502815/should-arm-none-eabi-gcc-include-a-stdio-h
sudo apt-get install libusb-1.0-0-dev wget http://dfu-util.sourceforge.net/releases/dfu-util-0.8.tar.gz tar -zxvf dfu-util-0.8.tar.gz dfu-util-0.8 cd dfu-util-0.8 ./autogen.sh # you need to have autoreconf, if error, try: sudo aptitude install dh-autoreconf ./configure make sudo make installReasons:
- gcc-arm-none-eabi is missing
nano.specs
and<cctype>
header file used incore-firmware/inc/spark_wiring_character.h
this will cause your compilation to throw errors like this:4.8 arm-none-eabi-g++: error: nano.specs: No such file or directory
- Ubuntu have dfu-util version 0.5, the most recent version is 0.8. In an old version running
dfu-util -l
will give you:Found DFU: [1d50:607f] devnum=0, cfg=1, intf=0, alt=0, name="UNDEFINED" Found DFU: [1d50:607f] devnum=0, cfg=1, intf=0, alt=1, name="UNDEFINED"
Notice "UNDEFINED". Using new version will print something more similar to:
Found DFU: [1d50:607f] devnum=0, cfg=1, intf=0, alt=0, name="@Internal Flash /0x08000000/20*001Ka,108*001Kg" Found DFU: [1d50:607f] devnum=0, cfg=1, intf=0, alt=1, name="@SPI Flash : SST25x/0x00000000/512*04Kg"But what's more, the old version is not able to flash Spark Core. Flashing and listing devices required to run it with root privilages (sudo).