Alex here from Looking Glass. I've been debugging some of the issues with Spark Pixels for the past couple weeks, and I have some findings that are relevant to this discussion:
tl:dr: the code is too big. Shrink the code down so that the compiled code is about ~85Kb and it'll work.
I believe that this problem is related to the size of the compiled firmware. According to the particle engineers, the maximum size of the user firmware image is 128Kb, and the maximum size of statically allocated RAM (basically the maximum program size, not counting variables initialized with a certain value) is 79kb. The latest version of spark pixels compiles down to 93Kb, with a program size of 83Kb.
When I first loaded spark pixels, I flashed it onto a cube, and once the photon was breathing cyan, I opened up the spark pixels app and tried to connect, but I got the "device is not running spark pixels firmware." I restarted the photon a couple times, but kept getting that error.
The way that spark pixels checks if the photon is running the firmware is by checking variables that the photon publishes to the Particle cloud API to list the various modes that are available and other parameters that you can set in the app. I checked the cloud API manually to see if the photon was publishing those variables, and I saw this result that the photon was online, but there were no variables published. Hmm. The photon code says that it's supposed to publish the variables once it connects, but it wasn't doing that.
Next, I made a super-simple app that published the spark pixels variables to the cloud, but had really simple modes that just made the cube turn a solid color. This was a much smaller program, and compiled down to something around ~15Kb, if I remember right. This worked reliably -- the variables went up to the cloud every time I rebooted it, and spark pixels was able to read them.
I should mention that I keep finding a minor bug in the particle API where the photon list command occasionally reports a breathing cyan photon as 'offline', even when the photon is online and correctly updating variables through the particle API. I only see this bug in calls to
https://api.particle.io/v1/devices/deviceID/ -- if I call
https://api.particle.io/v1/devices/deviceID/function/, I get correct, current results. This isn't a huge buzzkill in the context of spark pixels -- if it lists your photon as online, you can just click on it, and when spark pixels tells you that it appears to be offline, click 'continue anyway,' and it will correctly load the variables into the spark pixels app and work from there on.
One more frustrating factor with debugging spark pixels was that the same code worked on one photon but not on another. Whaaa! Repeatedly, over multiple tests! I flashed spark pixels v3 from cubetube onto one photon, and it never published its variables to the cloud. I flashed it to another photon, which worked reliably across many reboots. Super weird! Fortunately, this weirdness clued me in to the the program size issue -- when the program gets too big, the photon will still try to run, but it runs non-deterministically. Like, if I put a Serial.println() function in setup in the spark pixels program and run it, it doesn't print out for like 5 minutes, even though the program is running and accepting commands from the spark pixels app. I ran into this same problem when I was building the first demo code for the original spark cores -- once the program size gets close to the maximum firmware size, it's like you're trying to run a program on a cat (i.e. the cat just ignores you and does what it wants)
I then did a final step of just deleting huge chunks of code from spark pixels until it compiled down to something smaller. I'm not super familiar with spark pixels, and was struggling to grok the 6k lines of code, so I just started cutting the obviously big stuff, which was all the font definitions and text modes. After a bunch of cutting, I got it to compile down to 73Kb, and the photon booted up immediately, reliably published variables to the cloud, and worked with the app. Hooray!
I'm not super familiar with the spark pixels v3 firmware and the best modes to cut, and I'm a bit intimidated by the sheer amount of time it's taking me to grok the code, so I'll leave the slimming down exercise to Werner and others with more experience with these giant programs.
A few takeaways -- this is a really hard bug to fix, and the firmware can work perfectly on some devices but not others. Nobody is lying about this, but we need to fix this -- otherwise, someone with an unlucky photon is going to flash a too-big app from cubetube and get frustrated when it's not working for them. I suggest that we limit the max size of the firmware binary to 89Kb. If you're still getting complaints, try bringing it down a few more Kb. I know it's frustrating that there's not an exact maximum size, but the correct answer depends on exactly what is taking up space in the binary -- local variables, global variables, static variables, code, etc.
My favorite way to see my exact firmware size is to work locally on my computer and compile with the particle command-line interface. There's a detailed tutorial here that shows how to make it work with the L3D libraries. When I run the 'particle compile photon ' command, I get a detailed statement of my code size, like this:
downloading binary from: /v1/binaries/5787589228634b445e11e6f0
saving to: photon_firmware_1468487810769.bin
text data bss dec hex filename
83228 6784 3340 93352 16ca8
For those of you who are working on giant programs that combine many different sub-programs, Kudos! You're making awesome stuff! This is a particular hurdle that you have to navigate. Good luck, and godspeed.