I'd like to do a Boom type user interface using the Wandboard and Community Squeeze OS.
The hardware will consist of a 256x64 OLED display along with an RF remote control.
The display interface with the SPI port will be easy enough; the remote is USB. The software framebuffer driver needs writing but I can port another similar driver.
My question is where the UI code will live. The old Boom had very little code on the player, everything was generated by the server. The new devices work the opposite but don't support a Boom-like interface. I could re-use much of the Boom code and adapt it for the different resolution. This is what I've done for my "empeg" player and it works very well. However the OLED resolution is nearly twice that of the Boom so it would require new fonts, bitmaps, etc. After all that work it seems better to just write a client side renderer like JiveLite, but more Boom-like.
What do you all think?
The hardware will consist of a 256x64 OLED display along with an RF remote control.
The display interface with the SPI port will be easy enough; the remote is USB. The software framebuffer driver needs writing but I can port another similar driver.
My question is where the UI code will live. The old Boom had very little code on the player, everything was generated by the server. The new devices work the opposite but don't support a Boom-like interface. I could re-use much of the Boom code and adapt it for the different resolution. This is what I've done for my "empeg" player and it works very well. However the OLED resolution is nearly twice that of the Boom so it would require new fonts, bitmaps, etc. After all that work it seems better to just write a client side renderer like JiveLite, but more Boom-like.
What do you all think?