The growth of RAM is one of the things that boggles my mind. The available RAM of an average computer has increased many magnitudes during my days as a computer nerd and now I have far more RAM than I dared dream to have harddisk space back in the 90’s.
Over at computerhistory.org, they have a nice article on a memory circuit from 1966 with a whopping 16 bits of storage – two whole bytes! This really puts things in perspective!
We can learn a lot from the past – you may even experience aha moments. Reading the source code of the first edition of Unix is such an experience. The simplicity of it all, combined with the knowledge that this actually worked more than 40 years ago.
The github repository (mirrored from google code) contains gems such as a early C compiler (described by Dennis Ritchie here, courtesy of the wayback machine) as well as a beautifully simple implementation of printf.
The Raspberry Pi foundation has just released a long blog post announcing a Raspberry Pi touchscreen, available from $60 + local taxes.
The screen is connected using DSI/DSP, meaning that the final connector on the Pi finally has a use.
The blog post is an interesting read, showing all the design decisions that have gone into this – and all the issues that the team encountered trying to produce this at an attractive price.
The screen sports 24 bits colour depth at a 800×480 pixels at 60Hz. It also provides a ten point capacities touchscreen.
From a software perspective, using the screen is just about updating the Linux system and rebooting. The screen can be used in combination with HDMI or by itself. I’m already getting ideas for a touchscreen controlled media center…
The Mesa-Video is a low cost, low power, small open source hardware/software solution for providing graphics from embedded projects. It is, basically, a graphics card with an UART interface.
The graphics card side of the solution can display text and 24-bit color graphics at up to 800×600 pixels. It is built around the FT813 GPU, and FPGA and an HDMI output. The GPU itself has an API resembling what you used to draw graphics using BASIC back in the eighties and early nineties. According to the Meta-Video creator, a part of that API will be made available over the serial line.
Right now, the project aims to go open source, but the sources are not available at this point in time. The sources will be released as soon as the practical arrangements for production have been sorted.
Having a graphics card available over UART opens the door to graphical interfaces for system built using basic CPUs such as ESP8266 or Arduinos.
Via dangerous prototypes.
The NodeIT by Sweet Pea’s is an extendable, minimal IoT thing being launched on kickstarter right now. The project has already been funded, but is still open to additional backers.
Based on the ESP8266, the core module, called ESP210, provides an MCU with WiFi capabilities. In true arduino style, it supports stackable modules called “+1” modules. These add sensors and other features to the system. Currently, there are 5 “+1” modules available, but more are under development. The available boards are:
- BatOne, implementing a Li Ion charger
- EnvironOne, providing sensors for measuring light, temperature, moist and barometric pressure
- Io4One, adds 4 additional GPIO lines with interrupt capabilities
- Adc4One, adds 4 ADC channels
- RtcOne, provides a battery backed up real time clock with a unique EUI64 address
In addition to these boards, the Workstation40 boards adds 8 ADC channels, 8 GPIO pins and 6 PWM/timer/counter pins. This board breaks the form factor to fit all the pins, but the NodeIT is still stackable when connected.
All boards communicate over an I2C bus, reducing the number of pins used up for this purpose at the cost of limiting the bandwidth. However, given the purpose of the NodeIT, the I2C bus is probably not a problem.
The system comes with an IDE based on the Arduino IDE. This might not be the prettiest IDE around, but it works. Additional software for the NodeIT can be found on the project’s github page.
As a follow up on Raymond Chen’s in-depth look at Intel’s Itanium, he has posted a set of links to an introduction to ARM. Also an interesting read.
This falls close to the Baking Pi course that we’ve covered earlier.
Raymond Chen has written a multi part series on the Intel Itanium processor architecture. It really helps you understand the CPU architecture from a software development and performance optimization perspective. To quote Raymond:
The Itanium may not have been much of a commercial success, but it is interesting as a processor architecture because it is different from anything else commonly seen today. It’s like learning a foreign language: It gives you an insight into how others view the world.
The next two weeks will be devoted to an introduction to the Itanium processor architecture, as employed by Win32.
It is a highly recommended read (as is his entire blog, The New Old Thing):
Taking on the most complex, miniaturized part of a computer and going in the other direction takes a special kind of mind. Doing it “because I want to” takes it to the next level.
James Newman is building a CPU out of individual transistors and LEDs. The goal is to build a 14 meters long wall of blinkenlights performing data processing, 16 bits at a time. Check out his progress over at The Mega Processor blog.
The picoc interpreter, created by Zik Saleeba, is an embeddable, minimal C interpreter environment that you can integrate into your system. On x86, the stripped executable ends up only 118kB, with the standard libraries, so it is very compact.
Browsing the code, the include module looks easy enough to work with, so you can extend the environment for your specific needs. The code is BSD licensed, allowing you many freedoms.
Looking at the code, it does suffer from some code rot. The last commit is a year old. However, the code looks well structured and builds nicely, so it is not very hard to maintain.
Based on the work from visual6502.org, Greg James has put together a transistor level simulation of the Atari 2600 console. The entire project is available from github.
The simulator runs slowly, but that is the trick. Everything is simulated at half-clock accuracy. On my i7 laptop, I get around 12-15 ms/clk, meaning that it is running at ~70Hz. Much slower than the 1.19MHz of the original hardware.
Still, this is an interesting project, demonstrating how to do accurate hardware simulations and how much slower they are, compared to emulators.