Wednesday, 5 April 2017

Final updates on driving stepper motors with pololu A4988 boards and pigpio

Nearly a year ago I blogged about my experiences with driving stepper motors from a Raspberry Pi. This was very much just a step towards writing my own telescope mount driving software.

In the meantime I found that although I was getting 800 full steps a second, it tended to stall at random intervals - typically after a couple of minutes, so reliability was not that great. It looked pretty likely that this was linux failing to run my pigpio script on time, resulting in occasional glitches in the timing which were enough to stall the motor.

I now have a version of the driver using waves (in fact 'wave_send_using_mode') and this delivers much better reliability at higher step rates.

I've put the demo code on github here.

It took a bit of messing about. I initially intended to use wave chains, but as I wanted to properly control the ramp up (and ramp down) for smooth transitions, my waves were getting rather large and with the added requirement to control 2 motors with subtly different settings at the same time, I was going to need far too much wave data to handle in this way.

So I have written this to generate waves on the fly, with 2 loaded and one ready to go in the code. This runs 1 stepper in double step mode on a Pi 2B quite happily at 1000 full steps per second (so 4000 wave transitions per second). My motors don't quite want to go this fast - about 900 is the max rate they reliably run at.

As a wave finishes, it is deleted and a new one created and tacked on the end.

Now, back to making goto work.........

Sunday, 4 December 2016

Really? starting services properly is that hard

I reaaaaally don't care if it's Raspbian or Debian screwing up the startup of nfs-kernel-server, it should have been fixed long ago.

And there are endless posts on debian raspbian and other forums about this, many with fixes that only work in very specific (unstated) circumstances.

In the end I just got out me Glasgow screwdriver and
stuck a new line in crontab to postfix the mess.

sudo crontab -e

and add the line
@reboot              sleep 5 ; /usr/sbin/service rpcbind start ; sleep 10 ; /usr/sbin/service nfs-kernel-server restart

Saturday, 3 December 2016

Astrophotography - the mount driver

I am using an old Vixen GP mount with the Vixen MT1 stepper motors.

I am using a raspberry pi with Pololu stepper drivers. I've described this in an earlier post.

Now comes the job of making this work - initially for guiding and hopefully later for 'goto' as well (albeit that will be a bit slow with these motors).

The ingredients are:
  1. pi model B
  2. 2 pololu drivers mounted on a little HAT card as described here
  3. a 24v power supply to drive the stepper motors
  4. CAREFULLY set the current limit on the pololu driver carriers
  5. write a test program to do some basic testing of the drivers / motors
  6. Prepare a second raspberry pi to run this nice autoguiding software
  7. do a quick hack to enable Gordon'd autogider software to talk to my driver software.

I have done a quick video of the test program running the code to slew the scope as fast as the steppers can go here.
and below is the gory details.....

Sunday, 28 August 2016

Very simple web serving for timelapse with Python on Raspberry Pi

Why?

I messed about with few different stacks for running a web server on pi, and most are complicated, in particular being complicated to configure as well as being large complicated animals that provide a lot of functionality (and overheads) that were of no real use to me. I wanted something that was:
  1.  simple to install / set up
  2. suitable for use as an 'inside' home web service (i.e. not exposed to the nasty world outside)
  3. able to run reasonably fast
  4. really simple to use (from python in particular)
Some very simple ways to do this use cgi, but I soon found this method awkward to use as well as looking like there were significant performance overheads. I switched to http.server.HTTPServer which I like and provides a 'shape' of framework I am comfortable with.

What?

On a LAN connected raspberry pi 3 this approach will happily serve up to 200 requests per second - as long as the overall network bandwidth doesn't get too high.

The test I used serves up images on demand to create a virtual movie. It is driven from javascript in the web page. The individual images were just under 20k bytes on average.

I wanted to minimise the load on the Raspberry Pi and keep the user interface simple to use and responsive. To do this the python web server is very simple and - after serving up the initial page - just responds to requests from the client.

The web page implements the user interface control functions in javascript and fires off requests to the web server.

The web server runs happily on Raspberry Pi (including zero) and on my ubuntu laptop. It appears to work well with Firefox, Chrome and Internet Explorer on laptops / PCs. It does not work in Edge, but as I have little interest in Windoze , I'm not really interested in the ie / edge use.

It will run on reasonably fast phones / tablets, but not at high framerates; my old Galaxy S2 isn't much use, but a Hudl2 works well as long as the framerate is kept low.

This is just a proof of concept, so presentation and error handling are minimal / non-existent, and functionality is limited.

How?

There are 2 files:
  • a python3 file (the server) of around 100 lines
  • an html file (the client) of around 200 lines
The simple webserver builds an index to all the jpeg files in the given folder, and serves a web page which then allows various javascript functions to move around within the images and play them back at various speeds by controlling framerate and stride.

Tuesday, 12 July 2016

poe thoughts and findings with Raspberry pi

I've used 2 splitters for this. One specifically sold to power Raspberry Pi, the other a more generic one, but it is cheaper and more flexible (it can output at 5V, 9V or 12V - obviously for direct power to a pi, it is on the 5V setting.

The switch in front is a Cisco SG200 08P, which is expensive, but provides close control and reporting on what is going on with each port. Much cheaper PoE switches (like the Netgear prosafe range) are available (but typically don't have the reporting I used here).

The ASIX AX88179 is the main part of a USB3 to Gigabit ethernet adapter. The Pi can drive significantly more than 100Mb with a Gigabit adapter (even although it is USB2). all the Gigabit USB adapters seem to be USB3 (quite sensible if you want to get close to full utilisation). Also being Gigabit means that potentially the green ethernet savings should kick in.

As a final test I had the Pi Zero running RPi cam control streaming live to a web browser on another PC, and with 2 servos waving the camera about. This took the power up to nearly 5 watts with everything running smoothly - apart from the live camera feed which was just a blur!

Conclusions

The TP-Link PoE adapter is the better solution - more flexible, more efficient AND cheaper.

The Official Rpi WiFi dongle seems to run at about 1/2 watt when idle.

The USB3 to Ethernet adapter I got is VERY inefficient - about 1 watt doing nothing.

You can run a pi very reliably this way - even with the camera and little servos running.

poe splitterloadpoe classpower (mW)current (mA)voltagegreen ethernetnetwork
RocksolITpi 3 idling419004247Nopi Ethernet
RocksolITpi 3halted48001747Nopi ethernet
TP-Link PoEpi 3 idling018004047Nopi ethernet
TP-Link PoEpi 3 halted07001547Nopi ethernet
TP-Link PoEpi 3 idling016003447Nonone
TP-Link PoEpi 3 idling025005447NoASIX AX88179
TP-Link PoEpi 0 idling012002647NoRPi wifi dongle
TP-Link PoEpi 0 halted0400947NoRPi wifi dongle
TP-Link PoEpi 0 idling010002247NoUSB3 - Gigabit no lan cable
TP-Link PoEpi 0 idling017003647YesUSB3 - Gigabit connected
TP-Link PoEpi 0 idling07001547Nono network adapter
RocksolITpi 0 idling413002847NoRPi wifi dongle
RocksolITpi 0 idling48001747Nono network adapter
TP-Link PoEpi 0 busy + servos046009047NoUSB3 - Gigabi

Monday, 11 July 2016

Making google cardboard work properly (with a web browser) part 2

Now I could set (and adjust) the lens spacing in my Google Cardboard, I needed a way to set up the spacing properly, but first.....

Rule 1: your display needs to have at least an 11cm wide viewable area - preferably a bit more. Without this width, significant areas of the image will only be visible to one eye - oh and I might need to adjust the app so it allows the panel to go off the edge of the screen.

To help with setup, I prepared a freestanding web page that enables settings to be tweaked, and saves those settings to local storage so they are persistent.

This web page can also be used as the base for display of stereogram images and similar things (like live streaming from stereo webcams), and can be run on devices that don't support the google cardboard app, so you can use google cardboard (the cardboard) without google cardboard (the app) - although not with any google api based apps obviously.

Now to getting the lens spacing properly setup. For me this means that:
  1. on first looking through the cardboard, the image should look right immediately - no waiting for the image to 'come together'
  2. shutting you eyes for for a few seconds and opening them again should likewise mean the image 'works' immediately
  3. The image should remain consistent across the whole field of view - no queasy feelings as you look towards the edges and corners of the view

web pages and other stuff.

I wanted to make the little app as widely usable as possible - not just android - and to that end it is effectively a freestanding web page - here is a typical screenshot of the app in motion:

Below is the method I use to setup cardboard. The setup also prepares the settings for use in a stereo image viewer web page I am working on - more of that another day!

Note if you have a lazy eye or suffer from bouts of double vision or similar eye problems, you probably shouldn't do it this way!

Sunday, 10 July 2016

Making google cardboard work properly (with a web browser) part 1

First experiences with Google Cardboard were that things looked 3D, but it always felt a bit weird and uncomfortable. Usually the 'not rightness' got worse as I looked further from the centre of the image. To start with, I messed around with profile generator, but soon came to the conclusion I was starting from the wrong place.
Could I make it better?

Having worked out what I thought was the problem, the answer is yes, it can be made LOTS better (well it is for me anyway)

I decided that there are 2 main problems:
  1. because the lens spacing doesn't match my ipd, and the lenses have a fair bit of distortion, there is a small sweet spot and as I looked away from the sweet spot, the 2 views diverge in ways my brain did not like. This makes things feel more and more 'not right' as you move further from the centre.
  2. The lenses introduce a lot of pincushion distortion - I suspected that while not ideal, things would look a whole lot better if the spacing is fixed even with this distortion.
Of course having fixed item 1, a new profile should make the google app (and others that use the underlying cardboard API) look a whole lot better as well - after generating a new profile.

Google cardboard is set up for an ipd of around 64mm, and I measure my eyes at 67mm, but even this small difference seems to have a big effect.

So I set off a couple of days ago to:

  1. fix the ipd to lens mismatch.
  2. write an app (web page) that would allow me to view stereo pictures in a web browser.
and so our quest begins.... Part 1 is below, part 2 is here