A Long Distance Lamp (Version 01)


What I am intending to do and why.

Long distance relationships are tough. There are many technologies that help to alleviate some of the difficulties of distance, such as video calling, instant messaging, even connected devices that simulate physical touching. Yet long distances remain hard.

The solutions above all require an ‘active’ participation, each person must be engaged in the video call, or on the message thread, or with the physical device.

With this project I posit there is a space for more passive presence, a subtle “I’m here, but not active”. My partner and I enjoy simply reading in bed next to each other, each minding there own business but aware of and comfortable with the other’s presence.

The ideal end goal of this project is a pair of connected reading lights, with each light giving subtle and noninterruptive cues that the other lamp is present. I also intend to offer the possibility of subtle interaction, similar to a light brush of the arm or momentary eye contact.

Present State

This section covers the current state of the light fixture, without any value judgement on the implementation. For more detailed evaluation of the implementation, see the Evaluation section further down.

Fig 00: demonstration of light in its current form.

Within the timeframe of the class project for Light and Interactivity, my goal is to create a single light fixture that simulates one end of the pair. The light must fulfill two requirements:

  1. Act as a useful (i.e. dimmable) reading light (task behaviour)
  2. Gently notify the reader of presence of and interaction with the “other light” (indicator behaviour)

As of presentation day, the task behaviour is implemented, and the indicator behaviour is ready to be implemented.

For the task light, aside from a slight jitter in the fade the behaviour is complete.

The indication behaviour still requires work. Presently the indicator lights exist within the physical device, and the framework to control them exists within the code. The work required involves cuing the indication (i.e. simulating a touch on the ‘other end’), and coding the particular fade profile.

There is one single interaction that controls all aspects of the light: touch. This light is definitely a prototype and the process of creating it has raised some important questions about the design (which I will discuss in detail later). At present touching the fixture activates a fade in/out cycle, offering rudimentary control of brightness. My original intention was for that interaction to trigger an indication on the connected lamp as well.



The light is intended to have two core features: act as a useful reading light and as a noninvasive interface for two people to be present with each other without being actively engaged in an activity.

To satisfy the second part of that, I settled on ‘touch’ being the focus of my interface. The least interruptive/active control I could think of is a gentle press on the light anywhere on its surface. My hope is that the light requires the least amount of engagement/thought to control.

In terms of the ‘useful reading’ aspect, at home I have an Ikea strip light installed above my bed as a reading light. I did this quickly and out of necessity, and it works fine as a reading light. The main problem I have with it is that unless I’m sitting straight up in bed, the bright LEDs are in my vision creating distracting bright spots.

Given that, with my light I definitely wanted something that gives off enough light to read by, but that also is gently on the eyes. Tom suggested I play around with diffusion or bounce, which was a great starting point for my prototyping.

Rapid Prototype (Fabrication)

Given the intention to control the light by gently pressing it anywhere, capacitive touch seemed the most obvious place to start. And given capacitive touch as a mechanic, the fixture needed to be predominantly continuous metal. From the start I had in mind a strip light of some sort, so I got some relatively cheap extruded aluminium bars and started playing around with shapes.

The core questions of my prototyping process were:

  1. What shape should the bar be?
    • I experimented with elbow and C shaped bars in different configurations. More on this below
  2. Will a 12V warm white LED strip work as the Task Light ?
  3. Will an RGB neopixel strip work as an Indicator Light?
  4. What diffusion material should I use?

What I found was:

  1. At first (pictured above) I attempted a ‘zigzag’ arrangement, where the task light pointed away from the reader (into the wall) and the indicator light pointed toward the reader. I immediately discovered that regardless of diffusion material the NeoPixels became discreet, which I did not like the look of. In the end I settled for a T shape where both strips point towards the wall.
  2. 12V is plenty bright, but it looks best if I double up the strip so that one points directly into the wall and one points perpendicular to it (Fig 01)
  3. Yes, but as mentioned before it’s best to hide the individual pixels with more than just diffusion (Fig 02).
  4. At first I used paper but it quickly became greasy and damaged. I experimented with some photography lightbox diffusion material (Fig 01/ Fig 03) which worked okay. In the end, since I’m pointing the lights at the wall anyway, even though the fabric diffusion isn’t perfect, it’s good enough. It helps that the fabric is designed to diffuse light with the least amount of radiance lost (specifically this loses 1.5 stops)

Rapid Prototype (Circuit)

Fig 04

The breadboarded circuit (partially seen in Fig 04 was, as usual, a mess of wires that came together by adding and testing without too much thinking. I tested three things individually and then together:

  1. Capacitive touch (using the CapacitiveSensor library and circuit)
  2. Addressable RGB LED strip control (using Adafruit’s NeoPixel library)
  3. 12V LED strip control (using TIP120 and PWM)

There was a brief moment of panic when putting everything together. I discovered that the 12V LEDs caused a bit of interference with the capacitive touch, and at first that was a problem with the rudimentary example code I was using to test. Fortunately the solution was relatively simple, in that most of the noise could be filtered out, and the reading difference that human touch created was still high enough to reliably read as human touch.

In other words, the 12V strip did cause on average a higher reading on the capacitive sensor than with no 12V strip, but that reading could just become the new baseline against which to measure for human touch.

Here’s a diagram of the working circuit:

Fig 05. Circuit Diagram

Rapid Prototype (Code)

As mentioned briefly above, a lot of my problems went away when I started using filters. Tom demoed the SimpleKalmanFilter library in class, which worked really well at smoothing out the incoming capacitor values.

Fig 06. Post-filtering normalised capacitive touch readings. The spikes indicate when I gently press the metal bar

The process for code was far more experimental and less directed than the fabrication and circuit, so instead of walking through the process I’ll point out some things I discovered/found interesting. For more detailed info on the code, the full commented code is on GitHub.

  1. CapacitiveSensor. Early on I discovered the CapSense library really slows down the loop, especially when the sensor is being touched. This is because the process is physical: the Aduino is waiting to see how long it takes for voltage to change on the sensor. The library offers an adjustable sampling rate; more samples means slower execution but higher accuracy. Since I’m only interested in a threshold value, and I’m using a filter, I can set the sample rate very low to get back a little speed.
  2. SimpleKalmanFilter. Mentioned above, this library filters out noise. I used it to filter the capacitive sensor readings. Since the filter is a mathematic process rather than a physical one, using few samples and then filtering is far more efficient than using many samples and not filtering.
  3. DeltaTime. In retrospect there’s a much easier way to implement this, but I took this concept from game development and think the implementation is worth a mention. Since the CapSense library delays the loop by an undetermined amount of time, getting a fade that’s of a consistent time is a little more involved than the simplest fade implementation: using delay() and a fraction of the total fade time. What I do is keep track of the amount of time the previous cycle took to complete (by comparing its start and end millis(), and then use that value as a multiplier for the fade increment in the next loop. So, if the previous loop took a long time, the multiplier is high, and the increment is also high. Vice versa if the previous loop was quick, then the next increment is small. It’s a quick and easy way to keep fades proportional, and I was excited to successfully implement it! However in retrospect I realised you could do the same thing by just using millis() as your fade offset…

Putting it together (version 001)

Though the light is yet to be anything near a connected pair of devices, I figured I’d put it in a neat package to demonstrate (and use!) as a simple reading light. The following is a pictographic overview of that process.

First I extended the aluminium bar to be about half a meter, or 50 individual 12V LED lights once doubled up. This length was a product of arbitrary decision, aesthetic best-guessing, and the amount of aluminium I had purchased.
I 3D printed some mounts for the new T shaped bar. It was a fun exercise to use my newfound 3D modelling chops (thanks to #100daysitp2019) on a physical project.
I glued the soldered LED strips to the T bar, then wrapped that in the diffusion fabric.
I found a broken power supply that I thought might be a nice enclosure.
It didn’t fit the breadboarded circuit 🙁
So it looked like I’d have to solder some stuff.
Soldering can be tedious or meditative, depending on how you look at it. This process was also a fun puzzle in pseudo-miniaturisation. There were some big components (heat sinks, a large capacitor, a mkr1000) that all had to fit.
I had to strip the power supply as there was no way that giant jack would fit inside with all the other stuff.
Don’t forget strain relief!
Extra strain relief in the form of a knot tied into the 12V LED power cable. Does it count if the strain relief looks pretty strained?
And then bring it all together.


Right now what I can confidently say I have is a reading light that is dimmable. The process of creating a finished working version has raised a few questions, however.

Is the current implementation the best dimming interaction for a reading light? The answer is probably not. I find it quite satisfying to hold the light and watch it fade, but if I was reading and trying to increase the brightness and it started decreasing first, I would likely become frustrated. This is an important consideration as the lights usability for reading in the next iteration.

Given that there’s only one possible interaction (touch the light), how am I going to separate ‘sending a message to the other’ with ‘changing my own’? Although I haven’t focussed on the connectivity, my assumption throughout has been it’s fine that a message is sent when you change your light. After all, isn’t the point that the other is present, each lamp occasionally (gently) reminds the other of its presence. However what I didn’t fully consider until later in the process was “what if I want to send a message, but I don’t want my current light level to change?”. I think that when I begin on version 002 (an actually connected version), I will have to separate the ‘change mine’ from the ‘ping theirs’ interaction. This is unfortunate because I do like the simplicity of 1 single interaction, but I imagine this is a case of for the best results I must kill my darlings.

I could consider this project a failure in that I have not got any connection whatsoever, and I have not even implemented a successful simulation of a connection.

I could also consider this project a success in that I have created a not-too-bad looking reading light (in my opinion) that solves my frustration with my current reading light (glare from the LEDs), and sets up the possibility for a connection to a pair.

Instead, I will consider this project a great technical and creative learning journey, a wonderful prompt to think about the nature of and opportunities for long distance relationships, and perhaps the first physical computing project I’ve ‘completed’ that is both stable and useful enough for me to take home and use for a long time.

Light Observation Week 5

The first thing I notice here is how bright the clouds appear. This is certainly an artefact of the camera as I do not recall the clouds being this bright – even with glowy New York underneath.

This clip records 1 frame every 60 seconds. The timelapse accentuates some sort of flickering during the night on the street – I’m venturing that it is the lights of the ground floor apartments but that is a guess.

There’s a beautiful short moment around second 8 where the brightness of the sky descends towards the horizon. a soft line in the sky between pink and blue becomes a hard shadow on the ground a moment later

The glittery window reflections make an interesting pattern on the building across at around the 30 second mark.

Three Battery/Cell Types

Last summer I tried my hand at building a drone. I learned a lot about drone battery use, and eventually settled on a 14.8V (4*3.7V), 4000mAh Lithium Polymer (LiPo) battery with a 10C rating

In my last job we occasionally would use the lead acid batteries you can get at most hardware stores. The standard size we used was 12V, 7Ah, C5 capacity.

I also often use CR2023 coin cell batteries that provide 3V and around .2 mAh. I’m not sure of the C rating.

LDR Lamp Proposal


For the next few weeks I’ll be working on a light fixture for Light and Interactivity.

My light is inspired by the personal experience of being in a long term long distance relationship.

My plan is to create two reading lights to go above the headboards of two separate beds. They will function in two or three ways (the third is undecided – needs further thought)

  1. As a standalone reading light, perhaps with an auto-off feature after an hour or so (I sometimes fall asleep with the reading light on)
  2. As a subtle presence device for long distance partners. Currently my thought is that by touching one of the fixtures, the connected light will gently pulse; just enough to let the other person know you’re thinking of them, but not so much to interfere with their reading. Perhaps a reach goal here would be to have a ‘hold’ function, where if both lights are being touched the pulse intensifies.
  3. (This is the one I’m not sure about yet) Sometimes after I fall asleep with the light on my partner will turn it off. I wish I could say vice-versa but she’s too conscientious to fall asleep with a light on. So maybe there’s something in giving control of the other light to each fixture. But that could raise a few different questions; of agency, of who’s job it is to turn the lights off, not to mention the technical difficulties.


I started throwing together concept sketches for what this lamp could look like. As of now I think a feasible option is a strip of LEDs in a square aluminium bar with one long side open. The LEDs face the open side, which has a thin layer of diffusion on it.

The ends of the bar could be capped with 3D printed caps, that could perhaps double as the housing for the electricals.

Light Observation Week 4

This wasn’t taken this week, but I’ve had the footage for a while and thought this was a good opportunity to do something with it.

The first thing that catches my eye in this timelapse is the vastly varying brightnesses as the clouds pass over the sun. It’s not something that we usually notice at normal speeds, but sped up it’s obvious.

Near the end, just before the sun dips out of sight, we can start to see the reddish tint that would become a beautiful orange sunset were it not for the clouds. This is most obvious on the bare brick wall on the left hand side of the image.

Once the sun disappears behind the clouds, the ambient light on everything immediately becomes blue, as the still-bright sky becomes the main, very diffuse source.

Lighting Observation Week 3

Very seldom do I leave the floor in time to witness the sun setting, which I lament greatly.

It’s interesting that the sky is still quite bright, yet the streetlamp is clearly a hotspot in the image.

I don’t think this is necessarily ‘Golden Hour,’ as the light is a little low for super romantic Hollywood scenes, yet it is still beautiful. The sky’s gradient transitions from pale blue above to a deep fiery red at the horizon.

Near the horizon, the orange/red/yellow (fiery colours) bleed through the trees, adding to the seemingly glowing effect.

Further up, the bottom of the clouds catch the redder light, and the top of the clouds seem bluer – closer to the colour of the sky. The reds create a lovely contrast to the clouds against the blue sky.

Light Observation Week 2

A street in New York on a raining night

There’s something really compelling about headlights gleaming on a wet road. Usually tarmac is so matte, but add a splash of rain and it becomes very reflective.

The shape of the reflection is interesting; from the angle of the viewer the original source of each reflection is stretched out vertically.

There is a lovely texture to the reflections; the tiny bumps scatter the light so that you can see general pools of brightness and colour, but no detail.

The photo above has blown out the highlights, but in reality the pools of light reflecting from the headlights have a beautiful gradient of brightness.

Aside from the reflections, which draw so much of my attention, I love the way the light rain catches the bright beams of the headlights. Here you can see it best in front of the yellow cab. Just like dust or fog, this light rain gives the beams of light definition, volume.

Here’s a photo of the other direction, with similar scattered, long reflections.