core.async with aleph

Tue, 16 Dec 2014 07:34:29 +0000

(More sledge retrospective)

There was a point about three weeks ago when I thought I had a working audio player, then I tried using it on the phone and I got awkward screeches every thirty seconds through my stereo when I told it to play Ziggy Stardust. No, I’m not talking about David Bowie’s voice here, this was genuine “a dog ate my CD” style digital audio corruption. The problem seemed to appear only on Wifi: I could replicate it on my laptop, but it didn’t show up on localhost and it didn’t show up over an ssh tunnel: I suspect it was something related to buffering/backpressure, and facing the prospect of debugging Java code with locks in it I punted and decided to try switching HTTP server instead.

Documentation on HTTP streaming from core.async channels with Aleph is kind of sparse, at least insofar as it is lacking a simple example of the kind of thing that should work. So here is my simple example of the kind of thing that worked for me: wrap the channel in a call to manifold.stream/->source and make sure that the things received on it are byte-array

(defn transcode-handler [request pathname]
  {:status 200
   :headers {"content-type" "audio/ogg"
             "x-hello" "goodbye"}
   :body (manifold/->source (transcode-chan pathname))})

(from server.clj )

I’m sure there are other things you could put on the channel that would also work, but I don’t know what. java.nio.ByteBuffer doesn’t seem to be one of them, but I’m only going on git commit history and a very fuzzy recollection of what I was doing that day, it might be that I did something else wrong.

core.async with aleph

Using the HTML5 audio element in Om

Mon, 15 Dec 2014 00:07:51 +0000

A quick one: if you want to render the HTML5 audio element with Om and do stuff with the events it raises, you will find that the obvious answer is not the right one. Specifically, this doesn’t work

(dom/audio #js {:controls true
                :autoPlay true
		:ref "player"
                :src bits
		:onEnded #(do-something)
               })

This might be because React has to be taught about each event that each element can trigger and it doesn’t know about this one, or it might be because (it is alleged that) event handling in React is done by placing a single event handler on the top-level component and then expecting events on subelements to bubble up. According to Stack Overflow, audio events don’t bubble

The workaround is to add the event listener explicitly in IDidMount, and to call addEventListener with its third parameter true, meaning that the event is captured by the parent before it even gets gets to the sub-element to be swallowed. Like this

Using the HTML5 audio element in Om

Older posts