We will not be broadcasting at 1pm today, because someone in the house is having their last day of sixth class (last day of primary school in Ireland!).
But we do hope to broadcast at some point during the day, so keep your eyes on Twitter, because we’ll give a shout out there. Or you can just watch the live feed, all day long, like a hawk, watching for a tasty Swift mouse to eat.*
Wishes do come true
One of the things we’re most excited about is the publishing of all the session videos up on the Developer app, so we can see what’s going to be talked about.
Here’s a brief list of things we’d love to see at WWDC:
Swift Playgrounds and Development on the iPad
New Swift Playgrounds challenges
I’m not saying our students are bored, but the existing content is so excellent and so well done, it would be amazing to get some more playgrounds during the week. I am an especially big fan of playgrounds like the Cipher playground that tell a story.
While we’re at it, The Code Hub Playgrounds in the official list of “More Playgrounds”
We’re not biased at all, but I’d personally love it if our playground feed was added to the app.
Again, I’m a huge fan of Everyone Can Code Puzzles, heck, we based an entire series on it. And the App Dev with Swift books are an incredible free resource to have. But I’d love to see the long-awaited Everyone Can Code Adventures ship!
More dev on iPad
Whether it’s Xcode on iPad or simply Swift syntax highlighting for Advanced > View Auxiliary Source Files in Swift Playgrounds, prepping folks for the full Xcode experience in Swift Playgrounds would be incredible…
Easier Playground Book Authoring
There is the Swift Playgrounds Author Template (for various versions of Xcode), which is a HUGE help. And it gets better every time they ship a new one for the new version of Xcode. But I would love to see 1) it announced more publicly (or at least to me 🙂 ) so I can go grab the latest one rather than keeping around older versions of Xcode and Swift to do my playground book authoring and 2) more integrated into Xcode itself, maybe as a template for new projects.
Reality Composer and ARKit
The ability to drag images as objects into Reality Composer on the iPad
Maybe you can do this somehow, but the only way I’ve found to drag images into a Reality Composer scene (as objects, not anchors) is to do it in Reality Composer on the Mac and then edit the Reality Composer project on iOS.
While we’re at it, movies as objects for Reality Composer
We can always jump back to Xcode to add an AVPlayerLayer and AVPlayerItem to our scene and layer it on top of an image anchor, ala Harry Potter’s Daily Prophet moving photos, but wouldn’t it be amazing if we could add it as an object in Reality Composer and tweak the way it’s laid out in Reality Composer’s interface?
I can’t wait until my students are all on the latest iPads with full-on LiDAR cameras on the back for playing around with our AR sessions.
The Triumphant Return of WebObjects
Who wouldn’t like to see a new version of WebObjects ship; EOF, D2W and all? Enterprise Objects Framework for the desktop, anyone?
Like Christmas in June
The above is a short list of the things we’ve run into in the last few months. Of course, there are a few of them we can address ourselves.
Regardless of what we wind up getting, I’m excited for improvements to existing frameworks. I’m excited for the new technology the gang add this year.
The Swift Student Challenge was an impressive start to the event. I can’t wait to point my students at more inspiring content for them to consume and start dreaming up what they’ll do with it.
So if you’ve been following our live streams or watching the replays on Twitch, you may have noticed some struggles with getting our streaming right.
We started out streaming exclusively to Twitch. This wasn’t due to some fervent loyalty. Bernie had been streaming on Twitch and I thought it would be a great platform to broadcast our coding lessons to.
I have my crack QA team downstairs, taking the coding lessons, themselves, so they would often run upstairs and tell me if the stream froze or went blocky or away altogether. As we’re teaching coding live, if you can’t see what I’m doing on the screen it makes it really hard to follow along. And that happened a few times, and my crack QA team was reduced to tears. Not quite what I was looking for from my audience.
I was also struggling, off and on, with the video feed from my iPhone camera.
The Next Stage
My goal was to stream the lessons for a bunch of kids in the area, and some parents weren’t keen on letting their kid loose on Twitch (call it a rough first landing for a few of them).
After a few different apps, including a home rolled one, I gave up on the iPhone camera to just use the built-in FaceTime camera on my laptop for anything needing my face. The quality isn’t quite as high, but at least I’m not way out of sync.
So while I tinkered with OBS settings for the output and numerous camera apps, I also set up a Mobcrush account. This let me start streaming live to YouTube, Facebook, and Twitch, all from the one broadcast… not too shabby!
I could set OBS up to stream to Mobcrush easily enough:
And I was away… but I noticed, during the stream, that I was dropping around half the frames I had intended to broadcast… ouch.
To fix the frame dropping issue I reduced the resolution (to 1280 x 720 from 1920 x 1080) and dropped the frames per second from 60 to 30.
I also plugged into an ethernet jack in the wall, whereas before I was streaming over WiFi…
That seemed to help… until today. For some reason my favorite testers on Twitch weren’t seeing anything on screen. There were no dropped frames, everything looked okay. But no Twitch. YouTube was streaming fine.
We don’t have it quite perfect (yet), but we’re pretty happy with our live streaming setup for our live coding sessions (weekdays at 1pm, Irish time).
Since we toyed with a few variations on the tech, we figured we’d write up what we went with, in the end, in case anyone else found it useful.
The basic session is about half an hour of coding and working through the material in the Everyone Can Code Puzzles book from Apple in Swift Playgrounds on an iPad. I tend to give a brief intro, on camera, to the session, then we dive into the iPad for exercises and illustrations of that lesson’s concepts.
It’s designed for kids from age 8 or so to 108, so long as they have an iPad and an internet connection. Ideally you’d have another screen, like a TV or laptop or other iPad to watch the coding session on while you follow along on the iPad.
The course material is geared towards people who have a limited understanding of what coding is. It introduces computing concepts in as painless a way as possible and is a great intro to programming.
For the basics we have a MacBook Pro from 2015 running Catalina and a couple of our 2016 iPads we use in our Code Hub classroom for Swift Playgrounds and the Books app (and occasionally a couple Keynote slides). And I have an iPhone X that I use as my camera.
I bought a Shure MV 5 a few years ago and it works like a champ for a good quality external mic. The mic is just off camera, pointed at my face, and it’s very near where I work through the day’s code on the iPad.
I also have a 7-port, powered USB hub for charging and maintaining The Code Hub iPads, and it’s been pressed into service as my hub for connecting the iPads and iPhone, which has been a life saver if I need to change between multiple iPads and have all the other stuff connected, too.
Lastly, I have a USB-powered light/phone holder from TK Maxx (I still find it weird to say or type TK and not TJ). We bought it for a lark just before things shut down, but it’s been pretty great for lighting video calls and the sessions, as well as a camera mount.
I’ve floundered around most with the software for our setup, for sure. I tried a number of streaming apps for the Mac, including some that did multi-streaming, but in the end I really like OBS Studio’s feature set.
In the beginning, I just used OBS to stream to Twitch, and I used the scenes to switch between different elements, like an image I captured of a Keynote slide for the intro, video capture from my phone, connected by USB, just from the normal camera app previewing me, then the screen capture from the iPad.
Now I think I’ve got it a *little* better. I use OBS Camera, which has a plugin for OBS Studio and an app for the phone, which presents a much better video feed of me. My scene switching has gotten a little better, but I’m sure there are loads of tricks I’m missing.
OBS Studio is configured to save the recording to disk as well as stream it, in mkv format, which YouTube will accept. After each session I upload the saved video to YouTube and chuck it in the kids.code() playlist.
I watched this video to help me use the software a little better (but I could probably do with watching it again:
Like I said, I’m not 100% sure I have my audio setup perfect, but it’s working well, for now.
I disabled my laptop’s mic and just use the external mic for my audio. I also mute the OBS Cameria mic, as you can see in this screenshot:
My opening Scene is the intro screen, which is just an image, that I put up when I start the stream. The Intro 2 group is my camera and a link screen I usually prepare the morning of the session with some tips for the day.
Once I’ve finished the intro I select Scene 3, which is the iPad(s), and I transition to that, and I’m off camera for the duration.
We’ve chucked a duvet up on an old clothes rack from Argos to help out with audio quality, and it does seem to damp the echo down a bit.
And that’s it! It’s not the most sophisticated stream in the world. It’s not the slickest. But I can run my classes online now, and I’ve had a ton of fun doing it.