DocC for Building Tutorials

One of my favorite things from WWDC21 this year was the DocC stuff. The funny thing is, I had NO IDEA it would be one of my favorites until Saturday, when I was catching up on some sessions.

There are the heavy hitters, of course, like the SwiftUI sessions, the announcement that you’ll be able to build and submit apps to the App Store from the next version of Swift Playgrounds. But this was a sneaky, wonderful surprise.

But go check out Building Interactive Tutorials with DocC. That’s an amazing session that I, as someone who tries to teach coding, found incredibly useful. You can build tutorials like the ones for SwiftUI and App Dev Training that Apple have shipped.

Build Your Own

That’s a good start, building rich tutorials that show up in the Xcode documentation window. Buuuuuuut, if you check out Host and automate your DocC documentation, you’ll see that you can put it up on your own site!

You need Xcode 13, the beta, to build documentation, but your students don’t. You can export the DocC archive, put it up on a website, and away you go!

Now, it was a bit more work than just “upload it to a website and you’re good to go.” I’ll go into the details of what you need to do, from a practical standpoint, in the future.

But for now, you can now browse the start of a series of tutorials all centered around the QuestionBot.

It’s a Swift package based around a QuestionBot type. The tutorial series takes you through building the QuestionBot as a command-line tool in replit.com, then using UIKit to build the project you see in Develop in Swift Explorations, then, finally, the SwiftUI implementation you may have seen (or will see) in our Teaching Develop in Swift Online class!

At the moment, there’s only the first tutorial, the others are coming soon. But I love the UI for the tutorials and how much easier it was to throw it together this way.

There’s also a really nice, comprehensive documentation page where you can check out the API for QuestionBot and still navigate to the tutorials to see how you might use that code in practice.

Like I said, I’ll detail how I built that tutorial in a future post, but in the meantime you can see the results. It’s rare we get to ship stuff they just announce at WWDC, and ‘m so psyched about writing some more tutorials and having them look excellent out of the gate!

Diversity in Swift

Just recently, the Swift team announced a new effort to promote diversity in Swift.

As someone who tries to make sure people who want to learn how to code get the best introduction they can, I love this effort. We, as a software engineering community in general, can always use more voices and perspectives.

They have a great group of inspirational folks on the working group helping showcase diversity and bring aspects of coding to light that you might not otherwise consider.

Swan’s Quest and Accessibility

A great example of this is the Swan’s Quest session at the Worldwide Developer Conference this year: https://developer.apple.com/wwdc20/10681.

In that session they introduced us to a puzzle we solved by implementing accessibility for those folks who use the screen reader technology built into their iPads. We covered this session during our Coding at Home series.

It’s a nice reminder to us that we’re building software for more than just ourselves; there’s a whole raft of people out there who might benefit from the code we write.

Diversity in Thought

The other aspect I love about this effort is that it makes us aware of our different backgrounds and how much that can contribute to really interesting problems and solutions.

Where you’re coming from is not 100% the same as anyone else, so you always have something to contribute to the conversation… and can always learn from someone else’s experience.

This includes the great groups like Women in Swift and Black in Swift, but also other diverse voices.

Heck, I’m personally not super diverse, myself, but as a guy with an English degree in an engineering field I bring another perspective to the party (it’s not always useful, but I can tell you a story, anyway 😊).

Some of the best engineers I’ve ever worked with didn’t study computer science at university or college, but were art history majors, or geology Ph.Ds, or didn’t even go to university.

So I can’t wait to welcome you to our coding sessions, no matter your background, race, gender, or any other variation you can throw at us!

EU Code Week: Let’s Go, Teachers!

Here we go, EU Code Week begins early!

We’ll be live today at 4pm, Irish/UK time, talking about how to host these coding sessions in your own classroom.

You can still register, though it’s not necessary: https://ti.to/the-code-hub/quick-start-to-code-follow-along-how-to-use-these-sessions.

A Smooth Sailing

The goal of today’s session is to answer any questions you might have about the coding sessions. We’ll cover how to teach them in class, what gotchas to look out for, and when we’ll be live.

Of course, we have the mini-site, which we’ll show off, but it’s always helpful to have a guide along with you.

No matter the level of coding experience you or your students have, we’ll explain what material we have to offer, and how to tailor it to your experience level. We have ways to challenge advanced students, and ‘unplugged’ sessions to bring computing concepts home in a tactile way.

Your perfect classroom setup

The Place and Time for Questions

We’ll have time during our live follow-along coding sessions, but tomorrow is just for the teachers. Do you have questions about running this in class? What do you need? How long should you budget? Will you be able to follow along?

We’ll address all of those questions tomorrow (but the short answers are: iPads, 45 minutes or so, yes, definitely), and any others you might have.

So grab a cup of tea or coffee after school, hit up https://youtu.be/CxKrMvkpJbs, and we’ll give you the inside scoop on how you can get your class coding for EU Code Week.

Coding at Home: June 23rd, Grad Break & WWDC20

We will not be broadcasting at 1pm today, because someone in the house is having their last day of sixth class (last day of primary school in Ireland!).

But we do hope to broadcast at some point during the day, so keep your eyes on Twitter, because we’ll give a shout out there. Or you can just watch the live feed, all day long, like a hawk, watching for a tasty Swift mouse to eat.*

Wishes do come true

One of the things we’re most excited about is the publishing of all the session videos up on the Developer app, so we can see what’s going to be talked about.

And take a look!

There are a whole new set of Swift Playgrounds AND a session on creating your own content, checking off a few big items on our wish list!

Today at 6pm, Irish time, is when all these sessions drop, and the first of the new content is presented: Swan’s Quest, Chapter 1: Voices in the dark.

Another session we might find relevant tonight is What’s New in RealityKit.

If you missed the Keynote yesterday, you can catch up with this short little video recapping the announcements from yesterday.

So we likely won’t see you today, but we can’t wait to get back to it tomorrow with you! Keep your eyes on those WWDC videos and Twitter and YouTube and we’ll see you on June 24th!

* Disclaimer: Do not eat Swift Playgrounds.

The Code Hub’s WWDC2020 Wishlist

At this stage, you can almost taste WWDC2020 and we’ve been talking with our students about what might come out of the sessions this week.

Since we’ve been immersed in Swift Playgrounds and Reality Composer for the last few months, I, for one, have a few things on my wish list for WWDC.

The Wish List

Here’s a brief list of things we’d love to see at WWDC:

Swift Playgrounds and Development on the iPad

  • New Swift Playgrounds challenges
    • I’m not saying our students are bored, but the existing content is so excellent and so well done, it would be amazing to get some more playgrounds during the week. I am an especially big fan of playgrounds like the Cipher playground that tell a story.
  • While we’re at it, The Code Hub Playgrounds in the official list of “More Playgrounds”
    • We’re not biased at all, but I’d personally love it if our playground feed was added to the app.
  • New Books
    • Again, I’m a huge fan of Everyone Can Code Puzzles, heck, we based an entire series on it. And the App Dev with Swift books are an incredible free resource to have. But I’d love to see the long-awaited Everyone Can Code Adventures ship!
  • More dev on iPad
    • Whether it’s Xcode on iPad or simply Swift syntax highlighting for Advanced > View Auxiliary Source Files in Swift Playgrounds, prepping folks for the full Xcode experience in Swift Playgrounds would be incredible…
  • Easier Playground Book Authoring
    • There is the Swift Playgrounds Author Template (for various versions of Xcode), which is a HUGE help. And it gets better every time they ship a new one for the new version of Xcode. But I would love to see 1) it announced more publicly (or at least to me 🙂 ) so I can go grab the latest one rather than keeping around older versions of Xcode and Swift to do my playground book authoring and 2) more integrated into Xcode itself, maybe as a template for new projects.

Reality Composer and ARKit

  • The ability to drag images as objects into Reality Composer on the iPad
    • Maybe you can do this somehow, but the only way I’ve found to drag images into a Reality Composer scene (as objects, not anchors) is to do it in Reality Composer on the Mac and then edit the Reality Composer project on iOS.
  • While we’re at it, movies as objects for Reality Composer
    • We can always jump back to Xcode to add an AVPlayerLayer and AVPlayerItem to our scene and layer it on top of an image anchor, ala Harry Potter’s Daily Prophet moving photos, but wouldn’t it be amazing if we could add it as an object in Reality Composer and tweak the way it’s laid out in Reality Composer’s interface?
  • More LiDAR!
    • I can’t wait until my students are all on the latest iPads with full-on LiDAR cameras on the back for playing around with our AR sessions.

WebObjects!

  • The Triumphant Return of WebObjects
    • Who wouldn’t like to see a new version of WebObjects ship; EOF, D2W and all? Enterprise Objects Framework for the desktop, anyone?

Like Christmas in June

The above is a short list of the things we’ve run into in the last few months. Of course, there are a few of them we can address ourselves.

Regardless of what we wind up getting, I’m excited for improvements to existing frameworks. I’m excited for the new technology the gang add this year.

The Swift Student Challenge was an impressive start to the event. I can’t wait to point my students at more inspiring content for them to consume and start dreaming up what they’ll do with it.

See you at the virtual Jamba Juice stand!

Behind the Scenes: Live Streaming Lessons

So if you’ve been following our live streams or watching the replays on Twitch, you may have noticed some struggles with getting our streaming right.

The Beginning

We started out streaming exclusively to Twitch. This wasn’t due to some fervent loyalty. Bernie had been streaming on Twitch and I thought it would be a great platform to broadcast our coding lessons to.

I use OBS, as I talked about in another post, which made it super easy to broadcast to Twitch.

I have my crack QA team downstairs, taking the coding lessons, themselves, so they would often run upstairs and tell me if the stream froze or went blocky or away altogether. As we’re teaching coding live, if you can’t see what I’m doing on the screen it makes it really hard to follow along. And that happened a few times, and my crack QA team was reduced to tears. Not quite what I was looking for from my audience.

I was also struggling, off and on, with the video feed from my iPhone camera.

The Next Stage

My goal was to stream the lessons for a bunch of kids in the area, and some parents weren’t keen on letting their kid loose on Twitch (call it a rough first landing for a few of them).

Action!

After a few different apps, including a home rolled one, I gave up on the iPhone camera to just use the built-in FaceTime camera on my laptop for anything needing my face. The quality isn’t quite as high, but at least I’m not way out of sync.

So while I tinkered with OBS settings for the output and numerous camera apps, I also set up a Mobcrush account. This let me start streaming live to YouTube, Facebook, and Twitch, all from the one broadcast… not too shabby!

I could set OBS up to stream to Mobcrush easily enough:

And I was away… but I noticed, during the stream, that I was dropping around half the frames I had intended to broadcast… ouch.

To fix the frame dropping issue I reduced the resolution (to 1280 x 720 from 1920 x 1080) and dropped the frames per second from 60 to 30.

I also plugged into an ethernet jack in the wall, whereas before I was streaming over WiFi…

Until Today

That seemed to help… until today. For some reason my favorite testers on Twitch weren’t seeing anything on screen. There were no dropped frames, everything looked okay. But no Twitch. YouTube was streaming fine.

So I’ve decided to switch it up a little. Instead of streaming to Twitch, YouTube, Facebook, and more, I’ll just be posting to YouTube from now on.

Maybe our next step will be to add back in the iPhone camera feed. But for now, I think we’re happy with this setup. Just anything that lets folks follow along better at home.

Time will tell how that works out, but the saga continues!

See you online, weekdays at 1pm, Irish time!

Our Live Sessions Setup

We don’t have it quite perfect (yet), but we’re pretty happy with our live streaming setup for our live coding sessions (weekdays at 1pm, Irish time).

Since we toyed with a few variations on the tech, we figured we’d write up what we went with, in the end, in case anyone else found it useful.

The Session

The basic session is about half an hour of coding and working through the material in the Everyone Can Code Puzzles book from Apple in Swift Playgrounds on an iPad. I tend to give a brief intro, on camera, to the session, then we dive into the iPad for exercises and illustrations of that lesson’s concepts.

It’s designed for kids from age 8 or so to 108, so long as they have an iPad and an internet connection. Ideally you’d have another screen, like a TV or laptop or other iPad to watch the coding session on while you follow along on the iPad.

The course material is geared towards people who have a limited understanding of what coding is. It introduces computing concepts in as painless a way as possible and is a great intro to programming.

The Gear

For the basics we have a MacBook Pro from 2015 running Catalina and a couple of our 2016 iPads we use in our Code Hub classroom for Swift Playgrounds and the Books app (and occasionally a couple Keynote slides). And I have an iPhone X that I use as my camera.

The recording setup

I bought a Shure MV 5 a few years ago and it works like a champ for a good quality external mic. The mic is just off camera, pointed at my face, and it’s very near where I work through the day’s code on the iPad.

I also have a 7-port, powered USB hub for charging and maintaining The Code Hub iPads, and it’s been pressed into service as my hub for connecting the iPads and iPhone, which has been a life saver if I need to change between multiple iPads and have all the other stuff connected, too.

Lastly, I have a USB-powered light/phone holder from TK Maxx (I still find it weird to say or type TK and not TJ). We bought it for a lark just before things shut down, but it’s been pretty great for lighting video calls and the sessions, as well as a camera mount.

The Software

I’ve floundered around most with the software for our setup, for sure. I tried a number of streaming apps for the Mac, including some that did multi-streaming, but in the end I really like OBS Studio’s feature set.

In the beginning, I just used OBS to stream to Twitch, and I used the scenes to switch between different elements, like an image I captured of a Keynote slide for the intro, video capture from my phone, connected by USB, just from the normal camera app previewing me, then the screen capture from the iPad.

Now I think I’ve got it a *little* better. I use OBS Camera, which has a plugin for OBS Studio and an app for the phone, which presents a much better video feed of me. My scene switching has gotten a little better, but I’m sure there are loads of tricks I’m missing.

OBS Studio is configured to save the recording to disk as well as stream it, in mkv format, which YouTube will accept. After each session I upload the saved video to YouTube and chuck it in the kids.code() playlist.

I watched this video to help me use the software a little better (but I could probably do with watching it again:

Audio Setup

Like I said, I’m not 100% sure I have my audio setup perfect, but it’s working well, for now.

I disabled my laptop’s mic and just use the external mic for my audio. I also mute the OBS Cameria mic, as you can see in this screenshot:

My opening Scene is the intro screen, which is just an image, that I put up when I start the stream. The Intro 2 group is my camera and a link screen I usually prepare the morning of the session with some tips for the day.

Once I’ve finished the intro I select Scene 3, which is the iPad(s), and I transition to that, and I’m off camera for the duration.

We’ve chucked a duvet up on an old clothes rack from Argos to help out with audio quality, and it does seem to damp the echo down a bit.

And that’s it! It’s not the most sophisticated stream in the world. It’s not the slickest. But I can run my classes online now, and I’ve had a ton of fun doing it.