Coding at Home: Augmented Reality in Swift Playgrounds!

Join us today at 1pm for a bunch more work with Swift Playgrounds and augmented reality!

Today’s Session

We’re back to Swift today!

We’ve done a lot with Reality Composer that last few sessions, now we’re going to dive back into the playgrounds and try our hand at writing some code to build and augmented reality scene!

This is an intermediate playground, so we’ll take our time and ease back into coding after a few days away!

I can’t wait to see what we create!

See you at 1pm, Irish time.

Coding at Home: June 16th, Improving Image Anchors for Reality Composer

Happy Bloomsday!

Join us today at 1pm for some more fun with Reality Composer.

Recap

Yesterday we went over adding an image anchor for a Reality Composer scene. We picked a new book, showed you how to take a picture and trim it to be a good target image.

We also built a series of scenes so we wouldn’t see our assets while we fished around, looking for our image anchor in the real world.

Once we found it we showed a tab we could tap on to get more info about the main character.

This way we can export our experience and share it with friends.

From that character page we then added an arrow to navigate back to the previous scene.

Today’s session

Today we’ll look at what makes a good image anchor and what doesn’t. We’ll give you some tips and tricks for making sure your image gets recognized. And we’ll also look at what’s happening, with our image anchor.

So catch up with us and we’ll play around with some more augmented reality!

Coding at Home: June 15th, Reality Composer and Image Anchors Again

Join us today at 1pm for more with Reality Composer and image anchors!

Last week

We had so much crammed into Friday’s session that it might have been a little overwhelming.

So we’re going to cover some of that same stuff again today.

Today’s session

On Friday we used a book cover as our anchor. That meant taking a photo of the front of our book and importing it into Reality Composer to use as our image anchor. We had to set the size we expected our image to be in real life so RealityKit had an easier time recognizing it.

We also built a small table of contents viewer that navigated to different scenes.

That’s a lot of moving pieces!

Like the book we picked say, “Don’t Panic.”

We’ll go over all of it again today, nice and slow, to make sure we get it.

See you at 1pm!

Coding at Home: June 12th, Reality Composer and Augmenting a Book Cover

Join us today at 1pm for our augmented reality session!

You’ll need a book handy, or something else you can get a good picture of and use as your image anchor.

Recap

For yesterday’s session we built some images in Keynote specifically for purposes of printing out to use as augmented reality anchors.

We were able to create a mini-school tour of different departments for each logo for the different discipline.

Today’s Session

That may have been a bit unfair, especially to those of you following along at home, live. We built the images in Keynote with you, but we already had them printed out and ready to go!

So today, I want you to make sure you have a book lying around.

We’re going to build an AR experience off the cover of your favorite book. (Or whatever book is handy.)

So come join us at 1pm, Irish time!

Coding at Home: June 11th, Reality Composer, Add an Image Anchor

Join us today on YouTube Live for our stream at 1pm, Irish time!

(For a note on our new streaming location, you can read the somewhat excruciatingly boring story of our streaming woes here.)

Recap

Our session yesterday was a little bit plagued by technical issues, it turns out, if you were watching on Twitch.

But what we went through, besides saying, “We can’t see anything!” a lot, was the very beginning of creating an AR scene that will use an image as its anchor.

Today’s session

Because yesterday was such a shambles we’re going to repeat most of what we did today.

image anchors

We’ll demonstrate creating some good images for anchors to use in our AR scenes. I used Keynote yesterday, and I’ll show you a good way to get shapes. Print out the shapes, take a screenshot of your slide, and that’s what we’ll transpose a virtual object onto in our AR experience.

So join us today and we’ll get creating some amazing augmented reality scenes!

Behind the Scenes: Live Streaming Lessons

So if you’ve been following our live streams or watching the replays on Twitch, you may have noticed some struggles with getting our streaming right.

The Beginning

We started out streaming exclusively to Twitch. This wasn’t due to some fervent loyalty. Bernie had been streaming on Twitch and I thought it would be a great platform to broadcast our coding lessons to.

I use OBS, as I talked about in another post, which made it super easy to broadcast to Twitch.

I have my crack QA team downstairs, taking the coding lessons, themselves, so they would often run upstairs and tell me if the stream froze or went blocky or away altogether. As we’re teaching coding live, if you can’t see what I’m doing on the screen it makes it really hard to follow along. And that happened a few times, and my crack QA team was reduced to tears. Not quite what I was looking for from my audience.

I was also struggling, off and on, with the video feed from my iPhone camera.

The Next Stage

My goal was to stream the lessons for a bunch of kids in the area, and some parents weren’t keen on letting their kid loose on Twitch (call it a rough first landing for a few of them).

Action!

After a few different apps, including a home rolled one, I gave up on the iPhone camera to just use the built-in FaceTime camera on my laptop for anything needing my face. The quality isn’t quite as high, but at least I’m not way out of sync.

So while I tinkered with OBS settings for the output and numerous camera apps, I also set up a Mobcrush account. This let me start streaming live to YouTube, Facebook, and Twitch, all from the one broadcast… not too shabby!

I could set OBS up to stream to Mobcrush easily enough:

And I was away… but I noticed, during the stream, that I was dropping around half the frames I had intended to broadcast… ouch.

To fix the frame dropping issue I reduced the resolution (to 1280 x 720 from 1920 x 1080) and dropped the frames per second from 60 to 30.

I also plugged into an ethernet jack in the wall, whereas before I was streaming over WiFi…

Until Today

That seemed to help… until today. For some reason my favorite testers on Twitch weren’t seeing anything on screen. There were no dropped frames, everything looked okay. But no Twitch. YouTube was streaming fine.

So I’ve decided to switch it up a little. Instead of streaming to Twitch, YouTube, Facebook, and more, I’ll just be posting to YouTube from now on.

Maybe our next step will be to add back in the iPhone camera feed. But for now, I think we’re happy with this setup. Just anything that lets folks follow along better at home.

Time will tell how that works out, but the saga continues!

See you online, weekdays at 1pm, Irish time!

Coding at Home: June 10th, Anchoring Images in Reality

Come hang out with us today on Twitch at 1pm!

Recap

We finished up our Lunar Lander game yesterday with a little demo of how you’d hook this game up in Xcode. We won’t be going any further (yet) with Xcode, but if you want to check it out, I put the source code for yesterday’s demo up at https://github.com/mhanlon/LunarLander.

If you don’t feel like you’re ready for it just yet, don’t worry, we’ll get to Xcode soon enough!

Today’s session

We’re going to try out something new today…

We’ll learn how to use an image anchor.

If you’ve watched the Harry Potter movies, or maybe you’ve read the books and imagined what the Daily Prophet must have looked like with its moving pictures, well, this is the sort of thing we can do with image anchors.

With an image anchor we can attach our augmented reality content to a particular pattern found in the real world.

For starters, we’ll talk about some strategies for making good anchor images.

I love this part of augmented reality and can’t wait to show you how to stick content to your own images!

See you at 1pm!

Coding at Home: June 9th, Lunar Lander, Where to Go from Here

Come on our Reality Composer adventure today at 1pm on Twitch!

Recap

Yesterday we had kind of a rocky feed, but we added a landing pad for our rocket and detected when our ship collided with the pad. Not too shabby.

You can re-watch yesterday’s session here: https://youtu.be/zIcCp8J6Grg

Today’s session

For today’s work we’re going to finish off our lunar lander game (for now) with a little bit of groundwork for what might come next.

In the image above I’ve added a Boost button, because maybe I’m a little frustrated, as a player, with the boost buttons being in augmented reality. I don’t want to have to hunt for them when I just want to keep my rocket from crashing.

Xcode icon

We’ll hook up some actions and triggers that will allow us to write code to recognize things that happen in our AR scene. Like a fuel gauge, for example. Each time we boost the rocket we spend some fuel. In a real game we would want to keep track of how much fuel we have left.

We’ll spend the first part of the session working through some additions to the Reality Composer project on the iPad.

The second half (or so) will be a little demo of what’s possible. You can follow along, if you have a Mac and Xcode installed.

We’ll be running a session, soon, where we introduce you to Xcode and the next logical step in your development, as a coder. But this will be just the start of dipping our toes in.

See you at 1pm!

Coding at Home: June 8th, Reality Composer Goes Back to the Moon

Hang out with us today at 1pm to try and finish off our lunar lander-style game!

Recap

Last week we had a rocket ship on its way to land on the moon. We hooked up a button to apply force to the rocket to keep it from crashing straight down to the ground.

It wasn’t the most sophisticated lunar game-style game, but it’s a start!

Today’s session

Today we’re going to add a couple other wrinkles, maybe some obstacles on the ground to avoid. This will mean adding more controls.

We’re also going to hook up certain triggers and actions that get sent so that, in the future, we can take advantage of these events in our code.

This is how you would build an app to react to things that happen in your augmented reality scene.

If you want to learn more about notification triggers and actions, you can see some really nice examples of it in action, you should check out Building AR Experiences with Reality Composer.

This was a session at WWDC 2019 and it goes into a bit more detail (and a bit more code).

Let’s dive in and get this rocket landed!

See you today at 1pm!

Coding at Home: June 5th, Reality Composer Lands a Rocket

Tune in live with us today at 1pm, Irish time, for some more fun with Reality Composer!

Recap: We Landed a Rocket!

Reality Composer Icon

We landed a rocket!

It wasn’t the prettiest, but we managed to land our rocket on the giant pink planet floating above our scene yesterday.

We used a combination of behaviors: y-axis change, followed by a z-axis rotation and another y-axis change.

Stuck in the pink planet!

It got a little disturbing, perhaps, when our math was off and the rocket embedded itself in the planet, but not bad for our first effort.

Today’s session

The rocket launching (and landing) gave me an idea.

Let’s try and build a simple game with Reality Composer. One that takes advantage of behaviors and the physics world in our scene.

So we’re going to try and build a lunar lander-style game!

Picture of 1979 Lunar Lander game by Atari, image from https://en.wikipedia.org/wiki/Lunar_Lander_%281979_video_game%29

I imagine it will be pretty simplistic, but let’s start building and see what we can create.

See you at 1pm!