By:
Comments Off

Solving Problems the Swift Way

Recently, I was asked to speak at SwiftCrunch, the first ever Swift hackathon. I gave a talk on solving problems using idiomatic Swift; that is, how do we solve problems "the Swift way"?

What's really key – fundamental to both my presentation and my belief about Swift – is that we, as a developer community, are going to face problems in Swift that we are already familiar with. The first time you go to implement UITableViewDataSource in Swift, you're going to be solving what's likely a problem you've solved before in Objective-C. This time, you're using Swift. The naïve approach to solving this familiar problem would be to use a familiar solution, but that would be a missed opportunity. Swift presents many new language features and many new ways to solve existing, familiar problems. It would be a shame not to explore those new solutions to see if maybe some of them are better than the Objective-C ones.

So here's my presentation. My slides are online, too. Please send any feedback you've got!

/Ash Furrow /Comments Off
By:
Comments Off

Sharing is Selfish

OK, OK, not all sharing is selfish, of course. A more accurate headline would have been Sharing Can Be Selfish, but I could have written Four Mind-Blowing Reasons Why Sharing Makes You Rich, so count your blessings.

So let's talk about the selfish benefits of sharing knowledge. To do so, we'll have to define what that actually means.

Sharing knowledge. Hmm.

I think that the benefits of sharing knowledge for a price are pretty clear: you get paid. This includes people who write books, professional scientists, and those creepy bastards at Experts Exchange. So for the sake of argument, let's limit our discussion to freely sharing knowledge. This would include, for example ...

  • Releasing software under an open source license.
  • Contributing to existing open source software.
  • Posting answers to Stack Overflow questions.
  • Writing blog posts, even if your blog doesn't have ads.

I'm sure there are others, but these are the big ones.

(Aside: what's really interesting to me is that the first two, probably the most important ones, are freely giving away the primary product of software development. To my knowledge, this is truly unique to the software development industry. Designers don't typically open source their PSDs, civil engineers don't open source their building designs, and lawyers don't open source their law research. So when we talk about freely sharing knowledge, I think that it's awesome that this is occurring in the software development industry at a rate that is unprecedented in human history.)

So what are the benefits of sharing? What's in it for you? I've narrowed it down to four key benefits.

Exposure

First up is the most obvious: exposure. When you share what you know, you put your name out there. You get Twitter followers. You get GitHub stars. You get a higher PageRank. Maybe some new RSS subscribers to your blog. Who knows. The point is that you get your name out there.

Why does this matter? Well, never underestimate the power of ego, but let's talk about tangible benefits. To do so, let's consider some examples.

My former employer, Teehan+Lax, gives away tools that they've developed. Primarily, the design source files for designing iOS interfaces. These have been used by thousands of developers all over the world and have helped make Teehan+Lax a household name of iOS design. These templates are even integrated into Sketch 3. Now, when someone out there needs an amazing app design, they know exactly who to contact.

Too abstract for you? OK, well consider objc.io and NSHipster – two sites that were created in order to give away knowledge to the iOS developer community. Their organizers are now able to use their popular sources of information in order to promote books that they've written. By sharing some knowledge for free, they can use their sites to sell more of their books. Super-awesome!

Validation of Ideas

This is actually one of my favourite benefits of sharing knowledge. When you share an idea, there are precisely two scenarios that may unfold:

  • Your idea is awesome. You thought so already, but now you know for sure.
  • Your idea could be improved. People point this out, and now you've learned something.

Over time, by exposing ideas to the world, you end up with better ideas. If you open source a component of an app that you've built, and someone points out a flaw, then your app just got better. Nice.

There is a danger, of course, in sharing ideas like this. What if someone really hates your idea? You could end up being ridiculed. After all, the internet is full of terrible, terrible monsters.

This will always be a danger, but you don't have to grow thick skin in order to be confident in sharing your ideas. Just follow these three steps in order to create a bullet-proof idea:

  1. State your assumptions.
  2. Explain what you tried first, and why it didn't work.
  3. Explain what you ended up with, and why you think it's the best solution for your problem.

By explaining how you ended up at an idea, other developers are very likely to offer constructive criticism. Maybe one of your assumptions is incorrect, or maybe your solution isn't the best because you were are unaware of a helpful API. If you explain how you arrived at a solution, then others can explain where you went wrong. It's like showing your work on math homework – even if you end up at an incorrect answer, at least you get partial credit for using the correct process.

In any case, this three-step process brings us to our next benefit of sharing.

Research

Here is a key one, which heavily influences me when I teach. To illustrate how this benefit works, let me tell you a story.

Last year, in the lead-up to the iOS 7 launch, I wrote some blog posts for Teehan+Lax. One of them was about the new custom UIViewController transitions API. This was a topic that I had identified as a great opportunity to write about: there was no WWDC sample code demonstrating how to use the API and, frankly, the WWDC presentation was very confusing.

I spent time investigating how to use the API, to understand its design and to test its boundaries. We released the blog post and its accompanying GitHub project, both of which became important sources for someone learning this new and confusing API.

Importantly I was now an expert in this API. Later, when I had to write a custom view controller transition for a project at work, I was able to draw upon that knowledge and complete the task quickly and accurately.

Often, when I begin to write about a subject, I don't really know what I am talking about. But in trying to explain the subject, I identify the gaps in my understanding, which makes it easy for me to fill them in. By sharing knowledge in well-informed blog posts, anyone can help teach themselves, with the benefit to others as a happy side-effect.

Reciprocity

This is the final benefit to sharing knowledge, and it's one that I used when writing this blog post. I had a few ideas about the benefits of sharing, but I wanted to verify those ideas and to get some more.

I was able to just ask Twitter what they thought and get people to give me their ideas, for free. Why would anyone answer some asshole on Twitter? Well, there is a concept in evolutionary biology called Reciprocal Altruism. The idea is simple: you scratch my back, I'll scratch yours.

People who know me know that I share knowledge, and are more likely to share knowledge with me. So the next time I need ideas for a blog post, or a Stack Overflow question answered, or a GitHub issue clarified, I can rely on that social support network. Cool.

Conclusion

I've laid out the four main selfish reasons that it makes sense for you to share what you learn. Of course, not only does it help you when you share, it helps everyone. And if everyone gets better at this software development thingy, you'll get better, too. Rising tides lift all boats, after all.

I've been a long-time advocate for sharing what we learn, while we learn it. The fact is that at the very moment you acquire some piece of knowledge, you have a unique state of mind. You are undergoing the mental process of transitioning from ignorance to understanding and, I believe, are uniquely qualified in that moment to teach others what you have just learned. You remember the exact state of mind you had before it "clicked" and can share the mental process that lead to that revelation. Every developer out there should have a blog where they write about things that they – until very recently – did not understand.

/Ash Furrow /Comments Off
By:
Comments Off

The Redemption of the Almost-Gamer

A few years ago (Jesus – has it been that long?), I wrote a blog post about being an "almost gamer", in which I discussed something that John Siracusa brought up on Hypercritical. In his followup on the subsequent episode, he actually read my post, in part, and replied on air.

Basically, the idea is this: if you accept that video games are an art form, then it follows that they can be appreciated. However, due to limits skills necessary to gaming (like having good hand-eye coordination and a finely-developed 3D visuospatial sketchpad), there are many people in the world who can never appreciate this form of art. With practice, one can become a connoisseur of wines or films, because the skills necessary to appreciate those things – taste and sightedness/hearing – are very common.

Anyway, I wrote the following.

John touches on this in the second part of his discussion; he describes how some games, such as Zelda, have a soft ramp-up to get players familiar with the game mechanics. He considers this, in some ways, to be cruel because players of the games who don't have the skills necessary to beat them are tricked into believing they do. Eventually, the game gets too hard; he said he didn't know how someone would feel once they get to the part of the game where it's too hard because he's never had that experience. Well I have, and it kind of sucks. In fact, it's why I stopped playing video games.

I stopped playing video games in my early teens. I mean, I've played a few games in the years since then – Portal, Super Smash Bros, Guitar Hero, and Journey – but I would not call myself a gamer by any stretch.

Take one of my favourite childhood games, The Legend of Zelda: A Link to the Past. I must've started new files on that game more than a dozen times, always giving up once I got past the first three, easy dungeons. I still enjoyed playing, but would always dread the inevitable "this game is too hard and I hate playing it now" phase. Pretty anticlimactic.

Flash-forward to today. I, a 26-year old, university-educated adult, purchased a Nintendo 3DS and Pokémon Y (Rated 7 and up) so that I could play with my friends online. While I had beaten another game in the series as a kid, doing so had been a real struggle. Not so, this time. This time, I played the whole way through without breaking a sweat. Huh.

Then I heard about The Legend of Zelda: A Link Between Worlds, which is a sequel to A Link to the Past. Nostalgia took over and I had to have that game.

Well, tonight, I've beaten it, stumbling only at the final boss.

I don't know what's happened – two video games beaten in a row. Maybe John Siracusa was right when he insisted that I am a gamer. Maybe I had the skills all along, but my childhood self's short attention span or aversion to being defeated spoiled games for me at the time. Now as an adult, I'm finally able to keep going, even through the difficult parts, and complete them. On the other hand, though, maybe games have gotten easier. I retract that – games definitely have gotten easier. What I mean to say is that maybe I'm able to enjoy – and complete – video games now because they're easier. I don't know.

So that's it. Two video games beaten on a brand-new gameboy. I don't know what I'll play next. Maybe it'll be super-difficult and crush my spirits again, or maybe it'll be like Pokémon Y and A Link Between Worlds – not particularly difficult, but still enjoyable. We'll have to wait and see.

/Ash Furrow /Comments Off
By:
Comments Off

Plans for Swift Books

So WWDC happened, and we were all blown away by Swift. Yay Swift! And one of my reactions afterward was "I want to write a book on this", because that's apparently what I do now. So I created this page where you can register for updates when the book is launched.

Cool, cool. But why, people would ask me, am I writing a book when Apple has two ebooks on Swift already available, for free? Well, I had to think really hard about that. To be honest, I just wanted to write something cool – largely as an excuse to get really good at Swift. So I had to think hard about why I wanted to write about Swift, and after reading the Apple books and getting my hands dirty, it came down to this: Apple's resources are really good at describing the language, but it's not written as a resource for teaching practical knowledge. The book, while excellent, reads like a text book.

I believe this is because Apple – like everyone else in the world – still lacks the kind of practical experience writing production-ready code in Swift. Even the Swift engineers don't know the kinds of new patterns that are going to emerge from the community over the coming months and years.

So that's what I want to write about: practical swift. How do we, as iOS and OS X developers, solve familiar problems with new tools? It would be a shame to ignore this opportunity and just continue to write in Objective-C, but using Swift syntax.

But there's a problem: I don't yet have that practical experience. Hmmm.

About the same time I came to this conclusion, a bunch of people on Twitter (bless you, Twitter people) asked if I was going to updated my existing Your First iOS App book for Swift. Initially I said no, but then I had an idea.

What if I followed my old book's instructions and gained experience building a feature-complete app, in Swift? With Core Data and everything? That'd be really cool, I could write about it, and better yet, I'd get the practical experience I need for my general Swift book. Sweet!

One catch: if I updated my book for Swift, then it won't be available to people who still want to use it to learn Objective-C. My solution is going to be that I create a second book on LeanPub, and the Objective-C one will continue, at least for now. Because the Your First iOS App book was actually the product of a successful crowdsourcing campaign, I feel it would be wrong to make a new book based on the same material without compensating those who already bought the Objective-C version. So that's why, once the Swift version launches, I'll be passing out promo codes good for a free copy of the new book. If you don't want to use it, then pass it along to a friend!

One more thing: I've also gotten questions surrounding my ReactiveCocoa book. I will be updating it for Swift, but only once The Great Swiftening has been completed.

So that's it. My initial thought to create a book in Swift has lead to an updated version of two existing books and writing a whole new one. I'm really, really excited about the next few months, if a little daunted by all the work I have to do.

Oh, and one more thing. My friend over at objc.io are launching a book on functional programming in Swift which you should also consider. It looks super-promising. Also, if you're looking for another Swift resource now, please do check out Daniel Steinberg's A Swift Kickstart on the iBook store.

/Ash Furrow /Comments Off
By:
Comments Off

SwiftCrunch Hackathon

After WWDC, once Swift had been announced, I was contacted by some developers in Poland who wanted to organize the first ever Swift hackathon. Really neat idea – and they wondered if I was interested in giving the keynote.

I accepted and am just finishing up the SwiftCrunch hackathon now. It's been a fantastic experience – I've met a lot of great developers and had a great time.

Here are my slides from the keynote. They don't give a tonne of context, but it was recorded so I'll post the video later once it's up.

There were some really cool projects that developers came up with – not just apps, but tools for developers to use to make their lives easier. One of the most interesting projects is SwiftInFlux, a community-based project for cataloguing the changes that Apple is probably going to make to Swift before 1.0 ships.

My project reproduces Facebook Slingshot's UI for presenting notifications, and it's available on GitHub.

It was a great learning experience – I filed some radars and learned more about writing idiomatic Swift. I'll put together a blog post with some of my findings later.

/Ash Furrow /Comments Off
By:
Comments Off

Lazy Property Setup in Swift

A few weeks ago, I was talking with my friend Robert about Swift. He had a problem. He wanted to create a property of a class that is not an optional, but depends on self for its creation.

The issue revolves around initializers in Swift. If a property is not optional, it must be set before the super's initializer is called. However, in order to refer to self, the super initializer must be called first. It's a chicken-and-the-egg problem. I need to set my properties before calling super.init(), but in order to set my properties, I need to refer to self, which I can't do until I've already called super.init().

Hmmm.

I've come up with a pretty good solution. Consider a UIDynamicAnimator property on a view controller. I need to initialize it with a reference view of self.view, but I'm in the same situation as Robert was. My solution, which came from a talk with Dave Addey at the WWDC labs, was to use a @lazy property that is set to a self-evaluating closure. The closure returns a reference to the initialized dynamic animator, but it since it's lazy, it isn't set until the first time it's referred to.

@lazy var animator: UIDynamicAnimator = {
    let animator = UIDynamicAnimator(referenceView: self.view)
    return animator
}()

The downside, as I can see it, is that @lazy properties must be var and not let, so you lose some Swift-ness there. Still, it's better than having an optional type.

/Ash Furrow /Comments Off
By:
Comments Off

Reflections on Art Basel 2014

My employer, Artsy, sent me to the world's largest art fair last week. It was my first art fair, and let me tell you, I was quite overwhelmed. Three hundred art galleries, each with many different artworks. Two stories of an exhibition centre, plus another floor for art that "transcended the limitations of typical art fairs", plus another warehouse for performance art, plus an entire design show.

Holy. Shit.

I don't have a sophisticated appreciation of art – I'm very new to the art world at large, but have been gaining an eye for good photography over the past few years. I thought I might be prepared.

It started easily enough. I was looking at art I like, turning it over in my head, trying to discern some appreciation from it. Then I started asking questions like, "what even is art?" Then things got trippy. "Can you ever not editorialize art?" Oh boy. Questions I wasn't prepared to answer myself.

The art fair handed out a handy book claiming to have questions and their respective answers. What is art? Well, we don't know. All we know for sure is that art is appreciated. And so on.

It didn't help as much as I had hoped. I was walking around in a haze. Confused by what I saw. Disturbed by some things, aroused by others. It was a confusing and daunting task, to just go and appreciate art. Not just any art, but art that everyone agree is awesome.

And that's when I got it. That's when I realized what the fuck was going on. Here I was, surrounded by the best, coolest art in the entire world, and I had no idea where to even begin.

You know what else used to be unapproachable, accessible only to the upper crust of society? Music. Used to be that the only opportunities to hear and appreciate music were live music.

Technology changed that. We invented the phonograph, and then the 8-track, and the CD and the mp3 player and Spotify and holy shit now everyone loves and appreciates music. Wouldn't it be so cool if we could do that for art?

The art world is hella intimidating. It's unbelievably unapproachable for lay people like me. And that sucks. And that's why the work I'm doing at Artsy matters. I want to make it suck less, because I believe in the importance of art, even if I'm not a sophisticated appreciator yet.

When I set out in my job hunt earlier this year, my most important criterion was that the company had to do good in the world. I feel like I've really found that here. It's why I'm excited to go to work in the mornings, even in the darker days of my depression. I feel like I'm making a positive change in the world, and that's something that neither salary nor anti-depressants can give me. I feel motivated to work because it's intrinsically important.

/Ash Furrow /Comments Off
By:
Comments Off

Me on MacVoices

I was delighted to be interviewed by Chuck Joiner down in San Francisco to discuss Artsy, Swift, and other things. Check it out.

/Ash Furrow /Comments Off
By:
Comments Off

Objective-C is Not Easy to Learn

I read this blog post by Aaron Hillegass this morning and was immediately disappointed.

There are many things that I disagree with about this article, but there is one in particular that I took offence to.

Objective-C is easier to learn than Swift.

Really? Come on now. That's just silly.

Objective-C is a really simple little extension to C.

I'm disappointed by this statement, because it is simply not true. Objective-C is a massive pain in the ass to learn. It's a mix of language (with "weird" syntax), runtime (all that arcane knowledge), and frameworks (massive ones). Swift obviates the difficult with the first two, which is awesome.

Let's consider a simple example.

NSLog(@"Hello, world!");

OK, so let's take a look at this. Why is NS there? Why not just log? And why is there an @ sign in front of the string? That's bizarre! Why doesn't NSLog conform to standard Objective-C syntax?

Pedantic? Maybe. But I'm not the one claiming Objective-C is easier to learn than Swift. Let's take a look at another example. I want an array, called array, of the numbers 1 to 5. Let's contrast.

 NSArray *array = @[@(1), @(2), @(3), @(4), @(5)];

Holy shit. Why is that asterisk there? (Yeah, explain pointers to a newcomer to programming. Have fun.) Again, what's with all these @ signs?! It makes no sense! Why doesn't this look more like the following?

var array = [1...5]

But then, that's Swift.

I'll tolerate people saying that "Swift is complex", either because it's unfamiliar or whatever reason you have. But come on. Objective-C being easier to learn? Give me a break.

As educators, it's our job to put ourselves in the shoes of a beginner and see things through a newcomer's eyes. I don't see that happening in this article.

Aaron Hillegass is an amazing developer and business person. I have admired and looked up to him, and the Big nerd Ranch, for a long time now. However, this post feels like it was written out of fear. I think that it is a disservice to iOS newcomers.

/Ash Furrow /Comments Off
By:
Comments Off

Whelp.

Last week, I attended WWDC in San Francisco. I was in the Bay Area for two weeks in all and spent a lot of time meeting people I know from the Internet.

Since the last time I was at WWDC, I've pulished twobooks, a second edition to a book, started contributing to a number of open source projects, wrote a tonne of iOS 7 tutorials, started another podcast, and doubled my number of twitter followers. Basically, it was a busy year.

In hindsight, it shouldn't have come as such a surprise when I ran into a lot of people who knew who I was. The common refrain from the week was something like:

Hi, nice to meet you. I'm Ash.

Ash... Furrow?

(sheepishly) yeah?

(My friend and colleague Orta says I have a branding problem, but I just can't see myself introducing myself as "Ash Furrow".)

One person who recognized me asked for a photo, since she didn't think her friends back in Melbourne would believe that she had met me.

So yeah. Last week was a big adjustment. I'm not used to attention in the real world, and it's a little uncomfortable to be honest. From a personal perspective, it's something that I think I'm going to have to get used to. And not all attention is desirable – I pissed off a lot of people last week with some strong opinions on Apple's new programming language.

I think what I'm most concerned about is remaining authentic. I answer a lot of questions people have about collection views, ReactiveCocoa, and other topics I've written about. Usually over email or twitter. I always try and answer each question as best I can, but that obviously doesn't scale, and lately, I've been falling behind. It's going to be an interesting problem going forward, and I'm not quite sure what I'm going to do.

/Ash Furrow /Comments Off