How we develop

We can only expect to get better at what we do if we open it up to criticism. And in that spirit of openness I thought I would share a bit about Mokriya’s software development process. Maybe I can inspire some of you to try this at your workplace.

What’s Mokriya?

Mokriya is a software development company based in Cupertino, but we have more than 30 employees around the world. The large majority of us work 100% remote.

I’ve been part of Mokriya for 8 months now, as an iOS Developer,

How do we synchronize

Each product has a dedicated team, so usually I only have to sync with three to six people. Finding time for a quick half an hour call every day is easy.

But syncing with the internal team is as important as syncing with the client and for the past months that’s also a daily call for me.

So I have at least two calls every day, 30 minutes each. That single hour saves us multiple ones reading and answering email threads.

In a remote environment good communication is absolutely necessary to keep speed and quality high.

How do we distribute work

At our daily syncs with the client we discuss new tasks, ideas and changes in the nightly build. That allows for more context on the why of some tasks, as well as technical problems being identified early. It keeps the client in the loop and the team gets time to ask questions.

The other daily sync is with the internal team. That’s when tasks are discussed and assigned.

The rest is Slack and some quick calls for rubber ducking.

Git-flow is perfect for the distributed workplace

By using git-flow we are able to keep conflicts to a minimum and

Apple Watch's Missing Features

It's painful to release the first version of a product, even if —as Reid Hoffman says— the absence of pain means you waited too long. Apple Watch is evidently being released before several sensors intended for inclusion could be perfected, reducing its impact as a health-aiding device [1], for example.

But for many —especially those in industry circles— even potentially transformative health benefits are mysteriously unexciting, and Apple Watch has struggled to convey its utility to the technorati. For Apple, it's a typical new product launch: pundits decry a lack of utility or functionality while eyeing a broader market whose reactions they cannot anticipate. Indeed, it may well be that Apple doesn't care at all about how nerds react, since they target regular users instead.

That said, I'm surprised that Apple Watch didn't launch with a feature that would have made it indispensable to nerds, one described by former Apple employee and usability design legend Bruce Tognazzini in 2013 [2]:

The watch can and should, for most of us, eliminate passcodes and passwords altogether on iPhones, and Macs and, if Apple’s smart, PCs: As long as my watch is in range, let me in! That, to me, would be the single-most compelling feature a smartwatch could offer: If the watch did nothing but release me from having to enter my passcode/password 10 to 20 times a day, I would buy it. If the watch would just free me from having to enter pass codes, I would buy it even if it couldn’t tell the right time! I would happily strap it to my opposite wrist!

To clarify: when one uses one's fingerprint to unlock an iPhone, it can tell the Watch that "this user is authenticated"; then, so long as the Watch doesn't leave the wrist —which

Are hashtags & @names falling out of favor on Twitter?

This was originally posted on Quora, where Mokriyans answer questions about technology, design, development, and more. See here for more.

For years, Twitter itself has been struggling with the problems of hashtags and @names, which while effectual are kludgy and off-putting to new users. In 2014, a Twitter executive seemed to indicate their coming demise:

That executive is no longer at Twitter, and both hashtags and @names remain a constituent part of the service. Hashtags, however, have been rendered less useful by Twitter search. If one tweets "Having fun on vacation!" one's tweet will surface in searches for "vacation" in the same way as would a tweet that said "Having fun on #vacation!" A hashtag is clickable and can be a badge of sorts for tweets related to live events or conferences or narrow topics, but beyond that utility, it doesn't alter your tweet's distribution.

Beyond the evolution of Twitter's product functionality, hashtags and gratuitous @mentions do seem to be less popular than they were 5 years ago. If they're out of favor, it's worth nothing that it's mostly among certain sets; as with all such conventions, in different communities there are different standards. But it does seem to be the case that they're gauche among the cool kids these days.

The reason hashtags and the @names of celebrities and such are out of favor among the "cool" is that they signal a possibly thirsty desire for attention, acknowledgment, or participation. Nothing is more anathema to cool people than seeming to want attention, however much they may want it. Visible alterations to a tweet that exist to get @Tim_Cook's attention or to distribute the tweet to people searching for "#vacation" seem desperate, socially needy.

And there's also a slightly rude quality to such tweets, as though you're having dinner

Rooms to Go

Ideas —especially in consumer software— tend to be worthless. As is well-known, execution, distribution, and lots of other prosaic factors are decisive. Being realistic about competitive landscapes, model necessities, and market forces is hard when you're excited about a product idea, but as the man said: the better part of valor is discretion. That's why, earlier this year, we at Mokriya decided not to proceed with developing a chat app we called Rooms. Today, Facebook launched a chat app called Rooms, and it gave us a big kick, so we thought we'd share the story.

We had been interested in creating useful, persistent spaces in which people could gather and chat. Twitter has in some cultural senses supplanted IRC, but it's done a poor job at replacing it. Most of us still retreat to private chats in order to interact freely —as public performance necessarily brings public judgment— but these spaces often don't serve our needs. They're difficult to manage; you have to invite members and manage membership, and the formalization of relationships is burdensome.

We wanted

  • ad-hoc spaces that could be organized around a group (just me and my friends), a topic (SF Cyclists), or a place (a park, a bar, a store)
  • spaces that could be public or private
  • optionality on identity and anonymity for all users, but also easy blocking / reporting so no one has to deal with abuse
  • optimized for mobile

Our model was the real-world: how and where do people gather at their most happily social? Well: parties with friends; bars; events which typically have topics (sports events, conventions, meetups, festivals); and certain locations.

We wanted to mirror the ease of choosing, navigating, and enjoying those spaces as much as possible, so we started playing around with some designs for an app we code-named… Rooms

The iPad's Plateau

Some products drive strategy, and some products reflect strategy. The former are the raison d'être of strategies —their purpose, their point— while the latter are passengers who ride for free on strategic choices already made. Products that drive strategy tend to be unique, while products which ride strategy tend to be derivative.

With iPad, Apple has a curious dilemma. On the one hand, iPad was plainly felt by Apple to be the successor to Macintosh: a "bicycle for the mind" product whose aspirational marketing suggests a transformative, creativity-enabling tool for artists, scientists, explorers, and makers of all kinds [1]:

Fig. 1: Apple's high-profile "Your Verse" campaign seems to have been a failure, and has been cancelled.

On the other hand, the reality of iPad is much more prosaic: we use iPads in our homes more than on mountains; we watch videos and read on them more than we scuba dive and save coral reefs with them. While iPad has enabled lots of creativity, it isn't broadly perceived by consumers as a transformative cultural phenomenon, or perhaps to be more precise: consumers aren't buying the iPad to empower themselves creatively. They buy them for the couch.

Apple is marketing a message that doesn't resonate. It doesn't suggest to those who see it how iPad should matter to them, what it can bring to their lives. But worse, iPad itself no longer seems interesting: little if any organic excitement about iPad Air 2 and iPad mini 3 could be perceived after their announcement. Their new features didn't grab many ordinary folks.

And of course, iPad sales are weak:

Fig 2: iPad sales are down, and iPad sales growth is not good. See: Tim Cook, in his own words, on why the iPad has a bright future.

While the usual excuses are accurate

Strategy & the Apple Watch

I've seen many sound analyses of the Apple Watch —or rather, of what little we know about its features and functions— already. But until its launch, what's most fascinating about it isn't how much RAM it has or whether its apps will really be displayed in a grape-bunch of moving icons, but rather what it tells us about Apple's strategies, especially for creating, entering, and attempting to own new categories.

First, a refresher on what strategy really means for a company, from Michael E. Porter's invaluable essay What is Strategy?:

Competitive strategy is about being different. It means deliberately choosing a different set of activities to deliver a unique mix of value. [1]

The activities Apple chooses to undertake in order to deliver a unique mix of value are well-known:

  • Control of —but not ownership of— supply chain, based on capitalization of supplier operations and equipment in exchange for exclusivity of supplier output, as well as relentless supply-chain improvement for greater and greater operational efficiencies
  • "General consumer" UX-oriented software development, but with a heavy emphasis on perfecting and polishing details to provided a premium and delightful product that errs on the side of well-executed simplicity over decently-executed multiplicity
  • In-house chip design —they have over 1000 chip designers!— with chip development heavily tailored to the software and hardware uses specific to Apple products, leading to more efficient chips (which allows Apple to use less RAM, get more out of batteries, etc.).
  • Retail outlets where Apple can showcase its products as it sees fit and control the context in which they're demonstrated, purchased, and serviced
  • Activities related to the aestheticization of technology; although this is decreasingly unique, recent hires and the soon-to-be-discussed development of the various Apple Watch bands is illustrative of the lengths to which they go

In general, these activities

Interfaces, Empowerment, and Voice Interaction

Toward what end do interfaces evolve? What are the trade-offs in the problem-solving process that drives their evolution? And what can we anticipate in future interfaces based on these patterns of evolutionary exchange?

A famous image from Steve Jobs’ first public presentation, months before its launch, of the iPhone:

Because it is an Apple event, Jobs notes only those interfaces which Apple successfully markets and sells [1]; although it remains chronological, absent are many other steps and branches of general development. A slightly more complete accounting of interfaces for human-computer interaction (HCI) might include:

  • punchcards
  • screens and command-line interfaces
  • the mouse-driven graphical user interface (GUI)
  • stylus-input interfaces (Newton, Palm, etc.)
  • touch-based interfaces
  • gestural or kinetic interfaces
  • voice interfaces

But that would have made for an ungainly slide.

This list seems partially progressive, and there is a temptation among technologists to assume progress is made towards ends; there is also a mistaken tendency to assume that all progress is pure. But this latter idea is not always so; while one strains to imagine how punchcards could be preferable, the command line text-only interface has different strengths and weaknesses than the GUI, and indeed it lives on in Terminal and many other places where its strengths are useful.

In design as in everything, every solution to a problem introduces new problems [2]. For much of the history of user interfaces, the problems introduced by solutions are familiar: with each new interface might come one or more of the following:

  • lower information density;
  • slower speed of operation, especially for experts;
  • higher computational costs in rendering, effects, structure, abstraction, the masking of the machine beneath the interface (indeed, computational constraints have often gated UI progress); and/or
  • an increase in overall interface ambiguity, which can amplify development costs and design difficulty.

In general, each

Designer Duds: Losing Our Seat at the Table

If design hadn’t triumphed by 2012, it had by 2013. Three years after launching the iPad, Apple was the world’s most valuable company, and even second-order pundits knew why: design. Steve Jobs’ remark that design was “how it works” had achieved what seemed like widespread comprehension, and recruiting wars for top designers rivaled those for top engineers. Salaries escalated, but cachet escalated faster; entire funds emerged whose only purpose was to invest in designer founders, and with money and esteem came the fetishization of design, the reduction of designers into archetypes, the establishment of trade cliques and the ever-increasing popularity of trend-ecosystems.

There were valedictory encomia about the power of design to deliver better products and therefore better commercial outcomes for companies and better utilitarian outcomes for users. In his rather more sober but nevertheless remarkable talk at Build 2013, David Cole noted that thanks to Apple,

Taking a design-centric approach to product development is becoming the default, I’m sure it will be taught in business schools soon enough… This is a trend I’ve observed happening across our whole industry: design creeping into the tops of organizations, into the beginnings of processes. We’re almost to the point, or maybe we’re already there, that these are boring, obvious observations to be making. Designers, at last, have their seat at the table.

For those of us who believe in the power of design thinking to solve human problems, and to a lesser extent in the power of markets to reward solutions when the interests of consumers and businesses are correctly aligned, this was invigorating news. Parts of the technology industry spent much of the 1990s and even the 2000s misunderstanding what design was and how it could help improve products. There was a time, after all