Tuesday, May 11, 2021

REPOST: The Power and Pitfalls of Gamification / Katy Milkman Ideas 05.04.2021 08:00 AM

 When tech companies first adopted the technique, there was hardly any science supporting it. Now researchers know when gamelike features help—and when they hurt.

Illustration of gold trophy topped with bored office worker
Illustration: Sam Whitney; Getty Images

This Story is adapted from How to Change: The Science of Getting from Where You Are to Where You Want to Be by Katy Milkman.

When you walk 10,000 steps in a day, your Fitbit rewards you with a jiggle and some virtual fireworks, giving you a reason to pause and smile with pride. When you practice a foreign language on Duolingo multiple days in a row, you earn a “streak” and are encouraged to maintain it, giving you an extra reason to strive for repetition. 

When companies, teachers, coaches, or apps add features such as symbolic rewards, competition, social connections, or even just fun sounds and colors to make something feel more like play, they’re relying on “gamification” to enhance an experience that might otherwise be dull. I’d wager that most of the apps on your phone use some element of gamification, but we also see gamification in our workplaces and from our health insurers.

Gamification first took off more than a decade ago. At the time, there wasn’t much evidence for its value; the concept just seemed to make sense. Business consultants promised organizations that gamifying work could more effectively motivate employees, not by changing their work itself, but by changing its packaging, and making goal achievement a bit more exciting as a result (“Yes! I earned a star!”). 

Technology companies like Cisco, Microsoft, and SAP, for instance, found ways to gamify everything from learning social media skills, to verifying language translations, to boosting sales performance.

Today, thanks to science, we know a lot more about when gamification really works, and what its boundaries seem to be. Beyond the gamified apps and software we use to learn new skills, companies like Amazon and Uber now deploy it to boost worker productivity. But to get the results we seek, in our own lives and in the workplace, it’s important to understand when gamification will work—and when it will only make matters worse.

In 2012, Jana Gallus, a brilliant young economist studying for her doctorate at the University of Zurich, learned of a problem plaguing Wikipedia—and saw an opportunity to run an early test of the value of gamification. Despite the popularity of the 50-million-entry online encyclopedia available in over 280 languages, Gallus discovered that its top performing editors were leaving in droves. 

And since the so-called Wikipedians who keep the site’s articles on everything from Game of Thrones to quantum mechanics accurate and up to date don’t get paid a dime, the organization needed to find a way to keep its top editors engaged with the sometimes-monotonous task of curating online content without offering them money.

In the hopes of reducing turnover, Wikipedia let Gallus run an experiment with 4,000 new volunteer editors. Based on the flip of a coin, she told some deserving Wikipedia newcomers that they had earned an accolade for their efforts, and their names were listed as award winners on a Wikipedia website. They also received either one, two, or three stars, which appeared next to their username, with more stars allocated to better performers. 

Other newcomers who had contributed equally valuable content to Wikipedia but came out on the other end of the coin flip got no symbolic awards (and weren’t told that such awards existed). Gallus thought the awards would make a monotonous task feel a bit more like a game by adding an element of fun and praise for a job well done.

She was right. The volunteers who received recognition for their efforts were 20 percent more likely to volunteer for Wikipedia again in the following month and 13 percent more likely than those who earned no praise to be active on Wikipedia a year later.

Examples like this one might make gamification seem like a no-brainer: Why wouldn’t a corporation want to make work more fun? Despite Gallus’ exciting results, more recent research shows that as a top-down strategy for behavior change, gamification can easily backfire. Two of my Wharton colleagues—Ethan Mollick and Nancy Rothbard—ran an experiment that proved just that. 

It involved several hundred salespeople who had the somewhat boring job of reaching out to businesses and convincing them to offer coupons for discounted products or services that were then sold on their company’s website (think Groupon). The salespeople earned commissions for each coupon eventually sold online.

In an attempt to make this more exciting, Mollick and Rothbard worked with professional game designers to create a basketball-themed sales game. Salespeople could earn points by closing deals with customers, with more points awarded for bigger deals. Sales from warm leads were called “layups,” while cold calls were dubbed “jump shots.” 

Giant screens on the sales floor displayed the names of top performers and showed occasional basketball animations like a successful dunk. Regular emails updated the “players” on who was winning, and when the game was over, the winner got a bottle of champagne.

Employees on just one sales floor participated; the rest were left out. My colleagues then compared the trajectories of salespeople who played the game with those who didn’t. Though they’d had high hopes, Mollick and Rothbard were surprised to find that playing the game didn’t improve sales performance, and it also didn’t improve the way salespeople felt at work. But digging into their data further revealed a very interesting pattern.

The researchers had asked everyone in their game a set of questions: Did people follow the game? Did they understand the rules? Did they think it was fair? These questions were designed to measure which salespeople had “entered the magic circle,” meaning that they agreed to be bound by the game’s rules rather than the normal rules that ordinarily guide their work. After all, if people haven’t entered a game mentally, there’s no real point to it.

Sure enough, the salespeople who felt that the basketball game was a load of baloney actually felt worse about work after the game was introduced, and their sales performance declined slightly. The game benefited only the salespeople who had fully bought into it—they became significantly more upbeat at work.

My colleagues argue that their study highlights a common mistake companies make with gamification: Gamification is unhelpful and can even be harmful if people feel that their employer is forcing them to participate in “mandatory fun.” Another issue is that if a game is a dud, it doesn’t do anyone any good. Gamification can be a miraculous way to boost engagement with monotonous tasks at work and beyond, or an over-hyped strategy doomed to fail. What matters most is how the people playing the game feel about it.

Gamification may have worked so beautifully at Wikipedia in part because Wikipedians don’t get paid but instead come to the site as volunteers. And it’s relatively safe to say that volunteers for any organization want to be there and want to be productive, or else why would they be volunteering? 

Wikipedia editors devote time to the world’s largest online encyclopedia because they’re intrinsically motivated to help share knowledge widely, just as volunteers for the Nature Conservancy want to help the environment. So Wikipedians naturally have the goal that the site’s awards are designed to reinforce.

At its best, gamification seems to work when it helps people achieve the goals they want to reach anyway by making the process of goal achievement more exciting. When people fully buy into a game, the results can be impressive, durably improving volunteers’ productivity, boosting worker morale, and even, as seen in one recent study, robustly helping families increase their step counts. But gamification can tank when players don’t buy in. 

If a game is mandatory and designed to encourage people to do something they don’t particularly care to do (like achieving an outstanding record of attendance at school), or if it feels manipulative, it can backfire. Amazon seems to understand this: They’ve kept their gamification program entirely optional so employees who enjoy it can use it, but it isn’t imposed on anyone.

This latest science suggests it makes lots of sense for apps to continue gamifying our achievements, so long as they’re promoting goals we’re intrinsically eager to reach. But when it comes to using gamelike features to promote change we might not find so appealing, gamification doesn’t seem to be a workaround for more substantive solutions. 

While not every context is the right one, under certain conditions, gamification can make pursuing your aspirations feel more like play. And that is a powerful tool in any personal or professional quest for change.


Adapted from How to Change: The Science of Getting from Where You Are to Where You Want to Be by Katy Milkman; foreword by Angela Duckworth, in agreement with Portfolio, an imprint of Penguin Publishing Group, a division of Penguin Random House LLC. Copyright © Katherine L. Milkman, 2021.


WIRED Opinion publishes articles by outside contributors representing a wide range of viewpoints. Read more opinions here, and see our submission guidelines here. Submit an op-ed at opinion@wired.com.

Thursday, April 8, 2021

American Cinematography Magazine / Game On: Game-Engine Technology Expands Filmmaking Horizons / August 14, 2020 / Noah Kadner

https://ascmag.com/articles/game-on-game-engine-technology

American Cinematography Magazine / Game On: Game-Engine Technology Expands Filmmaking Horizons / August 14, 2020 / Noah Kadner 

--reposted--

“What directors want are iterations. They want to find all the possible challenges early, so the faster you can see a result, understand the issues and create another iteration, the better. Real-time engines make that process happen instantly. It’s just like playing a video game.”

https://cms-assets.theasc.com/Game-On-1_TTF_PATHFINDER_TR_01_PNG-copy.jpg?mtime=20200814090957

At top, for Season 8 of HBO’s Game of Thrones, The Third Floor offered real-time rendering during production — which, for example, helped represent a dragon (pictured here) as shots were composed, framed and performed. The company also provided virtual versions of story locations for preproduction “scouting.”

The Mandalorian image by François Duhamel, SMPSP, courtesy of Lucasfilm Ltd. Cine Tracer image courtesy of Matt Workman/LED volume photo by Mike Seymour. The Lion King unit photography by Michael Legato, courtesy of Disney Enterprises, Inc. Game of Thrones image courtesy of HBO and The Third Floor, Inc.

Game engines such as Unreal Engine and Unity were, as their umbrella term implies, originally designed for the development of real-time applications, aka video games. In recent years, advances in both hardware and software have ushered these engines into the purview of cinematographers. Game engines are now used to create many forms of visualization for filmmakers, including previs, techvis and postvis — and even final in-camera movement and imagery for such productions as The Lion King (AC Nov. ’19; see sidebar below) and The Mandalorian (AC Feb. ’20). 

For The Mandalorian, an LED-wall system known as “the Volume” allowed the filmmakers to capture actors in a photo-real (or nearly so) environment, in-camera and in real time, with the aid of Epic Games’ Unreal Engine.

Traditional 3D animation applications such as Autodesk’s Maya and 3ds Max have long played a significant role in visual effects. Their comparatively slower performance, however, has seen their benefit focused more on pre- and postproduction, where time is somewhat less critical than in live-action production. These 3D apps prioritize final image quality over performance; as a result, image render times can easily stretch into hours or even days per frame, depending on the complexity of the shot and the computational power of the hardware.

By contrast, game engines were initially optimized for speed first and image quality second, in order to support gameplay in real-time, often at frame rates of 60 frames per second or more. And in the past few years, major technical advancements in the graphics-processing unit (GPU) of computers have enabled such engines to render production-quality imagery while maintaining their real-time speed. 

Commensurate with these hardware improvements, developers such as Epic Games (creators of Unreal Engine) and Unity Technologies (creators of Unity) have optimized their software for direct inclusion into the production pipelines of features and television. These changes are intended to support the crossover between traditional cinematography and computer-generated imagery — a dynamic that can serve filmmakers in a whole host of ways.

Matt Workman, a cinematographer and software developer, is working to erode even further the boundaries between filmmaking and CG with his creation of Cine Tracer, a real-time cinematography simulator built with Unreal Engine. The application — offered directly to filmmakers — enables the viewing of real-world camera and lighting equivalents in simulated, user-designed movie sets to produce highly accurate shot visualization.

Director of photography and previs artist Matt Workman’s real-time “cinematography simulator” Cine Tracer is powered by Unreal Engine. (Screen capture.)
Director of photography and previs artist Matt Workman’s real-time “cinematography simulator” Cine Tracer is powered by Unreal Engine. (Screen capture.)

“My background includes about 10 years of traditional cinematography, mostly in commercials around New York City,” Workman says. “During that time, I worked with a lot of visual-effects companies on effects-heavy commercials — so I started creating previs tools to communicate in 3D and plan with the teams. A couple of years ago, I started developing Cine Tracer to handle that workflow more efficiently. I [designed] it as a video game [that’s controlled similarly to playing a third-person shooter game], but it’s intended to help filmmakers quickly visualize their shots. 

“I went to school for computer science, but I’ve always been tinkering with 3D,” he says. “Luckily, it’s 2020 and there’s YouTube, so the amount of available free education is incredible, as long as you have the time and the patience to learn. 

“Most cinematographers who are on the technical side pick up [game-engine-powered previsualization] very quickly. If you want to add light coming through the window, the steps to get there are very quick. It’s the same way you do it in the real world.”

Workman at a demo of an LED Volume. The realistic background is actually an image that appears on the LED wall.
Workman at a demo of an LED Volume. The realistic background is actually an image that appears on the LED wall.

Regarding the primary advantage that real-time engines have over the more traditional computer-animation software, Workman notes, “The iteration time is much faster. If you want to see a camera move [for a specific shot] in order to determine, for example, what it would look like if you start close and then pull out wide — to see that change with Maya, you’re taking up to a couple of hours to render maybe 120 frames at high quality. In Unreal, that change happens instantly.” 

Visualization studios like The Third Floor have leveraged Unreal and other real-time engines on projects for the past several years to design previs, techvis and postvis services, and other forms of animation used in various phases of the production chain. The Third Floor’s credits span multiple Marvel movies, The Rise of Skywalker, (AC Feb. ’20) and other recent Star Wars films, as well as popular episodic series such as The Mandalorian and Game of Thrones (AC May ‘12 and July ‘19).

Casey Schatz, The Third Floor’s head of virtual production, has worked on such projects as Thor: Ragnarok (AC Dec. ‘17), Gemini Man (AC Nov. ‘19) and The Mandalorian, and has helped innovate everything from flame-throwing motion-control robots to real-time virtual eyelines. 

In addition to the obvious benefits of real-time rendering, Schatz, sees these software and hardware innovations as facilitating greater direct collaboration between studios like The Third Floor and cinematographers. “Historically, as previs creators, we were brought in very early, often before the DP was even hired,” he says. “So a certain amount of work had already been done, and when the cinematographer finally came on, they often felt like they were just painting by numbers. No one in visual effects wants that approach. We’re all trying to blend in with the wheel of filmmaking that’s existed since the Lumière brothers.

“Just a few years ago, game engines weren’t as conducive to moviemaking. Unreal’s virtual camera didn’t have a focal length or a film back [aka aperture gate] — it just had a field of view. Now there’s focal length, film backs, depth of field, f-stops, ISO and shutter speed. Epic even added the ACES color workflow into the rendering pipeline.

“A respect for and acknowledgment of traditional filmmaking has made its way into the software,” Schatz adds. “So you can say, ‘I’m shooting anamorphic with Panavision Primos,’ and we’ll have a menu of those exact focal lengths so that you can’t previsualize a focal length that doesn’t exist in your real lens kit.

“The goal has always been that even someone that has never touched a computer before, but is a remarkable cinematographer, can sit down next to a computer artist and talk in the language that they’re comfortable with — f-stops, T-stops, shutter speeds, film ISOs, grain, bounce light, diffuse light — the traditional cinematography terms that have existed for more than 100 years.” — Casey Schatz, The Third Floor’s head of virtual production 

“I’m working on the Avatar sequels now using Gazebo, Weta Digital’s proprietary real-time engine,” he says. “Russell Carpenter [ASC], the movies’ cinematographer, sat down with the lighters before we did any of our live action in New Zealand. Together they set the tone, the mood, the general key-light direction, the key-to-fill ratio, et cetera; all of this was done using  [cinema terminology] Russell is accustomed to. Thus, the line in the sand between traditional cinematography and computer graphics is disappearing more and more every day.”

Indeed, the rendering time of high-resolution interactive imagery has advanced to the point that it can actually appear onscreen as-is or with minimal adjustments in post — as employed, for example, on The Mandalorian.“That was our goal,” said Greig Fraser, ASC, ACS (who shot the Disney Plus Star Wars series along with Barry “Baz” Idoine), as reported in AC’s February 2020 issue. “We wanted to create an environment that was conducive not just to giving a composition line-up to the effects, but to actually capturing them in real time, photo-real and in-camera, so that the actors were in that environment in the right lighting — all at the moment of photography.”

When asked which specific advancements in real-time engines have pushed forward their synergy with traditional cinematography, Schatz explains that the ability to simulate bounce light, something so fundamental to traditional cinematography, is a game changer because only recently could this happen in real time or even close to it. “[During a previs session] a traditional cinematographer could be looking at a shot and say, ‘If we put a Kino Flo 6 feet away and add a bounce card, what would be the result?’ We can now show that result very quickly and accurately. Prior to these advancements, computer lighting was more analogous to theatrical lighting; you could aim a light and cast a shadow, but then you would have to cheat a bounce light by adding other lights to the sides at lower intensities.”

One of Schatz’s Third Floor colleagues, real-time developer Johnson Thomasson (The Mandalorian, Venom, Godzilla: King of the Monsters [AC June ‘19]), is quite directly involved with the intersection of live-action cinematography and real-time animation — specifically via motion capture and “virtual-camera sessions.” 

“One of the major benefits to real-time animation is ‘practice time’ for the filmmakers,” Thomasson says. “We worked on Christopher Robin, and director of photography Matthias Koenigswieser was able to use a virtual camera rig, playing back animation from the film and recording his camera motion so he could practice operating. He was able to directly experience the size difference between 12-inch tall Piglet and [for scenes set in the title character’s early years] 4-foot-tall Christopher Robin.

“It was a real challenge framing both of them, and something he hadn’t considered before coming to our virtual-camera sessions,” Thomasson continues. “It allowed Matthias to design his compositions ahead of the actual production. He was shooting with an empty frame [aka, a clean plate] on the day, but having rehearsed virtually, he already knew what the right framing felt like. When directors and DPs go through a virtual-camera session, they discover new ideas, and they’re exploring, expanding and coalescing their creativity.

“Another benefit is the physical representation of depth of field, which has never been rendered well in previs in the past,” Thomasson adds. “Unreal’s depth-of-field camera model is based on real-world cameras. So a cinematographer can ask which stop we’re at in a virtual-camera session and get an answer that reflects a physically accurate visual model. In my experience, when DPs learn about that capability, they want to take advantage of it, because depth of field is one of the strongest tools in their toolset for communicating their choices in cinematic language early on in preproduction.

“For directors who are not veterans of giant visual-effects tentpole films, it’s a new experience when they first get to the set. But [prepping] in a low-pressure, small-audience situation, and exploring and practicing via [real-time interactive previs], prepares them for the set like nothing else could.”

Looking toward the future, Schatz sees game engines becoming further entwined with live-action cinematography. “The hardware and software are going to [continue to advance], and it might almost become indistinguishable in terms of which imagery is real-time and which isn’t,” he says. “This is in service of the story and not to show off the technology. The motto of the Previsualization Society is ‘fix it in pre.’ The more creative decisions you can interactively figure out before you get on set, the better.

“The goal has always been that even someone that has never touched a computer before, but is a remarkable cinematographer, can sit down next to a computer artist and talk in the language that they’re comfortable with — f-stops, T-stops, shutter speeds, film ISOs, grain, bounce light, diffuse light — the traditional cinematography terms that have existed for more than 100 years.”

Workman adds, “What directors want are iterations. They want to find all the possible challenges early, so the faster you can see a result, understand the issues and create another iteration, the better. Real-time engines make that process happen instantly. It’s just like playing a video game.”


Prep Becomes Production

Society members Caleb Deschanel (wearing goggles) and Robert Legato (right), alongside Magnopus virtual-production producer A.J. Sciutto, at work on the photo-real animated feature The Lion King, which was produced with the aid of the Unity game engine.

Game-engine technology can provide cinematographers with ways to previsualize in a simulated filmmaking environment — but it can also serve as a filmmaking medium in itself. With the aid of game-engine tech, veteran cinematographer Caleb Deschanel, ASC leveraged his extensive traditional cinematography experience to capture The Lion King with entirely virtual characters, and settings as well (save for a single shot). Making this possible was visual-effects supervisor Robert Legato, ASC and a team of technicians and artists at Magnopus, who coupled the Unity game engine with various traditional dollies, cranes, tripods and other camera-movement tools to enable Deschanel to manually operate virtual cameras while directly interacting with live animation. 

“The essential thing for me in filming this way is having enough visual detail so I can make the same kind of decisions I always make on a set,” Deschanel tells AC. “That informs how you compose the shot and how you light it.” Though not yet photo-real — as the process to make them so would unfold later at MPC in London — the filmmakers’ subjects were imbued with enough detail by the real-time interactive system “that you can read their emotions and understand how close or how wide you need to be,” Deschanel says.

Legato — who was a key member of the virtual-camera crew in addition to his role in developing the technology — adds, “We’re essentially motion-capturing the Steadicam operator attached to the Steadicam or a dolly grip attached to the dolly. With the game engine, everything is live and under your control. You just walk over to the place that seems the most appropriate for you to film it. And because it’s live, you can say, ‘Let me get a little lower or a little higher, or let me try this same position with a 20mm instead of a 24.’ You’re tapping into your on-set intuition, which is ultimately years and years of experience, instead of overintellectualizing it.”

With these tools at their disposal, Deschanel and crew could not only capture the motion and composition of their virtual cinematography, but make visual design choices as well. “I would sit with [lighting director] Sam Maniscalco, who was my gaffer, with the files of all the sets,” Deschanel recalls. “We had a choice of 350 different skies to give us the right mood for every scene. It was just like being a cinematographer [in a traditional environment], but having far more control than you normally would. On a [traditional set], you don’t have control over the clouds and sky, so you have to follow the sun throughout the day. It was exciting and a lot of fun — I was really surprised.” — Noah Kadner

From left: The Lion King director Jon Favreau, Deschanel and production designer James Chinlund explore the previsualized world.
From left: The Lion King director Jon Favreau, Deschanel and production designer James Chinlund explore the previsualized world.

 

 

Saturday, February 27, 2021

REPOST: What Every Game Designer Should Know About Human Psychology [02.25.21] - Michael Moran

https://www.gamecareerguide.com/features/1987/what_every_game_designer_should_.php

Discover more from Informa Tech
  • From door handles to coffee mugs to fighter jet cockpits, User Experience shows up in everything you interact with.

    Every human-made object has been designed, based on either what will be easier for the user or what was easiest for the manufacturer. (Whenever using something feels frustrating or confusing, you can be sure it was the latter!)

    But what about the concept of User Experience in the world of video games?

    When our brains process something as complex as a video game, there's a lot going on. Understanding the role that psychology plays within the science of game development is crucial to creating memorable and engaging games.

    In this article, we'll summarize the importance of understanding psychology while crafting the user experience of a video game.

    We'll touch on subjects such as the importance of play-testing, employing affordance in your games, how to use psychology to create usability and engageability, and the role of the "Gestalt" theory in game design.

    Why Design Should Always Focus On The User

    When the design is focused on the perspective of the user, a product becomes much more practical, desirable, and useful.

    A great example of this dates back to fighter pilots during WWII. Exhausted and under pressure, these pilots had a high rate of human error and were at risk of accidentally pushing the wrong button on the dashboard of their aircraft.

    Image for post
    WWII Spitfire Cockpit

    Unfortunately for them, dashboards were not consistent between aircraft. This meant the pilots had to learn a new set-up every time they switched planes.

    This made it even more likely that they would press the wrong button. Therefore, standardized cockpits needed to be developed which would improve the user experience for these pilots.

    This type of thinking can also be applied to video games, ensuring that the design of the game is centered around the user experience. Therefore, by understanding the psychology of the user, you'll be better able to make design decisions tailored to their needs.

    It's also important to remember that there is no such thing as a neutral design. Everything we design will  influence people to use it in one way or another .  This is an important ethical issue to consider, especially when certain retention mechanics can create addictive behaviors and punish disengagement .

    The Importance of Play-Testing

    Every user is different and  our perspective depends on our experience , our history and what is important to us. When designing video games for different types of users, it's not possible to know in advance what every user will bring to the experience. That's why it's essential to have a diverse team of designers with different backgrounds.

    That's also why video game designers "play test" their games. This tests how the game is perceived by the people who will actually be playing it. With this method designing a game becomes  a cycle of action and iteration .

    Designers create a game, then test it to see if it is accomplishing what they wanted to achieve. If insight from audience testing finds the game lacking, it's back to the drawing board to refine with more information. Then the game is tested again, and the cycle continues.

    The design team may have certain goals that they are trying to achieve within the game. However, they will need to iterate on all aspects of the gameplay, from dialogue to visuals to mechanics and more, in order to achieve those goals.

    When it comes to play testing, heres an important tip:  The developer shouldn't be in the room with the play testers . Not only will it make the players feel somewhat awkward and intimidated, it will also make the test less accurate.

    Players tend to make  more  effort to understand a game when the developer is watching than they would if they were playing it at home  (perhaps out of politeness to the person who has put their heart and soul into crafting the game). To get an accurate measure of how many players would simply give up on a game, play testers should be free to play the game by themselves.



 

Game Career Guide

https://www.gamecareerguide.com/

 

 

Thursday, March 26, 2020

Trying this out. Twine is an open-source tool for telling interactive, nonlinear stories.

Looks like an old program on Facebook? back in the day for a project I worked on.
===============================
 
https://twinery.org/

You don't need to write any code to create a simple story with Twine, but you can extend your stories with variables, conditional logic, images, CSS, and JavaScript when you're ready. 

Twine publishes directly to HTML, so you can post your work nearly anywhere. Anything you create with it is completely free to use any way you like, including for commercial purposes. 

Twine was originally created by Chris Klimas in 2009 and is now maintained by a whole bunch of people at several different repositories


A new tool has emerged that empowers just about anyone to create a game. It's called Twine. It's extremely easy to use, and it has already given rise to a lively and diverse development scene.
Carolyn Petit, Gamespot
 
Although plenty of independent games venture where mainstream games fear to tread, Twine represents something even more radical: the transformation of video games into something that is not only consumed by the masses but also created by them.
Laura Hudson, The New York Times Magazine
 
The simple beauty of Twine is this: if you can type words and occasionally put brackets around some of those words, you can make a Twine game.
Kitty Horrorshow
 
If you're interested in making interactive fiction then there's no better place to start than Twine. It's possibly the simplest game making tool available, it will take you mere minutes to get started, and it has a wonderfully simple visual editor.
Richard Perrin
 
And aside from being free, it's really not programming at all — if you can write a story, you can make a Twine game.
Anna Anthropy
 
Twine is the closest we've come to a blank page. It binds itself and it can bind itself along an infinite number of spines extending in any direction.
Porpentine

Tuesday, March 24, 2020

Video: Storytelling lessons learned in 14 years at BioWare _ March 20, 2020 | By Gamasutra Staff

https://gamasutra.com/view/news/359918/Video_Storytelling_lessons_learned_in_14_years_at_BioWare.php

In this 2018 GDC talk, former BioWare creative director Mike Laidlaw details some of the most important lessons he's learned about how narrative fits within games.

Plus, Laidlaw explored how writers and narrative designers tasked with bringing stories to life can gel into cohesive storytellers, while still working well with the larger game team.

If you missed seeing it live at GDC, take advantage of the fact that you can now watch Laidlaw's talk for free via the official GDC YouTube channel!

And of course, all week the official GDC Twitch channel will be livestreaming speaker-recorded talks that were planned for GDC 2020, originally slated to take place this week. Tune in now to watch, or check back later to watch the archives on Twitch or the official GDC YouTube channel!

About the GDC Vault

In addition to this presentation, the GDC Vault and its accompanying YouTube channel offers numerous other free videos, audio recordings, and slides from many of the recent Game Developers Conference events, and the service offers even more members-only content for GDC Vault subscribers.

Those who purchased All Access passes to recent events like GDC or VRDC already have full access to GDC Vault, and interested parties can apply for the individual subscription via a GDC Vault subscription page. Group subscriptions are also available: game-related schools and development studios who sign up for GDC Vault Studio Subscriptions can receive access for their entire office or company by contacting staff via the GDC Vault group subscription page