Quirky Future Taskforce: Product Evaluation

Post on 25-Dec-2014

3.229 views 0 download

description

project goals/kickoff for product evaluation taskforce @quirkyinc

Transcript of Quirky Future Taskforce: Product Evaluation

TASK FORCE:

PRODUCT EVALUATION

for the future of social product development

• Top community voters / vocal community members:

Matt Fleming, Clinton Fleenor, Judi Sigler, Michelle Brewster, Stacy Prince, Steven Kramer & Charles Bailey

• John Lott – COO / CFO

• Brian Kerr – UX

• Jessica Marati – Community

• Mitch Lowe- Strategy

• Gaz Brown – Product Design

• Nancy Chen & Anthony Del Plato – Data Analysis

• Nathan Smith & Mike Lacy – Technology

WHO

WHAT A SUCCESSFUL PRODUCT EVALUATION PROCESS LOOKS LIKE:

1. Surfaces the ‘best ideas’ every week.

WHAT A SUCCESSFUL PRODUCT EVALUATION PROCESS DOES:

2. Curation is a fun and engaging experience for voters and community members.

WHAT A SUCCESSFUL PRODUCT EVALUATION PROCESS DOES:

3. Data collected is useful in product design, research, and marketing process.

WHAT A SUCCESSFUL PRODUCT EVALUATION PROCESS DOES:

4. Inventors who do not win walk away educated, with a ton of insight about where they fell short.

WHAT A SUCCESSFUL PRODUCT EVALUATION PROCESS DOES:

5. Requires a sustainable amount of Quirky staff interaction.

WHAT A SUCCESSFUL PRODUCT EVALUATION PROCESS DOES:

6. Rewards all who are involved without encouraging gaming.

WHAT A SUCCESSFUL PRODUCT EVALUATION PROCESS DOES:

7. Ends in winning ideas that both the community and Quirky are excited about.

WHAT A SUCCESSFUL PRODUCT EVALUATION PROCESS DOES:

8. Winning ideas enter the design process with a ton of research, knowledge, and demand behind them.

WHAT A SUCCESSFUL PRODUCT EVALUATION PROCESS DOES:

WHAT ARE THE ‘BEST IDEAS’?

1. The best ideas solve big problems (relating to usability, comfort, and convenience) in a new way.

WHAT ARE THE BEST IDEAS

2. The best ideas represent the biggest commercial opportunity for our community & brand.

WHAT ARE THE BEST IDEAS

3. The best ideas are manufacturable.

WHAT ARE THE BEST IDEAS

4. The best ideas don’t go out of fashion / relevance quickly – they are around for years (ideally 3+ year lifespan).

WHAT ARE THE BEST IDEAS

5. The best ideas are protectable.

WHAT ARE THE BEST IDEAS

6. The best ideas are often the ones people ignore or reject.

WHAT ARE THE BEST IDEAS

KNOWNISSUES:

1. Number of ideas grows each week, making the curation process more fatiguing for community members.

KNOWN ISSUES

2. We are, by design, saying ‘get lost’ to all but two of hundreds of inventors every week – how do we turn ‘get lost’ into something constructive and helpful, and how do we inspire losing inventors to stick around so they can learn, grow, and become successful inventors?

KNOWN ISSUES

3. Evaluating an ‘idea’ this early in the process is risky, because no one (not even the inventor) fully understands ‘it’ – because ‘it’ doesn’t exist yet.

“Don’t worry about people stealing your ideas. If they are any good,

you’ll have to ram them down people’s throats”. - Howard Aiken

KNOWN ISSUES

4. Due to relatively low number (about 2000) active voters and high weight of votes in defining the top ideas in a given week, the weight of each inventor’s social graph is higher then the actual ‘quality’ of an idea.

KNOWN ISSUES

5. People are voting for what theybelieve Quirky will like, versus what they actually like and will buy.

KNOWN ISSUES

6. Seasonality and changes in culture make things that were ‘no’s’ a short time ago, big ‘yes’s’ shortly thereafter... do we lose that opportunity, and how do we make a mineable database of ideas?

KNOWN ISSUES

QUESTIONS/THOUGHT STARTERS:

1. How do we create a mineable database of ideas that can somehow come back into relevance as times change / relevance changes / lines need expansion, etc?

KNOWN ISSUES

2. How do we avoid a culture (both staff and community) that gravitates toward easy wins, but rather encourage a culture that is drawn toward risk?

KNOWN ISSUES

3. Is a weekly ‘class’ of ideas the best way to approach evaluation?

other thoughts have included:

• rolling list in each category (ideas must hit a certain threshold

in order to get pushed into staff eval)

• volume-per-category-driven (collect 50 ideas in each category, then choose one)

• sudden death / elimination round (slowly kill off ideas

throughout the curation process that way we are all focusing

on going deeper into ideas we like)

KNOWN ISSUES

4. How can the community do more research earlier in the process to make staff evaluation easier?

KNOWN ISSUES

THE TASKFORCEPROCESS

THE TASK FORCE PROCESS

March 1st- Task Force planning begins

March 4th- Research survey sent to active voters (effort to better understand current climate)

March 10th- Plan complete, Task Force called to arms. Planning deck sent. Basecamp invites sent to all members. Brainstorm-style discussion begins within Basecamp.

March 15th- Kick-off / brainstorm conference call (full Task Force participation, time TBA).

Quirky-led conversation, resulting in three to five clear directions we can go. Jess will take thorough notes and post on Basecamp.

A ‘writeboard’ will be started for each of the directions – ‘bullet point / process style’ – and all Task Force members will work to refine the process, working in a ‘wiki’ environment.

March 24th- A ‘lead’ for each direction is chosen. Lead begins to prepare presentation of how the process could work.

March 28th- Conference call check-in (full Task Force participation, time TBA).

Each lead will discuss progress / challenges that still exist within their concept – things they’re worried about, things they’re excited about.

April 7th- Full afternoon @ QHQ. Each ‘lead’ will present their concept to the entire task force, Ben will join. Final direction will be chosen.

April 8th- UX/UI team briefed on vision / objectives

April 14th- UX / UI team will deliver preliminary wireframes of the new process / post to Basecamp. Conference Call (full Task Force participation, time TBA)

April 21st- UX/UI refinements

April 27th- Town Meeting presentation of new plan

May- Technology / user testing

June- Implementation of new product evaluation process