© 2007 IBM Corporation DITA code reviews Presenters: Megan Bock, Shannon Rouiller, Jenifer...

13
© 2007 IBM Corporation DITA code reviews Presenters: Megan Bock, Shannon Rouiller, Jenifer Schlotfeldt

Transcript of © 2007 IBM Corporation DITA code reviews Presenters: Megan Bock, Shannon Rouiller, Jenifer...

Page 1: © 2007 IBM Corporation DITA code reviews Presenters: Megan Bock, Shannon Rouiller, Jenifer Schlotfeldt.

© 2007 IBM Corporation

DITA code reviews

Presenters: Megan Bock, Shannon Rouiller, Jenifer Schlotfeldt

Page 2: © 2007 IBM Corporation DITA code reviews Presenters: Megan Bock, Shannon Rouiller, Jenifer Schlotfeldt.

© 2007 IBM Corporation2 DITA Code Reviews July 2007

Agenda

What is a code review?

Why do we do code reviews?

What do we find in code reviews?

Using a CSS to identify incorrect markup

What is the process for code review?

Tracking code reviews

Code review demo

Page 3: © 2007 IBM Corporation DITA code reviews Presenters: Megan Bock, Shannon Rouiller, Jenifer Schlotfeldt.

© 2007 IBM Corporation3 DITA Code Reviews July 2007

What is a code review?

It’s a learning process to fix coding errors early in the authoring process (or just after migration)

DITA source files are reviewed by a team (the author, a DITA advocate, and an editor)

Not a comprehensive review of every topic

– 3-10 representative topics (concept, task, reference)

– 30 minutes to 1 hour

– Writer is expected to implement review comments throughout their topic set

Page 4: © 2007 IBM Corporation DITA code reviews Presenters: Megan Bock, Shannon Rouiller, Jenifer Schlotfeldt.

© 2007 IBM Corporation4 DITA Code Reviews July 2007

Why do we do code reviews?

To help each other learn correct DITA markup

To ensure consistent markup in your library, which results in:

– Consistent output

– Support for topic and map reuse

– Simpler maintenance

– Faster and smoother adoption of new tools and features

To identify possible requirements for DITA, our tooling, internal documentation, or our highlighting guidelines based on real usage

Page 5: © 2007 IBM Corporation DITA code reviews Presenters: Megan Bock, Shannon Rouiller, Jenifer Schlotfeldt.

© 2007 IBM Corporation5 DITA Code Reviews July 2007

What do we find in code reviews?

Element abuse

– Unordered lists instead of parameter lists or definition lists

– Visual elements (typically bold or italic) instead of semantic elements

– Ordered or unordered lists instead of step markup (steps, substeps, and choices)

– Code phrase and code block elements for random monospaced output

– Pseudo-heads and pseudo-notes

– Incorrect markup left by migration

Element neglect

– Missing short descriptions

– Scattered index entries

– Incorrect structure of step content

– Simulated menu cascades

Page 6: © 2007 IBM Corporation DITA code reviews Presenters: Megan Bock, Shannon Rouiller, Jenifer Schlotfeldt.

© 2007 IBM Corporation6 DITA Code Reviews July 2007

Semantic elements that are new in DITA

apiname – Use for API names such as a Java class name or method name.

wintitle – Use for title text that appears at the top of a window or dialog, and applies to wizard titles, wizard page titles, and pane titles.

menucascade – Use to document a series of menu choices, or to show any choice on a menu from which the user needs to choose.

note type – Use to expand on or call attention to a particular point.

term – Use to identify words that represent extended definitions or explanations.

shortdesc – Use to represent the purpose or theme of the topic. This information is displayed as a link preview and for searching.

Message markup (msgblock, msgnum, msgph)

Step markup (step info, result, example)

Page 7: © 2007 IBM Corporation DITA code reviews Presenters: Megan Bock, Shannon Rouiller, Jenifer Schlotfeldt.

© 2007 IBM Corporation7 DITA Code Reviews July 2007

findBadTags.css

Page 8: © 2007 IBM Corporation DITA code reviews Presenters: Megan Bock, Shannon Rouiller, Jenifer Schlotfeldt.

© 2007 IBM Corporation8 DITA Code Reviews July 2007

findBadTags.css

Page 9: © 2007 IBM Corporation DITA code reviews Presenters: Megan Bock, Shannon Rouiller, Jenifer Schlotfeldt.

© 2007 IBM Corporation9 DITA Code Reviews July 2007

Process example

1. Optional: Writer applies a special CSS (findBadTags) to check for improper tags (mostly highlighting).

2. Writer has the topics edited:

a. Writer submits a selection of topics to the editor for an edit.

b. Editor does a technical or copy edit and checks the output for obvious incorrect formatting and tagging.

c. Writer incorporates the edits and submits the revised topics for a code review.

3. The DITA advocate, editor, and writer meet, and the DITA advocate and editor review tagging and provide input to the writer about best practices. Comments are documented.

Page 10: © 2007 IBM Corporation DITA code reviews Presenters: Megan Bock, Shannon Rouiller, Jenifer Schlotfeldt.

© 2007 IBM Corporation10 DITA Code Reviews July 2007

How does the process vary?

Timing of code reviews: the earlier the better.

Voluntary code reviews vs. mandatory code reviews.

Informal vs. formal code reviews.

People in the code review: Whole writing team vs. just one writer.

Files to submit: DITA (including conref source files, DITA maps, and art files) and output files (for example, Eclipse plug-in).

Special CSS: (findBadTags) to check for improper tags (mostly highlighting).

Page 11: © 2007 IBM Corporation DITA code reviews Presenters: Megan Bock, Shannon Rouiller, Jenifer Schlotfeldt.

© 2007 IBM Corporation11 DITA Code Reviews July 2007

Ways to track code reviews

Page 12: © 2007 IBM Corporation DITA code reviews Presenters: Megan Bock, Shannon Rouiller, Jenifer Schlotfeldt.

© 2007 IBM Corporation12 DITA Code Reviews July 2007

Ways to track code reviews (continued)

Page 13: © 2007 IBM Corporation DITA code reviews Presenters: Megan Bock, Shannon Rouiller, Jenifer Schlotfeldt.

© 2007 IBM Corporation13 DITA Code Reviews July 2007

Code Review Demo

Sample files