Friday, May 15, 2015

Blame chain or value chain? Value drives a project from strategy to test.

This series of blogs is based on a talk entitled "Is Your Programmer a Better Business Analyst Than You Are" given by the author to the Kansas City IIBA user group meeting in April of 2015.

It's always been a problem for me that programming students have no clue about working with existing project requirements. I've never seen any teaching nor any methodology come from the Agile movement let alone any other technical cultural phenomenon. If no one is concerned that the project moves from the business analyst role to the programming role, that must mean that the project really starts with the programmer because that's where the project deliverables really begin. And so, programmers have reverse-engineered analysis and design skill sets to cope by themselves,

Of course, real world projects are the practical training ground for understanding how to work with project teams, but couldn't we make this a part of the skill set for a programmer before they get thrown into the war room?

Web programmers are used to this. They are a department of one. They gather information on understanding customers' needs, write copy, design layouts, put the code together, and deploy and maintain a site. As the web application world expands, this puts immense pressure on the technical web employee to know about any new development and keep up their own skills tied to the legacy apps they maintain.

Programmers I talk to have an implicit distrust of the project. They find a purpose but no real detail in it and muddle through the development as best as they can. In fact, most businesses without a solid project management process don't have much to work with. The management will throw stuff out into the market and claim that the market is crazy. The analysts are documenting whatever anyone says and then complaining that the users don't know what they want. Coders code whatever they think is right and try to make the project manager's deadlines.

And testers. Standing at the end of the line with a deadline rapidly approaching doesn't make anyone try to establish any standards knowing that they are holding up the deployment. They just try to find as many bugs as possible in the time they have.

The programming students I see have not been trained in business so it's reasonable they don't know how good goals can drive their behavior. A couple of questions have always intrigued me when working with a software project:

  • What if all the people on the project have the same goal? 
  • What if they all had documents that built on the previous documents and were traceable back to the original value-driven goals? 
  • What if the project was measured on how it realized that goal?
A project proposal should be estimated on value. Not only just a return on investment but understanding what the priority is for the enterprise. Value for internal projects can be seen as basically two parts but more can be added if justified:
  • The number of people in the company that will use the project in their processes
  • The visibility of that project in the organizational chart of the business so that if that project fails, the level of management involved determines the severity ticket level.
Value for external projects is estimated the same way. There are the same two metrics involved:
  • The number of people in the market that will find value in using the product/service.
  • The amount of money that the market is willing to spend to get that product/service
The two measures are both based on quantity of value. The project then can look back to this value of opportunity and see how to organize the project pieces. For instance, business goals, the basic business driven goals that begin the requirements process, are based on what value the business gets. Requirements meetings should never ask the question "What do you want the project to do for you?" because it's not about the person. The better question is "What should the project do to produce the best value for the business?"

Then, the business can see that the customers have needs where they should adapt their product/service to achieve the best solution to a large market segment. The question for the strategists and enterprise analysts would be "What can I do to help our customers/employees solve their problems?"

Business analysts would then understand that the value perceived at the strategy level should be communicated downstream to the programmers clearly. That communication should be able to move directly into testing of that code because the target of the project development effort is to communicate business value to the technical staff. The analysts' question then is "How does what the customer/staff do right now help them get tasks done?" 

The designers have the task of taking the current processes and possible new processes and shoehorning them into a fit with the business assets so that it improves the value. They have to think about improving life in general. The question for them is "How can we translate the requirements into code/real life? The trick is to make it look like it's simple.

The final stage in the value transition from idea to final service/product is that someone double-checks that the value has actually arrived. That's the job of the tester who assures that the quality will be there though verifying minor things throughout the project and validating that the expectations were met at the end. Their questions are "How can I guarantee quality?"and "How can I let everyone know how well they did?

In making the move to measuring by value, the chain of FUD will change to the chain of quality. But there are lots more questions that I've asked and I'll continue this series by taking on those questions. Here's a sample of some:
  • How should scope be measured by value?
  • How should scope be organized by value?
  • What is the role of analysis?
  • What does a programmer need requirements to look like?
  • What are good business quality metrics for code?
  • Is TDD dead or alive?

Tuesday, July 8, 2014

New ThoughtWorks Technology Radar out today

Four trends are identified in the Thoughtworks strategic IT report called Technology Radar published twice a year. The most important one for me is the one concerning JavaScript technologies and the challenge it brings for understanding. That is the reason that I'm developing my new course, JavaScript Powered Web Apps focusing on the language's application to building client-side logic for web pages and developing more with the browser. And it's been challenging without the guidance of book authors and coordinating corporations.
Here's the full excerpt on JavaScript:
Churn in the JavaScript World — We thought the rate of change in the Ruby open source space was rapid until the full rush of JavaScript frameworks arrived. JavaScript used to be a condiment technology, always used to augment other technologies. It has kept that role but expanded into its own platform with a staggering rate of change. Trying to understand the breadth of this space is daunting, and innovation is rampant. Like the Java and Ruby open source spaces, we hope it will eventually calm to at least a deluge.
The other three trends were 
  • Microservices and the Rise of the API (also somewhat aligned with the JavaScript trends), 
  • Conway's Law ("organizations which design systems ... are constrained to produce designs which are copies of the communication structures of these organizations") and
  • Re-decentralization 

Monday, April 21, 2014

JavaScript Powered Web Apps - new programming course

The world of web programming is moving ahead into what should be called Web 3.0 (the semantic web is just a pipe dream). It's using JavaScript as a unifying layer and re-imagining how the web can do what it does without the benefit of large back end frameworks. Web 1 was delivering files to a client. Web 2.0 was letting clients think they were in control by faking a desktop application over HTTP. And now we can have a true application that has been enhanced with networking to services in 3.0.

For the next several months I will be developing a new course to complement the classes that I wrote a few years back on HTML5, CSS, jQuery, jQuery Mobile, and JavaScript that bring these components together. I see tooling on web applications starting to mature and it is the right time to start to promote a new style of web application development. Except that there's no one way to do anything yet. Adobe is getting close to providing another great IDE with Edge Code but I think we're a ways off yet. Even Google might become a player here with their IDE code named Spark built with Dart and Polymer.

The new course, JavaScript Powered Web Apps, will walk students through building a "site" using combinations of node.js, nginx, SASS, Mongo, Bootstrap, Github, jQuery, jQuery Mobile, Grunt, AngularJS, Knockout, Express, etc. I'll probably do four days of sample sites and then show a web work flow and let students choose their own tools.

The course will assume some programming or design experience with a web site but most of the exercises will be scripted so that anyone can follow them. The students that likely will be most interested are those that don't have the back end skill set and see a tremendous advantage in learning only one language to do both front and back end coding. That means that anyone from high school that has learned the basics of HTML and CSS can enroll.

A lot of the training is on the administration and workflow of the tools which is harder to learn from books. I'll try to capture what I can for the exercises but if anyone has suggestions, I'm willing to listen. And as always, the class will continually be updated as the tools rev and newer tools emerge.

For instance, I'm still waiting to see how is going to impact the animation tools. It looks incredible but won't be completely public until May 19th when HTML5 Dev Conference in San Francisco starts. And I want to spin up a partial.js site as well I think. But I'll never know until it's finished. After all, a project's requirements are never finished until the project is over.

Thursday, January 23, 2014

Documentation as a control mechanism - not! Think communication.

People associate documentation in Agile with waste. In fact, the Agile Manifesto prefers "working software over comprehensive documentation" but does that mean that the purpose of documentation is secondary to the output of the project? Agile set us up with a poor dichotomy. I mean I prefer getting a paycheck over driving to work. Maybe we're asking the wrong question.

I don't see documentation as an optional part of the project. It's definitely an output and can be measured and that is the allure of the artifact. The traditional usage of documentation in a project is as a control mechanism when I look at the process outputs. It's often managers and specifically project managers who finalize a phase with the documentation. It's what they do. They design their work package by understanding the scope of the effort and if there is something measurable at the end it becomes a good work package.

Back in the darker ages of procedural coding, there was a movement to measure code by what it did. That entailed putting an estimate on the smallest granular operations of the computer in the code itself. That worked for awhile when the code was consistent in its granularity. But code has changed and what we can hide in a line of code has become enormous. The function point estimation methods died.

Now we estimate, if at all, with a measurement of work package time completion and complexity (from the programmer's perspective) or confidence (from the project manager's perspective). The dependency is on the work package design. If we don't have a good description of what it is we're going to do, we can't plan it. Many projects flounder from a lack of analytic description.

So where does this good description come from? Well, your requirements are the place to find these descriptions. Of course, the best requirements are ideally the needs and wants of the stakeholders massaged in to testable work packages detailed down to repeatable tasks so that no business questions have to be asked constrained by project limitations. But in reality, they are more of a garbage dump of what people said in excruciating long meetings.

The requirement document gets written and then we're on to the next phase. Follow along on your Gantt chart please. As a programmer, my input should be the output of the analyst. But with all the hubbub in programming circles about how to organize and manage testability, it looks like the analyst isn't doing a very good job. I don't see much evidence of usable requirements from my point of view either.

So, the Agile people are right. Processes without consumers are pure waste. Let's right-size this documentation by eliminating it. No one used it anyway. But what are we losing? We're losing the ability to record a decision and the to think about the design of the business. Of course, if it wasn't good for the programmer, then it was useless.

But let's consider that the analyst can benefit the programmer. Then the documentation becomes a stepping stone to better code. Then there is communication of the needs of the business. Then there is less thinking on the role of the programmer who gets to focus on writing well-structured and reusable code.

The role of the documentation is really that of communication when the project scales. If you are the sole stakeholder and programmer, you probably have all the requirements circling in your head at any one time. No documentation is necessary. If two people know exactly what has to be done after a good agreeable meeting, no documentation has to be created. But if there is a memory loss, a sick day, a new member to the team, you will need some documentation. The need for documentation increases as the need to communicate increases.

The management of documentation needs a metric instead of lazily setting the goal to that of creating a greater return on investment (ROI). Just how do you measure that? My standards for measuring are in the more subjective realm whereby you produce the documentation, ask the user of that documentation if they understand, and then see if they are able to do their tasks without any further questions. The quality score descends as the need for answers or the amount of perceivable confusion increases. Get feedback.

Documentation is not a gate to the next phase and to be signed off. I'll take the stance a little further than the traditional "living document" style of writing. Since it is to be a communication mechanism, it has to always communicate the current understanding of what the project is about. Anyone and everyone can be a contributor but the use cases / user stories / work packages should be maintained by the analyst / technical writer role so that they achieve the best level of testability and detail. Wikis are good.

The trend of "barely good enough" documentation, I think, is allowing the programmer to use their analytic skills in place of poor business analysis skills in the workplace, which is sad but the best workable solution to getting the job done. Stop producing unusable business documents and let the programmer get on with the code. What are programmers strongly in favor of commenting their code and Test Driven Development? Because those are the tools that get the documentation done a better way.

So, lets eschew the notion of controlling the project by requiring the project members to produce a result that isn't used in the next phase. Control the project by understanding the work package completions. The artifact that completes the work package is the code or the pseudo-code (the use case) in some form or another, not a project document.

So, is documentation is secondary to the output of the project? Getting a paycheck over driving to work is comparing the result with a task for getting that result. It depends whether you have to go to the office or not. Documentation is not secondary. It's just a question of whether you need to communicate more or not.

Wednesday, May 22, 2013

Thoughtworks Technology Radar

Free software, free books and Technology Radar. If you've sat any of my classes, you've heard me talk about those three things every time.

I have been reading Martin Fowler's writing for most of my IT training career and have found him practical, in-depth, and current. He also satisfies that craving for a little higher level analysis thinking that developer blogs don't always cover well enough exposing their lack of experience with other technologies and limiting their authority on changing my opinion.

The new edition of TR is out today. It's taken me years to work through some of the recommendations that they've put together in their think tank, ThoughtWorks, that are considered to be the cutting edge worth keeping abreast on. They also don't mind telling you when a technology is not worth your time. Both are worth my time to read and understand.

The trends highlighted in this issue are:

  • Falling boundaries - cloud development, co-location, perimeterless enterprise
  • Proven practices to areas that missed them - CSS frameworks, database migrations for NoSQL, etc.
  • Lightweight analytics
  • Infrastructure as code
The four major areas that are reviewed are
  • Techniques
  • Platforms
  • Tools
  • Languages & Frameworks


For me, being mostly a web developer, I was happy to see that the Adopt recommendations in the Techniques section were for mobile testing on mobile networks moving away from the fake simulators as well as using promises for asynchronous programming, giving assurance of feedback for us JavaScript/AJAX coders.

The next level of recommendation down from the Adopt level is the Trial level which should be approached with a little more thought. You see HTML5 storage replacing cookies and Mobile First here as well as responsive web design. I agree with all those because they're not total solutions to a problem. What's interesting is their lack of concern for exhaustive browser based testing in the Hold level which means don't worry about it.

For Windows and Powershell people, you'll be glad to know that Chef, Puppet and Octopus (automated deployment of ASP.NET apps without PowerShell)  to support infrastructure tasks has made Windows automation a much better choice.


Martin is following the NoSQL movement very closely and has put MongoDB as the choice for his Adopt level. CouchBase, Hadoop and BigQuery are down one level in Trial. Node.js is down there too probably as a technology too green to make it worth our while yet. But I'm waiting for their take on Polymer and Meteor. Also interesting in the next level up from the basement, Assess, is PhoneGap (Apache Cordova) and Zepto.js, the smaller relative to jQuery.

I wasn't surprised to see WS-* holding the bottom place in the platforms with REST taking over web services slowly but surely.


I'm using NuGet for .NET development and was happy to see it in the top level. Check out Chocolatey NuGet as well if you do Windows administration. Maven is on Hold.

I've been searching for that right observer component that Google's Angular.js, Knockout.js  or Ember.js cover but also includes the whole MV* framework thing which I have covered with either a Java or .NET web framework. Reactive extensions for .NET (RxJS) didn't fare too well in the Assess level. My choice here for a solution to the observer pattern could be ReactJS or RxJS. 

The one tool that surprised me was D3. I have been recommending Raphael for JavaScript charting and watching D3 some but it shows up now in the Adopt level due to better complementary libraries such as Rickshaw and Crossfilter.

Languages and Frameworks

CSS frameworks like SASS/SCSS and Compass were staying in the Adopt level. The web apps that are moving away from traditional client/server architecture have much to learn yet but many frameworks are beginning to have business value so that they show up on the Trial level. These are HTML5 for offline applications, JavaScript as a platform and JavaScript MV* frameworks. Twitter Bootstrap also shows up as an Assess. But Backbone.js  and handwritten CSS are as good as last year in the Hold level.

A surprising observation was that Team Foundation Server caused productivity problems as a version control system. ThoughtWorks recommends Git, Perforce, or Subversion instead. It's a good thing that Visual Studio works with Git.

And just when you thought analytics couldn't get any better than Google Analytics, they see great promise in the data set aggregation and AWS/Hadoop management of your billions of web hits with Snowplow Analytics.

I'm sure I've missed some recommendations and packages people are using such as that new fangled language Mel Tillis and Kenny Rogers started about a paralyzed vet's wife going in to town for the evening without him. But read through the assessment and mine the results for some great improvements to your technology stack.

Tuesday, July 10, 2012

HTML5 and future of the universe

Nesting Dolls Web site people are always catching up. You get involved in an extended project and by the time you're done, they've changed the rules again and it's time to learn a new technology. I used to think this was a problem. Now I see it as a responsibility to manage the information of an ever expanding set of hardware devices that are becoming digital.

As these digital devices become of age, they mature into a web access point because of the value it adds to the device. Some devices like tablets are born with the ability to talk to the web at birth. Others, like DVD players, had to wait until they grew up.

HTML5 and CSS3 are expanding to fill the needs of the digital appetites of these new devices. As we innovate to market test a device at every screen size possible from phone to tablet to TV, the older development technologies will continually be updated until they break. So far, HTML and CSS are holding their own. CSS not so much. Even the old dog, JavaScript, is doing well but has some sturdy jQuery and CoffeeScript crutches.

The raft of programming languages not originally focused on the web that try to manage that environment push the simple text retrieval process to be more app like. In their desire to improve the unique asynchronous request-response communication model, they, in my mind, will eventually destroy it. Face it, it's slow. You can do faster communication with AJAX which is why you see Google using so much of it on their web apps. It's why jQuery Mobile designed their GUI library for speed with AJAX so much that it's the primary communication model that tolerates the web request model.

Now a newer model of finer grained I/O control in JavaScript is starting to appear called the WebSocket API. Did you not see this coming? It's not the end until each language has a way to use a simple library of functions/methods to talk to any device of your choice.

Languages are shifting little in popularity. A recent TIOBE trend announcement showing that the iOS platform language of choice, Objective-C, has been the faster gainer over the last two years to take the #3 spot on the chart, is the only major change. With Java trending downward and losing the lead to C, a venerable stalwart, I'm not seeing where anybody is picking up the slack. C# is on a slow trend upwards but not that much. Even php is losing ground.

My guess is that JavaScript is picking up the slack from reading language popularity articles. You can run a survey of projects on StackExchange and Github and find out that JavaScript has the top ranking there. Hacker News put JavaScript in the top three as well with Python and Ruby taking the top spots. Even job rankings on show JavaScript in the top three to get a better sense of real world usage. Book sales from O'Reilly put JavaScript at #2 showing a mix of business and hobby use.

The next step for JavaScript would be to revive the server side version of the language and make it as available as php is for Apache. Then the major languages would write APIs to talk to JavaScript and we would have a programming interface for the web. Oh, yeah, and somebody do something in JavaScript to make CSS easier to work with. LESS is a good start.

So, if you are a web person, the future of the web looks like it centers on web application development and JavaScript is taking center stage. APIs will be promulgating and JavaScript/jQuery/minor language support plug-ins will be promoted. One of the APIs that is next on my bucket list is YQL. I think with the first book out on YQL in a few months from O'Reilly, we'll see an interest in mining the data of the web from JavaScript languages.

Excuse me now, I have to get back to the future and read about what's coming so when I start working again I won't be too far behind.

Friday, March 9, 2012

Impressive Shadow debuts at SXSW

Adobe is showing off a great new product for mobile developers at the SXSW conference in Austin this weekend. I caught the announcement and have been using it to achieve a better workflow. It seems the more devices you need to test, the better it works. But I'm just working with an iPad and and Android phone. It's great to sit back and watch all the screens update in real time all at once. and it's still a 1.0 product with many features to come in the future.

Some of the things you start realizing when you leave Shadow open while you browse are that some sites don't do a good mobile design, some require constant authentication, some use AJAX to fake a new page request and which ones have a great sense of adaptive design.

The main feature is a Chrome plug-in that talks to your iOS or Android app so that when you launch your Windows or Mac Shadow application with the apps talking to your local network, the apps will "shadow" what you do on your desktop. The more you like Developer Tools in Chrome the more you will like the product because it shows a webkit Developer Tools based window for your remote device.

The product was a very well timed conjunction of talent from the weinre open source code that was acquired through Adobe's purchase of Nitobi, the maker of PhoneGap and Adobe's BrowserLab. Adobe Shadow is free also and looks to get only better as they add support for Firefox and localhost development environments.

The 1.0 version is posted at the Adobe Labs.This will definitely a product to work with in my new Mobile Web Application Development using jQuery Mobile course here at Centriq.