Day Two of Essential Practice Skills for High-Impact Analytics Projects

The seminar was an intensive on Structured Problem Solving, which I think is a term created by professor Patrick Noonan. A book may be in the works. I certainly hope so as there was tremendous wisdom and experience packed into these two days. Referring to the slide deck will help, but not the same way a comprehensive book would.

So what is Structured Problem Solving? It is a way to identify the real issue and then methodically move through various steps including research, task identification and communication, ultimately resulting in action. Be sure to keep the end goal in mind. Anyone who loves data understands that looking for other insights “just because” can waste a tremendous amount of time. That extra effort may be interesting but may not be aligned with the immediate needs of the company or customer. We had a great conversation about scope creep—and how that can come from both the requestor and from the team doing the work.

The high-level steps of Structured Problem Solving are:

  • Define the Problem
  • Break Down the Problem
  • Plan the Work
  • Work the Plan
  • Build the Case
  • Tell the Story
  • Start the Change

I have been in charge of software development teams of various size for over 20 years, including internal developers and outsourced teams. During that time, I’ve seen requirements and management styles shift from Waterfall to Agile. SPS is especially enticing to me as the advantages of Agile and iterative methodologies have become so apparent on so many projects. Each one of the steps above is tackled by a team in an iterative process and then refined or worked individually. Every team member understands the big picture, and every team member understands how their part fits into the whole. The final product truly is a gestalt.

Structure Problem Solving is a technique I intend to use both personally and professionally to bring clarity of thought and process to non-trivial tasks.

There were two other major highlights of the day. The first was on data visualization. As the actions should be based on logic and facts, they should be natural conclusions of the research. The question always is how can the data be presented with both the least cognitive load and the most clarity—especially when the results are being presented to non-technical people. Several examples of poorly chosen graphics were studied and better options discussed. Of course, the works of Edward Tufte’s The Visual Display of Quantitative Information and Stephen Kosslyn’s Clear and to the Point: 8 Psychological Principles for Compelling PowerPoint Presentations were referenced and highlighted in many ways.

The second major highlight was the time we spent discussing creativity. Some people in the room had practice with structured creativity while the concept was new to others. I have used Brainstorming and Devil’s Advocate approaches effectively for years. The concepts of Brainsteering and Six Thinking Hats were both new to me. The bottom line: creative insight doesn’t happen accidentally. Creative insight happens within the context of intentionally focused thinking.

This course will definitely change my management style for years to come.

Day One of Essential Practice Skills for High-Impact Analytics Projects

Today I met Emory University professor Patrick Noonan  at the INFORMS seminar on Essential Practice Skills for High-Impact Analytics Projects. The day included many great gems that will guide some thought processes for years to come, but an early idea was about the new nature of work and that businesses need to “figure it out as we go.” The underlying assumption is that business practices and advancements in data collection and analytics are changing at such a rapid pace that there is really no opportunity to rely on prior techniques. Of course, that’s not universally true—there are many situations where our skills and experiences are directly applicable so that some problems can be solved quickly. However, there are many complex business questions that have never before been answerable as the data and tools required to answer them did not exist. This is new territory.

The bulk of today’s discussion was about decision making processes and how to formulate a plan to answer key questions based on analytics. The advantages and limitations of existing frameworks such as SWOT and SMART were discussed, and how their inherent limitations naturally lead to the use of Issue Trees. What is the Key Question to be asked? Because the first step in the process is always situation specific, this becomes a highly customized process instead of a generic framework. Once the Key Question is agreed upon by key stakeholders, a plan of attack can be developed.

The most interesting exercise today was generating a task list from terminal questions based on the Issue Tree. After formulating the questions, we collectively created Proto-Graphs. I don’t know if Proto-Graphs is a Noonan invented term or borrowed from someplace else. Proto-Graphs are sample layouts of graphs that should be created by the data team to help answer the Key Question. The creation of these Proto-Graphs led to a significant amount of disambiguation. e.g., would a scatter plot or a histogram best represent the data? What time frame should be analyzed? What units are expected? It was surprising to me how many of the “obvious” assumptions were made differently by different team members. The process clarified the result before the expense of creating the graphic with real data was incurred. Another advantage was that our paper sketches were not dependent on the capabilities of any specific tool—our sample visualizations were not guided by what can be done. All of my new data visualization projects will use this technique.

We also discussed how to clarify the Key Question. A highlight of my day was when Professor Noonan asked to use one of my quotes. We were defining ideas to clarify the Key Question and I said that it was essential to ensure that everyone understood the true goal. When asked about this, I replied that “our biggest failures at Blue Ridge Solutions were when we delivered what the client asked for.” What’s that mean? Often in the rush to get something done, business requests are made from a starting assumption of let’s move quickly. Failure to investigate the true goal often leads to the wrong deliverable.

The E-Myth Revisited

The E-Myth Revisited has changed my perspective on owning and growing a small business.

There are parts of the book that are an obvious advertisement for Michael’s consulting business, but who can blame him for that? He’s on to something.

Thinking of myself in terms of the Technician, the Entrepreneur, and the Manager has been insightful. I started Blue Ridge Solutions as a reluctant entrepreneur. There was an immediate need to supplement my cash flow and a shortage of available high-tech jobs in Western NC. Thus, I created an environment where I could pursue my craft.

At some point the transition was made from the Technician (a highly skilled web developer) to Manager directing a team of qualified developers personally recruited. It was a good transition and allowed the company revenue to grow well beyond that of a single person.

Later, the transition to Entrepreneur occurred, though I’m not quite as conscious about when that happened. Instead of managing my team, I was implementing processes (Systems, per Gerber) so that the team knew what was needed whether I was present or not. During this time, I definitely experienced the Entrepreneurial Seizure and experimented with different management styles somewhere between delegating and abdicating responsibilities.

Many of these ideas were brought to my attention as I went through the ScaleUp program. Sometimes it’s hard to see what’s directly under your nose, especially in the crisis and busyness of the moment. I wish I had read this before I started a business. The lessons learned will definitely apply throughout the rest of my career.

Pick up The E-Myth Revisited: Why Most Small Businesses Don’t Work and What to Do About It before going out on your own!

Remembering Web Browser Intelligence

I was feeling nostalgic today so I looked up this co-authored old paper (1997) about personalizing the web. It was such a new field then. I’ve always been proud of the fact that I wrote the first version. Remember when the internet was slow? Advertisers were upset because we could model the relative response times of destination sites using an exponential distribution and place predictive green, yellow and red blocks next to each link. True, it destroyed the page layouts—but web pages weren’t as finely crafted back then. It was wildly useful during a time when a significant percentage of all links went to dead sites.

From the referenced web page…
IBM releases a software agent called Web Browser Intelligence that makes it easier for users to obtain, distribute, and control information on the Web.

Source: IBM unveils intelligent agent – CNET (from 1997)

Macs vs. Anything Else

MacBook Pro

I’m religious, but not about computers. I think they are tools—things to be used to accomplish other things. They can be used for entertainment, for development, for business, for communication. Definitely for streaming music. They can be used by the Russians to direct public opinion.

But they are just tools—things to allow us to accomplish greater things, not do greater things for us.

I’m sitting in my downtown coworking space. There are nine people here right now. Eight are using Macs. I’m using a Dell. Am I an outsider, or is there a new freelancer religion that I failed to grasp?

The irony that people attempting to identify with a fringe movement often end up looking the same has never escaped me.

p.s. I do prefer Bash. Perhaps that’s reason enough to change.

AI & Deep Learning Conference | GTC 2018

I’m headed out to this conference in March. Will be great to revisit my old stomping ground. It would take about an hour but I used to walk to the convention center.

But the important part—hearing about the latest innovations in AI, Analytics and Cryptocurrency and meeting industry influencers.

From the conference web page…

The world’s biggest and most important GPU developer conference. Learn how to harness the latest GPU technology. Enjoy face-to-face interactions with industry luminaries and NVIDIA experts.

Source: AI & Deep Learning Conference | GTC 2018 | NVIDIA

Learning From The Best

Very excited to be attending this conference in February!

From the conference web page…
Learn practical frameworks and systematic processes for addressing complex, real-world problems and how to facilitate effective action. This intensive hands-on course is developed by the largest organization of analytics professionals, INFORMS.

https://www.informs.org/Professional-Development/Continuing-Education/Essential-Practice-Skills-for-High-Impact-Analytics-Projects

Learn R, Python & Data Science Online | DataCamp

Started some online work with datacamp this week. The first software I developed was in FORTRAN. Can you believe it? My professor told me to a Mechanical Engineer all software is FORTRAN. I moved to C. Those professors didn’t know what they were missing. c++ quickly became my happy place. And, from my perspective, most modern languages derive so much from that early work.

I can program PHP with my eyes closed. Try it and you’ll see just how tricky that is. Matlab and Mathematica were used frequently in grad school—but it’s been a while. Python seems to be the language of choice for quick data set investigation and integration with AI tools.

Python list comprehensions are cool! That’s one slick line of code to adjust an entire dataset.

phrase = 'List comprehensions are more interesting with numbers'.split()
result = [[w.lower(), len(w)] for w in phrase]
print(result)

Let the journey begin!

From the DataCamp website…

Learn Data Science from the comfort of your browser, at your own pace with DataCamp’s video tutorials & coding challenges on R, Python, Statistics & more.

Source: Learn R, Python & Data Science Online | DataCamp