Product Development
Products over time - A story of my career
My product development journey started while I was in college, In the early 90's when I was a student at Arizona State University and working in the campus dining hall, I realized that athletes who were coming for meals had to figure out which food they could eat to get their daily portion of the calories and took them a while to add them up creating longer wait times. In addition to the manual card system, it was hard to find cards to display and they would get damaged very quickly. My software development brain kicked in, and I built a simple database where we held all the menu cards and nutritional information. I built a very basic command line UI, that athletes would put in calories and based on that day's menu we would give them 3 options to choose from. This reduced the wait times and made it easier for us to print the menu cards every day.
7 years later as I was hired at Global Connect Partners in Seattle, I was the second employee hired to develop a billing solution for the wholesale telecom business. Sitting in a server room and working with Microsoft SQL Server Database I was figuring out the solution I had to take on additional responsibility of managing our operations. Managing operations meant that I needed to learn the operations and know what we offer to our customers and how we manage it. The seeds of BlueWater software were laid during those 3 years. Global Connect Partners became edge2net and grew to be 175 people worldwide with revenue of approximately $200 million. I became the chief architect and Director of IT as we grew. After its acquisition in 2001, I joined the 2 founders to build out the software product BlueWater. (OrcaWave.net)
As a CTO, I was involved with day-to-day product feature definitions and story breakdown, hands-on coding, as we were an early-stage startup. Doing customer demos, talking to customers about the product features and roadmap, being part of the sales calls, and providing technical details about the architecture became my everyday life. Post-implementation training with customers allowed me to meet all the customers in person on all 5 continents. Customers from the Netherlands, UK, Tanzania, South Africa, Brazil, Caribbean Islands, Central America, Asia, and the biggest ones here in North America. Those travels exposed me to different cultures, religions, and ways of working in different parts of the world. It also gives a different perspective on product management and the pain they feel while trying to operate a business. In today's connected world, all those customers are still connected and my friends even when I have moved on from the Telecom world.
Due to personal and family reasons, I had to exit the partnership and join a local opportunity in Fort Collins, CO. CA Technologies had just acquired Nimsoft and was expanding the team. After the hiring manager dug up my resume from the reject pile, I joined as a Manager in the Systems team, which was my first introduction to Agile methodologies (SAFe). Even though I was doing a lot of these things at Orca Wave for 13 years, I learned the names and concepts as part of CA Technologies and received official certification. My ability to lead became evident to the CA leadership very quickly when I was able to get the build server up and running by coordinating with the vendor within a week after it was down for 2 weeks. I was tasked with 2 responsibilities, move products of an acquired company with diminishing revenues from Austin, TX to Fort Collins, CO. Learn everything about building those products, the product architecture, and the defect backlog then build a team here in Fort Collins to deliver the next versions of the product to our customers. At the same time, to avoid another blunder of down build server, I had to design and implement a state-of-the-art infrastructure architecture that is resilient, performant, and easy to manage. Managing 3 product development teams that were moved from Austin, was hard as I had to learn everything about functionality, architecture and personalities of new people that moved from Austin to Fort Collins. Some were technical challenges and some were product functionality challenges. Architectural challenges were easy to address as we reduced the cost of appliances and invested in building a containerized software-based packet collector. Feature challenges took longer as we had to reinvent the user experience and re-imagine the terminology used, integrate with network monitoring tools, and provide larger deep packet inspection analytics. Overall, as a team, we delivered happy customers, with existing customers buying more appliances with reduced cost, and smaller enterprises implementing the software-based packet collector.
In 2017, CA launched a program called LEAP (Leadership, ) which was meant to prepare and promote internal talent to executive positions within CA Technologies.
Agile Methodology and Product Management
Even though were were following the concepts of Agile methodology as early as 2004-2005, I learned about the terms and concepts in 2014. At Orca Wave, we delivered a SaaS Product with ARR from large customers. Features were released every 3 months with a release to our multi-tenant platform happening Thursday at 9 pm. Every week, we would release defect fixes and every quarter we would release new features. All new features were demonstrated to customers prior to release, release notes were available on our application website and we tracked customer errors by having the application email us when errors were produced and take proactive actions if we see the trend continue.
In 2014, at CA Technologies I received my official training in Agile Methodologies. We practiced SAFe agile development methods and as time went on I learned a lot about Scrum, backlog grooming, daily standup, retrospectives, feature breakdown, and feature and story writing. Since I was hands-on with my team, it was great learning which I used in my later days at CA in bringing products and product features to market.
Taking what I learned at CA and improving it to fit our needs at Blue Yonder helped me launch a net new multi-tenant SaaS cloud-native product within 12 months starting from hiring a team and getting beta customers. I learned what metrics and OKRs make sense to me and how I would follow through with the team in measuring success.
Incremental Delivery
I am and have always been a believer in incremental delivery with a vision. As a product manager, I believe in defining a 2-3-year vision, and planning out what you need to do in the first year to get to the vision. As you start building and delivering in smaller increments of 2 weeks to a quarter depending on the solution, having customer conversations, looking at industry trends, and unplanned environmental impacts adjust the vision and the delivery of the product features during the first year. Once you explain the vision to customers (existing and new), you get enough feedback and you can adjust to get the buy-in. This allows you to adjust as the markets change, the competitive landscape changes and new technologies emerge.
How to measure success and what metrics to use?
I use some of the basic methods to measure success. A sprint team should be able to deliver a feature within a release cycle usable by customers and provide value. It is very important that grooming and breakdown need to happen in such a way that the feature is delivered in a release cycle. An epic should be broken down into features that can be linked together and multiple teams can work at the same time. Measuring team velocity and the ability to estimate can be judged by this method. Then it does not matter if you use story points, hours, or man days. I prefer story points as it is a relative measure.
SaaS Product Experience
Since 2001 when SaaS was ASP, I have delivered SaaS applications to Fortune 500 companies. I led a team that built multi-tenant routing and billing solutions for Wholesale telecom billing and routing solutions, hosted it in a Data Center in Seattle, WA, and provided access to customers through a web interface.
We pioneered in doing releases every week to our product keeping it versionless for all customers. Delivered white label concepts within the product and eliminated any service work to set up customers.
At BY, I led the team to build a multi-tenant Azure native solution with Micro-Services Architecture. Product features included ML-based task optimization, Robotics Hub that integrated with various robotics companies, and the product vision of Resource Orchestration where resources can be a human or a robot was accomplished. Blueyonder now markets that solution as WES (Warehouse Execution System)
Logistyx parcel solution was already a cloud-native product, however being a single tenant and not having standard product practices had led to higher costs, higher implementation times, and lower performance of the product. As an engineering leader, I implemented standard product development practices, worked with the development team to make components multi-tenant to reduce costs, and started working on standard interfaces with ERP solutions. Managed teams distributed around the world in 4 different time zones and established a nearshore development center in Peru. That development center now is a growing hub for e2open and with no timezone difference it is very efficient and cost-effective.
AI & ML experience
My first introduction to machine learning was a long time ago at Orca Wave when we did not even talk about Machine learning. At Orca Wave when customers would upload their vendor costs, we would look at historical trends and predict the potential next trend and recommend a pricing mix to offer to customers. Wholesale telecom is like a fast food restaurant, you need to create a mix of products to sell where you may bear a loss but gain somewhere else. These methods that we provided to large telecoms, resulted into a trend called bilaterals where there was not money exchange but net settlement based on volume of minutes.
Taking that experience and bringing it to the packet capture and inspection at CA Technologies, we were able to teach the algorithm when to create an alert based on the learnings from the large amount of data. Identify the bad actors who are trying ot get into your networks be it accidental or on purpose.
BlueYonder gave me an opportunity to apply those concepts in the warehouse management task optimization. As we prioritized the tasks using optimizaton techniques, when a end user overwrote that priority, or when we saw that the time taken to complete the task was different than what was anticipated by the software, the algorithm would learn and adjust parameters for the next optimization. We still allowed an end user to adjust parameters and adjust algorithms to make them feel confident about the alrgorithms. Execution systems are tricky as you cannot stop execution for any reason, we had to adjust our software to allow human intervention.
This human intervention led us to bringing in AI into the execution systems. We could take what we learned from the ML models in the historical data and apply decision making, also based on what the decision human had made under the same circumstances. Eventually we planned to use these techniques and applying Generative AI to produce the data and scenarios to build out the simulation of a warehouse operations.
Analytics
Analytics or trends has been around for ever, it just became easier as the technology evolved. At Orca Wave using the telecom data we were providing many different trends, statistics and other graphs and charts for customers. As I moved into the network monitoring domain, the analytics became more at granular level than at higher level trends. All of us were trying to emulate MRTG and PRTG continous graphing methods.
Supply chain it is quite different. In execution, the analytics are not only applied to an warehouse execution but also to labor performance. Labor performance and tracking that as some of the performance metrics are used to provide feedback to the individuals as well as setting some benchmarks and guide lines. Scanning 450 cartons an hour metric is not determined without analyzing large amounts of data and performance of different individuals. Providing those analytics became the bread and butter for task optimization systems that i built and delivered. In addition we were also tracking the equipment performance within the regulations provided. As my products integrated with robotics, we started presenting a value analysis on where the robots are effective and where humans are effective so customers can figure out which areas to stock which way in order to have a different entity perform that task. Inventory robots and picking vehicles, robots, humans could not clash in same aisle so managing those tasks through analytics was fundamentals to the task optimization engine.
As I moved into supplier and customer data inbound platform at e2open, we had large amounts of data but no dashboards, no analytics, no trends, no predictions. In order to jump start that feature set, I embarked on a journey to evaluate different technologies such as Snowflake, Yellowbricks, and Bigdata. Ideas was to to funnel data into these technologies and allow users to bring their own analytics tools to execute the queries that they see necessary for their business model. It was a successful concept when presented to customers but my time at e2open did not allow me to complete this and take it to the customers.
Mergers and Acquisitions
As part of the Edge2net and Logistyx leadership team, I was on the side where we were acquired by other companies. In both cases, I led the product and technology due dilligence answers. As part of M&A team in CA Technologies, I was on the side of acquirer, and targed 6 different companies during the 6 months resulting in 2 completed acquisitions. My role was more focused on product process and technology due diligence.
Pricing for SaaS and Perpetual licensing
Through out my career building pricing plans was never a cost based method. I did keep track of costs but as the technology changed, markets changed and the way in which consumers are. using the applications, the pricing method had to evolve. Areas I focused were the value provided by the product, how sticky can the product be with customers, what features are basic to the product and how the enhanced features would add additional value.
As the technology changed, the cost of deploying changed and in case of multi-tenant products it became easier to manage performance for specific customers without having a separate instance. This allowed me to be aggressive in pricing, pushing the competition to either evolve their technology stack, or justify the higher pricing.
I always want to be on the offensive in terms of pricing and my belief is if you can reduce the time to implement, customers can see the value quickly and becomes an easy sell.
Cost based pricing is good for n-tier architectures, single tenant applications where you want to keep gross margins at certain percent, but deployment costs and keeping software current are not considered in this model as the customer base grows.
Platform Product Management
Concepts of platform and Common Data Model have been around for long time. I used common data model in telecom solutions. As different switching platform provided a different format of call data records, we built a interface where we would consume the data as provided, then translate that to a common data model.