April 4, 2012 by cs
Winning enterprise-wide government contracts is no easy feat, but when you do win it is critical to capture lessons learned. In many cases organizations are part of a larger team that typically includes a mix of products and services that must be delivered in an integrated manner. When managing an enterprise-wide government program there are many things that can be done to ensure success. Here are five lessons learned from my experience:
Keep your eyes and ears open
Typically there are all kinds of different personalities and geographical and structural logistics involved. To maximize value for the customer, you should continually conduct assessments of future users to get a strong read on expectations. Establishing a total awareness of how the solution will look and feel with respect to the user experience is critical. The more they know about what they’re going to get before they get it, the better they will adjust once it’s there.
You should also be cognizant of bottlenecks users have run into in the past, and try to make sure those are not repeated. You discover what pain points exist – and really understand those intricacies – before moving forward.
One of the larger challenges we faced in a recent enterprise deployment is that each site location was used to conducting their own business, setting their own standards and meeting site-specific requirements. Part of deploying the solution for this customer is to bring cohesion and unity for users and administrators across the enterprise.
Keep your vendors involved as well. Read their white papers and product reports. Allow them to visit on site while the work is getting done to gain their input. The only way they can help you (and the customer) is for them to have a hands-on understanding of the whole solution.
Steady as you go
For a complex undertaking like this, you don’t want pedal to the floor activity periods and then others that are fairly quiet. That’s when you end up needing 30 hands on deck one week and then just three the next. It simply isn’t a good work model for large enterprise and international deployments. So it is critical to pace enterprise deployments with very even-keeled, consistent workloads, to make the best use of everyone’s time, resources and investment.
Another important task is requirements gathering from the customer and understanding those requirements. The customer provides their requirements; we, in turn, provide them with a design that meets those requirements. This precipitates changes within the deployment plan as well as on-site to account for all design requirements.
Both of these tasks take a fair share of pre-project planning, but it’s worth it. Otherwise, you’re spending an enormous amount of time coordinating on the fly. That could make for a negative customer impression, and impact the certification and accreditation process. This process is something you should think about every step of the way. Because if you stumble, the entire effort is tossed for a loop – possibly indefinitely.
With training, timing is everything
The training experience means so much with respect to success. So be careful about when you schedule this. You can’t host training sessions at the last minute, because the sense of immediacy may lead to a bit of user panic. You want to give users a chance to familiarize themselves with the new equipment and system before it is thrust upon them. On the other hand, if training is conducted six months in advance, they will likely forget everything they learned by the time they have to make the transition.
To users this is a simple solution that they truly enjoy using. To administrators it brings together several already complex components into a single environment. In order for the enterprise to embrace and support the new technology, the administrators require vendor-level advanced training and hands-on experience after training.
That said, we have learned that retention is enhanced if participants are allowed to determine the training method. So offer up a number of options – in person, online, PowerPoint, simulations, etc. – and you’ll get better results/retention.
There is a period of time that the technology needs to transition from us to them. That period of time should be determined by both the customer and the solution designer, and is based on the experience of the on-site administrators. It is critical that the customer understands the importance of allocating the proper time and resources to achieve a smooth transition. There is no cookie cutter approach when handing the keys over to the customer, but rather open and honest dialogue always makes for a successful transition.
Even with a game plan in hand, keep in mind that circumstances will change. Requirements will shift. Schedules will get revised. There is always someone in the room that says, “We can’t change that, it is not part of the design”, or “the documentation says this.” Yes, there are cases in which you have to redo major designs. To stay on top of these shifts, maintain an active, open dialogue with the customer to understand the true requirements that your team must address. Remember that as requirements evolve or are discovered it is our job to help the customer understand changes are necessary and expected. The original design can be amended and documentation can be updated – ultimately resulting in a better overall solution.
Document your experiences
I have learned that being involved with large-scale enterprise deployments isn’t just a job. It’s an opportunity to learn how to effectively support a large customer. So it is important to capture your experiences in working documents that summarize, “lessons learned” so it can be passed on to the next location.
About the Author: Douglas Norton is a senior manager for professional services at Raytheon Trusted Computer Solutions. This article was published by Washinton Technology on Mar. 26, 2012 at http://washingtontechnology.com/articles/2012/03/26/lessons-enterprise-deployment-advice.aspx.
February 16, 2012 by cs
Almost every proposal you write has a requirement for information on past performance. The government uses this information to evaluate how well your company has performed on similar programs and expects your past performance to be a predictor of how well you will perform on the program you’re currently bidding.
Because past performance can be an important discriminator in the evaluation and selection process, there are some things you should know about how to write your past performance response.
Past performance versus past experience
Past performance comprises a set of specific contracts that you select to demonstrate how well your company, or your team, has performed on contracts that are similar in size, scope, and complexity to your current bid.
Past experience, which is sometimes confused with past performance, is about the broader issue of what experience and expertise the bidding organization has gained from all of its contract work and the work of its teammates.
Select contracts to demonstrate past performance
Past performance is all about relevancy and how well you performed the work you’re referencing. The government will consider these two factors together when developing your past-performance score — and both are important. However, performance is more important than relevancy. It is better to showcase your best-performing contracts and argue that they are relevant than to select contracts that are highly relevant and had poor performance.
Expect the government evaluator to ask your customers how well you did performing each contract. Typically, this happens via a formal past-performance questionnaire submission process and/or direct communication from the government evaluators to your customers. The government keeps two databases—the Contractor Performance Assessment Reporting System (CPARS) and Past Performance Information Retrieval System (PPIRS)—to determine how well your company performs its contracts.
Government access is restricted to those individuals who are working on source selections, to include contractor responsibility determinations.
With the CPARS, companies can regularly review their own ratings for each evaluated contract, but cannot check ratings for other companies. In order to access PPIRS information, a contractor must be registered in the Central Contractor Registration (CCR) system and must have created a Marketing Partner Identification Number (MPIN) in the CCR profile. Because past-performance ratings are such an important factor in proposal evaluations, every company should regularly review its CPARS ratings and challenge any evaluations they consider unfair.
Write your past performance summary
Each RFP will be very prescriptive about the information you need to provide when you describe each past-performance contract. While it may seem obvious, you really do need to provide all the requested information in order to submit a “compliant” proposal (see my Washington Technology article, “6 reasons your proposals fail,” October 2011).
You’ll be asked to provide information to show contract relevance, so keep this in mind when you write your response. Measures of relevance include contract size, scope and complexity, as well as the technical scope of work performed.
The description of the work is where you can stand out. Write your response to not only show that you performed relevant work — which every bidder does — but that you also had specific accomplishments that were meaningful to the government. Don’t just parrot back the statement of work from the contract you are citing. Focus on accomplishments because it’s these achievements that can make your contract past performance stand out from the crowd.
Most importantly, make sure you have outstanding past performance on the contracts you present. Confirm this information with your customers and with your teammates’ customers before you submit your proposal.
The government will read what you write, and they will validate the content. A good writer can present your past performance in a credible, compelling way, but if the underlying performance is less than desirable, it’s hard to overcome the truth.
About the Author: Bob Lohfeld is the chief executive officer of the Lohfeld Consulting Group. This article was published by Washington Technology on Feb. 10, 2012 at http://washingtontechnology.com/articles/2012/01/30/insights-lohfeld.aspx?s=wtdaily_130212.