The implementation phase of an electronic LMIS project is when you will begin planning and conducting the technical setup and configuration of your system, and define the features used, desired, customized, and extended in the implementation.
The information in this section, 2-Implement, includes the Technical Setup Guide and Configuration Guide for OpenLMIS, as well as guidance on customization, versioning, training, and testing/QA.
A critical part of an OpenLMIS implementation is understanding how to engage the OpenLMIS Community to request features for future development or to build and contribute back to the OpenLMIS core codebase.
The steps below provide guidance on suggesting a new feature to the OpenLMIS Community, receiving support from the Community during an implementation, and understanding Community expectations.
The audience for this section is technical staff–the IT or software developers–who will be planning and conducting the technical work to deploy and maintain OpenLMIS.
This section is also important for non-technical audiences to review in order to identify what questions to ask and what technical activities to plan.
The OpenLMIS Technical Setup Guide contains the following topics for this section:
The audience for this section is technical staff along with non-technical implementers who are planning and supporting a deployment of OpenLMIS.
This section contains terminology and concepts that are important for the implementation team to understand, and that will help identify supporting data required for an implementation.
The OpenLMIS Implementer Configuration Guide contains the topics for this section:
Important Note: The OpenLMIS Configuration Guide is a live document that is updated as new development is completed. Please refer to the release documentation for guidance on feature deployments and versioning.
The audience for this section is technical staff–the IT or software developers–who will be planning and conducting the technical work to customize or extend OpenLMIS.
Non-technical audiences should skim or review this section in order to identify what questions to ask as they plan for any customization or extension.
As you plan for needed customizations or extensions, it is important to know that modifications to OpenLMIS can have significant cost as well as significant implications for ongoing effort required to upgrade over time.
The OpenLMIS version 3 series microservice architecture is designed to make customization and extension easier and to allow each implementation to stay up-to-date with new versions of OpenLMIS 3.x
The ways in which a technical team might customize, override, fork, or extend the OpenLMIS codebase can make this process easier or can make it difficult and costly.
In general, the cost of an implementation will be lower the more that you can use the off-the-shelf version of OpenLMIS without modifying the software. It’s also important to engage with the OpenLMIS Community if you want to customize the software, because other implementations may have similar needs.
Before learning the technical details of customization and extension, you should review how version numbers and country-specific microservices and extensions work in OpenLMIS version 3: Country Versioning in OpenLMIS 3.
Once you are familiar with how versioning works in OpenLMIS, refer to the Technical Setup Guide section above for technical details on how to customize and extend OpenLMIS.
The purpose of User Acceptance Testing (UAT) is to confirm that the system is performing as expected and meets requirements by the users of the system. Rather than functional testing completed by the implementation team, however, UAT has actual users testing the system to ensure it meets their needs (in other words, ensuring that users will accept that system). The results of UAT provide feedback to OpenLMIS team to enhance features and functionality as well as to address any defects identified.
UAT should be conducted prior to the system rollout and will inform the decision about whether the system is ready and rollout can proceed as planned, or if there are issues or changes identified in UAT that need to be addressed prior to rollout.
Once completed, UAT results should be compiled and reviewed by the team to make a “go / no-go” determination. If there are issues to be address (a “no go”), then those issues should be fixed, at which point the UAT process should be repeated, and a subsequent review and go/no go determination can be made.
Any software deployment plan should include buffer time for fixes or changes and a second UAT session in addition to the initial UAT, to ensure that issues identified in UAT can be addressed without complete disrupting the project timeline.
UAT test scripts should cover all scenarios that users will encounter in the system. Each testing script should include detailed, step-by-step instructions and each step should have specifically defined results so the users know what the system should do (or look like) after they complete the steps. Testing should be conducted in any/all browsers that are recommended. Users can identify if each step passes (meets the expected result) or fails (does not meet the expected results). If a step fails, the actual result should be recorded in detail for further investigation and an issue can be opened to correct any problems identified.
Refer to section 1-Plan for guidance and an OpenLMIS Test Plan Template.
Functional testing that is completed by the implementation team is executed and tracked using test cases in Zephyr.
Implementers may also use these functional test cases as a baseline for the UAT test scripts.
Once requirements gathering, software development, training, and other activities are complete, you are ready for the software deployment or “go-live,” which is when users will start to actually use and interact with the production system.
In addition to the deployment or release of the production system (or making the production system available to the users), the deployment also includes ensuring that the system is monitored and there is support available to end users interacting with the system for the first time to assist with any questions or issues.
At a minimum, routine support needs to be outlined in the support plan, whether that be an online issue reporting tool, e-mail, or phone line that users can call to request assistance when they face problems or issues using the system. Ideally, on-site support would be provided or at least available for users for the go-live.
The team should also be in close communication to coordinate support requests, questions, and troubleshooting efforts.
To this end, it is helpful to create a deployment plan with the following information:
OpenLMIS is committed to user-centered design practices, ensuring that the community and initiative are building the best and most useful product possible.
When in-country health officials and users of LMIS systems participate in the design process for OpenLMIS, the initiative ensures that real-world situations are addressed and that the system is built to respond to users’ day-to-day needs. It can help to ensure that users are seeing the information that is most important to them, and that they are seeing it in a format that is useful and user-friendly. Previous OpenLMIS deployments have used User Satisfaction surveys, human centered design workshops, and other Human Centered Design Methodologies to engage users in the process and ensure their feedback is reflected in the end result.
Read about the Francophone User-Centered Design Workshop for the OpenLMIS Vaccine Module. The Angola implementation of SIGLOFA utilized a User Satisfaction Calculation Tool to measure user feedback on the user interface (UI) and made improvements that responded to the user feedback.
Downloads: User Satisfaction Calculation Tool
User Centered Design for OpenLMIS Vaccine Module (ppt)
Human Centered Design Methodologies