How to Continuously Improve Apricot Software, A Case Study [Webinar]

In this recorded webinar, Sonoma County Human Services Department shares lessons they’ve learned over a five year journey with Social Solutions Apricot 360 software, specifically how they transitioned from initial implementation into a continuous improvement model that has kept their Apricot 360 database clean, healthy, and up to date.

The Sonoma County HSD team experienced a common outcome after their initial implementation. As their programs and workflows shifted, some of what they thought made sense at initial implementation didn’t make sense in day-to-day practice. Complete re-implementation wasn’t needed, but they did need strategies to support changing business requirements. In addition, they needed a way to manage feedback from over ten different user groups, which was a daunting challenge.

They settled on a continuous improvement framework that logs user feedback, prioritizes updates, and implements updates ongoing. As a result, they’ve implemented new systems, abandoned components that are no longer relevant, and maintained focus on positive user experience and strong reporting capabilities. Sonoma County HSD is building capacity in Apricot one step at a time.

In the video recording and slides below, you will learn:

  • Why continuous improvement practices are important to managing Apricot
  • How to transition from implementation to continuous improvement
  • How to gather feedback from users and prioritize it
  • What is required to build capacity in Apricot long term (i.e. management buy-in, systems, expectations)
  • Which of Apricot’s technical features are good fits for continuous improvement

 

 

Webinar transcript (not including Q&A):

Welcome to our webinar, “Case Study of Sonoma County HSD, Building Capacity in Apricot.” We appreciate your interest in today’s topic and your interest in making the most of your Apricot system. Thank you for joining us.

[SLIDE]

My name is Jeff Haguewood. I am an Apricot software consultant at Sidekick Solutions. Sidekick Solutions is an independent consulting firm and Apricot Certified Implementation Partner. We help new and existing users make the most of Apricot.

I am joined today by Vickie Miller. Vickie is an Apricot administrator with the Sonoma County Human Services Department Family Youth and Children’s Services Division, also known as FY&C, in Santa Rosa, CA. Vickie has worked with Apricot since 2015 and is the project lead for FY&C’s service provider referral system in Apricot, which we will learn more about shortly.

[SLIDE]

The concept for this webinar started with a common set of questions. We hear these often from other Apricot users. Vickie and I also find ourselves asking these questions.

  • How are other organizations using Apricot?
  • What works well for organizations like us that use Apricot?
  • What lessons learned might we apply to our own Apricot?

With a capable platform like Apricot, it may feel like you “don’t know what you don’t know.” It is helpful to learn how others are using Apricot. Whether you are a new user implementing Apricot for the first time or an existing Apricot user, we hope you’ll find takeaways in this webinar to make the most of your Apricot system.

[SLIDE]

This webinar is particularly exciting because Vickie and I will be sharing Sonoma County HSD’s journey with Apricot from Day 1 to today. We will start with a little background, for context, and then cover three phases of HSD’s Apricot journey.

  • Initial implementation
  • Transition to go-live
  • Adoption of a continuous improvement approach

This presentation will cover a lot of ground. Rest assured that following this webinar, you will receive an email with a recording of today’s presentation and a PDF copy of the slides. We will also open for questions at the end of the presentation.

Alright, let’s dive in. To start, I’ll hand it off to Vickie to give us some background on Sonoma County HSD.

[SLIDE]

Thank you Jeff.

The Sonoma County Human Services Department strives to support the health, safety and well-being of individuals, families and the community. We provide a diverse set of supports and services, and we collaborate with community-based agencies and service providers. Because we have a wide reach across the County, collecting data for analysis has always been a priority, as well as the concept of shared measurement.

[SLIDE]

In 2011, the Sonoma County Human Services Department studied the viability of a county-wide shared measurement initiative. At a high level, shared measurement was defined as the ability for cross-sector partners in the community to collect, manage, review, and report on client progress and outcomes. We wanted to better understand what programs and services were adding up to improvements in the lives of families, youth, and children.

The shared measurement vision was to develop County-wide outcomes that would drive evidence-based practices and support community domains like education, income, and health. Multiple stakeholder groups were engaged in the process to determine what, if any, part of shared measurement could be implemented in Sonoma County, specifically whether implementing a data system for shared measurement would be possible. One conclusion summarized the primary challenges:

[SLIDE]

“There are strong views among important stakeholders that a shared data system to track program outcomes will never be feasible. Objections include: cost, the inability to link data across multiple systems, privacy issues, Community-Based Organization (CBO) resistance, and the data entry burden on CBOs.”

In total, a data system for “shared measurement” was deemed too ambitious, too costly, and too burdensome on the proposed stakeholders and partners. The vision of a data system for shared measurement was put on pause.

[SLIDE]

The idea of shared measurement resurfaced in 2014. Upstream Investments, a part of the Planning, Research, Evaluation and Engagement Division known as PREE, took a second look at the implementation and management of a data system for shared measurement.

Building on the conclusions of the 2011 study, Upstream decided to pilot a small set of programs to test the overall concept of shared measurement. Upstream’s hypothesis was that they could get shared measurement “off the ground” with a smaller scope, which in turn would have a smaller overall budget. If we could show program results, regardless of whether those results were positive or negative, the County might be able to build momentum around shared measurement.

Upstream’s goal was to get the ball rolling. In contrast to the larger vision addressed in 2011, Upstream Investments defined a pilot program that included deeper provider collaboration, lower overall budgets and staff time, technology integration, and a model of shared outcome measures. The missing piece was a software platform to support the initiative.

Apricot was selected for the pilot and implementation began in 2014.

Before we explore our Apricot journey beginning with that initial implementation, let’s add a bit of context to our Apricot.

[SLIDE]

Apricot addresses our goals for shared measurement by being:

  • Easy for end users to get started with the platform
  • Customizable to various program workflows and requirements
  • Capable in its reporting features and tools, which are all built-in
  • Scalable for more users, programs, and capabilities

[SLIDE]

Our Apricot has:

  • 237 active users with 4 assigned administrators
  • Over 120 Tier 1 and Tier 2 forms
  • User groups across 2 HSD divisions and subdivided into 18 separate programs

It is a large system, but it didn’t start out that way.

[SLIDE]

We’ve incrementally scaled our Apricot to support the workflow and outcomes of each program. We use Apricot for programs that focus on advocacy, case management, information and referral, service coordination, collaboration and communication, constituent relationship management, application processing, and survey collection to name a few.

Let’s explore each program to give you a sense of the overall variety of programs we operate in Apricot.

[SLIDE]

FY&C provides case management and child welfare services to children and families. When a family needs additional services to support achieving positive outcomes, FY&C refers the family to community-based service providers that contract with the County to provide various supplemental services. We use Apricot to track and report on referrals made to contracted service providers.

Service providers have Apricot logins and collaborate on referrals with FY&C staff in Apricot. Apricot is a portal for sending and receiving information about a family, the services they are receiving, and the achievement/non-achievement of family goals in each service provider’s program.

Prior to Apricot, we used paper-based processes and fax machines to send referrals to service providers, with no way to check-in on a referral’s active status without calling the service provider directly. With Apricot, we can monitor referral status and report on performance in real-time.

[SLIDE]

Upstream Investments is a team within PREE that maintains a public directory of evidence-based programs and initiatives in Sonoma County, which is called the Portfolio. They advocate for evidence-based practices through training, technical assistance, and relationship building. Upstream uses Apricot as a constituent relationship management (CRM) system, for processing applications to the portfolio, tracking outreach activities, and logging events and attendance. Upstream also uses Apricot data to populate their searchable website directory, which saves time and gives their constituents access to portfolio data online.

[SLIDE]

The Road to the Early Achievement and Development of Youth (READY) Initiative is a cross-sector partnership focused on increasing access to quality early childhood education and supporting a child’s transition to kindergarten. READY collaborates with 11 school districts in Sonoma County to assess the social-emotional and academic skills of students entering kindergarten. READY uses Apricot to collect survey data from two groups: educators and parents. Collection is done via Apricot web forms. The two surveys are then correlated via duplicate matching in Apricot to produce a full student profile. READY uses the complete student profile to summarize county-wide data and shares that data with schools and the community.

[SLIDE]

Keeping Kids in School (KKIS) is a collaborative effort to reduce school truancy and prevent juvenile delinquency in Sonoma County. KKIS is a partnership between the Sonoma County Probation Department, Sonoma County Court, Sonoma County Office of Education, Seneca Family of Agencies and 21 schools within 8 Sonoma County school districts.

You might begin to notice a theme of collaboration among multiple stakeholder groups within all of these programs. Internal and external users working together in a shared data system is a key feature of HSD’s Apricot system. More on that a bit later.

KKIS provides culturally-relevant, family centered case management services in home, school, and community settings. Apricot is a complete case management database and reporting platform for KKIS case managers and supervisors for service tracking, needs assessments, service plans and goals, attendance, behavior, and grades data, and exit surveys.

[SLIDE]

The City of Santa Rosa’s Violence Prevention Partnership (VPP) is a collaborative effort between local government, schools, parents, community partners and law enforcement to prevent violence in Santa Rosa and provide support to youth at-risk of health/social problems and their families.

VPP uses Apricot to track referrals into the program, participant status at intake, services received while enrolled, status at closure, and assessment scores pre and post. VPP aggregates data from multiple providers for program evaluation and aggregate reporting.

[SLIDE]

HART is our most recent addition to HSD’s Apricot, which we started in June. HART provides housing locator, information, and referral services to individuals and families that are victims of domestic violence. Apricot is a referral intake and case management system that HART uses for service coordination, following referrals from HSD’s various divisions. HART also uses Apricot for grant reporting on client demographics, victimization types, and services received in the program.

As you can see, the Sonoma County HSD Apricot is home to a variety of programs with different workflows, focus areas, and outcomes in different domains and with different operating procedures. Apricot’s flexibility and technical capabilities have been key, but we’ve also needed to adopt operating procedures to help us manage, grow, and make the most of Apricot, especially as we’ve scaled up over time.

Designing those systems began with initial implementation.

[SLIDE]

We began initial Apricot implementation with a question, which presented a challenge we needed to address.

[SLIDE]

How do we quickly implement a single data system across two divisions, over a dozen programs, and with both internal HSD and external users?

We did not want a long-term or ongoing implementation. We needed a concrete end to each program’s implementation and we needed users in the system quickly so we could start collecting and analyzing data within the pilot programs. The whole goal was to “get the ball rolling.” The shared measurement pilot had no time to waste and Apricot implementation could not hold up the process.

There were three major takeaways from this phase.

[SLIDE]

We decided upon a staggered implementation timeline.

The scope of a single implementation project with so many programs was a barrier to getting started. We couldn’t bundle the whole project into one because we couldn’t devote full attention to all program implementations at the same time, especially since HSD staff were busy with existing roles as it was. We needed a tighter timeline that could sustain stakeholder focus.

We split the initial implementation into two groups. Although we didn’t want to wait too long to get started, we decided to start the second group 12 months after the start of the first. This provided the breathing room to commit to each group on its own.

We were able to complete the first group in eight months, including 13 programs under FY&C and two programs for PREE under the Upstream Investments Initiative.

Group two began five months after the conclusion of Group 1 and continued for four months. Group two configured three additional programs under the Upstream Investments Initiative for KKIS, VPP, and the Upstream CRM. READY was added in a third group in early 2016, and then HART in 2019.

Although the initial two groups spanned 15 months total, HSD internal and external stakeholders were only required to invest two 3-4 month blocks of focused attention, one for each group.

Staggered implementation by program was a measured approach to implementation that maintained realistic goals amid the excitement at the start of the project.

[SLIDE]

In addition to a focused timeline, we also needed a consistent implementation methodology for each program.

Implementation in the groups began with form design and data entry configuration. We intentionally left report development, data quality tools, and workflow tools until after each program had launched.

It was a four step process.

  • Discovery and blueprint included an exploration of the use case for each program, program models and outcomes, reporting requirements, and feature requirements. Those were then translated into the Apricot platform and formalized in requirements documentation. HSD’s time commitment in this phase was high.
  • With the requirements in place, we configured the data entry workflows in Apricot. This step offered HSD a short break before step three.
  • Testing is where we refined the Apricot design. HSD stakeholders demoed use cases in the new Apricot and provided feedback and requested change orders.
  • Because we had so many user groups participating in Apricot, we knew we needed formal documentation on “how the system worked.” We drafted custom user guides for each program, which users leaned on during training.

A consistent implementation methodology addressed Apricot’s configuration, but that didn’t address the expectations and responsibilities of each user group in Apricot. The collaborative aspects of HSD’s Apricot needed attention, especially with over 200 users participating.

[SLIDE]

Stakeholder agreements formalized our relationships.

In addition to the technical aspects of implementation, we felt it was important to memorialize our relationships with all stakeholder groups. We knew a shared data system was new territory, so we drafted formal documentation to set expectations, define responsibilities, and get early buy-in. The drafts used during initial implementation were straightforward. Some were formal contracts and others were Memorandum of Understandings (MOUs). Regardless of the format, the goal was the same: establish clear definitions of the mutual interest in shared measurement and Apricot use.

PREE and FY&C were both able to implement Apricot with collaboration from external, non-staff stakeholder groups and that collaboration still exists today. That in and of itself is a feat.

While the format of these documents have evolved over time and our various divisions and programs format them a bit differently, we believe these types of formal documents are essential. Whether you are collaborating on Apricot with an external provider or partner or you are securing buy-in from two or more internal programs that share the same Apricot license, it’s a good thing to have these formalized agreements in place.

Our keys for these documents include:

  • A description of program and data system use
  • Identification of data formatting and quality expectations
  • Definitions around security and confidentiality provisions or procedures, especially if data sharing is involved
  • A list of role-based responsibilities, including explicit tasks for the players involved
  • An outline of financial commitments or budgets that support the relationships between the parties
  • A summary of performance expectations or standards for the program

[SLIDE]

The next chapter in our journey with Apricot is closely linked to the initial implementation. Although we were able to quickly configure Apricot for data entry, there was another hurdle to overcome.

[SLIDE]

How to transition from implementation development to steady-state within each program?

This was a daunting challenge. Although we had started this process before initial implementation, including the leg work of relationship building on the people side and configuration on the technical side, getting 200 users to actually use a new system and to generate momentum toward steady-state was still on our radar.

Looking back, the results of this transition were mixed.

While a majority of our programs got going right away, we experienced some obstacles with others. The transition identified a few gaps in our original plans for shared measurement and it also presented a few new opportunities. Upon reflection, these outcomes are common and to be expected. While most of what is planned works out, a change of course may be needed to avoid pitfalls and adjust for new opportunities discovered. For us, this meant two programs were abandoned as part of the shared measurement initiative and other programs were slower to get started.

So, why some hits and some misses?

[SLIDE]

Our conclusions on this point are subjective, but the majority of the difficulty we experienced in this transition was due to changing program assumptions at or near go-live. In most cases, this happened when new assumptions about the program were brought forward, the program was evolving during implementation, or in a few cases the program needed to go in a different direction.

Because Apricot is a custom system, changes to assumptions may require modifications which may delay user adoption and the transition to steady-state.

Other factors included:

  • delays in data migration, which slowed user onboarding because there was no historical data in the system, and
  • too little end user involvement during implementation discovery, which caused some users to be reluctant to use Apricot even after training.

On the whole, our major takeaway is, modifications are part of the game. Continuous improvement based on changing assumptions is something we will explore in more depth later in the webinar. Go-live showed how important it is to remain invested in capacity building for Apricot when the underlying assumptions of your system evolve.

[SLIDE]

To illustrate how things changed for HSD, let’s take a look at an example. Within the first six months after go-live FY&C referral programs required major modifications to account for new assumptions about the program after training.

We assumed that all referral workflows under FY&C would follow a template, to be used by all Service Providers, even across different types of programs. We hoped to standardize the referral process and gain efficiency in Apricot by doing so. The reality was that each referral process included subtle differences from the “template.” The nuances by program hindered our ability to onboard new users. Some users just didn’t take to the template model. We needed to adjust the template for each program independently and we did that by engaging service providers in a redesign process.

Second, we assumed that service providers would provide updates on referrals every 30 days. As we onboarded new users, it quickly became clear that both FY&C and service providers would benefit if service providers could be more involved in “status setting” for referrals, a more real-time collaboration. However, we were “boxed in” with our design because our permission set structure would not allow service providers to edit referrals submitted from FY&C. We wanted service providers to see referrals, but not edit them once received, so they were unable to update a referral’s status..

[SLIDE]

This is an ERD of the before and after, illustrating the change in data structure that was prompted within months after go-live for the first dozen or so FY&C programs.

Because we needed a way for service providers to collaborate on a record without the ability to edit the base referral, we needed to add a new form and adjust the linking and workflow of the system. This was a major change, but created the opportunity for FY&C and service providers to use real-time caseload and data quality dashboards. It required a new form called a Referral Status, and updated the workflow so that Referral Feedback records were created from a Referral Status instead of the document folder. The Referral Status became the home base for service providers. We were stuck without the change, but we gained new capabilities by changing the program assumptions in Apricot.

[SLIDE]

Reflection on the need for redesign immediately following implementation, offers a handful of takeaways that are worth mentioning here.

  • First, although program leads may lead the charge during implementation discovery, it is important to include some end users in that process as well. Those closest to service delivery can expose assumptions that may impact Apricot’s design.
  • Second, the configuration and go-live may not go as you intend. As users begin to use the system day to day, they will inherently find gaps and new options for the design. Expect feedback that will prompt updates in the first 90 days at minimum.
  • Third, if your program assumptions change and the system needs modifications as a result, explore the opportunity of those modifications. The cost of maintaining an improperly designed system that users are skeptical to adopt could cost more long-term than the short-term investment of making a change.

[SLIDE]

The current chapter of our journey is where we feel we’ve hit our stride.

[SLIDE]

But this phase has also come with unique goals and challenges.

  • How do we build capacity without negatively affecting what’s already in place?
  • How do we prioritize a steady stream of user feedback into developments that make a difference?

HSD wants to grow the vision of a shared data system, but also wants to sustain a positive user experience. In order to grow effectively with Apricot, we’ve needed systems that “keep us in the lane” so to speak.

[SLIDE]

As we progressed past implementation and deeper into steady-state, we decided that our efforts needed to pass the “capacity building” test.

Is the change we intend to make a net increase in capability or feature set?

Does it enhance the system?

  • If no, we queue the project for later or abandon it.
  • If yes, then we will move forward based on the project’s priority in our queue.

We are cautious not to fall into a “rearranging the deck chairs” so to speak. It is easy to continue tinkering and perfecting the system along the same path. This is especially true of form design in Apricot. Multiple small changes to form configurations likely won’t level up the overall Apricot experience. As a result, we try to prioritize new projects that are a net positive in capability or functionality. We steer clear of net neutral. Building capacity in Apricot is about stacking new and valuable systems on top of another, piece by piece.

[SLIDE]

To get our Apricot “right” we treat the experience of each user group as its own “app” building a set of tools, processes, and people to make the most of Apricot. Currently, we have 18 total programs, so 18 total apps.

On the whole, we think of capacity building of these “apps” in four domain areas. By sticking to these domains, we can focus on maximizing Apricot’s potential.

They are:

  1. Data entry and workflow
  2. Data quality
  3. Performance reporting
  4. User resources

[SLIDE]

For data entry and workflow, we focus on form/link configuration and field selection, form logic, linking fields, the workflow tool, and dynamic dashboards on the My Apricot home screen.

[SLIDE]

Each program “app” has one or multiple dashboards that prompt users to take the next step in the program’s workflow. This is an example of a dashboard for HART showing pending referrals that require intake and active cases. When pending referrals complete intake, they fall off the list and move to Active, which summarizes the active caseload for the program. Dashboards like this streamline navigation, reducing the need for clicks and the Tier 1 search.

[SLIDE]

For data quality, we try to put as much on the users as we can. We place data quality topics in dashboards so users can take action. All of our data quality tools are conditional, meaning they look for errors and clear them from the report when they are resolved.

[SLIDE]

Here is a data quality checklist for Lifelong Connections that highlights potential gaps in seven domains. Because these reports are conditional, they only display errors, which simplifies the process of data quality cleanup and validates whether our data is clean.

[SLIDE]

Performance reporting is a large topic. We think about capacity building with reports in two categories. First, each program “app” will have a deck of standard, template reports like quarterly or annual reports. Then we build new reports as ad hoc explorations to learn more about the data.

It’s important to note that we use multiple reporting platforms in the HSD system, and yes sometimes we take the data to Excel for additional processing.

[SLIDE]

This is an example of a new report format we are using for FY&C contract outcomes. FY&C calls these Quarterly Process Reports or QPRs. FY&C can benchmark service provider performance to contract goals quickly using this type of template report in Apricot. Vickie and I are currently deploying this QPR format across all 13 programs.

[SLIDE]

Lastly, we focus on user resources. This includes data entry guides, workflow descriptions, and workflow diagrams. We place these tools on the home screen for quick access and host them in Google Drive so new versions are automatically available to end users.

[SLIDE]

This is a stack of resources for the CAPS program:

  • a 30 page user guide in Google Docs,
  • a three-page downloadable workflow description, and
  • a one page printable workflow diagram with definitions.

Although we honor the four technical domains for Apricot capacity building, you may be asking how we generate new ideas and topics for improvement.

[SLIDE]

For all programs, we complete an annual system assessment. We request feedback from users and complete administrator reviews of core functionality.

We gather feedback from users by asking questions about their overall Apricot experience:

  • What is the most pressing challenge you have right now?
  • What is working for you in the system?
  • What are your nice-to-have and need-to-have features?
  • What would you change?
  • What are you tracking in Excel?

I complete site visits once per year (this is now a systematic procedure) with a focus on user feedback – this has been a valuable component of our continuous improvement process. In addition we have an open door policy for service providers to contact me to give suggestions or let me know if they need something changed or added to their form or report. On an ‘as needed’ basis we provide one on one meetings outside of the yearly site visits.

Annual system assessments have prompted re-implementation of every program in HSD’s Apricot system except for two. The two exceptions have undergone various updates to keep them relevant as the program evolves.

In addition to the 2.0 re-implementation, we also make regular ongoing improvements to Apricot. Gathering feedback from users has been helpful in this effort, but incorporating frequent user feedback was initially a challenge.

[SLIDE]

Gathering feedback is one thing, but prioritizing that feedback and deciding what to implement and what to queue for later is something I’ve learned with time. Both Jeff and I started by doing everything service providers requested. This wasn’t efficient or productive and caused anxiety to get users everything they wanted and to maintain the integrity of the system.

We have learned to evaluate user feedback more thoughtfully. Sometimes we may need to say “ no” to the request. Not all user feedback is implementation-ready and may not support the integrity of the system.

Administrators need to be selective in the revisions they take on because
a) there is limited time to work on Apricot and
b) even simple changes often end up requiring more work than initially perceived.

[SLIDE]

As of this year, we’ve implemented a new process to support those projects we do select.

We call it the System Change Checklist.

Given the sophistication of the Sonoma County system, we’ve found that any single change impacts many sub-systems: forms impact reports, reports impact permissions, and so on. Once we decide to pursue a project and before we make any change in Apricot, we follow this checklist to ensure the change we do make is comprehensive.

For example, let’s say we want to change a field value in the Closure Reason dropdown on a referral or enrollment, maybe the phrasing of an existing value.

  • Is this a form change? Yes.
  • We develop a specification for the change.
  • Is data migration required? Yes, we develop a data migration specification for the change.
  • Then, update the form and complete data migration.
  • Review which reports include the Closure Reason field for possible filter or formatting mis-matches.

[SLIDE]

  • If report updates are required, copy existing reports, develop, and test changes before publishing live.
  • Update workflow diagrams and descriptions, plus user guides if needed.
  • No permission set updates needed for a field change, but this would apply if it was a new report.
  • Inform users and admins of change.

Making a subtle change like a field value in a dropdown can impact a number of sub-systems, which all need to be accounted for.

[SLIDE]

With the ongoing process of continuous improvement that we are committed to, I can’t overemphasize the importance of leadership buy-in for Apricot. Sonoma County leadership has allowed autonomy for Apricot admins to facilitate the processes described in this webinar. They don’t participate in the day-to-day operation of the system. They are involved in significant decision making when prompted by Apricot admins.

However, the type of autonomy required to make this relationship work requires investment. Leadership buy-in with a hands off approach requires the allowance of administrator investments in time, trusting that admins will prioritize Apricot tasks appropriately and keep momentum building for the platform. This also means that leadership is willing to make new investments in Apricot as well, like new Apricot features, more user seats, and consulting support.

Leadership must be realistic in what Apricot is to the organization and its value in decision making. We at Sonoma County are seeing the investment in Apricot pay off. As we transition to data-based decision making, we are able to spot performance gaps before they become larger issues. Our Apricot system is highlighting areas where we might save on cost and deploy more effective resources in our community.

Our vision for shared measurement is in motion. The patience and perseverance to stick with the vision through growing pains has been a key to our success. That could not be done without leadership setting clear expectations and providing Apricot administrators the flexibility to do what they do best, which is to manage the system.

[SLIDE]

We want to wrap our webinar with an observation.

The HSD Apricot system is getting simpler.

That seems counter intuitive, especially given the focus on capacity building. As we increase the sophistication and capability of our reporting systems, the database architecture and back-end is getting simpler, and it has become more apparent in our recent 2.0 re-implementation projects.

Vickie and I are calling this “getting closer to the source” where source is the underlying assumptions of the program that is using Apricot. Much of what we thought was important years ago during initial implementation has been pruned back. Unused features, abandoned systems, and better approaches to use cases have led to a leaner more vibrant Apricot platform. No doubt that hindsight is 20/20, but much of what we thought was important at initial implementation isn’t important long-term, which is why we find continuous improvement in Apricot so important.

To put this in practice, we’ve adopted an implementation principle called “baseline design plus user pull.” This means that we develop a baseline system first and during initial use of the system allow users to pull new use cases, features, and updates. This ensures that the system requires less modifications long-term by prioritizing a foundational system and then building new features as user urgency dictates.

This is not a new concept and closely resembles lean methodologies in software development. Nowadays, we are much more inclined to trim back rather than add by asking, is this necessary.

When Vickie and I meet to review priorities for HSD’s Apricot, we often step back to reflect on the many steps it has taken to get to where the system is today. Building capacity is a one-step-at-a-time process, a journey, that over time refines and grows your Apricot. It is a steady march toward your vision for the system. That’s our version of capacity building in a nutshell and is guiding growth of HSD’s Apricot into the next phase of the journey.

[SLIDE]

Thank you for attending today’s webinar and exploring this case study of Sonoma County HSD. Vickie and I are thrilled that you could join us.

Shortly, we will open up for questions. Before that, a few housekeeping items.

As a reminder, a recording of today’s webinar and the slides will be made available to you following today’s session. Please be on the lookout for an email with those resources.

[SLIDE]

If you enjoyed today’s presentation and our approach to Apricot software, let’s find some time to connect. Vickie and my doors are open. We’d love to learn more about your organization and your Apricot. Feel free to send us an email.

Thank you for attending today.

Let’s open our time up for questions?

Free Download

The Essential Kit to Apricot Best Practices

  • 3x PDF Guides
  • 2x Template/Worksheet
  • 1x 60-minute Webinar
  • ( checklist + slides )

Join the 250+ users that already got their kit!

DOWNLOAD NOW