Hello, I am one of the covidwa.com founders, a volunteer based project to search for vaccine availability. This project runs very fast with ups and downs and it is too early to reflect on the full story, but one aspect we often are asked about is how to replicate the effort for another state. This is really a time sensitive question, almost too late, so let me share my learnings that might help you to bootstrap the effort.
Start simple and launch within couple of days.
We used Airtable, which is like google spreadsheet, but more tailored for embedding into web. On one end, our then-small team could write into table all clinics with the same ease as Google Spreadsheets. On another, an Airtable could be embedded into a website, so you really don’t need to code anything.
For the website, take whatever is simple and allows custom HTML. A paid plan on weebly or wix of similar would go. We used Jekyll framework. As I said, you can put together a decent website with an Airtable on it, without coding and quite quick.
Keep minimalistic data structure
The world is complex. Try to agree (in writing) how you will model the information in your table. For each clinic we came up with ~4 availability statuses (yes, no, not covered, possible) and a separate column “restrictions”. We would use “Possible” status when the website required authentication or suggested to make a call, or we otherwise could not tell yes/no. The “restrictions” column would specify if clinic only serves local community or similar consideration. We would not replicate general eligibility criteria.
You need data
You will need to have data. That’s what you need to figure out. Strive for coverage and simplicity. We leveraged several ways: manual checks (especially for small clinics with text “no vaccine available, check in a week”), visualping.io , and home made scrapers.
The key win in our original design was simple API that allowed other people to plug their scrapers in vary fast: we’d issue a key for a scraper and a scraper would report availability by that key. The scraper author could implement it in any language and run in any environment. Eventually we consolidated scrapers into a single system, but for long we had scrapers running on our own laptops as bash scripts.
Build a team
Plan what volunteers you need:
- People to monitor new clinics and add their info
- People to check availability for existing clinics
- Software developers to build scrapers and automate
- Business people to build relations with similar efforts and share data. Without help of others you won’t survive
- People for community outreach, media outreach
- UX and Web design
- Customer support
Also keep your door open for different help venues. You never know. We had people who suggested what we could not originally think of, for example ad credits. At the same time, make sure you build structure and good onboarding experience before let many people in. Volunteers should be given means to get effective.
Strive for asynchronous communications, thus document critical agreement and how things need to be done (when it is important). Establish open culture (everybody can see everything).
You can only help people if you are pretty much THE aggregator, not one of many. If there is a project in your state, join it. If not, make yourself visible as soon as possible. Launch once you have at least some value. Start promoting early. Get people join you and drive the good.
While our tech stack now is opinionated (postgress, nodejs, python, golang, AWS lambda, heroku, netlify, to name a few), the start was simple as the following:
- Static website with embedded airtable
- Several scrapers in bash and other languages
- Simple backend on Express — you can start even without it