Visit the analytics dashboard here. Please do not share this link with non-members.
You can filter data by author and by title using the filter bar at the top of the dashboard.
Here are the key metrics we've been able to provide, and how to think about them:
- Reads. This is our best guess at the number of times your article was read by a person, and is mostly a measure of article popularity, but not impact. As a benchmark, our average article gets about 1,000 reads these days.
- Avg Mins Read and Avg Seconds Read. These two columns, combined, represent the average amount of time people spend with your article (at some point we'll put them in one column with HH:MM but SQL is lame). We view this as mostly a measure of impact/quality. As a benchmark, our average article is read for about 1 minute.
- Survey data (q1/a1 ... q10/a10). As you all know, we've been collecting survey data via our article page chat-bot. We're still figuring out what data to collect, what it means, and how to think about it. For now, treat survey data as a way to judge your articles against each other and a directional indicator of quality, but not as a concrete metric (that said, feel free to put these numbers on your CV).
There are a few considerations to keep in mind when working with our article data:
- Data collects site activity on MassiveSci.com only. Syndicated article data is not included in this dashboard, and usually cannot be provided.
- Data refreshes approximately every 2 hours. As such, it's not a good resource for real-time data. We recommend waiting at least 7 days after publication before putting article data on your CV, as articles tend to continue to be read and responded to during the week of publication, especially when we send out our weekly newsletter on Friday.
- If you don't see a complete list of articles when you filter by your name, visit your profile and search by title instead. Due to incomplete sets of data, not all articles published before March 9, 2018 have author data.
- Article data starts on June 1, 2017.
- Survey data starts March 9, 2018. Response data should thus be taken with a grain of salt for articles published prior to March 9, 2018.
- Your article may not accrue responses for all survey questions. We ask 3 questions at a time, and rotate them randomly, meaning it can take many responses for all survey questions to collect data.
- We are currently testing different survey questions to see which ones most accurately reflect the impact of the piece. At the moment, we cannot say for sure that any question corresponds to reader behavior. However, we think the survey questions are directionally useful for understanding how audiences responded to your piece.
- Occasionally, if we change the title of an article after it has been published, it will appear twice in results.
We plan to eventually correct these issues over time, and will update this page accordingly.
Last updated March 14, 2018