Detroit Vacancy Rates: A Guide to Conscious Data Consumption

Blog post by Stephanie Quesnelle
June 2017

Data Driven Detroit (D3)   (Detroit)

17.8%, 4.89%, 25.9%, 16.5%, 13.3%, 21.9%, 18.7%, 14.2%, 2.6%, 15%, 22.4%, 9.4%.

What do all of these numbers have in common?

They’ve been used in the past year to describe the vacancy rate in Detroit.  Some represent different parts of the city, others represent different types of building (residential vs. commercial), types of tenants (renters vs. owners), and different data sources.  Declining vacancy rates are touted as a mark of Detroit’s comeback, but they’re surprisingly hard to measure!

17.8% Metro Detroit office vacancy rate Newmark Grubb Knight Frank

4.89% Detroit rental vacancy rate Department of Numbers

25.9% Detroit business vacancy rate Drawing Detroit

16.5% Metro Detroit office vacancy rate (2017) DBusiness

13.3%: Office vacancy rate for downtown Detroit Detroit News

21.9% Detroit vacancy rate Drawing Detroit

18.7% Metro Detroit office vacancy rate (2016) DBusiness

14.2% Detroit Central Business District office vacancy rate Newmark Grubb Knight Frank

2.6% Metro Detroit regional apartment vacancy rate Crain’s Detroit

15% Detroit’s central business district office vacancy rate DBusiness

22.4% Detroit’s residential vacancy rate Drawing Detroit

9.4% Prominent downtown office space vacancy rate Crain’s Detroit

Despite these numbers measuring very different things, with very different methodologies, it would be easy to consumer the data and turn to a friend and say “Oh Detroit’s vacancy rate is 26%” or “Detroit’s vacancy rate is 2.6%”. That’s why it’s important for people to be a responsible consumer of data.

When consuming or reporting data, it’s important to keep in mind five things about of the number:

·         Scope

·         Geography

·         Availability

·         Scale

·         Source/Methods

Scope

The scope of the data refers to the magnitude of the data collection.  For example, if data comes from a survey, like the Crain’s Detroit prominent office building vacancy rate, it’s important to pay attention to who is surveyed and what the response rate is.  Smaller samples make data less reliable.  The length of time it took to collect the data also impacts data reliability.

Another important aspect of scope is how long the data took to be collected.  This is especially true when reporting with American Community Survey data.  Each year data are released at less-populated areas as 5-year average estimates and more populated areas as 1- year estimates.  We wrote an informative blog post about the ACS methodology when the 3-year estimates were eliminated, for example you can’t compare 1-year and 5-year ACS estimates with each other.

Geography

Paying attention to the geographic boundaries that the data represent is also important.  In our list of vacancy rate data, we have prominent downtown office space, the Central Business district, downtown, the city of Detroit, and the Metro Detroit region.  Each of these numbers contributes to different narratives so we have to be careful that we don’t over-generalize a data point. A number for downtown Detroit can’t be assumed to represent all of Detroit and the Metro Detroit area. 

Availability

The first consideration around availability should be “is this data available in the time period that makes most sense?”  We still see people citing a literacy rate number from 1998, which we debunked back in 2011. Detroit is changing at a rapid pace, so ensuring that the numbers we cite are very recent is important to capture an appropriate picture.

The second consideration you should take into account is whether or not the data and methodology are publically available. It’s good to have access to data so you can run the numbers yourself and confirm that the aggregations are correct.

Scale

Data is collected and reported at different levels of granularity.  For example, the U.S. Postal Service data used to calculate Drawing Detroit’s statistics are collected at the address level, which is different than the building-level data collected by Jones Lang LaSalle, which is different than the city-level data reported from the 2015 American Community Survey.

Source/Methods

Understanding methodology encompasses all of the above considerations, but we’ll call it out specifically here as an understanding of how questions were asked to obtain data.  For example, USPS data is collected by reports from individual postal employees who report if addresses seem vacant.  The Jones Lang LaSalle data had building owners report vacancy rates and rents from their own buildings.  Self-reporting can bias the data because the person reporting usually has incentives for the data to say something specific.

---

All of these different factors affect the potential biases of the data itself.  Every number that’s reported relies on people who bring their own set of opinions to everything from survey writing and administration to reporting and interpretation.  Being conscious of the many challenges in measuring difficult to define aspects of our community is really important. It helps us be better consumers of information as we watch the news, read a magazine, or interact with people on social media.