Stop it: ESP deliverability rankings mean nothing


How many different ESP or marketing automation platforms have you used to send significant amounts of mail? I'm up to eight, more or less. I've logged into others, too, but these are the ones that I've used to send "significant" amounts of mail. They are:
  1. Amazon SES
  2. AWeber
  3. Constant Contact
  4. Mailchimp
  5. Oracle Marketing Cloud (Responsys)
  6. Pardot
  7. Salesforce Marketing Cloud (ExactTarget)
  8. Sendy
I think Wombatmail might count as number nine, since, until very recently, I used my own Postfix MTA to deliver my newsletter and discussion list mail.

Anyway, what's my point? My point is, I've gotten mail delivered to the inbox, reliably, with all nine of these platforms. They all can get you 100% inbox delivery, if you do things right. From STRICTLY the perspective of can they deliver the mail, they're all good. They all give each client, with varying levels of bells and whistles, the ability to be a good sender and enjoy deliverability success. I should know. I've tried them all.

So if/when you run into somebody saying, "Hey I rank inbox percentages from different ESPs and some are better than others," be skeptical. Be very, very skeptical. Because whatever they think they're measuring, they're not actually measuring.

That's not to say that one platform isn't better than another. They all have different features and capabilities. Vastly different user experiences. Different methods and depths of segmentation. Different content creation tools. Different integrations with other platforms and data sources. Different IP strategy calculations. Different prices. Different target markets. Some have better tools for marketing and sales tracking. Some are great for newsletters. Others are very strong on technical integrations and API automation. And so much more.

When you head-to-head test one platform against another, and look at inbox placement results, there are other variables involved here, beyond the platform itself. And any inbox placement issues are a result of something being up with those other variables. Stuff like: History of domain on that platform, newness of sending from that IP address, different subdomains, differing IP reputation between your IPs on the different platforms, different content being sent between the platforms, greater history of spam complaints for sends from one platform versus the other, difference in authentication configuration between different platforms, and a bunch of other stuff that all generally point back to the sender and the sender's reputation history. NONE of these amount to "Platform X is better than Platform Y when it comes to inbox delivery."

There are exceptions to this, but they are few and far between, and they're usually due to some sort of downtime or system issue at a given platform.

That's not to say that a platform doesn't have some requirements, some things they have to be doing, to help get your mail to the inbox. They are a part of it. But most of the modern platforms have their act together. They support feedback loops, one click unsubscribe, double opt-in, SPF and DKIM. If they don't accidentally send the wrong email to the wrong people or let you mail unsubscribes, you're probably going to be fine.

Occasionally, I'll run into a platform whose default domain authentication settings result in spam folder placement. The fix in those cases is inevitably to fully and properly implement DKIM authentication with your own domain, so that mailbox providers see you as YOU, not just seeing you as Random Platform Customer #732. DKIM authentication, with proper alignment, using your own from domain, is a best practice nowadays. I don't think it's the fault of the ESP or marketing automation platform if you have inbox placement issues before you set up your domain authentication fully.

Yes, sometimes a blocklist will list an entire ESP, or Microsoft might block the shared IP pool of some newsletter platform, but who's dealing with this issue or that issue changes over time. The platforms tend to work hard to resolve those issues as much as they can, and different days bring different challenges. Snapshot view of inbox placement don't drive good platform choice recommendations, because today's issue could (and often does) get resolved tomorrow. Or the "better" platform today could have to deal with one of those issues themselves, tomorrow. It happens. It definitely happens.

There are even deliverability differentiators between different platforms; this just isn't the way to measure that. Some platforms are better than others at IP/domain strategy, some might only support dedicated or shared IP (and not both), some might allow for easier setup of DMARC or BIMI than others (or maybe even include a default DMARC record with settings that I would never recommend nowadays), but that's not the point.

Who really cares about this simplistic kind of "platform to inbox" measurement, anyway?

Historically, a certain type of questionable sender -- the "grey hats," if you will, often skipped implementing their own authentication and would often just try to ride the reputational coattails of the default domains in use by a given platform. And some of these folks loved to be using multiple email sending platforms at a time. Send from ESP 1 on Tuesday, ESP 2 on Wednesday, or send to Yahoo from ESP 3 and to Comcast from ESP 4. That worked in 2012, and might still have worked in 2020, I think, but mailbox providers (Gmail, in particular) have since wised up to a lot of sender tricks, and the modern Gmail and Yahoo sender requirements generally reflect this. By requiring senders over a certain size to have proper, full authentication (with alignment) in place, they're saying that it no longer works to just sling as much mail as possible with reputation based on those shared, default domains.

So if you're THAT kind of sender, maybe you care about that different default domain reputation between platforms. As I mentioned, there is some variance there that I've observed. But for the rest of us, sending as ourselves, fully authenticating, and following permission best practices? Meh, don't worry about it.

(And a point of clarification: NO, I don't think inbox testing is bad or useless. It just has limitations and this is one of them. Like open tracking, I still recommend doing it, and reading the results directionally to observe trends over time, but don't trust it to tell you more than it can actually tell you reliably. YMMV.)
Post a Comment

Comments