AppEsteem Blog

They say you gave informed consent (but did you really?)

We've recently noticed that some bundlers are helping apps with sneaky (and insufficient) ways to obtain what they claim is informed user consent.

Here's an example. Let's say you're installing a game on your PC. Once the game is installed, you get an offer for a free super cool web browser. In this case, the game installer is considered a "bundler", and if you do install that web browser, that company will pay the game vendor.

Now imagine that, in the fine print, the offer claims that your acceptance means that you agree to set the super cool web browser to be your default browser. Here's a pic of just such an offer for Opera while the consumer was installing Recuva:

The bundler may claim that you gave informed consent to change your default browser setting, but they're wrong. That's because only the app that needs your informed consent may obtain it. In the above example, Opera (and not the Recuva installer) needs to obtain informed user consent to set itself as your default browser.

It isn't hard for Opera to obtain your informed consent. They can ask you after they're installed if you want them to be your default web browser, or they can ask you during their install (instead of installing silently).

So hopefully it's clear that bundlers cannot obtain your informed consent for anything, including changing default programs or search, lowering your security posture, or transmitting sensitive information about you. If we see a bundler trying to trick consumers into thinking that they gave their informed consent, we'll take the following steps:

  1. We'll call out the bundler as an active Deceptor for failing ACR-014, which requires that an app "Is truthful and not misleading or confusing with the intent to deceive; can be substantiated; is not unfair."
  2. We'll call out any third party ad network (in this case, it was Rise making the offer), who knows better than to try to trick the consumer into granting insufficient consent.
  3. We'll call out the app that claims other apps and websites obtained informed user consent on their behalf. It's the job of the app to obtain informed user consent, not to ask another program or website to get it for them.

 

 

Not yours! Consumer-linkable information belongs to the consumer.

Imagine you walk into my bookshop. As you peruse my books, I take note of where you pause and what you pick up. I ask you questions about what you're looking for, and I make suggestions. I'm hoping that the more I get to know you, the better I can serve you, and the better chance that you'll become a regular.

My imaginary bookshop is small, and my landlord offered me a free security system. It uses video cameras focused on my aisles, and it sends the video feed to the cloud alerting me when it thinks somebody steals a book.

Now imagine that once you leave my bookshop, I take what I learn about you and turn that into extra money for me: I sell your wish list to others, and I get advertisers to pay me to target you based on your book interests. Also, that free security system let my landlord do the same, based on what else it learned about you while analyzing the security tapes.

My imaginary bookshop sounds both big brother-ish and unfair to my customers. Fortunately, it would be hard for me to run this kind of bookshop, because I'd have to put a big notice on my door that says something like this, "Attention: your entry into this bookshop is your consent to being tracked and targeted for future advertising, both by me and by my landlord, who in return has provided us a free security system. Scan here to see the full privacy policy."

My guess is that if I put that kind of disclosure on my imaginary bookshop, I wouldn't get many customers coming through my door. My bookshop would be rather empty, because although customers are happy if I use data linked to them to give them a better shopping experience, they would not be happy if I was using their information to monetize or trade or sell. The reason? Because any consumer-linkable information I collect belongs to the consumer, not to me.

This makes sense in the physical world, but somehow the Internet doesn't work this way. It should, but we're not there yet.

Somehow the Internet has convinced us that the price of better searches and entertainment only comes when others are allowed to trade, sell, or monetize the information that can be linked back to us. That's not fair, because your non-public information belongs to you, and, except for well-regulated and limited scenarios, nobody else should have the right to sell it, or use it to make money, or permit others to gather it from you.

In what kinds of scenarios can making a business out of consumer-linkable information be acceptable? Two examples come to mind: medical information and credit scores. Unlike the Internet, both medical information and credit scores involve well-regulated industries that have been built up around controlling how this information is protected, limited in its usage, and used appropriately. Strong consumer-focused laws make it clear that you own your data, and they provide protection against misuse. Your consent must be obtained, and only for specific use cases. We think these examples provide good models for how consumer-linkable information collected on the Internet should be handled.

Just because a company knows something personal about you, they don't have the right to sell or make money from others or trade it to somebody else. And they don't have the right to let others collect more linkable information about you. If a company does any of this, especially online while you visit its websites or use its apps, that company is a polluter, and they should be stopped.

Here's just a few examples of how a website or app can pollute your internet experience and take advantage of your linkable information:

  • Google and Bing give you free searches in exchange for learning what you search for and click on, then use your information to charge advertisers to target you while you search.
  • Taboola and Outbrain pay more ad revenue to news websites who let them track and target you based on what they learn about your interests.
  • Google gives tens of millions of websites free analytics in exchange for collecting data on what you do on those websites.
  • Meta, Google, Microsoft, Yahoo, and so many other ad networks have huge businesses based on them selling advertisers access to your information so you can be better targeted.

All of the above examples are unacceptable, because they're all trading in something that doesn't belong to them: information linkable back to you. Only polluters aggregate other people's linkable information and try to turn it into money. Only polluters claim that you gave them consent to resell and repackage and build businesses around your linkable information.

Meanwhile, the privacy debate swirling around various governments seems to be focused on how consumers can request to see what linkable information companies are exploiting, and whether they can request to be forgotten. There is talk about "do not sell" flags in browsers that websites and ad networks can voluntarily consume, but without any teeth behind them. We think these initiatives are missing the basic point: no company should consider consumer-linkable information theirs to resell, to trade, or to monetize outside of providing better and relevant first-party experiences directly back to the consumer.

The BigTech companies who grow their businesses by exploiting consumer-linkable information have been successful in fending off regulation of how they purchase, aggregate, use, and monetize this information. This needs to change.

This is why we've added two new polluter indicators, focused on customer-linkable information, to our list. AP-10 says companies can collect and use consumer-linkable information to improve their direct services, but they can't use it to sell, monetize, or improve third-party services. AP-11 says you can't let others collect consumer-linkable information on your site or in your apps.

It's time to call out this exploitative behavior for what it is: internet pollution. It's time for it to stop.

 

ACR-013 is coming soon - are you ready?

Starting on April 1, when we find apps that violate ACR-013, we'll list them as active Deceptors. If you're an app vendor who makes offers to install additional software, or you're a third-party offer provider, this blog post is for you.

We think that app monetization through making offers for other apps is legitimate, and it should be allowed. That being said, we have identified a key scenario when we believe it's almost impossible for consumers to distinguish the acceptance of another offer from the work that they're already performing. ACR-013 will be violated when certain kinds of offers are made during this key scenario.

 

ACR-013 only targets silent-install offers for unrelated software that interrupt committed user acquisition workflows without first obtaining consent. But that's quite a mouthful, so you can use this rubric to unpack it and see if ACR-013 applies to your offer:

 

  1. Was the offer presented during an committed user workflow, such as during the consumer answering the install questions? If not, then ACR-013 doesn't apply to your offer.
  2. Was the offer presented in a way that demanded the consumer to answer or wait to continue? If not, then ACR-013 doesn't apply to your offer.
  3. Will acceptance of the offer result in the silent installation of another app? If not, then ACR-013 doesn't apply to your offer.
  4. Is the offer related or essential to your app? If it is, then ACR-013 doesn't apply to your offer.
  5. Immediately prior to making the offer, did you obtain explicit consent to do so? If you did, then ACR-013 doesn't apply to your offer.  (A big thank you to Keren and Itay at CSA for helping us understand the importance of this scenario -- we added this after their feedback)

 

Restating this with references to the rubric: ACR-013 only targets silent-install offers (#3) for unrelated software (#4) that interrupt (#2) committed user acquisition workflows (#1) without first obtaining consent (#5).

 

 

 

So what to do if you think your offer is violating ACR-013? We suggest the following remedies (this isn't an exhaustive list, but it may spark your own creativity):

  1. Consider moving your offer out of the committed user workflow. For instance, if the offer was inside the install, you could place it at the end, after the consumer knows the installation was complete.
  2. Consider making your offer without requiring a response from the user to continue the workflow.
  3. Upon acceptance, consider not silently installing the app, but instead show a landing page, or launch an interactive install
  4. Consider showing how the offer is for software essential or related to your app.
  5. Consider obtaining explicit consent for showing the offer. 

In our studies of the software monetization industry, we've identified many application vendors who already comply with non-ACR-013-violating offers for software. But because we understand that this change may be disruptive to some application vendors, and also to some third-party offer providers, we've spent several months informing the industry, answering their questions, listening to the AVs, and improving on the ACR to be sure we get it right. We "froze" the words in ACR-013 in January, and since then we have been working on spreading the word throughout the industry.

We're grateful to the support we've received from CleanApps.org for hosting a webinar, and to CSA for distributing to their constituents, collecting feedback, and holding a roundtable discussion.

Looking forward, we're confident that this ACR will result in a significant reduction in the consumer dissatisfaction that comes from them realizing that they inadvertently ended up with additional software installed, while preserving the ability for app vendors to monetize through offers.

 

2022 Review, and What's Coming Next

2022 was a busy year for AppEsteem. Here's what we accomplished:

  1. We moved forward with two new ACRs that affect the bundler industry: ACR-013 (which prevents interrupting install/uninstall/conversion with un-consented offers), and ACR-060 (which requires offers to disclose the offering network). These two ACRs are meant to reduce consumer confusion and dissatisfaction, and they will go into effect on April 1. You can read more about them, including the requirement, our intent, and our guidance, on our Requirements checklist.
  2. We took a stand against ad pollution by publishing pollution indicators, and publicly calling out our first set of ad polluters.
  3. We updated our browser safety consumer apps and services (available on Browse.live) by releasing Browse.live Ad Control, a free browser extension that blocks ads from ad polluters, and Browse.live Search, an ad-pollution-free, anonymous search engine.
  4. We called out hundreds of active Deceptors, and we certified hundreds of clean, consumer-respecting apps.
  5. We ran monthly tests against the main AV products to determine how good they were at blocking Deceptors and allowing certified apps.

Not bad for a year where we're mostly still working from home and postponing almost all customer visits.

In 2023, our mission won't change. We'll continue to help clean apps thrive by finding ways to protect consumers from getting tricked, scared, or fooled. Here's what we plan to focus on:

  1. We'll start enforcing the bundler ACRs (ACR-013 and ACR-060). We'll work to stop apps from violating these ACRs, including hunting for them, reaching out to them, and listing them on our active Deceptor list.
  2. We'll keep calling out Deceptors and Ad Polluters, so we can get them to clean up.
  3. We'll continue to expand our Browse.live consumer safety product line so that consumers can have safer and cleaner internet experiences.
  4. We'll look for more ways to encourage the AVs to better protect consumers, both on the system and in the browser. We'll do this with our feeds, our testing, our technology, and with releasing our own apps.

We're winning the fight against deceptive apps, and our clean ecosystem makes this possible. Thank you to our app makers who get their clean apps certified, our AV partners who use our Deceptor and Certified feeds to protect consumers, and our customers who use our Browse.live apps to make their internet experiences safer and cleaner.

Happy New Year from all of us at AppEsteem!

 

Security-reducing apps: a call to action

(Hong Jia and Dennis Batchelder)

We think that many AVs need to update their (potentially) unwanted software policies to make sure they can block apps that reduce security without first obtaining informed user consent. We gave a talk yesterday at AVAR 2022 in Singapore to make our case, show which AVs are currently struggling with protecting their customers against these apps, and ask them to update their policies so their customers can be better protected.

You can see the slides we used for the presentation here.

This was our abstract:

As Avs get better operationalized in their fight against unwanted software (UwS), their combined pressure is driving the software monetization industry toward finding the gaps in AV policies so they can continue to exploit consumers for easy money.

The big gap in AV policies these days, unfortunately, is around apps that make their computers more vulnerable to attacks. The result? A proliferation of apps that needlessly reduce their customers’ security postures and set them up for future attacks, without first obtaining informed user consent. Examples of these apps include VPNs that install self-signed trusted root certificates and free apps that monetize by installing proxies that share their internet connection and processor.

Lately these security-reducing apps that don’t obtain informed consent are grabbing public attention: articles about them are popping up in both security blogs and computer industry news. Some platforms and AVs are beginning to respond – they detect after others have called them out. But the platforms and AVs have been slow to update their policies, and slow to detect these apps as UwS, which leaves a gap that software monetizers continue to exploit.

Our session will show examples of how these apps reduce their customers’ security postures. We will highlight the platform and AV public policy gaps that have led to the spread of them. We’ll make suggestions as to how Avs can enhance their policies to better protect their customers from these apps.

Be ready for December 13: remove the urgency from free scans

Last January, Microsoft posted a blog titled Protecting customers from being intimidated into making an unnecessary purchase. The blog announced that effective March 1, they would be tightening up what they considered to be coercive messaging. The two new areas they called out were:

  1. Reporting the results in an exaggerated or alarming manner
  2. Requiring the user to "pay" to fix free scan results

We welcomed these changes, as it demonstrated Microsoft's resolve to go after the app vendors who were taking advantage of consumers to push unnecessary system utilities. But we also recognized that this was a significant change for many system utilities, including those that we had already certified.

Facing this change, we decided that the first step was to see if the anti-malware ecosystem could align on our understanding of Microsoft's principles. We worked with our security partners to come up with wording for a new application certification requirement (ACR-004). We also worked with many affected app vendors, CleanApps.org, compliance partners, and consumer groups to clarify the wording and provide examples of apps that either passed or failed ACR-004.

This took a few months to work through. These kinds of discussions are not easy, especially when the affected parties also include anti-malware vendors. But after all the discussions, we ended up with a requirement that we believe will both help consumers and still allow vendors to continue to demonstrate and monetize the value of their apps.

We set our enforcement date to be December 13, 2018. This means that any apps that do not meet ACR-004 by December 13, including new versions of apps that we have previously certified, may be added to our active Deceptor list.

ACR-004 states: When showing free scan results with the intent to monetize, results are substantiated and avoid any exaggerated sense of urgency, and app provides free fixes for all free scan results shown when the fix is not anticipated to be permanent or the fix offered is an ongoing service.

So what does this mean? If you're using free system utility scan results to monetize your solution, keep the following points in mind:

  • Make sure your free scan results are truthful, detailed, and can be substantiated.
  • Don't map free scan results to graphs, gauges, meters, or other ways to "measure" how important they are
  • Unless you're reporting on immediate threats to the system or consumer (a good example of this is active malware), don't use differentiating colors to highlight your free scan results
  • Unless you're providing a one-time permanent fix that's not an ongoing subscription, let the consumer "try" your solution by fixing all the results you show for free.
  • If you're fixing free scan results for free as part of a "trial", don't pre-collect payment details or ask the consumer to perform any other tasks beyond providing their email.

You can read more details and see both good and bad examples for ACR-004 on our requirements checklist. We're happy to help vendors understand ACR-004, and we offer both free and paid services to help companies comply.

 

Adjusting our Ad Injector/Blocker Requirements

Over the past few months, new standards for ads have been released by both BetterAds.org and the IAB. We think that these are in response to the proliferation of more and more ad blockers; the ad industry has started taking responsibility for the quality of online ads.

And while we felt that this is great news for consumers, we also realized that it was time to update our own certification requirements for apps that inject or block ads. So we spent the past few months working with our customers, some of the larger ad injector vendors, compliance partners, various security and platform companies, and CleanApps.org.

This work drove significant changes: not only did we adjust the requirements, but some of the requirements were promoted to Deceptor-level. Starting in October, we'll be reviewing and calling out bad ad injectors and blockers and adding them to our active Deceptor list.

You can find a summary of the changes in the following ad injector requirement updates document. Please feel free to use this to understand the context behind the changes. Also, all the changes are live in our online requirements checklist.

Copyright © 2024 - Design by FS