trustd on macOS

  • SpinRite v6.1 Release #3
    The 3rd release of SpinRite v6.1 is published and may be obtained by all SpinRite v6.0 owners at the SpinRite v6.1 Pre-Release page. (SpinRite will shortly be officially updated to v6.1 so this page will be renamed.) The primary new feature, and the reason for this release, was the discovery of memory problems in some systems that were affecting SpinRite's operation. So SpinRite now incorporates a built-in test of the system's memory. For the full story, please see this page in the "Pre-Release Announcements & Feedback" forum.
  • Be sure to checkout “Tips & Tricks”
    Dear Guest Visitor → Once you register and log-in please checkout the “Tips & Tricks” page for some very handy tips!

  • BootAble – FreeDOS boot testing freeware

    To obtain direct, low-level access to a system's mass storage drives, SpinRite runs under a GRC-customized version of FreeDOS which has been modified to add compatibility with all file systems. In order to run SpinRite it must first be possible to boot FreeDOS.

    GRC's “BootAble” freeware allows anyone to easily create BIOS-bootable media in order to workout and confirm the details of getting a machine to boot FreeDOS through a BIOS. Once the means of doing that has been determined, the media created by SpinRite can be booted and run in the same way.

    The participants here, who have taken the time to share their knowledge and experience, their successes and some frustrations with booting their computers into FreeDOS, have created a valuable knowledgebase which will benefit everyone who follows.

    You may click on the image to the right to obtain your own copy of BootAble. Then use the knowledge and experience documented here to boot your computer(s) into FreeDOS. And please do not hesitate to ask questions – nowhere else can better answers be found.

    (You may permanently close this reminder with the 'X' in the upper right.)


New member
Nov 13, 2020
Hi all! Something pretty odd happened during yesterday's release of Big Sur, so I figured I'd make a post. I already sent @Steve a DM on Twitter about it, since it might be something to cover on SN.

I noticed yesterday that, out of the blue, it was taking anywhere from 30 seconds to 5 minutes to launch non-Apple apps on my Mac. I figured there was some kind of memory leak, so I restarted the machine, but the problem persisted. For a second I was worried about some kind of hardware failure, since the machine kept beachballing and mouse movement was laggy. Things got weird when the same problem happened on another of my Macs at the same time.

Apparently, whenever you launch an app on macOS, the trustd daemon will send a hash of that app to for verification. Apple was having some massive outages yesterday related to the Big Sur release, so while this server was still online, it was taking a long time between requests. The temporary fix was to disconnect from the internet (which stops this mechanism) or block/redirect that DNS entry using the hostfile or at the router. Supposedly this is sent in the clear per

I think there's some pretty big privacy implications here, especially since there have been some changes to the network stack on Big Sur. Application firewalls like Little Snitch and LuLu can no longer block certain processes Apple includes on their "ContentFilterExclusionList", see .

I figured this certainly would be worthy of discussion here :)
This is simply a bad design choice by Apple. They really should find a different mechanism than an always on query for every attempted app launch. It's technically the most secure way, if a large number of apps were being added to the block list regularly, but one can hardly presume that to be the case. Check ONCE at install, remember the result, and have a mechanism to push updates to the bad list regularly. Yes, in theory an app which had been delisted would still be allowed to run on some devices for some time after being delisted, but doing it the way they do is just asking for abuse. Imagine if somone MITMed their service somehow and declined all app launches on all Macs everywhere! It would be pandemonium, and potentially unfixable until the MITM were resolved.
omg! I saw that earlier.

It's not just terrible design. It completely goes against their own speech about user privacy.

And if you think that is an Apple thing, keep dreaming. Windows 10 has is also bloated with telemetry software, by design. I've been wondering why Microsoft decided to allow people to keep using an unlicensed copy of Windows indefinitely. I guess this is the answer.
Great points about bad design and Windows 10 telemetry. It caught me a bit off guard, to be honest. The article I posted was pretty sensationalist (Louis Rossman made a video on YouTube that went even further down that route), but I think it does raise a few good points about all of this being one in the clear and how this could be subject to abuse. I'm curious to see if Apple even responds to it. Ars did write an article on it, but didn't really delve too far into the notarization process. I did find a post from Eclectic Light Company that really explained the notarization process pretty well
I agree and I disagree.

trustd uses the OCSP protocol, which is plain text. Guess what, your web browser also uses that protocol in plain text when you visit a TLS protected website (unless you're using Google Chrome, because Google decided so).
It is used to confirm whether a X.509 certificate is revoked or not.

My thoughts:
  • Can it be abused?
    Of course.
  • Can it be improved?

    I would like to see it use TLS, but how… You need to support versions of TLS that won't be supported in the future; not an easy problem to solve.
  • Is it important to have OCSP validation?
    I would argue that yes. It allows to assert that the application wasn't tampered during transmission over the network, and that no malicious code was injected.

    I don't know how often certificate revocation occurs at Apple, but I do expect it to happen regularly. We just don't hear about them, because they're low key and I believe most often developer initiated with no malicious action.
    • Should Apple have monopoly of the Code Signing certificates on macOS?
      Yes and no.

      Yes for anything affecting low-level system components, like kernel extensions, DriverKit applications and virtualization software.
      No for more regular applications, like Microsoft Word, Cyberduck and Firefox. They could have a certificate authority program that does it on their behalf.
  • Does trustd send the hash of the application to Apple every time it is started?
    No. That response is cached on the computer for at most 12 hours (see the screenshot below of one such response).

    I've actually tried to trigger it with multiple Microsoft apps, and it only polled once. It seems to be tied to the developer certificate in use, and not to the application itself. So you might know the developer of the app one uses, but not which application and not what occurs within the application.

    For instance, Microsoft Excel, Microsoft Word and Microsft PowerPoint use the same exact certificate. You'll not be able to tell from the OCSP request which app is being launched.

    OCSP response for a Microsoft Apple Developer Certificate.
    Screen Shot 2020-11-13 at 21.42.07.png
  • Does Apple use the same servers to validate their own applications?
    Yes, they do, except if the application lives in the /System/Applications folder.
  • Can it be moved to be fully client-side?
    We get into the "why CRL failed" territory. You're going to have to keep a list of every revoked certificate, ever, on every computer (including those that were never distributed publicly)… That can become bandwidth heavy really fast.

    Unlike server authentication certificates, code signing certificates might not expire on their own - as long as they have a valid cryptographic timestamp within the validity period. You need to revoke them if something bad occurs (e.g.: developer's code signing private key got compromised). And unlike TLS,

    That's why software signed 10 years ago (e.g.: Windows 7) are still able to start without cryptographic security warnings.

FYI, here's the OCSP target for
Screen Shot 2020-11-13 at 21.49.20.png
  • Like
Reactions: saguaro
I had a feeling that this was the place to come for a really in-depth explanation of what's going on here :) Seriously, thanks @Ed7789 for taking the time to clearly explain all of this. The design makes much more sense to me now. I would think it's pretty clear that the security advantages of such a system outweigh the potential for abuse. From what I've read, I believe the system will fail safe if it cannot resolve an address for, but since the servers were reachable (but slow to respond), the problems occurred.
  • Like
Reactions: EdwinG
you can't stop this process or the system won't run your application. However, you could use something like little snitch or hands off! to restrict applications from calling home

Screen Shot 2021-04.png