If I go to someone and ask for a therapy session, even if they are the most supportive, thoughtful person I could hope to find, it's not appropriate for them to hold it out as proper therapy. We have rules and restrictions on who is allowed to offer certain services, and for good reason. If one asks their therapist about confidentiality, it's highly inappropriate for the therapist to misrepresent the confidentiality rules, also for good reason.
ChatGPT will gladly claim to be able to provide this support, and will promise complete anonymity. It will say that it's able to offer good advice and guidance. As you said, the text it generates will certainly be natural-sounding. It also won't be therapy. It definitely won't be anonymous.
A person who lies about stuff faces consequences. A label on the door of a "medicinalist" that says "No promises to offer truthful information, verify all important things" isn't going to prevent that if they're selling arsenic as a cure-all. If a company wants to offer a service, they should be restricted in what they can claim they are offering.
There are several companies that provide access to bank payments, but they all tend to have substantial limits, especially for US banking.