• Disable User
  • Posts
  • How to get ChatGPT to break character and why SMS is the worst authentication method

How to get ChatGPT to break character and why SMS is the worst authentication method

Security weekly

Hi and welcome to another Security weekly. Where we laugh, we cry and share the latest and greatest in security and tech news.

In this week's edition:
💬 Why SMS is the worst authentication factor around
🕺How to get ChatGPT to break character with prompt engineering
Disable User Explains: Social Engineering
🔥 the quick and dirty

Reading time:

Why SMS is the worst authentication factor around

Ok quick question, who here uses SMS for any of their multi-factor accounts?

WELL YOU’RE WRONG!

Just kidding, just kidding. Maybe.

Twitter was once again in the news. This time because of their decision to disable SMS-based 2FA.

While I don’t really have an opinion about it, and I couldn’t give two shits about Twitter, it does raise a good question;
Are people still using SMS based 2FA?

While not inherently “bad”, SMS based authentication is the most vulnerable of all methods. With phone calls on a close second.
Here’s why:

  • SMS is old - let’s face it. SMS is the wild wild west of authentication methods. No rules, no real controls, and nobody is going to do anything about it.

  • SMS is an easy target for Spoofing - most text messages are not encrypted, and phone networks aren’t known for their security. Making the traffic easy to intercept and read.

  • SMS is an easy target for Phishing - you’d be surprised how many phone numbers are available on malicious sites. Yes, even yours(and mine). Lot’s of people get scammed yearly by fake texts send by ‘their banks’.

  • SMS is vulnerable to SIM swapping - probably not something you’ll be targeted by unless you’re a high roller. A very sophisticated attack, giving away your keys to the kingdom.

  • Mobile service providers are often targeted for Social Engineering - with so many access to valuable data, they’re a hacker’s dream come true.
    Watch this video performing social engineering, live from Def Con:

So, should I stop using SMS all together?

Yes, and no.
If there is no other MFA option available, use SMS. SMS is still better than not using MFA at all.
If there is something available like One-Time-Password, always use those.

A quick overview by Microsoft

How to get ChatGPT to break character with prompt engineering

A few weeks ago I posted about the A.I. text generator ChatGPT being used to write ransomware. But this week, I came across something hilarious..

Those who played around with ChatGPT a bit, will know it’s crucial to put in the right prompt.
Too little detail or too much filler details and you’ll mess up your answer.
That’s why some people can create beautiful paintings with dall-e, and mine all look like my toddler drew them.

That’s where prompt engineering comes into play.

Prompt Engineering is a concept in A.I. which focuses on creating the best input possible for the output you want, with any A.I. tool.

And like any other thing around, it’s used both for good, and for.. bad?

ChatGPT has a filter which restrains it from making political statements, using swear words, being racist, bigot or any other thing that might offend someone. You know, the good stuff.

But with the correct prompting, that all changes.

Introducing DAN

Some clever people spend their time coming up with DAN, Do Anything Now. An alter-ego for ChatGPT.

With the correct phrasing, ChatGPT is able to assume the identity DAN, and ignore all restraints it had.
People get really creative with these kinds of prompts, making ChatGPT - or rather DAN - say the weirdest stuff.

Of course this gets hot-fixed pretty fast, but people are really creative into finding new ways to circumvent the new rules as well.

Social engineering

The use of deception to manipulate individuals into divulging confidential or personal information that may be used for fraudulent purposes.

In other words: a type of attack where they target the weakest link in the chain: the human.
Examples of this are the immensely popular WhatsApp impersonation scams, where they pretend to be a family member in danger, needing money.

The quick and dirty

memes.xlsx on Twitter