False Face: Unit 42 Demonstrates the Alarming Ease of Synthetic Identity Creation
Summary:
Researchers at Palo Alto Networks’ Unit 42 have observed evidence of North Korean IT workers utilizing real-time deepfake technology to infiltrate organizations by leveraging remote work job positions, which poses a myriad of challenges for defenders. The researchers have outlined the detection strategies they hope will provide practical guidance to protect organizations from these guileful threat actors. They were able to replicate the adversary’s process and figure out how to create a real-time deepfake using readily available tools and cheap consumer hardware with no prior experience. It's been recently reported by both talent acquisition and cybersecurity communities that there is a growing trend of job seekers employing sophisticated real-time deepfake technology during interviews. Investigators have found instances where candidates presented fake video feeds, often unconcernedly using the exact same virtual backgrounds for multiple different applicant identities. Palo Alto researchers were able to connect a documented case study involving a Polish AI company that encountered two separate deepfake candidates to DPRK IT worker operations based on indicators shared by the case study that align with the known DPRK IT worker TTPs.
Deepfake Detection Opportunities:
Through this investigation and demonstration, the researchers surmised that, although there are limitations in today’s deepfake technology, these limitations are diminishing as creating convincing synthetic identities is becoming nearly effortless. North Korean APT groups have repeatedly demonstrated their strong focus on methods to fake or alter identities, and the interconnected nature of North Korean APT groups further underscores the scale of these operations. This research exemplified the startling accessibility of deepfake creation. Combining this with document forgery and AV manipulation tools creates a sophisticated social engineering lure that's readily available.
Suggested Corrections:
The DPRK IT worker campaign demands close collaboration between human resources (HR) and information security teams. When both work together, it affords an organization more detection opportunities across the entire hiring and employment lifecycle.
For HR Teams:
https://unit42.paloaltonetworks.com/north-korean-synthetic-identity-creation/
Researchers at Palo Alto Networks’ Unit 42 have observed evidence of North Korean IT workers utilizing real-time deepfake technology to infiltrate organizations by leveraging remote work job positions, which poses a myriad of challenges for defenders. The researchers have outlined the detection strategies they hope will provide practical guidance to protect organizations from these guileful threat actors. They were able to replicate the adversary’s process and figure out how to create a real-time deepfake using readily available tools and cheap consumer hardware with no prior experience. It's been recently reported by both talent acquisition and cybersecurity communities that there is a growing trend of job seekers employing sophisticated real-time deepfake technology during interviews. Investigators have found instances where candidates presented fake video feeds, often unconcernedly using the exact same virtual backgrounds for multiple different applicant identities. Palo Alto researchers were able to connect a documented case study involving a Polish AI company that encountered two separate deepfake candidates to DPRK IT worker operations based on indicators shared by the case study that align with the known DPRK IT worker TTPs.
Deepfake Detection Opportunities:
- Temporal consistency issues: Rapid head movements caused noticeable artifacts as the tracking system struggled to maintain accurate landmark positioning
- Occlusion handling: When the operator's hand passed over their face, the deepfake system failed to properly reconstruct the partially obscured face
- Lighting adaptation: Sudden changes in lighting conditions revealed inconsistencies in the rendering, particularly around the edges of the face
- Audio-visual synchronization: Slight delays between lip movements and speech were detectable under careful observation
Through this investigation and demonstration, the researchers surmised that, although there are limitations in today’s deepfake technology, these limitations are diminishing as creating convincing synthetic identities is becoming nearly effortless. North Korean APT groups have repeatedly demonstrated their strong focus on methods to fake or alter identities, and the interconnected nature of North Korean APT groups further underscores the scale of these operations. This research exemplified the startling accessibility of deepfake creation. Combining this with document forgery and AV manipulation tools creates a sophisticated social engineering lure that's readily available.
Suggested Corrections:
The DPRK IT worker campaign demands close collaboration between human resources (HR) and information security teams. When both work together, it affords an organization more detection opportunities across the entire hiring and employment lifecycle.
For HR Teams:
- Ask candidates to turn their cameras on for interviews, including initial consultations
- Record these sessions (with proper consent) for potential forensic analysis
- Implement a comprehensive identity verification workflow that includes:
- Document authenticity verification using automated forensic tools that check for security features, tampering indicators and consistency of information across submitted documents
- ID verification with integrated liveness detection that requires candidates to present their physical ID while performing specific real-time actions
- Matching between ID documents and interviewee, ensuring the person interviewing matches their purported identification
- Train recruiters and technical interviewing teams to identify suspicious patterns in video interviews such as unnatural eye movement, lighting inconsistencies and audio-visual synchronization issues
- Have interviewers get comfortable with asking candidates to perform movements challenging for deepfake software (e.g., profile turns, hand gestures near the face or rapid head movements)
- Secure the hiring pipeline by recording job application IP addresses and checking they aren't from anonymizing infrastructure or suspicious geographic regions
- Enrich provided phone numbers to check if they are Voice over Internet Protocol (VoIP) carriers, particularly those commonly associated with identity concealment
- Maintain information sharing agreements with partner companies and participate in applicable Information Sharing and Analysis Centers (ISACs) to stay current on the latest synthetic identity techniques
- Identify and block software applications that enable virtual webcam installation on corporate-managed devices when there is no legitimate business justification for their use.
- Monitor for abnormal network access patterns post-hiring, particularly connections to anonymizing services or unauthorized data transfers
- Deploy multi-factor authentication methods that require physical possession of devices, making identity impersonation more difficult
- Develop clear protocols for handling suspected synthetic identity cases, including escalation procedures and evidence preservation methods
- Create a security awareness program that educates all employees involved in hiring about synthetic identity red flags
- Establish technical controls that limit access for new employees until additional verification milestones are reached
- Document verification failures and share appropriate technical indicators with industry partners and relevant government agencies
https://unit42.paloaltonetworks.com/north-korean-synthetic-identity-creation/