For most of the last decade, Kimsuky was known as the DPRK's diplomatic spy. Mandiant's April 2023 report on APT43 — the name Mandiant uses for the cluster that overlaps significantly with what Kaspersky, Microsoft, and CrowdStrike call Kimsuky — described a group that stole research from Korea-watchers, think tanks, and ministries. In late 2023 and through 2024 something shifted. The phishing lures stopped focusing on Korea Foundation fellowships and started focusing on developer job descriptions. GitHub repositories started appearing that looked almost, but not quite, like legitimate open-source projects. And npm and PyPI registries began logging a trickle of packages whose metadata pointed back to infrastructure that CISA and KISA had already burned in joint advisories.
Kimsuky is not the loudest DPRK cluster. Lazarus gets the headlines because Lazarus is the one that steals the cryptocurrency — the Ronin bridge hit in March 2022, the Atomic Wallet incident in June 2023, the Stake.com breach in September 2023. Kimsuky's work is quieter and, in some ways, more dangerous to the software ecosystem. When a group optimized for intelligence collection starts targeting the people who ship code, the blast radius is not a single theft event. It is the slow erosion of trust in upstream dependencies.
What changed in the tradecraft
The July 2023 CISA and NSA advisory AA23-187A described Kimsuky's operational tempo in terms that would have been familiar to any counterintelligence officer. Spear-phishing against academics, use of compromised webmail, long cultivation cycles. By contrast, the joint advisory from KISA and NPA published in Korean in early 2024 described something new: fake recruiter personas on LinkedIn and Telegram that were pitching "take-home coding challenges" to engineers working at South Korean defense contractors, semiconductor firms, and — more recently — cryptocurrency exchanges.
The coding challenges were the delivery mechanism. A zip file containing a Node.js or Python project. A README that asked the candidate to "run npm install and fix the failing test." Buried in the dependency graph was a package — usually a typo of a well-known library, sometimes a legitimate-looking utility published to a scoped namespace — that fetched a second-stage payload on install. The second stage was a BeaverTail or InvisibleFerret variant, the same malware family documented by Palo Alto's Unit 42 in their November 2023 "Contagious Interview" write-up. Kimsuky did not invent the pattern. Lazarus sub-clusters had been running it through 2023. But Kimsuky adopted it, tuned it for their espionage objectives rather than theft, and scaled it.
Why developers, why now?
The answer most analysts give — that developers have credentials and access that matter to a state actor — is true but incomplete. The more interesting answer is that developers are the weakest link in every serious air-gapped environment. A defense contractor's classified network might be locked down tight, but the laptop that the lead engineer uses for prototyping and brings home on Fridays is not. It has GitHub credentials. It has AWS keys in a .env file the engineer forgot about. It has an SSH config that points at the jump host for the corporate VPN. And it executes arbitrary code every time the engineer runs npm install or pip install -r requirements.txt.
The Kimsuky operators seem to have understood this. The packages they pushed to npm in the first half of 2024 — Phylum published IOCs for twelve of them between March and June — were not designed to steal production data directly. They were designed to establish a foothold on a developer workstation and then wait. Wait for the developer to open a terminal connected to a staging environment. Wait for them to paste an API key into their shell. Wait for them to commit something they shouldn't have.
The package ecosystem angle
Kimsuky's npm activity is modest compared to what DPRK-affiliated groups pushed in 2023. Socket's research team counted roughly 320 malicious npm packages attributable to North Korean activity across all clusters during 2024 — most of them Lazarus-linked. Kimsuky's share was smaller but more targeted. Where Lazarus tended to name-squat popular packages (etherscan-sdk-beta, variations on web3-eth), Kimsuky's packages had names that only made sense if you had already been baited into a specific coding challenge: utility libraries whose names matched the job description of a specific target company.
This is a meaningful distinction. A broad name-squatting campaign can be caught by registry-side heuristics. A package uploaded specifically to be installed by a handful of engineers at a handful of companies is much harder to detect automatically. It requires the target organization to be watching what gets installed into its own build environments.
The CVE footprint
Kimsuky's tooling has not historically relied on novel zero-days. The group prefers credential theft and social engineering. But 2024 did see them adopt a few known-and-patched issues as opportunistic entry points. CVE-2024-21413, the Outlook moniker-link vulnerability disclosed by Check Point in February 2024, appeared in Kimsuky samples by summer. CVE-2023-38831, the WinRAR extension-spoofing bug from August 2023, continued to appear in Kimsuky spear-phishing attachments well into Q3 2024 — a reminder that a patched vulnerability still lives wherever users have not updated.
What engineering organizations should actually do
The guidance that CISA and KISA put out is correct but high-level: train your people, verify your suppliers, use MFA. The operational version, for an engineering org that takes this threat seriously, looks more like:
Treat the developer laptop as untrusted when it executes third-party package install scripts. Run npm install and pip install inside a sandbox that does not have access to SSH keys, cloud credentials, or the corporate VPN. Several organizations have moved to ephemeral dev containers for exactly this reason.
Maintain an allowlist of packages and versions that your codebase is permitted to depend on. A pull request that introduces a new dependency should require human review of the package's provenance, not just a green check from CI.
Log and review what gets installed. The telemetry from npm's audit and from pip's resolver is useful, but so is the much simpler practice of diffing lockfiles and asking why new transitive dependencies appeared.
Assume the recruiter is a threat model. If your engineers are active on LinkedIn, the threat model includes a fake recruiter sending a "quick coding exercise." The appropriate response is not to block LinkedIn but to give engineers a safe, isolated environment to run untrusted code from strangers.
How Safeguard Helps
Safeguard applies reachability analysis to filter out the transitive dependencies that cannot actually be executed, so the signal from a Kimsuky-style targeted package is not buried under thousands of unreachable false positives. Griffin AI correlates newly published packages against known DPRK infrastructure indicators, flagging suspicious install-time behavior before it reaches a developer workstation. Our SBOM pipeline records every dependency that enters your build — including the ones that came in through a "candidate take-home project" — so incident responders have a ground-truth inventory. The TPRM module scores upstream maintainers against threat intelligence feeds, and policy gates block packages whose provenance does not meet your defined trust bar. Together, these controls turn the developer laptop from a soft target into an instrumented one.